Power Apps: A Low-Code, Low-Cost Solution

Power Apps: A Low-Code, Low-Cost Solution

Published

February 16, 2021

Data & App Engineering
Microsoft Azure

What is Microsoft Power Apps?

Power Apps, a part of the broader Power Platform, is a suite of apps, services, connectors and data platform that provides a rapid low-code application development environment. Power Apps empowers both individuals with little (to no prior) development experience and professional developers to rapidly build tailored, feature-rich applications that connect to stored business data in any underlying data platform: Microsoft Dataverse (previously known as Common Data Service), or within on-premises and online data sources (i.e., Excel, SharePoint, Office 365, SQL Server, Azure SQL, Dynamics 365 etc.).

Applications built using Power Apps provide rich business logic and workflow capabilities to transform and automate business processes. Power Apps applications have a responsive design and can run seamlessly on desktop (any browser) or on mobile devices (phone or tablet).

This article will cover the advantages and limitations of Microsoft Power Apps including: 

Advantages:

Limitations:

Power Apps Advantages

Low-Code solution

Power Apps does not require a purchase or installation of an integrated development environment (IDE) or special licensing to develop. Office 365 Enterprise E1 (or above) subscription gives users access to the Power Apps development environment.

Unlike a traditional development environment, where only pro developers can be involved in the actual application making, Power Apps, empowers everyone to build the applications they need by using advanced functionality previously available only to professional developers. Power Apps “democratizes” the custom business app building experience by enabling users to build feature-rich, custom business apps without writing code.

Advanced features and reusable components

Developers can write code to extend business applications. Code can be used to create data and metadata, apply server-side logic using Azure functions, plug-ins, and workflow extensions, apply client-side logic using JavaScript, integrate with external data using virtual entities and webhooks, build custom connectors, and embed apps into website experiences to create integrated solutions.

Power Apps comes with AI Builder, an optional Add-on that provides two kinds of Power Apps components, depending on the models you want to use:

Components that use prebuilt AI models that are ready to use right away (i.e., Business card reader, Receipt processor component, Text recognizer component).
Components that use custom AI models that are built and trained by the developer (i.e., Form processor, Object detector)

Any components developed in app can be imported into other applications.

Use of Industry standard data structures and processes

Power Apps works directly with Microsoft Dataverse, a key data platform offering, which includes an extensive base of industry-standard tables that cover typical scenarios. The data structures can be easily customized to fit the organization’s need, with data populated via Power Query. There are numerous additional advantages to using Dataverse:

Easy to manage – Both the metadata and data are stored in the cloud.
Easy to secure – Data is stored securely, with a granular role-based security in place to control access to tables for different users within organization.
Access Dynamics 365 Data – Data from Dynamics 365 applications is also stored within Dataverse, letting organizations quickly build apps that use the Dynamics 365 data and extend the applications with Power Apps.
Rich metadata – Data types and relationships are used directly within Power Apps.
Logic and validation – Define calculated columns, business rules, workflows, and business process flows to ensure data quality and drive business processes.
Productivity tools – Tables are available within the add-ins for Microsoft Excel to increase productivity and ensure data accessibility.

Integration

Power Apps integrates with over 400 connectors (applications/services). For a full list of connectors please see List of all Power Apps connectors.

Of the 400 connectors, the most commonly used are Microsoft ones:

Office 365, getting the data into and out of SharePoint, Excel, Access, or any of the other Office 365 applications.
SQL Server and Azure SQL, working with custom-built databases.
Power BI, a business analytics service that aims to provide interactive visualizations and business intelligence capabilities.
Power Automate (Microsoft Flow), a service that helps create automated workflows between applications and services to synchronize files, get notifications, collect data, and more.

Accelerated time-to-market

With the traditional application development approach such as ‘waterfall’, and even modern iterative approaches such as ‘agile’, a significant amount of time can pass before the users are presented with a minimum viable product. Power Apps provides a WYSIWYG (what you see is what you get) development experience, that allows users to experience the actual working app very early in the development process; any new requirements/features can easily be added in the next version.

Platform agnostic

Power Apps mobile applications execute via the Power Apps application, that can run on Android, iOS and Windows. The Power Apps application takes care of the differences between the operating systems. There is also a web version of Power Apps, that allows for a given Power Apps application to be accessed via any modern web browser instead of the mobile app. For a complete list of supported browsers and browser platforms please see System requirements, limits, and configuration values.

Availability, data location and localization

Power Apps and its parent Power Platform is available globally. With numerous options to pick from:

Global cloud: United States, Europe, Asia Pacific
Local cloud: Canada, Brazil, United Kingdom, France, Germany, India, Japan, Australia, United Arab Emirates, Switzerland, Republic of South Africa
Sovereign cloud: US Government, Germany(retiring), China.

All metadata as well as Dataverse data is localized. The applications and the data (metadata, Dataverse) are mirrored and replicated for redundancy.

Security

Power Apps inherits the Governance and Security from the Power Platform. Users are authenticated via Office 365 or Azure AD inheriting the organization’s authentication configuration, for example multi-factor authentication (MFA).

Application and its data can be secured across the below levels. Please see Power Apps Security for more information.

App-level security – restricts access to the app.
Form-level security – sets permissions to allow only specific security groups to access specific forms.
Record-level security – sets permissions for each individual row.
Field-level security – finer-grained security permission set for a single record.

Governance and environment Administration

Power Apps is easily administrated via the Power Platform admin center, with capabilities to create and manage environments, get real-time, self-help recommendations and support for Power Apps and Power Automate, and view Dataverse analytics. For more information, please see Administer Power Platform.

Low-cost solution

Power Apps general pricing is divided into two subscription plans:

Per app plan – monthly cost to run one application per user.
Per user plan – monthly cost to run unlimited applications per user.

Power Add-ons, Portals and AI Builder, fall outside of the general pricing. For detailed information, please see Power Apps pricing.

There are deep volume discounts available when looking to license at an enterprise level. For example when acquiring 200+ licenses, the cost of ‘Per app’ plan of $10/month and the ‘Per user’ plan of $40/month is brought down to $3/month and $12/month respectively.

In a typical project, bulk of the cost falls into the Development bucket, which is associated with the IT project staff (i.e., Project Manager, Developers, Engineers etc.). Power Apps greatly limits the IT Development involvement and the associated cost.

Professional developers have limited involvement in the making of Power Apps.
The need for project management is greatly reduced (since Power Apps projects are typically developed by the business).
IT Engineering has limited involvement in setting up the development environment.

The other portion of the cost is associated with hosting. Infrastructure resources (and licensing), whether on-premises or in the Cloud, cost money and there are costs associated with managing and maintaining. Here too, Power Apps greatly reduces the costs by limiting IT (Engineering) involvement and completely removing the cost of hosting.

IT Engineering is not involved in application publishing (promoting it through environments, and ultimately releasing it to production).
IT Engineering is not involved in the Power Apps maintenance as it is performed on the platform, with no impact to the applications.
There are no application hosting costs.

Power Apps Limitations

Every platform has its Limitations, and Power Apps is no exception. Following is a list of the limitations:

Data Source fetch limit of 500 records

In PowerApps every data source (i.e., SharePoint, SQL Server, Common Data Service, OneDrive etc.) is under limitation of 500 items. It means that a given data pull (query) will only return the first 500 items. Note: there are numerous strategies to overcome the limitation (e.g., using delegation, using static data, using delegation with iterative function – paging, etc.).

Integrated development environment (IDE)

The Power Apps integrated development environment is running on the web. While it eliminates specific hardware and installation, it lacks the developer ‘feel’. Perhaps more important is the limitation of only one user being able to edit a given app at the same time.

Designated application layout

Applications are developed to fit a particular layout, Tablet or Phone (not both). Note: The Tablet mode can be used to render a Power Apps application on a phone, with the drawback of not having the expected look and feel of a ‘Phone’ application.

Final Thoughts

With very few limitations and the substantial advantages, including ease of development, extensive features and integration capabilities, platform independence, high availability and redundancy, enterprise-level security, all at relatively low-cost, Microsoft Power Apps is a clear winner for a company looking for options to empower its business and improve the bottom line.

OneSix is a Microsoft Aszure Consultant

We help companies solve their most complex problems with cloud technology and yield incredible results.

Matillion Failure Notifications using Azure Queue Message

Matillion Failure Notifications using Azure Queue Message

Published

July 28, 2020

Data & App Engineering
Microsoft Azure
Matillion

One of the common problems faced when using Matillion hosted on an Azure virtual machine is the inability to send failure emails when a scheduled job does not run successfully. Without a solution outside of Matillion, users would need to start the virtual machine to check the status of a job. Depending on the number of jobs and their scheduled run times, this could become a time-consuming manual process of starting and stopping the virtual machine.

The solution to this problem combines the Azure Queue Message component within Matillion, Azure Queues within an Azure Storage Account, and an Azure function using an Azure Queue Trigger written using .Net Core 3.1. When a job fails, the Queue Message component will send a message to the Queue listening for failures. This triggers the Azure Function, which sends an email notification of a failure, including the job title, to any email address added in the function.

Setting Up Azure Storage Account

To get started, ensure that there is an Azure Storage Account created. There is likely one created for the resource group that the Matillion VM is using but a new one can be created as well if there is a desire to keep messages in a separate location.

Within the created storage account, there is an option for ‘Queues’ under ‘Queue Service’. Selecting the ‘+ Queue’ button at the top of the pane will bring up an option for entering the Queue name. This is the only setting that is needed. Keep in mind that Queues can be used for both success and failure emails if desired and this name will be entered in the Matillion component.

For Matillion to be able to send messages to the queue, access needs to be given to the storage account. There are multiple ways to provide access to the storage account with the easiest of these options being to provide the VM owner access to the account. To do this, navigate to the ‘access control (IAM)’ pane within the storage account. Clicking the ‘+ Add’ button will bring up an ‘Add Role’ pane allowing the user to add the owner access to the VM. For more options on granting permission, check the Matillion documentation here.

Setting Azure Credentials Within Matillion

Within Matillion, right clicking on the selected environment will allow the user to edit the environment setup. Select ‘Instance Credentials’ for the ‘Azure Credentials’. This will allow Matillion to access the storage account and send messages to the Queue.

Creating Azure Queue Storage Message Component

For this component, we are using an environment variable called ‘AuditJobName’ as the message to be sent. This variable is set using the ‘Set Scalar Variables’ setting of the orchestration job. This variable is populated with the automatic variable ‘job_name’. While any text can be entered as the message, using a variable for the job name allows for the same component to be used in multiple different jobs or within an audit framework without the need for multiple different messages. For more information on the automatic variables that can be used in a Matillion audit process, check out the Matillion documentation here.

Create Azure Function

To create the function app, search for ‘Function Apps’ within the Azure portal. This will show the above pane allowing the settings to be chosen for all apps within this Function App container. Choose the necessary subscription and resource group for the project. Choose .Net Core as the runtime stack which will default the version to 3.1 or above depending on the most recent version. This allows for the function to be written using C# code. Select ‘Review + Create’ to complete the setup.

OneSix is a Premier Snowflake Partner

We help companies solve their most complex problems with cloud technology and yield incredible results.

Streaming Data Analytics with Azure IoT and Power BI

Streaming Data Analytics with Azure IoT and Power BI

Published

May 1, 2020

Data Analytics
Microsoft Azure
Power BI

Recently, we had a client reach out with an interesting challenge – to see a data stream from a device visualized in real-time without the need to refresh the page for updated results or even wait for a data pipeline to run its next scheduled job. The foundation of our solution was Microsoft Azure, where organizations can quickly deploy scalable, managed services around Internet of Things (IoT) applications. So, here is what we built – a combination of Azure components as the base architecture and Power BI’s streaming data set for visual analytics.

At a high level, four key pieces in Azure are required to get messages from a device to the live visualizations in Power BI – The IoT device sends a message to Azure IoT Hub. When this message is received, an Azure Function transforms the message into the proper JSON format and sends an output to an Event Hub. When the Event Hub receives the message, Azure Stream Analytics pushes the message to a custom Power BI dataset. Lastly, this allows the data to move in real-time in Power BI without a data-sync related delay. Let’s walk through these components in detail. 

Create an Azure IoT Hub

To build this solution, the first component needed is an Azure IoT Hub. This hub serves as a manager for all your IoT devices, provides security, and ensures reliable communication between the device and the hub. In a production scenario, these IoT devices could be any of the numerous smart devices found in a workplace. But for this example, we will be simulating a raspberry pi sending messages. Setting up an IoT hub is simple. Navigating to the IoT Hub pane within Azure, selecting “Add”, entering your preferred resource group, and creating a name for the hub will get the hub up and running. Creating a test device is just as easy. Within your new IoT Hub, select IoT devices and click “New”. There are settings for the device, but only an Id for the device is required to get it running. If done successfully, the device should appear in the list.

Figure 1- Azure IoT Hub Device List

To test the device, we used Visual Studio with the Azure IoT Hub extension enabled. Once connected with Azure, the test device should be visible. To test, right-click the device and select “Send D2C Message to IoT Hub” (Fig 2). This allows you to send plain text or a JSON template to the IoT Hub. The successful message count can be seen within the IoT Hub metrics by selecting the “Telemetry Messages Sent” metric.

Figure 2 – Visual Studio – Azure IoT Hub Extension D2C Messages

Configure an Azure Event Hub

Next, the Event Hub needs to be created. The Event Hub is an ingestion service that helps process streaming data. While this may not be necessary for all projects depending on the message source and format, we included it to do data transformations with an Azure Function. This Event Hub will take the output from our Azure Function and connect with an Azure Stream Analytics Job to send data to Power BI. Creating an Event Hub namespace is like IoT Hub, which requires a name, pricing option, and resource group. Once the namespace is created, an Event Hub entity can be added under the “Entities” pane.

Figure 3 – Azure Event Hub Namespace and Event Hub Entity

Develop an Azure Function

In our example, an Azure Function is necessary because the JSON format required by the Stream Analytics Job did not match the string format of the messages coming from the device. So, with the Azure Functions extension within Visual Studio, we created a new function with an IoTHubTrigger template. This trigger means that every time the IoT Hub receives a message, the function will run with using the message’s data. This function will connect with the IoT Hub using the “Event Hub-compatible endpoint” within IoT Hub’s Built-in Endpoints which can be found under your IoT Hub settings.

Figure 4 – Azure Function

This small function returns the data from the message in a JSON object format to the previously created Event Hub as a new message. The function connects to the Event Hub using a connection string found within the shared access policies for the Event Hub Namespace. Developers should store the connection strings within the local.settings.json file that is created when using the IoTHubTrigger template. After deploying the Azure Function, the connection strings should be entered within the configuration settings on Azure.

Connect an Azure Stream Analytics Job

The next piece required is an Azure Stream Analytics Job. This is what relays the data from the Event Hub to Power BI for live visualizations. Creating a new Stream Analytics Job also requires a name and resource group. Once created, an input and an output need to be configured to relay messages. Stream Analytics has three input options: Event Hub, IoT Hub, and Blob Storage. All settings should be applied when creating an input by selecting the existing Event Hub and choosing the JSON serialization format.

There are many options for the stream output, including Event Hubs, SQL Database, and Blob Storage. We will select Power BI as the output for this project. Once authorized with Power BI, all that needs to be entered is the dataset name and the table name. This will then automatically create the dataset in Power BI. Please note that the data set won’t be created until the first row of data is streamed.

The final step for Stream Analytics is writing the query to get the data desired. Only the EventProcessedUtcTime (needed to show time on a Power BI data visualization axis) and your data column are required but there is no limit. The query must specify the data is from the event hub input and is being pushed to the Power BI output using the same names specified when they were created.

Figure 5 – Azure Stream Analytics Query

Stream Data with Power BI

Figure 6 – Power BI Tile Settings

The final piece of the process is to create the dashboard in Power BI. This dashboard needs to be created in the same workspace where the Streaming Analytics Job dataset was created. When adding a tile, select Custom Streaming Data to choose the dataset and configure the settings. For tiles that require an axis, the EventProcessedUtcTime column must be selected as this allows for the axis to move as more messages are being sent over time. The time window to display setting sets the maximum time on the axis (Fig 6).  From our testing, shorter time frames respond better to frequent messages, but this setting can be changed for each visual so choose what best provides the intended effect.

Once all the components have been configured, it is time to test. If using the IoT test device, the best way to test is to send messages from Visual Studio using a D2C message with the Azure IoT extension. Sending many messages allows time for troubleshooting.

Troubleshooting

If data is not being shown in Power BI, there are a couple of things to double-check. First, try refreshing the Power BI page while messages are still being sent. Sometimes the original connection might not refresh once messages are starting to be sent. If there are still no messages, edit the dataset and turn off historical analysis. This setting allows for saving data and can cause issues if data types change. A third troubleshooting location is in the Stream Analytics Job. If data is not in the proper format, Stream Analytics will show a warning for the input or output. Data will not show in Power BI until it is in the expected format.

Functionality

While Azure and Power BI provide one of the best solutions for live streaming data in terms of ease of use and functionality, this setup is not perfect. Power BI does have some limitations when it comes to custom streaming datasets. The most prevalent issue is the lack of customization. There are only five tile choices for the dashboard – card, line chart, clustered bar chart, clustered column chart, and gauge. These are some of the simpler visualization options Power BI offers and limits the ability to create more complex dashboards for greater insights. There is no ability to change the visualization colors, which leaves users stuck with the default green color option. This prevents users from matching the theme of their Power BI environment causing the dashboard to clash with other pages.

The other common issue with this system is the lack of error reporting. When something is not working – data not loading in Power BI for example – there is often no error code presented to the user. The user must go step by step through the process until the issue is found rather than skipping directly to the error. While there are some limitations, these problems are more related to the user experience rather than the functionality of the system. Overall, this system does exactly what it is intended to do and can provide great live insights into your IoT devices.

OneSix is a Microsoft Aszure Consultant

We help companies solve their most complex problems with cloud technology and yield incredible results.

How to Integrate Azure AD into Your Web Application

How to Integrate Azure AD into Your Web Application

Published

April 21, 2020

Data & App Engineering
Microsoft Azure

Authentication is very important to any application. While rolling out your own authentication, services like Azure AD (Active Directory) make getting your application up and running much faster. In this article, we will walk through the necessary steps in detail to setup Azure AD authentication with .NET Core and React.

To follow along, you will need the following:

1. Dotnet 3.1 (or latest LTS)

2. Nodejs

3. Access to an Azure account

Setup Azure AD Instances

To begin, we will go into Azure and create our Azure AD resources. Once you are logged in, simply search for ‘Azure Active Directory’. Once there, you will need to create two new app registrations, one for our backend application and one for our frontend SPA.

To continue, go to ‘App Registrations’ and create two apps. For this example, we’ll have AzureAdExampleBackend and AzureAdExampleFrontend.

For the frontend app, make sure to set a redirect URL since we will be using Dotnet Core. The default redirect is https://localhost:5001 as seen below.

Next, we will go to our backend AD instance and setup access for our frontend instance. Go to ‘Expose an API’ and setup the scope for our backend API.

The scope is the permission that the API will expose to our frontend application. Users will need to consent to these scopes when authenticating with the application. The display names and descriptions are what users and admins will see when they first login to the app and their permissions are initially requested.

A screenshot of a cell phone Description automatically generated

The next step will be to add the API permissions to the frontend. Start with going to your frontend instance and going to API permissions under the ‘Manage’ section of the sidebar and hitting ‘Add a permission’. You can find your backend Azure AD instance by going to ‘My APIs’ and grant it the Read permission we setup in the previous step.

Once complete, return to the App Registrations and make note of your ClientIds and TenantId. We will be using these in the next section that goes over the actual application build.

The Code

To begin, assuming you have the latest Dotnet Core and Nodejs installed, we will use the react generator with the following command: dotnet new react. This is going to create a new .Net Core application with a React client application.

Before we go any further into the code, we will be adding our Azure AD configuration settings to the appsettings.json, making sure to add the appropriate Ids where needed. For this file make sure that you are using the backend instance ClientId.

“AzureAd”: {

“Instance”: “https://login.microsoftonline.com/”,

“Domain”: “https://login.microsoftonline.com/common”,

“TenantId”: “<fill with your id>”,

“ClientId”: “<fill with your id>” },

We will begin here with adding the [Microsoft.AspNetCore.Authentication.AzureAD.UI]() package, by running `dotnet add package Microsoft.AspNetCore.Authentication.AzureAD.UI` in the command line. Then we’ll go to the `Startup.cs` file and add the following:

public void ConfigureServices (IServiceCollection services) { services.Configure<CookiePolicyOptions> (options =>

{ options.CheckConsentNeeded = context => true;

options.MinimumSameSitePolicy = SameSiteMode.None; });

 services.AddAuthentication (AzureADDefaults.AuthenticationScheme)

.AddAzureAD (options => Configuration.Bind (“AzureAd”, options));

services.Configure<OpenIdConnectOptions> (AzureADDefaults.OpenIdScheme, options => {

options.Authority = options.Authority + “/v2.0/”; options.TokenValidationParameters.ValidateIssuer = false; }) … … }

public void Configure (IApplicationBuilder app, IWebHostEnvironment env) {

… …

app.UseRouting ();

app.UseAuthentication();

app.UseAuthorization();

… … }

Next, we want to test our authentication by actually authenticating an endpoint in our application, luckily the generator creates a controller for us, so go to the WeathersController.cs and add [Authorize] above the controller definition.

[Authorize]

[ApiController]

[Route(“[controller]”)]

public class WeatherForecastController : ControllerBase { … … }

For the next step, go into the ClientApp folder and run yarn add redux react-redux react-aad-msal msal. This installs the redux and the react-aad library, which makes it easier to authenticate with Azure AD in a React app. Now, create a new folder src/auth and within that folder we will create a new file authProvider.js and add in the following and replace ClientId with your frontend ClientId that was previously copied.

import { MsalAuthProvider, LoginType } from “react-aad-msal”;

import { Logger, LogLevel } from “msal”;

import { MsalAuthProvider, LoginType } from ‘react-aad-msal’;

export const authProvider = new MsalAuthProvider(

{ auth: { authority: “https://login.microsoftonline.com/common”,

clientId: “<your frontend clientId here>”, postLogoutRedirectUri: window.location.origin, redirectUri: window.location.origin, validateAuthority: true, navigateToLoginRequestUrl: true, }, cache: { cacheLocation: ‘sessionStorage’,

storeAuthStateInCookie: true,

},

},

{

scopes: [‘api://<your backend clientId here>/Read’] }, LoginType.Redirect, );

To see the full list of options available, take a look at the documentation for the [react-aad] library.

import { createStore } from ‘redux’;
import { AuthenticationActions, AuthenticationState } from ‘react-aad-msal’;

const initialState = {
initializing: false,
initialized: false,
idToken: null,
accessToken: null,
state: AuthenticationState.Unauthenticated,
};

const rootReducer = (state = initialState, action) => {
switch (action.type) {
case AuthenticationActions.Initializing:
return {
…state,
initializing: true,
initialized: false,
};
case AuthenticationActions.Initialized:
return {
…state,
initializing: false,
initialized: true,
};
case AuthenticationActions.AcquiredIdTokenSuccess:
return {
…state,
idToken: action.payload,
};
case AuthenticationActions.AcquiredAccessTokenSuccess:
return {
…state,
accessToken: action.payload,
};
case AuthenticationActions.AcquiredAccessTokenError:
return {
…state,
accessToken: null,
};
case AuthenticationActions.LoginSuccess:
return {
…state,
account: action.payload.account,
};
case AuthenticationActions.LoginError:
case AuthenticationActions.AcquiredIdTokenError:
case AuthenticationActions.LogoutSuccess:
return { …state, idToken: null, accessToken: null, account: null };
case AuthenticationActions.AuthenticatedStateChanged:
return {
…state,
state: action.payload,
};
default:
return state;
}
};

export const store = createStore(rootReducer);

Now that we have our configuration setup, the next step is to start making changes to the component files. First, go to the `index.js` file in the root `src` directory. We want to add the following:

ReactDOM.render(
<Provider store={store}>
<AzureAD
provider={authProvider}
reduxStore={store}
forceLogin={true}>
<BrowserRouter basename={baseUrl}>
<App />
</BrowserRouter>
</AzureAD>
</Provider>,
rootElement);

This will pass the our Store to both the Provider and our AzureAd component. We pass in the necessary parameters and set forceLogin to true, so it auto redirects to our login page. Users will need to login before accessing the application. We will leverage the AzureAd component once again in our NavMenu, since we will want a log out button.

<AzureAD
provider={authProvider}
reduxStore={store}>
{({ login, logout, authenticationState }) => {
if (authenticationState === AuthenticationState.Authenticated) {
return (<button onClick={logout} color=”inherit” variant=”outlined”>Log Out</button>);
}
}}
</AzureAD>

Adding this to the end of the Nav list will display a logout button. Since we are forced to login when we first visit the application, we will not need to make use of a login button. Similar to the backend controller, since we used the generator to create the base application for us, we have a few frontend components setup already. Once that calls our endpoint it requires authorization. Find the FetchData.js file (the changes are very minor) and componentDidMount() function. We simply need to get our accessToken before we load our weather data.

async componentDidMount() {
await authProvider.getAccessToken().then(res => this.setState({ token: res.accessToken }));
this.populateWeatherData();
}

After this, simply add the token to the header of the request.

const response = await fetch(‘https://localhost:5001/weatherforecast’, {
headers: !this.state.token ? {} : { ‘Authorization’: `Bearer ${this.state.token}` }
});

After making all of these changes, simply run dotnet watch run in your command line and visit https://localhost:5001. You will be redirected to your Microsoft branded login page and then brought back to your application where you’ll be able to access the fetch data page and display the weather data from our controller. Since we added the [Authorize] attribute, the only way to access this data is to be authenticated against the application.

To continue working with Azure AD take a look at these resources:

We help companies solve their most complex problems with cloud technology and yield incredible results.