PowerApps: Connecting to a custom JSON API

Developer Nathan Burchell shares his experience of working with custom APIs on Microsoft’s low-code platform, whilst building a rostering tool for Greenwich staff. 

 

Royal Greenwich’s street services team are responsible for waste collection and street cleaning across the borough. We’ve written previously about how we’re building a rostering tool using Microsoft PowerApps, to improve the back-office efficiency of the service.  

We chose PowerApps because it had the most flexibility of a set of low-code platforms we explored and allows us to introduce custom integrations where required. This post shares a developer’s experience of blending PowerApp’s low-code components with more traditional API-based development. 

 

Why we needed to introduce a custom API for the app

We began building the app using Microsoft PowerApps for the front-end, and a set of SharePoint tables that acted as a temporary back-end database. We were able to set up SharePoint tables in a matter of minutes, which made it quick to start development. However, SharePoint has a significant drawback when working with large amounts of data: the speed and efficiency of the app. Using SharePoint tables, we found that the application’s loading times were below expectations in certain scenarios, and would make working with the application a frustrating experience for users. 

When our PowerApp connected to the SharePoint table (also known as a SharePoint list), it pulled in all the data that is stored in that specified table. This means every single row in the table will be pulled into the app, requiring the data to be handled and filtered in the actual app itself. This has an impact on the app’s performance, decreasing the speed of certain interactions, largely due to the filtered data being constantly refreshed in the background.  

We needed to find a solution which would allow us to still pull data into the app, whilst also pre-filtering the data to reduce load times and keep the app performing optimally. 

 

The solution: connecting a custom API to the PowerApp 

Fortunately, we were able to meet the performance requirements by building a custom API, which reads and writes data from a database. The API structures the data in a form the PowerApp can understand and sends it to an endpoint in JSON format.  

We initially created a mock version of the API, using a JSON file as the data source, before replacing this with a PostgreSQL database. We used the Python FastAPI framework to create the mock API. Originally intended as a prototype, we realized that the FastAPI code would be a great solution for the production version of the API, so we were able to re-use some of the code in developing the database version. 

Once this was set up, we created a ‘custom connector’ in PowerApps, to link the app to the custom API. Within the connector settings we can configure the different types of requests that will be required, such as GET or POST. In this instance we required a GET request, as we are trying to pull data into the app, therefore we need to define the endpoint URL that we want the GET request to connect to in order to pull through the JSON.  

 

PowerApps interface, showing fields to configure a custom connector, including the URL, path and query. 

Defining the URL and type of request.

 

Microsoft PowerApps interface, showing the headers and body of the API response, in JSON format.

Fetching the response to the GET request 

 

The final step was to configure the PowerApp to allow the pre-filtering of the data to take place. For instance, our app requires us to pull in waste and recycling round collection information for each day, so we'd want pull through the allocations for a specific date, rather than all dates. We set up a date picker, where the date selected by the user will be stored as a variable. The connector passes this variable into the endpoint URL, so when the GET request gets sent, it only fetches the data for the date selected by the user. When the app receives the data, there’s no need to filter it further because the API has handled this, returning only the data for the specified date. 

 

Results and challenges 

We’ve seen a dramatic increase in speed since making this change. When connecting the app to the mock API, data was displayed in several tenths of a second, compared to 6-7 seconds using the Sharepoint tables. It made clear to us that Sharepoint tables are great for prototyping (and production usage in simpler apps with lower data volumes), but aren’t robust enough for production use in more complex apps. 

One challenge we faced was about how we structure the incoming JSON. It’s vital that we structure the JSON in a way which can cater and function for all of the app components. There’s been a significant learning curve to understand what data structures the various PowerApp components require, to design something that works for all of them. 

Another challenge we faced was not being able to develop across the full stack in local environments. When working on the API and PostgreSQL database locally, we’ve not found it possible to connect these (Microsoft cloud-hosted) PowerApp in a secure way. We were able to get around this by developing both sides of the stack against the stub, and then integrating them in a cloud-hosted development environment. This has made it a slightly longer process, but fortunately the stub solution was able to provide all the functionality we needed to proceed. 

In conclusion, the custom API has significantly improved app performance. We'll monitor how this scales as we increase the amount of data being sent to and from the PowerApp in future performance testing.  

 

Share and Enjoy:These icons link to social bookmarking sites where readers can share and discover new web pages.