Skip to content

Streaming Data into Power BI Using Alteryx

Published by Concentra

Discover analytics solutions from Concentra

Concentra’s analytics and business intelligence teams turn information into insight to give you the edge from your data. Learn more.

To find out more about Concentra’s bespoke Analytics Services and Solutions see Services.

Creating a Live Currency Exchange Rate Dashboard

I’m back with more on financial data. In this blog, I’ll demonstrate how to build a near real-time self-updating dashboard that displays the most recent exchange rates of several currencies against the sterling in the last 60 minutes.

To accomplish this, I’ll use streaming datasets, a recent functionality that was added to Power BI that enables displaying streaming data on dashboard tiles. Alteryx will be used to get the exchange rates from the web and continuously append them into the Power BI streaming datasets that will then automatically refresh dashboard tiles showing the most recent data.

Create your streaming datasets in Power BI

The first step is to create all the streaming datasets that we need – one per currency. In this article, we will get the exchange rates for the EURO (EUR), US Dollar (USD), Canadian Dollar (CAD), Swiss Franc (CHF), Arabic Estate Dinar (AED), and the Indian Rupee (INR) against the British Pound (GBP), meaning we will need to create six streaming datasets in Power BI.

To create a streaming data set, login to Power BI and click on the Streaming datasets option in the Datasets section of the Power BI menu on the left side of the screen:

Clicking on the Add streaming dataset at the top of the list of streaming datasets, we will create a new streaming dataset:

The first step in creating the new streaming dataset is to define that the data will be posted through an API, define its dataset name and list of value fields and data types that will be posted to it and also enable the Historic data analysis so that the posted values are kept:

Notice that we defined two field values will be passed to the streaming data set – Time as a DateTime value and ExchangeRate as a number, which will respectively hold the timestamp and exchange rate of the currency that the dataset will hold. The values are passed to the stream API as JSON content. An example of how this content should be formatted is shown in the box below the field names. To finalise the creation of the streaming dataset we just need to click on the Create button.

After creating all the required streaming datasets, our Power BI Streaming data window should look similar to the following:

To update each dataset, we just need to send an HTTP post request to its respective API with a valid JSON format containing the values that match the field structure that dataset is expecting. To get the API URL for each streaming dataset, we can click on the  symbol on the list of datasets:

How to push real-time exchange rate data to the Power BI streaming dataset API using Alteryx

We’ll get the current exchange rates for the sterling from a well-known website, . We will use a workflow similar to the one I explained in my previous blog on this matter ( ), so please do review it if you wish to implement it.
It looks like this:

Now, we will only add three more tools sequentially to this workflow. We start by inserting another Formula tool after the last one, to add two more fields – one to hold the Power BI API URL for the respective currency (which we will name as TileAPI and define it as a string with length 1024) and another to hold the JSON formatted content to push to the API with timestamp and exchange rate for the respective currency (which we will name JSONContent and is also a string with length 1024).

For the TileAPI field, we will define an expression that will result in returning the relevant API URL that the Currency field in the workflow refers to. Note that at this stage you should have the list of unique API URL for each streaming dataset (as demonstrated at the end of the previous section), belonging to each currency.

So after selecting the TileAPI in Output Field list, enter the following expression in its Expression box:

For JSONContent, after selecting the field in the Output Field list, enter the following in its Expression box:

‘[{“Time”:”‘ + DateTimeFormat(DateTimeNow(), ‘%Y-%m-%dT%H:%M:00.000Z’)+ ‘”,’ +
‘”ExchangeRate”:’+ToString([Rate],5) + ‘}]’

This will generate the adequate JSON content to post to each API. The following step is to add a Download tool to the workflow in order to post all JSON values using the JSONContent field to their respective Power BI APIs using the TileAPI field. To do this, add a Download tool as the next task in the workflow and fill in its configurations as shown in the following images, which will allow you to post your data up to the Power BI streaming API:

Finally, go to the Connection tab and fill in your Power BI username and password in the Username and Password boxes (these credentials will be required when you publish and schedule your workflow to be continuously executed by the Alteryx Scheduler and able to post to the Power BI service).

So the final workflow should look like this:

Making the Alteryx workflow continuously push the exchange rates to Power BI

We will now schedule the execution of the Alteryx workflow to run every minute in a continuous way, so the database table will store all historic exchange rates per minute.

To do this, open up the workflow in Alteryx Designer and select Schedule Workflow the on the Options menu (note that to be able to schedule workflows you must have an Alteryx Desktop Automation license).

Configure the schedule as shown in the following window and click on the OK button:

Note: If you have your scheduler controller on a different machine than the one where you are developing on and publishing the schedule, you will need to reconfigure the Controller drop down properly.

To check if the schedule has been properly configured and published in the scheduler, check the Schedules in the View Schedules from the Options menu. It should show your workflow in the Workflow Name list and its State should be Active.

After the schedule has been configured, it should run and execute the workflow every minute. To check this, allow a few minutes to pass by and look in the Results tab of the View Schedules window to see if each run finished successfully (Result column should show green dot markers in all rows).

Creating the streaming exchange rates dashboard

We can now create the near real-time exchange rate dashboard. To do this, after logging on to, create a new dashboard.

After creating the new dashboard, we will add a tile for displaying the data for each currency by using each streaming dataset. To do this, click on + Add tile, select the CUSTOM STREAMING DATA from REAL-TIME DATA as the source and click the Next button:

Now select the data set for the tile by selecting it from the list of available streaming datasets:

Change the Visualisation Type to Line Chart, and add Time as the Axis value and ExchangeRate as the Values value and click the Next button:

Finally, on the next window fill in the tile Title and click the Apply button. After repeating these steps for each of the different currencies, we end up with a dashboard that looks like this:

And that’s it! You have an up to the minute live dashboard that automatically updates as it receives new data streams.

As further improvements, a database could be built to store all the historical exchange rate data to enable historical analysis over time as well as provide a source for applying predictive analytics to determine currency patterns and trends. The Alteryx workflow can also easily be modified and transformed into a macro tool that could receive any number of distinct currencies and dates as inputs, and returning their respective currency rate data for those dates as an output, providing a dynamic and broad-purpose tool.

If you have any questions or comments, please feel free to contact us.

Discover analytics solutions from Concentra

Concentra’s analytics and business intelligence teams turn information into insight to give you the edge from your data. Learn more.