Tracking Power BI Report Access Across All Workspaces Using ADF and the Power BI REST API.
One of the most common requests I receive as a data engineer is:
“Can you get me a list of people who have accessed our Power BI reports?”
At first, this sounded simple enough. Power BI’s built-in Usage Metrics report gives you user and report access information right out of the box.
But, as expected, the requirements quickly evolved — the business wanted usage tracking across all workspaces and reports in the organization.
That’s when things got interesting.
The Challenge
The out-of-the-box Power BI usage analytics are scoped to individual reports and workspaces. Expanding this to every report across the tenant is not something Power BI natively supports — at least not easily.
So, I started looking for alternatives.
The Microsoft Purview Approach (and Its Limitation)
After some exploration, I found that Microsoft Purview (via compliance.microsoft.com) captures audit events such as "Viewed Report".
At first, this seemed perfect — it gave me all report access activity across the tenant for a selected time period. I thought:
“Yay… problem solved!”
But there was a catch.
-
Purview retains audit logs for only the past 90 days.
-
Extracting the data is manual and time-consuming.
So, while it provided visibility, it wasn’t a sustainable or scalable solution.
The Breakthrough: Power BI REST API
That’s when I stumbled upon the Power BI Admin REST API — a hidden gem for automation enthusiasts.
The endpoint I used was:
https://api.powerbi.com/v1.0/myorg/admin/activityevents
This API allows you to query Power BI audit logs programmatically for specific time windows.
Exactly what I needed to build a long-term, automated solution.
And since I love working with Azure Data Factory (ADF), that became my orchestration tool of choice.
Building the Automated Solution in Azure Data Factory
Here’s how I automated Power BI audit log extraction into Azure SQL Database.
Step 1: App Registration
First, register an application in Azure AD and grant it the following API permission:

This must be Application-level permission (not Delegated).
Then authenticate this app in ADF using the Service Principal (Client ID, Secret, and Tenant ID).
Step 2: Configure the REST API Source
In Azure Data Factory:
- Create a REST linked service to connect to the Power BI API.
- Create a dataset under the adming API.
- In the dataset’s relative URL, enter the following expression:activityevents?startDateTime=’
@{formatDateTime(getPastTime( item(), ‘Day’), ‘yyyy-MM-dd’)}T00:00:00.000Z’& endDateTime=’@{formatDateTime( getPastTime(item(), ‘Day’), ‘yyyy-MM-dd’)}T23:59:59.999Z’
This dyanmically builds the start and end time for each day transaction

Step 3: Configure the Destination (SQL)
Create a SQL linked service and dataset pointing to your Azure SQL Database — this will store the audit logs retrieved from Power BI.

Step 4: Set Up the Copy Activity
Now, in your Copy Data activity:
-
Add a parameter to define how many days of data you want to extract.
-
In the “Items” property, use this expression:
@range(1, pipeline().parameters.Daystoextract)
This allows the pipeline to loop through a range of days, calling the API once per day.


Finally, map your columns between the REST source and SQL destination.

Step 5: Run and Schedule
Run your pipeline manually for a few test days to validate data flow.
Once verified, schedule it to run daily (e.g., using a trigger).
This ensures that you continuously capture and append audit data into SQL — effectively creating your own Power BI audit warehouse.
The Outcome
With this automated setup:
-
You have a complete and permanent record of Power BI report views.
-
Data is stored securely in Azure SQL for as long as you need.
-
You can even build a Power BI dashboard on top of this SQL table for real-time usage insights.
No more 90-day limit. No more manual exports. Everything runs automatically.
.