Ever wanted to export all your Power BI reports from a workspace and save a copy without manually going to each one and select the “Download report” option:
My team at DevScope just updated the PowerBIPS powershell module with a simple cmdlet to do just that:
Export-PBIReport –destinationFolder “<local folder>”
With an easy PowerShell script you can download all the reports from your PowerBI workspaces (ex: Backups?):
# Get the Auth Token
$authToken = Get-PBIAuthToken
# Define the Workspace you want to download the reports from - Optional, by default downloads from personal workspace
Set-PBIGroup -authToken $authToken -name "Demos - PBIFromTrenches" -Verbose
# Downloads the reports to a destination folder
Export-PBIReport -authToken $authToken -destinationFolder "C:\Temp\PBIReports" -Verbose
Recently we had the need to analyse the queries made by the users on Azure Analysis Services and to cross reference that data with Azure AS metrics. For example to see exactly which queries are the cause for high QPU’s or Memory and see who made them on which application.
Currently Azure AS allows you to configure an Extended Events session to collect events from your Analysis Services database:
But there’s no easy way to export or save that data to do some further analysis. You can only watch live data and it’s not very user friendly:
We tried to use the good old ASTrace but it’s not compatible with Azure Analysis Services and it’s not a very good practice because it basicaly create a Profiler Session that will be deprecated soon.
Because we desperately needed to analyse the user queries to identify bottlenecks my amazing BI team at DevScope build an great tool called “Azure-As-Trace” that will allow you to point to a Analysis Services Instance and instantly start collecting the events you want and store them in the file system in JSONL format.
You can download it or contribute to it at github: https://github.com/DevScope/Azure-AS-Tracer
It’s very simple to use you just need to download the binaries and change in the config file ‘AzureASTrace.exe.config’ the following parameters:
||The connection string to the Analysis Services instance you want to monitor
||The path to the XEvents trace template to create the monitoring session on the Analysis Services Instance
||The path to the Output Folder that will hold the JSONL files
After that you have two options:
- Run AzureASTracer as a console application, by simply executing AzureASTrace.exe
- Run AzureASTracer as a windows service by running ‘setup.install.bat’ and start the service
Either way when running the events will be saved on this on the Output folder, AzureASTrace will create a file for every Event Type subscribed and group the files by day:
To analyze those events you can this Power BI Desktop template:
Did you ever faced a scenario were you needed to load a collection of CSV/Text files into SQL Server tables?
What solution did you choose?
- TSQL BULK INSERT?
- SSIS Package (generated from SSMS Tasks->Import Data or manual)
- PowerShell “Import-CSV”
And what if the SQL Server destination tables must be typed (numeric, date, text columns,…) and the CSV file has formatting issues (ex: text columns without quotes, datetimes not in ISO format) and you need to transform the columns into the desired types?
A much quicker solution to transform CSV files into the desired shape is using a PowerBI Desktop query (or PowerQuery), for example in seconds I can:
- Load the CSV
- Replace a value from all the columns (in this case “NULL” from a real null)
- Auto detect the datatypes
Now to load these queries into a SQL Server database, it’s very easy thanks to DevScope powershell module “PowerBIETL” (also available at PowerShellGallery):
Export-PBIDesktopToSQL -pbiDesktopWindowName "*sample*" -sqlConnStr "Data Source=.\SQL2014; Initial Catalog=DestinationDB; Integrated Security=SSPI" -sqlSchema "stg" -verbose
The cmdlet “Export-PBIDesktopToSQL” will take care of:
- Connects to the PBI Desktop and read the tables
- Automatically create the tables on the SQL Database (if they do not exist)
- Thanks to DevScope “SQLHelper” powershell module and “Invoke-SQLBulkCopy” cmdlet
- Bulk copy the data from PBI Desktop into the SQL Table
The cmdlet has 4 parameters:
- -PBIDesktopWindowName (mandatory)
- A wildcard to find the PowerBI Desktop window
- -Tables (optional, defaults to all the tables)
- Array of tables to import
- -SQLConnStr (mandatory)
- Connection to a SQL Server database
- -SQLSchema (optional, defaults to “dbo”)
- The schema under the tables will be created
As a result all the tables from the PBI Desktop file will get copied into the SQL Server database:
Off course this will only work to those “one-time-only” or manual scenarios, but I assure you that is much quicker than using a SQL Integration Services package
In this post I will show you how to analyse Power BI Desktop diagnostic trace files in a more visual way than notepad
First you need to collect some diagnostics by enabling tracing on Power BI Desktop, go to: File –> Options –> Diagnostics –> Enable Tracing
If you click on “Open Traces folder”:
It will open the trace folder with all the trace logs:
PS – Trace log are only generated after you test your power bi report, do some refresh and interactions first to create the trace logs
Now to analyse these logs you could off course open them in notepad:
But is not very easy to read, so what better way to process and visualize this huge amount of text data??? Power BI off course!!!
So I created a Power BI Desktop to process and visualize the trace logs, that will allow you to quickly visualize things like:
- Duration of queries
- Performance issues
Instructions of usage:
- Download and open the Power BI Desktop file
- “Edit Queries” and change the variable “VAR_LogFolder” to point to the trace logs folder: