Azure Analysis Services Tracer (aka AzureASTrace.exe)

Recently we had the need to analyse the queries made by the users on Azure Analysis Services and to cross reference that data with Azure AS metrics. For example to see exactly which queries are the cause for high QPU’s or Memory and see who made them on which application.

Currently Azure AS allows you to configure an Extended Events session to collect events from your Analysis Services database:

image

But there’s no easy way to export or save that data to do some further analysis. You can only watch live data and it’s not very user friendly:

image

We tried to use the good old ASTrace but it’s not compatible with Azure Analysis Services and it’s not a very good practice because it basicaly create a Profiler Session that will be deprecated soon.

Because we desperately needed to analyse the user queries to identify bottlenecks my amazing BI team at DevScope build an great tool called “Azure-As-Trace” that will allow you to point to a Analysis Services Instance and instantly start collecting the events you want and store them in the file system in JSONL format.

You can download it or contribute to it at github: https://github.com/DevScope/Azure-AS-Tracer

It’s very simple to use you just need to download the binaries and change in the config file ‘AzureASTrace.exe.config’ the following parameters:

ConnectionStrings/AnalysisServices The connection string to the Analysis Services instance you want to monitor
AppSettings/XEventTemplateFilePath The path to the XEvents trace template to create the monitoring session on the Analysis Services Instance
AppSettings/OutputFolder The path to the Output Folder that will hold the JSONL files

image

After that you have two options:

  • Run AzureASTracer as a console application, by simply executing AzureASTrace.exe

image

  • Run AzureASTracer as a windows service by running ‘setup.install.bat’ and start the service

image

Either way when running the events will be saved on this on the Output folder, AzureASTrace will create a file for every Event Type subscribed and group the files by day:

image

To analyze those events you can this Power BI Desktop template:

Manage Analysis Services Tabular Partitions (OnPrem / Azure) w/ PowerShell

One of the biggest challenges managing large analysis services databases is table partition maintenance. Either we can do it manually – witch eventually will lead us to an error prone, time consuming and boring task – or…we can AUTOMATE it, for example leveraging the PowerShell simplicity to deal with systems/servers administration. SSAS servers are not an exception to this PowerShell ability, so that the solution we are presenting today is the proof for this.

Describing the benefits for having large tables partitioned is not in the scope of this article so we will just leave a reference to an excellent source that can do it for us: https://msdn.microsoft.com/en-us/library/hh230803.aspx#bkmk_benefits.

In order to assemble the example shown below we’ve used

  • An Azure AS Analysis Service (AS) instance, thought on-prem is just fine;
  • A Windows machine with PowerShell V3.

That is all we will need to resemble our automated partition manager. Moreover, the sample scripts we are about to show depend on just one PowerShell module:

  • SSASHelper – A single lightweight PowerShell module with cmdlets to manage SSAS Instances (On-Prem. or Azure)

This one is indeed a new module developed to abstract some SSAS operations, e.g. the ones used in this blog post scope, hosted on GitHub – https://github.com/DevScope/ssas-powershell-modules.

Note that the examples provided do refer to the code of the modules as being in the folder on the scripts using it – presuming you’ve downloaded them, and so the Import-Module statements, thought the SSASHelper module is also available in the PowerShell Gallery, ready to be installed, as shown on GitHub.

Time for what matters, we will leave two scripts, the first for partition (re)creation – note that if the partition exists and there are no alteration to its definition everything is left untouched, data included:

# Clear the screen
cls

# On error stop the script
$ErrorActionPreference = "Stop"

# Get the current folder of the running script
$currentPath = (Split-Path $MyInvocation.MyCommand.Definition -Parent)

# Import helper modules
Import-Module "$currentPath\Modules\SSASHelper" -Force

# Connection strings (on prem & Azure)
# ex: "Data Source=.\sql2016tb;Initial Catalog=Adventure Works Internet Sales;"
$connStr = (Get-Content "$currentPath\SSASTestConnStr.txt" | Out-String)
 
$conn = Get-SSASConnection -connectionString $connStr -open

$ssasDatabase = "AW Internet Sales Tabular Model"

# Process only most recent facts
2009..2015|%{
 
 $year = $_

 if ($year -eq 2014) {
 $ssasTable = "Internet Sales"
 $partition = "Internet Sales $year"
 Invoke-SSASProcessCommand -connectionString $connStr -database $ssasDatabase -table $ssasTable -partition $partition -Verbose
 }
 
}

# Get all tables from AS database
$cmdText = 'select [Name] from $SYSTEM.tmschema_tables'
$result = Invoke-SSASCommand -connectionString $connStr -commandtext $cmdText -commandType "query"

foreach ($item in $result) {

 $tableName = $item.Values 

 # Process all the dimension tables (smaller tables)
 # If is not named like the fact tables: process
 if( !($tableName.Contains("Internet Sales")) ){
 Invoke-SSASProcessCommand -connectionString $connStr -database $ssasDatabase -table $tableName -Verbose
 }
}

Note that we are only creating partitions on the “Internet Sales” table, illustrating the need to do it only for the biggest tables.

The second script demonstrate partition, table or database processing. The example shown here leverages SSASHelper module partition and table processing for large and small tables, respectively.

# Clear the screen
cls

# On error stop the script
$ErrorActionPreference = "Stop"

# Get the current folder of the running script
$currentPath = (Split-Path $MyInvocation.MyCommand.Definition -Parent)

# Import helper modules
Import-Module "$currentPath\Modules\SSASHelper" -Force

# Connection strings (on prem & Azure)
# ex: "Data Source=.\sql2016tb;Initial Catalog=Adventure Works Internet Sales;"
$connStr = (Get-Content "$currentPath\SSASTestConnStr.txt" | Out-String)
 
$conn = Get-SSASConnection -connectionString $connStr -open

$ssasDatabase = "AW Internet Sales Tabular Model"

# Process only most recent facts
2009..2015|%{
 
 $year = $_

 if ($year -eq 2014) {
 $ssasTable = "Internet Sales"
 $partition = "Internet Sales $year"
 Invoke-SSASProcessCommand -connectionString $connStr -database $ssasDatabase -table $ssasTable -partition $partition -Verbose
 }
 
}

# Get all tables from AS database
$cmdText = 'select [Name] from $SYSTEM.tmschema_tables'
$result = Invoke-SSASCommand -connectionString $connStr -commandtext $cmdText -commandType "query"

foreach ($item in $result) {

 $tableName = $item.Values 

 # Process all the dimension tables (smaller tables)
 # If is not named like the fact tables: process
 if( !($tableName.Contains("Internet Sales")) ){
 Invoke-SSASProcessCommand -connectionString $connStr -database $ssasDatabase -table $tableName -Verbose
 }
}

The scripts are commented and self-explanatory and for that reason, we will not describe then much more. Note just the partition query definition – with variables – in the first script, and the use of SSAS DMV’s in the second so that we get to process entire tables for those wo have only one partition – this way we don’t have even to know the partitions name for the last ones.

…and the result – showing partitions created and one of them processed for “Internet Sales” table:

clip_image002

Some examples of use for this simple scripts include partition management to optimize database processing times, maintaining real-time partitions, etc.

Hope that this modules/sample scripts will help you achieve the same freedom! Feel free to give us your feedback.