Mad for Data - BI Consultant
martedì 19 luglio 2022
venerdì 17 luglio 2020
Aggregation + Composite Model in Power BI
The composite model expects to have huge tables as a DirectQuery source and some smaller tables in import mode. When we talk about huge tables, performance is always a challenge. In this session, we will talk about how aggregations can be useful for speeding up performance. You can see how gigabytes of data can be analyzed and used in less than a second. Learn about many tips, tricks, and insights into aggregations in Power BI and how to use them in a composite model.
sabato 25 aprile 2020
Big Data: Traffic monitoring system with Azure architectures
this is my video in Italian on a traffic monitoring system built with an Azure architecture.
martedì 24 dicembre 2019
Change the dataset of a Power BI report using powershell
Import-Module PowerBIPS
$authToken = Get-PBIAuthToken
Set-PBIGroup -authToken $authToken -name "test" -Verbose
Set-PBIReportsDataset -authToken $authToken -sourceDatasetName "xxx" -targetDatasetName "xxx" -Verbose
mercoledì 13 novembre 2019
New AI Function in Power Query
Is that ai functions are now available in power query when you’re in the query editor you’ll able to see that there are three types of AI Transformations available :
Text analytics
vision analytics
azure machine learning models
Additionally for nay of these options you’ll be able to pick out which premium capacity you want to run it off of and this is because these features are exclusive to premium so while you’re in desktop you pick out which specific capacity you want a reference and then in power bi service if you published up your reports to a premium workspace will automatically run them in the premium capacity associated with that workspace.
Text analytics
vision analytics
azure machine learning models
Additionally for nay of these options you’ll be able to pick out which premium capacity you want to run it off of and this is because these features are exclusive to premium so while you’re in desktop you pick out which specific capacity you want a reference and then in power bi service if you published up your reports to a premium workspace will automatically run them in the premium capacity associated with that workspace.
giovedì 1 agosto 2019
Upload data to a Power BI real-time
This blog will show you how to load data into a Power BI real-time data set using Powershell.
The following code load to a Power BI dataset from a table in the adventureworks database
#Variables - details of the connection, stored procedure and parameters
$connectionString = "server=localhost;database='Adventureworksdw2017';trusted_connection=true;";
$storedProcedureCall = [your store procedure;
#SQL Connection - connection to SQL server
$sqlConnection = new-object System.Data.SqlClient.SqlConnection;
$sqlConnection.ConnectionString = $connectionString;
#SQL Command - set up the SQL call
$sqlCommand = New-Object System.Data.SqlClient.SqlCommand;
$sqlCommand.Connection = $sqlConnection;
$sqlCommand.CommandText = $storedProcedureCall;
#SQL Adapter - get the results using the SQL Command
$sqlAdapter = new-object System.Data.SqlClient.SqlDataAdapter
$sqlAdapter.SelectCommand = $sqlCommand
$dataSet = new-object System.Data.Dataset
$recordCount = $sqlAdapter.Fill($dataSet)
#Close SQL Connection
$sqlConnection.Close();
#Get single table from dataset
$data = $dataSet.Tables[0]
$endpoint = [your dataset url]
#Loop through each row of data and create a new file
#The dataset contains a column named FileName that I am using for the name of the file
foreach($row in $data)
{
$payload = @{
"ProductKey" =$row.Item("ProductKey")
"SalesAmount" =$row.Item("SalesAmount")
}
Invoke-RestMethod -Method Post -Uri "$endpoint" -Body (ConvertTo-Json @($payload))
write-Host $row.Item("ProductKey")
}
at this link the script
The following code load to a Power BI dataset from a table in the adventureworks database
#Variables - details of the connection, stored procedure and parameters
$connectionString = "server=localhost;database='Adventureworksdw2017';trusted_connection=true;";
$storedProcedureCall = [your store procedure;
#SQL Connection - connection to SQL server
$sqlConnection = new-object System.Data.SqlClient.SqlConnection;
$sqlConnection.ConnectionString = $connectionString;
#SQL Command - set up the SQL call
$sqlCommand = New-Object System.Data.SqlClient.SqlCommand;
$sqlCommand.Connection = $sqlConnection;
$sqlCommand.CommandText = $storedProcedureCall;
#SQL Adapter - get the results using the SQL Command
$sqlAdapter = new-object System.Data.SqlClient.SqlDataAdapter
$sqlAdapter.SelectCommand = $sqlCommand
$dataSet = new-object System.Data.Dataset
$recordCount = $sqlAdapter.Fill($dataSet)
#Close SQL Connection
$sqlConnection.Close();
#Get single table from dataset
$data = $dataSet.Tables[0]
$endpoint = [your dataset url]
#Loop through each row of data and create a new file
#The dataset contains a column named FileName that I am using for the name of the file
foreach($row in $data)
{
$payload = @{
"ProductKey" =$row.Item("ProductKey")
"SalesAmount" =$row.Item("SalesAmount")
}
Invoke-RestMethod -Method Post -Uri "$endpoint" -Body (ConvertTo-Json @($payload))
write-Host $row.Item("ProductKey")
}
at this link the script
giovedì 18 luglio 2019
Wrangling Data Flow
Organizations need to do data preparation/wrangling for accurate analysis of data that is more complex and continues to grow every day. Data preparation is also required so that organizations can use the data effectively in various business processes and reduce the time to value.
Wrangling Data Flow in Azure Data Factory allows you to do code-free data preparation/wrangling @cloud scale iteratively. Wrangling Data Flow integrates with Power Query Online and makes the best in class Power Query M functions available for data wrangling @ cloud scale via spark execution.
Wrangling Data Flow translates M generated by Power Query Online Mashup Editor into Spark code for cloud scale execution and provides best in class monitoring experience.
Wrangling Data Flow in Azure Data Factory allows you to do code-free data preparation/wrangling @cloud scale iteratively. Wrangling Data Flow integrates with Power Query Online and makes the best in class Power Query M functions available for data wrangling @ cloud scale via spark execution.
Wrangling Data Flow translates M generated by Power Query Online Mashup Editor into Spark code for cloud scale execution and provides best in class monitoring experience.
Iscriviti a:
Post (Atom)
-
Let's see how the power query cache works (in both Excel and Power BI Desktop). A "persistent cache" stored on disk, when refr...
-
Speak about Power BI and business application platform http://www.cloudconferenceitalia.it/2017/
-
As we start the new year, we’re excited to announce we’re rolling out report sharing to users worldwide. Now reports and dashboards have the...