Since end of 2022, there is also a SQL Server Integration Services Project template available for Visual Studio 2022 which you can install from the Visual Studio Marketplace. You can install it from the direct download link here or you can search for it in the Visual Studio 2022 extension manager and install it from there.
JOPX on Microsoft Business Applications and Azure Cloud
Occasional rantings about Dynamics CRM/365, Power BI and Azure cloud. Taking the first small steps in machine learning, Python and algorithmic trading
Friday, February 23, 2024
Thursday, February 22, 2024
Classic Azure Application Insights deprecated on February 29th 2024 - 7 days to go
If you missed it - classic Azure Application Insights will be deprecated on February 29th 2024. If you missed the different notification e-mails, you can quite easily see the warning if you navigate to an Azure Application Insights resource in Azure Portal.
Migration is actually quite easy - you just click on the link provided and this will open up the menu depicted below which allows you to associate your Azure Application Insights resource to a Log Analytics Workspace. The good news is that there are no pricing changes when moving to the workspace-based model.
As indicated in the migration window, this is a one way operation so plan for it in advance - the points below might impact on how you will do the migration:
- You can link different Application Insight resources to a single Log Analytics workspace or you can make the split - in most case you want to consolidate it.
- Instrumentation keys do not change during the migration so you don't need to worry about this
- The export feature is not available on the Application Insights workspace-based resources - you need to look at diagnostic settings for exporting telemetry
- There might be some schema changes - important to consider when doing KQL queries - check out query data across Log Analytics workspaces, applications and resources in Azure Monitor
- Existing log data will not immediately move to the Log Analytics workspace - only new logs generated after the migration will be stored in the new log location.
Sunday, January 14, 2024
Procreate 5 handleiding video overzicht
Lijst van ondersteunende video's bij het boek Procreate 5 Handleiding - tekenen op de ipad (Stefan de Groot, 2020)
Tuesday, December 26, 2023
Running SSIS packages in Azure Data Factory - scaling and monitoring
Lifting and shifting SSIS packages to Azure Data Factory (ADF) can provide several benefits. By moving your on-premises SSIS workloads to Azure, you can reduce operational costs and the burden of managing infrastructure that you have when you run SSIS on-premises or on Azure virtual machines.
You can also increase high availability with the ability to specify multiple nodes per cluster, as well as the high availability features of Azure and of Azure SQL Database. You can also increase scalability with the ability to specify multiple cores per node (scale up) and multiple nodes per cluster (scale out) - see Lift and shift SQL Server Integration Services workloads to the cloud
To lift and shift SSIS packages to ADF, you can use the SSIS Integration Runtime (IR) in ADF. The Azure SSIS-IR is a cluster of virtual machines for executing SSIS packages. You can define the number of cores and compute capacity during the initial configuration (Lift and shift SSIS packages using Azure Data Factory on SQLHack)
Even though there is Microsoft article which explains how to Configure the Azure-SSIS integration runtime for high performance, there is not a lot of guidance of how to run it at the lowest possible cost but still being able to complete the jobs. So would you recommend a higher sizing running on a single node or running a lower sizing on multiple nodes? Based on experience, it seems perfectly possible to run most jobs on a single node and up until now we have been running all of them on a D4_v3, 4 cores, 16GB Standard. If you decide to run it on a lower configuration, it would recommend monitoring failures, capacity usage and throughput. (See Monitor integration runtime in Azure Data Factory for more details)
Reference:
- Configure the Azure-SSIS integration runtime for high performance
- Comparing SSIS and Azure Data Factory
Wednesday, November 29, 2023
Dynamics 365 and Power Platform monthly reading list November 2023
2023 Release Wave 2
- Dynamics 365 Customer Service: new dialer experience (preview) by @D365Goddess
- New sales capabilities in 2023 release wave 2 - helping sellers work smarter
- Dynamics 365 CRM Wave 2 Release: enhance categorization of knowledge search articles
- Configure sales Copilot for Dynamics 365 Sales (Preview) by @DianaBirkelbach
Technical topics (Configuration, customization and extensibility)
- Dataverse - use bulk operation messages (MS Learn) - CreateMultiple and UpdateMultiple are now GA! Read the small print to see when to use it
- Dataverse let's try elastic tables (preview) - by @temmy_raharjo
- Ways to deal with missing Power Platform environments by @inogic
- Edit subgrids side by side with Power Apps Grid or editable grid by @DianaBirkelbach
- Announcing monthly channel for model-driven apps
- Announcing general availability of custom connectors in solutions as well as environment variable secrets
- Announcing SharePoint Embedded Public Preview at ESPC23 - will be interesting to explore whether it is possible to combine this with Dynamics 365 data as an alternative for Power Pages. Head over to http://aka.ms/start-spe/ to start building your first SharePoint Embedded app.
- Dataverse + Azure Service Bus queue + Azure function for processing long operations
- Workflow automation in Dynamics 365 CRM: triggering actions on email send and receive events by @inogic
- August 2023 updates for modernization and theming in Power Apps
- Connecting to Dataverse from Function App using Managed Identity - using azd
- Power BI - Mastering sales calculations: a comprehensive guide to departmental analysis
- How to enable the enhanced email template editor in model-driven apps (Dynamics 365/Dataverse)
Copilots, AI and machine learning
- Write an email with Copilot in Dynamics 365 Customer Service by @nishantranacrm
- Getting your enterprise ready for Microsoft 365 Copilot (Ignite 2023 recording)
- Learn live: prepare, implement and secure Microsoft 365 Copilot (Ignite 2023 recording)
- New study validates the business value and opportunity of AI
- Architecture and deployment diagrams for Microsoft 365 Copilot
- Analyse the impact of AI-enhanced customer service with Copilot analytics
- What is prompt engineering? (McKinsey)
- Anti-hype LLM reading list
- Microsoft Chat Copilot vs Azure ChatGPT - which generative AI capability to choose for the enterprise
- Measuring the productivity impact of Generative AI (NBER digest)
- The power of prompting (Microsoft Research)
- Get started with Microsoft Copilot Studio: how to create your first Copilot by @lisacrosbie
- Use Copilot to summarize cases - Dynamics 365 Customer Service
Topics for Dynamics 365 Business Applications Platform consultants, project managers and power users
- Microsoft to make Dynamics 365 Marketing a part of D365 Customer Insights
- Customer Service in the age of AI
- Refresh the sales experience with Dynamics 365 Sales modern update
Sunday, November 26, 2023
Implementing Azure Synapse Link for Dataverse: gotchas and tips
Azure Synapse Link for Dataverse allows you to easily export data from a Dataverse (or Dynamics 365) instance to Azure Data Lake Storage Gen2 (ADLS) and/or Azure Synapse. Azure Synapse Link for Dataverse provides a continuous replication of standard and custom entities/tables to Azure Synapse and Azure Data Lake.
I highly recommend you to view the awesome YouTube playlist Azure Synapse Link and Dataverse - better together from Scott Sewell (@Scottsewell) as an introduction.
1. Check the region of your Dataverse/Dynamics 365 instance
The configuration of Azure Synapse Link for Dataverse is done through the Power Platform maker portal but before you can get started you should first setup Azure Data Lake Storage Gen2 and Azure Synapse in your Azure subscription.
It is however best that you first check in the configuration screen in which region your instance is located since the storage account and Synapse Workspace must be created in the same region as the Power Apps environment for which you want to enable Azure Synapse Link. From the PPAC user interface it is currently not possible to create a Dataverse/Dynamics 365 instance in a specific region but this is possible with the PowerShell - see Creating a Dataverse instance in a specific Azure region using Power Apps Admin PowerShell module.
If you need to move a Dataverse or Dynamics 365 instance to a different Azure region, you can open a Microsoft support tickets. Based on recent experience this specific type of Microsoft support request is handled fairly quickly (within 1-2 business days).
Azure Data Lake Storage is a set of capabilities, built on Azure Blob Storage. When you create a storage account and check the "enable hierarchical namespace" checkbox on the advanced tab, you create an Azure Data Lake Storage Gen2.
2. Make sure all prerequisites are in place before enabling Azure Synapse Link
Definitely make sure that all security configuration outlined on Create an Azure Synapse Link for Dataverse with your Azure Synapse Workspace (Microsoft docs) are correctly setup. The exception messages which are shown in the Azure Synapse Link configuration pages aren't always very helpful.
3. Azure Synapse Link for Dataverse is a Lake Database
In the documentation from Microsoft (Understand lake database concepts) a lake database is defined as:
A lake database provides a relational metadata layer over one or more files in a data lake. You can create a lake database that includes definitions for tables, including column names and data types as well as relationships between primary and foreign key columns. The tables reference files in the data lake, enabling you to apply relational semantics to working with the data and querying it using SQL. However, the storage of the data files is decoupled from the database schema; enabling more flexibility than a relational database system typically offers.
The data is stored ADLS Gen2 in accordance with the Common Data Model (CDM) -the folders used conform to well-defined and standardized metadata structures (mapped 1:1 with Dataverse tables/entities). At the root you will see a metadata file (called model.json) which contains semantic information about all of the entity/table records, attributes and relationships between the tables/entities.
The way the files are being written depends on the Azure Synapse Link for Dataverse configuration - both the partitioning mode and in place vs append only mode can be configured - see Advanced Configuration Options in Azure Synapse Link
4. Synapse Link for Dataverse uses passthrough authentication using ACLs in Azure Data Lake - no support for SQL authentication
Since all the the data for the tables in Azure Synapse Link for Dataverse are CSV files which are stored in Azure Data Lake Storage, this also means that security needs to be set at the level of the files in Azure Data Lake Storage Gen2. There is no support for SQL authentication in the Lake DB which is created by Azure Synapse Link for Dataverse.
References:
- Use the Common Data Model to optimize Azure Data Lake Storage Gen2 (Microsoft docs)
- Create an Azure Synapse Link for Dataverse with your Azure Synapse Workspace (Microsoft docs)
- Azure Synapse Link for Dataverse - understanding advanced configuration settings (Microsoft docs)
- Do more with data - from Data Export Service to Azure Synapse Link for Dataverse (Microsoft docs)
- Use Power BI to analyze the CDS data in Azure Data Lake Storage Gen2
- Synapse Link for Dataverse - Option Sets
- Dataverse metadata in Synapse
- Setting up Azure Synapse Link for Dynamics 365/Dataverse
- Power BI modeling guidance for Power Platform
- Azure Synapse Serverless SQL Pools cheat sheet
- Controlling cost with Azure Synapse Serverless Pools
Wednesday, November 22, 2023
Near real-time and snapshots in Azure Synapse Link for Dataverse
The Azure Synapse Link for Dataverse documentation contains a section about Access near real-time data and read-only snapshot data but it does not really explain why you want to use one or the other.
When you open an Azure Synapse SQL Serverless LakeDB in SQL Server Management Studio you see a clear distinction between the two versions of the table data - whereas in Azure Synapse Studio there is no obvious distinction besides the name you will see the "account" table the "account_partitioned" view:
- Near real time data: external table for all the underlying CSV files exported by the Azure Synapse Lin for Dataverse sync engine. There is a soft SLA for the data to be present in these tables within 15 minutes
- Snapshot data/partitioned views: views on top of the near-real time data which are updated on an hourly interval.
In most scenarios, it best to do queries against these partitioned views since you will avoid read conflicts and you are sure that a full transaction has been written on the CSV files in Azure Data Lake storage.
A typical exception that you might receive when doing queries directly against the "tables" is "https://`[datalakestoragegen2name].dfs.core.windows.net$$/[lakedbname]/[tablename/Snapshot/2023-05_1684234580/2023-05.csv" does not exist or you don't have file access rights)" but this also depends on your specific context. If you have a lot of create, updates or deletes on Dataverse tables this might happen more regularly. Even though, the partitioned views are update on an hourly basis - it might be that the Synapse Link engine is just refreshing the views at the same point that you perform a query, which will give you a similar exception but the changes that this occurs are more rare.
You can check the last sync timestamp and sync status in the Power Platform maker portal (see screenshot below)
For the moment, you will also have to manually check the monitoring page (which can be quite tedious if you have a lot of environments) but there is an item in the Microsoft release planner "Receive notifications about the state of Azure Synapse Link for Dataverse" which is apparently in public preview but I haven't seen it in for environments (not in the https://make.powerapps.com and also not in https://make.preview.powerapps.com/) I have access to.
It is also not easy to see if something went wrong with the refresh of the partitioned views - up until now the easiest way to find out is running a SQL query - select name,create_date from sys.views order by create_date desc against the LakeDB.