Tuesday, May 04, 2021

Quick tip: ConfigurationManager and .NET Core

If you want to use the System.Configuration.ConfigurationManager class in your .NET Core app you will first need  to reference the System.Configuration.ConfigurationManager Nuget package since it is not part of .NET Core due to the smaller framework footprint.



Thursday, April 15, 2021

Getting started with Azure Data Factory for Dynamics365 CRM consultants

The majority of Dynamics 365 CRM projects typically include some initial data load/migration activities as customers don't start from a blank sheet. Besides initial load of data there might also be a need for data integrations - there are a  number of solutions for data migration or data integration with Dynamics 365 CRM - this post will explain why you should strongly consider Azure Data Factory as a possible option.. 

Besides the obvious writing code as an integration option, you can also consider a third party solution and there are many good solutions on the market to meet integration needs e.g.  Tibco Scribe or the KingswaySoft SSIS Integration toolkit for Dynamics 365 .  But if you are looking for something with minimal initial cost investment (no separate licensing fee), quick start-up time and fully cloud based (no virtual machine/server environments needed) Azure Data Factory is a great fit. 

This is why - in my opinion -  technical Dynamics 365 CRM consultants should be at least aware of the  capabilities of Azure Data Factory and setup simple data pipelines.

Azure Data Factory is a managed cloud service meant for building complex, hybrid ETL/ELT pipelines that integrate data silos and can include big data and machine learning transformations.


Azure Data Factory provides an easy to use drag and drop interface where you use activities/tasks to transformation, move and copy your data from over 100+ supported data sources within the Microsoft stack or non-Microsoft (Salesforce, SAP, etc…). Azure Data Factory also has full support for running existing SSIS packages on its cloud infrastructure and last but not least it has a flexible pricing model with low initial costs. As expected from an enterprise platform it also provides great DevOps integration - something which I will cover in a next blog post (always setup Git integration for Azure Data Factory - it will make the editing experience a lot easier - see below in the Tips & tricks section for more details).

Below is a visualization of the different building blocks in  Azure Data Factory:

  • A data factory can have one or more pipelines which group together activities which define the actions to perform on your data. (See  Pipelines and activities in Azure Data Factory for more details)
  • These activities use datasets as inputs and outputs. A dataset identifies the data in a specific data store. (See Datasets in Azure Data Factory for more details)
  • Before you can create a dataset you need to specify the linked service to link your data store to the data factory. The dataset represents the structure of the data within the linked data store and the linked services defines the connection to the data source. (See Linked services in Azure Data Factory for more details)
  • The pipelines need to run in a compute infrastructure used by Azure Data Factory to allow for the different types of data integration capabilities. There are three types of integration runtime types (Azure, Self-hosted and Azure SSIS) and depending on the required capabilities and network support you will need to use on or the other. (See Integration runtime in Azure Data Factory)


Tips & tricks:

ADF has a connector which supports both Dynamics 365 (cloud version) as well the legacy on-premise versions of Dynamics CRM (version 2015 and higher) -   See Copy data from and to Dynamics 365 (CDS/Dataverse) or Dynamics CRM by using Azure Data Factory for more details). To be able to use on-premise versions of Dynamics CRM, the environment needs to be setup with  Internet-facing deployment (IFD) - info on the ADF connector information to on premise CRM see Dynamics 365 and Dynamics CRM on-premises with IFD is found on the same documentation page
  • Setup Git integration for Azure Data Factory - by default, the Azure Data Factory user interface experience (UX) authors directly against the data factory service (live mode). Even when you are not deploying Azure Data Factory artifacts from source control you will benefit from the better developer experience when git integration is enabled.
  • For the copy activity, you are required to set the Write Behavior to Upsert when  you use Dynamics 365 CRM as a destination (sink). You will need to provide an alternate key or you  can  leave the alternate key blank when you use the internal ID of CRM within your mapping (See Azure Data Factory: Update an existing record using Dynamics 365 Internal Guid for more details)

References:

Monday, April 12, 2021

Dynamics 365 and Power Platform quarterly reading List Q1 2021

Technical topics (Configuration, customization and extensibility)
Topics for Dynamics 365  Business Application Platform consultants, project managers and power users

Friday, February 12, 2021

How to disable multi factor authentication (MFA) for a Dynamics 365 trial environment

 You might have noticed that if you now setup a new Dynamics 365 trial environment, users who login are greeted with "Help us protect your account" which requires to setup multi factor authentication (MFA) even for accounts in trial/demo tenants. This is part of the Azure Active Directory security defaults. 


To get rid of this setting you need to login to https://portal.azure.com and change this setting for the Azure Active Directory used by your demo/trial tenant. Navigate to Azure Active Directory in your subscription and select Properties in the left navigation pane. At the bottom of this screen, you will find the Access Management for Azure resources section.


Click the link Manage Security Default in this section and change the setting to Enable security defaults No.




Wednesday, February 10, 2021

Power Platform and Dynamics 365 API request entitlements

Based on the number and the different types of licenses that you have within your Office 365 tenant you are granted a specific number of API request entitlements in a 24 hour window. For more details take a look at Request limits and allocations (Microsoft documentation)

It is important to keep in mind that these entitlement limits are different from the service protection limits  which are already enforced today.


It is time to start reviewing your architecture and consumption of API requests on Power Platform and Dynamics 365 as Microsoft announced that they will start enforcing these limits once the transition period ends (no date available yet in the official documentation - last updated February 2d 2021) - see Power Platform > Licensing > Request limits and allocations for more details.





Saturday, February 06, 2021

Quick tip: changing the access for a Teams recording in Microsoft Stream

 Up until now a Microsoft Teams recording is published in Microsoft Stream (this will change in early 2021) and  the Microsoft Teams recording is only accessible to a limited set of people. 


If you are the meeting owner, you can change permissions and allow people without access to view a meeting recording. Open the meeting in Stream and select   More options button > Update Video Details. 

This will open a new screen, in which you can tick either "allow everyone in your company to view this video" or you can share the recording with specific persons.





Wednesday, February 03, 2021

Quick tip : how to do a poll during a Microsoft Teams meeting

 A colleague showed me last week how to do a poll during Microsoft Teams meeting - check out the video below on how to set this up yourself and add some interactivity to meetings