Wednesday, June 24, 2020

Using a personal access token in Postman to query the Azure DevOps Services REST API

If you want to query the Azure DevOps Services Rest API you first need to be authenticated. The easiest way to do this is to generate a Personal Access Token. If you want to use this token in Postman to authenticate you will need to use Basic authentication, leave the username blank and copy the token in the password input box.

Saturday, June 20, 2020

Dynamics 365 and Power Platform monthly reading list May 2020

Technical topics (Configuration, customization and extensibility)

Topics for Dynamics 365 Business Application Platform consultants, project managers and power users

Wednesday, June 17, 2020

How to fix a broken Easyrepro project after upgrading Chrome

After I recently upgraded Chrome I noticed that my Easyrepro test project did not build/run anymore. (read Easyrepro - UI automation test library for Dynamics 365 and  CDS if you are not familiar to Easyrepro)

Luckily the exception pointed in the correct direction "System.InvalidOperationException: session not created. This version of ChromeDriver only supports Chrome version 79 (SessionNotCreated)".


The EasyRepro testing framework leverages Selenium ChromeDriver so you need to check the compatibility - see https://chromedriver.chromium.org/downloads 

To fix the error I just had to update the Nuget package to a version which is compatible with the installed Chrome on my machine.



Storage capacity enforcement for CDS instances

Storage capacity enforcement now seems to activated, this means that when your capacity storage is greater than the entitled capacity, admin operations like creating, copying and restore environments will be blocked.

Thursday, June 11, 2020

Farewell to the Dynamics 365 Admin Center

In 2019 Microsoft announced that they would consolidate the different admin centers for Power Platform and Dynamics 365 Sales/Customer Service/Field Service. In the meanwhile, more and more functions seem to be moving to the new Power Platform Admin Center. Although the legacy Dynamics 365 admin center deprecation is not listed (yet) on the Important changes (deprecations) coming in Power Apps, Power Automate and model-driven apps in Dynamics 365. I expect this admin portal to quickly fade away.  

Update June 28th: @jukkan just mentioned that there is an update in Message Center which officially states that the old Dynamics 365 Admin Center will go away. I like the speed of innovation in the cloud but this seems like incredibly short notice.



To be able to access the Power Platform Admin Portal you will need to assign an appropriate service admin role - see Use service admin roles to manage your tenant for more details.

Features/functionality which is now surfaced in the new Power Platform Admin Portal:
  • App management in the Applications tab of the legacy Dynamics 365 Admin center has now moved to Power Platform Admin Center - see  Manage Dynamics 365 Apps for more details. This applies to installing and managing Microsoft first party apps like Customer Service but also to other apps installed through AppSource. You first need to select your environment in the new admin center and use the Resources>Dynamics 365 apps menu to open the list of available apps to install. This same screen can be used to upgrade existing apps to newer versions. (Unfortunately there is no indication when there is a new app version available)





Friday, June 05, 2020

Quick tip: exporting all nuget packages used in Visual Studio solution to text file with PowerShell

Although it is possible to manage Nuget packages at solution level in Visual Studio with the possibility to even consolidate the Nuget packages it might be useful as well to have an overview of all of the packages that you use in another format. 



The below PowerShell code snippet allows you to export all used Nuget packages to a text file (you can ignore the exceptions thrown because of  projects without Nuget packages).

Get-Content .\[yoursolutionname].sln 
| where { $_ -match "Project.+, ""(.+)\\([^\\]+).csproj"", " }
| foreach { "$($matches[1])\packages.config" } | % { Get-Content $_ | Find "<package id" } | Sort-Object -Unique | Out-File -FilePath c:\temp\packages_demo


Tuesday, May 19, 2020

Dynamics 365 and Power Platform monthly reading list April 2020

Dynamics 365 and Power Platform -2020 Wave 1 Topics

Technical topics (Configuration, customization and extensibility)
Topics for Dynamics 365 Business Application Platform consultants, analysts, project managers and power users

Monday, May 18, 2020

Power Platform ALM process guidance published and Power Platform Build Tools general available

Microsoft just published dedicated documentation around Application Lifecycle Management (ALM) with the Power Platform and Dynamics 365. This is a great starting point for everyone who wants to mature their development and deployment practice.

At the same time Microsoft announced the general availability of their Microsoft Power Platform Build Tools for Azure DevOps .  If you have used the preview version of the Power Platform Build Tools you will need to install the new extensions and you will need to recreate your build and release pipelines. You can install the Power Platform Build Tools extension from Azure Marketplace and they are free to use within Azure DevOps Services (online) or Azure DevOps Server.


Keep in mind though that setting up a good ALM practice, is about more than just tooling being in place (For a good discussion take a look at Continuous Integration is not a tooling problem)

If you are already using tooling like the Power DevOps Tools from @waelhaemze, it might not (yet) make sense to switch over but if you are new to DevOps/ALM for Dynamics 365/Power Platform definitely take a look (for more background on this topic see My Perspective on the PowerApps Build Tools for Azure DevOps )

Friday, May 15, 2020

Quick tip: generate and integrate mock data for Dynamics 365

If you are doing demos for Dynamics 365 it might be needed to generate mock data - an interesting tool for this is Mockaroo. You can just download a dataset that you generate from the website but Mockaroo also exposes a number of APIs that you access.


An interesting example of leveraging the API is outlined in Power Automate your demo data by @dylanhaskins


Thursday, May 14, 2020

Easyrepro - UI automation test library for Dynamics 365 and CDS

With the "One version" update policy of Dynamics 365 (Sales,Customer Service, Field Service,...) it starts making more sense to include some level of test automation in your projects. If you look at the test pyramid - it however also becomes apparent that user interface testing is still quite cumbersome and I think that the majority Dynamics 365 projects do not include any form of automated UI testing.

With the release of the Easyrepro - the automated UI testing framework for Dynamics 365 on Github in June 2017, writing automated UI test has become easier although it still requires basic developer skills and access to Visual Studio tooling.



The purpose of the Easyrepro library is to provide Dynamics customers the ability to facilitate automated UI testing for their projects. The Easyrepo API's provide an easy to use set of commands that make setting up UI testing quick and easy. The functionality provided covers the core CRM commands that end users would perform on a typical workday and developers area able to extend that coverage to more functionality.

If you want to quickly explore what is possible with the EasyRepro framework, clone the repo on Github and take a look at Microsoft.Dynamics365.UIAutomation.Sample project and see how the different test classes are implemented.



Afterwards I highly recommend you to take a look at the excellent series of blog post written by the Microsoft PFE team:


References:


Wednesday, May 13, 2020

Resolving xauth: timeout in locking authority file /home/pi/.Xauthority on Raspberry Pi

A couple of weeks ago when I tried to start the Raspberry Pi desktop it threw an error " xauth: timeout in locking authority file /home/pi/.Xauthority" -  luckily I found a solution on this page  please help -startx fails/hangs, finally returns error message “xauth: timeout in locking authority file /home/pi/.Xauthority ”  - running xauth -b did the trick for me.

Sunday, May 03, 2020

Quick tip: reload Python modules in Jupyter with the autoreload magic method

If you are writing your own Python modules and you include them in your Jupyter notebooks you might notice that by default your code updates in the custom Python module are not reflected once the module has been load by Jupyter. You can fix this by adding %autoreload at the top of your notebook.


%load_ext autoreload
%autoreload 2

For the documentation take a look https://ipython.readthedocs.io/en/stable/config/extensions/autoreload.html

Friday, May 01, 2020

Quick tip: change Jupyter notebook startup folder in Anaconda

This excellent walk through explains how to change the Jupyter notebook startup folder in Anaconda - quick summary of the steps:

  1. Generate config file by running jupyter notebook –generate-config in the Anaconda prompt
  2. Open the c:\Users\\.jupyter\jupyter_notebook_config.py
  3. Uncomment c.NotebookApp.notebook_dir = '' and fill in your startup folder e.g. 'c:\jupyter_workspace'  
For an overview of all of the configuration setting in the jupyter_notebook_config.py file take a look at Config file and command line options (Jupyter documentation)




Thursday, April 30, 2020

Running the Jupyter stock trading notebooks in Azure Notebooks

In the previous blog posts - Using Euronext stock data from Quandl in Jupyter notebooks and Working with multiple time series trading data from Quandl in Jupyter Notebooks I showed how you can use Jupyter notebooks to analyse stock trading running the Anaconda distribution from my local machine. The notebooks are available on Github - https://github.com/jorisp/tradingnotebooks

It is however also possible to run them in Microsoft Cloud using Azure Notebooks on  https://notebooks.azure.com/ . Azure Notebooks is a free hosted service  to develop and run Jupyter notebooks in the cloud along with supporting packages for Python, R and F#. You can just login and get started without having to setup or install anything and run the code within your browser.

On the home page you will see a number of Jupyter notebook projects which you can clone into your own personal library.



By default, projects run on the Free Compute tier, which is limited to 4GB of memory and 1GB of data. You can bypass these limitations and increase compute power by using a different virtual machine that you've provisioned in an Azure subscription. For more information see Use Azure Data Science Virtual Machines

To be able to run the Jupyter notebook from https://github.com/jorisp/tradingnotebooks/blob/master/Quandl_API_Euronext_ABI_Shared.ipynb on Azure notebooks, the only thing you need to do is installing the Quandl Python module by adding !pip install quandl in a new code cell on the top.


The free compute tier from Azure notebooks is running a custom version of Anaconda but not always the latest version - if you need specific Python modules you need to be aware that the installed version might not always be the latest version



Updates to Dynamics 365 release schedule, unified interface transition and team member license enforcement

As outlined in Our commitment to customers help ensure business continuity Microsoft has delayed some of the mandatory upgrades and changes to existing Dynamics 365 environments - below a short overview:
  • Existing Dynamics 365 environments will receive the Wave 2020 update one month later starting beginning of May . See Dynamics 365 release schedule for the exact dates for your geo.
  • Deprecation of the legacy web client and the mandatory transition to the Unified Interface is postponed from October 1th to December 1th 2020.
  • Deprecation of the Dynamics 365 for Outlook (Outlook COM add-in) is scheduled for October 2020 - see The future of Outlook integration for more details
  • Technical enforcement for Dynamics 365 Team Members licenses purchased or transitioned after October 1, 2018 will come into effect on January 31, 2021 (extended from September 30, 2020 - initially planned for April 1, 2020). 

Tuesday, April 28, 2020

Using Power Automate Flows to sent daily RSS digest e-mails of Power Platform documentation updates

It is quite a challenge to keep up with the rapid updates of the Microsoft documentation on Power Platform but did you know that you can use a RSS feed based on a search in the Microsoft documentation? https://docs.microsoft.com/api/search/rss?search=powerapps&locale=en will return a RSS feed with all recently updated pages on https://docs.microsoft.com about the Power Platform.

Power Automate contains a RSS feed to send e-mail template but this flow will sent out an e-mail every time a page is updated which will flood your mailbox. So I built my own flow to receive a daily digest e-mail using Power Automate.


Here are some of the things I learned on the way (to be honest I don't use Power Automate flows that often):

  • I trigger the Power Automate Flow on a daily basis so I am using the List All RSS feed items actions to RSS feed items since a specific date (use the formatDateTime function for the expected format)

  • Some days no new RSS feed items will be published, so you need to check if the RSS body is not empty, otherwise you will sent out empty e-mails.

DM me on Twitter (@jopxtwits ) if you are interested in receiving an export of the Power Automate Flow.





What can you do with the Azure Cosmos DB free tier?

Beginning of March 2020, Microsoft announced the availability of a free tier of Azure Cosmos DB (see Azure Cosmos DB Free Tier is now available)

"When free tier is enabled on an Azure Cosmos DB account, you’ll get the first 400 RU/s and 5 GB of storage for free for the lifetime of the account. Additionally, when using shared throughput databases, you can create up to 25 containers that share 400 RU/s at the database level. There’s a maximum of one free tier account per Azure subscription and you must opt-in when creating the account."

But maybe you are wondering what you can actually do with 400 RU/s? Request Units per second (RU/s) represent the "cost" of a request in terms of CPU, memory and IO. In Azure Cosmos DB you can provision "performance" upfront by setting RU/s at database level, collection level or both. It is however also possible to create Azure Cosmos DB containers and databases in autoscale mode  Containers and databases configured in auto scale mode will automatically and instantly scale the provisioned throughput based on your application needs without impacting the availability, latency, throughput, or performance of the workload globally

I can highly recommend you to take a look at the Microsoft Ignite 2019 session - A developer's guide to Azure Cosmos DB, from onboarding to going live in production . The code samples from session are available on Github - https://github.com/deborahc/cosmos-perf-tips

References:

Monday, April 27, 2020

Things to watch out for when configuring the export of CDS data to Azure Data Lake

End October 2019 Microsoft announced general availability of the export to Azure Data Lake functionality  (previously called Project Athena) for CDS and Dynamics 365 (Sales, Customer Service and Field Service).



It is quite easy to setup by following the steps outlined in the announcement blog post  or the official documentation on exporting entity data to Azure Data Lake Storage Gen2 - there are however two important prerequisites:


  • The login that you use to configure the Export to Data Lake settings needs to be an owner of the Azure storage account. If you missed this step you will get an exception in the configuration wizard on the second screen.


References:

Wednesday, April 22, 2020

Working with multiple time series trading data from Quandl in Jupyter Notebooks

In the previous example - Using Euronext stock data from Quandl in Jupyter notebooks I downloaded a single dataset from Quandl. But it is also possible to download multiple datasets by passing in a list of Quandl codes.

In the example below, I downloaded the prices of a number of diversified holding companies which are traded on Euronext Brussels and compared the cumulative returns (not including dividend payments) using Jupyter Notebooks.


The Quandl Python API allows you to make a filtered time series call and request only specific
columns - in this example the 'Last' (Closing price) is retrieved by specifying the index 4. In a next
step I renamed the columns in the pandas dataframe to make it easier to work with the data
afterwards.



Take a look at the full python notebook at https://github.com/jorisp/tradingnotebooks/blob/master/Quandl_Belgian_Holdings-Shared.ipynb to see how this data can be used to visualize cumulative returns for these different stocks



 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
%matplotlib inline
import quandl
import matplotlib.pyplot as plt


quandl.ApiConfig.api_key = "<Your Key Here>"

#Retrieve Last price only for the 5 holdings (excluding mono holdings) trading on Euronext Brussels
#Data is available from February 2014 onwards - Ackermans Van Haren (ACKB), Brederode (BREB), Sofina (SOF), 
#GBL and Bois Sauvage (COMB )
data = quandl.get(['EURONEXT/ACKB.4','EURONEXT/BREB.4','EURONEXT/SOF.4','EURONEXT/GBLB.4','EURONEXT/COMB.4'])

#Rename column names 
data.rename(columns={'EURONEXT/ACKB - Last': 'ACKB', 'EURONEXT/BREB - Last': 'BREB','EURONEXT/SOF - Last':'SOF',
                     'EURONEXT/GBLB - Last':'GBLB','EURONEXT/COMB - Last':'COMB'},inplace=True)

Monday, April 20, 2020

Dynamics 365 and Power Platform monthly reading list March 2020

Dynamics 365 and Power Platform - 2020 Wave 1 Topics


Topics for Dynamics 365 Business Application Platform consultants, analysts, project managers and power users

Using Euronext stock data from Quandl in Jupyter notebooks

The last couple of weeks I have been learning about Python and how to use it for stock and derivative trading. One of the challenges is getting stock trading data for European stocks (without having to pay for it).  One of the first things I started with is using Jupyter notebooks to quickly visualize stock market information.

The easiest way to get started with Jupyter is using an all-in-one Python distribution - the one I used is Anaconda since it is easy to setup and it includes a number of interesting libraries I want to use in next steps.



I like to try out things hands-on but I did use a number of training resources to get up to speed:
To get trading data about European stocks I used QuandlQuandl is a marketplace for financial and economic data which is either freely available or requires a paid subscription. Data is contributed by multiple data publishers like World Bank, trading exchanges and investment research firms. Quandl provides REST API access to the available data sets but also has specific Python and R libraries. You first need to register to get an API key. A lot of European stocks are traded on Euronext and Quandl provides you access to Euronext data - https://www.quandl.com/data/EURONEXT-Euronext-Stock-Exchange

Install the quandl Python package using the Anaconda command prompt. It is best to setup virtual environments to manage separate package installation that you need for a particular project, isolating the packages in other environments but for simplicity I just installed in the base environment.

Next it is quite easy to retrieve stock data from Quandl - you first import the quandl package and next you call the quandl.get() method. By default, Quandl will retrieve the dataset into a pandas DataFrame. Since I specified no additional parameters, the entire timeseries dataset was retrieved - from February 2014 until now. Afterwards I used the plot command which uses the matplotlib library to display a graph of the closing prices.



For the full Jupyter notebook take a look at Github https://github.com/jorisp/tradingnotebooks/blob/master/Quandl_API_Euronext_ABI_Shared.ipynb


Thursday, April 02, 2020

Dynamics 365 and Power Platform monthly reading list February 2020

Dynamics 365 and Power Platform - 2020 Wave 1 Topics
Technical topics  (Configuration, customization and extensibility)

Sunday, March 29, 2020

Quick tip: Skype for Business and Microsoft Teams network assessment tool

To ensure that your network meets the requirements for using Skype for Business or Microsof Teams for audio and/or video calls, you can download and run the Skype for Business Network Assessment tool.




Sunday, March 22, 2020

Generating Azure Application Insight Key in Azure DevOps pipeline

If you want to generate an Application Insights key in your Azure DevOps pipeline - you can use the the PowerShell code snippet below in a Azure PowerShell task.


Look at Automate Azure Application Insight resources using PowerShell and New-AzureRmApplicationInsightsKeyApiKey for more details.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
Install-Module AzureRM.ApplicationInsights -force -verbose -scope CurrentUser
Import-Module AzureRM.ApplicationInsights

$resourcegroupname ='rg_func****'
$aicomponentname ='func****'
$permissions = @("ReadTelemetry", "WriteAnnotations")
$apikeydescription = 'testapikey'

New-AzureRmApplicationInsightsApiKey -ResourceGroupName $resourcegroupname 
-Name $aicomponentname -Description $apikeydescription -Permissions $permissions

Friday, March 20, 2020

Azure Application Insights for Dynamics 365 and Power Platform solution architects and consultants

A best practice which is quite often overlooked is enabling monitoring and logging capabilities in your applications or solutions. You should always monitor your applications/solutions whether they run in the cloud or on premise because you want to know when something fails (before the users start complaining) and also understand how users are working with your solutions.



Both Dynamics 365 and Power Platform have the option to enable integration with Azure Application Insights - but what is Application Insights?

Azure Application Insights  is a feature of Azure Monitor and it is basically an APM (Application Performance Management) tool. Application Insights provides standard integration with a lot of Azure components adding automatic monitoring capabilities but you can also extend it with custom monitoring so if you get an exception in your code that is unhandled Azure Application Insights will pick this up. You can use it for on-premise and cloud applications but the ease with which you can instrument your applications might differ.


You can integrate Application Insights into your model-driven apps (either Dynamics 365 first party apps which you extended or completely custom-model driven apps) by using the Javascript SDK (See Application Insights for web pages for more details). Unfortunately the old documentation on this topic has disappeared - https://azure.microsoft.com/en-us/documentation/articles/app-insights-sample-mscrm/ now does a redirect to the general document. 


Luckily @DanzMaverick wrote an interesting plugin for XrmToolBox which allows you to easily instrument your different model driven forms.  Be sure to update to the latest version though since it contains a number of fixes. If you want to keep up to date with updates on this interesting tool follow https://github.com/Power-Maverick/ApplicationInsightsManager




Using Application Insights Manager plugin you will be able to instrument a number of different model forms using just a number of clicks - behind the scenes it will wire up your forms with the required JavaScript web resources.



A recent update on the Power Platform also allows you to log telemetry for your Canvas Apps using Application Insights.

Dynamics 365 can also leverage Azure components like Azure web apps, web jobs or Azure functions which will also require monitoring and logging capabilities. Depending on the type of component you might get out of the box monitoring or you might need to use code based monitoring - most scenarios are described in the documentation - https://docs.microsoft.com/en-us/azure/azure-monitor/app/app-insights-overview

Things I learned when using Azure Application Insights in the last couple of months:

Tuesday, March 10, 2020

Enabling C# 7.1 in Visual Studio 2017 projects

Today, when I tried to compile one of the Visual Studio projects in our solution - I received the exception "Feature 'default literal' is not available in C# 7.0. Please use language version 7.1 or greater". (For more info see default literal in C# 7.0)



To enable C# 7.1 you need to following these steps:

  • Right click on your project and select Properties, next navigate to the build tab and click Advanced

  • Next you can either choose C#7.1 from the list or choose C# latest minor version (latest).

Monday, March 09, 2020

Dynamics 365 and Power Platform monthly reading list January 2020

Dynamics 365 and Power Platform - 2020 Wave 1 topics

Starting April 2020, new features in Dynamics 365 and Power Platform will be activated, but you can already try out some of the new features by enabling early access - for a full list of available features see 2020 release wave 1 features available for early access
Technical topics (Configuration, customization and extensibility)

Forms Pro Quick tip: switch between Microsoft Forms Pro and Microsoft Forms

Microsoft Forms Pro is an enterprise survey tool which leverages both Microsoft Forms and the Common Data Service (For more details see Getting started with Microsoft Forms Pro: customer feedback is the ultimate truth.)

Once you are assigned a Forms Pro license, it is not so obvious that you can still use Microsoft Forms (at least not for me) To switch between Microsoft Forms Pro and Microsoft Forms you should select your photo in the Office 365 header and then select Switch to Forms



Thursday, February 06, 2020

Quick tip: keep your Dynamics 365 development tools up to date using Nuget

I recently encountered an issue when I tried switching the view of the registered plugins & custom workflow activities in the Plugin Registration Tool.



When updating to the latest version using the PowerShell script listed on Download Dynamics 365 development tools from Nuget the error got resolved. Lessons learned: make sure that you keep your Dynamics 365 development tools up to date.

Monday, February 03, 2020

Lessons learned about Dynamics 365 solution layering - Part 1


This is the first in a series of blog posts on Dynamics 365 solutions and solution layering in which I will cover the basics about Dynamics 365 solutions and gradually delve deeper into the mysterious world of the inner workings of Dynamics 365 solutions.

In my opinion,  based on how managed solutions and solution layering work - I think that in the majority of cases you should only use a single managed solution to avoid issues down the line. Read on to see why...


This blog post assumes that you are using managed solutions to deploy to all environments and only have unmanaged solutions in your development environment.

Dynamics 365 solutions fundamentals

If you work with Dynamics 365/CDS solutions, you should be aware of the concept of solution layering. Solution layering occurs on import of solutions when a specific component is affected by a change within one or more solution. Solution layering will determine the kind of behavior that a user will see in Dynamics 365.

The figure above shows how layering works: at the bottom, you have the standard CDS solutions on top of which you might have some first party apps installed like Dynamics 365 Sales/Customer Service etc... If you deploy custom managed solutions, they will go on top based on the order in which they are installed. Direct customizations and unmanaged solutions are always at the top but there is no real layering of unmanaged solutions - they all end up directly modifying the base behavior.

Solution layers describe the dependency chain of component from the root solution introducing it, through each solution that extends or changes the component’s behavior. Solution layers are created through extension of an existing component (taking a dependency on it) or through creation of a new component or version of a solution. So it is very important to understand that layers should be considered on a per component basis.

In April 2019, Microsoft introduced a new user interface component which visualizes the different solution layers which are impacting the behavior of a specific component - View solution layers for a component. Below is an example of the solution layers of the case form which was quite an interesting example.


Going beyond the basics

But what happens if two or more solutions define solution components differently? This article  - Introduction to solutions - conflict resolution explains the two conflict resolution strategies - Merge and Top Wins. The article is also valid for online according to Microsoft support - even though it was written for Dynamics 365 on premise and last updated in December 2017.


When importing a managed solution with the overwrite customization option, it is possible to override the "top wins" conflict resolution  but this does not apply to components which apply a “merge strategy” - for more details see Understand how managed solutions are merged

So you need to be very careful when adding components which apply a merge strategy in multiple solutions with different publishers since the merge behavior might cause unexpected side effect such as creating a separate active layer on top which would block your customization to appear to the user.

An active layer can emerge in two scenarios - a direct customization was done in the instance but it seems that it is also possible that a managed active layer is created when using only managed solutions without direct customizations in the system. This happens when managed solutions in one system, with different publishers (the Microsoft solutions count as a different publisher) are trying to update the same component. It is a defense mechanism that Microsoft has built in when there are object inconsistencies across managed solutions.

The general recommendation is that you keep the number of managed solutions with common components (e.g. forms with a different form layout) to a minimum. This will avoid merging issues across the layers.

Single solution to rule them all?


Taking into account all of the above it seems that you would opt to use a single managed solution to deploy to other environments.

Single solution approach
Pros:

  • Easier to version control
  • Avoid solution layering issues
  • Less environments needed
  • Easier to promote across environments
Cons:

  • Longer duration of solution import
  • Long list of components


I do think that a single solution might make sense in the majority of cases but there are of course exceptions to this rule.  I would recommend you to take a look at the Solution Lifecycle Management: Dynamics 365 for Customer Engagement Apps whitepaper and especially the section on solution boundaries which outlines some of other valid scenarios for having multiple solutions e.g.  the scenario where you need to deploy to multiple regional production instances with small variations with a core solution approach and on top local solutions  or the scenario where you need to release subsets of applications on different cadence/timeline.