If you have multiple Azure subscriptions linked to the account that you use to login – the operations in Azure PowerShell are executed against the “current subscription”. You can use the Get-AzureSubscription PowerShell cmdlet to see all of the subscriptions and also which is the “current” subscription. During a PowerShell session, you will need to use the Select-AzureSubscription cmdlet to choose which subscription data set is used by other Azure cmdlets with the following syntax "Select-AzureSubscription -name "NameOfYourSubscription".
Sunday, June 19, 2016
Friday, April 08, 2016
But let’s first take a look at why such a resource might be needed more than ever. I think that massive Dynamics CRM deployments – in number of users - are still quite rare especially if you compare it with SharePoint. This is however quite normal given that both products are covering completely different functional domains and where SharePoint might be used by everyone in a company, Dynamics CRM users are typically still found in customer facing departments (service management and sales & marketing) . This might be one of the reasons why there hasn’t been any need to compile such an extensive list of boundaries and limits for Dynamics CRM. But in an era where winning the customers heart and mind is getting harder and harder – it becomes imperative that companies transform from ‘product-focused’ to ‘customer-focused’ which will put CRM more at the center of your enterprise architecture and which might also trigger larger deployments of CRM both in terms of number of users, transactions and records.
But in the enterprise market we are currently already encountering at what you might describe as “High Volume” systems. A “High Volume” system might have one or more of the following characteristics:
- High volume data: more than 1 mio records in the base tables (Contacts,Accounts, …) or more than 5000 activities/day or more than 1 mio/year
- High volume transactions
- High volume users/security: more than 300 concurrent users or more than 1000 teams and business units
I also got feedback from another CRM architect stating that there are no real hard limits in Dynamics CRM for on-premise deployments but only scalability guidelines that you should follow. But still when you are doing a large CRM deployment chances are that you might call in Microsoft consultancy to assist you or audit your design they will point you to some specific boundaries and limits that you should adhere to. These guidelines and boundaries and are defined in the Microsoft Product Line Architecture for Dynamics CRM which unfortunately is not publicly available (See Introducing the Microsoft Product Line Architecture if you are unfamiliar with PLA) and although these are indeed no hard limits –if you don’t comply you will be asked to provide mitigating or corrective actions.
To get started you should definitely take a look at the Microsoft Dynamics CRM 2015 and Microsoft Dynamics CRM 2016 Performance and scalability documentation. This download contains the Dynamics CRM Scalable Customizations white paper which describes how SQL Server platform-level issues can occur that create slow Microsoft Dynamics CRM application performance or error messages returned to the end-user. It also provides information about how to optimize a custom implementation and enable better performance that results in better end-user experience. It also contains the Scalable security modeling with Microsoft Dynamics CRM 2015 white paper which describes how security modeling features in Microsoft Dynamics CRM 2015 and Microsoft Dynamics CRM 2016 related to authorization work at scale, the associated implications, and guidance on common and recommended usage patterns for modeling Microsoft Dynamics CRM 2015 and Microsoft Dynamics CRM 2016 security at scale, incorporating teams as appropriate. Both white papers are updated versions of the papers which were initially written for CRM 2011 and CRM 2013 because in essence the core architecture as described in Microsoft Dynamics CRM Reference Architecture has not changed significantly in the later versions.
Given the focus of Microsoft on Dynamics CRM Online you might also be faced with a large deployment of Dynamics CRM Online and you have to keep in mind that Dynamics CRM Online does contain some hard limits. I will cover these in a next blog post. But the Microsoft Dynamics CRM Online patterns & principles for solution builders white paper download provides already some solid guidance specifically about building solutions using Microsoft Dynamics CRM Online. Also keep in mind that Dynamics CRM Online deeply integrates into Office 365 (Yammer, SharePoint Online, Exchange Online and Office 365 Groups) and you might also need to the integrate some Microsoft Azure components so you should also have a solid grasp of these components as well as a solution architect.
Last but not least - it is a sad truth that in most deployments bad performance is not due to the number of users, transactions or records stored in Dynamics CRM but mainly caused by bad code – so also take a look at Best practices for developing with Microsoft Dynamics CRM.
- Software requirements for Microsoft Dynamics CRM 2015
- Microsoft Dynamics CRM Server 2015 hardware requirements
- Introducing the Microsoft Product Line Architecture
- Microsoft Dynamics CRM Reference Architecture
- Microsoft Dynamics CRM 2015 and Microsoft Dynamics CRM 2016 Performance and scalability documentation
Saturday, February 06, 2016
If you ever want to use filled maps with Power BI it is important to use the correct data category and this can be quite confusing for Europe where we have a multitude of different ways of dividing up a country which do not always correspond to the way it is done in the U.S. The data categories for locations that you can use on a filled map are Continent,Country/Region, State or Province and County .
So when I switched the data category for my Belgian provinces to county – the filled map already looked at lot better but not completely since it shows the Grand Duchy of Luxembourg (https://en.wikipedia.org/wiki/Luxembourg) and not the province Luxembourg (https://en.wikipedia.org/wiki/Luxembourg_(Belgium) ) on the map.
So I tried it out for a number of neighboring countries - the next table gives an example of what the different data category labels (in bold in the first row) correspond to for a number of European countries.
I tested it with Belgium, Germany, the Netherlands and France. I did not get it the regions to work correctly for France but this might be because France has officially merged a number of regions since January 1st. France is also subdivided into a number of departments and these were shown on the map as expected.
|Power BI Data Category to use||State or Province||County||Remarks|
|Belgium||Flanders, Walloon region and Brussels Capital region||Antwerp, East-Flanders, Flemish Brabant, Limburg, West-Flanders, Hainaut, Liège, Luxembourg, Namur, Brabant-Walloon||Province Luxembourg not depicted correctly|
|Germany||The different bundesländer: Baden-Württemberg, Bavaria, Berlin, Brandenburg, Bremen, Hamburg, Hesse, Lower Saxony, Mecklenburg-Vorpommern,North Rhine-Westphalia |
Rhineland-Palatinate, Saarland, Saxony, Saxony-Anhalt, Schleswig-Holstein,Thuringia
|France||The different departments as outlined on https://en.wikipedia.org/wiki/List_of_French_departments_by_population .||I removed the overseas departments to make it workable|
|Netherlands||The different provinces: Drenthe, Flevoland, Fryslân, Gelderland, Groningen, Limburg, North Brabant, North Holland, Overijssel, South Holland, Utrecht, Zeeland|
French departments with population density.
Dutch provinces with population density
Monday, January 11, 2016
In this webinar Brendan Rohrer (@_brohrer_) explains with a number of great examples some key ingredients or trade secrets of doing data science in easy to understand terms – here’s a quick recap (although I really recommend you to watch the video):
- Trade secret 1: You can’t use any data (and you have to ask sharp questions): I really like the definition as formulated by Jeff Leek (@jtleek) (taken from Data science done well looks easy, which is a big problem) Data science is the process of formulating a quantitative question that can be answered with data, collecting and cleaning the data, analyzing the data, and communicating the answer to the question to a relevant audience. So you first need a precise question and then you need to look for the right data or as indicated in the webinar relevant,connected, accurate and enough data. I’m not a data scientist but this really seems like the hardest part (or as phrased here For Big Data scientist, ‘janitor work’ is the key hurdle to insights )
- Trade secret 2: Turn your data in a picture – check out the example used in the seminar below. It is important to understand that people effortlessly recognize and classify objects among tens of thousands of possibilities so visualization of your data can help you to make sense of the data (For an interesting scientific article on this topic – take a look at How does the brain solve visual object recognition? )
- Trade secret 3: Data science can only answer five questions: predict how much/how many [regression], which category does something belong to [classification], which groups exist in a dataset [clustering], is something weird [anomaly detection] and which action should you take[reinforcement learning].
- Trade secret 4: Machine learning is simple. This statement is a little aggerated – but the analogy of mastering a foreign language and mastering machine learning is indeed correct. You need to learn the lingo (everyone probably knows tables – either in Excel or a database, but data scientist will refer to these rows of data in a table as data point or samples by data scients. The columns in your table typically describe a specific characteristic – well data scientist will call this a feature.)
- Trade secret 5: there are a lot of right ways to solve a specific problem. If you look at the Machine learning algorithm cheat sheet for Microsoft Azure Machine Learning Studio you will notice that there a lot of different ways to solve a specific problem (with certain nuances such as the number of features available, or speed of calculating the model, …) but in most cases it apparently does not seem to matter that much.
To get an overview of other Microsoft webinars on similar topics check out Big Data and Advance Analytics: On-demand and upcoming live webinars
- What can data science do for me?
- Data exploration through visualization
- What types of questions can data science answer?
- Which algorithm family can answer my questions?
- Machine learning algorithm cheat sheet for Microsoft Azure Machine Learning Studio
- How to choose algorithms for Microsoft Azure Machine Learning
Apparently the Microsoft .NET Threadpools settings of the machine.config where changed based on the guidelines defined in Optimizing and maintaining a Microsoft Dynamics CRM 2011 Server Infrastructure.
|maxconnection||12*n (where n is the number of CPUs)|
|minWorkerThreads||50 (manually add this parameter to the file|
Thursday, January 07, 2016
Update 2016/02/06: Thanks to the Power BI team feedback – I managed to get this working correctly – check out Using filled maps in Microsoft Power BI for provinces, regions and counties in European countries for the explanation.
A couple of weeks ago I wanted to try out the new filled map functionality (also referred to as choropleth) in Power BI ( See Tutorial: Filled Maps (Choropleths) in Power BI) – I wanted to start with a very simple data set
|Province||Dutch name||French name||Capital||Surface||Population|
|Flemish Brabant||Vlaams-Brabant||Brabant flamand||Leuven||2106||1114299|
Unfortunately I could not get the filled map to display correctly – I tried the province names in three different languages but nothing seemed to work.
According to Bing Maps Geographic Coverage – geocoding precision for Belgium should be fairly good. What are your experieces with this – do filled maps work correctly for provinces/regions outside of US? Leave a comment.
Wednesday, January 06, 2016
Using Microsoft Power BI Desktop to build Dynamics CRM Online Reports Part 5 –Refreshing data and custom visuals
- Using Microsoft Power BI Desktop to build Dynamics CRM Online reports – Part 1 – Introduction
- Using Microsoft Power BI Desktop to build Dynamics CRM Online reports – Part 2 – Using option sets in reports
- Using Microsoft Power BI Desktop to build Dynamics CRM Online reports – Part 3 – Relationships and the map control
- Using Microsoft Power BI Desktop to build Dynamics CRM Online reports – Part 4 - Sharing and collaborating
But before you can define the refresh schedule Power BI needs to be able access the Dynamics CRM Online OrganizationData.svc service, fortunately this service supports certain authentication capabilities found in the oAuth2 protocol. The OAuth 2.0 authorization framework - definition from the spec at Internet Engineering Task Force (IETF) enables a third-party application to obtain limited access to an HTTP service, either on behalf of a resource owner by orchestrating an approval interaction between the resource owner and the HTTP service, or by allowing the third-party application to obtain access on its own behalf. So oAuth is one of the industry standards around federated identity and it’s main goal is to eliminate the need to give system A your user name and password for accessing system B and it allows you to determine what system B can get from system A once it’s been allowed access. So in simple terms – oAuth allows Power BI to talk to Dynamics CRM Online using the access token that you got back when first authenticate using the screen below and in this way Power BI does not need to store the user name and password.
You have to make sure that the credentials for the different data sources are up to date before you can set up the refresh schedule so you have to specify the credentials and make sure that you use oAuth as authentication method.
In Power BI Standard edition you then have the option to schedule a daily or weekly refresh – for an hourly data refresh you will need to upgrade to Power BI Pro. The table below lists the different available refresh options and the required subscription of Power BI (Source: Data Refresh in Power BI).
|Data Refresh||Power BI (free)||Power BI Pro|
|Datasets scheduled to refresh||Daily||Hourly|
|Streaming data in your dashboards and reports using Microsoft Power BI REST API or Microsoft Stream Analytics||10K rows/hour||1M rows/hour|
|Live data sources with full interactivity (Azure SQL Data Warehouse, Spark on HDInsight)||Not supported||Supported|
|On premise data sources requiring Power BI Personal Gateway and on-premise SQL Server Analysis Services requiring Analysis Services Connector||Not supported||Supported|
D3 allows you to bind arbitrary data to a Document Object Model (DOM), and then apply data-driven transformations to the document. For example, you can use D3 to generate an HTML table from an array of numbers. Or, use the same data to create an interactive SVG bar chart with smooth transitions and interaction.
So how does this look like from a visualization designer perspective - first you can take a look at the Power BI Visual Gallery for some example custom visuals. Here you will need to download your custom visual definition file – in this example I will use the TadPole Spark Grid Plus – when you click on the download link – you will see that it downloads a pbiviz file.
As a data source I will start from the sales and marketing sample which you can download from the Power BI industry samples (Excel workbooks) . You can import this Excel file within Power BI Desktop, and Power BI Desktrop will try to import the Power Query queries, Power Pivot models and Power View worksheets which you can later on refine using Power BI Desktop. (See Import Excel workbooks into Power BI Desktop for more details).
Next I will create a new report page using data from the sales fact table (Total Units and Sales $) per manufacturer and per year. Afterwards you will need to import the definition file for your custom visual by selecting File>Import> Power BI Custom Visual or clicking the three dots in the visualizations pane and selecting the pbviz file that you just download. Next you can apply your visualization to the report data.
As you see in the example below, it shows a spark line (for sales in units and dollars) with colored and thickened line segments. The black colored segments mean that the value has gone up since last period (desirable), and the red colored segments mean the value has gone down (undesirable). (This behavior is configurable using the properties of the visualization)
In a next post I will take a look at how you can embed Power BI reports in other web applications as well as Dynamics CRM.
Tuesday, December 29, 2015
I have been using Windows Live Writer since the first beta in 2006 and I it has been my preferred blogging tool ever since. So it was great to hear that there finally is a successor – welcome Open Live Writer.
Open Live Writer is an open source application enabling users to author, edit, and publish blog posts. It is based on a fork of the well-loved but not actively developed Windows Live Writer code. Open Live Writer is provided under a MIT license.
Also check out @Shanselman post – Announcing Open Live Writer – An Open Source Fork of Windows Live Writer - for some more background information.
Sometimes you don’t want a password of an Office 365 user to expire e.g. when you use it as a system account. You can’t do this using the Office 365 Admin center but it is quite easy using the Azure Active Directory Module for Windows PowerShell.
- Connect to Office 365 using your Office 365 global admin credentials by running the following cmdlet: Connect-MsolService
- Set the password of one user to never expire, run the following cmdlet: Set-MsolUser -UserPrincipalName <email@example.com> -PasswordNeverExpires $true
For an extensive walkthrough, check out the links below