Thursday, April 30, 2020

Running the Jupyter stock trading notebooks in Azure Notebooks

In the previous blog posts - Using Euronext stock data from Quandl in Jupyter notebooks and Working with multiple time series trading data from Quandl in Jupyter Notebooks I showed how you can use Jupyter notebooks to analyse stock trading running the Anaconda distribution from my local machine. The notebooks are available on Github -

It is however also possible to run them in Microsoft Cloud using Azure Notebooks on . Azure Notebooks is a free hosted service  to develop and run Jupyter notebooks in the cloud along with supporting packages for Python, R and F#. You can just login and get started without having to setup or install anything and run the code within your browser.

On the home page you will see a number of Jupyter notebook projects which you can clone into your own personal library.

By default, projects run on the Free Compute tier, which is limited to 4GB of memory and 1GB of data. You can bypass these limitations and increase compute power by using a different virtual machine that you've provisioned in an Azure subscription. For more information see Use Azure Data Science Virtual Machines

To be able to run the Jupyter notebook from on Azure notebooks, the only thing you need to do is installing the Quandl Python module by adding !pip install quandl in a new code cell on the top.

The free compute tier from Azure notebooks is running a custom version of Anaconda but not always the latest version - if you need specific Python modules you need to be aware that the installed version might not always be the latest version

Updates to Dynamics 365 release schedule, unified interface transition and team member license enforcement

As outlined in Our commitment to customers help ensure business continuity Microsoft has delayed some of the mandatory upgrades and changes to existing Dynamics 365 environments - below a short overview:
  • Existing Dynamics 365 environments will receive the Wave 2020 update one month later starting beginning of May . See Dynamics 365 release schedule for the exact dates for your geo.
  • Deprecation of the legacy web client and the mandatory transition to the Unified Interface is postponed from October 1th to December 1th 2020.
  • Deprecation of the Dynamics 365 for Outlook (Outlook COM add-in) is scheduled for October 2020 - see The future of Outlook integration for more details
  • Technical enforcement for Dynamics 365 Team Members licenses purchased or transitioned after October 1, 2018 will come into effect on January 31, 2021 (extended from September 30, 2020 - initially planned for April 1, 2020). 

Tuesday, April 28, 2020

Using Power Automate Flows to sent daily RSS digest e-mails of Power Platform documentation updates

It is quite a challenge to keep up with the rapid updates of the Microsoft documentation on Power Platform but did you know that you can use a RSS feed based on a search in the Microsoft documentation? will return a RSS feed with all recently updated pages on about the Power Platform.

Power Automate contains a RSS feed to send e-mail template but this flow will sent out an e-mail every time a page is updated which will flood your mailbox. So I built my own flow to receive a daily digest e-mail using Power Automate.

Here are some of the things I learned on the way (to be honest I don't use Power Automate flows that often):

  • I trigger the Power Automate Flow on a daily basis so I am using the List All RSS feed items actions to RSS feed items since a specific date (use the formatDateTime function for the expected format)

  • Some days no new RSS feed items will be published, so you need to check if the RSS body is not empty, otherwise you will sent out empty e-mails.

DM me on Twitter (@jopxtwits ) if you are interested in receiving an export of the Power Automate Flow.

What can you do with the Azure Cosmos DB free tier?

Beginning of March 2020, Microsoft announced the availability of a free tier of Azure Cosmos DB (see Azure Cosmos DB Free Tier is now available)

"When free tier is enabled on an Azure Cosmos DB account, you’ll get the first 400 RU/s and 5 GB of storage for free for the lifetime of the account. Additionally, when using shared throughput databases, you can create up to 25 containers that share 400 RU/s at the database level. There’s a maximum of one free tier account per Azure subscription and you must opt-in when creating the account."

But maybe you are wondering what you can actually do with 400 RU/s? Request Units per second (RU/s) represent the "cost" of a request in terms of CPU, memory and IO. In Azure Cosmos DB you can provision "performance" upfront by setting RU/s at database level, collection level or both. It is however also possible to create Azure Cosmos DB containers and databases in autoscale mode  Containers and databases configured in auto scale mode will automatically and instantly scale the provisioned throughput based on your application needs without impacting the availability, latency, throughput, or performance of the workload globally

I can highly recommend you to take a look at the Microsoft Ignite 2019 session - A developer's guide to Azure Cosmos DB, from onboarding to going live in production . The code samples from session are available on Github -


Monday, April 27, 2020

Things to watch out for when configuring the export of CDS data to Azure Data Lake

End October 2019 Microsoft announced general availability of the export to Azure Data Lake functionality  (previously called Project Athena) for CDS and Dynamics 365 (Sales, Customer Service and Field Service).

It is quite easy to setup by following the steps outlined in the announcement blog post  or the official documentation on exporting entity data to Azure Data Lake Storage Gen2 - there are however two important prerequisites:

  • The login that you use to configure the Export to Data Lake settings needs to be an owner of the Azure storage account. If you missed this step you will get an exception in the configuration wizard on the second screen.


Wednesday, April 22, 2020

Working with multiple time series trading data from Quandl in Jupyter Notebooks

In the previous example - Using Euronext stock data from Quandl in Jupyter notebooks I downloaded a single dataset from Quandl. But it is also possible to download multiple datasets by passing in a list of Quandl codes.

In the example below, I downloaded the prices of a number of diversified holding companies which are traded on Euronext Brussels and compared the cumulative returns (not including dividend payments) using Jupyter Notebooks.

The Quandl Python API allows you to make a filtered time series call and request only specific
columns - in this example the 'Last' (Closing price) is retrieved by specifying the index 4. In a next
step I renamed the columns in the pandas dataframe to make it easier to work with the data

Take a look at the full python notebook at to see how this data can be used to visualize cumulative returns for these different stocks

%matplotlib inline
import quandl
import matplotlib.pyplot as plt

quandl.ApiConfig.api_key = "<Your Key Here>"

#Retrieve Last price only for the 5 holdings (excluding mono holdings) trading on Euronext Brussels
#Data is available from February 2014 onwards - Ackermans Van Haren (ACKB), Brederode (BREB), Sofina (SOF), 
#GBL and Bois Sauvage (COMB )

#Rename column names 
data.rename(columns={'EURONEXT/ACKB - Last': 'ACKB', 'EURONEXT/BREB - Last': 'BREB','EURONEXT/SOF - Last':'SOF',
                     'EURONEXT/GBLB - Last':'GBLB','EURONEXT/COMB - Last':'COMB'},inplace=True)

Monday, April 20, 2020

Dynamics 365 and Power Platform monthly reading list March 2020

Dynamics 365 and Power Platform - 2020 Wave 1 Topics

Topics for Dynamics 365 Business Application Platform consultants, analysts, project managers and power users

Using Euronext stock data from Quandl in Jupyter notebooks

The last couple of weeks I have been learning about Python and how to use it for stock and derivative trading. One of the challenges is getting stock trading data for European stocks (without having to pay for it).  One of the first things I started with is using Jupyter notebooks to quickly visualize stock market information.

The easiest way to get started with Jupyter is using an all-in-one Python distribution - the one I used is Anaconda since it is easy to setup and it includes a number of interesting libraries I want to use in next steps.

I like to try out things hands-on but I did use a number of training resources to get up to speed:
To get trading data about European stocks I used QuandlQuandl is a marketplace for financial and economic data which is either freely available or requires a paid subscription. Data is contributed by multiple data publishers like World Bank, trading exchanges and investment research firms. Quandl provides REST API access to the available data sets but also has specific Python and R libraries. You first need to register to get an API key. A lot of European stocks are traded on Euronext and Quandl provides you access to Euronext data -

Install the quandl Python package using the Anaconda command prompt. It is best to setup virtual environments to manage separate package installation that you need for a particular project, isolating the packages in other environments but for simplicity I just installed in the base environment.

Next it is quite easy to retrieve stock data from Quandl - you first import the quandl package and next you call the quandl.get() method. By default, Quandl will retrieve the dataset into a pandas DataFrame. Since I specified no additional parameters, the entire timeseries dataset was retrieved - from February 2014 until now. Afterwards I used the plot command which uses the matplotlib library to display a graph of the closing prices.

For the full Jupyter notebook take a look at Github

Thursday, April 02, 2020

Dynamics 365 and Power Platform monthly reading list February 2020

Dynamics 365 and Power Platform - 2020 Wave 1 Topics
Technical topics  (Configuration, customization and extensibility)