Monday, November 21, 2022

Reminders for the occasional Power BI creator

I am not a regular Power BI user, but every couple of months I will create some Power BI reports (e.g., Storage capacity management for Dynamics 365/Dataverse ) because Power BI is simply great to visualize and analyze data. 


Unfortunately, I tend to forget - maybe very obvious things to frequent Power BI users - how to perform specific tasks in Power BI. 

So here it goes - my list of reminders for the occasional Power BI creator.



References:

Wednesday, November 16, 2022

Visualize S&P 500 data in Power BI using Azure Synapse Serverless SQL Pool

In Explore and analyze stock ticker data in Azure data lake with Azure Synapse serverless SQL Pool, I showed you can download stock ticker data from Yahoo Finance, stored it in Azure Data Lake and retrieve the data using standard T-SQL in Azure Synapse Studio. In this post, I will show how easy it is to consume the data from Synapse SQL Serverless using Power BI.


For the standard visual with the evolution of the S&P 500 closing price, I connected directly on SP500 external table in the Synapse SQL. You can connect to Synapse SQL Serverless using either the Azure SQL Database or Azure Synapse Analytics SQL connector and you will need to enter the Serverless SQL endpoint which looks something like this <yoursynapse>-ondemand.sql.azuresynapse.net


With the second reported I want to visualize the S&P 500 yearly return and the average return since December 1927. To make it easier, I created a separate view on top of the external table which calculates the yearly returns


As you see from the visual, returns can vary quite a lot both on the negative side as well as on the positive side - for the last 20 years, there was a huge drop in 2008 (-38%) and also this year is not looking great (-22%), but 2013, 2019 and 2021 all had returns above 20%. On average across the S&P 500 returned 7% (not included dividends).


For the last visual in the Power BI report, I wanted to show a histogram with the S&P 500 yearly returns. I based myself on Power BI Histogram example using DAX since  Power BI does not have a standard histogram and I did not want to use a custom visual ( I used Power BI custom visuals from Pragmatic Works in the past)

Equity returns roughly follow a normal distribution or "bell curve", meaning that most values cluster near the central peak and values farther from the average are less common.  Stock returns however have fat tails - meaning that the occurrences on the extremes are far more common than expected in a normal distribution.  The Greate Depression (1931) and the Global Financial Crisis (2008) led to two of the largest stock market losses of the S&P 500. With a loss between -20% and -30% this year, we are in the same category/bin as 1930, 1974 and 2002.



Monday, November 14, 2022

Azure functions with Python: a getting started guide

In this post, we'll learn what Azure Functions are, and how you can use VS Code to write your first Azure Function in Python code. 

I will show how you can create a simple Azure Function which retrieves data from Yahoo Finance (See Using Python and Pandas Datareader to retrieve financial data - part 3: Yahoo Finance) and saves the retrieved data in a CSV file in Azure blob storage. I will be using the Python v1 programming model for Azure Functions since v2 is still in preview.

Introduction to Serverless and Azure Functions

More traditional forms of cloud usage require you to provision virtual machines in the cloud, deploy your code to these VMs, manage resource usage and scaling, keep the OS up to date and the underlying stack, setup monitoring, perform backups, etc... 

But if you just want to deploy some piece of code which needs to handle some kind of event, serverless compute might be the right choice for you. With serverless compute, you can develop your applications, deploy it to the serverless service like Azure Functions and you don't need to worry about the underlying hosting architecture. Serverless compute is most of the time cheaper than PAAS or IAAS hosting models. 



Several versions of the Azure Functions runtime are available - see Languages by runtime version for an overview which languages are supported in each runtime version. Python 3.7, 3.8 and 3.9 are supported by Azure Functions v2, v3 and v4. 

How to create an Azure Function using Azure Portal

You can deploy an Azure Function from your local machine to Azure without leaving VS Code, but I would recommend doing it first using the Azure Portal to understand what VS Code is doing behind the scenes. 

To create your Azure Function, click the Create a resource link on the Azure Portal home page and next select Function App.



This brings us to the function creation screen, where we have to provide some configuration details before our function is created:

  • Subscription: Azure subscription in which you want to deploy your Azure Function App
  • Resource group: container that holds related resources for an Azure solution - these resources typically share the same development lifecycle, permissions and policies, ...
  • Function App Name
  • Runtime stack: Python
  • Version: choose 3.9 (latest supported version) unless you have specific Python version dependencies.
  • Region: choose the same region as other resource that you need to deploy e.g., blob storage, Cosmos DB, etc. ...
  • Operating system: only Linux is supported
  • Plan type: leave it to Consumption (Serverless) unless you have very specific requirements with regards to execution time limit higher than 10 minutes (see Azure functions scale and hosting - function app timeout duration for more details)
In the next configuration screens just leave the default options but do make sure that you link up an Application Insights resource to your Azure function.

Setup your development environment

Things to setup beforehand:

Create your local Azure Function project in VS Code

Let's now see how you can create a local Azure Functions project in Python - open the Command Palette and choose Azure Functions: Create function. Next select Python, the Python interpreter to create a virtual environment, the template for the function (HTTP trigger) and the authorization level. Based on the provided information, Visual Studio Code will generate the different files in your project.

When you choose "HTTP trigger", it means that the function will activate when the function app receives a HTTP call. The name that you specified for the Function name (jopxtickerdata) will be used to create a new directory which contains three files:
  • function.json - configuration file for our function
  • sample.dat - sample data to test the function
  • __init__.py  - main file with the code that our function runs
You can also add in your own Python code files (e.g. jopxlib.py) that you can use afterwards __init__.py , see Azure Functions Python developer guide - Import behavior for more details.

In the root directory of your project you will also see other files and folders:
  • local.settings.json: stores app settings and connection strings when running locally
  • requirements.txt: list of Python packages the system installs when publishing to Azure
  • host.json: configuration options that affect all functions in a function app instance
  • .venv:   folder which contains the Python virtual environment used by local development.
I slightly modified the standard generated HTTP trigger so that it accepts 2 query string parameters (name and startdate), added a reference to my own Python code (jopxlib) and called the writetickertoazblob function within the main function.


The code of  writetickertoazblob is quite simple - it will download data from Yahoo Finance in a dataframe and then save the dataframe to CSV and upload it to Azure Blob Storage. in Azure functons, application settings are exposed as environment variables during execution os.environ["AZURE_STORAGE_CONNECTION_STRING"] will read the application setting with name AZURE_STORAGE_CONNECTION_STRING






Wednesday, October 26, 2022

Creating a Dataverse instance in a specific Azure region using Power Apps Admin PowerShell module

One of the prerequisites if you need to setup Azure Synapse Link for Dataverse is that the Dataverse/Dynamics 365 environment must be created in the same Azure region as the Azure Data Lake storage account. 

So, if you know that you will be using Azure Synapse Link for Dataverse functionality and you want to avoid opening a support ticket for moving your environment (see Quick tip: finding the Azure data center for your Dynamics 365/Online environment for more details), you will need to create the Dataverse instance in the correct region. 

Unfortunately, this is not possible using the standard user interface but luckily you can use the Power Apps admin PowerShell module (Microsoft.PowerApps.Administration.PowerShell)


I am not sure in which version the functionality was added to enable creation of a Dataverse environment in a specific Azure region was added so if you already installed this PowerShell previously I would recommend updating to the latest version using the update-module command.


When using the New-AdminPowerAppEnvironment command you can specify the location and region you need (for example -LocationName "europe" -RegionName "westeurope"). For a list of supported environment locations and regions you can run Get-AdminPowerAppEnvironmentLocations


References:


Monday, October 24, 2022

CollabDays Belgium 2022 - a look back


CollabDays Belgium 2022 was simply amazing. It was really great to get together with some old friends and meet new people in the #Microsoft365 community. Unfortunately, there were so many excellent sessions that I could not attend them all, but I learned a lot in different sessions. The slidedeck of my session Dataverse deep dive: watch out for the sharks is now available for download on GitHub

If you are a Microsoft365 professional working in Belgium, I highly recommend you register as a BIWUG member - it is amazing to see how this user group and community has managed to still stay vibrant and relevant more 17 years after the first BIWUG meeting in September 2005.

Tuesday, October 04, 2022

Dynamics 365 and Power Platform monthly reading list September 2022

Power Platform and Dynamics 365 release 2022 wave 2

Technical topics (Configuration, customization and extensibility)

Topics for Dynamics 365 Business Applications Platform consultants, project managers and power users

Thursday, September 22, 2022

Quick tip: troubleshooting Jupyter notebook not starting correctly

The other day my Jupyter notebooks did not start correctly from Anaconda navigator - luckily the Jupyter docs have a section  Jupyter - What to do when things go wrong. So I tried starting it from Anaconda prompt and it indeed gave me an exception that there was an invalid path in the Jupyter config file - to find out where to look for the config file check out Jupyter common directories and file locations

Tuesday, September 20, 2022

Quick tip: updating Anaconda with command prompt

 I recently started getting a popup for updating Anaconda Navigator on my Windows machine and I also received a warning when installing packages using the Anaconda prompt. I first started the update through the user interface but this update completely stalled and I had to do a hard reboot after almost 2 hours (when my patience ran out). Running the update using Anaconda prompt worked without problems - next time I will use this method first. If conda is installed on your machine, you can update it to the most recent version and patches using  conda update -n base -c defaults conda



Sunday, September 18, 2022

Speaking engagements in coming months

With all Covid bans lifted and summer holidays well over, the conference season kicks off. 



I will be speaking at a couple of events in the coming weeks and months:

  • Dataminds evening session Upcoming in-person event on September 29th organized by dataMinds.be at Inetum-Realdolmen offices in Kontich together with Benni De Jagere. First session a little bit off the beaten track for data professionals: #dataviz for investors. Second session: #PowerBI roadmap and #AMA by Benni.
  • Collabdays Belgium 2022. Free community-driven event in Brussels, Belgium. Focus is Microsoft 365 with some Power Platform and Azure sprinkled on top. I am particularly excited to be speaking at this conference which was born out of the SharePoint Saturday conferences which I helped organize many years ago. I will be delivering Dataverse Deep Dive: watch out for sharks.
  • Cloudbrew 2022. A two-day conference focusing on all things Azure on November 18-19 in Mechelen Belgium. I will be delivering Using Python and Azure Cloud for trading and investing

Thursday, September 01, 2022

Dataminds event - How data visualization can help you become a better long-term investor and Power BI roadmap session

I will be delivering a  Dataminds session together with Benni De Jagere (@BenniDeJagere) on September 29th - for registration and info see the Dataminds in-person event at Realdolmen



Below the summary of the session and a sneak peak at the opening slide (spoiler alert: investing like a sloth isn't that hard)

How data visualization can help you become a better long term investor

Summary: Whether you prefer passive investing or are more into actively managing your investments, data visualization can help you overcome the emotional biases associated with investing in the financial markets. In this session, we will explore how you can use Python, Azure and Power BI to become a better long term investor. We will start with gathering the relevant data using Python, process it with Synapse and Python, and finally finish off with building  data visualizations using Jupyter notebooks and Power BI. We will see how these visualizations can help you understand portfolio returns and risk, get a better understanding of market manias and crashes and visualize different investment strategies.

The aim of this session is not to tell you what stocks, bonds, funds or ETFs to buy but will show you how data visualization might help you not to fall for the irrational and manic-depressive characteristics of Mr. Market or other pitfalls when starting to invest.

Attendees are invited to fill in this survey (max. 5 minutes) https://ecv.microsoft.com/pteCfVQ89I – survey results will be used in the session.

Friday, August 26, 2022

Detailed Power Platform request usage information in Power Platform Admin Center in preview

I wrote a post in January 2022 on the changes in Dynamics 365 and Power Platform request limits and allocations mentioning that detailed reporting was not available at that point in time. 

In the meanwhile Microsoft has released Detailed Power Platform request usage information in the Power Platform admin center (Preview). For integration application users you need to look at the Non-licensed User report which shows request usage per day for non-licensed users and the total entitlement for the tenant.  Unfortunately for environments with a lot of integrations you might need to revert to some Excel skills or Power BI to make sense of the data (currently working in a tenant where we have 100K+ lines in the CSV export file).


For the moment there is no high usage enforcement for Power Platform request limits, but this might start at least six months after the reports have been made available. 


References:


Wednesday, August 17, 2022

Dynamics 365 and Power Platform monthly reading list July 2022



Power Platform and Dynamics 365 release 2022 wave 2


Technical topics (Configuration, customization and extensibility)



Topics for Dynamics 365 Business Applications Platform consultants, project managers and power users


Friday, August 12, 2022

Using the yFinance Python package to download financial data from Yahoo Finance - part 2

In a previous post I showed how you can download ticker data from Yahoo Finance using the yFinance Python package.  I now updated the  Jupyter notebook code sample using YFinance to also show how you can retrieve additional information (sector, industry, trailing and forward earnings per share, etc...). The Ticker class in the yFinance library contains the info property which returns a dictionary object ( a collection of key-value pairs where each key is associated with a value) which allows you to access specific information about an asset.

Since I wanted to know how fast data retrieval would be I also include the %%time magic command . Wall clock time measures how much time has passed. CPU time is how many (milli)seconds the CPU was busy.


Yahoo Finance contains data about stocks, Exchange Traded Funds  (ETF), mutual funds and stock market indices - the information that you can retrieve for each of these differs, so it is safe to check in your code for the quoteType. Below example retrieves information about Apple stock, the iShares MSCI AWCI UCITS ETF (Acc) and a thematic mutual fund from KBC.

I also included a code snippet which shows how to retrieve this information for multiple assets and convert this into a Pandas dataframe.










Tuesday, August 02, 2022

Explore and analyze stock ticker data in Azure data lake with Azure Synapse serverless SQL Pool

In this walkthrough, I will show how you can perform exploratory data analysis on stock market data using Azure Synapse serverless SQL pools. To simplify things I will just focus on daily quotes for the S&P 500. 

The S&P 500 (short for Standard & Poor's 500) tracks the performance of 500 large companies listed on exchanges in the United States. The  composition of the S&P 500 is typically rebalanced four times per year. The S&P 500 is a capitalization-weighted index meaning that the stocks with a higher market capitalization have a big impact on the changes in the index (See Top 10 S&P 500 stocks by index weight)


I downloaded all daily data for the S&P 500 stock market index (ticker symbol is ^GSPC) from Yahoo Finance using the historical data tab in CSV format. The S&P CSV file contains the date, open, high, low, close, volume, dividends and stock splits for the S&P 500 from December 1927 (but the index in its current form was only created in 1957) until now (dividends and stock splits are not relevant).  I manually downloaded the file but take a look at  Using Python and Pandas Datareader to retrieve financial data part 3: Yahoo Finance and Using the yFinance Python package to download financial data from Yahoo Finance for ways to automate retrieving data from Yahoo Finance using Python.

Serverless SQL Pools in Azure Synapse

Serverless SQL Pool is an auto-scale SQL query engine that is built-in to Azure Synapse - as the term serverless indicates you don't need to worry about provisioning underlying hardware or software resources.  Serverless SQL Pool uses a pay-per-use model so you will only be charged for a query if you run it to process data.  Like Synapse dedicated SQL pool, serverless SQL pool also distributes processing across multiple nodes using a scale-out architecture (Check out the Microsoft research publication Polaris: the distributed SQL engine in Azure Synapse for an in-depth discussion). 

Synapse Serverless SQL enables you to query external data stored in Azure Storage (including Data Lake Gen 1 and Data Lake Gen2), Cosmos DB and Dataverse. The data remains stored in Azure storage in a supported file format (CSV, JSON, Parquet or delta) and is query processing is handled by the Synapse SQL engine.

Walkthrough: analyzing S&P 500 data with Synapse serverless SQL

In this post I will not show you how you need to setup Azure Synapse - take a look at Quickstart: Create a Synapse Workspace for a detailed walkthrough - the Microsoft Learn learning paths which I added in the references are also quite useful. 

In this post, I will be primarily using SQL to analyze the data but this is a matter of preference (having a coding background I prefer Python to do exploratory data analysis)

After you downloaded the data you will need to upload the CSV file to the Azure data lake storage associated with Synapse Link (you can also use a different Azure storage).


The OpenRowset (Bulk..) function allows you to access files in Azure storage.  The SP500.csv file has a header row specifying the different columns in use - it contains all daily ticker data since December 1927. I am using Parser_Version 2.0 since it is more performant but it has some limitations (see the Arguments section in Microsoft's OpenRowSet  documentation) - also check out How do Synapse serverless SQL pools deal with different file schemas (or schema evolution) part 1  CSV for some interesting info on how schema changes are handled.


If you will be using the data quite frequently, it might make more sense to use a CETAS process (CREATE EXTERNAL TABLE AS SELECT) to generate a dataset pointing to the data residing in the data lake ready for querying. In the Synapse Studio data hub, you can simply right click on a file and select the option to create an external table. 

Next, select the database and the name of the table. You will need to create the external table by selecting "Use SQL Script" since you will need to adapt the script to skip the header row for reading data. For CSV files you have the option to infer column names.
You will need to modify the generated script for creating the external file format so that it skips the header row. You are still able to modify the database in which you want to create the external table (1) and I added a line to indicate that the external file contains a header row so data read should start on row 2 (2). Once you understand the script, it also possible to modify it to use wildcards, so that you can read from multiple files in multiple folders.

Now let's try out some queries in Azure Synapse Studio:
  • Let's get all closing prices for this century ([date]> '2020-01-01') - you will notice that you can also visualize the data using some basic graphs.
  • Which were the years with the largest percentage difference between the highest and lowest close for the S&P 500? No surprises here - we have the Wall Street crash of 1929 followed by the Great Depression of the 1930s,  the Financial Crisis of 2007-2008 and the Covid crash in 2020 in the top 10 
  • Which were the days with the highest difference between the day's closing price and the previous closing price - so the days in which the market crashed. In this example I used the SQL Lag() function.  Besides the 1930s we also see Black Monday with a 20% decline in the S&P 500 - this triggered a global sell-off (Take a look at this video about Black Monday documentary (YouTube) with traders actually still working on the market floor)
  • You can also use common table expressions (CTE) for working with temporary named result sets for more complex queries and data manipulations. In the example below I want to find the 3-day trend for the S&P 500. (See Introduction to the SQL With clause if you are new to CTEs).  The idea behind this query is to create a three-day trend variable for any given row. If the closing price on a day is greater than the closing price on the previous day, then we assign that day +1 one, otherwise, that date gets assigned -1 (minx_close columns). If the majority in the previous 3 days consists of positive values, the trend is positive, otherwise the trend is negative. (Example taken from Coursera: Introduction to Trading, Machine Learning & GCP )
As seen in this post, Synapse serverless SQL is quite useful for data professionals in different situations. Data engineers can use it to explore data from data lake to optimize data transformations, data scientists and data analysts can use it to quickly carry out exploratory data analysis (EDA). Take a look at Build data analytics solutions using Azure Synapse serverless SQL pools (Microsoft Learn) if you want to learn more.  In an upcoming post I will show how easy it is to consume the data from Azure Synapse SQL Serverless in Power BI.

Friday, July 29, 2022

Thoughts about the Dynamics 365 solution architect role

I having been working as a solution architect for over 15 years now and the last 7 years as a Dynamics 365 solution architect (from when we still referred to it as simply Dynamics CRM) and I really like it.  But if I tell people that I work as a Dynamics 365 solution architect (even people work in IT), I immediately get the idea that they don't have a clue about what the role entails or what I do (and don't get me started about recruiters).



In this post I will try to shed some light on why it is so hard to clearly define the role and responsibilities of a Dynamics 365 solution architect. I will also explain on what Microsoft thinks about the Dynamics 365 role and why what Microsoft prescribes is not necessarily what applies to the project that you will be working on as a solution architect.  I will also list a number of lessons learned and learning resources that I used in the last couple of years.

Context matters

How we approach architecture, defines what architecture, in effect, is, at least for our system. That is, no matter what we say architecture is (for), architecture is (an outcome of) what we do. (Source: What is Software Architecture and related concerns

The last couple of years, I mainly worked as a solution architect in a large team at Toyota Motor Europe (TME), so I adopted the way of working of TME and the standards and boundaries set for solution architects within Toyota (I will be writing a separate blog post on lessons learned from the Toyota way)

So depending on the customer your work for, the size of the Dynamics 365 project and the associated complexity, the number of consultants working on the project and the overall maturity of the project team as well as the project methodology (agile vs waterfall) the tasks you take on as a Dynamics 365 solution architect and your role and responsibilities might differ quite a lot. 

Whether you are full-time working for a single project (embedded architect) or supporting multiple teams (wandering/travelling architect) also makes a big difference.

Architecture is about significant design decisions


I think that if you look at the role of the Dynamics 365 solution architecture from a very minimalistic viewpoint, there are three key activities and deliverables: 
  • The high level architecture visualization - remember a picture says more than a thousand words - definitely take a look at good models like C4 from Simon Brown
  • Maintaining a list of architecture design decisions together with the project team (take a look at  Architectural Decision Records | adr.github.io for some more info)
  • Coaching and mentoring the team to create an architecture awareness (understand design trade-offs, keep track of non-functional requirements, identify architecture risks and counter measures, understanding and handling technical debt)
A solution architect has to keep the long-term use of his solution in mind and implement scalability and adaptability into the solution for the future.

Success By Design and FastTrack

I learned a lot from the joint-design workshops and conversations with the Microsoft FastTrack solution architect assigned on some of my projects. For those of you unfamiliar with FastTrack for Dynamics 365, this is a Microsoft implementation support program for large Dynamics 365 implementations - this includes Customer Engagement (Sales, Customer Services, Field Services, Project Operations  and Marketing) as well as Dynamics 365 Finance, Commerce, Supply Chain Management and Human Resources.

The full FastTrack program is only available for a selected group of customers (threshold based on Dynamics 365 annual licensing revenue or internal approval by Microsoft account team)  and Microsoft partners (only partners with gold or silver status in Cloud Business Applications competency).  When you participate in the FastTrack program, you will be assigned a Microsoft FastTrack solution architect which you can use as a sounding board for design choices, receive guidance on best practices and how to plan for a successful roll-out.

Even if you can't join the Dynamics 365 FastTrack program it is still useful to take a look at the Success By Design resources - especially the Dynamics 365 Implementation Guide (recently revised) is a very extensive reference.

Also take a look at Dynamics 365 Fasttrack Architecture Insights - recordings created and shared by solution architects from the Dynamics 365 engineering team.

Microsoft about the Dynamics 365 solution architect

Two years ago I took a Microsoft certification exam targeted at solution architects in the Dynamics 365 space. I really enjoyed taking the exam MB-600: Microsoft Dynamics 365/Power Platform Solution Architect (now replaced by PL-600: Microsoft Power Platform Solution Architect) since it gives great insights on how Microsoft looks at the role of a Dynamics 365 solution architect. 

As a preparation for this exam, I followed the Architect solutions for Dynamics 365 and Microsoft Power Platform (screenshot below taken from this learning path) and together with the Microsoft FastTrack boot camp for Dynamics 365 CE solution architects, this shaped a lot of my thinking about the role of a Dynamics 365 solution architect. 


A key task of the solution architect is solution envisioning. Traditionally, a development-focused architect would start with custom development and low-level Azure services. A business applications focused solution architect will instead start with Dynamics 365 and Power Platform and then use third party components, custom development and Microsoft Azure to address any gaps.

In-depth knowledge of the products in the Microsoft BizApps stacks is therefore a must have and hands-on experience with one or more products helps you in becoming a better architect.  If you look at the videos from Gus Gonzalez on How do you become a solution architect and 3 rules for solution architects and consultants or listen to some of the episodes of the CRM MVP Podcast you will also get a feel about what other solution architects think about their role.

Microsoft seems to assume that a solution architect needs to be embedded into the team and does not make any mention of the notion of the travelling architect. The idea behind a travelling architect is that he/she will support multiple projects and provide support for teams who lack architectural skills by either coaching them, tackling specific architecture issues by him/herself, etc ... If you are working for a partner organization who is doing multiple smaller projects in parallel the role of a travelling architect makes more sense.

Working with other architects 

Depending on their scope of work, different types of architects work at different levels of abstraction - until now I have worked with enterprise architects, business architects and technical architects. In my current project, I work very closely with EA to make sure that the solution blueprint aligns with the set of architecture principles and principles defined by EA. 




A solution architect is not the same as an enterprise architect (although you might combine both roles) - an enterprise architect has a more strategic role and handles the initiatives that are required within a digital transformation at enterprise level to ensure that the business achieves their goals (business intent). It is however useful that you have a good understanding about enterprise architecture as a function and affinity with EA frameworks.

Get involved during the initial definition of a project/program

When I took a TOGAF training course years ago, I still remember an architect being called a "best effort architect" when he is not involved in the early stages of a program or project. If you are working on customer side and you are not involved during the initial business case creation, project or program initial scope definition, you are a "best effort architect" meaning that you miss a lot of the context and a large numbers of important decisions have already been made so you can only give it your best effort. 

Same goes for architects working at a consultancy or implementation partner, if you are not involved during the initial presales phase, you are constrained due to (implicit) design decisions which were already made during presales phase while answering the RFP or presenting demo cases.

Learn by doing
So if you are a Dynamics 365 solution consultant or technical consultant and you want to take the leap, there are a lot of resources to ease into the role but nothing beats learning by doing. Also keep in mind that as a solution architect lifelong learning is a prerequisite for success (but probably also for other roles)



If you are lucky, you might encounter someone within the company you work for who can coach you to grow into the role but also reach out to the broader Dynamics 365 community for questions or feedback.


References: