Monday, December 15, 2014

Introducing Azure Stream Analytics

Azure Stream Analytics  which is currently in preview is a fully managed real-time stream analytics service that aims at providing highly resilient, low latency, and scalable complex event processing of streaming data for scenarios such as Internet of Things, command and control (of devices) and real-time Business Intelligence on streaming data.

Although it might look similar to Amazon Kinesis, it seems to distinguish itself by aiming to increase developer productivity by enabling you to author streaming jobs using a SQL-like language to specify necessary transformations and it provides a range of operators which are quite useful to define time-based operations such as windowed aggregations (Check out Stream Analytics Query Language Reference for more information) – listed below is an example taken from the documentation which finds all toll booths which have served more than 3 vehicles in the last 5 minutes (See Sliding Window – slides by an epsilon and produces output at the occurrence of an event)

SELECT DateAdd(minute,-5,System.TimeStamp) AS WinStartTime, System.TimeStamp AS WinEndTime, TollId, COUNT(*) 
FROM Input TIMESTAMP BY EntryTime
GROUP BY TollId, SlidingWindow(minute, 5)
HAVING COUNT(*) > 3

This SQL like language allows for non-developers to built stream processing solutions through the Azure Portal and allows to easily filter, project, aggregate and join streams, add static data (master data) with streaming data and detect patterns within the data streams without developer intervention.

 

Azure Stream Analytics leverages cloud elasticity to scale up or scale down the number of resources on demand thereby providing a distributed, scale out architecture with very low startup costs. You will only pay for the resources you use and have the ability to add resources as needed. Pricing is calculated based on the volume of data processed by the streaming job (in GB) and the number of Streaming Units that you are using. Streaming Units provide the scale out mechanism for Azure Stream Analytics and provide a maximum throughput of 1MB/sec. Pricing starts as low as €0.0004/GB and €0.012/hr per streaming unit (roughly equivalent to less than 10€/month). It also integrates seamlessly with other services such as Azure Event Hub, Azure Machine Learning, Azure Storage and Azure SQL databases.

References



Thursday, December 11, 2014

SharePoint deep dive exploration: SharePoint alerting

This is the second in a series of blogpost on SharePoint Server 2013 in which we will explorer how e-mail alerting works in SharePoint 2013. For part 1 – take a look at SharePoint deep dive exploration: looking into the SharePoint UserInfo table.

If you need to know more about how alerts are working at the lowest level you should take a look at SharePoint 2003 Database tables documentation – for alerts this documentation still seems to be valid. SharePoint stores the list of events for which users have request alerts in the EventCache Table – in comparison to SharePoint 2003 there are some extra fields available (marked in bold). For some of the fields I did not find a

The other tables which are manipulated by the the SharePoint alert framework are EventLog, ImmedSubscriptions, SchedSubscriptions and EventSubsMatches (For an in depth discussion also take a look at the Working with search alerts in SharePoint 2010 ).  Every event is recorded in these table but since the EventType and EventData column will contain the most data, these are only filled in when the list has at least one subscription.

So how does this works – there actually is a SharePoint timer job – called the “Immediated Alerts” job which is scheduled to run every 5 minutes. This will pick up the necessary event information and will process it (in batches of 10.000) – if you see issues with alerts not being sent out – I recommend you to take a look at SharePoint Scheduled Alerts Deconstructed

Column Name Description
EventTime Time when the event record was inserted into the database
SiteId ID of the site, available from the AllSites table
WebId ID of the web, available from the AllWebs table
ListId ID of the list in which the monitored item appears
ItemId ID of the item that raised the event
DocId ID of the document that raised the event
Guid0 ?
Int0 ?
Int1 ?
ContentTypeId ?
ItemName Full name of the item
ItemFullUrl Full path of the item
EventType ItemAdded(1), Item Modified (2), Item Deleted (4), DiscussionAdded (16), Discussion Modified(32), Discussion Deleted(64), Discussion Closed (128), Discussion Activated (256), …
ObjectType  
ModifiedBy User name of the person who raised the event
TimeLastModified Time when the event occurred
EventData The binary large object (BLOB) containing all of the field changes with the old and new values for an item
ACL The ACL for the item at the time it is edited
DocClientId  
CorrelationId  

The reason why I started looking into these tables because I got feedback from a client that all e-mail alerts which were being sent out had the wrong link in it after we migrated their environment from SharePoint 2007 to 2013. One of the first things that I did was actually sit next to the user who was adding documents in SharePoint and then I noticed something strange. The user uploaded a document and when they needed to fill in extra metadata, they immediately changed the name of the document.

After looking into how alerting works I still did not get an explanation for why the links were sent out correctly before in 2007 – because this should have failed as well. So I used this PowerShell script to create an export of all the e-mail alerts/subscriptions that users had in SharePoint and I noticed that most of the alerts were on just a couple of libraries and then I found it.

In SharePoint 2007, they had a “require check out” set by default on these libraries – this means that when the user uploaded and renamed the document, it was not yet visible to other users and the alert was not send out. If checkout is not required then the files are immediately visible and the “New Item Added” immediate alerts is fired – this was the behavior that they were seeing in 2013.

So the “require checkout” is an interesting workaround to prevent a file from being visible before it is explicitly checked in. Since they were changing the file properties (and even the filename) before the file is visible to users, the New Item alerts would not trigger and users would only be notified of the “Changed Item” alert when the file was checked in.

The reason why we deactivated “require check out” was because of it would conflict with co-authoring but apparently they would never use this feature for these specific libraries for which these alerts were set. So the morale of the story, don’t just activate or change a specific functionality because it is available in a new version but first look at how people are actually using it.

References:

 

BIWUG on blueprint for large scale SharePoint projects and display templates

 

On the 16th of December BIWUG (www.biwug.be) is organizing its next session – don’t forget to register for BIWUG1612 because there are some great sessions planned.

SharePoint Factory: a blueprint for large scale SharePoint projects (Speaker: Lana Khoury, Senior EIM Consultant at CGI Belgium responsible for CGI Belgium’s Microsoft Competency Centre, and the Digital Transformation Portfolio)

Large Notes 2 SharePoint transformations do require a standardized approach in development and project management in order to assure the delivery in time and quality.The SharePoint Factory has been developed, to allow parallel development of applications and support all stages of the development process by having standardized quality gates, test procedures and templates for example requirements analysis templates. Essentially, the SharePoint Factory can be compared to an assembly line in the automotive industry.This approach is combined with a SharePoint PM as a Service offering which is a blueprint for the Management of Large Scale SharePoint projects and does provide a specific PM Process with SharePoint centric artefacts, checklists and documents. The approach has been developed within a 6.500 person day Project in Germany and has already been published to German .net Magazin, SharePoint Kompendium and Dutch DIWUG Magazine.

Take your display template skills to the next level (Speaker: Elio Struyf, senior SharePoint consultant at Ventigrate -  http://www.eliostruyf.com/)

Once you know how search display templates work and how they can be created. It is rather easy to enhance the overall experience of your sites compared with previous versions of SharePoint. In this session I will take you to the next level of display templates, where you will learn to add grouping, sorting, loading more results, and more. This session focuses on people that already have the basic understanding of what search display templates are, and how they can be created.

18:00 - 18:30 ... Welcome and snack

18:30 - 19:30 ... SharePoint factory: a blueprint for large scale SharePoint projects (Speaker: Lana Khoury)

19:30 - 19:45 ... Break

19:45 - 20:45 ... Take your display template skills to the next level ( Speaker: Elio Struyf )

20:45 - …      ... SharePint!

 

Tags van Technorati: ,,,

Tuesday, November 25, 2014

Get early access to new features in Office 365 and provide feedback with Uservoice

Microsoft is offering a new First Release program. If you opt-in to join this new First Release program, you get to test the new features for Office 365, SharePoint Online, and Exchange Online a couple weeks before they roll out to everyone else. To activate it go to Office 365 Admin Center > Service Settings > Updates. You will get a warning stating that activation of new features might take up to 24 hours to complete – so be patient.



The last couple of months a couple interesting new functional modules such as Delve, Yammer Groups and the new App launcher have been pre-released on Office 365 for which some might only become visible after you have activated first release. Remember that there also is an Office 365 for business public roadmap available (at office.com/roadmap) where you can see which functionality is being rolled out and which is under development. For more information check out the links below.

Also remember that you can always use the Office Developer Platform Uservoice (http://officespdev.uservoice.com/) to give feedback and request changes. You can submit your feedback for a specific change and encourage others who you know to support these changes by voting for them. If you want to give feedback with regards to InfoPath – there is a Microsoft Office Forms vNext User Voice (http://officeforms.uservoice.com/) as well.

References
Tags van Technorati: ,,,,

Tuesday, November 18, 2014

Understanding Azure Event Hubs–ingesting data at scale

Azure Event Hubs are an extension on the existing Azure Service Bus which provides hyper-scalable stream ingestion capabilities. It allows different producers (devices & sensors – possibly in the 10 thousands) to send continuous streams of data without interruption. There are a number of different scenario in which you typically see this kind of streaming data from different sensors such as future oriented scenarios such as connected cars, smart cities but also more common scenarios such as application telemetry or industrial automation.

Event hubs scaling is defined by Throughput Units (TUs) which is kind of like a pre-allocation of resources. A single TU is able to handle up to 2 MB/s for writes or 1000 events per second and 2MB/s for read operations. Load in the Event Hub is determined by creation of partitions, these partitions allow for parallel processing both from the consumer and producer side. Next to support for common messaging scenarios, competing consumers, it allows provide data retention policies up to 84 GB of event storage per day. The current release supports up to 32 partitions but you can log a call to increase this up to a 1000 partitions. Since a partition is allocated at most 1 TU, this would allow for 1GB/s data ingest per Event Hub. Messages can be send to an Event Hub publisher endpoint via HTTPS or AMQP 1.0, consumers can retrieve messages using AMQP 1.0

Building such an application architecture is quite challenging and Event Hubs allows you to leverage the elasticity of the cloud and a pay per use model to get started quite rapidly. Whereas current scaling of this type of systems is oriented at 10s of thousands of units, expectations are that this number will increase quite rapidly. Gartner expects the number of installed IoT units to increase up to 26 billion by 2020, other estimates are event pointing at 40 billion IoT units (Internet of Things by the Numbers: estimates and forecasts)

References:

Monday, November 17, 2014

Webinar: What’s new on the Microsoft Azure Data Platform

On Thursday 20th of November I will be delivering a webinar on the new capabilities in the Microsoft Azure Data Platform.  With the recent addition of three new services - Azure Stream Analytics, Azure Data Factory and Azure Event Hubs - Microsoft is making progress in building the best cloud platform for both big data solutions as well as enabling the Internet of Things (IoT). These additions will allow you to process, manage and orchestrate data from Internet of Things (IoT) devices and sensors and turn this data into valuable insights for your business.

The above mentioned new services extend Microsoft's existing big data offering based on HDInsight and Azure Machine Learning. HDInsight is Microsoft's offering of Hadoop functionality on Microsoft Azure. It simplifies the setup and configuration of Hadoop cluster by offering it as an elastic service. Azure Machine Learning is a new Microsoft Azure-based tool that helps organization build predictive models using built in machine learning algorithms all from a web console.

In this webinar I will show what are the key capabilities of these different components, how they fit together and how you can leverage them in your own solutions.

Register for this free webinar “What’s new on the Microsoft Azure Data Platform” and get up to speed in less than one hour.

Wednesday, November 12, 2014

BIWUG on apps for SharePoint Server 2010 and data driven collaboration

 

On the 26th of November BIWUG is organizing our next session – don’t forget to register for BIWUG2611 because there are some great sessions planned.

Writing apps on SharePoint Server 2010 (Speaker: Akshay Koul, SharePoint CoOrdinator at Self, http://www.akshaykoul.com )

The session is geared towards developers/advanced users and explains how you can write enterprise level applications on SharePoint 2010 without any server side code.  We will go through real life applications and discuss the mechanisms used, the provisioning process, debugging techniques as well as best practices. The application written are fully compatible with Office 365/SharePoint Online and SharePoint Server 2013.

Preparing for the upcoming (r)evolution from User Adoption to Data-Driven Collaboration (Speaker: Peter Van Hees, MVP Office 365/Collaboration architect, http://petervanhees.com )

As Consultants we (try to) listen to our customer, (try to) address the requirements ... and finally (try to) deploy the solution. This seems like an easy job, but in reality Collaboration projects - and especially SharePoint or Yammer implementations - are a little more challenging. The fast adoption of cloud computing has introduced a new currency for license-based software: User Engagement. If you can’t engage your users, your revenue stream will start to spiral downwards. It should be obvious that Office 365 (and all of its individual components) are not exempt. We all need to focus on the post deployment!

This story bears its roots in my hands-on experience while trying to launch Yammer initiatives. It seems that everyone agrees that Yammer is a wonderful and viral service ... yet, the conversations seems to flat line in most organizations. We will review how you should (already) be addressing User Adoption now; but, more importantly, we will spend more time to look into the stars … a future where Data-Driven Collaboration will take User Engagement to the next level. This isn't a story about Delve. It's about ensuring you integrate data in all your projects to prepare for the future. The age of smart software …

18:00 - 18:30 ... Welcome and snack

18:30 - 19:30 ... Writing apps on SharePoint Server 2010 (Speaker: Askhay Koul)

19:30 - 19:45 ... Break

19:45 - 20:45 ... Preparing for the upcoming (r)evolution from User Adoption to Data-Driven Collaboration( Speaker: Peter Van Hees )

20:45 - …      ... SharePint!

Tuesday, October 14, 2014

Getting Virtualbox to work on Windows 8.1

Quick tip for those of you who want to try install Virtualbox on Windows 8.1 – use one of the older versions  -  VirtualBox-4.3.12-93733-Win.exe  worked for me (download locationDownload Virtualbox Old builds). More recent versions seem to crash when you try to start a virtual image – see screenshot below.




If you are already using Hyper-V you will also need to create a dual boot since Virtualbox is not compatible with Hyper-V. You can do this using the commands listed below from an administrative command prompt (As outlined in this blog post from Scot Hanselman – Switch easily between VirtualBox and Hyper_V with a BCDEdit boot entry in Windows  8.1 )
C:\>bcdedit /copy {current} /d "No Hyper-V" 
The entry was successfully copied to {ff-23-113-824e-5c5144ea}. 

C:\>bcdedit /set {ff-23-113-824e-5c5144ea} hypervisorlaunchtype off 
The operation completed successfully.

When booting you will be provided with an option to boot with Hyper-V support or without Hyper-V support.


Tags van Technorati: ,,,