Thursday, February 16, 2017

In Browser Query Editor for Azure SQL

Microsoft in the News:

In July of 2009, Google announced that it was developing a partially open source OS that would operate primarily in the cloud.  That OS is of course Chrome OS and it is now the second most popular OS on the market, now beating out Mac OS.  With the onslaught of inexpensive Chromebooks, Microsoft is starting to pay attention. 

Last year, Chromebook enjoyed a 20% growth rate while the PC industry was shrinking.  Since competition seems to bring out the best in companies, Microsoft intends to release Windows 10 Cloud which will compete directly with Chrome OS.

Windows 10 Cloud (“W10C”) will be a scaled down version of Microsoft’s flagship Windows 10.  The intention is that W10C will only run Universal Windows apps from the Windows Store.

There hasn’t been an official announcement as to when W10C will be released, however, there have been indications that it could be as early as April of this year.

In Browser Query Editor for Azure SQL:

There seems to be a steady pace being set by Microsoft for ensuring that all things Azure are constantly being improved.  The latest improvement to Azure SQL is the inclusion of an in-browser query tool.  This new Query Editor has just been released for public preview in Azure.

Until now, to access and query your Azure database, you had to switch to a separate tool.  This can be quite cumbersome when working for clients.  I find I would sometimes have to navigate multiple connections and remote desktop connections with layers of security to do a simple query. Now, you can use an Azure Portal to write queries, execute T-SQL scripts, and manage the results of your query.

If you are looking to jump in right away, be forewarned that this feature is just being previewed right now, so you will be asked to accept the Preview Terms prior to being allowed to use this editor.

To get started, once you have navigated to your SQL database, click the Tools menu, followed by Query Editor (preview).  Or, if you have a SQL Data Warehouse, there is a Query editor (preview) button in the Common Tasks screen.


The Query editor is similar to SQL Server Management Studio for writing queries but far more basic and has less functionality available in graphical user interface format.



This new functionality available in Azure will make quick investigation and working on client’s systems much easier.   I am excited to see how this feature evolves. 

Tuesday, January 31, 2017

Basic Security is a good reason to start with the cloud.

Microsoft in the news:

Artificial intelligence has been and will continue to be a news headline grabber for a long time to come.  Over the years, AI has progressed from a supervised learning platform to the current cutting edge, which is a reinforcement-learning platform.  The AlphaGo system that beat the top ranked Go player is the most public example of how powerful the reinforcement-learning platform is. 

In order to leapfrog Google and Facebook, on January 13, Microsoft announced that they purchased Maluuba.  Maluuba’s deep learning AI is able to read and comprehend text with near human capability.  We can look forward to putting this AI and others to work for us via the Azure Cloud.

Another AI from Microsoft uses a subset of reinforced learning know as “contextual bandits”.  A year ago, Microsoft put this AI to work on their MSN.com website.  This AI helped pick personalized headlines for MSN users.  The result was an increase in click-throughs by 25%.  This successful AI tool was turned into an open source service that can be deployed on Azure, as well as other platforms.  This system allows you to answer much more detailed questions than the current A/B testing model.

To access this Multiworld Testing Decision Service, follow this link: https://github.com/Microsoft/mwt-ds

Basic Security is a good reason to start with the cloud.

If you haven’t heard news reports of “ransomware” you will soon.  According to the FBI, ransomware attacks are becoming the attack du jour for cyber criminals.  In the first three months of 2016, over $200 million was paid to cyber criminals because of ransomware (up from $25 million for all of 2015), and that is only what was reported.  There is little doubt that many individuals and businesses have made payments without referring the matter to authorities.

Ransomware is a piece of malicious code that encrypts your data.  Once encrypted, you are sent a message saying that if you pay the ransom, you will receive the encryption key so you can regain access to your data.  The payment is made in Bitcoin, so there is no way to follow the money to the recipient.  In some cases, those who have paid were hit up for additional payments.  Additionally, these criminals don’t always get their encryption right and end up making your data un-recoverable.  So, you may pay and get the key only to find out that it doesn’t work and even the crooks can help you.

Becoming a cyber criminal in this industry is easy, and requires almost no technical talent.  In the underground market of crimeware-as-a-service, you can start your own ransomware campaign by simply providing a Bitcoin address for victims to send their money to.  So, this situation is going to get a lot worse long before it gets better!  This is definitely a growth industry right now.

For this past year, hospitals seem to be a favorite target.  With people’s lives at stake, they are in need of a quick solution, and so they have little choice but to pay the ransom.  Several hospitals in the US have fallen victim to this crime in 2016.  Can you imagine being the IT professional who has to tell the CEO that ALL the company’s data is encrypted and inaccessible?

Educating your staff about spear fishing (the same hack that caught the Democrats flat footed during the election) and other forms of attacks like finding a random USP memory stick outside the front door of the office, is a great start.  However, a persistent hacker will find a way, and education will not be 100% effective.  So, you need to have a “back-up plan” (double entendre intended).

Not only do you have to regularly back up your data, but also, it must be in a secure location that is not connected to the original data.  That is where the Azure cloud can help.  Both Azure Site Recovery and Azure Backup can eliminate the risk of having to pay a ransom to recover your data. Backups in Azure are inherently safer because attackers not only need access to your environment, but also to a secure backup vault on Azure before they are able to effectively attack your data.

Microsoft is well aware of the increasing threat of ransomware and have recently implemented additional security to address this and other threats.  To begin with, since encryption creates a new encrypted file and then deletes the original, Microsoft has implemented a requirement for a Security PIN in order to delete backup data.  In addition, Azure retains deleted backup data for fourteen days ensuring you have multiple recovery points.


In this game of leapfrog as security professionals and criminals continually try to out do each other, you really need a full time team of professionals working around the clock to protect your data.  By taking advantage of Azure cloud services, that is exactly what you are getting.

Friday, January 6, 2017

Event Hubs


I am sure you have heard by now of the IOT or the Internet of Things.   With a plethora of different IOT devices sending data to a variety of applications, did you ever wonder how that data is managed? Event hubs perform this task.  They take in data from everything, from telemetry and mobile app data to gamming data and organize it, so it is easier to consume and use.  Event hub does this by managing the flow of data as it is received.

Let’s begin with an example.  We have sensors at bank branches that log when customers come and go from the branch.  The sensors that log when people come and go are the IOT devices.  These devices are called Event Publishers.  The publishers send the events, entry and exit of the bank, to the hub where they are organized and then available for applications to use the data. These applications are referred to as consumers.  Consumers can be a variety of different types of applications that often need different parts of the data.





All the data is coming in from a single type of publisher the sensors, in this example but often there is more than one type of data.  It could be location data such as how close this customer lives from the branch or basic data such as, is this customer currently in the branch and many more individually different types of data.  Event Hub can take all the data as fast as it comes in, unfortunately in this type of environment there is so many different types of data, that at a large volume it is essentially unusable in a traditional way.  This is where the Event Hub comes in, it is used to sort and organize the data in a way that the consumer can use it.
It is important how the data is used to determine how to organize it.  Event hub uses partitions to do this. 

The formal name for how Event Hub organizations the data is a competing consumer model format.  This means that multiple consumers of the data can receive data from the same channel at the same time to optimize the volume and scale.  As you can imagine, data is streaming in at a highly variable rate, in a wide variety of ways, and using a surfeit of protocols.  As the data enters the workflow, Event Hub uses partitions to segment the data and adds new partitions as data arrives.  Each partition is retained for a configured retention time that is set at the Event Hub level.  The events cannot be deleted but expire on a time basis. 

Partitions are also set when the Event Hub is created.  Partitions are the key to the data organization for any downstream workflow management and should be based on the degree of downstream parallelism you will require for the consumer. A good rule is to make the number of partitions equal to the expected number of concurrent consumers.

Any entity, regardless of type that reads event data is called an event consumer.  All consumers read the event stream through partitions in a consumer group.  Similar to the event hub, the consumer groups partition the data for each concurrent consumer.  The consumer connects to a session in which the events are delivered as they become available.  The consumer does not need to pole to determine if data is available. The consumer group is what controls both the subscription of the Event Hub and the viewing of it.  Each group enables a separate view of the event data stream to each of the consumers.  This individualization allows consumers to process the data at a rate that is appropriate for them and in offsets (set groupings of events) that are individual to that consumer.




To learn more about Event Hubs and how to set them up check out the programming guide at this link;  http://bit.ly/2hWEUob







Wednesday, November 30, 2016

Hour of Code 2016

Microsoft in the News

With Christmas fast approaching many of you are starting to fret over what to get people for Christmas.  For this reason, I decided to bring some of you up to speed on a neat little shopping assistant that plugs into Microsoft Edge.

The Microsoft Personal Shopping Assistant makes your on-line shopping experience just a little easier to manage.  I’m not going to list all the features here, but one feature that caught my attention was that when you save an item to your favourites board, your shopping assistant will alert you when there are price changes.

This plug-in is getting great reviews and is available for free download at the Microsoft Store.

Computer Science Education

Never let schooling interfere with your education.  Mark Twain

Next week (December 5 – 11) is Computer Science Education Week and Hour of Code 2016!
Last year over 10 million students engaged in an Hour of Code.  This year Microsoft is launching Minecraft: Education Edition and a brand-new coding tutorial experience. Minecraft is a game that appeals to many age groups.  A child’s current love of Minecraft, and a passion to explore, is all that is needed to fuel the opportunity to learn a few technology skills and realize the depth and breadth of fun, that technology can provide.  Hour of Code and Microsoft are committed to reaching as many young people as possible, particularly those underrepresented in this field.  Minecraft is a fantastic way to introduce it and this year they have another new tutorial for the Hour of Code. 



All the Hour of Code Minecraft Tutorials can be found here

If you think you could help in your local area, they are looking for volunteers.  You can schedule your own Hour of Code at your local school or sign up here to volunteer at anytime during the year here.

If you ever wondered if this was important, I offer you a few stats from Code.org.

Computer science drives innovation throughout the US economy, but it remains marginalized throughout K-12 education.
Only 33 states allow students to count computer science courses toward high school graduation.
There are currently 517,393 open computing jobs nationwide.
Last year, only 42,969 computer science students graduated into the workforce.


These are obviously American stats but I am sure it is a similar story in nations across the Western world.  Where I live in Canada, there is not a single computer science course offered within a hours drive.  Some may suggest I have my kids do one online.  We looked at that, and could not find one that did not require some form of coding experience.  I even sent emails to the director of the on-line courses approved for high school credit.  I asked if I could help get my child up to speed so that she could take the class.  But even with multiple emails, we received no replies.  At that point my child lost interest.   When I asked the school about it they said there was not enough interest to hold a regular class. 
I refuse to believe that our children are so willing to ignore the future in which they will be living.  If the children are not interested, that is the fault of the adults charged with the duty of preparing children for their future.  So, I applaud Microsoft, Code.org and Hour of Code for picking up the ball that we as parents and educators have dropped.  These three organizations are making the art and science of coding easy and engaging again.
At the same school where I was told there was not enough interest, I have always had a great turnout for all my workshops.  This year will be no different.  Sometimes it is how you teach and not what you teach. 
Let’s put the engagement back into learning!


Monday, November 21, 2016

Data Catalog


Not to be confused with a “catalogue” which is some form of ancient paper based device, a “catalog” is a collection of metadata.  It is a directory of information that describes where a data set, file or database entity is located.  Additional information about the data may also be included such as the producer, content, quality, condition, and any other characteristic that may be pertinent.  It is a tool that allows an analyst to find the data they need.  There may be solutions hidden in your data.  A data catalog, at the least, will tell you where to look.


In any organization, data is collected and stored across different departments, multiple databases, and in a variety of formats.  In banking, for example, the customer information that a bank manager sees isn’t the same as what the Finance Department sees.  In fact, the bank manager is likely not even aware that a separate and unique data source about their clients even exists.  Registering these sources in a catalog allows people to become aware of the existence of data they may find useful.

Suppose you are at the library and you want to hold in your hands a map with information about Hole-in-the-Wall Falls in Oregon.  You could look at numerous maps and not find anything.  The first map you pick up may be a highway map.  If the catalog you are looking at has the map descriptions, it will save you a lot of searching.  The catalog may describe the map you are looking for as a topographical map showing hydrology for the state of Oregon, with the map being located at a specific library.  Now, instead of travelling from library to library looking through a variety of maps of Oregon, you can focus your attention on tracking down this single map with the information you need.

Microsoft’s Azure Data Catalog (“ADC”) is a fully managed service.  With ADC, when you register a data source, you can point to the source of that data and ADC will automatically extract structural metadata.  The source of the data does not have to be in the cloud.

Once registered, the catalog card can be used by anyone with access.  Others can then annotate it in order to enrich the metadata.  ADC will allow for crowd sourcing of metadata in order to provide a catalog rich in details.  Tags can include, for example, descriptions of how the registered data can be used to find what otherwise might be obscure or unique solutions.

Because the source of the data is registered in the Catalog, a user can connect directly to that data source through the catalog.  If the data is such that it shouldn’t be freely shared throughout an organization, ADC will allow the registrant to restrict access by defining ownership of the data and authorization requirements for access.

Organizations produce data at an enormous rate.  Storage for that data is likely to run the full gamut of places from an individual computer to the cloud, with locations anywhere on the planet.  This exponential growth of data and data sources makes a data catalog a very useful tool for making that data useful to everyone within the organization.  Through the use of ADC, you can actually find that needle in the haystack.

Some links to get you started:


You can find a series of “how to” links at the end of this Data Catalo intro article: https://azure.microsoft.com/en-us/documentation/articles/data-catalog-what-is-data-catalog/


Wednesday, November 16, 2016

SQL Server - A historic release available now!

One of the most exciting things about being an technology professional is the constant change.  Some people dislike change; some avoid it and none more so than large corporations.  As a small business owner and technology professional, embracing change is the cornerstone to my business and what excites me about  going to work everyday.  It is also why I went into business, I wanted to work for a company that embraced change and became the change they wanted to see in the world to paraphrase Ghandi.  Last week I had the privilege to spend a week at the MVP Summit.  It was my first time on the Microsoft Campus and an honor and a privilege to be there as a Data Platform MVP.   I took that opportunity to not only absorb all that I could from the fire hose of information that was tossed my way but to make connections and observe the large machine that is Microsoft.   Most people who work for large companies know how slow companies can be to change.  I will admit I did not see that side of Microsoft.  All the talk around technology that took place on and off the Microsoft campus was about change, being agile specifically.  Although we often know it as a method of software development or project management, it is basically the ability to move quickly and easily and as technology professionals we need to embrace it.  Microsoft does this to a level I have never seen or experiences in a large company ever before. 
Today at Microsoft Connect(); the announcement of changes to the standard features of SQL Server 2016 is a classic example. This is another change in the direction of Microsoft.  Where once they determined for us what we needed and how to use it, they are now giving us the tools and allowing us to determine what the best way is for us to help our clients, customers and partners.

I am aware of many clients who have not had the ability to take advantage of features like; Row-level security, Columnstore, In-memory OLTP, Always Encrypted and PolyBase so wondered why they should bother to upgrade to SQL Server 2016?  You can find the announcement from Microsoft to see all the details on what Microsoft has changed for SQL Server 2016 in SP1 here. For all those that have said, why do I need to be excited about the next version of SQL Server, I only have standard edition.  Microsoft just gave you the best reason ever, to be excited to upgrade!

Thursday, October 13, 2016

Data Factory

What is it and what is it used for?

Microsoft in the News:

As a netizen, odds are that there are few things that you like more than you cat (or dog, depending you your proclivities).  Some even believe that the internet was created for pet lovers to share photos, videos and stories.  So, this week I thought I’d lighten things up with something I stumbled across and thought was an interesting IoT device.  All work and no play … actually kinda describes my life lately.  Anyway, I want to tell you about a device that allows you to track your pet.  Its called G-Paws.

G-Paws is a device that attaches to your pet’s collar.  It doesn't track them in real time since that would mean a lot more weight and some kind of subscription service.  It will, however, allow you to download the stored data to G-Paws website which is hosted by Azure.  The download can be done through your smart phone or your computer.  The G-Paws website uses Azure’s Internet of Things to store and process the data and give you a visual presentation of what your little fluff ball has been up to.  Perhaps the Internet of Things will become useful to the average person after all?

The steady stream of structured and unstructured data that comes in from all of G-Paws’ customers need to be automatically processed and then presented back to the client in a meaningful format.  In order to automate this, G-Paws set up a data factory in Azure.

Now, put on your hard hat, we are now going to stroll through the factory.


As we all know, a factory is a place where a steady stream of raw material is brought in and processed in order to produce a steady stream of finished product.  The materials don’t all necessarily enter the same pipeline.  The parts to build the chassis of a car will go in one pipeline and the parts to build the motor will enter a different pipeline.  At some point within the factory, the finished product from one pipeline (the engine) is combined with the product from the other pipeline (the chassis) to produce the final output.   

A Data Factory does the same thing.  The raw material comes in initially as a stream.  With a little processing, some, most, or perhaps all of that data is fed into a specific pipeline that is directed towards one or more processes that will take place within the Data Factory. 

Other data may be fed into a different pipeline and undergo a different process.  Each process may need to a series of transformations, or perhaps just a single transformation.  Some of the processes may be done in parallel, or in series.  These are all things that you will define as you build your factory. 

The data is processed through one or more pipelines, and when it reaches the end, it will be combined to produce the useable finished product.  The factory will contain all the processes necessary to automatically produce a steady stream of finished products.  In this case, processed data that is useable by the client.

Don’t you just love it when analogies from the real world we are all familiar with translate so nicely into the digital world?

Microsoft has a number of tutorials that will walk you through the process of building some sample Data Factories.  The really nice thing about Azure is that it provides you with all kinds of raw materials and tools to let you play for free.  You can learn to build a Data Factory knowing that there are no hazardous materials or red tape that may impede your progress.  Just some fun to be had while learning a new skill.

If you are ready to get started, here are some links to some tutorials:

Process data using Hadoop cluster: http://bit.ly/2dZWDWW
Copy data from Blob Storage to SQL: http://bit.ly/2e0PeZt
Move your data to the cloud: http://bit.ly/1RODh1h