The top option is the Azure Data Catalog Portal.
Thursday, March 23, 2017
Microsoft in the News - World Water Day
Did you know that March 22 each year is officially World Water Day? Who know that we have dedicated a day to the thing that makes up most of our body and is required for life?
Unfortunately, we humans aren’t very good at managing the things that allow us to live on this planet. In order to help us all out, Microsoft has launched the Water Risk Monetizer (WRM). This free app is built on the Azure Cloud and is designed to build a business case for taking more action to protect this essential resource.
Many of us pay for our water, but what we pay is only a fraction of the real cost of water. By undervaluing it, we end up disconnected from the reality of what we are doing when we use, or waste water.
Microsoft is currently using the WRM to prioritize locations and optimize water management systems. The WRM gives them additional tools that add to design/engineering considerations, and calculate the ROI of different water management strategies.
At the Microsoft data center in San Antonio, Texas, the WRM was used to show that the true cost of water in the Leon Creek Watershed, where the data center is located, was 11 times greater than what the water bill indicated. Working with an Ecolab partner, Microsoft was able to save more than $140,000 in water costs and reduce their water usage by 58.3 million gallons of potable water per year.
Accessing your Catalog
In my previous blog, we got as far as registering a source of data for your Data Catalog. The next step is access that data via your Catalog. From the Registration screen, you can jump directly into the portal by clicking the View Portal button here:
There are more ways to access your catalog, but this is the quickest and easiest when you are already in the Registration screen.
Once you are in your Azure portal you can scroll down to the bottom of your main menu, select More services > This gives you another menu. At the bottom of that menu part way down the heading of INTELLIGENCE + ANALYTICS is the Data Catalog. If you choose the star this will place it in your favorites.
Data Catalog will now show in your main menu at the bottom but you can drag it close to the top if you use it often.
In my Azure portal when I click on my Data Catalog it gives me administrative information about it. I get the location, resource group, subscription type and if I scroll all the way to the right and click on the …it allows me to pin my catalog to my dashboard to make it easy to find.
With my Catalog on my dashboard it looks like this.
Choosing the Data Catalog from my azure dashboard gives information about your catalog from an administrative point of view and I will get into those details in a later post. The important thing to note is that the Azure dashboard for the Data Catalog is administrative. There is no functionality of the Data Catalog in in the Azure dashboard. There is a quick start button in the dashboard that you can use to view a number of quick start options.
The top option is the Azure Data Catalog Portal.
Once inside the portal you get a data catalog dashboard which we will explore in my next blog.
Wednesday, March 22, 2017
Microsoft in the News
For you fellow SQL Server junkies, Microsoft recently announced a new preview for the next version of SQL Server Community Technology Preview (CTP). When it is available, the new preview will allow you to schedule jobs using SQL Server Agent in SQL Sever v.Next on Linux. You can keep up with the upcoming changes in SQL Server by visiting What’s New in SQL Server v.Next
Register a data Source in Azure Data Catalog
When you get started with Azure Data Catalog, the first thing you must do is register a data source. From the home screen choose Launch Application.
It will download a registration application program to your local drive. You will see the familiar install screen
How long it takes to download will depend on your connection, but since the file is only around 100MB, it shouldn’t take very long.
Accept the licence and you get a handy app that will walk you through to process of registering your data sources.
Sign in to begin. Remember to use the same account you used to create your Data Catalog.
For this example, I chose a SQL Warehouse.
Enter the details of the Connection.
Here you select the objects you want to register. The metadata for the objects will be registered in the Catalog. For Sources that support previews you can choose to include a 20-record snapshot of the objects data. I recommend including that; if it is available.
The Preview allows you to preview a sample of the data if you have permission to the underlying data. The profile will gather as much detail about each object as is available. I find this feature will fill in as much detail as it can find about my source and limits the amount of typing I have to do later to add that detail.
Note: Just above the REGISTER button you can add tags separated by commas. This allows you to start documenting as you load.
Note: Every comment you enter goes against all objects. Ensure the comment is relevant to all objects before you enter it.
As soon as you choose REGISTER, the service begins. Once your objects are registered, you can choose to register more objects by simply going back to the Register Page and selecting more objects. You can also do this in the view portal.
In my next blog, I will begin to Explore the Data Catalog.
Tuesday, March 21, 2017
Stay Safe Out There!
It is tax season and that means that tax themed malware is ramping up. Despite you all being tech professionals, you still need to be extra cautious. There have recently been several instances where even technically savvy people are being tricked into downloading malware. Typically, the tech professionals are succumbing to spear phishing attacks.
Microsoft can only keep you safe, but only up to a point - Office 365 Advanced Threat Protection uses machine learning to warn you of potential threats, for example - however, the human is almost always the weakest point when it comes to security. So, you need to be extra careful about what links you follow and what email attachments you choose to double click.
Malware is proliferating at an alarming rate and it is getting sophisticated enough to catch IT professionals. So, stay alert to the threat, and stay safe.
For those that have been to any of my in-person sessions over the last few months, you have likely heard me talk about Azure Data Catalog (ADC). I cannot seem to stop talking about it! I will happily tell everyone who will listen just how much I love it. I wrote my fist blog on it back in November, shortly after I had heard about it. You can find that blog at What is Data Catalog?. Since that blog I have had many people ask for more details and last week I promised an entire room full of people that I would post more about Azure’s Data Catalog. This blog will be the first in a series of quick, easy to get you started steps in setting up a Data Catalog for your organization.
To create a Data Catalog all you need to do is sign up.
The nice thing about this Get started page is that it allows you to get an Azure account as well as start with your Data Catalog, all from one page.
Once you have your account, choose Get Started and sign in (if you have not already). One of the keys to remember with Data Catalog is that the Azure account you use must be a corporate or student account and you must be the owner or co-owner of the subscription. I will get into more details on this when we get to administration of ADC.
Once you click Get started, you only need to tweak a few settings to complete the creation of your first Data Catalog.
You will need to create a name for your Catalog. Here I have called mine DemoCatalog. If you are signed in, the Subscription will auto fill for you. You only need to change it if you have multiple subscriptions and would like to use a different one. The Catalog Location is also defaulted but you should change it to the location nearest to you.
You will notice that I have only chosen the free edition options. I will cover Standard Edition in a later post. My User name and Administrator designation automatically default to the account I am logged in with. Once you have made your choices and click Create Catalog it will only take a few minutes to process and you will have created your Data Catalog.In my next blog we will look at the auto discovery way to publish data
Thursday, February 23, 2017
Microsoft in the News:
Back in 2014, Microsoft purchased Minecraft for $2.5 Billion. On January 19, 2016, they introduced us to Minecraft: Education Edition. Microsoft took a game enjoyed by millions of kids and turned it into a learning experience. It is still the same fun world that attracted so many people to it in the first place. The Education Edition, however, has a new emphasis on creativity, collaboration and problem solving. In other words, it teaches the skills people need to thrive in the 21st century.
It used to be said that, “What is good for GM is good for America.” That was no doubt true in the industrial age. In the information age, we change “GM” to “Microsoft” and the statement remains true. What is good for Microsoft is a well educated population from which to draw employees. In this regard, I think that this statement is more true in the information age than it was in the industrial age. And Microsoft is prepared to do its part to help us all out.
Microsoft has spent millions over the years promoting the education of students in the areas of coding and technology. Last year Microsoft asked all of its employees to participate by being a teacher in the Hour of Code in support of computer science education. There have been almost 5,000 Microsoft employees who have participated since this call to arms .
With the success of Minecraft: Education Edition, Microsoft recently expanded their support for it by creating the Global Minecraft Mentor program. Sixty mentors representing nineteen countries will be contributing their expertise in order to support educators in their quest to bring immersive learning environments into the classroom.
It has been said that if you have made it to the top, it is your responsibility to send the ladder back down for others. If you are interested in helping prepare future generations, here is an easy way to start: https://code.org/volunteer
Twice now I have volunteered to teach a course on coding at our local high school. I was inspired to do this out of frustration with the lack of decent computer courses offered at the high school level. Code.org made it quick and easy for me by providing everything I needed.
Now there is a second easy way to help our youth. Introduce your kids, and your kid’s teachers to Minecraft Education Edition. We all have a responsibility for our country’s future. This is an easy way for you to do your part.
“Education isn’t something you can finish” Isaac Asimov:
The world of science and technology is moving forward at an ever increasing rate. If you are standing still, education-wise, you are falling behind. This applies to both the student and the educator.
When a high school doesn’t offer inspiring and relevant computer and other science courses, we as a society are failing our youth. When a college or university isn’t providing current software for students to learn on, we are failing our youth. When adult learning courses aren’t readily available, flexible to accommodate a busy work schedule, and affordable, we are failing our society. When we, as tech professionals are not engaged in learning, we are failing ourselves. As was stated recently in the Economist, “When education fails to keep pace with technology, the result is inequality.” In other words, the failure to educate either yourself or the population in general, leads to poorer employment opportunities, and a lack of job security. As a nation, we all sink or rise with this educational tide.
As it stands now, it is mostly high achievers that are actively engaged in lifelong learning. That alone says volumes of its importance. But as these high achievers succeed and grow, those standing still fall behind, and the inequality within society expands.
I have always found it odd that we think nothing of spending tens of thousands of dollars (usually financed through debt) on our education as soon as we exit high school, yet, as soon as we exit university, our education budget drops to zero. We get our first job and instantly we feel it is our employer’s responsibility to ensure you are qualified to work.
Don’t get me wrong, as an employer, I invest in my employee’s education. But is it really my responsibility? I want and need an educated work force, so I do what I can. Most companies are moving in the opposite direction. Most companies seem to fear that they will educate their employees and then they will leave. I fear that my employees will not be educated in current tech, and will stay!
There is a Jewish proverb that states: “If you drop gold and books, pick up first the books and then the gold.” It is a reminder that education should be your primary goal. Knowledge is weightless, so feel free to accumulate as much as possible.
Thursday, February 16, 2017
Microsoft in the News:
In July of 2009, Google announced that it was developing a partially open source OS that would operate primarily in the cloud. That OS is of course Chrome OS and it is now the second most popular OS on the market, now beating out Mac OS. With the onslaught of inexpensive Chromebooks, Microsoft is starting to pay attention.
Last year, Chromebook enjoyed a 20% growth rate while the PC industry was shrinking. Since competition seems to bring out the best in companies, Microsoft intends to release Windows 10 Cloud which will compete directly with Chrome OS.
Windows 10 Cloud (“W10C”) will be a scaled down version of Microsoft’s flagship Windows 10. The intention is that W10C will only run Universal Windows apps from the Windows Store.
There hasn’t been an official announcement as to when W10C will be released, however, there have been indications that it could be as early as April of this year.
In Browser Query Editor for Azure SQL:
There seems to be a steady pace being set by Microsoft for ensuring that all things Azure are constantly being improved. The latest improvement to Azure SQL is the inclusion of an in-browser query tool. This new Query Editor has just been released for public preview in Azure.
Until now, to access and query your Azure database, you had to switch to a separate tool. This can be quite cumbersome when working for clients. I find I would sometimes have to navigate multiple connections and remote desktop connections with layers of security to do a simple query. Now, you can use an Azure Portal to write queries, execute T-SQL scripts, and manage the results of your query.
If you are looking to jump in right away, be forewarned that this feature is just being previewed right now, so you will be asked to accept the Preview Terms prior to being allowed to use this editor.
To get started, once you have navigated to your SQL database, click the Tools menu, followed by Query Editor (preview). Or, if you have a SQL Data Warehouse, there is a Query editor (preview) button in the Common Tasks screen.
The Query editor is similar to SQL Server Management Studio for writing queries but far more basic and has less functionality available in graphical user interface format.
This new functionality available in Azure will make quick investigation and working on client’s systems much easier. I am excited to see how this feature evolves.
Tuesday, January 31, 2017
Microsoft in the news:
Artificial intelligence has been and will continue to be a news headline grabber for a long time to come. Over the years, AI has progressed from a supervised learning platform to the current cutting edge, which is a reinforcement-learning platform. The AlphaGo system that beat the top ranked Go player is the most public example of how powerful the reinforcement-learning platform is.
In order to leapfrog Google and Facebook, on January 13, Microsoft announced that they purchased Maluuba. Maluuba’s deep learning AI is able to read and comprehend text with near human capability. We can look forward to putting this AI and others to work for us via the Azure Cloud.
Another AI from Microsoft uses a subset of reinforced learning know as “contextual bandits”. A year ago, Microsoft put this AI to work on their MSN.com website. This AI helped pick personalized headlines for MSN users. The result was an increase in click-throughs by 25%. This successful AI tool was turned into an open source service that can be deployed on Azure, as well as other platforms. This system allows you to answer much more detailed questions than the current A/B testing model.
To access this Multiworld Testing Decision Service, follow this link: https://github.com/Microsoft/mwt-ds
Basic Security is a good reason to start with the cloud.
If you haven’t heard news reports of “ransomware” you will soon. According to the FBI, ransomware attacks are becoming the attack du jour for cyber criminals. In the first three months of 2016, over $200 million was paid to cyber criminals because of ransomware (up from $25 million for all of 2015), and that is only what was reported. There is little doubt that many individuals and businesses have made payments without referring the matter to authorities.
Ransomware is a piece of malicious code that encrypts your data. Once encrypted, you are sent a message saying that if you pay the ransom, you will receive the encryption key so you can regain access to your data. The payment is made in Bitcoin, so there is no way to follow the money to the recipient. In some cases, those who have paid were hit up for additional payments. Additionally, these criminals don’t always get their encryption right and end up making your data un-recoverable. So, you may pay and get the key only to find out that it doesn’t work and even the crooks can help you.
Becoming a cyber criminal in this industry is easy, and requires almost no technical talent. In the underground market of crimeware-as-a-service, you can start your own ransomware campaign by simply providing a Bitcoin address for victims to send their money to. So, this situation is going to get a lot worse long before it gets better! This is definitely a growth industry right now.
For this past year, hospitals seem to be a favorite target. With people’s lives at stake, they are in need of a quick solution, and so they have little choice but to pay the ransom. Several hospitals in the US have fallen victim to this crime in 2016. Can you imagine being the IT professional who has to tell the CEO that ALL the company’s data is encrypted and inaccessible?
Educating your staff about spear fishing (the same hack that caught the Democrats flat footed during the election) and other forms of attacks like finding a random USP memory stick outside the front door of the office, is a great start. However, a persistent hacker will find a way, and education will not be 100% effective. So, you need to have a “back-up plan” (double entendre intended).
Not only do you have to regularly back up your data, but also, it must be in a secure location that is not connected to the original data. That is where the Azure cloud can help. Both Azure Site Recovery and Azure Backup can eliminate the risk of having to pay a ransom to recover your data. Backups in Azure are inherently safer because attackers not only need access to your environment, but also to a secure backup vault on Azure before they are able to effectively attack your data.
Microsoft is well aware of the increasing threat of ransomware and have recently implemented additional security to address this and other threats. To begin with, since encryption creates a new encrypted file and then deletes the original, Microsoft has implemented a requirement for a Security PIN in order to delete backup data. In addition, Azure retains deleted backup data for fourteen days ensuring you have multiple recovery points.
In this game of leapfrog as security professionals and criminals continually try to out do each other, you really need a full time team of professionals working around the clock to protect your data. By taking advantage of Azure cloud services, that is exactly what you are getting.
Friday, January 6, 2017
I am sure you have heard by now of the IOT or the Internet of Things. With a plethora of different IOT devices sending data to a variety of applications, did you ever wonder how that data is managed? Event hubs perform this task. They take in data from everything, from telemetry and mobile app data to gamming data and organize it, so it is easier to consume and use. Event hub does this by managing the flow of data as it is received.
Let’s begin with an example. We have sensors at bank branches that log when customers come and go from the branch. The sensors that log when people come and go are the IOT devices. These devices are called Event Publishers. The publishers send the events, entry and exit of the bank, to the hub where they are organized and then available for applications to use the data. These applications are referred to as consumers. Consumers can be a variety of different types of applications that often need different parts of the data.
All the data is coming in from a single type of publisher the sensors, in this example but often there is more than one type of data. It could be location data such as how close this customer lives from the branch or basic data such as, is this customer currently in the branch and many more individually different types of data. Event Hub can take all the data as fast as it comes in, unfortunately in this type of environment there is so many different types of data, that at a large volume it is essentially unusable in a traditional way. This is where the Event Hub comes in, it is used to sort and organize the data in a way that the consumer can use it.
It is important how the data is used to determine how to organize it. Event hub uses partitions to do this.
The formal name for how Event Hub organizations the data is a competing consumer model format. This means that multiple consumers of the data can receive data from the same channel at the same time to optimize the volume and scale. As you can imagine, data is streaming in at a highly variable rate, in a wide variety of ways, and using a surfeit of protocols. As the data enters the workflow, Event Hub uses partitions to segment the data and adds new partitions as data arrives. Each partition is retained for a configured retention time that is set at the Event Hub level. The events cannot be deleted but expire on a time basis.
Partitions are also set when the Event Hub is created. Partitions are the key to the data organization for any downstream workflow management and should be based on the degree of downstream parallelism you will require for the consumer. A good rule is to make the number of partitions equal to the expected number of concurrent consumers.
Any entity, regardless of type that reads event data is called an event consumer. All consumers read the event stream through partitions in a consumer group. Similar to the event hub, the consumer groups partition the data for each concurrent consumer. The consumer connects to a session in which the events are delivered as they become available. The consumer does not need to pole to determine if data is available. The consumer group is what controls both the subscription of the Event Hub and the viewing of it. Each group enables a separate view of the event data stream to each of the consumers. This individualization allows consumers to process the data at a rate that is appropriate for them and in offsets (set groupings of events) that are individual to that consumer.
To learn more about Event Hubs and how to set them up check out the programming guide at this link; http://bit.ly/2hWEUob