Chris Presley recently kicked off his brand new Cloudscape podcast. This podcast focuses on different cloud vendors and shares their unique takes, approaches and strategies as it relates to their area of expertise.
I was glad to be one of his guests in an episode where I shared some news on what was taking place in the world of Microsoft Azure.
In this episode, we discussed the following topics as they relate to Microsoft Azure:
- Security Center
- Managed Service Marketplace
- Data Factory v2
- New Data Features
- Analysis Service as a Service
- Meltdown vulnerability
- Azure Security Center
- Azure Security Center released new log analysis, new search tools and more dashboarding.
We’re always seeing info about threats and hacks and even though these things are constantly happening, some still think that cloud providers cannot secure their data better than they do.
The fact of the matter is that many are not aware of how vulnerable their data may be, whether they are on prem or in the cloud.
For example, I was accessing files through a VM I use for testing with SQL Server, whichso it serves as my own Adventureworks database. Once I opened it with a Remote Desktop Protocol (RDP) port on the internet, I started receiving alerts that somebody out of tThe Netherlands was port scanning my VM because they saw that my RDP port was open. After that, they started using the RDP port to constantly try to log into my VM using brute force. This shows us that these types of things can easily fly under the radar without us realizing that it’s happening.
Luckily, I was aware of what was going on, so I set up a port range in my firewall for the VM. I chose to do this not out of fear of getting hacked, but because it was killing my machine’s CPU due to the constant logging of events.
As I learned through that experience, it’s a great tool, with the main goal of making people more aware of all the threats out there. It seems that many providers are going in the direction of having a type of security dashboard approach for their customers.
Managed Service Marketplace
Azure Marketplace allows a user to package applications, and the latest innovation is that they have added the option to deploy a managed service solution. This will allow you to choose a solution from the marketplace that may not be a SaaS offering. It can be a product that you’re deploying with your Azure subscription, but it can also be managed by a third party.
It also enables economic relationships between the consumer and those running their cloud operation. It used to be that the marketplace was only for software, but now you can pick a VM that has licensed software and a set of services attached to it.
Azure Data Factory V2
For anyone not familiar, Azure Data Factory v2 was released a few months ago. This release is a huge leap in terms of what Data Factory v1 had. In the past, I would tell people that Azure didn’t have a decent ETL offering until V2 came out, and now, with the visual tools, it’s even easier for people to develop, especially for people that have an SSIS background. People who have worked with Microsoft’s data stack ETL for years find that working with ADFv2 is simple because they are very similar.
Essentially, it’s a drag- and- drop type of approach. The visual tools make it even easier for SSIS developers to adapt and work graphically, which many enjoy for ETL tools.
At this point, the service can host SSIS as well. For clients who are 100% in the cloud, there’s really no reason not to adopt it, which is a good thing because v1 had the unfortunate situation of making people decide whether to go with the limitations of v1 or just put up some ETL software in a VM. Now, v2 takes this issue away and users can build an entire end-to-end data platform solution with PaaS services.
New Azure data features, compatibility level default and analysis service changes
This new compatibility level default makes it so every time you create an Azure SQL database, it automatically goes to the 140 compatibility level. It’s the same compatibility level as SQL 2017. The big difference is that this is the default, so if someone were to create something and start developing, they will automatically be on that version. Once there, the database will have access to the latest T-SQL and optimizer fixes.
The interesting part about this is that the compatibility level was available in the cloud long before any of this was made available—it just wasn’t the default. It’s now a “cloud first” type of development, so when compatibility mode 15 rolls out for SQL 2018, we’ll likely see it as optional first.
The takeaway here is that you don’t have to worry so much about all the new versions and patching. These things will continuously roll out on their own without you having to take action.
Analysis Services as a Service
The Analysis Services managed offering is now deployed in even more regions around the world. You can now build end-to-end data platform solutions that are 100% platform as a service. Previously, if I needed an analytical model, I would have had to go in with a third party tool or run Analysis Services from Microsoft in a VM. Now, we have Analysis Services as a service and it runs in even more regions so people can just use the PaaS service instead of having to deploy on a VM.
Keep in mind, there are two ways to run Analysis Services. The first way is multi-dimensional, a cube style. The second way is with a tabular model. So it’s not something that’s new, but the amount of people in the field who still deploy multi-dimensional cubes is surprising. The cloud service doesn’t do multi dimensional models at this point, but it is on the work queue of the Microsoft team.
Azure was affected by the recent meltdown vulnerability, just like every other Ccloud provider was affected. This exposed content from memory space that wasn’t yours, so you could potentially see the memory space of another VM that belonged to another customer. This was the main thing that got patched right away. The fix was quick, it didn’t take longer than a day.
This was a summary of the Azure topics we discussed during the podcast, Chris also welcomed Greg Baker (Amazon Web Services), and John Laham (Google Cloud Platform) who also discussed topics related to their expertise.
Listen to the entire conversation here and be sure to subscribe to the podcast to be notified when a new episode has been released.