Topics of discussion included:
- Public Preview of Azure Confidential Computing
- Azure Digital Twins in Preview
- Emotion Analysis now available on the Video Indexer service
- Cognitive Services in Containers in Preview
- Static Data Masking for SQL and Azure SQL in Preview
- Accelerated Data Recovery for SQL and Azure SQL in Preview
- New SQL Server VM Azure Resource Provider
Public Preview of Azure Confidential Computing
Confidential computing is what Microsoft has called “the use of secure enclaves and allowing capabilities for people to use this technology”. A secure enclave is a protected region of memory closed off to access from any other application other than the one that is cryptographically allowed to access and load code into that part of memory.
It is the most secure computing that you can do. Let’s say I’m on my SQL Server and I’m running some computations using something like Always Encrypted. When I am working with that data, SQL Server is processing it and has done the decryption. Theoretically, someone could dump my memory and look at it. Confidential Computing, using a capability called secure enclaves, does not allow that. You could have access to the OS, or you could have access to the BIOS, and you would still not be able to have access to the actual clear text of this memory.
It does not allow any code outside of the enclave to have access to the data inside of the enclave. You can’t pull out an important medical record, for example. You can have them encrypted, you put the data inside the enclave and then it gets decrypted only inside the enclave. (Decrypted inside the enclave still looks encrypted from outside the enclave.) Then you work on it and when you return the results, you can return the results of whatever your model predicted outside of the enclave, but the medical records themselves were never revealed outside the enclave.
This is very interesting because it pushes the boundaries of security to a place where they haven’t been before.
Azure Digital Twins in Preview
This is a service that is part of Azure’s IOT story. The term “digital twins” is not owned by this service, it’s a general term that’s being used a lot in the IOT space nowadays. It means building a model, computationally, that maps to a physical space – the digital twin of whatever you’re trying to build in real life. It could be inside a vehicle for example, or inside a building, or even inside a city across buildings.
The Digital Twins service gives clients a pre-built model that Microsoft has fine-tuned based on working with countless clients and building solutions like this. Clients don’t have to do the modeling themselves, they can consume the model which has been operationalized by Microsoft. It’s basically an API where you can load your entities, customize them, start loading data into them, and Microsoft manages the compute. Microsoft manages the high availability and so on. And Microsoft is also adding integrations to the service. You can fire off alerts from your digital twin over to an Azure Event Hub or you can fire over to a logic app and then do something depending on what’s going on in real time.
For example, let’s say I have a building and it has 10 rooms and each room has an IOT temperature sensor. I would just create a model where I have a building, I have these rooms, and I have the IOT sensor. Then if one of the rooms’ temperature fluctuates, I could do an Event Hub routing back to the sensor to adjust down or adjust up.
It’s a service to get people going quickly with an IOT solution and to make that step of mapping it to a real live space really easy.
Emotion Analysis now available on the Video Indexer service
Emotion Analysis is a new, interesting capability added to the Video Indexer service. This service allows you to tag people and then be able to, for example, get all the timestamps where a particular person shows up in a video. It also allows you to split topics, tag the videos, and jump to the time where that person is talking.
They have now added emotion analysis. So whenever someone is expressing happiness in the video or anger or sadness, the video indexer will actually tell you at what point in the video you are going to see that emotion.
It’s interesting not only in the capability, but what’s more interesting is the type of things that you can build with that. I can see it being really useful for sentiment analysis for a marketing company. Marketers could survey a bunch of people and when they’re trying to find a particular clip, instead of having a human watch everything and put all the tags in individually, they can just run it through the indexer.
Cognitive Services in Containers in Preview
Cognitive Services is a set of cognitive abilities built in as cloud APIs. They do things such as text analysis, speech analysis, knowledge, groupings of questions and things like that.
These services have always been offered as cloud APIs but some folks have pointed out that, while they like the APIs and their quality, they have a lot of data on-premises that they can’t move easily into the cloud. To cover this scenario, now you can deploy the services as a container on-premises. The pitch here is that if you have large amounts of data that you want to get processed through the Cognitive Services and you like what Microsoft is doing, you can deploy it as a container. You pick the specific cognitive service that you want and you deploy the container for it. Then your data stays on-premises and it doesn’t have to move over to the cloud and you get the same functionality as the cloud service.
Static Data Masking for SQL and Azure SQL in Preview
Static Data Masking is the next evolution of the data masking feature in Azure SQL and SQL Server. Previously what we had was “Dynamic” data masking. That is, if I have a field containing a bunch of social security numbers, for example, I can define a mask on SQL Server to only show the last four digits and the rest of the numbers to show as stars. In storage, however, it would show the actual social security numbers and somebody with a high level of access or somebody who queried the data in a different way would be able to view the full social security number. That’s why it was called dynamic because it was only masking it on execution.
The difference here is between the dynamic part and the static. Now, you also have the option to make the masking static: you can actually run a specific DDL now, define the mask on the field, and it will literally go in and change all the numbers down to the storage level.
Obviously, you’re not going to do this in production if you still need to have the real numbers. However, this opens up a lot of scenarios where you might not need to keep the full numbers anymore, for example, for development and testing and demos because you could just change the data and give that copy for training.
Accelerated Data Recovery for SQL and Azure in Preview
This is going to be in SQL Server 2019 but right now it is in preview in Azure SQL Database. This is a feature that makes a big difference for some really important scenarios and it doesn’t affect the functionality of applications in any way.
I’ll explain what this does in a familiar scenario: say you are a developer and have to do a mass data update or delete and you forget to do it in batches. So now you have to delete 10 million records and instead of deleting 10,000 by 10,000, you just write a delete that covers the entire 10 million and press “enter”. Now it has to go through the full delete in one transaction. And then someone comes in and they say, “Oh my god! You are blocking everybody,” and panics. So you say, “Okay, I am just going to kill the session.” At that point, you’re stuck in rollback and you have to wait a long time, possibly even longer than what it already had taken for the actual delete to complete. And meanwhile, you are causing a big blocking scenario in your database.
Or, if you were, for example, in the middle of this delete and suddenly your database restarts or you do a failover, it has to run recovery when the service comes back online. It’s going to run through the whole thing again before you have full access to your database.
Microsoft took a hard look at how all of this is implemented in SQL Server and they came up with a way that is a near-instantaneous rollback: Accelerated Data Recovery.
So, now, if you have your long delete and you kill it, you have near-instantaneous access to the data as it looked like before the delete operation started. The same thing if you restart the server. Same if you do a failover: recovery completes almost instantaneously due to this accelerated data recovery.
I think this is a huge deal because we’ve lived these situations, especially doing Managed Services, and there’s nothing you can do about it. There’s not much you can tell the client other than wait it out because otherwise, we will end up with an inconsistent database.
It’s going to make all of those situations a lot easier. And you don’t have to do anything, that’s the nicest part about this. It will simply work this way from now on.
New SQL Server VM Azure Resource Provider
The Azure Resource Provider is the way that Azure Automation, the Azure Resource Manager, operates on cloud resources. It is the infrastructure as code and declarative deployment mechanisms that are underlying Azure. So when you create a VM and you want to create a template to create a VM with the virtual network and a storage account, you basically write commands against the VM resource provider, the storage resource provider, and the network resource provider. And then you deploy that as a template and Azure does its work and poof! You have some infrastructure in the cloud or some services deployed.
The new thing here is they are adding a SQL Server Resource Provider. So machines running SQL Server will have their own resource provider. This means that Azure itself will be aware of treating machines that have SQL Server inside of them differently than general VMs. And that also means that you as an Azure user will be able to leverage this distinction.
This opens new automation scenarios. It even opens up new licensing scenarios because it used to be that if you deployed a machine with a certain type of license on it, you would not be able to switch it easily into another one. For example, if you use one of the pay-as-you-go from the marketplace, you wouldn’t be able to easily switch it to be your own license even if you’ve purchased it after the fact. You’d have to take backups, redeploy the VM, and restore the backups.
With this new resource provider, they’re adding the capability that you can just declaratively run a command and tell Azure, “I am switching this VM from pay-as-you-go to my own license.” It opens another scenario, too, where they can make SQL Server features available at a higher level declaratively and people don’t have to worry about how they work.
The first one they are working on right now is for availability groups. So you are going to be able to say, “I want an availability group that has three nodes and is using this storage account as the witness.” And you just deploy that declaratively and in the background, Azure will create the Windows cluster. It is going to add the nodes to that cluster and it is going to create the availability group and set up SQL and configure it in SQL. So it is going to open up automation scenarios for SQL Server that were not possible before or where a lot more complex to pull off.
Listen to the full conversation to learn about more announcements we discussed. Be sure to subscribe to the podcast to be notified when a new episode has been released.