Early in my DBA career, when we were architecting systems, OLTP or transnational systems were designed for a much higher degree of availability than any other system. This is because they were typically used to serve a customer or critical internal function.
We would use higher RAID levels, mirror databases and sometimes even mirror full servers to another data center, depending on the criticality of the application. This was very expensive.
Any system slanted towards reporting would not have the same degree of availability. There were a few reasons for this, the main was cost but the secondary reason was almost always that “The business could live without reports for a while.”
I don’t think the business users I worked with ever fully agreed with that, but they either weren’t asked, or rarely succeeded in making a case for the same level of availability as transactional systems.
When developers were building their software, we’d often “outlaw” them from integrating the data warehouse into their architecture, as it wasn’t going to be “as available.”
As part of my role as a DBA, I’d often monitor traffic looking for developers who disobeyed this directive. We would provide the data through extracts or other means, and the developers were tasked to code to the same level of availability. This added more complexity, and more jobs to monitor (and subsequently fix), but it worked.
Times have changed
Reporting databases are no longer optional, and analytics data is often mission critical. Data is an asset, after all. Gartner predicts that by 2020, 10% of organizations will have a highly profitable business unit specifically for productizing and commercializing their information assets.
Organizations clearly need specific information to make decisions, which is not a new concept, but to adapt, I think we now have to move faster than we ever did. Today, change happens faster, and those changes depend more on data than ever. This dependence will not go away, although, one can predict that the machines will make more of our decisions (but that is another blog post).
Analytics in the Cloud
Today, we’re blessed with a plethora of cost effective options. If your analytics data is hosted in a public cloud, it can easily be mirrored to another, nearby location or even the other side of the world. There are more globe-spanning databases available than ever. Forbes Magazine reported that through a recent survey, they learned that public cloud is the most preferred deployment platform for cloud Business Intelligence and analytics.
Upcoming Analytics Megatrends
There are 10 predicted megatrends on the horizon when it comes to the types of data being used for analytics, and the techniques that will be used to gather the data. Knowing what these changes are and how to adapt to them is essential for all analytic driven organizations.
You can view the Gartner webinar on these topics here.
If you are still doing analytics on-site, no problem, the cloud can be your stand-by data center. Cost is no longer the prohibitive. Effort is no longer prohibitive.
It is said that cloud analytics are better. So, if you’re not performing analytics in the cloud, what’s stopping you?
Pythian’s Kick Analytics as a Service offering was specifically designed to meet the growing demand for cloud-compatible analytics solutions. Contact us today to learn more.
No comments