I followed the press conference this morning from SAP TechEd. And I came away a bit shocked.
It was actually interesting. SAP held my attention.
Really. Chief Tecnology Officer Vishal Sikka talked about semantic technologies and sharedconversations he had with VMWare CEO Paul Maritz. It had demos of connected health systems to do mobile checks of blood levels.
We heard explanations abut private cloud computing, especially in its context to a heterogeneous stack. That was interesting if nothing else than to make us even more curious about the concept of the private cloud and what it actually means.
But our curiosity piqued when the conversation turned to in-memory technology. In the past few months, in-memory technology has established itself as a fundamental capability for discovering and presenting granular bits of information that tell a wider story about the market.
In-memory technology is a core piece of the future enterprise, especially considering the state of the market for virtualized infrastructures and the technologies required to manage big data.
But here’s the thing about big data. It can be a joy to have or it can be crippling.
You can be showered with insights or deluged by data torrents.
That is also the dual opportunity of the market and why it’s worth a fortune in software development for SAP, Oracle, VMware, Citrix, Red Hat, IBM and any number of others.
Two pieces that fit into that golden necklace around the enterprise come from the cloud. It’s the double star quality of virtualization and predictive analytics. It’s both the hardware and software networked on data center technology that extends into the virtual layer.
It’s in this virtual layer that in-memory technology is working. Search is an inherent aspect of columnar architecture and perhaps an answer to the failed expectations of semantic technologies of the past 20 years. This elastic stack provides an infrastructure that can support big data, mesh networks and the apps that connect people.
SAP’s answer comes with an increasing focus on these two potential jewels. In terms of virtualization, its a combined play. SAP will work with hardware vendors like Dell and IBM to provide an SAP-centric stack. They call it a landscape management solution. It follows up from plans earlier this year to partner with EMC, VMware and Cisco. The applications and hardware that fit into this heterogeneous stack are literally worth billions, if not trillions of dollars in the overall marketplace.
The question is if this on-premise architecture centric approach will take hold or if customers will migrate to a more open environment, not dominated by proprietary technologies.
This virtualized world means that companies can add the analytics component that you need to understand big data. And it’s at this point that we wonder how the real cloud story fits in.
The question is about elasticity. For my question I asked about how SAP explains the private cloud to customers. Sikka explained that a private cloud will operate in the same manner as a public cloud. So then is there a distinction? Why does all this investment need to be made if the public and private cloud operate one in the same?
What we do with big data will answer some of these questions. SAP is pinning its development goals on offering in-memory analytics as an appliance inside the enterprise.
But even with that comes the question of how the data flows beyond the networks. The in-memory analytics are required to process the data but to understand it requires going beyond the concept of entirely private systems. First off, there is no such thing. All systems are porous to some extent. And second, the real world is far more vast. The public cloud is part of that real world, where big data is becoming the leading driver for how the world is viewed.
Where that leaves the private cloud is anyone’s guess. In our book, it’s all becoming one in the same.
0 Responses
Stay in touch with the conversation, subscribe to the RSS feed for comments on this post.