Despite its strong association with the cloud, most Big Data deployments are in private data centers, writes Wikibon Principal Research Contributor Jeff Kelly in “For Big Data to Thrive in the Cloud, More Managed Services Required”.
This creates a couple of issues. First, many Big Data use cases are built on large volumes of data that reside in the Cloud, not the corporate data center, and moving that data is difficult and time consuming. Second, building out an infrastructure to support Big Data analysis beyond the proof-of-concept level is expensive, requiring large amounts of hardware that must be crammed into often already crowded data centers. Unless the enterprise has a near constant need for Big Data analysis, much of that infrastructure will lie idle much of the time, generating little value while it depreciates.
One reason that more of these initial Big Data trials are not done in the Cloud, Kelly writes, is a lack of Hadoop-based, bare-metal platforms on which to build them. He identifies three:
- Google’s Big Query,
- The AWS Big Data engine, and,
- Treasure Data (see embedded video below).
While these are good choices, more are needed to fill out the market. He suggests that vendors should build analysis platforms on top of multiple IaaS environments, such as IBM SoftLayer, which specifically includes a bare metal environment.
One new vendor looking to enter this market is Splice Machine, the latest venture of Monte Zweben, whose projects have included Red Pepper Software and Blue Martini. Zweben says Splice Machine, which is currently in private beta, is the only NewSQL database engine built on Hadoop and combining the best aspects of both. Designed for near real-time transactional environments, he describes Splice Machine as a very high volume solution and says that one of the early users moved a very large database onto it when Oracle “hit the wall”. He says Splice Machine, which will enter the public market in the next few weeks, is already running on AWS, and he is exploring other cloud platforms.
Kelly recommends that Big Data practitioners explore cloud services to free themselves from the challenges of provisioning hardware, tuning systems and maintaining performance or alternatively building their own Big Data analysis platforms on IBM/Softlayer or other bare-metal services.
Kelly’s full analysis, like all Wikibon research, is available without charge on the Wikibon Web Site. IT professionals are invited to apply for free membership in the Wikibon community, which allows them to post comments and questions on research, publish their own tips and analysis, and participate in Wikibon research.
photo credit: PhOtOnQuAnTiQuE via photopin cc
About Bert Latamore
Bert Latamore is a journalist and freelance writer with 30 years of experience in the IT industry including four years at Gartner and five at META Group. He is presently the editor at Wikibon.org, and associate editor at Seybold Publishing. He follows the mobile computing market, including PDAs and tablet computing, and related subjects such as both a user of PDAs and tablet computers for more than 20 years and as a strategic analyst. He was the first person at Gartner to carry a pocket computer, in 1989.
Read the article: