VMware enables Hadoop to run on private and public clouds
By Asia Cloud Forum 15-Jun-2012
VMware today announced a new open source project, Serengeti, to help enterprises to quickly deploy, manage and scale Apache Hadoop in virtual and cloud environments.
In addition, VMware is working with the Apache Hadoop community to contribute extensions that will make key components "virtualization-aware" to support elastic scaling and further improve Hadoop performance in virtual environments.
"Apache Hadoop has the potential to transform business by allowing enterprises to harness very large amounts of data for competitive advantage," said Jerry Chen, VMware's vice president, cloud and application services. "It represents one dimension of a sweeping change that is taking place in applications, and enterprises are looking for ways to incorporate these new technologies into their portfolios. VMware is working with the Apache Hadoop community to allow enterprise IT to deploy and manage Hadoop easily in their virtual and cloud environments."
De facto standard for big data processing
Apache Hadoop is emerging as the de facto standard for big data processing, however, deployment and operational complexity, the need for dedicated hardware, and concerns about security and service level assurance prevent many enterprises from leveraging the power of Hadoop.
By decoupling Apache Hadoop nodes from the underlying physical infrastructure, VMware can enable Hadoop to realize rapid deployment, high-availability, optimal resource utilization, elasticity, and secure multi-tenancy in a cloud infrastructure.
Available for free download under the Apache 2.0 license, Serengeti is a "one-click" deployment toolkit that allows enterprises to leverage the VMware vSphere platform to deploy a highly available Apache Hadoop cluster in minutes, including common Hadoop components like Apache Pig and Apache Hive.
"Hadoop must become friendly with the technologies and practices of enterprise IT if it is to become a first-class citizen within enterprise IT infrastructure. The resource-intensive nature of large Big Data clusters make virtualization an important piece that Hadoop must accommodate," said Tony Baer, principal analyst at Ovum.
"VMware's involvement with the Apache Hadoop project and its new Serengeti Apache project are critical moves that could provide enterprises the flexibility that they will need when it comes to prototyping and deploying Hadoop," Baer added.