With Pentaho it Simplifies Deployment, Operations and Scaling of Enterprise Big Data Projects
HDS unveiled the next generation Hitachi Hyper Scale-Out Platform (HSP), which now offers native integration with the Pentaho Enterprise Platform to deliver a sophisticated, software-defined, hyper-converged platform for big data deployments. Combining compute, storage and virtualization capabilities, the HSP 400 series delivers seamless infrastructure to support big data blending, embedded business analytics and simplified data management.
Modern enterprises increasingly need to derive value from massive volumes of data being generated by information technology (IT), operational technology (OT), the Internet of Things (IoT) and machine-generated data in their environments. HSP offers a software-defined architecture to centralize and support easy storing and processing of these large datasets with high availability, simplified management and a pay-as-you-grow model. Delivered as a fully configured, turnkey appliance, HSP takes hours instead of months to install and support production workloads, and simplifies creation of an elastic data lake that helps customers easily integrate disparate datasets and run advanced analytic workloads.
HSP’s scale-out architecture provides simplified, scalable and enterprise-ready infrastructure for big data. The architecture also includes a centralized, easy-to-use user interface to automate the deployment and management of virtualized environments for leading open source big data frameworks, including Apache Hadoop, Apache Spark, and commercial open source stacks like the Hortonworks Data Platform (HDP).
“Many enterprises don’t possess the internal expertise to perform big data analytics at scale with complex data sources in production environments. Most want to avoid the pitfalls of experimentation with still-nascent technologies, seeking a clear path to deriving real value from their data without the risk and complexity,” said Nik Rouda, Senior Analyst at Enterprise Strategy Group (ESG). “Enterprise customers stand to benefit from turnkey systems like the Hitachi Hyper Scale-Out Platform, which address primary adoption barriers to big data deployments by delivering faster time to insight and value, accelerating the path to digital transformation.”
“We consistently hear from our enterprise customers that data silos and complexity are major pain points — and this only gets worse in their scale-out and big data deployments. We have solved these problems for our customers for years, but we are now applying that expertise in a new architecture with Hitachi Hyper Scale-Out Platform,” said Sean Moser, senior vice president, global portfolio and product management at Hitachi Data Systems. “Our HSP appliance gives them a cloud and IoT-ready infrastructure for big data deployments, and a pay-as-you-go model that scales with business growth. Seamless integration with the Pentaho Platform will help them put their IT and OT data to work — faster. This is only the first of many synergistic solutions you can expect to see from Hitachi and Pentaho. Together, we are making it easy for our enterprise customers to maximize the value of their IT and OT investments and accelerate their path to digital transformation.”