+
Logo1 + Logo2
A Strategic Collaboration to accelerate H2O.ai on Intel Platform. new paragraph
Accelerating H2O.ai on Intel Architectures
H2O.ai and Intel joined forces on a major joint collaboration effort named “Project Blue Danube”. This initiative has a focus on accelerating H2O machine learning algorithms and libraries on Intel Platforms. These efforts include optimizing algorithms such as XGBoost on Xeon yielding 10X performance gains over previous implementations. H2O and Intel will lead new innovations to enable the world’s leading enterprises with a highly scalable, high performance and secure data science platform to accelerate their data science workflows on the world’s most pervasive platform.
Machine Learning at Scale
H2O.ai’s CPU-accelerated machine learning algorithms, powered by Intel, enables optimal performance for CPUs and allows enterprise businesses to apply data science techniques and solutions that are designed to solve critical problems quickly, easily and at scale. Customers can experience seamless deployment and trust that all machine learning models can be interpreted transparently and with high accuracy. H2O Driverless AI employs the techniques of expert data scientists in an easy-to-use application that empowers data scientists to work on projects faster using automatic machine learning on state-of-the-art computing power from Intel, to accomplish tasks in minutes that used to take months.
Faster Time to Market
Intel together with H2O.ai provide an integrated, validated AI platform with the tools and services necessary to help operationalize AI and accelerate time to market. This can be realized with the rich collection of H2O.ai algorithms and tools bundled and optimized with Intel’s Industry-leading AI technology. Customers can now optimize price/performance while significantly reducing infrastructure evaluation time and integration of tools and services required to build an effective AI platform. Optimized libraries and frameworks further enhance performance while reducing programming complexity. Preloaded runtime environments and libraries enhance performance while simplifying deployment.