H2O.ai Blog
Filter By:
5 results Category: Year:H2O.ai and Snowflake Enable Developers to Train, Deploy, and Score Containerized Software Without Compromising Data Security
H2O.ai today announced its participation as a launch partner for Snowflake’s Snowpark Container Services (available in private preview), which provides our joint customers with the flexibility to train, deploy, and score models all within their Snowflake account. This further expands the ease of use for data science teams to create machin...
Read moreHow Horse Racing Predictions with H2O.ai Saved a Local Insurance Company $8M a Year
In this Technical Track session at H2O World Sydney 2022, SimplyAI’s Chief Data Scientist Matthew Foster explains his journey with machine learning and how applying the H2O framework resulted in significant success on and off the race track. Matthew Foster: I’m Matthew Foster, the Chief Data Scientist for SimplyAI. So, I’m going t...
Read moreImproving Machine Learning Operations with H2O.ai and Snowflake
Operationalizing models is critical for companies to get a return on their machine learning investments, but deployment is only one part of that operationalization process. With H2O.ai’s latest Snowflake Integration Application, authorized Snowflake users can easily deploy models, significantly reducing deployment timelines and enabling a...
Read moreImproving Manufacturing Quality with H2O.ai and Snowflake
Manufacturers are rapidly expanding their machine learning use cases by leveraging the deep integration between Snowflake’s Data Cloud and the H2O AI Cloud. Many current manufacturing quality checks require that sensor data and image data be processed and analyzed separately. Standard tooling presents challenges in storing and referencin...
Read moreH2O Integrates with Snowflake Snowpark/Java UDFs: How to better leverage the Snowflake Data Marketplace and deploy In-Database
One of the goals of machine learning is to find unknown predictive features, even hidden from subject matter experts, in datasets that might not be apparent before, and use those 3rd party features to increase the accuracy of the model.A traditional way of doing this was to try and scrape and scour distributed, stagnant data sources on th...
Read more