Last week at Red Hat Summit in Boston, Sri Ambati, CEO and Founder, demonstrated how to use our award-winning automatic machine learning platform, H2O Driverless AI , on Red Hat OpenShift Container Platform. You can watch the replay here .
What we showed not only helps data scientists achieve results, it also enables them to scale their machine learning efforts and easily deploy their models for enterprises. Sri talked about the five easy steps to do automatic machine learning with Driverless AI on Red Hat OpenShift.
Now let’s see how all of these come together on OpenShift. Using the H2O.ai OpenShift templates, we can train a model with one, and with the other deploy it.
The demonstration was focused on determining sentiment analysis which we did on a tweet at the end of the presentation. We started with a sentiment data set that we had pre-loaded.
We can visualize the data to get a snapshot of how the data set looks. Auto Visualization, part of Driverless AI, allows us to look at the data in many ways from determining data outliers, to correlations, heat maps and more.
We want to find out whether the sentiment is positive or negative, and the only thing we need to do at this point is to optimize for the accuracy, time and interpretability, the “knobs and dials” that tell us how complex we want the model to be, how much time to train, and how interpretable the model will be. Driverless AI automatically detected that this is an NLP problem and applied one of our NLP recipes to the problem.
We saw about 93% accuracy, which is pretty good for a problem that was trained on a really small dataset with only 12,000 rows.
You can look at different charts like a ROC curve, lift and gains and also look at the summary quickly. You can see we created about 352 features out of the one feature that was given originally. We only gave the text column to start. Based on the text column, we created word embeddings and those were the ones that were used by the model.
Once the models were built, you can download the scoring pipeline or you can interpret this model. We’ll show how to deploy it.
Using the OpenShift console, and we now have a template to deploy this Mojo (our format for deploying models) that is optimized for low latency.
We had a small web app running and typed in the following to see the sentiment based on the model:
“The Red Hat keynote was beautiful & awesome”
Which resulted in positive sentiment. This was a fairly easy demonstration with Driverless AI on OpenShift, but it showed how easy and seamless it is to start an instance, build and interpret a model, and finally publish that model to score live data.
It was great to participate at the Red Hat Summit. We enjoyed demonstrating how H2O.ai and Red Hat are working together to democratize AI for the enterprise.