Make your data science actionable, real-time machine learning inference with stream processing.

Make your data science actionable, real-time machine learning inference with stream processing.

Are you ready to take your machine learning algorithms and make them operational within your business in real time? We will walk through an architecture for taking a Machine Learning model from Training into deployment for inference within an Open Source platform for real-time stream processing. We will discuss the typical workflow from Data Exploration to Model Training through to real-time Model Inference (aka Scoring) on streaming data. We will also touch on important considerations to ensure maximum flexibility for deployments that need the flexibility to run in Cloud-Native, Microservices and Edge/Fog architectures.

We'll demonstrate a working example of a Machine Learning model being used on streaming data within Hazelcast Jet, an extremely powerful platform for distributed stream processing.

Schedule:

Room:

Edward 1-4

Tracks:

Speakers
Neil
Stevenson
Solution Architect
at
Hazelcast
Neil is a solution architect for Hazelcast®, the world's leading open source in-memory data grid.
In more than 25 years of work in IT, Neil has designed, developed and debugged a number of software systems for companies large and small.

Slides & Recordings

   Download Slides