Streaming Data

Divide, Distribute and Conquer: Stream v. Batch

Data is flowing everywhere around us, from phones, credit cards, sensor-equipped buildings, vending machines, thermostats, trains, buses, planes, posts to social media, digital pictures and video and so on.

Simple data collection is not enough anymore. Most of the current systems do data processing via nightly extract, transform, and load (ETL) operations, which is common in enterprise environments, requires decision makers to wait an entire day (or night) for reports to become available.

Redefining Business with Streaming Analytics

 

Streaming Analytics has become the latest buzzword in the Business world. However, most Enterprises struggle to comprehend how Streaming Analytics can solve business problems specific to their industry and domain. In this session, Seshika will showcase the application of Streaming Analytics in different industries and how it could solve specific problems that would boost revenue growth and provide deeper market penetration.

Stream Processing with In-Memory Data Grids: Creating the Digital Twin

This talk is targeted at application developers who want to explore the use of in-memory computing for streaming analytics. The talk’s goal is to describe a key limitation (tracking streaming context) of current techniques (e.g., Spark streaming) and describe a new approach (implementing a digital twin using an in-memory data grid) that overcomes this limitation. It explains how the object-oriented architecture of in-memory data grids makes them well suited to applications that implement digital twins.

In-Memory Stream Processing with Hazelcast JET

Java SE 8 Stream API is a modern and functional API for processing Java Collections. Streams can do parallel processing by utilizing multi-core architecture, without writing a single line of multithreaded code. Hazelcast JET is a distributed, high-performance stream processing DAG engine, which provides distributed Java 8 Stream API implementation. This session will highlight this implementation of Stream API for big-data processing across many machines from the comfort of your Java Application.

Scalable Real-time Notification System with Fine-grained Policy Control using Apache Ignite

In the era of mobility, mobile push notification is one of the most powerful channel to deliver messages to users. However, how do we decide to send what message, to whom, under what frequency? Clearly, the what/who/when problem need to be precisely managed by polices to maximize user experience.

In this talk, we'd like to introduce you a scalable real-time notification system that supports arbitrary type of source and sink, with a fine-grained policy control framework implemented based on Apache Ignite. We'll discuss the following challenges during this talk:

Apache Ignite: This Is Where Fast Data Meets the IoT.

It's not enough to build a mesh of sensors or embedded devices to get more insights about the surrounding environment and optimize your production. Usually, your IoT solution should be capable of transferring enormous amounts of data to a storage or cloud where the data has to be processed further. Quite often, the processing of the endless streams of data has to be done almost in real-time so that you can react on the IoT subsystem's state accordingly and in time.