How to Use Java 8 Streams to Access Existing Data with Ultra-low Latency

Baden B

Many interactive applications analyze large amounts of existing data. For example within the field of high-frequency trading, situation awareness, AI and IoT. These analytics type of applications often require that a huge range of operations and aggregations can be carried out with ultra-low latency in a predictable way over time.

In this presentation, we will learn how developers can access data from existing databases with ultra-low latency using standard Java 8 streams.
 
Covered Technologies
We will cover how standard Java 8 streams can exhibit ultra-low latency properties even though the stream semantically can include terabytes of underlying analytic data. The presentation will include topics such as off-heap storage, code generation, database synchronization, column indexing, in-place de-serialization, memory mapping, stream optimization, parallelism and asymptotic behavior for the supported Java 8 stream operations.
 
A Java 8 Stream does not describe any details about how data is retrieved, in fact this is delegated to the framework defining the pipeline source and termination. There is nothing in the design of a stream entailing data must come from a SQL query. This fact is used by Speedment Enterprise that contains an in-JVM-memory analytics engine allowing streams to connect directly to RAM instead of remote databases. The engine provides streams with exactly the same API semantics as for databases but will execute queries with orders of magnitude lower latencies. This creates a new way to write high performance data applications whereby the actual source-of-truth can remain with an existing database. It is possible to provision terabytes of data in the JVM with no garbage collection impact because data is stored off-heap and can optionally be mapped to SSD files. Streams can have a latency well under one microsecond. Comparing this to a traditional application with a database connection, just the TCP round-trip delay in a high-performance network is hardly ever under 40 microseconds and then database latency and data transfer times have to be added on top.
 
  
Purpose of the Talk
The objective of the talk is to explain how easy and powerful existing data in a database can be accesses using in-memory computing and easy-to-use Java 8 streams.
 
Target Audience
Even though deep technical details will be discussed, a top-down approach will be used whereby the overall fundamentals of low-latency computing will be explained. Business values and an overall understanding of low-latency properties will be apparent for any person in the audience. The more technical parts will require moderate to advanced audience skills.
 
Take Aways
General considerations for low-latency applications. Simplicity and elegance of Streams in combination with low-latency performance.
 
When you are chased by a bear, you don’t have to outrun the bear. You only need to outrun your travel companion. In this presentation you will learn how to run faster.
 
Conference Track
Presumably “Streaming”
 
Speaker Bio
Palo Alto based serial entrepreneur and inventor Per Minborg is the co-founder and CTO of Speedment, Inc. Per has been a speaker at several major Java conferences including JavaOne, DevNexus, Silicon Valley Java User Group, In-Memory Computing Summit and many more and is a co-author of the publication “Modern Java” and is a writer in Oracle Java Magazine. He has 20 years of Java coding experience and runs the blog “Minborg’s Javapot” with millions of views. Per is also a frequent contributor to open-source projects/
 

Speakers
Per Minborg
Per
Minborg
CTO
at
Speedment AB

Co-founder and CTO of Speedment, Inc.