The speed at which data is getting generated is not going to slow down. This is not news anymore. How do you reduce the distance in time between an event happening and an action being taken to respond to it? How complex are these decisions to enable meaningful and effective responses? How can I dynamically refine these decisions based on what I uncover from my machine learning investments? After all, what good is learning if you don’t use it? This session is to show how to think about data utilization patterns in an event driven world and architecture and how to create a continuous event response. We will discuss how to build intelligent decision making into your data ingestion process and how in-memory computing evolution in conjunction with better processors enables solving problems at scale in a near linear fashion.