KLM, or Kappa, Lambda architectures and My journey from Legacy to the next new
Lambda- and Kappa-architectures, and of course serverless concepts all lead to a scalable solution that is cost effective and can be utilized for analytics and machine learning as well.
But what if the project you are working on has already bagged a few thousand man-years of development work and you are asked to integrate that project with yet another project of similar size and you have to make that 'digitized'?
It is not new that large projects that take many years to complete can easily be caught up, or even overtaken by new developments in Information Technology. Choices that were made back then seemed great are now horribly outdated. On the other hand, some of the developments that seemed hip and trending have been outlived by these projects as well. There is a fine balance on deciding how new developments can be fitted into these large projects without disrupting them.
In one of the customer's long-running multi-million dollar projects, Kappa- and Lambda Architectures actually prove to be architectures that can enhance and make these projects ready for the next generation. It is not like this can be done with zero impact, but being pragmatic this proves to be a great improvement to the existing architecture concepts.
In this talk I will go over how these concepts are applied on this legacy project; how components that used to rely on Terra bytes of database state can now utilize in memory and streaming; and how your team has to be brought up to speed on this new generation of tech to prevent the proverbial reinvention of the wheel (yes, utter failures will be not be avoided!)
So, if you think you are stuck in an old fashioned architecture and to use new tech would mean switching to other projects, I hope that I will prove that there is light at the end of tunnel.