Advances in technology solve problems while creating new ones
Previously I wrote about the difference in technology today compared to when MQ first came out. One of the areas that is most notable is network speed and how that relates to I/O as well as reliability. Not that long ago, you probably were saving changes to your Word or PowerPoint documents every few minutes just to make sure you didn’t lose a days’ worth of work in case of a problem. OK, let’s be honest some of us did lose a days’ worth of work because we forgot to save often enough. When IBM MQ first came out, in order to provide guaranteed delivery it also had to deal with the high volatility and slow speeds that were available. As such it had to be very frugal in its use of resources as well as making sure that any changes were hardened.
Today, Word and PowerPoint are constantly saving your work in the background so if something bad were to happen you are almost up-to-date and it’s less common that something does. So it’s no surprise that if you compare the recovery strategy for Kafka and the recovery strategy for MQ there are a lot of differences. Kafka makes assumptions about the environment that would not of been possible even five years ago.
For example, Kafka leverages multiple concurrent replication of data at very high speed. Which in turn, are required in order to provide data integrity. These techniques leverage high speed networks, and network attached storage that is a magnitude faster than the fastest local storage devices which existed not that long ago.
IBM MQ has continued to modernize its facilities to take advantage of technology changes. For example multi instance Q managers were created which relied on network attached storage but came with restrictions. RDQM was introduced which removed many of those restrictions but was only available for Linux systems. I can envision that other recovery options will be available in the future, in order to be on par with the modern systems.
But these trends have also created new challenges. Since the components are distributed across a number of nodes and move dynamically, visibility requires tools that have been modernized and can handle the dynamic nature of this environment. Nastel’s products have constantly been innovating to make sure that’s you can do your job no matter how complex the underlying infrastructure is.