Data Mesh is indeed a hot topic these days. Is it the answer to all our big data problems? Probably not, but it will help some organizations. Is it obsolete? See what I think in this recent blog post…
In the summer of 2022, Gartner® published its Hype Cycle on Data Management, and this was not without some controversy.
In particular, it asserted that Data Mesh will be obsolete before it reaches what Gartner calls the Plateau of Productivity. It is hard to fathom how they came to that conclusion when there is such a vibrant Data Mesh Community complete with a very active Slack channel, and when DataOps.live customers like Roche and OneWeb have successfully built a Data Mesh using DataOps.live and Snowflake. But we’ll come back to that in a moment.
Data Mesh is in essence a conceptual approach and architectural framework, not a specific off-the-shelf tool or technology. One of the main precepts for its creation is angled towards improving productivity and time to value by eliminating the perceived bottlenecks that have been experienced by organizations building out large-scale enterprise data warehouses and data lakes. Zhamak Deghani, the creator of the Data Mesh concept, has referred to it as a decentralized sociotechnical approach – meaning it involves people, processes and technology. So, it seems likely that no two implementations will be the same by their nature. How one organization defines productivity based on its specific needs and outputs on its own application of Data Mesh will differ from another. Yes, Data Mesh is (kind of) new and yes, it is maturing, but that does not mean we should ignore or discount it this early in the game.
It is hard these days to separate the hype from the reality so check out the entire post over at the DataOps.live blog.
The Data Warrior