The Data Crunch

The Data Crunch: As AI advancements accelerate, we’re facing a "data crunch" reminiscent of the 2007 financial crisis. Just as subprime mortgages were bundled into seemingly safe packages and passed along, organisational data is often riddled with quality issues that are hidden behind the polished interfaces of applications and reports.

The financial crash was a systemic problem. It wasn’t caused by one bad actor but by multiple small, poor decisions that were not fully understood until the entire interconnected system collapsed. Similarly, fragmented data infrastructures - filled with unverified or poorly integrated data - pose a systemic risk to organisations as they enter the age of AI.

As the general intelligence of foundational models increases, organisations must respond by ringfencing, consolidating, and automating their specific intelligence. This specific intelligence relies on all of an organisation’s people and data working together as one interconnected system.

This is where the problems begin. Many AI applications are being layered on top of data foundations that require significant reinforcement. AI models rely on large volumes of high-quality data: the worse the data, the worse the model. These issues compound - problems aren’t caused by one bad dataset but by multiple unresolved quality issues interacting in unfathomable ways that weaken the entire system.

As AI accelerates through the economy, organisations with poorly integrated data systems will begin to show cracks. Disparate but entangled data quality issues will lead to unreliable AI insights and a loss of trust. Within a ten-year timeframe, many organisations may crumble under the strain of their fragmented infrastructures, losing relevance as their specific intelligence fades into the background intelligence of larger foundational models.

But there’s hope. Just as some foresaw the financial crash and acted, we too can recognise the approaching data crunch and prepare. While data warehouses, lakes, and meshes have laid the groundwork, we must now take the next step toward Total Data Connectivity - a state where all organisational data (structured and unstructured) is seamlessly interoperable, enriched with shared semantics, and accessible across systems.

Key strategies to achieve this include:

🔹Harnessing AI models to tackle the data integration problem.
🔹Using ontologies to formalise tribal knowledge and share semantics.
🔹Connecting data into a distributed web using URLs.

The message is clear: don’t waste time chasing AI side projects. Instead, focus your energy on using today’s AI to organise and connect your data, so it’s ready for the transformative AI of the near future. By reinforcing data foundations with structured metadata, semantic clarity, and better integration, organisations can build resilience and thrive in the age of AI.

⭕Network to System: https://www.knowledge-graph-guys.com/blog/network-to-system

Previous
Previous

Don’t Panic

Next
Next

Reasoning Will Fall