The Data Deluge: Unraveling the Productivity Paradox in a Stagnant Digital Age
Data refers to numbers, indexes, symbols, and other forms of information used to describe economic activity and, more broadly, our daily lives. The earliest known writings, etched on clay tablets in ancient Mesopotamia, recorded data such as quantities of goods traded or stored. This evidence highlights data’s timeless importance: no organized society can function without it. Central authorities—whether governing a village, city, state, or empire—depend on essential facts about their domain: population size, worker counts, production volumes, and more. Such data underpins taxation, resource allocation, military recruitment, and other vital tasks.
As civilizations grew more intricate and economies multilayered, the need for detailed data intensified to maintain order and governance. Yet, it’s not just rulers who rely on it—every participant in economic life does. When commerce emerged, merchants tracked prices of goods bought and sold, logging profits or losses. Farmers selling crops to these traders likewise noted their earnings or deficits. In short, the demand for data scales with the population an entity oversees and the complexity of its operations. In our transaction-heavy society, even individuals need basic data to manage household budgets.
We undeniably inhabit a data-world. Numbers and indices permeate our existence. Governments oversee data on demographics, economics, transportation, security, law enforcement, and healthcare. Companies juggle operational records, customer insights, and financial metrics. Fortunately, Information Technology (IT) has emerged to tame this flood, aiding humanity in this daunting challenge. Computers, ubiquitous in workplaces by the 1990s, were engineered by brilliant minds to store, process, and automate data tasks.
Despite nearly five decades since the advent of Information Technology (IT)—a period defined by relentless innovation—we remain inundated by data. Unlike the transformative technologies of the Industrial Revolution, which markedly boosted productivity, IT has largely failed to deliver similar gains, except for a notable surge in the 1990s, which we’ll explore later. This disconnect is termed the "productivity paradox" or "Solow Paradox," after economist Robert Solow’s famous 1987 observation: "You can see the computer age everywhere but in the productivity statistics." In this discussion, I aim to unravel this paradox, proposing three potential explanations:
The technology itself is inadequate for the challenge.
We are misapplying the technology.
We generate and process more data than IT can effectively manage.
My analysis draws primarily on data from the United States, though its conclusions resonate across the Global North. The U.S. stands out as the epicenter where "parasitic jobs" flourished, setting a blueprint for Western economies transitioning from industrial powerhouses to service-driven systems dominated by finance and IT.
Let’s begin with the public sector, which, in reality, operates as a sprawling public-private complex. In the U.S., this complex is enormous, consuming roughly 37% of GDP annually through government funding. IT adoption came early here for two reasons: first, the U.S. pioneered IT, particularly through its defense-industrial complex; second, its vast public sector was an ideal testing ground for processing, storing, and organizing massive datasets. Jennifer Pahlka’s book Recoding America offers an insider’s perspective on this software-driven complex. An answer to our question can be summarized in the following excerpt from the book:
"Each successive leadership at an agency typically receives a budget or mandate to address only the most pressing technological crises. Since tech investments are usually justified by adding new capabilities rather than renovating existing systems, each component is built using different technological paradigms from different eras. However, every new piece depends on what came before, meaning each successive layer is constrained by the limitations of earlier technologies. The system is not so much updated as it is expanded. Over time, new functionalities are added, but the system never sheds the core limitations of its foundational technologies. At the same time, it becomes increasingly complex and fragile. Updates require caution, as any change in one layer can have unforeseen consequences in others. It becomes harder to support older technologies in the lower layers, while newer layers demand constant updates and patches. Eventually, the paint cracks."
As expected, IT did not reduce complexity; instead, it increased it. This outcome was predictable because no interest group wants procedures to become fully automated. Complete automation would mean millions of people losing their jobs. The lower quality of public services is a small price to pay, a price borne mostly by the most vulnerable individuals, who have the least power to change the situation. In most cases, as the book mentions, the job gets done—or at least half-done—thanks to the heroic efforts of a few conscientious employees. The author argues that the system in the U.S. public-private sector is structured in a way that makes it completely sclerotic. Rather than being dismantled, the system is patched up every time a new problem arises, with shortcuts that compound its complexity. Increased complexity, in turn, leads to further bloating of the system. Specifically, the author notes that when the system becomes completely dysfunctional, the solution is always to throw more money at the problem. This translates to hiring more people, creating new contracts, or expanding existing ones.
While money may not always solve the problems of public services, it does address the main issue in our society: the economy’s inability to produce productive, well-paid jobs. Pahlka’s optimism—suggesting cultural shifts like "waterfall to agile" could help—feels naïve. So does her surprise at politicians’ and bureaucrats’ indifference to digital flops. But let’s be realistic. What is the true role of politicians? They may claim to be public servants, but the reality, as the author herself describes, is that their primary function is to balance the pressures exerted by various interest groups. Who are these groups? They include lobbyists representing corporate interests, who seek a larger share of state funding, and the white-collar class responsible for managing the state’s data—individuals who are part of politicians' daily lives. What do these groups ultimately want? Jobs. Thus, when digital public services falter, politicians feel neither anguish nor accountability; deep down, they know their real mission—securing jobs—has been accomplished. Glitches don’t faze them unless public outrage flares—like the Healthcare.gov mess under Obama, which botched Affordable Care Act delivery. Even then, pouring in more funds often proves a handy fix—not because it perfects the system, but because it raises the odds that someone might feel pressured to take their work seriously. In the end, politicians typically spin neglect into victories, contractors walk away wealthier, and the cycle grinds on.
Developing nations are increasingly outpacing their Global North counterparts in delivering effective digital public services. Notable examples include India’s Unified Payments Interface (UPI), a seamless real-time payment system, and Brazil’s Pix, an instant payment platform. This disparity arises because, while modern technology can manage any state’s data demands, incentives diverge sharply. In developing countries, scarce funding for hiring staff becomes a catalyst: it drives the adoption of cost-effective digital systems to replicate human-delivered services, requiring only political resolve to succeed. Western states, however, confront a stark choice—squander vast sums on parasitic jobs to maintain public services or deploy efficient IT with minimal human input. Invariably, they opt for the former, bowing to interest groups that favor job preservation over innovation.
Thus, the public-private complex misuses IT, layering complexity instead of streamlining. Worse, it spawns new data—indices like the Disability Equality Index (DEI), Gender Equality Index, and Sustainable Development Goals (SDG) Index—swelling administrative ranks in government-funded universities, hospitals, and beyond. New managerial roles emerge to wrangle this data, not boost efficiency. Productivity gains were never in the cards. While private companies within this complex may have incentives to be more productive to increase their profits, the structure—with its multiple compliance requirements, lack of coordination, and legal hurdles—makes productivity nearly impossible.
The story of the private sector is more complicated and can be divided into two distinct periods. The first period, roughly from 1990 to 2000, saw official statistics showing significant increases in productivity. As experts and a paper on the U.S. labor market I consulted explain, this was due to several factors: computers automated routine tasks (e.g., data entry, inventory tracking), reducing labor hours needed per unit of output; connectivity streamlined supply chains, enabled real-time communication, and opened new markets. Retail productivity rose as firms like Walmart used IT to optimize logistics. Computers also enabled process innovations—such as just-in-time manufacturing, which cut waste, and ERP systems (like SAP) that integrated operations. Additionally, information technology facilitated computer-based manufacturing techniques like computer numerically controlled (CNC) machines, allowing workers to program routine production tasks with incredible precision. I believe the economists are correct: IT drove these advancements, and the positive results were reflected in productivity charts.
Yet a persistent question lingers: why did growth falter afterward, despite semiconductor sizes shrinking from 130 nm to 2 nm—unlocking vast efficiencies—and smartphones and cloud computing revolutionizing corporate landscapes? In our current era, the second phase of this technological wave, IT permeates everything, yet Western economies languish in persistent productivity stagnation. The 1990s demonstrated that IT could overhaul supply chains, leading some to argue that the "low-hanging fruit" of gains was harvested, leaving subsequent breakthroughs harder to achieve. This partly accounts for the slowdown, but I suspect a deeper shift: a flood of data has overwhelmed us, outpacing IT’s capacity to manage it effectively. Consider how major U.S. tech giants—Google, Meta, Amazon, and Microsoft (with Apple as an exception, thriving on hardware)—generate their profits. Their revenue streams—advertising, cloud services boasting 20-40% margins, and business intelligence (BI) tools—all depend on a single linchpin: data.
Not all data is equal, however. In the 1990s, it centered on manufacturing, logistics, and supply chains—supply-side metrics like production volumes, product specs, machinery stats, and shipping timetables. This data, efficiently harnessed, oiled the economy’s gears. Since the 2000s, the tide has turned to demand-side data, driven by a single catalyst: the internet. IT birthed this digital realm, whose reach exploded with smartphones and social media post-2010. As we browse, we leave traces—clicks, views, likes—captured as the "new data." This has eclipsed supply-side figures so thoroughly that "data" now often means just our online footprints. Branded "the new gold" since the 2010s (search "data new gold" to see), its value sparked a frenzy—big data and analytics buzz took hold. Managing it birthed empires: Google and Meta target ads with our data; Amazon, Microsoft, and Google store it in clouds, where I estimate, via Grok AI, 85% of data is demand-side—a hunch that feels spot-on. Unstructured and vast, this "gold" demanded mining, fueling Microsoft’s ascent and a thriving analytics ecosystem.
I still find it incredible that some people have managed to create a business model centered around our online behavior. It seems so strange to me.I doubt its sales impact matches the hype; without this "new gold" fervor, this landscape might not exist—a thread for later exploration. One truth stands firm: this data era hasn’t lifted productivity. Productivity grows when better management or technology squeezes more output from the same inputs. Economy-wide, it surges when efficient tech spreads across firms, slashing prices, spurring demand, and absorbing displaced workers—think electricity or 1990s IT. That’s scaling; everyone wins.
Today’s reality diverges sharply. In an age of abundance, Western firms prioritize advertising over production. Production still matters—its neglect costs us—but CEOs fixate on profits via visibility. Competition shifts from delivering customer value to securing prime digital real estate—internet ad slots. The internet, a boundless marketplace, gathers global consumers in a magical bazaar. Tech titans—swift, savvy, or ruthless—claimed this turf, now renting it at a premium to firms vying for attention. Value-driven rivalry scales economies and spawns productive jobs; this zero-sum ad war stagnates them. For two decades, Western productivity has flatlined, explaining the "profit paradox." Economists miss it: the economy has morphed from expansive to static.
What does this have to do with data? The connection is largely artificial. Google and Meta do not sell our data; they rent their privileged positions on the internet. Internet data tells advertisers when consumers pass through specific digital spaces, but ultimately, it doesn’t matter when, as consumers will pass through eventually. The digital ad market has created a structured mechanism to manage this process. From the spam-filled internet of the 1990s, we now have a more controlled environment where advertisers bid in real-time auctions based on user data, ad quality, and budget. However, the result is not so different from the spam era—the internet is still flooded with ads, though this is a topic for another discussion.
The obsession with data is largely tied to a dominant mindset centered around information. Since the 1970s and 1980s, Western societies have idealized "good jobs" as clean, white-collar office roles reserved for their most talented, ambitious, and hardworking members. These jobs primarily involved managing information from the real world, while physical or manual labor was often stigmatized as demeaning. When technology emerged to streamline data management, it threatened these prestigious roles. To preserve them, the professional managerial class began scouring for new sources of data wherever they could be found. The internet provided the raw material they were searching for. The fact that this new data was of lower quality compared to traditional data didn’t matter. White-collar employees, having spent their entire careers immersed in data, naturally valued it as "gold." The public sector seized the opportunity to step in and champion the protection of personal data. In doing so, it legitimized the data’s value by inflating its significance. While their intentions may have been well-meaning, unconsciously bureaucrats saw the new data as an opportunity to create more parasitic jobs for themselves and their children. In this way, a potentially transformative technology like IT was neutralized. Contrary to the conventional wisdom that technology disrupts labor markets, the structure of employment has remained largely unchanged over the past 50 years. If society does not desire change, no technology can force it to happen.Even Larry Summers, in his recent paper on the U.S. labor market, acknowledges this: "We find that—contrary to popular imagination—the pace of labor market disruption has slowed in recent decades. The changes in the structure of U.S. employment at the end of the nineteenth century were greater than in any decade of the digital era."
However, formidable challenges loom ahead. We have reached the final stage of capitalism in the Global North, where companies have shifted their focus away from production and instead devote all their energy to luring more customers. Their competition has become zero-sum. As a result, the economic pie no longer expands, and firms vie for our finite attention online. If the economy fails to regain productivity, new, well-paying jobs will not emerge. Ambitious young people will continue scrambling for the dwindling pool of prestigious yet increasingly unstable data jobs. This stems from the fact that the true asset of tech companies, as I’ve noted earlier, is not the data they harvest from us but the dominant positions they occupy on the internet. The entire data industry—cloud providers, SaaS companies, and data analytics firms—remains vulnerable because their customers are not naïve. For now, the narrative of “golden” data holds sway, largely amplified by financial speculation sparked by the unrepeatable profits of early tech giants that claimed prime digital territory when the internet was still uncharted. Yet, as the hype subsides, companies may cut back on lavish spending for cloud services and SaaS, triggering a slow erosion of private-sector data jobs. The only data roles left standing will likely be within the public-private complex.I fear a future where everyone is fighting to get into this complex. Corruption and hypocrisy would be its prevailing norms.