Artificial Intelligence continues to dominate headlines in 2023, reflecting both its growing capabilities and the ongoing debates surrounding its impact on society. Last year, the focus centered on how AI could enhance personal and professional lives, with many organizations exploring its potential. Amid this excitement, however, alarmist predictions about AI’s possible existential threats, popularized by figures such as James Cameron and the fictional Cyberdyne Systems, also emerged.
The past year witnessed bizarre trends, including the viral phenomenon of transforming parts of the United States into action figures, which, while amusing, highlighted the broader implications of AI’s usage. The current landscape reveals unprecedented growth and resource consumption, drawing comparisons to the Industrial Revolution—perhaps warranting the term “Intelligence Revolution.”
As millions of professionals leverage AI, the tangible technology driving this shift comes with significant resource demands, including servers, electricity, water, memory, chips, and expansive data centers. The phenomenon is evident when 50 million users tap into substantial computing power, straining these crucial resources. As demand escalates, so too does the cost of these essentials, echoing the lessons of economics that many may have overlooked in their educational journeys.
Water consumption is one of the most pressing issues. A massive data center can consume up to 5 million gallons a day, and according to the Lincoln Institute of Land Policy, Texas data centers used approximately 50 billion gallons last year. Even smaller AI interactions, such as a 20-40 query conversation with ChatGPT, can consume the equivalent of a 16-ounce bottle of water, shedding light on the staggering scale of resource usage.
Turning to electricity, data centers in the U.S. are estimated to account for 5-10% of the nation’s power consumption. These large facilities house servers, data storage systems, and critical networking equipment necessary for digital operations. Each email sent, show streamed, or photo saved to the cloud is facilitated by these infrastructures. Surprisingly, while Florida appears to be a natural candidate for leading data center operations, states like Texas (over 300), Virginia (over 600), and California (over 300) dominate the landscape, with roughly 5,000 data centers in the U.S. comprising 40% of the global market.
Indeed, many companies boast of possessing a “data center,” but simply having a server in a closet does not suffice. However, Florida may soon see the development of a significant data center, potentially enhancing its standing in this competitive arena.
The discussion extends beyond water and electricity to the essential components of memory and chips. Memory chips, integral to data management, come in two forms: RAM for short-term and Flash/ROM for long-term storage. High-bandwidth memory (HBM), crucial for AI applications, is currently experiencing a supply shortage, impacting prices significantly. Forecasts predict that prices for computer memory could increase by over 50% by early 2026, reflecting current market strains.
The complexity of HBM technology, which involves stacking multiple memory layers into a single chip, means that manufacturers like Micron must prioritize these advanced chips over conventional memory production. Consequently, consumers should brace for price hikes of around 25% when purchasing new PCs. Such fluctuations are reminiscent of past spikes caused by pandemic-related production issues, but this current scenario poses unique challenges.
Furthermore, the supply of rare earth metals, vital for many high-tech applications, remains a critical concern. China produces more than half of the global supply of these 17 essential metallic elements. In the U.S., companies are actively exploring alternative sources, such as Mosaic in Florida, which is investigating waste mining methods to reclaim rare earth elements from phosphate mining materials.
In discussions about AI’s future, Eduardo Gonzalez Loumiet, a partner at Ruvos, emphasized the urgent need for infrastructure investment. “The question isn’t whether artificial intelligence is coming; it’s already here, and its use will only continue to accelerate. As a region anchored by education, government, and an increasingly innovative private sector, we must be honest about what that reality demands,” he stated. He highlighted the importance of investing in power, connectivity, and data capacity to ensure that institutions can responsibly harness AI’s potential.
AI is not a harbinger of doom but rather a catalyst for significant resource demands. Balancing the use of AI technologies, such as ChatGPT, could mitigate the strain on resources. As society adjusts to this digital intelligence revolution, it is imperative to manage human intelligence effectively, reserving fantastical transformations for realms of fiction. The future of AI will depend on our ability to navigate these challenges pragmatically, ensuring that technology serves as a tool for progress rather than a drain on our essential resources.
See also
Tesseract Launches Site Manager and PRISM Vision Badge for Job Site Clarity
Affordable Android Smartwatches That Offer Great Value and Features
Russia”s AIDOL Robot Stumbles During Debut in Moscow
AI Technology Revolutionizes Meat Processing at Cargill Slaughterhouse
Seagate Unveils Exos 4U100: 3.2PB AI-Ready Storage with Advanced HAMR Tech

















































