- Change Leaders Playbook
- Posts
- Prepare for the AI Gold Rush
Prepare for the AI Gold Rush
Why leading businesses are investing heavily in AI infrastructure and what it could mean for the future of work.
Investing ahead and spending astronomical amounts may seem futile, naive and speculative, but history has shown that businesses that get ahead of emerging technology tend to remain relevant in the long run.
The first industrial revolution saw the developments in manufacturing, trade routes and railways.
The second Industrial revolution saw electrical utilities, chemical, steel and oil companies alongside automobiles and planes take over.
The third and fourth Industrial revolutions witnessed the advent of computing hardware and software accompanied by cloud computing. Familiar enterprises such as Google, Microsoft and Apple capitalized on these opportunities and continue to thrive today as a result.
The next revolution is arguably an AI-focussed world, which presents a future that is currently being written, and holds opportunities for more abundance than previous revolutions, whilst probably adding he greatest risk to relevance of todays workforce.
The fact is that todays leaders are certain that AI is here to say and they want to be ready for it, to avoid irrelevance at the hands of their competitors. Even though we don’t fly know how AI will improve lives, or even how it can be monetized, The companies and organizations across the globe are on perhaps the largest tech spending spree the world has ever seen.
Join 400,000+ executives and professionals who trust The AI Report for daily, practical AI updates.
Built for business—not engineers—this newsletter delivers expert prompts, real-world use cases, and decision-ready insights.
No hype. No jargon. Just results.
Hungry for GPUs
The leading tech platform providers known as the hyperscalers (e.g. AmazonAWS, Microsoft Azure, Google Cloud) and major tech companies are investing heavily in AI compute due to the insatiable demand to serve their own internal R&D needs and that of their customers, who typically rent compute power fro them.
GPU designers and manufactures, such as NVDIA and AMD are overrun with demand for their hardware and related software that creates the required compute. This demand is seeing no sign of stopping, but what is it being used for in the context of AI, and will these new uses for AI put professional jobs at risk?
The global AI market was valued at $200 billion in 2023 and is projected to reach $2 trillion by 2028, driven by enterprise adoption of AI for automation, analytics, and customer engagement.
Large Language Models (LLMs) and generative AI need significantly more computing power for training and inference, in comparison to standard cloud services.
AI models, in particular reasoning models, need over 100 times the compute resources of previous generations, pushing hyperscalers to continually expand their datacenter capacity with seemingly no end in sight.
For context, every time a ChatGPT query is ran, a new AI image is created, or complex AI problem is solved, an ever increasing amount of compute power is needed.
Hyperscaler Investments in AI
Company | AI Infrastructure Spend (approx) | Why? |
---|---|---|
Microsoft (Azure) | ~$80 billion in fiscal 2025 | AI-ready data centers for model training, Copilot, Azure AI services, OpenAI partnership |
Amazon (AWS) | Over $100 billion in 2025 CapEx | Expanding AI infrastructure for AWS and AI customers |
Alphabet (Google) | ~$75 billion planned in 2025 | AI infrastructure for Gemini, Vertex AI, TPU clusters |
Meta | ~$60 billion planned in 2025 | AI compute capacity for internal tools and services |
Even nations are investing in AI infrastructure, with China estimated to be investing $138 billion, and EU committing $235.7 billion to reduce the risk of falling behind, and to minimize reliance on US-based companies.
So, what is all this compute power currently being used for?
LLM Training and Inference
Developing AI models such as GPT-4 or Google Gemini requires significant GPU clusters. Inference, including real-time processing for user queries, is increasingly dominant in its demand for compute. AI inference demand is now estimated to be 10 times that of training.
Agentic and Physical AI
Nacent cases including robotics and autonomous vehicles are proving to be a significant opportunity to use AI to solve safety, cost and labour challenges.
These cases are incredibly demanding of AI workloads in particular due to the epic amount of vision data required to be acquired and processed for physical AI.
Intelligent Document Processing
An example could be the integration of machine learning (ML) and natural language processing (NLP) to enhance the accuracy and efficiency in data extraction.
Developer Tools
AI-enabled tools such as GitHub Copilot, AWS CodeWhisperer accelerate software development by generating code and automating developer workflows, reducing development time and quality.
On-Device AI
Smartphones, tablets, wearable devices and personal computers are just some examples of devices that are starting to integrate AI in their functionality.
Whilst they can increase privacy and safety, they also contribute significantly to the accelerated demand for AI processing.
|
Talent Wars
It’s clear that whilst the full understanding of what lies ahead remains unclear, the opportunity for business growth and market dominance is expected to reach levels previously unseen.
Such is the expected business opportunity that Meta has been on an epic recruitment spree, offering salaries and deals that would make the salaries of elite sport stars or world class CEO’s appear meagre at best.
Meta’s hiring spree
Person / Team | Previous Affiliation | Offer | Did they take it? |
---|---|---|---|
Matt Deitke | Allen Institute for AI / Vercept | Initially $125 M → then $250 M over 4 years | Accepted |
Researchers from OpenAI | OpenAI | Up to $300 M packages (with >$100 M in year one); $100 M+ signing bonuses also cited | Several accepted (~7–8 joined Superintelligence Lab) |
Thinking Machines Lab team | Thinking Machines Lab (ex-OpenAI) | Offers ranging from $200 M to $1 B over multiple years; one $1 B offer | All declined |
Anthropic staff | Anthropic | Offer ~ $100 M signing bonuses or more | Declined |
Unnamed OpenAI researchers | OpenAI | $100 M signing bonuses or high comp | Most declined — none of OpenAI’s “best people” moved |
So, what do we see as the near-term impacts for professionals like you and me? Are there opportunities in the short term? Will our careers be replaced by AI robots?
In reality, we don’t fully know the impact until the longer term use cases of applying AI in our everyday life have matured, however, history has shown that whilst workforce roles were displaced, each revolution created new types of employment and business opportunities, enabling a net-gain in job creation and economic prosperity.
What I’ve Been Reading
That’s it for this edition, for more delivery leadership insights, subscribe to the Change Leaders Playbook podcast series on Youtube, Spotify, Apple and Audible.
p.s.
How was this article?Your feedback helps to make future posts even more relevant and useful. |
Reply