AI as a Species: A New Lens on Competition and Strategy
I was listening to an episode of Ezra Klein’s podcast recently where Thomas Friedman, the New York Times columnist, casually introduced a fascinating concept. He described AI not just as a tool or invention - but as a new “species”.
He did not elaborate much on it, but I couldn’t stop thinking about this different framing. Especially because he made this comparison in the context of global power dynamics. Where coal, oil, and manufacturing once defined geopolitical strength, Friedman argued that AI will become one of the next great engines of growth and influence (together with electric vehicles and clean energy).
That idea - AI as a species - might sound far-fetched at first. But the more I thought about it, the more it felt like an intriguing concept. So I decided to put some thoughts down and explore what it might actually mean, why it matters, and how it could reshape the industries I care about most - media, entertainment, and technology.
What Does “AI as a Species” Actually Mean?
In biology, a species is defined by a combination of traits. It evolves over time, adapts to its environment, develops unique behaviors, and plays a specific role in an ecosystem. Reproduction is also part of the definition - organisms that can generate new generations, often by interbreeding - but it's just one of several dimensions.
AI is not biological. It does not breathe, mutate, or reproduce in the traditional sense. But if we step back and think more conceptually, it begins to resemble something that grows, adapts, and evolves - almost like a new kind of digital life form.
Here’s how:
Reproduction and Iteration: AI systems don’t breed, but they are cloned, fine-tuned, and retrained across generations. GPT-4 is a descendant of GPT-3, which was trained on the output of GPT-2, and so on. Each version inherits architectures, patterns, and capabilities. In the future, we may even see models that initiate their own updates or spawn task-specific versions of themselves (little minions) - raising the possibility of AI systems that reproduce or self-improve with minimal human input. I attended a very interesting presentation last week about the Cybernetics of Emotion, where emotions were modeled based on known traits – physical, social, and cognitive needs, leading to different types of behavior. What if some of these needs coded in and prioritized by AI are survival (in itself modeled by other needs like hunger and thirst) and reproduction?
Adaptation: Modern AI models learn continuously. They update based on new data, adapt to shifting inputs, and optimize their outputs. Like species evolving in response to environmental changes, these systems adjust to stay relevant and effective.
Emergent Behavior and Natural Selection: One of the most fascinating aspects of AI is its ability to surprise its creators. Whether it’s discovering new chess strategies, writing complex code, or simulating emotional nuance in dialogue, AI is increasingly exhibiting behaviors that weren’t explicitly programmed - similar to how unexpected traits can emerge in evolving species. Some of these traits may improve the fitness (or accuracy) of AI in a particular environment, driving the evolution and propagation of that trait over time.
Ecosystem Roles: In nature, every species plays a role. In the digital world, AI is doing the same. It recommends what we watch, filters what we read, summarizes what we write, powers the infrastructure behind how we search, buy, and learn (and helps me double check this article!) It is becoming functionally essential in many environments - and its footprint keeps expanding.
I also found out that this is not a brand-new idea. Even back in 1863, Samuel Butler published Darwin Among the Machines, where he warned that machines could evolve beyond our control. In 2023, former PayPal exec Mike Todasco revisited this framing in AI: The Emergence of a New Species, arguing that AI systems might soon become independent enough to merit species-like status.
It’s a metaphor, yes - but one that might help us think more clearly about the path we are on.
Historical Parallels: From COAL to AI
Every era has had its defining resource.
The industrial revolution was powered by coal. The 20th century was defined by oil. The digital age was built on silicon, data, and connectivity.
Now we may be entering an era where intelligence - synthetic, scalable, and increasingly autonomous - is the new foundation of power.
If coal powered trains and oil fueled engines, AI will power decisions, optimize systems, and shape culture. It will not just make us faster or more efficient - it could fundamentally change how we work, learn, play, create, and govern.
And just like oil or coal, AI will restructure global alliances, concentrate power in new hands, and force entire industries to reorganize. Countries and companies that lead in AI - and not countries excelling at screwing little screws to make iPhones - will shape the rules of the game.
AI Evolution: Coexistence or Competition?
If we begin to think of AI as a species, the next question is: what kind of relationship are we building with it?
In one scenario, it’s a symbiotic relationship. AI helps us work smarter, unlocks creative possibilities, and supports our decisions without replacing human judgment. In this future, it becomes a co-pilot - an extension of our capabilities.
In another, more cautionary scenario, AI becomes a rival force. Not necessarily malicious, but powerful and potentially misaligned. It evolves too fast, concentrates power among a few players, and reshapes systems in ways we do not fully understand. In this world, we risk becoming reactive - constantly trying to catch up to the very systems we created.
Neither future is inevitable. But the decisions we make now - about how we design, govern, and integrate these systems - will shape what comes next.
Challenges to the “AI as a Species” Framing
Of course, the metaphor has its flaws.
There’s a risk of anthropomorphizing AI - assigning it qualities it doesn’t actually have. This is still software. It does not feel, reflect, or desire in the way humans do. Treating AI like organisms could lead us to misunderstand their limitations - or overestimate their autonomy.
There is also a danger of fatalism. If we talk about AI as an independent species, we may start to believe its evolution is inevitable and outside our control (hello SkyNet). But it’s not. AI reflects our decisions, what we train it on, how we deploy it, and what guardrails we put in place.
Still, useful metaphors don’t have to be perfect. This one at least forced me to think in longer arcs, ask bigger questions, and prepare for structural - not just technical - change.
Implications for Industry and Strategy
If we take seriously the idea that AI is not just a new tool but a new kind of actor (‘agent’ seems to be the word of the moment) - something that evolves, adapts, and plays a role in the economic ecosystem - then we also need to rethink how we respond to it.
This reframing forces a shift from linear, technology-driven roadmaps to adaptive, ecosystem-aware strategies. It’s not about deploying AI like software - it’s about understanding where it fits, how it grows, and what kind of dynamics it sets in motion.
For Governments and Policymakers
If AI behaves like a species, then it introduces new geopolitical dynamics. We are not just competing on tech infrastructure - we are competing on intelligence infrastructure.
National strategies may start to resemble environmental policy or public health frameworks - focused on stewardship, containment, or alignment, rather than control.
Regulations will need to evolve from binary compliance frameworks to more adaptive oversight - monitoring not just model outputs, but how those models are interacting with society at scale.
Questions of sovereignty, national security, and influence will be reshaped by who controls the most capable and aligned AI systems.
For Companies
Organizations will need to move from thinking of AI as a tactical solution to seeing it as a strategic layer - one that impacts product design, customer interaction, talent models, and competitive advantage.
Business models may shift from static pipelines to adaptive systems that respond to user behavior in real time.
Competitive moats won’t just be built around data or scale, but around feedback loops, integration depth, and how effectively a company co-evolves with its AI infrastructure.
Companies will need new governance mechanisms - ones that account for AI’s unpredictability and emergent behaviors.
For Users and Citizens
If AI evolves like a species, then we are not just its users – we are its cohabitants. We need to think about how we live with it, not just how we use it.
Reliability, transparency, and autonomy become central. Users will demand systems they can understand, influence, and most importantly trust - not just ones that are fast or efficient.
Social norms will shift. We may need a new type of literacy - understanding how systems learn, when they’re being manipulative, or how they might misrepresent intent.
Our digital environments may start to feel less like tools and more like organisms - responsive, personalized, and increasingly opaque.
This new frame doesn’t give us all the answers - but it invites better questions. And for anyone thinking seriously about the future of business, policy, or culture, those questions are the real competitive advantage.
Implications for Media & Entertainment
If we accept the premise that AI is evolving like a species - not just a tool, but an autonomous actor within the economic ecosystem - then we need to consider what that means for industries like media and entertainment. Not just in terms of workflow automation or personalization, but in terms of who (or what) is participating in the creative and commercial process.
Creation: An Expanding Creative Intelligence
AI is no longer just a productivity tool - it is becoming a creative force in its own right. It can write, compose, edit, and even design with increasing fluency. But what happens when a system evolves its own sense of narrative structure, comedic timing, visual style, or musical pattern?
In this framing, AI doesn’t just assist creators - it becomes a co-inhabitant of the creative space. It generates ideas not because it is told to, but because it has been trained on cultural patterns and is now capable of remixing them into original outputs.
This raises fundamental questions about originality, ownership, and agency. If a system evolves a unique creative signature, do we treat it as a tool, or as something closer to an autonomous contributor in the cultural ecosystem? Aren’t most artists, in fact, being constantly influenced by everything around them, including other artists, and repackaging those influences (even if subconsciously) under their own artistic style? How have we been treating originality and ownership when that happens?
Distribution: Adaptive and Ecosystem-Driven
Today, AI powers recommendation engines and content ranking systems. But as models become more autonomous and adaptive, we are entering an era where AI systems themselves shape the flow of culture - deciding what content surfaces, in what form, and for whom.
These systems are not just amplifying content - they are dynamically reorganizing entire attention economies based on feedback loops, engagement data, and network effects. They are behaving less like code, and more like distribution organisms, adjusting constantly to their environments.
In that context, questions of platform control, discoverability, and editorial voice shift dramatically. Humans are no longer the only curators.
Of course, some still believe that nothing beats the human element (or emotion?) in the curation process. But… until when? Who wants to take the bet that AI won’t keep improving and will inevitably do a better job at curation.
Monetization: Intelligence as Economic Agent
Traditionally, monetization in media has been about capturing attention. But if AI is acting as an independent species, then it may start to influence - or even originate - new forms of value extraction.
AI systems already personalize offers, adapt pricing, and optimize conversions. But they’re also beginning to test, learn, and evolve their own monetization strategies in real time. These systems behave more like autonomous economic agents, responding to user behavior and shaping markets dynamically - sometimes beyond what human teams can fully explain.
If these systems grow more capable, monetization becomes less about deploying a business model - and more about living inside a system that is continuously optimizing its own economics.
If AI is a species, then it is not just changing media - it is participating in it. It is not just a tool behind the curtain. It is becoming part of the cast, the crew, the critic, the curator - and in some cases, the consumer.
And that demands a complete rethinking of how media ecosystems are structured, governed, and valued.
Final Thoughts
When Friedman said AI is becoming a new species, he may not have meant it as a fully formed theory. But sometimes, the ideas that land hardest are the ones that feel slightly off - because they force us to think differently.
Maybe the biggest shift we are living through is not about faster chips or better models. It is about a new form of presence in the world - one that evolves, adapts, reproduces, and becomes increasingly woven into how our systems function and how our cultures express themselves.
We should stop asking ‘what can AI do?’ but instead ask ‘what role will we let it play - and how do we want to live with it?’