I'm Stealing From Creators Every Day With AI (And So Are You)
AI theft isn't just about visual art—it's extracting value from every creative domain. And we're all complicit.
Studio Ghibli recently discovered an AI image generator had mimicked their distinctive artistic style without permission, triggering lawsuits and outrage.
But the problem extends far beyond visual art. AI systems are quietly extracting value from writers, musicians, photographers, filmmakers, and academics—often without attribution or compensation.
Last week, I asked Claude to summarise a business book I'd never purchased.
Within minutes, I had extracted years of someone's intellectual labour, research, frameworks, and insights without paying a penny. I then incorporated these stolen concepts directly into my presentation without even remembering the author's name.
I didn't think twice about it.
These incidents are happening with increasing frequency, and the scale is difficult to comprehend. The pace of AI development and integration is outstripping our ability to adequately address the ethical implications.
This quiet "borrowing" has made me profoundly uncomfortable.
We've Been Here Before (But This Is Worse)
Pondering this, I was reminded of the impact when Google started aggregating news content.
Scott Galloway described in "The Four" how publishers like The New York Times could have acted early and limited Google's access to their material, forcing negotiation for content rights. Instead, they capitulated, allowing unfettered access that fundamentally altered media power dynamics.
As a result, publishers are now begging for views with clickbait headlines while Google's value has soared.
But there's a crucial difference between then and now.
Back then, news publishers faced a classic prisoner's dilemma. If one stand-up outlet blocked Google while others folded and allowed access, they'd lose traffic to competitors.
With AI, something far more insidious happens:
Value extraction occurs invisibly in private conversations
No attribution trail exists
No compensation flows to creators
The extraction occurs simultaneously across all creative domains
This isn't just Google's news aggregation on steroids—it's intellectual property theft at an unprecedented scale.
The Great Creative Heist: How AI Systems Feed on Original Work
Modern AI systems function essentially as advanced autocomplete tools built on a foundation of others' intellectual property.
I've witnessed the significant impact from both sides:
At loveholidays, we've used AI to summarise research, apply frameworks from books we've never purchased, and deliver projects at 100x the pace. The productivity gains are undeniable and transformative.
On the flip side, my brother, who built an audience of over 100,000 Instagram followers based on his artistic journey, is pulling back from the platform. As he put it, "Meta is scraping all of IG to feed their plagiarism engine."
One more creative voice silenced.
One less person moving society forward, making social media increasingly the sole domain of the mental health crisis-inducing bikini-clad and TRT-augmented influencers.
The issue isn't that AI assistance is problematic—these tools genuinely supercharge productivity. The problem is that creators, the foundation for these systems, receive no recognition, compensation, or protection.
This makes little economic sense. I would have willingly paid for the Project Ghibli version of a picture I took rather than effectively stealing it through ChatGPT.
The Warning Signs: Journalism's Collapse as a Harbinger
We've already seen what happens when content creation becomes economically unsustainable.
The collapse of quality journalism serves as a warning:
Once-respected news organisations now chase clicks
Sensationalist headlines replace thoughtful analysis
Listicles and AI-generated content barely pass for reporting
Quality investigative journalism becomes increasingly rare
This future awaits every creative domain if we continue down our current path. The same forces that gutted newsrooms will decimate art studios, writers' rooms, recording studios, and photography collectives.
Not through direct competition—but by siphoning away the economic value of original creation without compensation.
Ironically, the gap between idea and creation, facilitated by AI, is shorter than ever. We're undermining the creators who drive innovation.
What will future AI models train on if creating original content becomes economically unsustainable?
Regurgitations of earlier AI outputs?
The logical endpoint is a wasteland of derivative works and diminishing originality.
This creative monoculture benefits no one—not even the AI companies themselves.
Beyond Attribution: The Intellectual Property Crisis
At a recent interview at TED on April 11, 2025, Sam Altman was forced to address the elephant in the room—the massive intellectual property crisis brewing beneath AI's impressive capabilities.
His arsey acknowledgement of the issue ("you can clap for that all you want" 9_9) is essential but doesn't go nearly far enough. The solutions being proposed are insufficient for the scale of the problem.
It's not just visual artists suffering. Authors, researchers, and academics who spend years developing unique frameworks now watch as their life's work is casually summarised and repurposed without a single citation.
The business book I had Claude summarise represented thousands of hours of someone's intellectual labour. Yet within minutes, I had extracted the core value with:
Zero attribution to the author
No purchase
No acknowledgement of intellectual debt
This is intellectual strip-mining at an unprecedented scale. We're extracting valuable minerals while leaving behind an uninhabitable landscape for creators.
When authors can't make a living because AI systems give away their core insights for free, who will invest the time to develop the next generation of groundbreaking ideas?
From Attribution to Fair Compensation
The solution to this existential threat must go beyond Altman's modest proposals and simple attribution. We need comprehensive systems that ensure creators receive fair compensation when their work is accessed through AI systems.
Imagine a system where:
AI companies are transparent about which creative works are in their training data
Usage statistics track when AI draws from specific works to answer queries
A percentage of subscription revenue flows to creators based on how often their work informs AI responses
Users see attribution and direct links to purchase the works they're accessing
Just as the music industry adapted after Napster decimated traditional revenue models, we need new systems that ensure value flows back to creators.
The critical difference? This time, we need to act before the ecosystem collapses, not after.
From Disruption to Sustainable Innovation
We in technology have a pattern of asking forgiveness rather than permission when disrupting established fields. AI is following the same playbook that search engines used—extracting value from content creators while offering little in return.
As a technologist and born contrarian, I see both sides:
AI tools offer genuinely transformative capabilities that shouldn't be stifled
But sustainability requires balance
Without it, the same decline that devastated journalism will spread across artistic endeavours simultaneously, leading to cultural impoverishment that makes everyone poorer—intellectually if not financially.
AI companies, content creators, and regulators must collaborate on a fair compensation framework that maintains innovation and ensures creators can make a living from their work.
Taking Personal Responsibility
While systemic change is necessary, individual actions matter.
I'm still going to use AI, but I am modifying my own behaviour by:
Buying every book I use or summarise even if I disagree with it
Reading and subscribing to independent creators
Purchasing art directly from artists
Attending more live events and eating locally
These small acts of support won't fix the system alone but represent a personal commitment to valuing original creation.
I'm also maintaining my continued resistance to derivative monoculture by refusing to acknowledge Taylor Swift as a thing, never drinking anything from Starbucks and avoiding the Ikea nesting instinct. Cultural diversity depends on both systemic change and personal choices that push against homogenisation.
How Will You Support the Creators You're "Borrowing" From?
Are you comfortable with the invisible extraction happening through your AI tools?
Have you considered how your creative work might feed these systems without compensation?
The journalism industry's collapse should serve as a warning—once the economic foundation for creation crumbles, it's exceedingly challenging to rebuild.
We have a narrow window to get this right before we witness the same collapse across art, literature, music, and other creative domains.
What specific actions will you take this week to ensure creators are compensated for their work?
Will you buy a book you've had an AI summarise? Support an artist whose style you've admired through an AI image generator? Subscribe to a writer whose ideas you value?
The future of human creativity might depend on it.
And frankly, I'd rather not live in a world where the only new ideas come from algorithms feeding on the ghosts of creators who could no longer afford to create.
(Yes, AI assisted in the writing of this article)