Sometimes a shift happens in the world that is so quiet, so gradual, and so deceptively convenient that we only recognise its weight once it begins shaping our thinking, our behaviour, and our children.
Artificial Intelligence is one of those shifts.
People frame it as progress.
As empowerment.
As a great equaliser.
As the moment “everyone” can create, think, build, and express without limits.
But behind the convenience is a deeper story.
A story we’re not telling loudly enough.
And if we don’t reflect honestly deeply we will inherit a future we didn’t choose.
This isn’t an anti-technology message.
Technology has always been part of human civilisation.
It’s about alignment between tools and values, between power and justice, between curiosity and responsibility.
Because right now, something unprecedented is happening beneath our feet.
And we need to pay attention.
The Internet Is Eating Itself
Recent research revealed something unbelievable:
52% of the internet is now AI-generated content.
In 2022, that number was 10%.
And here’s the part that should genuinely worry anyone who cares about truth:
74% of ALL new pages published online today contain AI-generated content.
The internet is no longer growing through human thought.
It’s multiplying through synthetic regurgitation.
AI systems scrape the web, train on that content, produce more of it, and then train on their own outputs again.
A feedback loop of digital self-consumption.
Researchers call it model collapse a documented phenomenon where:
- systems drift away from reality
- knowledge narrows
- originality dies
- truth dissolves
- the model trains on itself until it becomes unstable
The internet once humanity’s collective memory is slowly becoming humanity’s collective hallucination.
We may soon live in a world where it’s impossible to distinguish:
What was written by a human?
What was written by a machine?
And what was generated by a machine trained on the outputs of another machine?
This isn’t an apocalypse scenario.
It’s data.
It’s measurable.
It’s already happening.
And it has consequences far beyond “content.”
AI Feeds Us Convenience And Steals Something Quietly
The Qur’an asks:
“Do they not reflect within themselves?”
Surah Ar-Rum (30:8)
Reflection is part of our design.
But today, we outsource our reflection:
- We start a sentence, and a model finishes it.
- We draft an idea, and a model expands it.
- We look for direction, and a model suggests it.
- We ask a question, and a model interprets it.
We call it efficiency.
But it’s actually sovereignty handed away in tiny, invisible slices.
Reflection has a speed.
Learning has a rhythm.
Mastery has a process.
Identity has a slow unfolding.
AI removes the friction — and friction is where transformation happens.
Silence is where the soul breathes.
Struggle is where meaning forms.
Time is where depth grows.
When everything becomes instant, the heart loses its grip on patience.
The mind loses the muscle of contemplation.
The self loses the ability to stand alone.
The Illusion of Creative Empowerment
One of the biggest arguments for AI is:
“Now everyone can be creative.”
People say AI removes the blockage
“You don’t need to be a developer to code.”
“You don’t need to be a designer to design.”
“You don’t need skill to create art.”
“You don’t need experience to write.”
They say this is empowerment.
But is it?
Or is it simply shifting the goalpost?
Because when you look closer:
- the interface is democratized
- but the power is centralized
A handful of corporations now own:
- the infrastructure
- the data
- the training pipelines
- the models
- the creative outputs
- and your patterns of thinking
You’re “creating,” but inside someone else’s cage.
You’re producing, but the system owns the means of production.
You feel independent, but you rely fully on:
- their servers
- their compute
- their API keys
- their updates
- their licensing
- their terms
- their models
- their weaknesses
- their hallucinations
It’s not empowerment.
It’s dependency packaged as creativity.
The “freedom to create” is an illusion.
Real creative freedom means independence not reliance on a machine that can be switched off, restricted, or monetized at any moment.
This is not liberation.
It’s digital feudalism.
The New 9-to-5: Cognitive Dependence
For years, people criticised the old corporate 9-to-5 system.
They said:
“It limits freedom.”
“It controls people.”
“It suppresses creativity.”
“It makes society dependent on corporations.”
The solution was always the same:
Education + entrepreneurship + ownership of your skills.
Knowledge, once learned, belongs to you.
No company can take it from your mind.
But now?
AI centralizes all knowledge into datasets controlled by a few companies.
Think about the irony:
The same people who wanted to escape the 9-to-5
are becoming cognitively dependent on tools
owned by companies far more powerful
than any employer in the past century.
Before, corporations controlled your time.
Now, tech companies control your thinking process.
We’ve traded one dependency for another but this one is deeper.
It rewires the mind.
It replaces the learning journey.
It makes the muscle of intelligence weak.
It gives the illusion of capability without the substance behind it.
The old chains were external.
The new ones are internal.
We don’t even see them.
Are We Still Learning? Neuroscience Says No
Some argue:
“With the right frameworks, AI can enhance education.”
But neuroscience doesn’t agree.
The human brain requires:
- multi-sensory engagement
- repetition over time
- embodied learning
- trial and error
- feedback loops
- real-life interaction
Screens don’t provide this.
Instant answers bypass it.
Studies show:
- fast feedback does NOT equal learning
- comprehension without struggle = shallow memory
- replacing cognitive work with automation reduces long-term retention
- children using digital tutors often perform better in the short term but worse in independent reasoning
- overstimulation increases anxiety, decreases focus, and weakens executive function
Adults in their 30s, 40s, 50s already built cognitive frameworks before AI existed.
But children?
They’ve never lived without a shortcut.
Humans naturally choose the easy path.
So imagine what happens when an entire generation grows up believing:
Why think?
Why struggle?
Why learn?
Why develop intuition?
Why build intellectual stamina?
Why form an inner moral compass?
AI is giving people answers
but robbing them of understanding.
And the long-term effect?
We have no idea.
No long-term studies.
No generational data.
No evidence of benefit.
Plenty of evidence of harm.
In every era of human history, true learning required time.
Mind.
Memory.
Effort.
Emotion.
Experience.
Silence.
AI replaces all of these with speed.
That is not evolution.
It is cognitive disintegration.
The Internet Is Becoming a Hall of Mirrors
When synthetic content becomes the baseline, we get:
- news generated by models
- academic papers generated by models
- medical advice generated by models
- history summaries generated by models
- religious explanations generated by models
- scientific literature generated by models
And then models scrape all of that again.
Original thought becomes a rare artifact.
Human nuance becomes noise.
Culture becomes compressed.
Language becomes homogenized.
The world becomes flat.
And the truth becomes optional.
When the Prophet ﷺ warned:
“A time will come when the liar is believed and the truthful rejected.”
He described an age of moral inversion.
We are now witnessing a digital inversion.
The truthful is buried under volume.
The synthetic is multiplied by scale.
The human is drowned out.
The machine becomes the default voice of the world.
And because everything looks polished, clean, and confident…
people stop questioning.
The Hidden Cost Behind the Magic
The outputs of AI feel effortless.
But the cost is not.
Behind every “intelligent” model:
$2/hour Kenyan workers
Forced to label violent, disturbing content so your app doesn’t tell someone how to self-harm.
Many develop PTSD.
1,200 billion litres of water per year
Enough to supply 18.5 million households.
Data centres quietly target water-stressed regions.
Because those communities can’t fight back.
Energy consumption equivalent to small nations
ChatGPT is estimated to consume the electricity of entire countries.
Environmental destruction through rare-earth mining
Hidden from headlines.
Psychological manipulation through design
Optimised for dependence.
The cloud is not in the sky.
It is built on the backs, bodies, and environments of people who will never benefit from it.
Saying “No” Is Also a Path
Emily Bender & Alex Hanna wrote something powerful:
“Never underestimate the power of saying no…
Our tech futures are not given to us.
They are ours to shape.”
This echoes Qur’an 13:11:
“Allah will not change the condition of a people until they change what is in themselves.”
We are not powerless.
We are not spectators.
We don’t have to accept inevitability.
We can:
- reject uncritical adoption
- demand ethical design
- limit dependence
- preserve reflection
- protect our children
- insist on transparency
- challenge harmful incentives
Saying “no” is not backward.
It is leadership.
So Where Do We Go From Here?
Here is what I’ve started doing personally:
1. Audit my AI usage
If a tool sharpens reflection, I keep it.
If it replaces reflection, I remove it.
2. Honour friction
I intentionally keep one part of my life slow.
For me, it’s gardening.
It’s soil, water, growth, and patience things that cannot be rushed.
3. Family technology boundaries
From Maghrib to Fajr:
Screens off.
Stories on.
Presence restored.
4. Teach my children the value of struggle
They need frustration.
Failure.
Trial and error.
Real world learning.
5. Protect my intention
Before generating anything, I ask:
Am I avoiding laziness or avoiding learning?
A Closing Reflection
We are living in a moment where humanity must choose:
- depth over convenience
- truth over volume
- learning over shortcuts
- presence over automation
- reflection over prediction
- intention over momentum
If AI becomes our teacher, our mirror, our entertainer, our scholar, our memory, our creator, our thinker, our spiritual guide
we lose the very thing that makes us human.
In Islam, every meaningful change begins inside:
“Do they not reflect within themselves?”
This is the call of our time.
To reclaim reflection.
To reclaim learning.
To reclaim our minds.
To reclaim our hearts.
To reclaim our future.
Because if we don’t shape our future…
someone else’s model will.
If this reflection sparked something in you, you might enjoy the longer, raw version I recorded for the podcast. Sometimes hearing a voice makes the ideas settle differently. And if you’d like to stay connected with these conversations about purpose, parenting, technology, faith, and the world we’re stepping into feel free to subscribe to the newsletter or follow the podcast. No spam, no noise, just honest thoughts as they come.
🎧 Listen to the Episode
Research & Scientific Studies
1. Graphite Research (2025) – Analysis of 65,000 URLs from Common Crawl showing AI-generated content now makes up 52% of the internet.
2. Ahrefs (2025) – Study of 900,000 newly-created web pages in April 2025 showing 74% contain AI-generated content.
3. Common Crawl Foundation (2024–2025) – Monthly indexing statistics indicating 3–5 billion new web pages added each month.
4. Shumailov, Ilia et al. (2024) – “The Curse of Recursion: Training on Generated Data Makes Models Forget.” Nature, 631, 755–759.
5. Studies on digital learning & brain plasticity – Research from the fields of cognitive neuroscience indicating that learning requires multi-sensory engagement, slow repetition, and embodied struggle (various peer-reviewed neurological literature).
6. AI impact on cognition – Early studies documenting reduced cognitive performance in users heavily relying on AI for daily tasks (referenced broadly; multiple 2024–2025 studies exist across academic institutions).
⸻
Books & Academic Works
1. Emily M. Bender & Alex Hanna (2025) – The AI Con: How to Fight Big Tech’s Hype and Create the Future We Want. HarperCollins.
2. Guest, O. et al. (2025) – Against the Uncritical Adoption of ‘AI’ Technologies in Academia. Zenodo. https://lnkd.in/d9mPkpaQ
3. Tim Ferriss (2007) – The 4-Hour Workweek. Crown Publishing.
4. David Graeber (2018) – Bullshit Jobs. Simon & Schuster.
5. Anders Ericsson (1993–2016) – Research on deliberate practice forming the basis of the “10,000-hour mastery” concept.
⸻
Articles, Posts & Commentary
1. Karl Mehta – AI Risk Thread (2025) – Multi-post commentary discussing AGI, safety concerns, alignment, and long-term survivability risks.
2. Stephen Klein (2025) – “The Internet Is Eating Itself” analysis on AI-generated content, recursive training collapse, and digital information decay.
3. Monett Diaz (2025) – Commentary summarizing The AI Con and offering critical reflections on AI hype, ethics, and societal impact.
4. Global reporting on Kenyan data labellers – Exposés documenting $2/hour content moderation labour, psychological trauma, and exploitation (multiple investigative pieces, 2020–2024).
5. Reports on AI water consumption – Data centre cooling impact reports predicting 1,200 billion litres of water consumption by 2030.
⸻
Islamic Sources
1. Qur’an 30:8 – “Do they not reflect within themselves?”
2. Qur’an 34:13 – “And few of My servants are truly grateful.”
3. Qur’an 12:53 – “Indeed, the soul inclines to evil except when my Lord has mercy.”
4. Qur’an 13:11 – “Allah will not change the condition of a people until they change what is in themselves.”
5. Surah Al-Asr (103) – The passage on mankind being in loss except those who believe, act with purpose, encourage truth, and encourage patience.
6. Prophetic narrations (Sahih)
• On truth vs falsehood in the end of times
• On taking advantage of five before five
• On intention (niyyah)
(Note: All ahadith referenced were used conceptually; no fabricated narrations included.)