When the Internet Stops Seeing Us
Enjoy a double-helping of articles this Friday in the spirit of the season.
Scroll any major app today and you can feel it: the feed is less a conversation and more a conveyor belt. We still post, watch, and react—but it’s getting harder to shake the sense that we’re feeding a machine that no longer really sees us.
Over the last decade, our online lives have been quietly rearranged around feedback loops we don’t control. Algorithms decide what we see. Advertising funds the platforms that host our attention. Cloud giants rent out the computing power that runs it all. The result is a digital economy that increasingly funds itself—by extracting more and more time, data, and emotional energy from us.
We don’t just need new regulations for this. We need a new kind of ethical framework—one designed for people living inside opaque feedback loops.
How the Feed Took Over
In 2016, Instagram moved away from a chronological feed by arguing that people were “missing 70% of all posts and almost half of posts from close friends.”[1] The “solution” was a ranked feed, tuned by machine-learning models to show what we were most likely to engage with. It worked—almost too well.
A large Meta-backed experiment during the 2020 U.S. election forced some users to see posts in chronological order instead of the usual algorithmic mix. Over three months, time spent on Instagram dropped by almost eight hours per person, while time on TikTok jumped 36% and YouTube 20%.[2] In other words, when people are given a simple, time-based feed, they spend less time in the app and quickly drift to other platforms whose algorithms keep them hooked.
Meta has seen this from the inside. When Facebook changed its News Feed in 2018 to favor “meaningful social interactions,” Mark Zuckerberg admitted the tweak reduced time spent by about 50 million hours per day.[3] That’s one line of code and an entire country’s worth of waking hours, gone.
By 2022, Meta said that AI recommendations already accounted for around 15% of what people see on Facebook, and even more on Instagram—and planned to more than double that share.[4] We are steadily moving toward feeds where what we see is mostly what the system chooses, not what the people we follow post.
For creators, this is now existential. If the recommendation engine likes you, you eat. If it doesn’t, you disappear.
The Self-Funding Ad Loop
All of this is powered by advertising. Global ad spend hit roughly US$1.1 trillion in 2024, with digital channels now taking about 73% of that—around US$790 billion.[6] Digital ad spend has more than doubled since 2019 and keeps growing in double digits.[7]
A huge chunk of that money goes to a small group of firms. In the U.S., Amazon, Google and Meta alone capture about 59% of all ad dollars in 2025.[8] Across digital channels, they take roughly 72% of the total.[9] Add Microsoft and TikTok’s owner ByteDance, and five companies together control about 65% of the U.S. ad market.[10]
This is what a self-funding loop looks like:
• Ads bankroll the platforms.
• Platforms tune the algorithms to maximize engagement.
• Engagement produces more data.
• Data improves targeting, making ads more profitable.
• Profits pay for more infrastructure, more AI, more reach.
FAANG stocks don’t just run websites. They own the backbone: ad exchanges, cloud servers, app stores, identity systems, payment rails. Their valuations are based less on profits now than on the promise of future user growth and attention capture—an endless stream of impressions still to be monetized.
Workers, Creators, and Symbolic Exhaustion
For ordinary users, the cost is mostly invisible: more time lost, more fractured attention, more targeted nudges. For workers and creators, it shows up as a different kind of burnout.
The “creator economy” is regularly celebrated as a new path to independence. But most creators earn very little. Surveys find that a majority make below a minimum-wage equivalent once you factor in hours spent producing content, engaging with followers, and chasing the algorithm.[11] Income is wildly volatile and depends heavily on staying in the good graces of opaque recommendation systems.
At the same time, traditional institutions—schools, workplaces, governments—feel less like sources of meaning and more like interfaces. We fill out forms, click “I agree,” complete required trainings, post internally, and yet the feedback we get is thin or contradictory. Many people report feeling lonelier, less trusting, and more worn out by digital life, even as they use it more.[12]
This is what I mean by symbolic exhaustion: we’re still performing all the rituals of participation, but the loop doesn’t return much back. The system keeps running. The signal keeps fading.
Toward a New Kind of Ethics
The old moral stories we inherited assumed a different world—one where you could see who held power, where cause and effect were clearer, and where opting out was at least imaginable. That’s not our world anymore.
We now live inside tightly coupled feedback systems:
• Feeds that learn from everything we do.
• Ad markets that optimize around our attention.
• Cloud infrastructure that makes it cheap to spin up new products aimed directly at our nervous systems.
We don’t just need rules for privacy or antitrust. We need an ethic that starts from where we actually are: inside the loop.
That ethic would ask new kinds of questions:
• Not “Is this app good or bad?” but “What does this loop return to the people inside it?”
• Not “Is this business profitable?” but “Is this model parasitic on human attention and trust?”
• Not “How do I win the algorithm?” but “What would it mean to participate only where the system still responds in a human way?”
Call it feedback ethics, loop ethics, or something else. The name matters less than the shift: away from blind faith in growth and toward active stewardship of the systems we’re already entangled in.
Because if we don’t learn to tune these loops, they will keep tuning us.
Footnotes:
[1] “Your Instagram Feed is About to Change,” PetaPixel, March 16, 2016, https://petapixel.com/2016/03/16/instagram-feed-change/.
[2] A. M. Guess et al., “How do social media feed algorithms affect attitudes and behavior in political contexts?” Science, July 27, 2023, https://www.science.org/doi/10.1126/science.abp9364.
[3] “Facebook’s U.S. user count declines as it prioritizes well-being,” TechCrunch, January 31, 2018, https://techcrunch.com/2018/01/31/facebook-time-spent/.
[4] Paresh Dave, “Meta Just Proved People Hate Chronological Feeds,” WIRED, July 27, 2023, https://www.wired.com/story/meta-just-proved-people-hate-chronological-feeds/.
[6] “Digital 2025: global advertising trends,” DataReportal / Statista Market Insights, February 5, 2025, https://datareportal.com/reports/digital-2025-sub-section-global-advertising-trends.
[7] “The Ultimate Paid Advertising Statistics Report 2025-26,” RockingWeb, November 2, 2025, https://www.rockingweb.com.au/paid-advertising-statistics.
[8] “Triopoly – Reports, Statistics & Marketing Trends,” eMarketer / Insider Intelligence, November 3, 2024, https://www.emarketer.com/topics/category/triopoly.
[9] “Top 10 Companies in the Online Advertising Market in 2025,” Emergen Research, July 29, 2025, https://www.emergenresearch.com/blog/top-10-companies-in-the-online-advertising-market.
[10] Saleah Blancaflor, “5 companies have captured an insanely large share of the U.S. ad market over the last decade,” Fast Company, October 9, 2025, https://www.fastcompany.com/91419317/big-tech-companies-us-advertising-share-growth-report.
[11] “NeoReach’s Creator Earnings Report Finds Over 50% of Creators Make Under a Living Wage,” NeoReach Editorial Staff, July 15, 2025, https://neoreach.com/creator-earnings-report/.
[12] “Redefining health through vitality: New insight into five years of loneliness trends,” The Cigna Group Newsroom, November 20, 2023, https://newsroom.thecignagroup.com/vitality-research-new-insight-into-five-years-of-loneliness.





