# DIY Social Algorithms: The Geeks Will Fork the Feed

The social algorithm is becoming a personal choice. Bluesky custom feeds, Threads Dear Algo, RSS revival, and EU regulation are converging to make feed curation a user-level decision. But the geeks-MOPs-sociopaths lifecycle predicts that only 1-9% of users will ever exercise that choice -- and that small percentage will reshape culture for everyone else.

- Canonical URL: https://buildooor.com/research/diy-social-algorithms
- Author: Rob Baratta
- Published: 2026-02-12
- Version: Working Paper v1.0
- Keywords: social algorithms, custom feeds, Bluesky, algorithmic choice, geeks MOPs sociopaths, feed curation, social media fragmentation, consolidation cycle, RSS revival, filter bubbles, Digital Services Act, recommendation systems, user agency, participation inequality, protocol not platform, decentralized social

---

<ResearchAbstract>
  The social media algorithm — the invisible hand that decides what 5 billion people see every day
  — is becoming a personal choice. Bluesky's open custom feed marketplace, Meta's "Dear
  Algo" feature on Threads, the 30% year-over-year surge in RSS app downloads, and EU regulatory
  pressure under the Digital Services Act are converging on a single trajectory: algorithmic sovereignty.
  Users — at least some of them — will build, choose, and control their own recommendation
  systems. But David Chapman's geeks-MOPs-sociopaths lifecycle framework predicts that this sovereignty
  will follow a familiar pattern. Geeks (the 1% who build feeds) will create the tools. Fanatics (the 9%
  who install and configure them) will form the early-adopter cohort. MOPs (the 85% who want a
  "reasonably pleasant time for minimal effort") will never touch their settings. And
  sociopaths will find ways to monetize the gap between builders and consumers. This paper argues that
  we are entering an era of *algorithmic fragmentation* — not platform fragmentation —
  where the real divide is not which app you use but which algorithm you run inside it. The
  consolidation-fragmentation cycle that has governed social media since 2004 is about to operate at
  the feed level, not the platform level. And the geeks will fork first.
</ResearchAbstract>

<ResearchSection number={1} title="The Algorithm as Territory">

Every social platform runs on the same basic contract: users provide attention, and the platform
decides what to fill it with. The mechanism of that decision — the recommendation algorithm
— is the single most powerful editorial force in human history. It determines which ideas
spread, which creators earn a living, which political narratives gain traction, and which products
get bought. Facebook's News Feed algorithm influences the information diet of nearly 3 billion
people. TikTok's For You Page has been described by internal documents as an "addiction
machine." YouTube's recommendation engine drives 70% of total watch time. None of these
systems were designed by or for the people they serve. They were designed to maximize engagement,
which is a proxy for advertising revenue, which is a proxy for shareholder value.

The algorithm is territory. Whoever controls the algorithm controls the culture that forms around it.
For two decades, that territory has been held by a handful of companies — Meta, Google, ByteDance,
and (until recently) Twitter. Users had no say in what they saw beyond crude binary signals: follow or
unfollow, like or scroll past. The algorithm observed these signals, inferred preferences, and then
optimized for the metric the company cared about — which was never "what the user actually
wants to see" and always "what keeps the user on the platform longest."

This is starting to change. Not because platforms have become altruistic — but because
competition, regulation, and a small cohort of technically opinionated users are forcing the question.
The question is simple: **who gets to decide what your feed looks like?** And for the
first time, the answer is starting to shift from "the platform" to "you — if
you want to."

<ResearchTable
  caption="Table 1. Platform Feed Control Mechanisms (2026)"
  columns={[
    { label: 'Platform' },
    { label: 'Feed Control Mechanism' },
    { label: 'User Reach', align: 'right', mono: true },
    { label: 'Control Depth', muted: true },
  ]}
  rows={[
    ['Bluesky', 'Open custom feed marketplace (AT Protocol)', '37M registered', 'Full -- users can build, share, and pin any algorithm'],
    ['Threads', '"Dear Algo" natural language tuning', '~300M MAU (est.)', 'Shallow -- 3-day adjustments via text posts'],
    ['X / Twitter', 'Open-sourced algorithm (2023), no user control', '~500M MAU', 'None -- code visible but not modifiable by users'],
    ['Mastodon / Fediverse', 'Chronological by default; server-level moderation', '~2.5M MAU', 'Structural -- choose your server, choose your norms'],
    ['Farcaster', 'Client-level algorithm choice (Warpcast, etc.)', '~546K registered', 'High -- different clients = different algorithms'],
    ['RSS (2026 revival)', 'Pure user subscription; no algorithm', '30%+ app download growth YoY', 'Total -- zero algorithmic intermediation'],
    ['Instagram', '"Your Algorithm" interest toggles', '~2B MAU', 'Shallow -- add/remove interests from preferences'],
    ['TikTok', 'No user control; opaque FYP ranking', '~1.5B MAU', 'None -- engagement-optimized black box'],
  ]}
  footnote="Sources: Platform documentation, TechCrunch, CNBC, WebProNews. MAU figures are estimates as of early 2026."
  compact
/>

The spectrum of control is enormous. At one end, TikTok offers zero user agency — an opaque
engagement-optimized black box. At the other end, Bluesky offers full algorithmic sovereignty —
an open marketplace where anyone can build, share, and pin a custom algorithm. Between those poles,
every other platform is making different bets. Meta's "Dear Algo" on Threads lets
users write a natural-language post starting with "Dear Algo," and the algorithm adjusts
for three days. Instagram added interest toggles. X open-sourced its algorithm code in 2023 but
never updated it — no public commits for over two years — and gave users no actual
control over their experience. The gesture was, as critics noted, "transparency theatre."

</ResearchSection>

<ResearchSection number={2} title="The Consolidation-Fragmentation Cycle">

Social media has always oscillated between consolidation and fragmentation. The pattern is structural:
new platforms emerge to serve unmet needs, network effects consolidate users into a few dominant
players, those players extract increasing value from users (enshittification), and fragmentation
begins again as users seek alternatives. The cycle has run roughly twice in the medium's
two-decade history, and we are now entering the third turn — but this time, the fragmentation
is happening at a different layer.

<ResearchTable
  caption="Table 2. The Social Media Consolidation-Fragmentation Cycle"
  columns={[
    { label: 'Era' },
    { label: 'Dominant Pattern' },
    { label: 'Key Platforms' },
    { label: 'User Behavior', muted: true },
  ]}
  rows={[
    ['2004--2010', 'Fragmented emergence', 'MySpace, Friendster, LiveJournal, early Facebook', 'Users pick niche; no expectation of universality'],
    ['2010--2016', 'Consolidation (Facebook era)', 'Facebook, Instagram (acquired), WhatsApp (acquired)', 'One platform per function; network effects dominate'],
    ['2016--2020', 'Peak consolidation', 'Facebook/Instagram/WhatsApp + YouTube + Twitter', 'Average user on 4 platforms; most time on 1--2'],
    ['2020--2023', 'Fragmentation trigger', 'TikTok disrupts; Twitter/X acquisition chaos', 'Average user on 6.7 platforms; attention splits'],
    ['2023--2026', 'Active fragmentation', 'Bluesky, Threads, Farcaster, Mastodon, Nostr, BeReal', 'Protocol migration; feed-level choice emerging'],
    ['2026+', 'Algorithmic fragmentation (predicted)', 'Same platforms, different algorithms per user', 'Feed becomes a personal construction, not a platform default'],
  ]}
  footnote="Sources: SiteLogic Marketing, FreedomLab, Sprout Social, author analysis."
  compact
/>

The first two cycles operated at the *platform level*. Users migrated between apps: from
MySpace to Facebook, from Facebook to Instagram, from Twitter to ... everywhere. The metric
that mattered was which app you opened. But the emerging cycle is different. Users are not just
switching platforms — they are switching algorithms *within* platforms. A Bluesky user
who pins a custom "Booksky" feed and a Bluesky user running the default Discover feed
are on the same platform but experiencing entirely different realities. They share an address but
not a neighborhood.

This is algorithmic fragmentation. The platform is the operating system; the algorithm is the
application. And just as desktop computing went from "everyone runs the same software"
to "everyone customizes their stack," social media is heading toward a world where the
feed is a personal construction, not a platform default. The shift from 4 platforms per user (2016)
to 6.7 platforms per user (2024) was the warm-up. The shift from one algorithm per platform to
multiple algorithms per platform is the main event.

<ResearchCallout>
  The next fragmentation cycle will not be about which app you open. It will be about which algorithm
  you run inside it. Platform choice becomes secondary to feed choice. The real lock-in shifts from
  the social graph to the recommendation engine — and for the first time, users can bring
  their own.
</ResearchCallout>

</ResearchSection>

<ResearchSection number={3} title="What's Actually Shipping: The Programmable Feed">

This is not speculative. Programmable feeds are shipping now, at meaningful scale, with real
monetization experiments already underway. The most advanced implementation is Bluesky's
AT Protocol, which treats custom feeds as first-class citizens of the social network. Bluesky's
architecture replaces the opaque algorithmic black box with an open marketplace. The company's
stated philosophy: "Give users sensible defaults but leave them the option to fully customize
their experience if they don't like our choices." Third-party developers build feeds
using published APIs and tools like SkyFeed. Users pin, sort, and swap feeds within the app. The
vast majority of feeds on the network are now built by independent developers, not Bluesky itself.

The commercial proof point is Graze, a startup that went from zero to serving hundreds of thousands
of unique daily users and tens of millions of content impressions within months of Bluesky's
growth spike. Graze powers 4,500 feeds created by roughly 3,000 users — including several of
Bluesky's top feeds across news, gaming, art, politics, sports, and fitness. In April 2025,
Graze raised a $1M pre-seed round led by Betaworks and Salesforce Ventures. It is already running
ads in 200 custom feeds at approximately $1 per 1,000 impressions — with Bluesky's
explicit blessing. This is not a side project. It is an emerging business model: the algorithm as
a service, built by users, for users, with the platform serving as infrastructure.

<ResearchTable
  caption="Table 3. Custom Feed Adoption Data Points (2025--2026)"
  columns={[
    { label: 'Platform / Tool' },
    { label: 'Metric', align: 'right', mono: true },
    { label: 'Source' },
    { label: 'Significance', muted: true },
  ]}
  rows={[
    ['Graze (Bluesky feed builder)', '4,500 feeds by ~3,000 users', 'TechCrunch, Apr 2025', 'Raised $1M pre-seed; $1 CPM ads in custom feeds'],
    ['Graze daily impressions', 'Tens of millions', 'TechCrunch, Jan 2025', '3,000 builders serve hundreds of thousands of daily users'],
    ['Bluesky registered users', '37M+', 'Bluesky, Jul 2025', '23M added in one year; 40% DAU drop by Oct 2025'],
    ['Bluesky custom feeds (total)', 'Thousands (open marketplace)', 'Bluesky Blog, 2023', 'Vast majority built by independent third-party developers'],
    ['RSS app downloads (2026 vs 2025)', '+30% YoY', 'WebProNews, 2026', 'Driven by privacy concerns and algorithm fatigue'],
    ['Threads "Dear Algo" launch', 'Feb 2026 (US, UK, AU, NZ)', 'CNBC, Feb 2026', 'Natural language feed tuning; 3-day adjustment window'],
    ['Farcaster registered users', '~546K', 'BlockEden, Oct 2025', '40--60K DAU; client-level algorithm choice'],
    ['Mastodon monthly active users', '~2.5M', 'MarketingScoop, 2025', '600%+ YoY growth post-Twitter acquisition'],
    ['X algorithm open-source (GitHub)', 'No public commits for 2+ years', 'Social Media Today, 2026', 'Transparency theatre -- code visible but not updated or user-modifiable'],
    ['EU DSA fine against X', '\u20AC120M (Dec 2025)', 'European Commission, 2025', 'First DSA non-compliance fine; deceptive design + ad transparency violations'],
  ]}
  footnote="Sources: TechCrunch, Bluesky Blog, WebProNews, CNBC, European Commission, BlockEden, MarketingScoop, Social Media Today."
  compact
/>

Meanwhile, Meta is approaching the same destination from the opposite direction. Rather than
opening the algorithm to developers, Threads launched "Dear Algo" in February 2026
— a feature that lets users write a natural-language post requesting algorithm adjustments.
The algorithm then adjusts the user's feed for three days. It is a strikingly different design
philosophy: Bluesky says "build your own algorithm," while Threads says "tell
ours what you want in plain English." One gives you the source code; the other gives you a
suggestion box. Both acknowledge the same underlying demand.

And then there is RSS — the 25-year-old protocol that Google tried to kill in 2013 by
shuttering Google Reader. RSS app downloads surged 30% year-over-year in 2026, driven by what
PC Gamer called the need to "kill the algorithm in your head." RSS is the most radical
version of algorithmic sovereignty: there is no algorithm at all. You subscribe to sources. They
appear chronologically. No ranking, no optimization, no engagement signals. It is the equivalent
of reading the newspaper you chose to subscribe to, and nothing else. The revival is small in
absolute terms but culturally significant — it represents the purist position in the
algorithmic choice spectrum.

</ResearchSection>

<ResearchSection number={4} title="Geeks, MOPs, and Sociopaths: The Lifecycle of Algorithmic Choice">

David Chapman's 2015 essay "Geeks, MOPs, and Sociopaths" describes a lifecycle
that governs how subcultures form, grow, and die. The framework identifies three groups: geeks
(creators and fanatics who invent and sustain a scene through obsessive engagement), MOPs (members
of the public who show up for a "reasonably pleasant time in exchange for minimal effort"),
and sociopaths (exploiters who recognize the subculture as a power game and extract cultural, social,
and liquid capital from it). The lifecycle runs: geeks create something exciting, MOPs arrive and
dilute it, sociopaths monetize and destroy it, geeks abandon the wreckage and start over.

This framework maps with uncomfortable precision onto the emerging landscape of user-controlled
social algorithms.

<ResearchTable
  caption="Table 4. Chapman's Framework Applied to Algorithm Choice"
  columns={[
    { label: 'Actor' },
    { label: "In Chapman's Framework" },
    { label: 'In the Algorithm Economy' },
    { label: 'Approximate %', align: 'right', mono: true },
  ]}
  rows={[
    ['Creators (Geeks)', 'Invent the scene; obsess over esoteric details', 'Build custom feeds, write feed generators, fork protocols', '~1%'],
    ['Fanatics (Geeks)', 'Organize, fund, analyze; deeply committed', 'Install custom feeds, curate RSS, run Mastodon servers', '~9%'],
    ['MOPs', 'Casual fans seeking pleasant experience for minimal effort', 'Use default algorithm; never change settings; follow what surfaces', '~85%'],
    ['Sociopaths', 'Extract cultural, social, and liquid capital', 'Game algorithms, sell "growth hacking," monetize attention arbitrage', '~5%'],
  ]}
  footnote="Source: Chapman (2015), 'Geeks, MOPs, and Sociopaths,' meaningness.com. Percentages adapted from Nielsen's 90-9-1 rule."
/>

The geeks are already visible. They are the 3,000 Graze feed builders serving millions of
impressions. They are the Mastodon server operators running their own instances with custom
moderation policies. They are the developers writing custom Bluesky feed generators using the
AT Protocol API. They are the people who set up Miniflux or NetNewsWire and curate 200 RSS
subscriptions instead of scrolling a default feed. They do this because they care about what
they see. The feed is not a passive experience for them — it is a craft.

The MOPs are the other 85%. They will never build a custom feed. They will never install one.
They will never change their feed settings. They will open the app and consume whatever the
platform decides to show them. This is not laziness — it is rational behavior. Most
people do not care about algorithmic curation the way geeks do. They want "a reasonably
pleasant time in exchange for minimal effort," and the default algorithm delivers that
well enough. The MOPs are the reason platforms can afford to offer custom feeds to geeks: the
default algorithm, running on 85% of users, generates the ad revenue that subsidizes the
geek-facing features.

The sociopaths are the growth hackers, the engagement farmers, the "algorithmic
arbitrageurs" who will find ways to exploit the gap between custom-feed geeks and
default-feed MOPs. They are already present in Graze's ad-supported feeds. They will
appear wherever there is an audience that someone else curated and a mechanism to monetize it.
Chapman notes that the optimal ratio of MOPs to geeks is "maybe 6:1" and that
beyond 10:1, the scene becomes unsustainable. In algorithm land, the ratio is closer to 85:10
— which suggests the sociopath invasion will arrive quickly.

<ResearchTable
  caption="Table 5. The Algorithm Lifecycle: From Scene to Collapse"
  columns={[
    { label: 'Stage' },
    { label: 'Subculture Lifecycle (Chapman)' },
    { label: 'Algorithm Lifecycle (This Paper)' },
  ]}
  rows={[
    ['1. Scene formation', 'Small group invents exciting innovation', 'Power users discover custom feed tools; build for themselves'],
    ['2. Growth', 'Fanatics join; MOPs arrive for the vibe', 'Feed builders gain users; Graze hits millions of impressions'],
    ['3. MOP dilution', 'MOPs demand convenience; cultural intensity drops', 'Platforms simplify feed controls into "Dear Algo" one-liners'],
    ['4. Sociopath capture', 'Exploiters monetize the scene; push out geeks', 'Growth hackers, SEO grifters, and ad networks colonize custom feeds'],
    ['5. Collapse / fork', 'Geeks abandon the scene; subculture dies or mutates', 'Geeks fork to new protocols; cycle restarts at smaller scale'],
  ]}
  footnote="Source: Author synthesis of Chapman (2015) and observed market behavior."
/>

<ResearchCallout>
  The geeks-MOPs-sociopaths lifecycle predicts that algorithmic sovereignty will follow the same
  arc as every subculture before it. The geeks will build something beautiful. The MOPs will
  show up and demand it be easier. The sociopaths will monetize it into oblivion. And then the
  geeks will fork — to a new protocol, a new tool, a new layer of abstraction. The
  question is not whether this cycle will run. It is how many times it will run before the
  infrastructure becomes mature enough to resist capture.
</ResearchCallout>

</ResearchSection>

<ResearchSection number={5} title="The 90-9-1 Problem: Who Actually Builds Their Own Feed?">

The internet's participation inequality is one of the most robust findings in digital sociology.
Jakob Nielsen's 90-9-1 rule — 90% lurk, 9% contribute [redacted], 1% create most
content — has been validated across platforms from Wikipedia (where 0.2% of visitors are
active editors) to health forums (where "Superusers" generate the vast majority of
posts) to blogs (where the ratio skews even further, to roughly 95-5-0.1). More recent data from
large online communities shows the split may be narrowing slightly — 5% creating, 5%
responding — but the fundamental asymmetry holds: a tiny minority produces, the vast majority
consumes.

<ResearchTable
  caption="Table 6. Participation Inequality in Content vs. Algorithm Curation"
  columns={[
    { label: 'Activity Level' },
    { label: 'Traditional Social (90-9-1)' },
    { label: 'Algorithm Curation (Predicted)' },
    { label: 'Cultural Influence', muted: true },
  ]}
  rows={[
    ['Creators (1%)', 'Generate majority of content', 'Build custom feeds, write algorithms, run feed services', 'Set the taste; define what quality looks like'],
    ['Contributors (9%)', 'Edit, share, comment', 'Install custom feeds, configure RSS, choose alt-clients', 'Amplify creator taste; form the early-adopter cohort'],
    ['Lurkers (90%)', 'Consume without contributing', 'Use platform defaults; never touch feed settings', 'Absorb whatever surfaces; shaped by others\' choices'],
  ]}
  footnote="Source: Nielsen (2006), Higher Logic (2024), author projection."
  compact
/>

Apply this to algorithmic choice and the picture is stark. The 1% who build custom feeds are not
just creators — they are *taste-makers*. They decide what the 9% early adopters
experience, which in turn shapes the culture that the 90% eventually absorb through the default
algorithm. This is how it has always worked in media. A tiny number of editors, curators, and
programmers have always determined what the masses see. The difference now is that the curators
are users, not employees. The editorial function has been democratized — but democracy
does not mean equal participation. It means the motivated few shape the experience of the
passive many.

Graze's numbers illustrate this perfectly. Three thousand builders serve hundreds of thousands
of daily users. That is a ratio of roughly 1:100 — far more concentrated than even the
90-9-1 rule predicts. Each feed builder is a micro-editor, curating reality for an audience
that chose their algorithm but did not build it. The feed builder's biases, interests, and
blind spots become the audience's information diet. This is not necessarily worse than a
corporate algorithm — but it is not automatically better, either. It is a different kind
of gatekeeping: distributed, transparent, and voluntarily chosen, but gatekeeping nonetheless.

The deeper implication is that "everyone will build their own feed" is wrong. It has
always been wrong. The correct prediction is: **the 1% will build feeds, the 9% will choose
from what the 1% built, and the 90% will use whatever default the platform sets.** The
revolution is real, but it is a revolution of the minority. And the minority's taste will
propagate to the majority through the same cultural diffusion mechanisms that have always operated
— just faster, and with more explicit infrastructure.

</ResearchSection>

<ResearchSection number={6} title="The Regulatory Accelerant: DSA and Forced Algorithmic Choice">

The European Union's Digital Services Act, fully applicable since February 2024, is the most
significant regulatory force pushing platforms toward algorithmic transparency and user choice. The
DSA requires Very Large Online Platforms (VLOPs) to disclose how their recommendation algorithms
work, ban targeted advertising to children, and provide users with at least one recommendation
option that is not based on profiling. The European Commission established the European Centre for
Algorithmic Transparency specifically to audit platform compliance.

Enforcement has teeth. In December 2025, the Commission issued its first non-compliance fine under
the DSA: \u20AC120 million against X for violations of deceptive design rules, ad transparency
requirements, and researcher data access provisions. A new investigation against X launched in
January 2026. The message is clear: opaque algorithms that optimize purely for engagement, without
user agency or transparency, are now a regulatory liability. Facebook has already expanded
chronological feed options in select EU regions as a compliance measure. Instagram's
"Your Algorithm" interest toggles are partially a DSA response.

The regulatory dynamic creates an asymmetric incentive. Platforms that already offer user-controlled
feeds — Bluesky, Mastodon, Farcaster — are effectively pre-compliant. They built
algorithmic choice as a feature, not a regulatory burden. Platforms that relied on opaque
engagement maximization — TikTok, Instagram, X — face a choice: genuinely empower
users to control their feeds, or build the minimum viable compliance features while preserving
the engagement-optimized default. So far, most are choosing the latter. "Dear Algo"
is elegant, but a 3-day adjustment window based on a natural language post is not the same as
handing users the source code to their recommendation engine.

The research supports a Georgetown Knight Institute report from March 2025 titled "Better
Feeds: Algorithms That Put People First," which argued for structural reforms to
recommendation systems — not just transparency requirements but genuine user control
mechanisms. A study in *Science Direct* found that increasing user autonomy increases
recommendation acceptance, drawing on self-determination theory: people who feel in control of
their information environment are more satisfied with it, even if the content is the same. The
DSA is pushing in the right direction, but the gap between regulatory intent (genuine algorithmic
choice) and platform compliance (minimal settings menus) remains vast.

</ResearchSection>

<ResearchSection number={7} title="The Filter Bubble Paradox: Is Self-Curation Better or Worse?">

The most common objection to user-controlled algorithms is the filter bubble argument: if people
choose their own feeds, won't they just surround themselves with confirming voices? A
systematic review synthesizing a decade of peer-reviewed research (2015–2025) on filter
bubbles, echo chambers, and algorithmic bias found that "algorithmic systems structurally
amplify ideological homogeneity, reinforcing selective exposure and limiting viewpoint diversity."
Small initial biases are magnified by recommender systems, producing "polarization cascades
at the network level." This is the standard critique, and the evidence for it is strong
— *for platform-controlled algorithms*.

But the filter bubble research has a blind spot: it almost exclusively studies engagement-optimized
algorithms, not user-constructed feeds. The entire literature assumes that the algorithm is a
black box controlled by the platform. When you shift to user-controlled algorithms, the dynamics
change in ways that the existing research does not capture. A user who *consciously builds*
their feed is engaging in a fundamentally different cognitive act than a user who passively
consumes an engagement-optimized one. The former involves metacognition — thinking about
what you want to think about. The latter involves no metacognition at all.

<ResearchTable
  caption="Table 7. Filter Bubble Risk by Curation Method"
  columns={[
    { label: 'Curation Method' },
    { label: 'Diversity Risk' },
    { label: 'Echo Chamber Risk' },
    { label: 'User Agency', muted: true },
  ]}
  rows={[
    ['Platform default algorithm', 'High -- optimizes for engagement, not breadth', 'High -- amplifies confirming content', 'None -- user is the product'],
    ['User-built custom feed', 'Medium -- depends on builder sophistication', 'Medium -- self-aware curation possible', 'High -- user chooses the filter'],
    ['RSS / chronological', 'Low -- user explicitly subscribes to sources', 'Low -- no amplification loop', 'Total -- but requires effort'],
    ['Community-curated feed', 'Low-Medium -- curator taste diversifies', 'Medium -- curator bias replaces algo bias', 'Delegated -- trust the curator'],
    ['AI-tuned personal feed ("Dear Algo")', 'Unknown -- depends on implementation', 'Potentially high -- natural language = vague steering', 'Shallow -- illusion of control'],
  ]}
  footnote="Source: Author analysis synthesizing MDPI (2025), Springer Nature (2024), Frontiers in Psychology (2025)."
  compact
/>

The paradox is this: platform-controlled algorithms *definitely* create filter bubbles,
because they optimize for engagement and engagement correlates with emotional arousal and
confirmation bias. User-controlled algorithms *might* create filter bubbles, but only
if the user is not self-aware — and the act of building your own feed is itself an act
of self-awareness. The geeks who build custom feeds are, almost by definition, the people most
likely to intentionally include dissenting voices, serendipity, and breadth. They are not
building feeds to confirm their biases. They are building feeds to escape the platform's
bias toward engagement optimization.

Research on algorithmic awareness supports this. A study in *Frontiers in Psychology*
(2025) found that algorithm-aware users experience less sense of manipulation and more perceived
control — but also noted that awareness alone does not change behavior. The users who
actually change their behavior are the ones with both awareness *and* agency. This is
exactly the cohort that custom feed tools serve: users who know what algorithms do and have
the tools to do something about it. The filter bubble risk does not disappear, but it shifts
from an externally imposed problem to an internally managed one. Whether that is an improvement
depends entirely on whether you trust individuals more than you trust engagement-optimized
corporate algorithms. The evidence suggests you should.

</ResearchSection>

<ResearchSection number={8} title="Framework: The Algorithmic Sovereignty Lifecycle">

Synthesizing the preceding analysis, we propose a lifecycle model for how algorithmic sovereignty
emerges, diffuses, and evolves. The model combines the consolidation-fragmentation cycle (which
describes *when* users seek alternatives), the geeks-MOPs-sociopaths lifecycle (which
describes *who* acts at each stage), and participation inequality (which describes
*how many* people exercise algorithmic choice at each level).

**Phase 1: Platform Enshittification (the trigger).** A dominant platform begins
extracting more value from users than it provides. The algorithm shifts from "show people
what they want" to "show people what generates revenue." Ad load increases.
Organic reach declines. The default experience degrades. This is where Twitter/X was in 2022,
where Facebook has been since roughly 2018, and where TikTok is heading as regulatory and
competitive pressures mount.

**Phase 2: Geek Exodus (the 1%).** The most technically opinionated users leave
first. They do not just leave the platform — they leave the *algorithm*. They
set up RSS readers, build Bluesky custom feeds, run Mastodon instances, or write their own
recommendation systems. They are not seeking a better platform. They are seeking sovereignty
over their information diet. This phase is small in numbers but culturally significant: geeks
are the taste-makers, and their departure signals that the platform has lost the quality cohort.

**Phase 3: Fanatic Adoption (the 9%).** Early adopters who are not technically
inclined but are culturally aligned with the geeks begin using the tools geeks built. They
install Graze feeds. They subscribe to curated RSS bundles. They switch Farcaster clients
for a different algorithmic experience. They do not build — they choose. This is the
phase where the market for custom algorithms becomes commercially viable. Graze's $1M
raise is a Phase 3 signal.

**Phase 4: Platform Response (the co-optation).** The dominant platforms see the
geek exodus as a competitive threat and respond with simplified feed control features. This
is Threads' "Dear Algo" and Instagram's interest toggles. These features
give the illusion of algorithmic choice without actually surrendering control. They are designed
to retain MOPs by making them feel empowered while keeping the engagement-optimized default in
place. Chapman would recognize this as the sociopath response: mimicking the geek innovation
to prevent MOP migration.

**Phase 5: Sociopath Capture (the monetization).** Growth hackers, SEO operators,
and ad networks figure out how to game the new algorithmic layer. Custom feeds become ad
channels. Feed builders sell placement. The curation layer acquires the same incentive misalignment
that plagued the platform layer. The 6:1 MOP-to-geek ratio that Chapman identifies as the maximum
sustainable level is quickly exceeded. Quality drops.

**Phase 6: The Fork (geeks leave again).** Geeks abandon the captured
algorithm layer and move to a new one. Maybe they build at the protocol level (AT Protocol,
ActivityPub, Nostr). Maybe they build AI-powered personal recommendation systems that run
locally. Maybe they build something we have not imagined yet. The cycle restarts. Each
iteration leaves behind infrastructure that is slightly more open, slightly more user-controlled,
and slightly harder for any single entity to capture. This is the ratchet: the geeks
never win permanently, but each fork raises the floor.

<ResearchCallout>
  The algorithmic sovereignty lifecycle is not a one-time transition from platform control to
  user control. It is a repeating cycle. Geeks build, MOPs arrive, sociopaths capture, geeks
  fork. The revolution is not in any single tool or protocol. It is in the ratchet: each
  cycle raises the baseline of what users expect from their feeds, making it progressively
  harder for platforms to retreat to fully opaque engagement maximization.
</ResearchCallout>

</ResearchSection>

<ResearchSection number={9} title="Implications: What This Means for Builders and Users">

**Implication 1: The algorithm becomes a product category.** Graze's business
model — building and monetizing custom feeds on someone else's social network —
is the first signal of a new product category. If Bluesky's 2026 roadmap expands custom feed
visibility and tools as promised, and if the AT Protocol matures as an open standard, we will see
an ecosystem of feed-building companies, algorithm marketplaces, and feed-as-a-service businesses.
The algorithm is unbundling from the platform, just as the app unbundled from the operating system.

**Implication 2: The 1% who build feeds have outsized influence.** Participation
inequality means that a few thousand feed builders will shape the information diet of millions.
This is not inherently good or bad — it depends on who the builders are and what they
optimize for. But it means that "algorithmic sovereignty" is, in practice, algorithmic
oligarchy: a small number of opinionated curators making editorial decisions for large audiences
who chose to delegate. The saving grace is that the delegation is voluntary and revocable —
unlike the platform default, which is neither.

**Implication 3: MOPs will never leave the default.** Every prediction that
"users will build their own algorithms" is wrong in the same way that "users
will read the terms of service" was wrong. Most people will not. The 85% on default feeds
are not a failure of the system — they are the system. Custom feeds are a safety valve for
the geeks, not a universal replacement for the algorithm. The default algorithm will remain the
dominant information channel for the foreseeable future, which means that platform incentives
still matter enormously even in a world with algorithmic choice.

**Implication 4: Regulation matters more than technology.** The EU's DSA is
doing more to expand algorithmic choice than any startup. By mandating that platforms offer at
least one non-profiling-based recommendation option, and by fining platforms that fail to comply,
regulators are creating the market conditions for algorithmic sovereignty. The technology to build
custom feeds has existed for years (RSS is 25 years old). What has changed is the regulatory
pressure that forces platforms to allow alternatives rather than trapping users in
engagement-optimized defaults.

**Implication 5: The sociopath invasion is inevitable, but survivable.** Chapman's
framework suggests that any successful algorithmic choice ecosystem will eventually be colonized
by growth hackers and ad networks. Graze is already running ads in custom feeds at $1 CPM. This
is not necessarily the end — advertising in user-chosen feeds is fundamentally different
from advertising in engagement-optimized feeds, because the user opted into the context. But the
incentive gradient is clear: where there is attention, there is monetization, and where there is
monetization, there are sociopaths. The geeks will need to build defensible infrastructure
— open protocols, transparent algorithms, easy portability — to survive the capture
phase and preserve the option to fork.

</ResearchSection>

<ResearchSection number={10} title="Conclusion: The Geeks Will Fork the Feed">

The social algorithm is not becoming a personal choice for everyone. It is becoming a personal
choice for the people who care enough to exercise it — and those people, though they are
a small minority, have always been the ones who shape the information environment for the rest
of us. Editors, curators, programmers, DJs, zine publishers, bloggers, RSS power users —
the form factor changes but the role is the same: the opinionated few who build the filters
through which the majority experiences reality.

What is new is the infrastructure. Bluesky's AT Protocol makes custom feeds a first-class
feature rather than a workaround. Graze demonstrates that feed-building is a viable business.
The EU's DSA mandates that platforms offer alternatives to engagement-optimized defaults.
Meta's "Dear Algo" concedes the principle even while limiting the execution.
The 30% year-over-year surge in RSS downloads shows that some users are rejecting the entire
concept of algorithmic mediation. These are not isolated signals. They are the early stages of
a structural shift — not from platform to platform, but from platform-controlled
algorithms to user-chosen algorithms.

The geeks-MOPs-sociopaths lifecycle tells us this shift will not be clean, permanent, or
universal. It will be messy, cyclical, and minority-driven. The geeks will build something
beautiful. The sociopaths will try to capture it. The geeks will fork and build again. Each
cycle will raise the baseline — the floor of what "algorithmic choice" means
in practice. The MOPs will mostly stay on default feeds, and that is fine. The revolution does
not require majority participation. It requires infrastructure that makes forking cheap and
capture expensive.

The question is not whether the social algorithm will become personalizable. It already has.
The question is whether the infrastructure will mature fast enough to survive the inevitable
sociopath invasion — and whether the geeks, this time, will build protocols instead of
platforms, standards instead of startups, and commons instead of companies. If they do, the
fork becomes permanent. If they do not, we will be here again in five years, watching the same
cycle play out on whatever the next Bluesky is. Either way, the geeks will fork the feed. They
always do.

</ResearchSection>

<ResearchReferences>

Chapman, D. (2015). "Geeks, MOPs, and Sociopaths." *Meaningness.* [meaningness.com/geeks-mops-sociopaths](https://meaningness.com/geeks-mops-sociopaths)

Bluesky. (2023). "Algorithmic Choice with Custom Feeds." *Bluesky Blog.* [bsky.social/about/blog/7-27-2023-custom-feeds](https://bsky.social/about/blog/7-27-2023-custom-feeds)

Perez, S. (2025). "Custom Feed Builder Graze Is Building a Business on Bluesky." *TechCrunch,* January 31, 2025.

Perez, S. (2025). "Bluesky Feed Builder Graze Raises $1M, Rolls Out Ads." *TechCrunch,* April 16, 2025.

Meta. (2026). "Dear Algo: AI Algorithm Personalization Feature for Threads." *CNBC,* February 11, 2026.

Nielsen, J. (2006). "Participation Inequality: The 90-9-1 Rule for Social Features." *Nielsen Norman Group.* [nngroup.com/articles/participation-inequality](https://www.nngroup.com/articles/participation-inequality/)

European Commission. (2025). "First Non-Compliance Decision Under the Digital Services Act." *European Commission Press Release,* December 5, 2025.

FreedomLab. (2025). "The Fragmentation of Social Media." *FreedomLab.* [freedomlab.com/posts/the-fragmentation-of-social-media](https://freedomlab.com/posts/the-fragmentation-of-social-media)

García-Soto, A., et al. (2025). "Trap of Social Media Algorithms: A Systematic Review of Filter Bubbles, Echo Chambers, and Their Impact on Youth." *Societies,* 15(11), 301.

Springer Nature. (2024). "Filter Bubbles and the Unfeeling: How AI for Social Media Can Foster Extremism and Polarization." *Philosophy & Technology.*

ScienceDirect. (2024). "Let Me Decide: Increasing User Autonomy Increases Recommendation Acceptance." *Computers in Human Behavior,* 153.

Georgetown Knight Institute. (2025). "Better Feeds: Algorithms That Put People First." *Knight-Georgetown Institute.*

Frontiers in Psychology. (2025). "Resistance or Compliance? The Impact of Algorithmic Awareness on Attitudes Toward Online Information Browsing." *Front. Psychol.,* 16.

ScienceDirect. (2025). "The Impact of Algorithm Awareness on the Acceptance of Personalized Social Media Content Recommendation." *Acta Psychologica.*

WebProNews. (2026). "RSS Revival in 2026: Users Flee Algorithms for Privacy and Control." *WebProNews.* [webpronews.com](https://www.webpronews.com/rss-revival-in-2026-users-flee-algorithms-for-privacy-and-control/)

BlockEden. (2025). "Farcaster in 2025: The Protocol Paradox." *BlockEden.xyz.* [blockeden.xyz](https://blockeden.xyz/blog/2025/10/28/farcaster-in-2025-the-protocol-paradox/)

SiteLogic Marketing. (2025). "A Brief History of Social Media Trends: Consolidation and Diversification." *SiteLogic.* [sitelogicmarketing.com](https://www.sitelogicmarketing.com/social-media-trends/)

PC Gamer. (2026). "Kill the Algorithm in Your Head: Let's Set Up RSS Readers and Get News We Actually Want in 2026." *PC Gamer.* [pcgamer.com](https://www.pcgamer.com/software/kill-the-algorithm-in-your-head-lets-set-up-rss-readers-and-get-news-we-actually-want-in-2026/)

</ResearchReferences>

<ResearchColophon
  citation={`Baratta, R. (2026). \u201CDIY Social Algorithms: The Geeks Will Fork the Feed.\u201D Buildooor Research Brief, February 2026.`}
  email="buildooor@gmail.com"
/>
