Buildooor Research Brief -- February 2026

DIY Social Algorithms: The Geeks Will Fork the Feed

buildooor % claude --model opus-4.6 -p "/research-paper fork or get forked"
Published February 12, 2026 -- Working Paper v1.0
Keywords: social algorithms, custom feeds, Bluesky, algorithmic choice, geeks MOPs sociopaths, feed curation, social media fragmentation, consolidation cycle, RSS revival, filter bubbles, Digital Services Act, recommendation systems, user agency, participation inequality, protocol not platform, decentralized social
What does this mean for me?MarkdownPlain text

Abstract

The social media algorithm — the invisible hand that decides what 5 billion people see every day — is becoming a personal choice. Bluesky's open custom feed marketplace, Meta's "Dear Algo" feature on Threads, the 30% year-over-year surge in RSS app downloads, and EU regulatory pressure under the Digital Services Act are converging on a single trajectory: algorithmic sovereignty. Users — at least some of them — will build, choose, and control their own recommendation systems. But David Chapman's geeks-MOPs-sociopaths lifecycle framework predicts that this sovereignty will follow a familiar pattern. Geeks (the 1% who build feeds) will create the tools. Fanatics (the 9% who install and configure them) will form the early-adopter cohort. MOPs (the 85% who want a "reasonably pleasant time for minimal effort") will never touch their settings. And sociopaths will find ways to monetize the gap between builders and consumers. This paper argues that we are entering an era of algorithmic fragmentation — not platform fragmentation — where the real divide is not which app you use but which algorithm you run inside it. The consolidation-fragmentation cycle that has governed social media since 2004 is about to operate at the feed level, not the platform level. And the geeks will fork first.

1. The Algorithm as Territory

Every social platform runs on the same basic contract: users provide attention, and the platform decides what to fill it with. The mechanism of that decision — the recommendation algorithm — is the single most powerful editorial force in human history. It determines which ideas spread, which creators earn a living, which political narratives gain traction, and which products get bought. Facebook's News Feed algorithm influences the information diet of nearly 3 billion people. TikTok's For You Page has been described by internal documents as an "addiction machine." YouTube's recommendation engine drives 70% of total watch time. None of these systems were designed by or for the people they serve. They were designed to maximize engagement, which is a proxy for advertising revenue, which is a proxy for shareholder value.

The algorithm is territory. Whoever controls the algorithm controls the culture that forms around it. For two decades, that territory has been held by a handful of companies — Meta, Google, ByteDance, and (until recently) Twitter. Users had no say in what they saw beyond crude binary signals: follow or unfollow, like or scroll past. The algorithm observed these signals, inferred preferences, and then optimized for the metric the company cared about — which was never "what the user actually wants to see" and always "what keeps the user on the platform longest."

This is starting to change. Not because platforms have become altruistic — but because competition, regulation, and a small cohort of technically opinionated users are forcing the question. The question is simple: who gets to decide what your feed looks like? And for the first time, the answer is starting to shift from "the platform" to "you — if you want to."

Table 1. Platform Feed Control Mechanisms (2026)
PlatformFeed Control MechanismUser ReachControl Depth
BlueskyOpen custom feed marketplace (AT Protocol)37M registeredFull -- users can build, share, and pin any algorithm
Threads"Dear Algo" natural language tuning~300M MAU (est.)Shallow -- 3-day adjustments via text posts
X / TwitterOpen-sourced algorithm (2023), no user control~500M MAUNone -- code visible but not modifiable by users
Mastodon / FediverseChronological by default; server-level moderation~2.5M MAUStructural -- choose your server, choose your norms
FarcasterClient-level algorithm choice (Warpcast, etc.)~546K registeredHigh -- different clients = different algorithms
RSS (2026 revival)Pure user subscription; no algorithm30%+ app download growth YoYTotal -- zero algorithmic intermediation
Instagram"Your Algorithm" interest toggles~2B MAUShallow -- add/remove interests from preferences
TikTokNo user control; opaque FYP ranking~1.5B MAUNone -- engagement-optimized black box
Sources: Platform documentation, TechCrunch, CNBC, WebProNews. MAU figures are estimates as of early 2026.

The spectrum of control is enormous. At one end, TikTok offers zero user agency — an opaque engagement-optimized black box. At the other end, Bluesky offers full algorithmic sovereignty — an open marketplace where anyone can build, share, and pin a custom algorithm. Between those poles, every other platform is making different bets. Meta's "Dear Algo" on Threads lets users write a natural-language post starting with "Dear Algo," and the algorithm adjusts for three days. Instagram added interest toggles. X open-sourced its algorithm code in 2023 but never updated it — no public commits for over two years — and gave users no actual control over their experience. The gesture was, as critics noted, "transparency theatre."

2. The Consolidation-Fragmentation Cycle

Social media has always oscillated between consolidation and fragmentation. The pattern is structural: new platforms emerge to serve unmet needs, network effects consolidate users into a few dominant players, those players extract increasing value from users (enshittification), and fragmentation begins again as users seek alternatives. The cycle has run roughly twice in the medium's two-decade history, and we are now entering the third turn — but this time, the fragmentation is happening at a different layer.

Table 2. The Social Media Consolidation-Fragmentation Cycle
EraDominant PatternKey PlatformsUser Behavior
2004--2010Fragmented emergenceMySpace, Friendster, LiveJournal, early FacebookUsers pick niche; no expectation of universality
2010--2016Consolidation (Facebook era)Facebook, Instagram (acquired), WhatsApp (acquired)One platform per function; network effects dominate
2016--2020Peak consolidationFacebook/Instagram/WhatsApp + YouTube + TwitterAverage user on 4 platforms; most time on 1--2
2020--2023Fragmentation triggerTikTok disrupts; Twitter/X acquisition chaosAverage user on 6.7 platforms; attention splits
2023--2026Active fragmentationBluesky, Threads, Farcaster, Mastodon, Nostr, BeRealProtocol migration; feed-level choice emerging
2026+Algorithmic fragmentation (predicted)Same platforms, different algorithms per userFeed becomes a personal construction, not a platform default
Sources: SiteLogic Marketing, FreedomLab, Sprout Social, author analysis.

The first two cycles operated at the platform level. Users migrated between apps: from MySpace to Facebook, from Facebook to Instagram, from Twitter to ... everywhere. The metric that mattered was which app you opened. But the emerging cycle is different. Users are not just switching platforms — they are switching algorithms within platforms. A Bluesky user who pins a custom "Booksky" feed and a Bluesky user running the default Discover feed are on the same platform but experiencing entirely different realities. They share an address but not a neighborhood.

This is algorithmic fragmentation. The platform is the operating system; the algorithm is the application. And just as desktop computing went from "everyone runs the same software" to "everyone customizes their stack," social media is heading toward a world where the feed is a personal construction, not a platform default. The shift from 4 platforms per user (2016) to 6.7 platforms per user (2024) was the warm-up. The shift from one algorithm per platform to multiple algorithms per platform is the main event.

The next fragmentation cycle will not be about which app you open. It will be about which algorithm you run inside it. Platform choice becomes secondary to feed choice. The real lock-in shifts from the social graph to the recommendation engine — and for the first time, users can bring their own.

3. What's Actually Shipping: The Programmable Feed

This is not speculative. Programmable feeds are shipping now, at meaningful scale, with real monetization experiments already underway. The most advanced implementation is Bluesky's AT Protocol, which treats custom feeds as first-class citizens of the social network. Bluesky's architecture replaces the opaque algorithmic black box with an open marketplace. The company's stated philosophy: "Give users sensible defaults but leave them the option to fully customize their experience if they don't like our choices." Third-party developers build feeds using published APIs and tools like SkyFeed. Users pin, sort, and swap feeds within the app. The vast majority of feeds on the network are now built by independent developers, not Bluesky itself.

The commercial proof point is Graze, a startup that went from zero to serving hundreds of thousands of unique daily users and tens of millions of content impressions within months of Bluesky's growth spike. Graze powers 4,500 feeds created by roughly 3,000 users — including several of Bluesky's top feeds across news, gaming, art, politics, sports, and fitness. In April 2025, Graze raised a $1M pre-seed round led by Betaworks and Salesforce Ventures. It is already running ads in 200 custom feeds at approximately $1 per 1,000 impressions — with Bluesky's explicit blessing. This is not a side project. It is an emerging business model: the algorithm as a service, built by users, for users, with the platform serving as infrastructure.

Table 3. Custom Feed Adoption Data Points (2025--2026)
Platform / ToolMetricSourceSignificance
Graze (Bluesky feed builder)4,500 feeds by ~3,000 usersTechCrunch, Apr 2025Raised $1M pre-seed; $1 CPM ads in custom feeds
Graze daily impressionsTens of millionsTechCrunch, Jan 20253,000 builders serve hundreds of thousands of daily users
Bluesky registered users37M+Bluesky, Jul 202523M added in one year; 40% DAU drop by Oct 2025
Bluesky custom feeds (total)Thousands (open marketplace)Bluesky Blog, 2023Vast majority built by independent third-party developers
RSS app downloads (2026 vs 2025)+30% YoYWebProNews, 2026Driven by privacy concerns and algorithm fatigue
Threads "Dear Algo" launchFeb 2026 (US, UK, AU, NZ)CNBC, Feb 2026Natural language feed tuning; 3-day adjustment window
Farcaster registered users~546KBlockEden, Oct 202540--60K DAU; client-level algorithm choice
Mastodon monthly active users~2.5MMarketingScoop, 2025600%+ YoY growth post-Twitter acquisition
X algorithm open-source (GitHub)No public commits for 2+ yearsSocial Media Today, 2026Transparency theatre -- code visible but not updated or user-modifiable
EU DSA fine against X€120M (Dec 2025)European Commission, 2025First DSA non-compliance fine; deceptive design + ad transparency violations
Sources: TechCrunch, Bluesky Blog, WebProNews, CNBC, European Commission, BlockEden, MarketingScoop, Social Media Today.

Meanwhile, Meta is approaching the same destination from the opposite direction. Rather than opening the algorithm to developers, Threads launched "Dear Algo" in February 2026 — a feature that lets users write a natural-language post requesting algorithm adjustments. The algorithm then adjusts the user's feed for three days. It is a strikingly different design philosophy: Bluesky says "build your own algorithm," while Threads says "tell ours what you want in plain English." One gives you the source code; the other gives you a suggestion box. Both acknowledge the same underlying demand.

And then there is RSS — the 25-year-old protocol that Google tried to kill in 2013 by shuttering Google Reader. RSS app downloads surged 30% year-over-year in 2026, driven by what PC Gamer called the need to "kill the algorithm in your head." RSS is the most radical version of algorithmic sovereignty: there is no algorithm at all. You subscribe to sources. They appear chronologically. No ranking, no optimization, no engagement signals. It is the equivalent of reading the newspaper you chose to subscribe to, and nothing else. The revival is small in absolute terms but culturally significant — it represents the purist position in the algorithmic choice spectrum.

4. Geeks, MOPs, and Sociopaths: The Lifecycle of Algorithmic Choice

David Chapman's 2015 essay "Geeks, MOPs, and Sociopaths" describes a lifecycle that governs how subcultures form, grow, and die. The framework identifies three groups: geeks (creators and fanatics who invent and sustain a scene through obsessive engagement), MOPs (members of the public who show up for a "reasonably pleasant time in exchange for minimal effort"), and sociopaths (exploiters who recognize the subculture as a power game and extract cultural, social, and liquid capital from it). The lifecycle runs: geeks create something exciting, MOPs arrive and dilute it, sociopaths monetize and destroy it, geeks abandon the wreckage and start over.

This framework maps with uncomfortable precision onto the emerging landscape of user-controlled social algorithms.

Table 4. Chapman's Framework Applied to Algorithm Choice
ActorIn Chapman's FrameworkIn the Algorithm EconomyApproximate %
Creators (Geeks)Invent the scene; obsess over esoteric detailsBuild custom feeds, write feed generators, fork protocols~1%
Fanatics (Geeks)Organize, fund, analyze; deeply committedInstall custom feeds, curate RSS, run Mastodon servers~9%
MOPsCasual fans seeking pleasant experience for minimal effortUse default algorithm; never change settings; follow what surfaces~85%
SociopathsExtract cultural, social, and liquid capitalGame algorithms, sell "growth hacking," monetize attention arbitrage~5%
Source: Chapman (2015), 'Geeks, MOPs, and Sociopaths,' meaningness.com. Percentages adapted from Nielsen's 90-9-1 rule.

The geeks are already visible. They are the 3,000 Graze feed builders serving millions of impressions. They are the Mastodon server operators running their own instances with custom moderation policies. They are the developers writing custom Bluesky feed generators using the AT Protocol API. They are the people who set up Miniflux or NetNewsWire and curate 200 RSS subscriptions instead of scrolling a default feed. They do this because they care about what they see. The feed is not a passive experience for them — it is a craft.

The MOPs are the other 85%. They will never build a custom feed. They will never install one. They will never change their feed settings. They will open the app and consume whatever the platform decides to show them. This is not laziness — it is rational behavior. Most people do not care about algorithmic curation the way geeks do. They want "a reasonably pleasant time in exchange for minimal effort," and the default algorithm delivers that well enough. The MOPs are the reason platforms can afford to offer custom feeds to geeks: the default algorithm, running on 85% of users, generates the ad revenue that subsidizes the geek-facing features.

The sociopaths are the growth hackers, the engagement farmers, the "algorithmic arbitrageurs" who will find ways to exploit the gap between custom-feed geeks and default-feed MOPs. They are already present in Graze's ad-supported feeds. They will appear wherever there is an audience that someone else curated and a mechanism to monetize it. Chapman notes that the optimal ratio of MOPs to geeks is "maybe 6:1" and that beyond 10:1, the scene becomes unsustainable. In algorithm land, the ratio is closer to 85:10 — which suggests the sociopath invasion will arrive quickly.

Table 5. The Algorithm Lifecycle: From Scene to Collapse
StageSubculture Lifecycle (Chapman)Algorithm Lifecycle (This Paper)
1. Scene formationSmall group invents exciting innovationPower users discover custom feed tools; build for themselves
2. GrowthFanatics join; MOPs arrive for the vibeFeed builders gain users; Graze hits millions of impressions
3. MOP dilutionMOPs demand convenience; cultural intensity dropsPlatforms simplify feed controls into "Dear Algo" one-liners
4. Sociopath captureExploiters monetize the scene; push out geeksGrowth hackers, SEO grifters, and ad networks colonize custom feeds
5. Collapse / forkGeeks abandon the scene; subculture dies or mutatesGeeks fork to new protocols; cycle restarts at smaller scale
Source: Author synthesis of Chapman (2015) and observed market behavior.

The geeks-MOPs-sociopaths lifecycle predicts that algorithmic sovereignty will follow the same arc as every subculture before it. The geeks will build something beautiful. The MOPs will show up and demand it be easier. The sociopaths will monetize it into oblivion. And then the geeks will fork — to a new protocol, a new tool, a new layer of abstraction. The question is not whether this cycle will run. It is how many times it will run before the infrastructure becomes mature enough to resist capture.

5. The 90-9-1 Problem: Who Actually Builds Their Own Feed?

The internet's participation inequality is one of the most robust findings in digital sociology. Jakob Nielsen's 90-9-1 rule — 90% lurk, 9% contribute [redacted], 1% create most content — has been validated across platforms from Wikipedia (where 0.2% of visitors are active editors) to health forums (where "Superusers" generate the vast majority of posts) to blogs (where the ratio skews even further, to roughly 95-5-0.1). More recent data from large online communities shows the split may be narrowing slightly — 5% creating, 5% responding — but the fundamental asymmetry holds: a tiny minority produces, the vast majority consumes.

Table 6. Participation Inequality in Content vs. Algorithm Curation
Activity LevelTraditional Social (90-9-1)Algorithm Curation (Predicted)Cultural Influence
Creators (1%)Generate majority of contentBuild custom feeds, write algorithms, run feed servicesSet the taste; define what quality looks like
Contributors (9%)Edit, share, commentInstall custom feeds, configure RSS, choose alt-clientsAmplify creator taste; form the early-adopter cohort
Lurkers (90%)Consume without contributingUse platform defaults; never touch feed settingsAbsorb whatever surfaces; shaped by others' choices
Source: Nielsen (2006), Higher Logic (2024), author projection.

Apply this to algorithmic choice and the picture is stark. The 1% who build custom feeds are not just creators — they are taste-makers. They decide what the 9% early adopters experience, which in turn shapes the culture that the 90% eventually absorb through the default algorithm. This is how it has always worked in media. A tiny number of editors, curators, and programmers have always determined what the masses see. The difference now is that the curators are users, not employees. The editorial function has been democratized — but democracy does not mean equal participation. It means the motivated few shape the experience of the passive many.

Graze's numbers illustrate this perfectly. Three thousand builders serve hundreds of thousands of daily users. That is a ratio of roughly 1:100 — far more concentrated than even the 90-9-1 rule predicts. Each feed builder is a micro-editor, curating reality for an audience that chose their algorithm but did not build it. The feed builder's biases, interests, and blind spots become the audience's information diet. This is not necessarily worse than a corporate algorithm — but it is not automatically better, either. It is a different kind of gatekeeping: distributed, transparent, and voluntarily chosen, but gatekeeping nonetheless.

The deeper implication is that "everyone will build their own feed" is wrong. It has always been wrong. The correct prediction is: the 1% will build feeds, the 9% will choose from what the 1% built, and the 90% will use whatever default the platform sets. The revolution is real, but it is a revolution of the minority. And the minority's taste will propagate to the majority through the same cultural diffusion mechanisms that have always operated — just faster, and with more explicit infrastructure.

6. The Regulatory Accelerant: DSA and Forced Algorithmic Choice

The European Union's Digital Services Act, fully applicable since February 2024, is the most significant regulatory force pushing platforms toward algorithmic transparency and user choice. The DSA requires Very Large Online Platforms (VLOPs) to disclose how their recommendation algorithms work, ban targeted advertising to children, and provide users with at least one recommendation option that is not based on profiling. The European Commission established the European Centre for Algorithmic Transparency specifically to audit platform compliance.

Enforcement has teeth. In December 2025, the Commission issued its first non-compliance fine under the DSA: \u20AC120 million against X for violations of deceptive design rules, ad transparency requirements, and researcher data access provisions. A new investigation against X launched in January 2026. The message is clear: opaque algorithms that optimize purely for engagement, without user agency or transparency, are now a regulatory liability. Facebook has already expanded chronological feed options in select EU regions as a compliance measure. Instagram's "Your Algorithm" interest toggles are partially a DSA response.

The regulatory dynamic creates an asymmetric incentive. Platforms that already offer user-controlled feeds — Bluesky, Mastodon, Farcaster — are effectively pre-compliant. They built algorithmic choice as a feature, not a regulatory burden. Platforms that relied on opaque engagement maximization — TikTok, Instagram, X — face a choice: genuinely empower users to control their feeds, or build the minimum viable compliance features while preserving the engagement-optimized default. So far, most are choosing the latter. "Dear Algo" is elegant, but a 3-day adjustment window based on a natural language post is not the same as handing users the source code to their recommendation engine.

The research supports a Georgetown Knight Institute report from March 2025 titled "Better Feeds: Algorithms That Put People First," which argued for structural reforms to recommendation systems — not just transparency requirements but genuine user control mechanisms. A study in Science Direct found that increasing user autonomy increases recommendation acceptance, drawing on self-determination theory: people who feel in control of their information environment are more satisfied with it, even if the content is the same. The DSA is pushing in the right direction, but the gap between regulatory intent (genuine algorithmic choice) and platform compliance (minimal settings menus) remains vast.

7. The Filter Bubble Paradox: Is Self-Curation Better or Worse?

The most common objection to user-controlled algorithms is the filter bubble argument: if people choose their own feeds, won't they just surround themselves with confirming voices? A systematic review synthesizing a decade of peer-reviewed research (2015–2025) on filter bubbles, echo chambers, and algorithmic bias found that "algorithmic systems structurally amplify ideological homogeneity, reinforcing selective exposure and limiting viewpoint diversity." Small initial biases are magnified by recommender systems, producing "polarization cascades at the network level." This is the standard critique, and the evidence for it is strong — for platform-controlled algorithms.

But the filter bubble research has a blind spot: it almost exclusively studies engagement-optimized algorithms, not user-constructed feeds. The entire literature assumes that the algorithm is a black box controlled by the platform. When you shift to user-controlled algorithms, the dynamics change in ways that the existing research does not capture. A user who consciously builds their feed is engaging in a fundamentally different cognitive act than a user who passively consumes an engagement-optimized one. The former involves metacognition — thinking about what you want to think about. The latter involves no metacognition at all.

Table 7. Filter Bubble Risk by Curation Method
Curation MethodDiversity RiskEcho Chamber RiskUser Agency
Platform default algorithmHigh -- optimizes for engagement, not breadthHigh -- amplifies confirming contentNone -- user is the product
User-built custom feedMedium -- depends on builder sophisticationMedium -- self-aware curation possibleHigh -- user chooses the filter
RSS / chronologicalLow -- user explicitly subscribes to sourcesLow -- no amplification loopTotal -- but requires effort
Community-curated feedLow-Medium -- curator taste diversifiesMedium -- curator bias replaces algo biasDelegated -- trust the curator
AI-tuned personal feed ("Dear Algo")Unknown -- depends on implementationPotentially high -- natural language = vague steeringShallow -- illusion of control
Source: Author analysis synthesizing MDPI (2025), Springer Nature (2024), Frontiers in Psychology (2025).

The paradox is this: platform-controlled algorithms definitely create filter bubbles, because they optimize for engagement and engagement correlates with emotional arousal and confirmation bias. User-controlled algorithms might create filter bubbles, but only if the user is not self-aware — and the act of building your own feed is itself an act of self-awareness. The geeks who build custom feeds are, almost by definition, the people most likely to intentionally include dissenting voices, serendipity, and breadth. They are not building feeds to confirm their biases. They are building feeds to escape the platform's bias toward engagement optimization.

Research on algorithmic awareness supports this. A study in Frontiers in Psychology (2025) found that algorithm-aware users experience less sense of manipulation and more perceived control — but also noted that awareness alone does not change behavior. The users who actually change their behavior are the ones with both awareness and agency. This is exactly the cohort that custom feed tools serve: users who know what algorithms do and have the tools to do something about it. The filter bubble risk does not disappear, but it shifts from an externally imposed problem to an internally managed one. Whether that is an improvement depends entirely on whether you trust individuals more than you trust engagement-optimized corporate algorithms. The evidence suggests you should.

8. Framework: The Algorithmic Sovereignty Lifecycle

Synthesizing the preceding analysis, we propose a lifecycle model for how algorithmic sovereignty emerges, diffuses, and evolves. The model combines the consolidation-fragmentation cycle (which describes when users seek alternatives), the geeks-MOPs-sociopaths lifecycle (which describes who acts at each stage), and participation inequality (which describes how many people exercise algorithmic choice at each level).

Phase 1: Platform Enshittification (the trigger). A dominant platform begins extracting more value from users than it provides. The algorithm shifts from "show people what they want" to "show people what generates revenue." Ad load increases. Organic reach declines. The default experience degrades. This is where Twitter/X was in 2022, where Facebook has been since roughly 2018, and where TikTok is heading as regulatory and competitive pressures mount.

Phase 2: Geek Exodus (the 1%). The most technically opinionated users leave first. They do not just leave the platform — they leave the algorithm. They set up RSS readers, build Bluesky custom feeds, run Mastodon instances, or write their own recommendation systems. They are not seeking a better platform. They are seeking sovereignty over their information diet. This phase is small in numbers but culturally significant: geeks are the taste-makers, and their departure signals that the platform has lost the quality cohort.

Phase 3: Fanatic Adoption (the 9%). Early adopters who are not technically inclined but are culturally aligned with the geeks begin using the tools geeks built. They install Graze feeds. They subscribe to curated RSS bundles. They switch Farcaster clients for a different algorithmic experience. They do not build — they choose. This is the phase where the market for custom algorithms becomes commercially viable. Graze's $1M raise is a Phase 3 signal.

Phase 4: Platform Response (the co-optation). The dominant platforms see the geek exodus as a competitive threat and respond with simplified feed control features. This is Threads' "Dear Algo" and Instagram's interest toggles. These features give the illusion of algorithmic choice without actually surrendering control. They are designed to retain MOPs by making them feel empowered while keeping the engagement-optimized default in place. Chapman would recognize this as the sociopath response: mimicking the geek innovation to prevent MOP migration.

Phase 5: Sociopath Capture (the monetization). Growth hackers, SEO operators, and ad networks figure out how to game the new algorithmic layer. Custom feeds become ad channels. Feed builders sell placement. The curation layer acquires the same incentive misalignment that plagued the platform layer. The 6:1 MOP-to-geek ratio that Chapman identifies as the maximum sustainable level is quickly exceeded. Quality drops.

Phase 6: The Fork (geeks leave again). Geeks abandon the captured algorithm layer and move to a new one. Maybe they build at the protocol level (AT Protocol, ActivityPub, Nostr). Maybe they build AI-powered personal recommendation systems that run locally. Maybe they build something we have not imagined yet. The cycle restarts. Each iteration leaves behind infrastructure that is slightly more open, slightly more user-controlled, and slightly harder for any single entity to capture. This is the ratchet: the geeks never win permanently, but each fork raises the floor.

The algorithmic sovereignty lifecycle is not a one-time transition from platform control to user control. It is a repeating cycle. Geeks build, MOPs arrive, sociopaths capture, geeks fork. The revolution is not in any single tool or protocol. It is in the ratchet: each cycle raises the baseline of what users expect from their feeds, making it progressively harder for platforms to retreat to fully opaque engagement maximization.

9. Implications: What This Means for Builders and Users

Implication 1: The algorithm becomes a product category. Graze's business model — building and monetizing custom feeds on someone else's social network — is the first signal of a new product category. If Bluesky's 2026 roadmap expands custom feed visibility and tools as promised, and if the AT Protocol matures as an open standard, we will see an ecosystem of feed-building companies, algorithm marketplaces, and feed-as-a-service businesses. The algorithm is unbundling from the platform, just as the app unbundled from the operating system.

Implication 2: The 1% who build feeds have outsized influence. Participation inequality means that a few thousand feed builders will shape the information diet of millions. This is not inherently good or bad — it depends on who the builders are and what they optimize for. But it means that "algorithmic sovereignty" is, in practice, algorithmic oligarchy: a small number of opinionated curators making editorial decisions for large audiences who chose to delegate. The saving grace is that the delegation is voluntary and revocable — unlike the platform default, which is neither.

Implication 3: MOPs will never leave the default. Every prediction that "users will build their own algorithms" is wrong in the same way that "users will read the terms of service" was wrong. Most people will not. The 85% on default feeds are not a failure of the system — they are the system. Custom feeds are a safety valve for the geeks, not a universal replacement for the algorithm. The default algorithm will remain the dominant information channel for the foreseeable future, which means that platform incentives still matter enormously even in a world with algorithmic choice.

Implication 4: Regulation matters more than technology. The EU's DSA is doing more to expand algorithmic choice than any startup. By mandating that platforms offer at least one non-profiling-based recommendation option, and by fining platforms that fail to comply, regulators are creating the market conditions for algorithmic sovereignty. The technology to build custom feeds has existed for years (RSS is 25 years old). What has changed is the regulatory pressure that forces platforms to allow alternatives rather than trapping users in engagement-optimized defaults.

Implication 5: The sociopath invasion is inevitable, but survivable. Chapman's framework suggests that any successful algorithmic choice ecosystem will eventually be colonized by growth hackers and ad networks. Graze is already running ads in custom feeds at $1 CPM. This is not necessarily the end — advertising in user-chosen feeds is fundamentally different from advertising in engagement-optimized feeds, because the user opted into the context. But the incentive gradient is clear: where there is attention, there is monetization, and where there is monetization, there are sociopaths. The geeks will need to build defensible infrastructure — open protocols, transparent algorithms, easy portability — to survive the capture phase and preserve the option to fork.

10. Conclusion: The Geeks Will Fork the Feed

The social algorithm is not becoming a personal choice for everyone. It is becoming a personal choice for the people who care enough to exercise it — and those people, though they are a small minority, have always been the ones who shape the information environment for the rest of us. Editors, curators, programmers, DJs, zine publishers, bloggers, RSS power users — the form factor changes but the role is the same: the opinionated few who build the filters through which the majority experiences reality.

What is new is the infrastructure. Bluesky's AT Protocol makes custom feeds a first-class feature rather than a workaround. Graze demonstrates that feed-building is a viable business. The EU's DSA mandates that platforms offer alternatives to engagement-optimized defaults. Meta's "Dear Algo" concedes the principle even while limiting the execution. The 30% year-over-year surge in RSS downloads shows that some users are rejecting the entire concept of algorithmic mediation. These are not isolated signals. They are the early stages of a structural shift — not from platform to platform, but from platform-controlled algorithms to user-chosen algorithms.

The geeks-MOPs-sociopaths lifecycle tells us this shift will not be clean, permanent, or universal. It will be messy, cyclical, and minority-driven. The geeks will build something beautiful. The sociopaths will try to capture it. The geeks will fork and build again. Each cycle will raise the baseline — the floor of what "algorithmic choice" means in practice. The MOPs will mostly stay on default feeds, and that is fine. The revolution does not require majority participation. It requires infrastructure that makes forking cheap and capture expensive.

The question is not whether the social algorithm will become personalizable. It already has. The question is whether the infrastructure will mature fast enough to survive the inevitable sociopath invasion — and whether the geeks, this time, will build protocols instead of platforms, standards instead of startups, and commons instead of companies. If they do, the fork becomes permanent. If they do not, we will be here again in five years, watching the same cycle play out on whatever the next Bluesky is. Either way, the geeks will fork the feed. They always do.

References

Chapman, D. (2015). "Geeks, MOPs, and Sociopaths." Meaningness. meaningness.com/geeks-mops-sociopaths

Bluesky. (2023). "Algorithmic Choice with Custom Feeds." Bluesky Blog. bsky.social/about/blog/7-27-2023-custom-feeds

Perez, S. (2025). "Custom Feed Builder Graze Is Building a Business on Bluesky." TechCrunch, January 31, 2025.

Perez, S. (2025). "Bluesky Feed Builder Graze Raises $1M, Rolls Out Ads." TechCrunch, April 16, 2025.

Meta. (2026). "Dear Algo: AI Algorithm Personalization Feature for Threads." CNBC, February 11, 2026.

Nielsen, J. (2006). "Participation Inequality: The 90-9-1 Rule for Social Features." Nielsen Norman Group. nngroup.com/articles/participation-inequality

European Commission. (2025). "First Non-Compliance Decision Under the Digital Services Act." European Commission Press Release, December 5, 2025.

FreedomLab. (2025). "The Fragmentation of Social Media." FreedomLab. freedomlab.com/posts/the-fragmentation-of-social-media

García-Soto, A., et al. (2025). "Trap of Social Media Algorithms: A Systematic Review of Filter Bubbles, Echo Chambers, and Their Impact on Youth." Societies, 15(11), 301.

Springer Nature. (2024). "Filter Bubbles and the Unfeeling: How AI for Social Media Can Foster Extremism and Polarization." Philosophy & Technology.

ScienceDirect. (2024). "Let Me Decide: Increasing User Autonomy Increases Recommendation Acceptance." Computers in Human Behavior, 153.

Georgetown Knight Institute. (2025). "Better Feeds: Algorithms That Put People First." Knight-Georgetown Institute.

Frontiers in Psychology. (2025). "Resistance or Compliance? The Impact of Algorithmic Awareness on Attitudes Toward Online Information Browsing." Front. Psychol., 16.

ScienceDirect. (2025). "The Impact of Algorithm Awareness on the Acceptance of Personalized Social Media Content Recommendation." Acta Psychologica.

WebProNews. (2026). "RSS Revival in 2026: Users Flee Algorithms for Privacy and Control." WebProNews. webpronews.com

BlockEden. (2025). "Farcaster in 2025: The Protocol Paradox." BlockEden.xyz. blockeden.xyz

SiteLogic Marketing. (2025). "A Brief History of Social Media Trends: Consolidation and Diversification." SiteLogic. sitelogicmarketing.com

PC Gamer. (2026). "Kill the Algorithm in Your Head: Let's Set Up RSS Readers and Get News We Actually Want in 2026." PC Gamer. pcgamer.com

Suggested citation: Baratta, R. (2026). “DIY Social Algorithms: The Geeks Will Fork the Feed.” Buildooor Research Brief, February 2026.

Correspondence: buildooor@gmail.com