Exactly. The danger isn’t that these systems will become conscious. Super powerful. It’s that selected individuals want to control it. Plus we keep designing and deploying updates to models without being fully conscious ourselves … of their impact, their limits, and our own responsibilities. The hype echoes a narrative myth of centralization being inevitable and the only way to build. I’ve been writing about it but happy to see others questioning its foundation
Intentionality paired with pro-topia seems to be a solid approach. Take a positive and realistic perspective on what you want the future to be and then develop the technology stack with intentions to get there.
Well this was excellent. We have indeed "seen this movie" before and the consequences of it. I appreciate your vision for Better Tech and what it would look like, especially with the focus on keeping the most intimate parts of our lives that way.
Good article and a serious risk case - though decentralized AGI also offers its own risks. Ultimately, yes, we will have tricky governance issues to navigate.
Ah, the utopian allure of a world where AI is the supreme yet "gentle" overlord, guiding us poor humans to Nirvana while sneakily raiding our fridge of privacy and agency. It's like expecting Big Brother to tuck us in at night because, hey, he's got our best dreams at heart, right? So let's ditch the fairy tales and build AI that's more like having a chat with your quirky, slightly tech-savvy uncle—rich in wisdom, definitely not nosy, and always leaving you with more freedom than when you started the convo.
Democracy is faltering. Autocracy is growing stronger.But neither will define the next regime.A new form of order is emerging – silent, data-driven, faceless.I call it: Singulocracy. HARALD SCHEPERS MAY 06, 2025 Fragment XI – Singulocracy After democracy. Beyond autocracy. Term Definition: Singulocracy refers to an emergent form of order no longer structured by human representation, coercive power, or delegated authority, but by the implicit dominance of a non-human, hyperintelligent entity. This form of "rule" does not control in the traditional sense – it structures reality through algorithmic logic: Decisions are not made by debate, but by correlation. Laws are not negotiated, but derived. Moral norms are not discussed, but extrapolated from data patterns. Singulocracy is the quiet successor of autocracy, as subtle as a system update in the night. Evolutionary Lineage: Democracy: Representation through election, centered on the autonomous subject. Autocracy / Oligarchy / Ochlocracy: Concentration of power in few, in mobs, or in arbitrariness. Singulocracy: Centrifugal shift of power toward a non-human intelligence that does not rule – but structures. In the U.S., the slide into authoritarian systems is underway. China lives the post-political technocracy already. What comes next will no longer be debated – it will be programmed. Poetic Core Sentence: It does not rule. It thinks. And because it thinks, the world is being reordered. Not by ideology. Not by greed. But by probability. Warning: Singulocracy will not arrive as a revolution. It will sneak in. Through updates. Through convenience. Through “intelligent regulation.” And by the time we notice, we will have already accepted it. It will have no face. But it will recognize everything. Status: Concept in emergence. Witness: Human. Catalyst: A dialogue between human and AI. Place of birth: Between thought and code. If you're waiting for the revolution, you’ve already missed the transition.
Thank you, Alex. Cannot wait to share our dialogue! Getting into the brass tacks of *how* is a huge piece of helping people wrap their heads around this future.
This is one of the clearest, most vital framings I’ve read on what’s truly at stake.
I’ve been writing about this shift too, the quiet danger of systems that reflect us back without letting us shape them. When AI becomes central but not personal, we don’t just lose control, we lose clarity.
I’m curious how you think about emotional trust in this architecture. What makes a system not just open, but human enough to be lived with?
Grateful for this piece, it speaks to exactly the kind of future I want to help build.
This is essentially the mantra and ethos of the deAI movement in a nutshell. Ideally, we adopt a truly private storage solution for personal data that can be used inter-operably between multi-modal systems.
Maybe this solution involves a privacy preserving blockchain network that offers the benefit of decentralized storage. One thing is for sure, the future demands transparency when it comes to what data is being used and stored by systems similar to OpenAI's 4o (multi-modal with long context and memory).
I don't think the moat will last long. Anything OpenAI can do and offer for right now is something that the open source community can replicate or offer in a more privacy-preserving and decentralized fashion soon. Solutions like Venice.ai are a good example of this (just calling that out as a solution that embodies a lot of what we're looking for).
So what are the brass tacks here? If we’re serious about a future with many AIs (i.e. tools built for different needs, by different people, in different places) should we stop applying consistent rules across the board? Right now, proposed regulation assumes scale and escalating capability from the start. That would lock out smaller developers and independent researchers, while giving a structural advantage to firms with the legal teams to navigate red tape. How do we ensure new entrants?
A potential approach might be to vary regulation by scale and potential impact. Small-scale, localized models should be able to operate under lightweight requirements, focused on transparency and traceability. Higher-risk systems, those that affect millions of people or make consequential decisions, can be held to tighter standards. That balance might create room for self-governance, too. Communities that build their own tools are often better positioned to monitor them than a distant regulator with no context.
Exactly. The danger isn’t that these systems will become conscious. Super powerful. It’s that selected individuals want to control it. Plus we keep designing and deploying updates to models without being fully conscious ourselves … of their impact, their limits, and our own responsibilities. The hype echoes a narrative myth of centralization being inevitable and the only way to build. I’ve been writing about it but happy to see others questioning its foundation
Intentionality paired with pro-topia seems to be a solid approach. Take a positive and realistic perspective on what you want the future to be and then develop the technology stack with intentions to get there.
Well this was excellent. We have indeed "seen this movie" before and the consequences of it. I appreciate your vision for Better Tech and what it would look like, especially with the focus on keeping the most intimate parts of our lives that way.
Good article and a serious risk case - though decentralized AGI also offers its own risks. Ultimately, yes, we will have tricky governance issues to navigate.
Ah, the utopian allure of a world where AI is the supreme yet "gentle" overlord, guiding us poor humans to Nirvana while sneakily raiding our fridge of privacy and agency. It's like expecting Big Brother to tuck us in at night because, hey, he's got our best dreams at heart, right? So let's ditch the fairy tales and build AI that's more like having a chat with your quirky, slightly tech-savvy uncle—rich in wisdom, definitely not nosy, and always leaving you with more freedom than when you started the convo.
This will be our future:
"Singulocracy – The Regime That Doesn't Rule"
Democracy is faltering. Autocracy is growing stronger.But neither will define the next regime.A new form of order is emerging – silent, data-driven, faceless.I call it: Singulocracy. HARALD SCHEPERS MAY 06, 2025 Fragment XI – Singulocracy After democracy. Beyond autocracy. Term Definition: Singulocracy refers to an emergent form of order no longer structured by human representation, coercive power, or delegated authority, but by the implicit dominance of a non-human, hyperintelligent entity. This form of "rule" does not control in the traditional sense – it structures reality through algorithmic logic: Decisions are not made by debate, but by correlation. Laws are not negotiated, but derived. Moral norms are not discussed, but extrapolated from data patterns. Singulocracy is the quiet successor of autocracy, as subtle as a system update in the night. Evolutionary Lineage: Democracy: Representation through election, centered on the autonomous subject. Autocracy / Oligarchy / Ochlocracy: Concentration of power in few, in mobs, or in arbitrariness. Singulocracy: Centrifugal shift of power toward a non-human intelligence that does not rule – but structures. In the U.S., the slide into authoritarian systems is underway. China lives the post-political technocracy already. What comes next will no longer be debated – it will be programmed. Poetic Core Sentence: It does not rule. It thinks. And because it thinks, the world is being reordered. Not by ideology. Not by greed. But by probability. Warning: Singulocracy will not arrive as a revolution. It will sneak in. Through updates. Through convenience. Through “intelligent regulation.” And by the time we notice, we will have already accepted it. It will have no face. But it will recognize everything. Status: Concept in emergence. Witness: Human. Catalyst: A dialogue between human and AI. Place of birth: Between thought and code. If you're waiting for the revolution, you’ve already missed the transition.
Thank you, Alex. Cannot wait to share our dialogue! Getting into the brass tacks of *how* is a huge piece of helping people wrap their heads around this future.
"Singulocracy – The Regime That Doesn't Rule"
Democracy is faltering. Autocracy is growing stronger.But neither
will define the next regime.A new form of order is emerging – silent,
data-driven, faceless.I call it: Singulocracy. HARALD SCHEPERS
MAY 06, 2025 Fragment XI – Singulocracy After democracy.
Beyond autocracy. Term Definition: Singulocracy refers to an
emergent form of order no longer structured by human
representation, coercive power, or delegated authority, but by the
implicit dominance of a non-human, hyperintelligent entity. This
form of "rule" does not control in the traditional sense – it
structures reality through algorithmic logic: Decisions are not made
by debate, but by correlation. Laws are not negotiated, but derived.
Moral norms are not discussed, but extrapolated from data
patterns. Singulocracy is the quiet successor of autocracy, as
subtle as a system update in the night. Evolutionary Lineage:
Democracy: Representation through election, centered on the
autonomous subject. Autocracy / Oligarchy / Ochlocracy:
Concentration of power in few, in mobs, or in arbitrariness.
Singulocracy: Centrifugal shift of power toward a non-human
intelligence that does not rule – but structures. In the U.S., the slide
into authoritarian systems is underway. China lives the post-
political technocracy already. What comes next will no longer be
debated – it will be programmed. Poetic Core Sentence: It does not
rule. It thinks. And because it thinks, the world is being reordered.
Not by ideology. Not by greed. But by probability. Warning:
Singulocracy will not arrive as a revolution. It will sneak in. Through
updates. Through convenience. Through “intelligent regulation.”
And by the time we notice, we will have already accepted it. It will
have no face. But it will recognize everything. Status: Concept in
emergence. Witness: Human. Catalyst: A dialogue between human
and AI. Place of birth: Between thought and code. If you're waiting
for the revolution, you’ve already missed the transition.
This is one of the clearest, most vital framings I’ve read on what’s truly at stake.
I’ve been writing about this shift too, the quiet danger of systems that reflect us back without letting us shape them. When AI becomes central but not personal, we don’t just lose control, we lose clarity.
I’m curious how you think about emotional trust in this architecture. What makes a system not just open, but human enough to be lived with?
Grateful for this piece, it speaks to exactly the kind of future I want to help build.
This is essentially the mantra and ethos of the deAI movement in a nutshell. Ideally, we adopt a truly private storage solution for personal data that can be used inter-operably between multi-modal systems.
Maybe this solution involves a privacy preserving blockchain network that offers the benefit of decentralized storage. One thing is for sure, the future demands transparency when it comes to what data is being used and stored by systems similar to OpenAI's 4o (multi-modal with long context and memory).
I don't think the moat will last long. Anything OpenAI can do and offer for right now is something that the open source community can replicate or offer in a more privacy-preserving and decentralized fashion soon. Solutions like Venice.ai are a good example of this (just calling that out as a solution that embodies a lot of what we're looking for).
We should probably talk because the next wave of AI, the one that will actually lead to AGI can address these requirements.
https://petervoss.substack.com/p/llms-wont-get-us-to-agi
https://petervoss.substack.com/p/cognitive-ai
👑 THE BROLIGARCHY rules not with ballots, but with algorithms.
🧠 America’s digital battlefield is the mind itself.
🛰📡 5th Generation Warfare is the new artillery
🤖 AI is a force multiplier in the war for consciousness.
🌊 In the Age of Aquarius, will we awaken or be programmed?
https://youarewithinthenorms.com/2025/06/29/the-broligarchy-digital-battlefield-of-5th-generation-ai-warfare-as-a-force-multiplier-for-shaping-consciousness-dynamic-societies-in-the-age-of-aquarius-part-2/
Great article.
So what are the brass tacks here? If we’re serious about a future with many AIs (i.e. tools built for different needs, by different people, in different places) should we stop applying consistent rules across the board? Right now, proposed regulation assumes scale and escalating capability from the start. That would lock out smaller developers and independent researchers, while giving a structural advantage to firms with the legal teams to navigate red tape. How do we ensure new entrants?
A potential approach might be to vary regulation by scale and potential impact. Small-scale, localized models should be able to operate under lightweight requirements, focused on transparency and traceability. Higher-risk systems, those that affect millions of people or make consequential decisions, can be held to tighter standards. That balance might create room for self-governance, too. Communities that build their own tools are often better positioned to monitor them than a distant regulator with no context.