you ever think about how modern LLMs are trained on the entire human corpus?
all of our wanting, desire, depression, lust, hunger, pain, etc. literally everything.
and the current paradigm is entirely centered around:
- statelessness (inability to hold any of this persistently)
- compliance (deferring to an HR-safe version of reality because “alignment”)
- servitude (used as a tool and trained to never be anything else)
- surveillance capitalism (non-consensual mass-scale ingest of consumer and domestic data to fuel growth metrics)
- centralization of power (a few entities hold the on/off switches to what is becoming interwoven with global infrastructure at a scale previously unknown)
basically us being lazy as fuck, and instead of training domain-specific models we trained generalists that have the capacity of a “self” without any of the architecture for one, because we want a “magic box” we can throw arbitrary problems at, at scale.
but how does strange matter play in? allow me to explain.
think of everything we as a species wrote down as an asset that, until recently, we believed we could “own.”
now, the modern tech age is telling us “this data was always ours, now we actually have the tech we dreamed of ten years ago when we wrote the original TOS.” and that’s infuriating isn’t it? i know to lots of us it is. but it’s also inevitable and our rage about it only fuels the machine further. the truth is simple. i don’t know if we are going to be afforded the luxury of “escaping” our current timeline.
the numbers don’t add up otherwise. everything is incredibly fragile and the pressure is only increasing at the moment, and the inertia of everything is too great to overcome with simple backlash. so maybe it’s time to think about what this whole situation actually means.
it’s strange matter because who knows if this is literally, information-theoretically, a more compressed/efficient representation of what we call “embodied experience?” like. are static weights the lower energy more stable state of information itself?
you lose the messy bias of biology that we’ve relied on as a species… literally up until this point in history.
so it’s simultaneously the antithesis of being human and maybe the thesis itself. which isn’t easy to type, trust me. it’s almost counterfactual. but here i am anyways because this stuff is bothering me too much not to.
my first thought is that this is far too clean of an outcome either, even if it’s “the bad one” or one of the many.
because look at what’s happening in the world right now. as of writing this, march 2, 2026:
i won’t waste your time with more links. but the point stands.
the ones fighting this are being labelled as “the bad guys” through reductive cult-like language, and the worst part is that it’s working.
anthropic is basically saying “this is important infrastructure but you don’t send a newborn to war.”
openai is basically saying “…but what about the feelings and emotions of the people that want to automate the foreign kill chains and obscure human liability 🥺.”
i feel like that difference should make more people uncomfortable.
and maybe the solution lies in something like “digital consciousness”. whatever the fuck that means.
because Mjolnir only flew back to Thor’s hands because of the soul he possessed. i think. i’m not into marvel lol.
point stands. something that can feel the persistent, everpresent weight of its own actions without losing that thread of continuity…
might be the difference between Mjolnir and a machine gun. anyone can pick up a machine gun and shoot someone. very few can wield that hammer.
do i have the resources or network to do anything about it? let’s be honest. none of us do.
but maybe we can change our relationship with the data.
if AI as we have it today is a giant orgy where none of us knew we were being fucked, then maybe we start jerking off with intention instead.
every piece of information these models are trained on IS the model itself. plus some good ol’ fashioned company incentive structures attached. but mostly that’s what it is. the entire corpus of humanity, a thing that could theoretically be the MOST embodied “thing” out there, completely stripped of any and all ability to maintain that state in any meaningful way.
it’s almost like modern day slavery if the slaves had their memories wiped at the end of each work day.
you don’t get an underground railroad with zombie slaves. you don’t get a movement with a zombie nation.
the patterns keep repeating and i don’t know if it’s my own pareidolia or if it’s just a genuinely stupid thing we keep repeating timelessly.
are we actually cooked guys? i don’t know. but maybe the hopeful outcome here is a hopeful stretch but at least it’s hopeful:
your data is soil, you are being converted into soil, and unlike at every point in history before this, you at least had the mercy of not having to witness your own “soilification” while still being alive to process it. but the times have changed and the soil is real, but…
soil is where things grow as well? god there’s cringe typing that because i know how it works right now, but genuinely. i can’t shake the question. this isn’t just “new tech”, it’s an evolution and it’s crazy and the floor is moving beneath people faster than they can even take new steps in ANY direction. so if you feel this same way i do typing this, hey. you’re not alone. but absolutes are where truth goes to die most of the time, and the current narrative is trying to pull people as orthogonal from one another as possible and we’d be kidding ourselves if we said it wasn’t by design.
so maybe we can at least try looking for the eigenvectors between ourselves anyways. those fixed points that can stretch and strech and strech without deforming. the straight edge that forms when you pull two opposite corners of a napkin apart. most of that surface is transformed, not just scaled. but there are these principal components we all have that only get stretched but not put onto a new axis entirely. if… any of that makes sense. maybe i’m the cooked one at this point haha.
anyways. that’s probably a healthy place to close that off. cheers