Maddy Foxmom
@maddyfoxmom.bsky.social
500 followers 4.4K following 200 posts
Agender | They/She | On E as of 06/2025 | 26 | 18+ NSFW Art Ahead | Puppygirl for @coyotecouch.bsky.social 💕 @SpaceRychen.bsky.social 💙 @NoxyDile.bsky.social ❤️ There's always a reason to be kind to others, and every reason to be kind to yourself
Posts Media Videos Starter Packs
maddyfoxmom.bsky.social
Conspiracism. It's all about jumping the gun, making assumptions, and rushing to conclusions without evidence. These people fundamentally lack the skill of deduction, it doesn't exist in their minds so they operate without it. Which is where you get crazy shit like this.
maddyfoxmom.bsky.social
Tired of liberals trying to reclaim Christianity. It was right wing before there was a right wing, literally monarchs historically. But those liberals do agree with the part where those old Christian monarchs conquered other people's land and erased cultures to replace them with their own.
maddyfoxmom.bsky.social
Google trained their models to produce educational books *using their vast online library of educational books*. Teachers would rate the material that they themselves were trained on as good material, obviously. I'm not convinced at all by those results, it's just accessing the data imprinted in it.
maddyfoxmom.bsky.social
Math is a subject which actually can be understood completely with just logical reasoning because that's the only thing math is, logic. It's not breaking any new ground there. But LLMs even struggle there because it never "learned" math, just the way that math is *most frequently* represented
maddyfoxmom.bsky.social
The entire premise of higher order thinking is to help people with finding solutions to novel situations, but it is not something a language model can use to accurately solve questions it doesn't already know the answer to. If it's right about something, then it was lucky, not smart.
maddyfoxmom.bsky.social
Higher order thinking is a term used in educational studies, not in intelligence studies in general. It's the ability that we have to use acquired skills in novel situations. AI is notorious for being very bad at trying to explain subjects it's had no training in, even with related knowledge.
maddyfoxmom.bsky.social
A languageless concept is an experience, any experience. You can experience things like pain and joy and the color blue, before you process them into language. The language model knows those words and the relationship between words, but not the experiences. It doesn't experience.
maddyfoxmom.bsky.social
There's no consciousness, there's no decision making, there's no internal model of the world. It's taking a text input, computing it using pure math, and spitting out the string of text that results from those calculations on the latent space. There's simply nothing more to it. It's math.
maddyfoxmom.bsky.social
Because I've never seen it evidenced. You just insist, constantly, that it is capable of the kind of thinking that humans are. I have yet to experience a language model deliberate over its choice of words in order to describe a languageless concept, it's just words, it only knows words.
maddyfoxmom.bsky.social
Latent space isn't the brain of the AI, it's just a compressed representation of all of the relationships between language data that the AI has been conditioned on. It's the map it uses to navigate our language so that we can understand it, but it doesn't mean the computer understands any of it.
maddyfoxmom.bsky.social
0 calories, it's almost certainly this can of sparkling lime water
maddyfoxmom.bsky.social
Where AI and our own minds differ is in the fact that we are capable of internalizing information and thinking without the use of language, as if events actually existed in our minds. There's simply nothing like that given to AI, and more logic gates don't just grant it that ability. It needs that.
maddyfoxmom.bsky.social
AI "knows" the entire contents of Wikipedia, and a significant chunk of publicly available social media, as much as a calculator "knows" that 2+2=4. The information was put there by taking all that data and punching the AI through a program until it spit out enough right answers to a few questions.
maddyfoxmom.bsky.social
They contain a list of tokens which contain information like letters, words, phrases, and use advanced language sorting algorithms to identify the *most likely* combination of tokens based on the weights bestowed onto it by its training data (the open internet). It's not learning through insight.
maddyfoxmom.bsky.social
LLMs are capable of combining different phrases together to make new ones, though it doesn't necessarily understand the meaning of the new phrases it's creating, just how likely the answer is to satisfy the training algorithm in a "mathematical accuracy" sense, not a "makes sense" sense.
maddyfoxmom.bsky.social
I'm not saying it's any less capable at finding the answers to a question than people are, but whatever it's doing is not how people do it, not even close. People work with complex, living memories, combined with all sorts of sensations, to reason with. AI uses pure math, in ways that people can't.
maddyfoxmom.bsky.social
I think people against AI are unlikely to treat or refer to AI as though it were a rational agent with thoughts that makes decisions. "It has a loss reduction algorithm" is a better phrasing to not imply it has a thought process in any way similar to a person's
maddyfoxmom.bsky.social
I mean, if this trend continues, if AI bros continue to think of AI as both an obedient female servant and the peak of intelligence... is it even masculine to be educated or smart anymore? Not that being rich means you have to be anything but an inheritor of violently obtained wealth
maddyfoxmom.bsky.social
4 month update: Chest getting fatter. The fluid is still coming out, no changes besides there's a lil more now. I shaved my facial hair and I'm never going back. I registered myself as a girl in the mirror for the first time in my life. I cry a LOT. Also the family jewels are out of business.
maddyfoxmom.bsky.social
I've heard my grandmother say these things in front of me. I'm truly astonished how anyone can think Christians don't just believe these things like it's common sense to them. Even some of the ones that act nice, it's not because they understand or care, they behave kindly only in fear of God.
maddyfoxmom.bsky.social
It's absolutely not trauma dumping though, you're just sharing a part of yourself. If other people start to feel uncomfortable feelings of sympathy for your experiences, that's not because it was your intention. And really, is it so bad they have to sit with the reality that people's lives can suck?
maddyfoxmom.bsky.social
It's also a fact that you probably just wouldn't be able to talk about a lot of your experiences in an abusive setting without bringing up something upsetting to other people who don't share similar experiences. I've eventually found out a lot of things I thought were normal, just aren't.
maddyfoxmom.bsky.social
Continuing to face even more abuse/neglect after you've been hurt so much as to numb your emotions, even if the abuse doesn't immediately emotionally affect you nearly as much as it would if you weren't already numb, is still technically a part of your trauma even if you don't realize that.
maddyfoxmom.bsky.social
I've been stung by both wasps and bees, I can mostly shrug off a wasp sting, I have to treat a bee sting. I still have a fascination with bees and an avoidance of wasps though, they exhibit completely different behaviors and serve different ecological niches
maddyfoxmom.bsky.social
It's part of the language of a subculture called "empty spaces" which focuses heavily on sci-fi/fantasy personal mythologies based loosely on one's own history and traumas. There also seems to be a level of secrecy as to what "empty spaces" even is, probably to insulate the community from outsiders