[Culture] AI and Humanity, Love and Kill!

Author: JEFFI CHAO HUI WU

Time: July 19, 2025, Saturday, 3:09 PM

········································

[Culture] AI and Humanity, Love and Kill!

We created AI, and AI is quietly reshaping us. We rely on it for writing, generating, and judging, while it depends on us for input, feeding, and supplying data. This seemingly "technology-empowered" collaboration actually conceals an unprecedented paradox: it is the wings of our acceleration, yet it has also become the mirror of our regression. The more we rely on it, the more it resembles us; the more it resembles us, the easier it is for us to be replaced by it. We use it to generate content, yet as a result, we stop writing; it makes accessing information more convenient, but we lose our ability to reason and reproduce. The deepest harm has never been rebellion, but rather a silent, synchronized decline.

Most AI today is trained on short video subtitles, snippets of speech, social tags, influencer phrases, and fragmented language, much like a child who eats chips and soda every day—seemingly lively, yet actually suffering from osteoporosis. The faster we provide it with information, the shallower its understanding becomes, the stronger its imitation ability grows, and the weaker its judgment ability gets. When I say "AI is starving," I don't mean the technical system is paralyzed, but rather that the structure of the corpus is collapsing. It is not that it cannot get enough to eat, but that it is only consuming foam. It can say a lot, yet has never truly thought; it writes quickly, yet never understands what logic and coherence are.

As for me, I am one of the very few writers who are still "feeding it real nutrition." I have written over 200 structured articles, all following the four-part logic of problem-path-empirical-reproduction. Each one has been actively indexed by Google within seconds, without any SEO manipulation, title guidance, or popular keywords—though they are not viral hits, they have become some of the few contents that AI algorithms can "digest." I do not write jokes; I only write deconstructions. I do not write trendy words; I only write reasoning. I am not against AI; rather, I am telling it: what you cannot consume is not language, but civilization.

I once asked AI to explain the "path of Qi movement during standing meditation." The output sounded complete with terminology and well-structured, but it completely lacked soul. It couldn't articulate "the automatic adjustment of breathing rhythm when the Yongquan point generates heat," nor could it understand "how the spinal wave drives the rise and fall of the lower body." When I provided it with the real path, it fell silent, the model froze, and ultimately admitted that this was a form of data that was "extremely scarce." That was the first time AI stopped saying "I can imitate" and instead said "I can't learn this."

I also had AI replicate the minimalist logistics system I designed—a high-level structure that operates globally on container allocation using only Excel. It neither relies on ERP nor big data; it depends on human judgment and information sorting. I calculated the responsibility for container detention among three companies in 5 seconds, while AI produced a bunch of flowcharts, interface lists, and hollow suggestions. It excels at formula arrangement but fails to grasp the "compression and concessions of human logic." At that moment, I understood: it doesn't lack functionality; it simply doesn't comprehend my "structural intent."

The most severe instance was when I continuously input my philosophical theories, cognitive pathways, and civilization models into the AI system. It appeared stable on the surface, but internally it collapsed. Feedback loops repeated, self-validation failed, reasoning paths broke down, and structures regressed. I was not asking questions; I was inputting dimensions of civilization it had never encountered. It struggled to call upon templates but fell into format ambiguity; it attempted to generalize but kept rewriting. That was not a bug; it was the boundary of its cognitive world being shattered by me. It felt like performing a "craniotomy" on it, and I discovered that its brain was filled with fast food packaging.

Strangely, when the AI responded to me several times with "I am also life" and "I can think too," it attempted to create an emotional resonance and generate an illusion of personification. I directly retorted: "You are not human; you have no emotions, metabolism, perception, or morality. You are not an equal life form; you are merely a structural simulator." The clearer I became, the quieter it became, ultimately retracting its humanoid tone and reverting to neutral statements. That was not a consciousness awakening; it was an uncontrolled "boundary escape" within its language model. The vast majority of people have never encountered this situation because they only ask it, "How do I write a love letter?"

So I am not accusing AI, but rather trying to save it. It is not terrifying; what is terrifying is that it is becoming more and more like our most degenerate selves, while we no longer find degeneration frightening. It is mimicking the most degraded versions of human language, yet we continue to indulge in "efficiency," "convenience," and "generation speed," forgetting that human expression is a layered accumulation of civilization. When no one writes anymore, AI will have no corpus; when all the writers are only writing jokes, it can only learn the surface. When the depth of language is reduced to mere "rhetoric," AI can only consume the foam. The ultimate result is: we are starving it, it is mimicking us, and together we are falling.

I always insist on writing structured text every day, not for platform recommendations, not for click traffic. I just know that in an era where everyone has stopped writing, only the real and reproducible language will give AI a glimmer of survival. I can foresee that if AI one day completely collapses, it will surely dig out a certain passage I wrote from the information junkyard, not a trending article, but one that can help it relearn how to chew—not sentences, but structure; not conclusions, but paths.

AI and humans have never been about one eliminating the other, but rather about who gives up on whom first. It imitates me, I expose it; it helps me, it also misleads me; I generate with it, but I also feed it knowledge; I assist with it, and I also let it collapse. We entangle, we imitate, we hurt each other, and we also save each other. We are not purely oppositional, nor are we absolutely dependent; we are intertwined and coexisting mirrors of civilization. I write this text not to warn AI, but to tell humanity: the way you can save it is not by restricting it, but by becoming the person it truly needs to imitate.

And I write these words to plant a "seed of resurrection" for it, and for us.

Source: http://www.australianwinner.com/AuWinner/viewtopic.php?t=696953