Vitalik Buterin Discusses AI’s Use in Article Writing
Ethereum co-founder Vitalik Buterin started a discussion on X about using AI in article writing, and he’s not a fan.
On this page
Buterin referred to the common practice of turning a bullet point list into a “proper” article. He says it’s often better to just give people the bullet points.
According to Buterin, GPT (the large language model powering ChatGPT) struggles with adding useful context to texts and instead adds fluff.
The GPT adds “wordcel noise” that the reader has to struggle to extract useful info from more than it adds useful context.
– Buterin wrote.Â
Wordcel is a slang term for people who use big words or overly complicated language just to look smarter than they actually are. Wordcels are often compared to sharp rotators, who are more into logic, math, or solving problems in a practical way.
The term is usually thrown around sarcastically or jokingly to call out when someone’s language feels shallow or pointless, though the whole idea is up for debate.
That said, Vitalik Buterin’s point about AI creating “wordcel noise” in articles is pretty clear. He also mentions that this could change as technology moves forward.
Buterin thinks any advice related to AI will only stay relevant for about six months, since the tech is evolving so fast.
Followers Share Vitalik Buterin’s Views on AI
Many people on X seem to agree with Vitalik Buterin about the limited use of AI. Abdul Rehman, Head of DeFi at Monad, mentioned that he finds it annoying to ask GPT to limit responses to a single sentence because there’s just no time to read long paragraphs.
Web3 builder @cavypunk also thinks AI-generated articles are pretty boring.
I always hated AI generated articles ….just boring , with some excess words that just waste time…yeah it has meaning and has context , but can be told much more briefly and with human take on.
– he wrote.
In a reply to someone, @cavypunk said he doesn’t think AI is right for content or articles, since these things need soul and “way more than AI can offer.”
AI’s Not Evil but Botspamming Is a Big Issue
AI isn’t entirely bad – it’s just a tool that can do a lot of useful things. It helps with writing, design, and even problem-solving, making tasks easier and faster. The real issue comes up when AI is used for botspamming.
You’ve probably seen it: websites filled with generic, repetitive content that’s neither useful nor engaging. It’s all about stuffing in keywords and improving rankings, rather than offering meaningful or helpful info.
This type of AI-generated content might be technically accurate, but it often lacks depth and relevance. Still, the number of AI-heavy news sites is growing rapidly.
Readers aren’t happy about it. They’ve come up with a bunch of terms for these AI texts: from sludge to slop to spam.
It’s making it harder to earn readers’ trust and build real connections. People are even using AI for social media comments and personal messages, blurring the line between real and fake.
The content on The Coinomist is for informational purposes only and should not be interpreted as financial advice. While we strive to provide accurate and up-to-date information, we do not guarantee the accuracy, completeness, or reliability of any content. Neither we accept liability for any errors or omissions in the information provided or for any financial losses incurred as a result of relying on this information. Actions based on this content are at your own risk. Always do your own research and consult a professional. See our Terms, Privacy Policy, and Disclaimers for more details.