When we first created FalseSolutions.org, it wasn’t out of curiosity, it was out of frustration.
Everywhere we looked, the truth was being smothered under layers of spin, marketing, and deliberate deception. Corporations sold “green” products made in sweatshops. Politicians promised “clean” fossil fuels. Tech billionaires pledged to “save the planet” with machines that quietly burned more energy than they claimed to save.
It was the constant lying, the greenwashing, the misinformation, and the corruption of science itself that pushed us to act.
FalseSolutions.org began as a way to expose these patterns. To track the lies disguised as progress. To show how money and power twist language until words like sustainability, innovation, or carbon-neutral lose all meaning.
Over time, we discovered something deeper. The problem isn’t just the liars. It’s the asymmetry of truth itself. Lies are easy, they fit neatly into headlines and hashtags. Truth is slow, detailed, and inconvenient. That imbalance has a name: the Bullshit Asymmetry Principle, which says it takes an order of magnitude more energy to refute bullshit than to produce it.
But what happens when machines start producing that bullshit for us?
When artificial intelligence, trained on the noise of the internet, becomes the new megaphone for misinformation, while the scientists, journalists, and advocates trying to correct it are buried under its weight?
We’re entering an age where deception can be automated and amplified, where the line between marketing and manipulation blurs. Yet the same technology that spreads misinformation can also help us stop it. Like any tool, AI is not inherently good or bad. It’s powerful, and how we use it will determine whether it becomes part of the problem—or part of a better solution.
Psychologists have shown that our brains are wired to believe new information, especially when it is simple, emotional, or confirms what we already think. When something feels right, our minds treat it as truth. That’s not stupidity, it’s survival. In the ancient world, quick decisions saved lives. But in today’s world, those shortcuts make us vulnerable to manipulation.
Researchers at the University of Waterloo found that people who are skilled at producing nonsense are also more likely to believe it themselves. “Those who BS, fall for BS,” they concluded 1.
The more we see a false claim repeated, the more credible it feels. Psychologists call this the illusory truth effect. It explains why slogans like “clean coal” or “carbon-neutral oil” no longer raise eyebrows.
Cognitive overload makes things worse. The constant flood of information forces us to rely on mental shortcuts. We trust what’s familiar. We side with our tribe. We react emotionally, not rationally.
When social media rewards outrage, the loudest lies rise to the top.
As Australia’s Chief Scientist Tony Haymet warns, “Our collective wellbeing depends on our capacity to distinguish credible science from persuasive fiction.” That capacity, however, is being eroded in real time 2.
Modern language models like ChatGPT don’t “know” anything. They predict the next word based on patterns in massive datasets that mix truth, rumor, and propaganda. As MIT’s Rodney Brooks said, “It just makes up stuff that sounds good.” AI isn’t logical, it’s linguistic. It produces convincing text without checking whether it’s true.
That’s not an accident. Philosopher Harry Frankfurt wrote in On Bullshit that the bullshitter doesn’t reject truth like a liar does, he simply ignores it. That’s exactly what happens when content is generated without verification 4.
Alan Blackwell, a professor at Cambridge, put it bluntly: “AI literally produces bullshit” 3. His argument builds on the “stochastic parrots” paper by linguists Emily Bender and Timnit Gebru, which warned that large language models imitate human expression without understanding meaning 5. The result is text that feels true, even when it’s entirely fabricated.
Yet the point is not that AI is dangerous—it’s that AI reflects us. When trained on polluted information, it amplifies that pollution. When trained on evidence and transparency, it can help detect misinformation faster than humans ever could.
Used responsibly, AI can become an ally in exposing false solutions, detecting greenwashing, and tracing disinformation networks. The challenge is not to ban the tool, but to guide it toward truth.
If AI is the engine, greenwashing is the fuel.
Greenwashing isn’t just public relations, it’s a powerful form of false solution. It gives the illusion of progress while protecting the status quo.
The United Nations defines greenwashing as “misleading the public to make a company appear more environmentally friendly than it really is.” It’s not just a PR issue, it’s a policy issue. Greenwashing distorts markets, weakens trust, and delays climate action 8.
A 2025 Nature Water study found that water companies in England used nearly every trick in the greenwashing playbook to hide pollution, from cherry-picking data to renaming sewage as “treated effluent” 7.
In the United States, fossil fuel companies spend millions rebranding natural gas as “clean energy,” even though methane leaks make it as bad for the climate as coal.
Fashion brands use the same tactics. H&M and Zara market “eco” lines made from recycled plastic, yet most of those garments end up in landfills or are dumped in developing countries, poisoning local ecosystems. Car companies promote “net-zero” models while lobbying against stricter emissions standards. Airlines sell “carbon offset” programs based on phantom forests and unverifiable credits.
As anthropologist David Graeber once said about “bullshit jobs,” these campaigns exist not to solve problems, but to justify them 6. Greenwashing creates the illusion of progress so real progress can be postponed.
The TerraChoice “Seven Sins of Greenwashing” are now corporate standard operating procedure: hidden trade-offs, vague language, irrelevant claims, and false labels.
Regulators often look the other way. In Dwyer v. Allbirds, a false “sustainability” claim was dismissed as “puffery,” meaning marketing language too meaningless to regulate 10.
The law protects the liar, not the public.
Once misinformation enters the system, it doesn’t just circulate, it multiplies.
AI-generated marketing, social media algorithms, and corporate PR now reinforce one another. Each piece of misleading content feeds the next, creating what researchers call the bullshit feedback loop.
This loop turns marketing into common sense. That’s how we got terms like clean coal, renewable natural gas, and low-carbon hydrogen. Each is a false solution, designed to sound like innovation while keeping fossil fuels alive.
The economist Carl Bergstrom, coauthor of Calling Bullshit: The Art of Skepticism in a Data-Driven World, calls this the “pollution of the information ecosystem” 11.
When truth becomes optional, power fills the void.
The consequences are not abstract. They are measured in asthma rates, poisoned rivers, and lost time.
Communities near refineries, ports, and chemical plants live with the results of corporate misinformation. Every “safe” level of emissions that turned out to be deadly. Every “temporary” permit that became permanent. Every “pilot project” that turned into a long-term disaster.
Misinformation kills. It kills when regulators believe an oil company’s self-reported data. It kills when families live beside toxic waste labeled as “remediation.” It kills when governments spend billions on false solutions like “blue hydrogen” instead of better solutions such as distributed solar, electrification, and storage.
It also kills democracy. If citizens cannot tell fact from fiction, power is accountable to no one.
That’s the real danger of the Bullshit Principle in the 21st century. Lies don’t just spread faster, they have institutional backing, algorithmic amplification, and now artificial intelligence on their side.
1. Strengthen public BS detectors.
Education must go beyond memorizing facts. It should teach how to question, verify, and detect manipulation. Ask: Who benefits if I believe this? What is the evidence? Is it peer-reviewed? The University of Washington’s “Calling Bullshit” course is a great model.
2. Reclaim science as a public good.
Trusting science doesn’t mean blind faith in experts. It means trusting the process: test, revise, repeat. Peer review, open data, and replication are our best shields against spin.
3. Demand transparency and accountability.
Every claim labeled “sustainable” or “net-zero” should come with proof, not PR. Governments must require verifiable metrics and independent audits. Greenwashing penalties must hurt enough to matter.
4. Use technology to expose, not amplify, lies.
AI can detect greenwashing, trace misinformation, and identify coordinated disinformation networks. Technology itself is not evil, but without transparency it becomes another false solution.
5. Support watchdogs and independent journalism.
Most truth-telling happens locally, by reporters and community groups tracking pollution and fraud. Funding them isn’t charity, it’s democracy maintenance.
6. Elevate the voices closest to harm.
Frontline communities are the first to see through false solutions. They can tell the difference between a “green corridor” and a gas pipeline. Listening to them is not optional, it’s essential.
7. Tell better stories.
Facts inform, stories move. Greenwashing wins because it offers comfort. Truth must offer courage. To counter false solutions, we must make real ones more compelling.
The Bullshit Principle explains why truth is harder, not why it is hopeless.
Refuting lies takes time, energy, and stamina. It requires institutions that protect facts, not profits. It demands citizens who are skeptical, not cynical. And it needs watchdogs who never get tired of asking questions.
The challenge before us isn’t just environmental or technological. It’s moral. Do we want a future defined by evidence, or one dictated by algorithms and advertisers? Do we measure progress in tons of carbon reduced, or in terabytes of bullshit produced?
At FalseSolutions.org, we believe truth can still win, not because it’s easy, but because it’s necessary.
Every time we expose a false solution, every time we connect a press release to a polluted river, every time we refuse to stay silent, the balance shifts, even if only by a few degrees.
Lies may travel faster, but truth lasts longer. And in the end, that’s what gives it power.