One weekend, Johanna attended her very first Edgy Dangerous Iconoclastic Truthteller (EDIT) conference. These sorts of conferences had proliferated in recent years, and each one had their own distinct culture. This one, thankfully, was somewhat left-of-center and quite gender-affirming—in fact, it had a large number of trans participants.
At the same time, many of the most prominent bloggers and writers attending this conference had made statements, at some point or another, that "trans rights activists" had gone too far. That by pushing certain beliefs that were way beyond the mainstream—trans girls' participation in sports, for instance—these trans rights activists had invited a backlash against trans rights that would regrettably end up hurting all trans people. These writers and bloggers certainly did not support the bill—just passed by the House—that would gut trans healthcare for adults, but they felt that this bill was a consequence of the wildly out-of-touch behavior by a certain segment of the trans left.
Fair enough.
The attendees at this conference were very interested in technology. In fact, virtually all of the discussion at this conference was about AI, and the ways it would shortly transform every aspect of American society, including medicine and healthcare.
However, many people at this conference were highly reluctant to discuss the fact that the current President, Donald Trump, was gutting America's scientific establishment by: a) withholding funds from large research universities that refused to allow him to pick who they should hire and admit; b) causing chaos at the NIH and NSF by canceling grants, pushing pseudoscience, up-ending scientific review and unilaterally reordering long-held research priorities.
Generally speaking, Johanna doesn't like to bring up the current government in casual conversation, because this topic makes everyone feel so helpless and upset. But she felt surely no topic was off-limits at an EDITcon. Nonetheless, she found a lot of resistance to talking about this government.
One man said, "Oh but the NIH budget still has to be approved by Congress."
"It's not about the budget," Johanna said. "This president has taken to himself the power to refuse to spend money allocated by Congress. That means he can withhold money from anyone he wants, destroying their research program. And that is indeed what's happening."
But mostly Johanna's attempts to talk about this government were met with silence—a refusal to engage. This conference was for networking and fun, and discussing this government wasn't that fun. In any case, many people also felt that superintelligent AI was right around the corner, and it would either cure every disease or it would be so malevolent that disease would be the least of our concerns.
That too was fair enough.
The majority of people at this conference worked in the tech industry. Many of them had made fortunes in AI, crypto, or other tech ventures. The conference was in a former hotel in Berkeley, now privately redeveloped as a venue for various EDITcons. This venue had recently pleaded money troubles, and as a result had raised millions of dollars from various donors, many of whom probably owed their own prosperity to the tech industry.
On a second order level, many of the ventures in this ecosystem were funded by various tech billionaires. At some point, someone mentioned that they were looking for funding from Marc Andreessen, and Johanna said, "Oh, but that guy's finished."
"What do you mean?" said this man, who we'll call Ruben.
Johanna said, "I just mean...he supported Trump. He's done. Eventually the Democrats will come into power, and they'll take his money away."
"But...how would that be legal?" Ruben said.
"What's legal?" Johanna said. "Look at what's happening to Harvard. That's not legal. There's no law anymore. There's just pure power. Eventually, the Republicans will lose an election, and the Democrats will come back, and at that point, someone will have to pay, and it'll be these tech oligarchs."
"But...why does someone need to pay? It's not their fault...it's the American people who voted for Trump."
"Yeah, but you can't punish the American people. You need their votes to win! So you find someone else to blame. And look what happened? These guys—Marc Andreessen, Ben Horowitz, Elon Musk—they got upset that the Democrats were trying to regulate them. They didn't like Lina Khan as head of the FTC. So they put their power behind Donald Trump. They picked a side."
"But it’s just politics as usual. They’re rich people who chose their own self-interest."
"No because...Lina Khan would've tried to break up their companies, but it would've happened within some kind of legal framework that they could've fought in court and perhaps won. Now...there's no law. The next Democrat will have all the powers that Trump has. And he'll use them to go after the oligarchs. That's what happens when you play power politics. This is exactly what oligarchs have been learning across the world. They think they’re constrained by rule of law, but actually their power relies upon rule of law. Without the protection of the law, their economic power is no match for real political power."
"Okay...but I would hope the Democrats would live and let live. If they stand for rule of law then there should be a principle that..."
This man spoke for some time about fair play in politics and not taking revenge.
"Yeah, but this goes beyond politics," Johanna said. "What Trump is doing is not normal politics. That means someone has to pay. They have to learn that you can't play this way. This can't be the system, where you put someone into office who wrecks everything, and when it’s over, you walk away rich. Someone has to pay. The American people will be out for blood."
"Out for blood? What does that mean?"
"Take away their money. Take away their companies. Take it all away."
Over the course of the weekend, Johanna developed this argument with several people. Many of them tried to argue that the tech industry wasn't really right-wing. They said actually tech generally skewed left. Certainly this conference skewed left.
And Johanna agreed with that assessment. At the same time, the reality was that the richest man in the world had bought a social network, turned it into a vehicle for pro-Trump rhetoric, and then this man went to Washington and ran rampage through the government, cutting valuable programs and services. It didn't look good. The optics were that tech had chosen Trump, and now they'd have to pay.
Ultimately, many of her interlocutors ended up agreeing with her. They hoped it wouldn't turn out that way, but they certainly thought it was a possibility.
At this same conference, there was a swirl of beliefs about AI. Some people had “short timelines”—they believed that AI superintelligence was coming within the next decade or two. Other people believed that AI superintelligence wouldn't happen within our lifetime, but that in any case we were about to see a lot of change in our society as a result of AI.
Most people thought AI ought to be regulated by somebody, but they disagreed about whom, or the nature of this regulation.
When Johanna asked, "Can't we just unplug the data centers?", most people felt like this somehow couldn't happen. That the potential represented by AI was so powerful that somebody would inevitably use it, and if the US didn’t use it then China would use it instead.
What's interesting is that there was also another group—albeit one poorly represented at this particular conference—that believed technological progress was actually a fitful, rare occurrence. This group believed that the past 500 years of Western Civilization represented an extremely rare break from the general thrust of human history, and that generally speaking human society was not conducive to technological development.
The existence of this “technological progress is rare” group implied one view about how AI could be stopped. After all, generally speaking, human societies have not developed AI. That’s because AI requires a lot of electricity, water, knowledge and infrastructure—it requires not just a technological society, but a society where people feel comfortable investing vast amounts of capital into a highly speculative endeavor. Most societies are not like this.
In any case, in 2025 the government regulatory apparatus has become completely dysfunctional—basically large companies can just do whatever they want—so any discussion of government regulation of AI was basically moot.
But...about four years later, a Democrat was elected. And this Democrat went after not just the AI companies, but all of the tech oligarchs, insisting that they break up their conglomerates and divest control of their businesses. When they refused, this President canceled all government contracts to these firms. This President canceled the visas of all workers employed by these firms. This President nationalized all patents held by these firms. He used various systems—originally developed to freeze the assets of terrorists—to seize control of the accounts of these firms, to prevent them from gaining credit, to basically make it impossible to function within the financial system.
Some of these firms fought it out in the courts and won, while others capitulated. But the end result was that a lot of capital fled the tech industry. Rich people knew that if they wanted to keep their wealth, they shouldn't put it into tech.
And where did that capital go? In this current world, what did generate a return? Well, crony capitalism. You put a lot of money into assembling a winning coalition to gain office, then you loot the government for as long as possible. You don't build, you don't invest, because you know eventually you'll be out of office and you'll only get to keep whatever you've hidden offshore.
Thankfully, this cycle only continued for a decade or two, until a real strongman seized power. He called himself a President, obviously, but this man was so gentle and so capable (and, most importantly, so young) that when he called for term limits to be abolished, they were, because people wanted this man to be in office for as long as possible.
This man was a tyrant, and he ruled like a tyrant. His sole aim was to retain control of the government. And that meant nobody else was allowed to gain enough power to challenge him. This was terrible for oligarchs, who feared that if they accumulated enough wealth, it would be seized and taken away. But for everyone else, it was fine. This man didn't fear poets and writers, because writers generally spoke the truth, which was that this man's rule was much preferable to the chaos that'd come before.
This man attempted to rebuild America's roads and its infrastructure, to rejuvenate its universities. But this was an Imperial system, and all life in this system was oriented towards the center. Anyone who accumulated any power in the provinces would eventually either be summoned to the capital or otherwise cut down to size. So although he longed to restore America's greatness, he was also chronically suspicious of anyone with an independent power-base.
So long as someone depended on Imperial largesse, the Emperor didn't fear them. But once they started to gain an independent reputation, the Emperor would get worried that they might unseat him. And he felt himself, correctly, to be the only thing keeping America from anarchy.
He died. Others succeeded him. They also tried and failed to maintain America's position on the world stage. Eventually this polity called America vanished, but the land didn't vanish, the people didn't vanish. They simply adopted other belief systems and came to conceptualize themselves in different ways.
And as for AI?
Well, it was a human creation: it required energy, water, expertise, and an enabling legal environment. Its creators conceptualized it as somehow surpassing humanity, but in reality it was a highly-fragile machine, one that was dependent on the strength of the underlying society. And once that strength began to wane, AI was the first thing to break and disappear.
Future generations couldn’t even conceive of how this machine had been made, and even now, when folks analyze the writings from those far-away times, there is some considerable debate about whether the people of America genuinely believed AI was a possibility, or if it was perhaps only ever a metaphor or some symbolic goal not meant to be taken literally.
P.S. The two previous Johanna tales can be found here and here.
Disagree. The Republicans currently think they're doing payback for a decade of humiliation and norm-violation, but George Soros, Hollywood, and the NYT still exist and are still doing fine. Harvard is having a tough time but is far from collapsing; nobody seriously thinks "I shouldn't apply to Harvard because it might not be prestigious and important in four years". This is because, even if Trump is 10x more norm-violating than previous administrations, there's still quite a lot of rule of law left, and these organizations are hard to attack. If the Democrats decide to do their own payback for four years of humiliation and norm-violation, I doubt they will be any more ruthless or have any better luck.
Also, I think future Dems will be in-touch-with-reality enough (ie more so than the Trump administration) that, confronted with the possibility that kneecapping the tech industry would hurt the economy / America's geopolitical position, enough of them will back down to fatally weaken the coalition in favor.
I predict they pass a couple of laws that make life a bit harder for some tech companies, but not much more so than Lina Khan did last time around, and not in a way the average person on the ground has to worry about much. Musk might be vulnerable because SpaceX gets government contracts, but it's not like he has great competitors who the government can switch to.
I don't think it's going to go like this, but it's a compelling enough vision that I'm saving it and scheduling at least one reread 16 months from now, and maybe more down the line.