The next technological revolution is rounding the corner, with artificial intelligence front and center in public debate. It wasn’t long ago that we saw another powerful innovation reshape our daily lives: social media. Nearly everyone in the United States — and billions worldwide — has heard of Facebook, Instagram, Snapchat, TikTok, and the list goes on.
In its early days, social media was pitched as lighthearted and connective. It was a way to keep up with family and friends, play games like FarmVille, and share snapshots of daily life (or at least the curated version people wanted you to see). But as the platforms matured, the picture changed. What started as fun became a vehicle for massive data collection, algorithm-driven manipulation, and enormous profits for a handful of tech giants.
One area hit especially hard has been elections. Social media has become a channel for misinformation, targeted political ads, and coordinated influence campaigns — with consequences that are still unfolding around the world. Cambridge Analytica, a political consulting firm, secretly harvested data from tens of millions of Facebook users through a personality-quiz app and then used that information to build voter profiles for targeted political advertising. The revelations raised alarms about how easily personal data could be exploited, how little oversight existed over these practices, and how social media platforms could be weaponized to influence democratic elections.
Years after the harm was done, the FTC finally charged Cambridge Analytica for harvesting data through deceptive practices. Part of the delay came from the difficulty of proving how the data was collected and used, combined with the fact that U.S. privacy laws were — and still are — far weaker than those in places like Europe. Regulators ended up relying on existing consumer protection authority rather than any clear statute written with social media in mind.
Moving to the present, it feels like we’re in the same boat all over again. Artificial intelligence is arriving in many forms — chatbots, image generators, video makers, even AI-powered podcast hosts — and the pace of change is dizzying. Yet, just as with social media, the United States has failed to put comprehensive regulations in place to guide its development and use.
The Biden administration issued an executive order in 2023 that outlined principles like testing requirements, watermarking, and privacy protections. But these were guidelines, not binding regulations. Congress has held hearings and introduced discussion drafts, but meaningful legislation has yet to move forward — leaving states to fill the void with their own rules. Meanwhile, companies such as Microsoft, Google, and OpenAI formed the “Frontier Model Forum,” a coalition that promises to set voluntary safety standards. The problem is obvious: when companies write the rules for themselves, they can just as easily rewrite them. And if the Cambridge Analytica scandal taught us anything, it’s that leaving oversight to the platforms is rarely enough.
As AI becomes a larger part of the political landscape, it carries the risk of shaping elections in many of the same ways social media once did. Generative tools could be used to spread outdated voting information, impersonate candidates, or flood feeds with misleading content. Without clear safeguards, AI could become yet another challenge to ensuring free and fair elections in the United States.
History has shown that when innovation moves faster than oversight, the public often bears the cost. With AI, the stakes are even higher — more powerful tools, faster adoption, and the potential to influence everything from news consumption to elections. We can’t afford to leave the rules to the platforms themselves again. Staying informed, questioning AI outputs, and urging lawmakers to create meaningful safeguards are steps each of us can take. How confident are you that the U.S. will implement effective AI regulation before major harm occurs? Vote in the poll and share your thoughts — your perspective matters.