A recent leak of a draft executive order from the White House has raised alarms among legal experts and policymakers due to its proposal to preempt state laws concerning artificial intelligence (AI) and transfer regulatory authority to the federal government. This draft, which was anticipated to be signed, included provisions that would significantly empower David Sacks, a tech venture capitalist and Special Advisor for AI and Crypto, raising concerns about the consolidation of power in AI policy.

The draft mandated that various cabinet secretaries and agency heads produce reports and guidance on punitive measures against states that implement their own AI regulations within a tight timeframe. Notably, the Attorney General was instructed to establish a legal task force to initiate lawsuits against these states. Critics have argued that this could undermine state sovereignty and stifle local governance and innovation in AI regulation, particularly as the order included provisions that could withdraw federal funding from non-compliant states.

The political response has been swift, with opposition emerging from both Democratic and some Republican lawmakers who view the proposed order as a threat to state autonomy. Additionally, the draft notably excluded key agencies involved in AI policy development, such as the National Institute of Standards and Technology (NIST) and the Office of Science and Technology Policy (OSTP), raising questions about the inclusivity of the proposed regulatory framework.

Despite initial plans for the executive order, it was ultimately not signed, leading to speculation about the internal dynamics within the administration and the influence of various factions, including those advocating for stricter regulations on Big Tech. This situation highlights the ongoing tensions between federal authority and state autonomy in the rapidly evolving field of AI.

In a separate but related issue, political consultant Steve Kramer is facing legal challenges following a federal court order requiring him to pay $22,500 to three voters. This order stemmed from a lawsuit by the League of Women Voters, which followed a jury acquitting Kramer of charges related to voter suppression and impersonating a candidate. In the civil case, the judge entered a default judgment against Kramer after he failed to appear in court.

Kramer orchestrated a robocall campaign that utilized an AI-generated voice resembling that of former President Joe Biden, misleading voters just days before the New Hampshire presidential primary. The calls suggested that participation in the primary would prevent voters from casting ballots in the general election. Despite his claims that the lawsuit was a publicity stunt, the League of Women Voters emphasized the ruling as a significant precedent against the misuse of AI in elections.

The Federal Communications Commission (FCC) had previously imposed a $6 million fine on Kramer, which he has also refused to pay. While several states have enacted laws to regulate the use of AI-generated content in political campaigns, there are indications of a shift towards deregulation at the federal level, particularly under the Trump administration, which has raised concerns about the potential for large AI companies to operate without sufficient oversight.