The REAL Act mandates that federal officials must clearly disclose when they use generative artificial intelligence to create or alter public content.
Bill Foster
Representative
IL-11
The REAL Act mandates that federal officials must clearly disclose when they use generative artificial intelligence to create or alter public content. This disclosure requires a prominent, plain-language disclaimer explaining the AI's role in generating or manipulating the content. The law establishes implementation timelines, annual compliance audits, and specific penalties for federal officials and contractors who fail to adhere to these transparency requirements.
The new Responsible and Ethical AI Labeling Act (REAL Act) is straightforward: if a federal official uses generative artificial intelligence (AI)—think ChatGPT, Midjourney, or similar tools—to create or significantly alter public content, they have to slap a clear label on it. This isn't just about text; it covers images, videos, and sound too. The goal is to make sure the public knows exactly when they are consuming AI-generated official communications.
This bill targets transparency. When a federal official publishes something—a press release, a social media graphic, a regulatory document—that was created or manipulated by generative AI, they must include a prominent, plain-language disclaimer. That disclaimer needs to state that AI was used, briefly explain how the content was generated or altered, and describe the technology used. This kicks in 90 days after the bill becomes law. For you, the reader, this means if the White House releases a complex chart explaining a new policy, and that chart was made by an AI tool, you’ll see the fine print explaining that AI was involved. This is a huge win for public trust, giving citizens the context needed to evaluate the authenticity of government information.
Not everything gets a scarlet letter. The bill carves out a few important exceptions. For instance, if a federal official uses AI for “routine text drafts” for efficiency, and that draft is reviewed by agency staff before publication, no disclosure is required. This is meant to let staff use AI tools to quickly summarize meetings or draft initial emails without slowing down the workflow. However, this is also where things get a little vague (Section 2, Exceptions). What counts as “routine,” and how thorough does the staff review have to be to essentially scrub the AI disclosure requirement? If an official uses AI to write 90% of a complex policy paper, and a staffer just reads it over quickly, does that count as a “routine draft” review? The effectiveness of this exception hinges on how the Office of Management and Budget (OMB) defines “routine” in the rules they have to issue within 180 days.
Another exception is for “minor graphic adjustments” that don’t “materially change the meaning.” This is also subjective. Adding a text overlay to a photo is fine, but if an AI subtly changes the background of a photo to make a crowd look bigger or a landscape look more pristine, that could be argued as minor, even if it changes the visual message. Federal officials and contractors involved in content creation will need to walk a tight line here, as non-compliance can lead to disciplinary action under Chapter 75 of Title 5, U.S. Code, or contract termination for contractors.
The REAL Act doesn't just ask nicely; it builds in accountability. The OMB Director has 180 days to establish the specific rules for how these disclaimers must look and where they must be placed across different media. More importantly, the President, Vice President, and every agency head must submit an annual audit to Congress and the public detailing their compliance with this law. If an official violates the rule—say, they publish an AI-generated image without a label—they must retract the content, issue a corrective communication explaining the violation, and, if possible, provide a corrected version. If the agency fails to correct the problem after a violation, the Comptroller General gets involved to mandate corrective actions. This mandatory audit and public reporting structure is the teeth of the bill, ensuring that agencies can't just ignore the new requirements and that the public has a way to check their work.