When—and How—to Say Media Was Made With AI

by Claire L. Brady, EdD

So…Should We Say This Was Made With AI?

As AI-generated images, video, and music become part of everyday campus communication, a new question keeps surfacing—often quietly, sometimes urgently: How—and when—should we say something was created with AI?

This isn’t really about disclaimers. And it’s not about distancing ourselves from the work. At its core, it’s about trust. Higher education has long-standing norms around attribution, authorship, and transparency. AI doesn’t undo those values. It asks us to apply them with a bit more care in contexts that feel new and, at times, unsettled.

The goal isn’t to label everything. The goal is to be clear when AI meaningfully shapes what people see, hear, or interpret.

A Simple Place to Start

Here’s the principle I come back to: Disclose AI use when it affects representation, voice, or creative authorship.

Not every AI-assisted workflow needs a note. Spellcheck doesn’t require disclosure. Neither does using AI to brainstorm ideas behind the scenes. But when AI plays a visible or audible role in the final product—especially in public-facing work—transparency becomes part of responsible practice.

Images: When Real and Represented Can Blur

AI-generated images are now realistic enough that people may reasonably assume they are photographs. That makes disclosure more important, not less.

When images are fully or mostly AI-generated—especially when they depict people, places, or events that didn’t actually occur—viewers deserve context. Clear, neutral language does the job without creating unnecessary alarm. Phrases like “AI-generated image” or “AI-generated illustration” signal what someone is seeing without editorializing it.

What matters most is avoiding confusion, not overexplaining the tool.

Video: Where Voice and Authority Matter

Video carries additional weight because it often conveys authority, presence, and accountability. When AI-generated avatars or voices are involved, disclosure helps maintain credibility—even when the intent is purely practical.

Here, placement matters as much as wording. Disclosures should be easy to find: in a video description, closing slide, or accompanying text. When disclosure is buried, people don’t just question the tool—they question the motive.

Music: Subtle, but Still Meaningful

AI-generated music often feels less consequential, but it still shapes tone and emotion. When background music is created by AI and contributes to the experience of the content, a simple credit or note in the description is usually sufficient.

It doesn’t need to interrupt the viewing experience. It just needs to be there.

What Undermines Trust

Most trust issues don’t come from how disclosure is written. They come from inconsistency. When AI use is disclosed sometimes, hidden other times, or explained defensively, people start to wonder what’s being left out. When disclosure becomes routine and matter-of-fact, it fades into the background and does its work quietly.

A useful final check before publishing is this: If someone later asked how this was made, would my disclosure feel sufficient and fair? If the answer is yes, you’re likely doing it right. This is careful work. And it’s work worth doing well.

Checklist: Disclosing AI-Generated Images, Video, and Music

Before publishing content that includes AI-generated media, pause and review the following.

1. Initial Disclosure Check

☐ Does AI meaningfully shape what people see, hear, or interpret?

☐ Is AI visible or audible in the final product (not just behind the scenes)?

☐ Would someone reasonably assume this content was fully human-created?

If yes to any of the above, disclosure is likely appropriate.

2. Images

☐ Is the image fully or mostly AI-generated?

☐ Does it depict people, places, or events that did not actually occur?

☐ Is it used in marketing, instruction, or official communication?

Recommended language:

  • AI-generated image

  • AI-generated illustration

  • AI-generated image (not a real photograph)

Placement options:

  • Caption or credit line

  • Alt text (when appropriate)

  • Footer or end note for grouped images

3. Video

☐ Does the video use an AI avatar or synthetic presenter?

☐ Is voice narration AI-generated?

☐ Is most of the video content created by AI tools?

Recommended language:

  • This video was created using AI-generated visuals and/or voice

  • AI-assisted video production

  • Narration generated using AI

Placement options:

  • Video description or metadata

  • End credits or closing slide

  • Accompanying email, LMS post, or webpage

4. Music

☐ Is background music created using AI tools?

☐ Does the music contribute to tone or emotional context?

Recommended language:

  • Background music generated using AI

  • AI-generated music track

Placement options:

  • Video description or credits

  • Resource page or footer

5. What to Avoid

☐ Burying disclosures in fine print

☐ Over-explaining the technology

☐ Using defensive or apologetic language

☐ Applying disclosure inconsistently across platforms or units

6. Final Trust Check

Before publishing, ask:

☐ Would this disclosure feel reasonable and sufficient if someone asked how the content was created?

☐ Does the disclosure match institutional values around transparency and authorship?

If yes, the standard has likely been met.

Next
Next

Using AI-Generated Music in Videos (Because No One Wants a Silent Video)