Ian Bogost
banner
ibogost.com
Ian Bogost
@ibogost.com
PREORDER The Small Stuff: How to Lead a More Gratifying Life: http://bit.ly/49gyZmp

Barbara and David Thomas Distinguished Professor at WashU; Contributing writer at The Atlantic; author of 11 books. https://bogost.com
Oh no carpenter bee
November 29, 2025 at 1:59 AM
I’m showing you three specific images. I represent to you that a few more similar ones appeared in the talk, along with a lot of other non-AI stuff.
November 29, 2025 at 1:34 AM
The audacious idea of YouTube saying that viewers want to know if what they're watching is real…
November 29, 2025 at 1:31 AM
Okay, but this is not the context we are discussing right? Or do you want me to say "This presentation contains synthetic imagery" on the title page or something?
November 29, 2025 at 1:26 AM
Given that most images I create are synthetic, what disclosure do you think would be required in this case? Or am I already sufficiently trustworthy (thanks) to be exempt?
November 29, 2025 at 1:24 AM
You know what, let's make this concrete. Attached are three of 262 slides from a presentation I once gave, in which I used AI tools in part to construct the slides (DALL-E and Photoshop, specifically).

Do you suspect my talk is was less trustworthy as a result? Why or why not?
November 29, 2025 at 1:17 AM
Just to play this out: Why would it be less of a moral hazard to steal someone's work for your personal use in public performance (expressly and clearly in violation of copyright law) than to construct an image with a generative software tool (where legal precedent is at best ambiguous)?
November 29, 2025 at 1:12 AM
It costs five thousand dollars.
November 29, 2025 at 12:06 AM
But an image is an image. Not all images contain sources and numbers. This is an incomplete account of the claim focused on a specific example.
November 28, 2025 at 11:12 PM
I just can’t read. I thought you said “generated” not “drew.” I know what drawing is!
November 28, 2025 at 10:12 PM
I appreciate and endeavor to share your spirit of generosity. But I’m still not sure “just vibes,” which is sort of your summary, is a persuasive case in the abstract. (Which would be unfair if the OP hadn’t made the case in the abstract.)
November 28, 2025 at 10:08 PM
Why is it a necessary truth that an AI image in a presentation makes its user unsmart? Would a stock photo or a swiped Google image also send this signal? Beyond a simple tell like seven fingeredness (itself on the wane), what would make it so?
November 28, 2025 at 10:06 PM
I am not a lawyer but have a lot of experience with IP law. The short answer is: It depends, but educational fair use is more limited than people sometimes believe, and this kind of usage is widespread outside education too.
November 28, 2025 at 9:57 PM
I am saying that I find it difficult to accept this premise wholesale.
November 28, 2025 at 9:06 PM
I am asking why that is the case. (I do not think it is obviously the case, and I’m not sure why I should accept it a priori.)
November 28, 2025 at 8:57 PM
Thanks for sharing that
November 28, 2025 at 8:53 PM
I understand what you mean. I’m not sure the distinctions are so clear cut. But the situation does start to feel a little circular to me. An AI image that erodes confidence is an AI image that erodes confidence etc.
November 28, 2025 at 8:53 PM
Is it a necessary truth that it looks careless?
November 28, 2025 at 8:45 PM
I think it’s the “tainted” thing that I am curious about. Bad, disingenuous clip art or stock images or stolen web images have been in presentations for ages. Presentations are terrible! Is it not thinkable that AI makes them better, in some ways and cases?
November 28, 2025 at 8:36 PM
Maybe. I mean, probably. Social media isn’t a good place to discourse but here we are.
November 28, 2025 at 8:34 PM
Totally fair correction.
November 28, 2025 at 8:34 PM
In every case? Why and how so?
November 28, 2025 at 8:33 PM
Disingenuous because it seems, on first blush, to exemplify a predetermined distaste rather than to produce it.
November 28, 2025 at 8:33 PM