Avatar

Bruh. (Gaming)

by Kermit @, Raleigh, NC, Friday, March 27, 2026, 17:49 (3 hours, 32 minutes ago) @ Claude Errera
edited by Kermit, Friday, March 27, 2026, 17:58

In many cases they can. You're focusing on the artistic realm. There is much happening beyond that.


I don't find this particularly useful in this debate. If you have specific examples, great, give them. ("It can be helpful for organizing" is not what I mean.) The issues with generative AI are (very often) specific to the ethical value of what's being created; if you have examples of generative AI being used in ways that are useful to humans without stealing from them, please, by all means, share.

But the coy responses you've been giving in this thread really add nothing except an "I know something you don't know" vibe.

Claude, you're so good at projecting the vibe of "I'm just a neutral observer here." As an observer, you might also call out the broadsides weaving conspiracies or the sweeping catastrophizing statements. I was trying to say it's not that simple, and I feel a bit singled-out for not providing footnotes.

If I came across as coy, let me say this about what I know: I know enough to not be arrogant about what I know. As I've alluded to, I work for a company that provides software (some of which incorporate AI agents) to many companies, academic institutions, and government agencies. From a professional perspective, I know more about the tools than specifics about the deliverables. I'm aware there are ethical pitfalls, and we should support companies that care about such things (Anthropic being a recent example in the news--I like to think that my company falls into that category). "Stealing" is a provocative and slippery concept when you're talking about publicly available data, but in many cases the data used by these AI are in the public domain or are proprietary to the institutions using them. I’m not an expert, but companies are increasingly relying on synthetic, proprietary, or purpose-built datasets to train AI. These are being used in the field of medical imaging (I think stabbim mentions this in a different post), self-driving cars, fraud detection, and QA. I work in publications, for instance, and we've developing AI that will answer your questions--it will generate content untouched by human hands, yet based on content we produced. My company owns that content. I know people whose job it is to make sure that our software integrates and uses AI ethically. Privacy, for instance, is a BIG concern. (Before someone says I'm biased because I have a professional interest--the bulk of my career is in the rearview mirror. I'm not that attached. I'm kind of glad to be toward the end of my career. Being at the beginning would be cool. Would not like to be in the middle.)

Also, I am allergic to what appear to me as simplistic narratives. I would react the same way if someone said AI was assuredly going to save mankind. AI is definitely going to be disruptive. So was the internet. Many of the best predictions about that didn't come true, but the worst predictions didn't either--we didn't correctly anticipate the good or the bad. I lean towards optimism. Plus, I'm old. I've seen things. You are old enough to have read FUTURE SHOCK in the 70s. My God, it's a wonder we're alive. We were supposed to be replaced by robots 35 years ago.

A friend at work who's a little bit younger than me told me about how he brought up AI to his adult kids, and they all but held up a cross and hissed at him. The reactions on this forum have been educational in that same way. I think there is bit of hysteria in the mix. I agree with those who say AI is problematic for creatives (but I don't agree it's of no use to them). I also agree there are ethical issues around how LLM models were trained. I don't have the answers regarding how it will or should shake out. (We all care about intellectual property across the board, right?) To get to the issue that started this debate, I think photorealism is a problematic goal, but a proportion of gamers want it and make purchasing decisions based on it. Game developers have a budget and a deadline. It's a tough industry and expectations are through the roof. If there is a tool that gives devs control but helps them deliver photorealism (making their games more marketable) while allowing them more time to focus on story and gameplay, then I'm not against that. (Artists already use such tools—Adobe Firefly is an example trained on licensed content). I want the companies that make good games to be able to have enough success to keep making them. The demands of art and commerce are often at odds. Finally, in this context, I guess my optimism rests on the belief that human-created art, whether with chalk or digital tools, will always be highly valued. I believe there will always be a demand for and therefore mechanisms to authenticate human-created art.


Complete thread:

 RSS Feed of thread