Skip to main content


Rant about AI:

Sadly there's no reasonable way to differentiate AI content from "real" content. And regardless of your opinions on AI there's no "stopping" it (it's a "cat's out of the bag" situation, you can run these things on your home computer with open source software... there's no way short of an apocalypse to stop development from here).

What we do have is a lot of fighting and little effort to work on solutions of living with this. And I think worse yet many taking the anti-AI stance, especially the loudest of them, are basically making things worse because real solutions are anathema to them (ie. anything short of an outright ban on the technology is unacceptable, which means they tend to push back against even efforts to rein in AI or talk over those who want to push those efforts).

On top of that you have the borderline predatory push of "AI Detection Tools" and "AI Poisoning". The detection tools are a question of "How many real lives are you okay with ruining to catch a handful of bad uses cases in AI because there is zero way to have any certainty on the accuracy of these tools?" while poisoning tools are a security blanket that leads to people dropping their defenses because they don't stop AI, just slightly delay it's access to your content (even the creators of those tools acknowledge that AI will quickly bypass them, at which point there's no difference in whether or not you used that tool), worse yet as AI gets further incorporated in search tools it can make it harder to get visibility and exposure over AI generated content.

What we really need to be focusing on to address the problems with AI:

  • Learning how copyright works (in my experience artists tend to have a woefully bad understanding of what is or isn't covered) and making sure corporations don't lobby the government into allowing copyright on AI works (under current law they are public domain, aka. no copyright, but there's already been one case of pushing that they can copyright "arrangements" of AI works). This means if they want to actually have a copyright on art, they've got to pay a human artist
  • We need to push for reporting requirements/standards. One of the most toxic elements is how much AI floods spaces and bumps out human artists, especially when they attack the prompt containing the artist's name (meaning searching for that artist can turn up more AI work than their actual work)... there needs to be a requirement that AI art be labeled. This also works with the previous point as it is similar to being able to search for something released Creative Commons.
  • Push for copyright responsibility in outputs rather than training data inputs. This sounds like something that is already one of the loudest arguments, but really isn't. Most arguments I hear try to go after AI tools for copyright content in their training data... but if you actually learn copyright you realize that a victory here largely means that major companies get more of an advantage because copyright only applies when content is copied (ie. when the training data is made available for smaller companies to run their own) vs when content is transformed (despite popular opinion, the vast majority of AI output does not violate copyright and qualifies as a transformative work... see again learning copyright law, plus a dash of learning how these tools actually work). Responsibility in outputs means that an AI can violate copyright (if I ask an AI tool to give me the first chapter of a copyrighted book and it does so... that is a violation and they need to genuinely be responsible for taking measures to prevent this from happening, but there should also be leeway for "forced violations", ie. when you bend over backwards to make it break copyright vs just saying "give me the first chapter of...")
  • Work on learning and developing responsible usage. Again despite popular artist opinion, there genuinely is a lot of responsible use cases for all these AI technologies, from using LLMs to help debug code, summarize text, prioritize lists to voice duplicators used, with the license of the original VA, being used for dynamic speech (ie. voice assistants or actually speaking a player's name in a video game in the middle of otherwise pre-recorded output). And that's not to ignore image generators which can be used for enhancing/repairing old photos, or just used for general visual effects on your own art (ie. the filters everyone uses on instagram or the like... much of them are the exact same tech as AI Image Generators)
  • And as always... fighting capitalism because the real threat of AI is the same as any other technology advancement: if CEOs can replace you with a machine, they will, and we live in a society where no employment means risk of death.

#AI #ResponsibleAI #Rant