OpenAI just admitted it can’t identify AI-generated text. That’s bad for the internet and it could be really bad for AI models.::In January, OpenAI launched a system for identifying AI-generated text. This month, the company scrapped it.
OpenAI just admitted it can’t identify AI-generated text. That’s bad for the internet and it could be really bad for AI models.::In January, OpenAI launched a system for identifying AI-generated text. This month, the company scrapped it.
> get sensible regulations about AI
There’s no such thing as “sensible regulations” for AI. AI is a technological advantage. Any time you regulate that advantage, other groups that don’t have those regulations will fuck you over. Even if you start talking about regulations, the corpos will take over and fuck you over with regulations that only hurt the little guy.
Hell, even without regulations, we’re already seeing this on the open-source vs. capitalism front. Google admitted that it lost some advantages because of open-source AI tools, and now these fucking cunts are trying to hold on to their technology as close as possible. This is technology that needs to be free and open-source, and we’re going to see a fierce battle with multi-billion-dollar capitalistic corporations clawing back whatever technological gains OSS acquired, until you’re forced to spend hundreds or thousands of dollars to use a goddamn chess bot.
GPLv3 is key here, and we need to force these fuckers into permanent copyleft licenses that they can’t revoke. OpenAI is not open, StabilityAI is not the future, and Google is not your friend.
Isnt forcing a copyleft licence exactly a regulation that would be sensible though? So why wouldn’t regulations and legislation work if thats your solution too?
There’s never been a bill that had the word “copyleft” or “GNU Public License” on it at all, and thanks to corpo lobbyists, there probably never will be. We have to be realistic here, and the only realistic option is to encourage as much protected open-source software on the subject as possible.