I came across this disturbing story today from The Guardian where they discuss AI-written mushroom foraging guides that are being sold on Amazon: https://www.theguardian.com/technology/2023/sep/01/mushroom-pickers-urged-to-avoid-foraging-books-on-amazon-that-appear-to-be-written-by-ai It's just amazing to me that this is happening at all. This is just one topic where incorrect information, presented by regularly wrong, stupid, unthinking generative AI can put someone's health at risk. There must be so many more. Fuck AI to death, and then the devil take over in Hell. It interested me to learn that there are reliable AI detection tests available (figured people were working on this). That gives me some hope, at least for now. It might be yet another "arms race" of sorts, though. One always evolving, one always trying to keep up. Much like viruses and antivirus software. I think that's a pretty apt analogy; generative AI writing is a virus which must be wiped out. Another thought that came to mind was, "Amazon isn't running these AI detection tests on their own, as part of the publishing process?" Which is bullshit. They're in a great position to set the tone here and make a stand. They can ban these fake, fraudulent, phony, talentless losers who post their "writing" trash online for sale - fiction or non-fiction. They can say, "We're not in that business. We care about integrity. We care about quality. We care about our products and we care about our customers." This kind of rubbish needs to be pushed to the margins, to some fringe market that most never notice. But right now, it has penetrated and polluted the mainstream. Amazon needs to be held to account and commit to filtering out the raw sewage being pumped out by generative AI. But will they? A 20% cut of sales from radioactive dildos covered with plague fleas and anthrax is still a 20% cut, as long as people are buying them. I hope the profitability calculations end up siding against the anthrax dildos. Market, please make the right choice, if the corporations won't.
It's been a long time since Amazon said things like that and meant it. They've become another evil empire, as has happened to just about all mega-successful corporations.
That's scary and shows how dangerous generative AI misinformation can be. The article says that it was the Guardian that undertook and covered the cost for having these books checked through the AI detection test, on a website called originality.ai I can see a time in the future when being certified AI-free becomes part and parcel of selling one's book. Running your book through originality.ai, as part of the process of publishing, and making the results available to anyone who purchases the book, is maybe where we should be heading. Amazon's quote is not very reassuring. "We're looking into it."
Yeah, they're "looking into it" to see how well this part of the plan is working while smirking behind their hands. Suddenly it's so clear to me (how did it take this long?)—think about the kinds of companies that have taken over AI and pushed it heedlessly without sufficient testing, and launched it onto our society, through so many portals (cropping up everywhere like poisonous mushrooms), with obviously built-in problems of very specific kinds. It's just one more way of shoving endless misinformation onto the society to destroy the rationality, morality, and integrity the West is built on. Another manufactured virus against which we now need to keep making new defenses every few months.
Lol sure, whatever. The western world is nothing but evil patriarchy and toxic whiteness, right? They created ended slavery and did nothing but oppress liberate women and minorities. The marxist/progressive war (postmodernism, wokeness, whichever label you want to put on it) is against liberal values, and despite all its problems, the western world still the leading paragon of liberal values.
Bit of both. You can admit the North American colonies were built on blood and exploitation while acknowledging their forging of virtues. US and Canada also have relatively fresh skeletons in their closets if you want to talk about international meddling for national (or just personal) gain. Humans are just messy that way. Some bad with some good. The primitives who were dominated had just about as much inter-tribal diplomacy and compassion as they had blue skin, which is unfortunately also about as much as the average enterprising colonial nation at the time. Edit: shit, got to stay on topic. Humans can still copy/paste info from the web, slap it into a book, and get people eating the wrong plants and mushrooms. There won't be any test you can put the book through for that either. The 'danger' always will stem from an unscrupulous consumer.
I wouldn't expect Amazon to do much. They're not a publishing house and to act as such probably isn't really what people want. Where I see this going is having a potentially big impact on self-publishing. There's no real rules around what people can self publish. And there are no real rules about using AI to write or help or generate books for people to self publish. When you buy a book published by a trade company it has a sort of built in stamp of approval and almost be certain there was no IA involved in the writing. I'm already seeing stuff in my contracts for magazines and journals that I have to sign off on no AI was used in the creation of my story, including but not limited to anything from generating ideas to, of course, the text. It seems like publishers, and rightfully so, are very unwelcoming of this technology and make that very clear in their contracts. I can only imagine that something similar is in book publishing contracts. But what does this do to the self-publishing industry? I think it's a little far-fetched to think self publishers are going to start feeding their work through any kind of checker. I hear there's some problems with that software too, I've heard. And I just wouldn't even feel comfortable uploading my work to one of those things to begin with. Think of the database that would be created. And I'm not sure it's fair that self-published authors somehow have to prove themselves. And to whom? Amazon is not a publishing company. Are these sort of things making you think how self publishing could be affected moving forward?