We've been through this all before, probably many times. There's always a big scare that the new technology is going to take away all the human jobs. the most relevant one was when photography basically killed off the magazine illustration market and photos replaced paintings. Guess what—there are still people making paintings, and a lot of them work for the industry. If writing is just a numbers game for somebody, then maybe they'll want to follow the statistics and get in on the ground floor with collaborating with AIs to produce their art or writing. Anyone who enjoys the creative act will keep writing or drawing or painting or whatever it is they do. It's not just about the final product to me, or to most of us—it's that we love the process of creating it. Otherwise we wouldn't do it. It isn't just looking at art or reading stories that I enjoy, it's creating them, which is a very different thing. Why should I export that creative work to a machine?
I am a purist when it comes to the definition of art. For me, art can only be produced by humans. I subscribe to Tolstoy's definition: Art is a human activity consisting in this, that one man consciously, by means of certain external signs, hands on to others feelings he has lived through, and that other people are infected by these feelings and also experience them. https://home.csulb.edu/~jvancamp/361r14.html No such thing. AI doesn't create, it generates, by plagiarizing from real art created by real artists.
Why is that a bad thing? Shouldn't there be both? And I want to know if what I'm about to read (or not) was actually written by a human being or a machine, or a collaboration of them. We definitely need clear labeling.
Should it? Debatable Can it? Absolutely. I've explained how, most people ignored it with knee jerks "well when it does it's not art" thus killing the conversation. So I won't repeat myself.
For me, and I'm sure for many writers, the idea generating and outlining or discovery process is engaging and is a huge part of why we love creative work. I don't want a machine to do that for me, I want to actually do the writing and the creative work. I don't see writing as labor or as a chore I don't want to do, and I don't want a 'labor-saving machine' to do it for me. Sure, I can see where some people might want to do that. I have no problem with that, but it isn't what I want to do, nor do I care to read stories written entirely or partially by machines. The exiciting thing about reading a story is engaging with the mind of the writer. There are moments of contact, where you feel the creativity of a human mind. Those are what make it worthwhile—personally I don't want a simulacrum of a human intelligence behind any stories I read. Maybe I'll let my computer read those stories, it might enjoy them.
You haven't finished creating those things until the last word is written. So no, I'd say it's not super easy or a lot of fun. Filling in the framework is where a lot of works falter because it tests the author's ability to translate what they imagined and with conceptual follow through. You might be able to easily daydream a plot or scene in your head, but it's wholly incomplete in ways that don't emerge until it has to be realised. But because it's in your mind, you have to be the one to do it; an AI can't read your mind. That's why an 'idea' or a 'concept' for a story is one of its least valuable aspects. New writers are worried a mere premise will be stolen by onlookers, but it's because they don't understand how the body is far more important than the setup.
But you can tell AI your basic ideas and ask it to write something based off those idea. And now a basic idea can generate a whole story in seconds, that barrier of entry that writers held so precious is gone. Now you can argue the story won't be very good, but A) a human can still edit it, and B) some people won't care, people read poorly written stories all the time. The point is, idea = story is now possible. We can either be part of the discussion or we can let the people who don't care decide. And that's my point, imagine we could go back in time and shape how these type of algorithms could be used not in harmful ways but helpful ways, what if we could be part of discussion of ethical use of the technology... we have that chance with AI and we (society) are blowing it massively.
Nobody says so. But the way you talk about AI does kind of imply that if I use any kind of AI, I must accept all other AI. Again, this is the kind of thing that confuses me when you talk about AI. Yes, I use YouTube, I guess? And perhaps other AI algorithms commonly found on the web. But the conversation here is about the new groundbreaking kinds like Large Language Models (LLMs). I don't use those for my writing. Ever. Some publishers won't accept AI. Publishers are free to have any kind of content they want. They mostly do it because they believe that AI should never be involved in the creative process. But as I said, there are many creative markets. I see absolutely no reason as to why, maybe in the next five years, publishers or self-published individuals can't create a market dedicated for AI and AI-assisted works. I'm personally never going to buy anything from it but there will be others that will. And anyway, publishers have done this kind of thing for years. They're extremely strict and specific with what they want. But as I said, its mostly accepted into places where there is repetitive labour. Nothing will stop kids who grew up with AI to help with their essays and stuff to learn and write without AI. Humans are adaptive beings. Now if today's children grow up unable to do writing and other tasks without AI, don't you find this concerning? Calculators have existed for many, many years but kids in schools are still tested to be able to do math without them. Where I am from, we're not allowed calculators in math exams at all. We do things like Trigonometry by hand. I wonder why? What you're describing here is impossible. You can't restrict what others do. If there is a possibility for something to be used in a harmful way, it will be. That's just how humans work.
Oh, is that your point? I've been wondering, because so far you haven't made any kind of coherent reasoned argument for or against anything, it just feels like a bunch of disconnected statements going in all directions. Are you trying to say we should all get involed in shaping how AI is developing? Elon Musk tried that. He was involved in early prototypes, and the development was taken over by corporations and turned toward where it's at now—developed, trained and run non-transparently by the corporations with no input from anyone else. He wasn't able to exert any influence over it, they shut him out. What makes you think a bunch of peons like us can do what he wasn't able to, with something he was heavily invested in? I think you'll ultimately just have to accept that some people don't like it and you can't change their minds.
So, what you're essentially saying is that the part where you actually sit down and write is a "barrier of entry?" Barrier of entry to what? The industry? Or the barrier of entry to call yourself a writer? An idea and the actual piece of writing are very, very different and distinct things. If you give an AI a book idea and it writes it entirely for you, you didn't write it. The AI did. There is not much going on here other than idea-pitching to a ghost writer, who is a machine in this case. Even if someone does edit the AI written work, they're still not the writer. They're the editor. By the way, when people say that they've written a book and sell it to a bunch of people, but then they find out it was a ghost writer who did the actual writing, they get angry. And reasonably so. I think I see the problem now. Our values towards writing vary vastly, so our views can never align together no matter how much we try. We can only agree to disagree. But I think you're right. The people will decide for themselves. Those who don't care about the art of writing and just want an endless flow of content will have no problem with AI written works. Those who do will reject AI written works. There's probably going to be some in the middle but it's impossible to tell what kind of views they will have right now.
It's debatable to what extent they'll actually be writers. Some will just give a little vague input to an AI which will then be the writer, some might take a little more control, maybe in editing. But it's hard to call someone a writer if they haven't gone through the struggles necessary to learn how to write—all the way through the process.
If I haven't made an argument for or against, I guess that's because my true argument is be cautious but keep an open mind.
I didn't realize this was part of an existing thread, somehow I thought you had just started it today, so my bad. I went back and saw some of your earlier posts, and your poistion is more clear to me now. But it seems strange the way you keep referring to "the writing community," as if there's just one big club we're all in and we all feel the same way about everything. What we all have in common (most of us—well, the ones who are opposed to AI anyway) is that we love the writing process, and we believe it's an art that can't be done or assisted in any large ways (or in any way for some of us) by a machine. It requires a human mind, and you can't say "Well I can write some parts of a story, but I have trouble with turning my initial ideas into a plot, so I'll let somebody else do that part." If you do that, then you're not the writer, you're a co-writer and should be credited that way. And if your co-writer was an AI, then under the title it should say "Written by (human name) and (AI name)." And should the names of the thousands of people who developed and programmed and trained the AI also be included? In a way it's a form of writing by massive committee. Except that the AI is a huge part of it and it's a digital machine that imitates human output (really it plagiarises the writing of actual human writers without crediting them, as has been said many times already).
I could explain, you won't hear me. No our values are the same, I am saying not everyone in the world shares our values, you are saying they do. In my naiveté of a few hours ago, my hope was that we could be part of defining what that middle will be. After this back and forth, I realize I was very wrong. Hopefully the middle will be something we are all happy with.
But is this something anyone wants? Any idiot can now crank out several novels a day, written mostly by non-human plagiarizing machines that don't understand human emotion or struggle or anything of the sort? Flooding the market with garbage at unprecedented levels? The only people who would want that are people who want to think of themselves as writers without actually becoming one. And they don't care about writing as an artform, they have no understanding of that, because they didnt' have to put in the years or decades of struggle to learn it. They don't care if they're destroying the market for actual writers, they just want to be able to say "Hey, I'm a writer, I wrote sixteen series' last week, you should check 'em out. I don't really know what's in them though."
We can't go back in time, and even if we could we wouldn't be able to infliuence the development of AI. As I said, Elon Musk tried it, and he was shut out of the process. It's already been developed, Now it's just a matter of the technology improving from time to time and the corporations using it in new ways to control us and get our data and to control what kinds of stories get written. AI is probably the most powerful tool for corporate/government control since the internet itself.
You are lumping several different types of programming together under the banner of AI. A GPS is not AI, as it can compute a route from point a to point b. But it cannot make decisions about factors that effect the route without the human input of preferences. It cannot decide on its own to only use major highways unless no other route exists, the same for toll roads, the use or avoidance of which is a preference setting on most GPS units. It cannot decide between the fastest or shortest routing without those preferences either. The accuracy is also questionable without the human intervention of updating the software. How many stories have been in the news about people getting into a bad situation by blindly following a GPS, and ignoring common sense? I can recall on not very long ago of a man that didn't pay attention to the posted signs telling him the bridge his GPS said to use was out.
They're not same at all and its very clear from statements like those. My answer to your question is that the best of stories do. Some writers are able to reach some incredible heights by tapping into their emotions and capturing them to the page. That takes skill, honesty and humility. Surely, you've come across them. By the end of the story, you're absolutely gob-smacked and you sometimes even think about them for days. I'm not a particularly good writer but I've managed it with a few stories. It was the most incredible thing ever. The feedback I got proved to me that somehow, others felt what I did when I wrote it. It was truly shocking. No AI model could ever write the kind of stories that I did. They're special and unique to my own creativity and mind.
I hope we're not being trolled in this thread. I'm usually the type who will try to see things from another perspective and one who appreciates nuance, but this really is one of those black and white kind of things. There's no "middle ground" to be found, in my view. If you give some program some prompts to churn out a story, you didn't write anything. That's pretty basic and obvious. If you claim that you did, you are a liar and a fraud. If you edit that output thoroughly, you're an editor of fake writing. Still not a writer. I can't imagine anyone could ever convince me otherwise. It's like going to a Michelin star restaurant, taking pictures of your dishes, and claiming to be a chef, because you're the one who ordered everything from the menu. Wow, so very impressive. I'll never pay for some plagiarized mishmash that generative AI threw together. I wouldn't read it for free, either. At least some legislation is finally coming: https://www.pbs.org/newshour/politics/new-bipartisan-bill-would-require-labeling-of-ai-generated-videos-and-audio I haven't read the bill but it sounds like right now it's just for labeling images, audio, and video. Hoping there will be a bill that includes labeling for writing in the near future.
This is why I don't like having discussions online. My intentions were never to insult you or your writing or your love of it. It never even crossed my mind. I only said that our values differ and that's it, not that your value of writing is bad and weak. Where exactly did you read that? Because when I go back on my posts, I see no such words. The only thing I said about values are those two statements: Nowhere here does it mention anything about your values being 'bad'. Just different. I'm not really sure why that's insulting? Values don't work in binary states. If I hold "good" values and I say that yours are different than mine, it doesn't mean that they're "bad". I honestly never intended to insult you. I just wanted to have a discussion. But you came here fully emotionally charged, and I guess I should have seen this as a red flag to bail. By the way, I never mentioned what it is exactly that I value in writing, but you keep telling me that our values must be the same. How can you? Aren't you doing the same thing you're accusing me of? Peace is such a wonderful thing. I'm never, ever going to be part of a discussion like this again.