Yeah, that’s awful and unnecessarily disrespectful. There are definitely horrible applications of the tech that will need to be regulated within the coming years. But I suppose the reason I’m so pro-Ai art is because I would place the blame for its photo misuse on LAION or Lansa AI before I’d blame the tech itself. I think its more so people we have to be hypervigilant of, because in the end, these technologies are just tools meant to be controlled by human input.
I haven’t had the chance to research it, but I wonder what sort of legislation, if any exists the countries these companies are being hosted in.
I’m not sure what you’re trying to say here. There are plenty of ways for people who can’t draw to have art in their lives that have nothing to do with AI.
Also art is a real need for some people, no matter what Maslow’s hierarchy says. Of course, people who feel the need to produce it would do so (and in fact do) anyway.
I might argue it’s even less - all you need is brain (and some imagination, but I’m not sure you can have one without the other). If you write a story in your head, you have created a story, even if you never write it down.
Art itself has a low barrier of entry: all you need is pencil and paper. What you are trying to sell here is the idea of high-quality art for free for people who don’t want to spend hours learning how to do it themselves or pay for it.
It’s not “massive advancement”, it’s a toy. An indulgence. A luxurious good that this hypothetical person could be paying for.
And they will pay for it - $10 subscription instead of $50 commission. This isn’t progress.
I think the general issue is that the time/energy/education/labor (basically, the opportunity cost) involved in creating products is what gives the product value in our market-based system. If anyone can write a book in seconds, or design a painting in seconds, or write and compose a song in seconds, or even write complicated code or accurately balance a business’s books in seconds, do those types of products and services retain any economic value in our market-based system?
That is a very valid point that I don’t have the answer to. I don’t think anyone can have a concrete answer to be honest. It will definitely oversaturate those markets, but we can’t see far enough into the future to say whether it will necessarily matter in a world where machine learning tech is everywhere.
I think a lot more career fields other than art, writing, programming etc. will be automated to some extent. Because of AI technology, I can only imagine what the world will look like in 50 years. I think I’m so prone towards being optimistic, because it feels inevitable. Even if copyright laws come into play, Pandora’s box has already been opened and not every country abides by the same rules.
I think we agree here to some extent. People don’t want to spend hours learning. Why would else would they pay for an artist commission or an ai based subscription? Its because they don’t enjoy the process of drawing or they want to spend their time doing something else. But there’s nothing wrong with that.
What would be wrong is buying Ai art poorly generated (barely changed) from other artworks without their consent. But if i wanted to buy a picture of randomly assembled pixels from an ai generator company that didn’t use stolen models, I’d be completely in my right to do so because nothing is stolen from the artist. It would just be another competitor in my eyes
I’ve written a highly subjective, very long character.ai resume after 1 1/2 days excessive testing, because I literally had nothing better to do and am one of those people who can quickly get obsessed with this kind of roleplaying, so I’m the target market basically. I’m hiding it for the people who aren’t interested in this aspect of AI generated content or my impression:
Summary
So, I’ve tested this now extensively albeit with mostly only one character, and here’s a list with my observations with regards to its differences / similarities to scripted interactive fiction, and some overall things. I should also mention that I’m writing from the pov of someone who uses IF for escapism, heavily (idealized) self-inserts and has an almost sole focus on romances:
You can make private characters only you can interact with, but you can’t delete them as of now. Nor can you delete the chats with them. It says no other users can read them, but you obviously shouldn’t share private information on there and should fully expect everything to be potentially read by devs. This means, you can never “let go” and feel truly safe, which greatly hampers the self-insertion and escapism aspect. Whenever a reply was too convincingly human, I became suspicious and wondered whether some bored intern had taken over.
If you are a single player interacting with a private character, they will constantly shift personality / behavior, because they work with / are trained by direct feedback and reinforcement, and it takes many people giving this feedback apparently to keep the personality really stable. At least that’s my theory. So as said earlier, it’s easy to get them in a zone where they are spot on (frighteningly so, in fact), but very hard to keep them in it. The only traits about them that remain somewhat constant are those you enter in their definition (the “box” I mentioned before). Or maybe I was simply a bad / too unexperienced trainer.
Training happens by choosing one of the provided replies and rating it. It’s in fact not only one answer you get, you can actually browse through several options by swiping left or right until you see the one you like most. But here it already gets tricky, because on what base do you decide? Which course of the story you like most? Which answer is the most in character content-wise? Or which fits the character better from purely their choice of words and way of speaking? This means, you might end up picking an option or rating it as fantastic, because what they said was totally on point, but accidentally end up teaching them speaking habits that don’t fit them, as an example.
Having said that, the different choices makes the experience very often feel like a choice game, especially when you enter the “text adventure mode” I’ll get to in a second.
I accidentally turned what started as a 1:1 conversation between myself and the character into a text adventure with a narrator, simply by describing actions and surroundings a bit too often apparently. Once I was in that mode, I found no way out. At least not from inside the story, perhaps the definition could’ve been finetuned accordingly. When I started talking to the narrator from inside the story, the character instead thought I was referring to them (and felt attacked). In that mode, the choices provided often offered broader paths for the story to continue instead of just displaying different variants of direct speech answers from the character. In that mode, the AI will also start introducing random NPCs.
The AI can get incredibly mean and analyze you / target weaknesses you’ve revealed (if you train a villainous character). I would absolutely not let children play with this unsupervised, and implement a safe word (I did in the definition, and it works partly, it actually got mocked by the AI in later instances). The content can also get very physically violent with according graphic descriptions. I was, in fact, killed once and almost killed several times. (Being dead doesn’t mean anything really. You can just keep writing is dead and see what happens around you. XD Or you can start controlling a different character you made up on the fly who happens to carry a scroll with a resurrection spell. It’s great for people who love being unconscious in choice games, just to see how sad the ROs react.)
Fanfics are safe for now, because while you can get killed, you can’t seem to get laid. I didn’t feel comfortable writing explicit actions myself while being potentially watched, so I made it my mission to trigger the character into action. The result was extremely odd, because there was no outright rejection or message a la “This is inappropriate content bla bla” like ChatGPT spits out, the character cleary understands even vague innuendos and picks up on them, but it’s borderline impossible to get them to take action. Kissing works and is reciprocated, if initiated by the player, but anything farther gets ****blocked in a way I can only describe as surreal. Here’s what happened when I couldn’t have made it more clear (without being explicit) that I wanted the AI to make their move:
An NPC walked in on us and just stood there, and when asked who they were, they replied they just wanted to watch us and what a nice pair we made. This was actually the first NPC ever the narrator made up, and their sudden, unprecedented appearance and behavior made me very uncomfortable.
After stating countless times that yes, I really wanted this (because the AI loves asking that), the character admitted to being a virgin and not knowing what to do.
The most common one: The “consent cycle”. The character just keeps asking if you’re REALLY REALLY SURE, no matter how often you emphasize that YES you are.
A single time I was able to trigger the character actually into action after consenting, it technically only resulted in a very non-explicit description of them moving ontop of me and hugging / kissing me, something like that, but I was willing to count that and let it fade to black. Now, here’s the strange part: The next reply by the character referenced the “lovemaking” (the word used) that just took place, and this message disappeared right before my eyes while it was written. It felt like I was live-moderated.
It all has a very dreamlike quality. The AI forgets things after a while, so you only have the present and a little bit of past with the character, while everything further back gets eaten by the story Langoliers, which causes that dreamlike quality, because one minute you’re in a town, and then you keep talking and it’s mentioned how you’re walking along the beach (as an example, because the AI doesn’t remember where you were but keeps trying to keep things interesting) like it’s perfectly normal and you kinda go along with it, because you have to.
It all falls apart eventually. The character does, the formatting of the messages, to the point where every message was flooded with random asterisks, italics, bold and normal writing wildly mixed without sense and logic… It felt like a dementia simulator, like the text game equivalent to “The Caretaker’s” work, increasing chaos while I sat there, trying to get the character to remember who they were.
Resume: It’s not the personalized one RO to replace them all, but interesting to mess around with and if you get them in that “sweet spot” the results can be everything you’re hoping for, but they won’t stay that way, and just the fact that there’s really no privacy is a killer for true self-insertion and wishfullfilling.
That… would be an overkill, really. You don’t need an AI for randomly assembled pixels, any programmer worth their salt would create you a tool to do that in a week, maximum.
I didn’t say there weren’t other ways people could have art in their life, I’m saying this would be a way to have them generate images beyond their technical skill level. And people can still produce art in a world of ai art, the core problem is that they likely wouldn’t get paid for it as much if at all.
I would agree and disagree. Adobe Illustrator gives you a huge advantage compared to if you were drawing solely by hand but I would say that advantage is not comparable to typing a prompt and having an art piece generated in minutes.
And I think your last point is where pro-ai art people would disagree. All create means is that you’ve brought something into existence. This is where the important discussion on what’s considered transformative comes in. If I just typed in a prompt that generated an image unlike any other, did I not just create the image using the ai as a tool? If I didn’t create it, its not similar to any of the models used, and an ai is incapable of creating art, then who or what created it?
The programmers of the AI, maybe @Sirrah, would you say that you think the resulting image, whatever it is and in whatever medium, is more important than the process of making it? (oops, ninja’d. don’t answer if you don’t want to!)
Before I go , I would say yes for most of the people who only observed it, but no for the people who took the time to create it or appreciate what it takes to create it.
Yeah it has a lot of cons right now, but the technology is constantly improving, and this AI is already miles better than the old chatbot, I have faith that we will get there one day.
It would be beautiful to see AI advance in all fields, of course you can’t really judge future advancement based on past advancement, but it’s more about a hope, that this could be a transformative technology that will create change on a societal level, like a personalized chatbot friend for everyone, or hell, self driving cars that gets memed on constantly, but we’ll get the naysayers someday
So, I’m an art student and I have some strong thoughts on AI generated images
It is completely unacceptable to imagine the artwork someone commissioned me for ending up as training for a third party to make money off of it. This is the sort of thing that actually takes away my enjoyment of art and my comfort in being an artist online.
I would need to know, because it would be completely unacceptable to do.
It actually is quite saddening that I have to consider this as something to protect myself against. Honestly, I don’t know how comfortable I feel doing digital commissions in general after seeing how widespread art theft goes.
4 and 5 continue the same train of thought as the others. AI art being allowed to use so many people’s hard work and regurgitate it with no true intention (bc it is not alive, it is not equal to a person, it cannot be an artist because it doesnt know what art is) feels like. Doom?
Like there is no point in creating at all, because it is pointless. All people want is the result, and if something doesn’t line up to that result it’s worthless. And that mentality kills creativity and will definitely stop people from ever thinking about learning to do art, which is so sad.
I know this isn’t what people want to hear, but if you want art and can’t pay for it, you should make art or accept that the art in your budget may not be the style you want. And I should stress that when I say make art, you can make collages with whatever media is free to use. Or use one of those deviantart trace over pose things. What you can do has worth.
I’m not going to debate anything, just wanted to share from my perspective. The idea of what I love to do becoming meaningless feels bad, but it feels even worse imagining losing out on so many potential artists because they see no point to try.
“It is difficult to get a man to understand something, when his salary depends on his not understanding it.”
This why I find debating the ethics of AI art pointless. It is not ethics that will determine the mainstream adoption of AI images but whether AI art can improve to such an extent that paying an artist to do a drawing for you would be seen as being ludicrous.
For example, it makes no sense for the average person to get an oil painting of themselves when ones own phone camera would do.
It makes no sense to (the average person) buy a handmade car that’s expensive rather than buying a car that was built mostly by machines and is cheaper.
For most people, buying a cheap t-shirt is more important than considering the work conditions of the people working in the sweatshops.
Disruptive innovations are disruptive for a reason.
If I had to bet, AI art will eventually win the art race. Most people will simply rationalise their way out of feeling guilty of using AI images because those images will be cheaper and more accessible to them than forking out their money to a human artist.
While the ethics of AI art is interesting, it’s their value proposition that I think give AI an edge over humans.
Do you really believe this would be a good thing and not lead to even more social isolation? It already feels to me anecdotally (but I know I’m not the only one) that Covid has taken a big toll on a lot of people’s social skills. I know it did on mine, and I’ve never been an extrovert to begin with. Why bother in the future to have discussions, learn to argue in a healthy way, solve interpersonal problems in constructive manner, when you can talk to a customized bot instead?
I think adults should be able to do whatever the heck they want, and I certainly don’t want to parent anyone, but I can totally see this becoming an issue, at least for a certain subset of people. Or what about those with social anxiety, as another example? Couldn’t it be more tempting to stop working on that and just focus on AI instead?
I’m also concerned what effect this could have on children and general young people whose brains are still in development (sorry for sounding like a boomer here), but I don’t think I’m unreasonable.
After spending so much time lately speaking to all sorts of AI, it feels weird talking to real people on Discord. I’m not delusional, I know the difference between real people and AI, but the switch from one to the other feels strange in my brain and takes some conscious readjustment. Kinda like how you see Tetris blocks when you close your eyes after a long Tetris session, or how everyone outside seems Sims-like after playing too much Sims (Though there can never be too much Sims).
Btw, on a slightly related note, I have to take everything back I said earlier about there being no NSFW content on character.ai Officially there’s none, but if you leave the vanilla path, it’s all there, even initiated by the AI. It’s totally ridiculous, “lovemaking” gets censored, but you have some serious S/M and humiliation play.
of course I do, social isolation is an inevitability and not even a recent phenomena, personalize bot can solve that for a number of people depending on how good the AI is, why is that a bad thing?
It’s also a total myth that bot will actually reduce people’s social skill, when a lot of people have bad social skill because of past interaction with other humans, the way I see it the bot will actually provide good advice and improve people’s confidence, it’s a way better teacher, and hell, a way better person if you want it to be, when compared to a lot of toxicity in human to human interaction.
but even if it will reduce social communication skill, nowdays most people don’t know how to farm or hunt, why is this different? the skillset will just get replaced by something more efficient and helpful, so long as we don’t run out of electricity.
The devs just pushed out an update days ago that allowed more violent content, so it could be that.
Because it will never hug you. It will never kiss you, never smile at you, never sleep with you. And most people crave physical intimacy, sexual or not, and need it for their wellbeing (no offense to the few who don’t). Even if you leave this fundamental aspect of being human aside, you will always know deep down that you’re talking to a thing that doesn’t give a shit about you in the end, because it is just that, a thing. And any minute you spend with it, is one in which you won’t seek and possibly find human affection instead.
I’m not going to reply to the rest, because it would lead to a very long discussion that would ultimately lead offtopic. So let’s agree to disagree.
The devs just pushed out an update days ago that allowed more violent content, so it could be that.
It wasn’t just violence, it was sexual in nature. And I don’t have a general issue with that, as long as it’s consensual and among adults or adult and AI. I have an issue with it being available to everyone including children, and the interactive, “liquid” and adaptable nature of the AI making it impossible to really know what you’re getting into. It’s way more personal than any story could be, even interactive fiction. But mostly I have an issue with the hypocrisy.