Could always try character.ai instead, they don’t require phone number I think
But probably won’t help you for writing since they output mostly short responses, but still helpful in testing out capacity for more recent NlP AIs
Could always try character.ai instead, they don’t require phone number I think
But probably won’t help you for writing since they output mostly short responses, but still helpful in testing out capacity for more recent NlP AIs
Yeah, looks like they’re not being particularly ethical (and probably not lawful either) about training the AI’s. Just had a look and photography of mine that has not been given permission to be used (it is under copyright) has been used to train AI’s it seems if this website is correct. Kinda sucks really that they’re trolling sites they know full well they don’t have access to free images on to gather material to base the AI creations on Can’t see why it’d be any different for writing either.
It’s not really stolen signatures, but the AI just thought it should come up with its own alien signature or watermark since that’s what it learned during training, that some pictures should come with watermark or signature. you would be hard pressed to find real world signatures that’s similar to the AI output.
It’s very rare from my experience using the tool, so it’s not a huge problem, I know stable diffusion uses algorithm to prevent that
As the article below states: another domino falls:
The fact that a services company such as Adobe is doubling down on this issue shows that they feel that they can outlast any challenges by independent artists … I wonder what will happen once someone trains their AI with Disney art exclusively …
This Forbes article delves into this issue better than I can:
The following is what I believe as well:
This last quote is what I see happening the most and soonest.
Some of my fellows in my uni were discussing a chatbot that was producing perfectly serviceable software. The thought is… scary, to say the least (it could not only cause jobs getting lost, but also end up in a situation where no one knows what the software does, including security holes no one can fix), but I still wouldn’t say the research is inherently bad (stealing is, of course) - it has also produced good things (TTS, speech recognition, certain sorting things). That doesn’t mean it should be applied everywhere, though. And definitely not with stealing stuff (although I would like to believe that’s more a problem with commercial than purely academical research… but who knows).
I… can hardly see how auto-sorting a photo album could cause the end of the world as we know it. (Now, auto-generating critical software certainly could. But that is an implementation. I think Dr. Malcolm is against creating dinosaurs, not finding out how they’ve been built.)
It was a joke.
Sorry. Hit a nerve. (As a person who intentinally selected an AI that doesn’t farm user data for her project.)
what kind of AI is that? in my experience AI needs to farm data for neural networks, you can pay people for “high” quality data of course but the principle is the same, it’s not always practical though.
And I don’t think AI grabbing data off the internet should be considered stealing, both currently in a legal sense and moral sense. The AI is not copying the artwork and claiming it as its own output, but learning the attributes that make up the artwork like every human is doing, this certainly falls under fair use and no one is challenging that in court.
But what I’ve seen both the artists and tech people like myself are very biased and really don’t budge opinions regarding AIs, guess we’ll just have to wait and see.
TTS/STT. It doesn’t produce new stuff, it just converts between speech and text. (I obviously don’t know what it’s trained with, but unlike the big commercial ones, it doesn’t grab your data without you knowing.)
I mean even for speech to text, you need large number of training data in order to make a good AI, if your problem doesn’t need that then it’s fine, but when you are talking about AI that can talk like human, it’s kind of unavoidable.
the best STT on the market is probably whisper, and yeah I’m a little biased for open source stuff, but even it is trained on large public data from the internet through web crawling.
“Trained by public data” isn’t what I was talking about though. I have no issue with that (as long as it’s not done with data without permission for such use). But I’m getting way too off topic, so I’ll stop now.
For the artists in the forum, I also have some questions. Assuming that AI art continues to be developed and evolve over time, what sort of changes would you need to see to consider its use ethical? I understand why using someone’s art as a seed without permission could be unethical but I think the two key questions I have is what is the moral difference between AI art and human inspired works and at what point is inspired art transformative enough to be considered original? For example, if someone was to hand-create a drawing that was perhaps stylistically or content-wise influenced by 100+ others drawings would that be unethical? Not saying that most AI art is using that many seeds, but let’s assume they’ve already advanced enough and are able to. At this point you wouldn’t be able to the distinguish between the originality of a human art piece and a generated AI art piece. Is it the direct use of data/pixels from one of those artworks that makes Ai art unethical? Or is the AI just not advanced enough yet to be used ethically? I mean no disrespect towards the work of artists, I’ve just never questioned the ethicality of AI art before seeing this forum so I’d love to hear your thoughts
To borrow an example from a friend:
if you prompt 5 people to draw the same topic, you’ll get 5 different artworks. Each version is coloured by how the artist sees the prompt, how they treat it, and how do they interpret it (in that sense they are all unique). And it is in this process of making that is the result of years of training, thought, and hard work.
That being said, you decided on the prompt. You gave them the prompt to draw the thing. You want the credit for their drawing because you ‘thought’ of the concept in the first place. So you take the artwork you like best — without compensating the one who actually drew the piece — scrub off the signature, maybe tweak it here and there, and call it yours.
And that’s where the heart of the issue regarding AI art is.
The bottom line for AI art is that the art it ‘references’ is done largely without the consent of the owner. Additionally, you type in the actual name of the person you want the machine to replicate in terms of style, and there you have it.
Like the infamous example is Greg Rutkowski — and he’s just one of many artists whose art is being used without permission.
What people are effectively saying is that ‘we think that what you do is cool and amazing but not enough to pay you for it.’
[Because that’s what it is. Why will you pay x amount of money for an artist when you can get it for free, or for a lesser price — aka commercial liscensing — with whatever AI generator you’re using]
But wait. What if someone wants to pay the artist, but doesn’t have the money? Unfortunately, you’re still not entitled to their labour. You don’t go to a store and take something just because you want it without (the possibility of) legal consequence.
You can make fun about the hands and other uncanny elements, but at the end of the day, you’re looking at the devaluation of the talent, skill, hours, etc etc to get a product ‘cheaper.’ Because that’s really what it is. As far as I’ve observed, people who like AI art are looking at it as an optimization of a process to get a product. They don’t really care about who does it so long as they have their piece (and if they can get it free, the better)
You can contemplate at the end of the day and back if AI works are transformative enough to be called ‘original’ (for the want of originality anyway) — but what you’re looking at is how quickly people discard artists if it meant ‘free’ art.
If that’s not a deterrent to future artists and other creative professionals, then idk what is.
[Why go into art when a machine can do it for free? People have to eat too, and that includes those in the creatives — and they’re undervalued enough as it is]
I agree that without the human foundation of artistic talent, these services wouldn’t be able to operate and that shouldn’t be forgotten. I also agree that the way images being taken without consent from artists is not being done in an ideal way as of now.
However, I disagree with the notion that originality isn’t relevant when that’s pretty much one of the core reasons we’re advocating for human art. From your response, I would believe that you consider AI art immoral largely for two reasons: the fact that they’re sourced from original artworks without the consent of artists (causing those artists to lose money) and you consider the changes an AI makes simple tweaks instead of creating original art.
I agree that these two reasons are justifiable reasons to not support AI art as of now. However, I am moreso asking at what level of advancement is it no longer immoral? If Ai algorithms evolve to the point where they are able to generate images incomparable to anything else after viewing not just hundreds or thousands, would you consider those images mass scale theft? I suppose to some extent it is, but even humans would be guilty of that at some level. We are almost incapable of imagining something we’ve never seen before: what we do create is always just a unique amalgamation of things we have seen. And most of the time, our works inspired by others are still considered original. If you see the generated images now as scrubbing off the signature and tweaking it here and there, I’ll probably understand your point if the generated work is far too similar. But I’m asking what happens when Ai art evolves to the point where that’s no longer the case? At what point would the AI be emulating someone’s art, like a human would, opposed to stealing it?
Even with those arguments aside, I understand that Ai art has the danger of making conventional art decline as a career and there is something sad about losing the human touch of creativity. But realistically the genie is already out of the bottle when it comes the technology. All we could hope to do in the short term is make sure these pieces are sourced more ethically, but the technology isn’t going anywhere. Unless it’s strictly regulated or made illegal, it would be near impossible to enforce copyright laws on company creating a generated art piece that has so many sources sampled that you couldn’t even discern the original works. An even more confusing copyright scenario: what if Ai art is generated from other Ai art that has already been through multiple iterations? That’s why it’s more important to figure out what boundaries should be set in the near future and what we define as an original art piece.
Finally, I don’t think the technology itself should be looked at solely as way to skirt paying artists simply because that happens to be the worse byproduct of it. I think the art industry is fundamentally changing, but in a company where pieces are ethically sourced, that could be good thing.
You can probably tell I’m a fan of the idea behind the technology, but I understand your concerns and you have made me aware of the ways these companies might be causing damage with underdeveloped technology. Anyways thanks for having a discussion with me, this is a really interesting topic
As long as the AI’s creations are derivative of other’s, I feel it will always be immoral.
Even at the most advanced stages imaginable, AI algorithms are “creators” that are not inspired by others, but controlled by them.
Humans are inspired, not controlled by what they intake in creating … a forger or a copyist is never seen as ethical, and I maintain that AI should be placed in the same creative category as these artists.