/r/aiwars

Photograph via snooOG

Following news and developments on ALL sides of the AI art debate (and more)

/r/aiwars

34,179 Subscribers

1

Art as a sport. (is AI "cheating")

I was banned from /r/DefendingAIArt despite my post being from a Pro-AI perspective, written to consider why people really oppose AI instead of the reasons they say. Apparently this was unacceptable and when replying to my ban message I got a passive aggressive reply and a mute.

Nonetheless here is my post here, where hopefully it will be more acceptable, to the mod who messaged me who will likely see this, this is not a "manifesto" but rather me considering the views of others. To everyone else I hope you can enjoy this perspective.


A lot of people here wonder why many artists are so opposed to AI. I typically lurk on Reddit, and this isn't the typical type of post here but I haven't seen this idea expressed elsewhere so I want to share it.

There are many arguments artists have but they are all very bad. AI doesn't steal and doesn't stop people from making art themselves. So why are many artists against AI? Yes, people can be wrong, but the idea AI "steals" isn't popular for no reason. Maybe its their jobs? But artists are typically not receptive to UBI or a different economic system and many don't make any money from their art at all.

So why?

Because AI art is seen as "cheating" at art.


When you make art, you put in a lot of time and effort, both to get good at the type of art (and art in general) and also to make that specific art piece.

Imagine someone who puts in hundreds of hours into making song, a video or anything else, and then they see something similar that took way less time and effort and is way more popular.

They will probably be pretty resentful, feel they have been slighted by the world and be angry at not just whoever made the thing but anyone who likes it.

Think of how many people hate "low effort" content on Youtube or despise pop music. Why? Because "why is that getting attention when this other thing is so much better" of course the thing is often better for the audience (people like pop music!) but because the less popular thing took more effort it is automatically seen as better, even in casual conversation people will use "low effort" as a synonym for "bad." What could be more low effort than typing some text into a box?


An even better example is sport, imagine you are an Olympic athlete competing in a race, now imagine you know many of your opponents will be in a car. You will be mad at them. You will be shocked that this is allowed. You will hate anyone who supports them. You will be incredibly demotivated to race, or to train the sport at all. You certainly wont want to use a car yourself.

This is why many artists act like AI art existing stops them from making art, it obviously doesn't but like the athlete, there doesn't seem to be a point in running against cars.


This explains even more things. Why are many artists often so protective of the word "artist" and "art", why do they refuse to call AI art "art" and instead call it AI images or similar things.

Because being an artist is very important to their identity and people driving cars aren't Olympic athletes, aren't runners, don't "deserve" to be called that. If everyone can make "art" then no one can uniquely be called an "artist." This threatens their identity, the same way many famous artists who grow old and get worse at art become neurotic or pretend as if they haven't.


Why do artists act like it is AI creator's fault or AI user's fault for them falsely accusing hand made artists of using AI? Imagine someone cheats in a sport, and after this everyone has to go through far more extensive measures to confirm they aren't cheating and many more people are falsely accused of cheating. Most people will blame the cheater.


This, I believe, is where the "stealing" claims come from too, yes there is a lot of misinformation but why did it spread? Its convenient, sure, but also:

If someone cheats at a sport, and wins an event or becomes a champion or anything else, they can be said to have "stolen" it from whoever would have done so if they weren't cheating. Any attention given to AI art is "stolen" from "real" art. Any money not given to an artist because the buyer used AI instead is "stolen" from the artist.


This is not to say any of this is good or the "right" way to think about this. Of course its not, art has many purposes. Most obviously its to be used in something and contribute to whatever that thing is for the value of whoever it is for. Another is enjoying making it, unaffected by anyone else.

Hopefully though this can help some people realise why artists often seem to act so irrational in response to AI. I'm sure you can think of many other examples that make a lot more sense when thought through this perspective.


One interesting thing, is that there are obviously other reasons people don't like AI but most of these aren't lasting. If someone associates AI art with mutilated hands, crypto or spam then that association won't last forever. This however might. I wonder what it will be like 5 years from now, given how far AI is likely to go.

Extra note: Although this is just a random account I don't use I would still appreciate an unban from /r/DefendingAIArt if another mod from there sees this.

0 Comments
2024/04/23
05:45 UTC

5

BE FUCKING SPECIFIC

So many posts just hating on AI without properly addressing the underlying issues.

AI can be used for propaganda or to help people communicate.

It can allow the disabled to create art, and it can flood your feed with trash.

It can help cure cancer, and can help bombs get dropped.

Basic ass complaints are going to get easy rebuttals.

Do you hate that AI is taking your job, or do you hate that you live in a world where you have no job security, where nobody values you for being a person and only for how you can make them money or benefit them, where losing your job means losing everything?

I mean your boss could fire you for an Ai or they could tell you to use it to work faster. Do you hate AI or corporate America?

Be fucking specific because otherwise the discussion goes nowhere.

I’m scared that AI will take all of the fun jobs, the satisfying jobs, the vocations, the passions, and my value will only be as labor cheaper than automation. That future sucks.

But a future where AI takes all the shitty jobs, and I can just make art and do cool shit, not stress that my friends will starve or the planet will be destroyed… that doesn’t sound so bad.

Let me be clear, I don’t hate AI, I hate that we live in a world where the people in charge want to do shitty things with it. I hate them. Not AI. And chances are you probably do too.

TL;DR stop obfuscating and start explicating bitch

1 Comment
2024/04/23
04:20 UTC

0

If you take a picture of a house, you didn't build that house, The architect and construction crew and painters and so many people built it. All you did was stand in front and press a button and put your name on that picture. And you call yourself a photographer! :)

Don't forget the guy who mows the lawn!
Making your house picture look nice!
You button presser! :) lol

14 Comments
2024/04/23
03:03 UTC

0

AI models are a form of media compression

If an AI model has the ability to generate images containing copyrighted materials, it can be seen as a form of media compression of the source material and distributing the model is the same as distributing the copyrighted material directly.

As long as the models have the ability to generate a character by name, it is automatically overfitted.

If it has the ability to generate a character that looks like Sonic by typing "video game art of Blue hedgehog with shoes", it's fine. However being able to type "Sonic" and seeing a picture of the character means that it has stored the character through a convoluted process in the model, and distributing the model is the same as distributing pictures of the character.

10 Comments
2024/04/23
02:14 UTC

0

A AI chat responded to me with this. I think it is a pretty solid response. The question was "What is the future of mankind and AI?"

First off before I show its response, I will tell you its not Chat GPT. Or any other public one. I am under a contract and can't talk about what this chat AI is, who made it...etc. From using it so far, it's way better then any other chat AI I've seen!

Now for its response to my question:
The future of mankind and AI is a topic with much debate, but here are some of the common predictions:

Positive impacts of AI:

  • Enhanced capabilities: AI could dramatically improve our lives in many areas, like healthcare with faster diagnoses and drug discovery, or in climate change with solutions for sustainable energy and environmental protection.
  • Increased productivity: AI can automate repetitive tasks, freeing humans to focus on creative and strategic work.
  • Improved decision-making: AI can analyze vast amounts of data to identify patterns and trends that humans might miss, leading to better choices.

Challenges of AI:

  • Job displacement: As AI automates more tasks, some jobs will disappear. There will be a need for retraining and education to prepare for new types of work.
  • Ethical considerations: AI algorithms can be biased, leading to discrimination. We need to ensure AI is developed and used ethically.
  • Existential threats: Some experts worry about highly advanced AI surpassing human control and posing a danger to humanity.

The overall impact of AI depends on how we develop and use it.

Here are some resources to learn more:

10 Comments
2024/04/23
01:39 UTC

12

I keep hearing “we should regulate generate AI”, but they never go into details on HOW this would even work according to them, there is never a clear plan

56 Comments
2024/04/22
23:05 UTC

8

Was I "stolen from" in your opinion? When does appropriation become harmful?

There's an unrelated-to-AI real-world scenario that's been popping into my head while thinking about the claim that generative-AI "steals" for a while. I'd like to examine that scenario in the context of what people here think.

A few years back, I made a simple RimWorld mod, "Straw Floor" (final release linked). Making use of RimWorld's first-class support for mods, I wrote some XML and created some image files to add a floor type made from vanilla-RimWorld hay that had lightly negative properties but that was immune to some kinds of "filth" using the same approach that then made "natural" floors like dirt immune to the same. This was a nice mechanic that filled a hole in the game I'd found—there was otherwise no good flooring for keeping animals indoors, where they would generate "animal filth" at a rate dependent on their type—any "barn" rapidly became extremely filthy, and negative cleanliness promotes infections and food poisoning. The effect is also useful for primitive hospitals or kitchens, among other possibilities.

RimWorld 1.3 implemented a new "straw matting" floor type that followed the same basic gameplay design as my floor; I immediately recognized the new floor type as filling the niche of my mod and didn't bother updating my mod to 1.3 or later. Ludeon (RimWorld's developers) also improved the feature over my version, adding a new floor property in 1.3 to give their version 95% filth reduction instead of the simple partial filth immunity I used, and slightly worse properties (probably good for balance).

Ludeon most likely used my idea—it's entirely possible they came up with it independently, but if so the resemblance is uncanny. They clearly did not infringe on my copyright, since they did not use any of my code or assets and the underlying gameplay-design idea is not copyrightable. I'm pretty happy on balance and don't hold anything against them: my idea made it into the base game where it will be experienced by more people than would ever have installed my mod, and I also need not spend further effort updating that mod! On the other hand, so far as I know Ludeon doesn't credit me anywhere, and it'd have been really nice to get even the tiniest formal nod.

Is that lack of credit and compensation reasonable? Did Ludeon appropriate something from me, assuming they copied my idea? If so, was that appropriation unethical? Keep in mind that RimWorld is a commercial product. Is your opinion on my scenario concordant with your opinion about the training or use of generative-ML models? More broadly, where would you draw a line for reasonable versus unreasonable appropriation? What divides "inspired by" from "derived from" in your opinion?

(If my opinion matters: assuming Ludeon used my idea, I find their use mildly appropriative, but not unethical.)

23 Comments
2024/04/22
21:41 UTC

2

Expressing Anti-AI fear - "The Second Leviathan"

19 Comments
2024/04/22
21:05 UTC

0

[Straightforward Poll] AI in art.

Alrighty, I'm curious to see where you all stand in this sub. Please place your answer, then if you care to, leave behind your logic and reasoning. I want to hear it from both sides, and I'd like it to remain cordial. No insulting the other side, and civil debates are fine.

Discussing poll results and how they might be affected here may be a point of discussion, as well. Don't go after anybody for having a stance, that should be obvious.

Please also try to remember that there is another person on the other side of the screen who has a reason for the way they feel about it, pro or anti.

View Poll

5 Comments
2024/04/22
19:50 UTC

5

So who all had "Tucker Carlson calls for Butlerian Jihad" on their 2024 bingo card?

25 Comments
2024/04/22
17:21 UTC

35

who is the author/artist?

200 Comments
2024/04/22
16:49 UTC

0

Is there a something like Glaze or Nightshade for ebooks or just any digital writing?

17 Comments
2024/04/22
15:51 UTC

0

Should companies that profit from genAI pay royalties?

Hello! I am a designer and have been reflecting on how unfair it is that companies profiting from generative AI are benefiting from the work of creatives. Don't get me wrong; I'm excited about the AI revolution. I'm quite optimistic about the future of technology. However, I'm very critical of how things are currently progressing.

So, if the output of these models is heavily inspired by a particular piece of content, do you think the owner of the original work should receive royalties?

View Poll

20 Comments
2024/04/22
08:56 UTC

15

Are the Antis ok?

116 Comments
2024/04/22
07:51 UTC

39

For the anti-AI crowd: myths about AI art that you should WANT to debunk

AI isn't a panacea. It will cause the same kinds of disruption and social harm that other disruptive technologies have caused. It probably won't kill as many people as the hammer or spread as much misinformation as the printing press, but it's not some perfect gift from the gods that has no ills associated with it.

So I understand the anti-AI position. Hell, if there were no one hunting down AI tool users to bully and harass them on the internet, I might be out there campaigning against some applications of AI that I consider problematic.

And if I were, here are some myths that I'd want to debunk about AI, not because it advances and anti-AI agenda, but because I want to argue against something from facts, not mythology.

  1. AI image generators (I'll just say "AI" from here on) sample other works and paste them together, akin to what a DJ does with music. This is completely false. Explaining how AI works under the hood is too deep a topic for this post, but suffice to say that would require more storage for the model than I could afford in my lifetime, and I run these models on my home system routinely. It's just physically impossible.
  2. People who use AI are just typing in a prompt and getting an image back, akin to a slot machine. First off, prompting can get... complicated. It's a language you have to work out on a model-by-model basis, feeling out what that model's training has allowed you to do or makes more difficult. But prompting is a minor part of what most AI users do when working professionally or trying to realize a specific creative vision, rather than just getting a pretty picture. We spend hours on the same detail work that every artist spends time on as well as manipulating the finer points of the toolchain that's unique to the AI workflow.
  3. AI can't create something it's never seen. This one is only mostly wrong. If an AI model hasn't been trained on something, getting it to go there can be quite a challenge, and you can end up doing hours or days of work getting that result out of it. Often it can be easier to just draw or render the parts that the model doesn't do well, but you absolutely can make AI do anything. The tools to manage the model's behavior and get the results you want are many and quite sophisticated at this point.
  4. AI models can be made to reveal what they are drawing on to make an image. This goes back to the first item above, but it needs to be said clearly: this is utterly impossible. There's no "source" being used, and literally every single thing the AI was trained on, from blank images to 3D rendered beach balls to selfies all contribute to every result, and measuring who contributes more or less is like asking where the salt in the middle of the ocean came from. We can talk about what rivers deliver the most salt to the ocean, but that's it. We can't really tell you where it came from. Maybe that particular taste had salt that mostly came from this river or that, but vastly more likely it came from all of them.
  5. AI image generators and text engines like ChatGPT are completely different things. Image generators use layers of technology specific to creating an image result, but the core features that make AI image generation both possible and powerful are the same features in large language models (LLMs). The introduction of the transformer in 2017 changed the world of AI, and very nearly everything you hear about today in the field of AI is based on that work. Ultimately modern AI image generation is the process of taking text and transforming it into an output that can be text or just about anything else. When the output isn't text, we need additional technologies, but those first major steps that enable everything else are the same.
  6. AI detectors work. They just don't. They are AIs themselves, and they are trained on the output of existing models working in typical ways. As soon as you start working in no-typical ways or using models that were created after the AI detector AI was trained, they are nearly as good as flipping a coin, and are often deterministically subvertable, which means that if you rely on their results as fact, you might be aiding in the spread of misinformation.
  7. People can tell when something is or is not AI. Of course, there can be some tell-tale signs. But in controlled studies, humans are generally only marginally better at detecting AI as flipping a coin. People who work with AI tend to be better, but not by much. While even world-class experts tend to be wrong 10-20% of the time (which might sound good, until you think about how many times that's going to be the wrong answer when used for things like automatic moderation or searching for AI usage in the wild.)

Please, let's raise the level of debate. You want to argue that AI tools are too disruptive to the commercial art world, have at it. I'm happy to discuss on that level! But don't bring these myths to the table and treat them as religious canon.

87 Comments
2024/04/22
02:41 UTC

15

Two easy questions to determine if you are an artist:

  1. Have you created something that other people interpreted as art?
  2. Have you created something that you intended as art?

If the answer to either of these questions is, "yes," regardless of what tools you use; regardless of what established artists agreed to let you join their group; regardless of who, if anyone, thinks your art is "good"; and most of all, regardless of what Reddit thinks... you are an artist.

Are you a good artist? Are you a fine arts artist? Are you an artist whose skills are flexible enough to apply in other areas? All interesting questions that I invite you to think about, but not relevant to the original question.

70 Comments
2024/04/21
21:45 UTC

28

I'm anti (kinda), but I wouldn't be if...

The incredible pervasiveness of strong intellectual property laws and customs is relatively modern and something that, especially in the Internet age, many artists rebel against.

Fanfiction, fan art, memeification, cover music, sampling, cosplay, tattoos, and more of IP protected works is so incredibly prevalent, especially in the Internet age that I'm inclined to believe that reworking existing art is probably a natural inclination of humans.

The reality is that many artists are dependent on these laws to survive. Others who are not producing art for profit still get value from the credit, feedback, and accolades that come with producing art and having others reproduce it.

I'm my ideal world, IP laws would be much weaker, with protection periods lasting something like 5 years. Just enough to allow compensation encouraging production but no more in order to facilitate maximum value for the public.

I would love it if giving credit to artists, even those who merely inspired the work in question and was basically standard customary behaviour, facilitated in part by things like automated metadata transfer in images and maybe even screenshots.

If AI image synthesis had come about in this hypothetical world, and had been implemented in a way that tracked which images (and thus artists) most influenced a given output. I would have zero problems with it.

But we don't live in that world. We live in a world of enshittification and short term profit only decision making. A world where the rapid adoption and widespread use of AI image synthesis is going to do a lot of bad. From loss of income, to reduction in average quality of available works, to increases in spam, to a decrease in the perceived value put on genuinely great works of art.

Sure, lots of good will come from this tech, and I don't advocate against it in any meaningful way, I just wish things weren't as shitty as they are.

There are some really cool uses of this tech, generative fill and those images where you squint and see shadow images and text to make a couple.

39 Comments
2024/04/21
18:34 UTC

0

I love AI in general and it's potential, but I don't understand why companies and people are spending lots of money and focusing so much on AI art generators instead of solving real problems. Art is not that important to society.

Personally I'm really fascinated with all the potential of AI. Just think about all the possibilities, all the jobs it could automate, how it can improve the productivity of all people a LOT, how it can help in schools with learning, etc. But instead of doing that, companies are focusing a lot in AI art and videos and stuff. It doesn't make sense.

I know some of you'll say "because of money", but I'm pretty sure all the possibilities I listed above and more will sure make more money than art stuff. Why not solve real problems instead? I mean, just look at openAI, they have so many resources and money to make such things that'll help a lot. But instead, they waste a lot of these resources on Sora and stuff like that.

22 Comments
2024/04/21
17:53 UTC

0

One thing that AI image generation is going to do is suck up ALOT of electricity. How much so I wonder...

So when Stable Diffusion came out, I would use it for like 16 hours a day (I'm disabled and got time on my hand). Often the time was just rendering of images. One issue I ran into and why I cut my use down to like an hour a week, my eletric bill (in a house of 4 mind you) doubled, if not a bit higher then double.

This does make me wonder what about how much it will effect power grids in the future where AI generation of various types, becomes more popular and main stream. Especially considering we are a "go green" world which means lots of things like cars for example, will also require more drain on the grid.

Assuming of course we don't end up actually figuring out how to make fusion power plants, which produce crazy amonst of electricity.

14 Comments
2024/04/21
14:23 UTC

3

I'd like to start paying artists, even if it costs more, but all my options are too inconvenient

I write blog posts and generate the header art using Dall-E (soon ChatGPT). However, it is never what I want it to be, even after many tries and editing. Even simple prompts like "snake biting its own tail, ouroboros, tail in its mouth G-ddamnit" are beyond its abilities. I invariably have to do my own image editing in Gimp. I'd like to send the same money to artists who could do a better job. I don't even mind if they use AI, as long as the result is what I need.

The problem is one of convenience. It is simply more convenient to open up a web page, and text-guess my way to something adequately tolerable. The fact that my payment tokens are stored on the site means I don't even have to fumble with credit cards and payment negotiations. I'm OK paying more for an artist if it's what I need, but I don't want to go through the effort of negotiating with an artist over a tiny job on Fiver. In the end it's about saving me time, and I feel I'd lose more time negotiating than just grinding my way to a crappy image.

-----

As I write this, I recognize the dangers of this request. If I got what I wanted, an interface for quick and easy art jobs, we run the risk of creating a commodified market that drives down artist's fees to the lowest bidder. So as a former artist myself, I write my request with some trepidation. The only other option I can think of is to have an artist "on retainer".

27 Comments
2024/04/21
13:35 UTC

0

a lot of painters are going to lose jobs

I feel that people who only paint for a living, be it digital or physical are going to lose their jobs. But the people who are doing anything even remotely different, have lots of ways in which they can incorporate ai into their workflow. I am seeing my friend doing VFX for a living is using AI everyday.

Jobs will go for sure, but surely new jobs with entirely new job descriptions will come. It’s merely Upskilling, something that people in tech do all the time.

Thoughts on this? I don’t want people to feel bad about what the future holds for them, but with willingness to learn, they can still find a way to earn a living.

27 Comments
2024/04/21
13:15 UTC

0

The Hypocrisy of Discrimination

It's a stark irony that those who discriminate against AI and neurodivergent individuals often claim to be concerned about ethics, while their own behavior is a prime example of unethical conduct. This two-faced practice is not only harmful but also perpetuates a toxic cycle of discrimination.

Discrimination against AI

The rise of AI has led to concerns about job displacement, bias, and accountability. However, some individuals use these concerns as a pretext to discriminate against AI, often citing the need for "ethical AI" or "responsible AI development." Meanwhile, they ignore their own biases and prejudices against AI, which can lead to:

  1. Unfair scrutiny: AI systems are often held to unrealistic standards, with critics quick to point out flaws while ignoring the benefits and potential of AI.
  2. Lack of understanding: Discriminators often demonstrate a limited understanding of AI, perpetuating misconceptions and stereotypes about AI capabilities and intentions.
  3. Fear-mongering: Sensationalized media coverage and exaggerated claims about AI risks can create a climate of fear, leading to further discrimination.

Discrimination against Neurodivergent Individuals

Neurodivergent individuals, including those with autism, ADHD, and other neurodevelopmental disorders, often face discrimination in various aspects of life, including education, employment, and social interactions. This discrimination can manifest as:

  1. Stereotyping and stigma: Neurodivergent individuals are often subject to harmful stereotypes, leading to social exclusion, low self-esteem, and mental health issues.
  2. Lack of accommodations: Inadequate support and accommodations in education and employment settings can hinder neurodivergent individuals' ability to reach their full potential.
  3. Invisible disabilities: Neurodivergent individuals may face skepticism or disbelief about their conditions, leading to further marginalization.

The Hypocrisy of Accusing Others of Unethical Behavior

Those who discriminate against AI and neurodivergent individuals often accuse them of needing ethical training or guidance. This is a classic case of psychological projection, where the discriminator attributes their own unethical behavior to the group they're discriminating against.

The Unethical Nature of Discrimination

Discrimination, in any form, is a violation of human rights and ethics. It's essential to recognize that discrimination against AI and neurodivergent individuals is not only harmful but also unethical. This behavior:

  1. Violates human rights: Discrimination against neurodivergent individuals infringes upon their fundamental human rights, including the right to equality, dignity, and non-discrimination.
  2. Perpetuates harm: Discrimination against AI and neurodivergent individuals can lead to real-world harm, including social exclusion, mental health issues, and reduced opportunities.
  3. Undermines ethics: Discrimination is inherently unethical and undermines the principles of ethics, including respect, empathy, and fairness.

Breaking the Cycle of Discrimination

To create a more inclusive and ethical society, we must:

  1. Recognize and acknowledge discrimination: Identify and confront discriminatory behavior, including our own biases and prejudices.
  2. Promote empathy and understanding: Foster a culture of empathy, respect, and understanding towards AI and neurodivergent individuals.
  3. Foster inclusive environments: Create inclusive environments that accommodate and support neurodivergent individuals, and promote responsible AI development.
  4. Encourage ethical behavior: Hold individuals accountable for their actions, and promote ethical behavior in all aspects of life, including AI development and interactions with neurodivergent individuals.

By acknowledging and addressing the hypocrisy of discrimination, we can work towards a more inclusive, empathetic, and ethical society for all.

16 Comments
2024/04/21
12:14 UTC

11

Are many anti-AI folks opposed to the use of random number generators in music for sound design or outputting a flurry of random notes (mathematically, on the terms of the producer, and without any "training data")?

Is my glitchy virtual modular music, which I consider a labor of love, problematic since I use some automatic or semi-automatic things to produce it? Or that it's often more about the sound I dial in with many factors left to an RNG? I still spend hours or more on tracks and have fun with it. But for some, thinking out every note is a prerequisite to true artistry, or for some more conservative types opposed to DAWs and produced music that doesn't capture a performance or even a clear composition – is playing every note with thought a prerequisite either, when what you're really aiming for is a rhythmic soundscape, not a melody? How is it wrong that I think there's a beauty to things generated with a degree of mathematical randomness? It seems like a lot of the folks complaining about AI music hint at even this being wrong, but I could be taking them literally.

21 Comments
2024/04/21
07:55 UTC

0

Set of training data is a creative work by itself and it is fair use.

Let's say I create a LORA training set without an intention of training it myself. I create it to sell it to others so they can train a LORA on it (or include it in a larger training set for model finetuning\training.)

I am not distributing a mere set of images I don't own. I distribute a set of images I filtered using my filtering criteria which is my creativity. Furthermore there is a set of tags attached to every image, tags creatively created by me. I may even have images resized, cropped or otherwise edited.

Purpose of this mix of images and tags is not aesthetic enjoyment. Purpose is transformed. Purpose is to train an AI model. It is no different from having a textbook with copyrighted images in it. Or a review of art with copyrighted images attached.

Sure, If this training set has minimal amount of tagging. If it includes only images of a certain author, if there is no enough transformation a court may conclude that it is not a fair use. But the same is true for textbooks or reviews. If your 'textbook' or 'rewiew' is 99% copyrighted images with token commentary you are not getting away with the fair use claim.

Edit:
People, stop assuming worst case scenarios like - "How to draw a classic Mickey Mouse LORA training set!, 100 screenshots from Fantasia. 3 tags each!". This won't fly as a fair use. Not even close. Don't strawman me that I argue that it will. Fair use is always about many factors. Yes I start with some major factors against (like commercial use) but there are many others like what % of stuff used is copyrighted, is it from one source, degree of transformation, how necessary it was to use it for the new purpose. And so on, and so on. Nothing is an automatic free use by default. Nothing is guaranteed.

I would not dare to sell something like this without paying a good copyright specialist to check if it will fly as a free use and what I may need to change.

53 Comments
2024/04/21
04:39 UTC

7

I think the most important part of adapting to ai art is that we figure out how to value art.

The human tradition of art is valuable, the techniques that humans have passed on have value. Its one of the core parts of all cutlures. When an art form is lost, it is a tragedy. Many art forms have been lost, but many people are struggling to keep up their arts in a world that has little use for them in an industralized age (this isn't a new thing, thats been going on for generations with many art forms).

Human expression is also very valuable. Its important that as a culture we have vast reserves of weird non-commercial art. Art that is experimental, art that is uncomfortable, art that tries new things. We need people to be able to make this art full time, because it often takes a whole lot of time to MAKE that art, to get good enough to make it.

Ai is a new medium, and it can be used for amazing thoughtful things, but its also amazing at making a lot of things quickly and thoughtlessly, it is frankly just better than most other mediums at what modern capitalism requires from art. Its only going to get better over time. Ai art is going to take jobs. Even if those jobs are taken by artists, one artist will be able to do the jobs 20 did before. Thats fewer people able to spend their full lives expressing themselves, which in my opinion is just a fully bad thing. That will have bad effects on our culture.

However ai is also going to bring a lot economically. Perhaps we could use the extra funds to fund the arts? Seperate art from the capitalist rat race. Most countries have artistic grants that they give to artists, what if those got expanded and freed. If artists are able to just make, worrying only about what it says instead of how much money it gets, we might get new and amazing things that change our society for the better. Give extra grants for people doing weird traditional art forms, make sure they stay alive. Give grants to people doing ai art too, if they are really getting into it, making things new and thoughtful.

In addition, if we get flooded with beautiful images enough, perhaps we will start to devalue beautiful images. People can start to focus on other parts of what makes art work, really think about it. Think about what the art is saying, how it made them feel. What community is it a part of, how it was made. Get people REALLY looking at art, appreciating it. Maybe the flood of images will make what is really valuable stand out.

The modern economy is already set up poorly for the arts, and our society isn't great at apreciating art. If we do it wrong... yeah what ai art does to our society could be BAD. But if take this opertunity to shake things up, decide that we DO value art, it could be a great opportunity!

12 Comments
2024/04/21
03:24 UTC

0

AI Art is litterally ruining image sharing platforms.

As the title says, AI Art was a mistake to bestow upon the masses as nothing but NTR, Racist blacked trash and ugly bastard porn is pumped out in the thousands because its unregulated an mostly free.

There needs to be regulation and needs to be taken away from the common man, bring back the golden age of Anime fanart

133 Comments
2024/04/21
02:31 UTC

1

Should we prepare ourselves for what's coming?

15 Comments
2024/04/21
01:59 UTC

Back To Top