• 0 Posts
  • 26 Comments
Joined 1 year ago
cake
Cake day: June 10th, 2023

help-circle
  • Can you imagine the alternate headline of “US sends peace keeping force to Saudi Arabia to protect African migrants”? That’s a sociopolitical minefield if there ever was one. I’m sure that if the US knew, so did most everyone else, and there weren’t any other countries lining up to help. SA isn’t known for being a clandestine government, more of a “What are you gonna do about it, buy more oil?” kind of operation. The US is damned if we do, damned if we don’t. Step in and its Team America World Police, don’t do anything and we are standing by letting genocidal governments kill nations.














  • I do think it would be ethically wrong for a company to profit by mimicking someone’s style exactly. What incentive remains for the original style or work to exist if you cannot earn a living from it.

    That’s where we differ in opinion. I create art because it’s what I enjoy doing. It makes me happy. Would I like to profit from it? Sure, and I do, to some extent. However, you are conflating two ideas. Art created for profit is no longer art, it is a product. The definition fundamentally changes.

    I’m a writer, a photographer, and a cook. The first two I do for pleasure, the last for profit. If I write something that someone deems worthy to train an AI on, first, great, maybe I’m not as bad as I think I am. Second, though, it doesn’t matter, because when I wrote what I wrote, it was a reflection of something that I personally felt, and I used my own data set of experience to create that art.

    The same thing goes for photography, though slightly differently. When I’m walking around with my camera and taking shots, I do it because something has made me feel an emotion that I can capture in a camera lens. I have also done some model shoots, where I am compensated for my time and effort. In those shoots, I search for art in composition and theme because that’s what I’m paid for, but once I finish the shoot, and I give the photographs to the model, what they do with them is their own business. If they use them to train AI, then so be it. The AI might be able to make some 99% similar to what I’ve done, but it won’t have what I had in the moment. It won’t have the emotional connection to the art that I had.

    As far as the third, cooking, goes, I think it’s the most important. When I follow a recipe, I’m doing exactly what the AI does. I use my data set to create something that is a copy of something someone else has done before. Maybe I tweak it here and there, but so does AI. I do this for profit. I feed people, and they pay me. Do I owe the man who created the Caesar Salad every time I sell one? It’s his recipe. I make the dressing from scratch just like he did. I know that’s not a perfect example, but I’m sure you can see the point I’m making.

    So, when it comes to Art v. Product, there are two different sides to it, and both have different answers. If you are worried about AI copying “art” then don’t. It can’t. Art is something that can only be created by the artist in the moment, and may be replicated, but can never truly be copied, in the same way that taking a photo of the Mona Lisa doesn’t make me DaVinci. However, if it’s a product, then we are talking about capitalism, and here we can see that there is no argument against AI, because it is only doing what we have been doing for forever. McDonalds may as well be the AI of fast food burgers. Pizza Hut the AI of pizza. Taco Bell the AI of TexMex. Capitalism is about finding faster, cheaper ways of producing products that people want. Supply and demand. If someone is creating a product, and their product can be manufactured faster, and cheaper, by the competition, then the onus is on the original creator to find a way to stand out from the competition, or lose their marketshare to the competitor. We can’t hamper AI just because some busker is having a hard time selling his spray paint on bowl planet scape art. If you mass produce for the sake of profit, you can’t complain when someone out-mass produces you, AI or human. That’s the way of the world.



  • @Kichae

    Those probability distributions are based entirely on what training data has been fed into them.

    The exact same thing a human does when writing a sentence. I’m starting to think that the backlash against AI is simply because it’s showing us what simple machines we humans are as far as thinking and creativity goes.

    You can see what this really means in action when you call on them to spit out paragraphs on topics they haven’t ingested enough sources for. Their distributions are sparse, and they’ll spit out entire chunks of text that are pulled directly from those sources, without citation.

    Do you have an example of this? I’ve used GPT extensively for a while now, and I’ve never had it do that. If it gives me a chunk of data directly from a source, it always lists the source for me. However, I may not be digging deep enough into things it doesn’t understand. If we have a repeatable case of this, I’d love to see it so I can better understand it.

    It occurs at the point where the work is copied and used for purposes that fall outside what the work is licensed for. And most people have not licensed their words for billion dollar companies to use them in for-profit products.

    This is the meat and potatoes of it. When a work is made public, be it a book, movie, song, physical or digital, it is placed in the public domain and can be freely consumed by the public, and it then becomes part of our own particular data set. However, the public, up until a year ago, wasn’t capable of doing what an AI does on such a large scale and with such ease of use. The problem isn’t that it’s using copyright material to create. Humans do that all the time, we just call it an “homage” or “parody” or “style”. An AI can do it much better, much more accurately, and much more quickly, though. That’s the rub, and I’m fine with updating the laws based on evolving technology, but let’s call a spade a spade. AI isn’t doing anything that humans haven’t been doing for as long as their has been verbal storytelling. The difference is that AI is so much better at it than we are, and we need to decide if we should adjust what we allow our own works to be used for. If we do, though, it must effect the AI in the same way that it does the human, otherwise this debate will never end. If we hamstring the data that an AI can learn from, a human must have the same handicap.


  • If you can truly tell me how our form of writing is any different than how an AI writes, I’ll do a backflip. Humans are pattern seekers. We do everything based on one. We can’t handle chaos. Here’s an example.

    Normal sentence:

    Jane walked to the end of the road and turned around.

    Chaotic Sentence:

    The terminal boundary of the linear thoroughfare did Jane ambulate toward, then her orientation underwent a 180-degree about-face, confounding the conventional concept of destinational progression.

    On first pass, I bet you zoned out half way through that second sentence because there was no pattern or rhythm to it, it was word salad. It still works as a sentence, but it’s chaotic and strange to read.

    The first sentence is a generic sentence. Subject, predicate, noun, verb, etc. It follows the pattern of English writing that we are all familiar with because it’s how we were taught. An AI will do the same thing. It will generate a pattern of speech the same way that it was taught. Now, if you were taught in a public school and didn’t read a book or watch a movie for your entire life, I would let you have your argument that

    @cendawanita

    an LLM-generated textual output that is in the form of a book report or movie review looks the way it does by copying with no creative intent previous works of the genre.

    However, you can’t say that a human does any different. We are the sum of our experience and our teachings. If you get truly granular with it, you can trace the genesis of every sentence a human writes or even every thought a human thinks back to a point of inception, where the human learned how to write and think in the first place, and it will always be based on some sensory experience that the human has had, whether through reading, listening to music, watching a movie, or any other way we consume the data around us. The second sentence is an example of this. I thought to myself, “how would a pedantic asshat write this sentence?” and I wrote it. It didn’t come from some grand creative well of sentience that every human can draw from when they need a sentence, it came from experience and learning, just like the first, and the same well of knowledge than an AI draws from when it writes its sentences.


  • @mack123

    Can I get an AI to eventually write another book in Terry Pratchett’s style? Would his estate be entitled to some form of compensation?

    No, that’s fair use under parody. Weird Al isn’t compensating other artists for parody, so the creators of OpenAI shouldn’t either, just because their bot can make something that sounds like Pratchett or anyone else. I wrote a short story a while back that my friend said sounded like if Douglas Adams wrote dystopian fiction. Do I owe the Adams’ estate if I were to publish it? The same goes for photography and art. If I take a picture of a pastel wall that happens to have an awkward person standing in front of it, do I owe Wes Anderson compensation? This is how we have to look at it. What’s good for the goose must be good for the gander. I can’t justify punishing AI research and learning for doing the same things that humans already do.


  • If I do a book report based on a book that I picked up from the library, am I violating copyright? If I write a movie review for a newspaper that tells the plot of the film, am I violating copyright? Now, if the information that they have used is locked behind paywalls and obtained illegally, then sure, fire ze missiles, but if it is readily accessible and not being reprinted wholesale by the AI, then it doesn’t seem that different from any of the other millions of ways we use data in every day life. Just because a machine learned it instead of a human, I don’t believe that it makes it inherently wrong. I am very open to discussion on this, and if anyone has a counter-argument, I’d love to hear it, because this is a new field of technology that we should all talk about and learn to understand better.

    Edit: I asked GPT-4 what it thought about this, and here is what it said:

    As an AI developed by OpenAI, I don’t access any copyrighted databases, subscription sites, confidential information, or proprietary databases for my learning or response generation. I was trained on a diverse range of internet text, which includes publicly available data in multiple languages and formats. The training also involves learning from large-scale datasets that contain parts of many kinds of texts (books, websites, scientific articles, etc.). However, I don’t know specifics about which documents were in my training set or have access to any specific documents or sources.

    Your question seems to pertain to how copyright law applies to AI. This is a complex and evolving area of law and policy, largely because AI systems like me do not “read” or “use” material in the same way a human would. A human might read a book and then consciously recall information from it when writing or speaking. In contrast, AI models do not have consciousness or awareness, they analyze statistical patterns in the data they were trained on and use these patterns to generate responses.

    When a human uses copyrighted material to create something new, the law often considers this to be a derivative work. It is typically necessary to obtain permission from the copyright holder to create derivative works. However, AI doesn’t use or recall copyrighted material directly; instead, it learns patterns from a huge amount of text data to predict or generate text based on a given prompt.

    Dissemination of copyrighted material, such as through forum posts or Q&A platforms, without proper licensing or within the bounds of fair use, can be a violation of copyright law. AI doesn’t ‘disseminate’ copyrighted material; it generates responses based on patterns it has learned.

    That said, the legal interpretation of how copyright law applies to machine learning and AI is still evolving, and there can be a lot of complexities and nuances. Therefore, the information here should not be taken as legal advice.

    As of my last update in September 2021, this was the case. However, I recommend checking the most recent legal standpoints and opinions, as this area is still evolving rapidly.