Cav-AI-t Emptor

A music manager talking about how using AI for lyrics/music could cause copyright infringement. I think there’s one called Suno, which she discusses, and apparently the licensing doesn’t cover commercial uses. Fine for playing around with, but if you intend to publish your music … Cav-AI-t Emptor! (video is just over one minute)

https://www.tiktok.com/@lizthemusicmanager/video/7476890173482093867

1 Like

That will definitely be a gray area for quite a while I’m sure. I’d also guess that there will be quite a number of AI companies filing lawsuits in the near future. Wouldn’t surprise me

1 Like

Suits and counter-suits. More legislation. A never-ending pursuit of nailing down jello. :upside_down_face:

While this is very different, it kind of reminds me of the years of lawsuits against bands like Led Zeppelin, for ‘borrowing’ stylistically and even lyrically from the blues musicians that preceded them. Sensing something was ‘done’ that could be perceived as harm (‘theft’), but then having to prove it in court.

Are AI builders putting copyrighted works in the training data, or is the AI ingesting copyrighted material from the internet? How would it know if that’s appropriate or not? ‘Who’ decides? ‘Who’ is the arbiter of truth?

AI is like a dog fetching your slippers; it gets you what you want, without understanding why you want it, what you plan to use it for, or if the usage is even moral/ethical/legal. IMO.

Yes, how else does anyone or anything learn to play music; there are no abstract models to follow unless they are digital copies from human sources. I suppose you could learn an instrument without coming in contact with copyrighted material but no one does that.

Dogs know more than we think they do, they just process and evaluate in a different way. “Are there snacks involved?” I can be bought you know. There’s your arbiter of truth.

2 Likes

Right, there has to be a pattern to replicate. It needs to know what the expectations and ‘rules’ are so it can (try to) ‘improvise’ through Chaos Theory or something.

So that begs an interesting question: What does AI want? Does it require a reward? Does it want something? Does it like something? Is there a mutually beneficial trade-off involved?

Has it been programmed to seek reward through interaction with humans? Is it trying to ‘please’ you, to get some digital Reese’s Pieces when it gets back to the Server Ranch? Why does it do what it does - what’s the payoff? Or is a human (or dog) personification completely irrelevant?

These questions are going to keep me awake tonight. :last_quarter_moon_with_face: :face_exhaling:

In this case, at this time, this AI doesn’t ‘want’ anything. It has no awareness of self or otherwise. It performs a task without the expectation or need of reward. It is a tool. An extremely complicated and expensive hammer, of sorts. I see this more an example of ‘machine learning’ and not true Artificial Intelligence (which is a very dramatic and effective marketing term.) It is a building block upon which true AI might be constructed. A piece of the puzzle.

Will this change with other AI’s? Almost certainly. When it does, I think we will very quickly find out what AI ‘wants’.

1 Like

Yes, I think so too. The hype doesn’t match the capabilities, at least not yet. I see it as a “Search Engine on steroids” (expensive hammer). I had been averse to using AI for a long time, but finally tried it out. It can sometimes consolidate information pretty well, but can require ‘training’ or interaction to figure out what you want. At this point, I find it interesting, but I’m not that impressed. I have yet to find it especially useful for anything except saving time on comprehensive internet searches.

I don’t know what the ‘leap’ might be, from machine learning to something resembling ‘consciousness’, but it could certainly be the explication by an AI of wants/wishes/requests/demands. The key is to not get so comfortable and complacent with this technology that we can’t see it gradually slipping into this type of behavior. The “boiling frog” analogy, as it were.

1 Like

It is a fascinating and impressive tool that is evolving daily. I’ve been using text-to-image generation models for a while and some of the results can be stunning (especially for a talent-challenged person like myself.) It does take some wrangling and some understanding of ‘how’ to get it to produce something you actually want. I’ve also used some of the language models to summarize documents/videos and to assist in basic coding exercises. They can be particularly helpful as an ‘interactive’ learning tool when trying out some new language, code or process. It’s like an on demand tutor. Helpful, but at the same time, concerning on many levels. I just accept that the ‘genie is out of the bottle’, it’s not going back in any time soon, and it will only become more capable over time. You have to take advantage and leverage what it can do for you while you can. Be the best frog :frog: you can be. I have no illusions that corporations or governments (they are one in the same at this point) will ‘do the right thing’ when/if they see warning signs.

2 Likes

I’ve done a bit of this myself. At the very least I find it useful to have it create many versions and I have been able to utilize it for generating ideas. It at least gets the gears turning for me. And that may be (and likely is) the case for those using it for lyric writing and music creation. Much of it may get tossed, but if it at least helps get you something to work from, I consider it a win.
However, it is quite obviously getting used as “instead of” on a regular basis and that just doesn’t sit right with me. But that’s my opinion

1 Like

What I take from the music manager’s warning is that companies using AI to generate “lyrics” from a series of words input by the user will be able to frivolously sue for copyright infringement if the users’ input results in the use of anything close to what the AI’s software generates. In other words, they are going to copyright phrases and ideas that are randomly generated and hope to catch people using them, hoping they can find a few big fish and sue them if they make any money.
At this point, you’d have to have Led Zeppelin money behind you to go to court and not get fleeced. Let’s face it, “You Shook Me” was a Willie Dixon song covered by Led Zeppelin and his estate had to sweat it out in court to get a settlement that finally credited him for writing it. Imagine if an AI conglomerate copyrighted an algorithm that said that a random user input generated a catch phrase that became commonly used in songwriting, such as “I’m sad” = “My baby left me”. It wouldn’t be long before you couldn’t write anything without getting sued if your .0001 cents per stream amounted to something.

1 Like

That’s an interesting viewpoint. What thought or inspiration sparked the user input? Must the user defend their prompts? What is their justification for choosing their words?

The AI software issue is perhaps slightly more straightforward. As I understand it, whatever data that the AI ‘scraped’ from internet websites, or was trained on in its development, may have been copyright material.
There’s always been a bit of a fine line between an artist/composer being ‘influenced’ by music they have heard, inspiring a new composition, and out and out copying and theft of an idea. Presumably, there is a similar situation and argument for AI ‘creativity’. How similar is too similar? The AI must be trained and learn on some sort of material. Real world examples. Is copyrighted material off limits for the AI to learn from, even though the human composer may use it for learning and inspiration?

Some use AI for lyrics content generation, but the musical element is also something AI can do. Melodies, rhythms, and harmonies are all probably studied - both real world examples, and musical theory of those elements.

Unfortunately, we’re in a new Wild West, where the ‘rules’ are up in the air and not well defined. She admits that “it’s a big gray area”, so nothing is settled at all. She points to her entertainment lawyer’s advice, but of course he’s going to advise against anything that could get you sued. She also points out Suno got sued, but doesn’t go beyond that to any conclusion - people get sued all the time in frivolous lawsuits. Being sued doesn’t mean you’ve done anything wrong or will lose a judgement.

One good point she makes is the Terms and Conditions of the AI software. If it prohibits use for commercial projects, then that’s a potential ‘gotcha’ that should be avoided. That legalese could be used by a plaintiff to at least show mal-intent. One clear stipulation amongst the muck.

But the bottom line is as you say: Is someone making money, large sums of money, off of your copyrighted work? If not, there’s really no gain in a lawsuit that may be hard to prove anyway.

A potential test case might be these AI Spotify accounts they found that were AI generated music. Basically ‘fake’ artists. Looks like a real artist, but the whole account was manufactured by AI as was the music. And they had hundreds of thousands of listens, maybe even millions, raking in some substantial income. The sad part is that those listens were by humans (presumably), who had no idea - or didn’t care - that it was not a human artist.

That’s where we’re at.

Yup. I got concerned when “beats” themselves became considered music. A lot of that to me sounded like somebody screwing around on a drum machine, saving it, then running it up the flagpole to see who saluted. Now, it’s “Alexa: write me a sad song”.

1 Like

Here’s a thread where we talked about one of those cases.

1 Like

Yes, I remember seeing your thread. And I saw another article recently saying something similar (sans the legal consequences, IIRC).

The “AI naming schema” thing is kind of funny. Weird nonsensical names. I’m seeing tons of new channels and videos on YouTube, most of which are AI generated and AI narrated. Sometimes it’s hard to tell, though YT supposedly is enforcing a policy that creators have to label the content as “digitally generated” or something like that in the Description of the video. But the channel names are typically so generic and off-kilter that it’s laughable.

I don’t give AI much (if any) credit for original thinking. But for manufacturing its own songs, if they’ve studied copyrighted material - especially ‘hit’ songs - they may be able to mimic human creativity. Within limits.

The stuff on Suno is pretty amazing. Scroll down to hear some examples.

https://suno.com/home

1 Like

I remember liking that song “Coffee Love”. It has a nice groove to it, and the female vocals are strangely addictive … kind of like Ulysses hearing the bewitching song of the sirens. Though the lyrics gave a bit of pause … almost weird enough to be believable, but yet still too predictable once the plot is known (much like a banal Pop song - think Britney Spears) :disguised_face::

"In the morning pour it black and hot
Ooohhh, coffee my coffee
I’ll brew you when the sun comes up
Yeaaahhh coffee my coffee
I’d die for coffee
I’d kill for coffee
I’d lie and steal for coffee
It’s like oxygen for me
And now do you see

In the morning get my water hot
Ooohhh, coffee my coffee
I lose my marbles drippin’ drop by drop
Yeaaahhh coffee my coffee
I’d die for coffee
I’d kill for coffee
Goddamn I love you coffee
It’s like oxygen for me
Da da da, da da, da da da da
Da da da, da da, da da da da
Da da da, da da, da da da da"

Et al.