Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
With questionable copyright claim, Jay-Z orders deepfake parodies off YouTube (waxy.org)
462 points by minimaxir on April 28, 2020 | hide | past | favorite | 210 comments


So if I understand this correctly...

YouTube took these videos down based on a copyright (DMCA?) claim. Which of course isn't a court ruling or anything.

At the same time, no court would ever uphold this removal, because it obviously falls under "parody" which is protected. (Otherwise shows like Saturday Night Live couldn't even exist.)

After all, the end result is no different from a really good vocal impersonator. In fact, a really good impersonator will definitely do better.

It's disappointing that Jay-Z (or more likely some lawyer working for him) is abusing YouTube's takedown mechanism this way.

But at the same time, I can see how this could seem particularly scary for a performer. After all, if you've spent decades creating a recognizable profitable persona... the idea that anybody with a personal computer can flood YouTube with fake lyrics that aren't yours could feel terrifyingly like losing control of everything you've built.

What if some really nasty stuff went viral and became as associated with Jay-Z as the rest of his stuff? Stuff that did serious damage to his brand, which "nobody could unhear"?

I genuinely wonder if deepfake technology will actually result in new copyright restriction law. I.e. to make it a crime to produce unlicensed deepfakes that are genuinely indistinguishable to the average person, regardless of whether they're parody or not. (While "bad" parodies like SNL will continue to be protected as always.)

I kind of feel like that's going to have to be the outcome at some point in the near future -- in fact, as soon as really convincing yet nasty deepfakes of senators and representatives start making the rounds and spreading on Facebook, I suspect a new law will get passed incredibly quickly.


>> as soon as really convincing yet nasty deepfakes of senators and representatives start making the rounds and spreading on Facebook, I suspect a new law will get passed incredibly quickly.

No. This isn't a new area of law. Legal "Deepfakes" have been around a long while. There is even a supreme court case covering the matter: Falwell, 1988. That involved a fake interview with Billy Graham, using his image and putting words in his mouth. If the 1988 supreme court wouldn't protect the reputation of a national religious leader, they aren't going to do so for politicians today.

https://en.wikipedia.org/wiki/Hustler_Magazine_v._Falwell

This is from the movie, but it is accurate to what was said. This interaction really did change supreme court procedure, introducing a far less formal back-and-forth oral argument. Jokes were almost unheard of before this case.

https://www.youtube.com/watch?v=MeTuNES82O0

The trial court argument: https://www.youtube.com/watch?v=TsvB61mDoG8


From [1]:

> In an 8–0 decision, the Court ruled in favor of Hustler magazine, holding that a parody ad published in the magazine depicting televangelist and political commentator Jerry Falwell Sr. as an incestuous drunk, was protected speech since Falwell was a public figure and the parody could not have been reasonably considered believable.

The post you're replying to explicitly talks about convincing deepfakes. Also, Falwell was a televangelist and political commentator, not a senator. So I don't think this court case is a good example.

[1]: https://en.wikipedia.org/wiki/Hustler_Magazine_v._Falwell


No, deepfakes are relatively new. It's a specific term meaning a fake video produced with ANNs, not any type of picture that's been faked or photoshopped. The case you reference is a just and paste job, not a deepfake.


I don’t think the method of producing the fake is important in terms of court rulings.


It kinda is though, if the network is trained on the source material itself. For example if I take the source material and move a few pixels, is that still the source material? What if I flip the video, which is technically moving every single pixel, that's still the a copyright issue even if the video is mirrored. The network is basically taking the pixels of the original content, learning from it, and creating a filtered version.


I think you misunderstood my point. I meant that it doesn’t matter if someone modified it by hand or if it was done using a neural network.


I didn't say it was, I simply corrected the definition of a word.


> If the 1988 supreme court wouldn't protect the reputation of a national religious leader, they aren't going to do so for politicians today.

Well that gave me a good chuckle! The fact that a supreme judge is appointed by the sitting president and confirmed by the Senate means that they can push whatever dogma the ruling party has.


> I kind of feel like that's going to have to be the outcome at some point in the near future -- in fact, as soon as really convincing yet nasty deepfakes of senators and representatives start making the rounds and spreading on Facebook, I suspect a new law will get passed incredibly quickly.

Ugh, that is the type of free-speech infringement that people should be worried about, not all this stuff about how YouTube taking down videos is government censorship.

A really good deepfake is still fiction. Really good writing is fiction. Really poorly-written fake "news" blogs "reporting" lies as truth is fiction. These are all allowed to exist (in the USA) because of the 1st Amendment. It is the price that we pay for living in a free society, that the government cannot tell you "your fiction is not allowed."

Textual accounts of real events, and lies about real events, have existed for ages. Sometimes the lies are more convincing than the truth. Audio recording and doctored audio has existed for ages. Photography and doctored photos have existed for ages. Video recording and doctored video has existed for ages. We'll survive another flavor of doctoring just fine without the government's help.


No, copying and 'reporting on' / parodying / sampling etc. are very different things.

A writer who makes a book that is word-for-word the same as another book is infringement.

Making a pixel-perfect copy of a photo, from scratch, is an infringement.

Copying a song, perfectly, is infringement.

Whether something is a 'lie or the truth' - or 'which one is more convincing' doesn't really matter.

With 'deepfakes' we're going to have to get serious because people will literally be putting words in other people's mouths, creating fake representations of other people, it's going to cause problems. This issue may not actually fall under copyright.


I feel like you've replied to the wrong comment - I'm not defending copyright infringement, misuse of celebrity likeness, or the content of this video.

I'm specifically addressing the notion that the government would carve out a free-speech exception for "deepfakes", like "deep fakes of Politicians / Celebrities are illegal." Which would be clearly unconstitutional, because you are forbidding creation of a creative, expressive work.

Whether the works that you are free to create, regardless of format (photoshopped image, deepfaked video), violate some other law - libel/slander - is separate as well. Yes, they certainly could, just like a photoshopped image today could be libelous.



> Really poorly-written fake "news" blogs "reporting" lies as truth is fiction. These are all allowed to exist (in the USA) because of the 1st Amendment.

Not if it’s libel. And a convincing deepfake that harms someone’s reputation could easily be interpreted as an act of defamation by the creator, which would mean it’s not protected speech.


Yes, certainly. And that's somewhat my point - the freedom to create these fictitious works is already limited by existing laws, we don't need new ones specific to this medium.

There's no reason to think libel laws wouldn't apply to deep fakes the same way the apply to a photoshopped image, or a textual lie.


Pop stars on YouTube ? Fair use & freedom of speech ? Deepfakes will be used to cause a war in the near future where people are killed and who are we going to point our DMCA take down requests at then ? What will the third world victims of some hate story stirred up by AI doctored footage care about first amendment protections in America ? Deepfakes are a clear and present existential danger to ourselves and others and at this point I'm afraid you are correct, there is no good solution. God help us all once we can't even believe what we see anymore.


I think you're ignoring an approximately 5,000 year history of humans using propaganda, misinformation, and outright lies to manipulate each other. Nothing new here except perhaps a little broader perception of the level of misinformation actually generated.

We (ok, mostly those in power or who want to be) constantly feed each other the most manipulative, self interested lies for personal/organizational benefit, 24/7/365.

Life has always revolved around either being manipulated, or knowing that that's the point of what you hear, and putting in the effort to find out more before forming an opinion.

Hence investigative journalists, etc etc.

> God help us all...

Umm, we've all been deceived about almost everything since forever. There's no sudden worsening here, just a newer level of verification required.


A little broader? You mean reaching everyone on the planet for almost no cost? This is a huge difference, it used to be only nation states and huge actors could spread and maintain disinformation on this level, now an individual can do it with a GPU.

One set of shared delusions is not the same as an anarchy of delusions.


It didn't take many people to create the Spanish American War. The news papers were not states or even large actors, though of course they did have a platform that most others did not.


> Deepfakes will be used to cause a war in the near future where people are killed and who are we going to point our DMCA take down requests at then ?

Wars are started based on disinformation, black propaganda and/or false flags anyway.[0] Nobody (at least who wasn't already spoiling for a fight) is going to start a war just because of a deepfake posted to Youtube.

> What will the third world victims of some hate story stirred up by AI doctored footage care about first amendment protections in America ?

This video was taken down, rightly or wrongly, due to American law. It's quite likely deepfakes are actually 'more legal' in some countries outside the US, due to differences in legislation.

For example, depending on political/sexual content, an Iranian or North Korean video host may be less interested in taking down a deepfake of Jay-Z than Youtube.

> Deepfakes are a clear and present existential danger to ourselves and others

The real danger with deepfakes (and other technology that the rich and powerful don't like, such as encryption) is overreaction and people tripping over themselves to call for even more limits on their own free expression.

> God help us all once we can't even believe what we see anymore.

And God help us all when we have no way to speak out any more. As shown in the link below, we never could believe what we see.

[0] https://www.globalresearch.ca/the-ever-growing-list-of-admit...


Gotta fight fire with fire. Use a neural net to detect deep fakes.

http://www.arxiv-sanity.com/1811.08180v3


You can't believe what you see right now. I remember watching people doing insane things on Facebook back in 2011 but it was all fake. Nothing has changed.


>but it was all fake.

You mean like scenes shot with actors to imply certain events happening? ex. scenes protests in some city when it's actually just a bunch of actors making it look like a protest is happening?


Exactly. Propaganda and misdirection has been a constant for a few thousand years.

The harder lesson is learning to catch yourself and see past it before getting manipulated.

The hardest lesson is not knee jerking and giving those who want to control you the tools to limit everyone's speech to just approved material. (China, Iran, Vietnam, North Korea, these ring a bell? VERY slippery slope.)


They can already cause a war just by lying about certain classes of weapons being "found" so I don't think deepfakes will help much in that regard.


Major benefit: eventually, all the news networks will be able to decide that instead of whatever each candidate said, they all actually agreed that Medicare for all was the right choice, and that it will be the center of their platform.

And once elected, they'll be stuck doing it, or the news networks will decide that it did happen, and no one will be able to figure out that it actually was not. At that point, is there any difference?


So the law will be whatever Rupert Murdoch (or Mark Zuckerberg) says it should be? That's not a world you want to live in.


> I genuinely wonder if deepfake technology will actually result in new copyright restriction law. I.e. to make it a crime to produce unlicensed deepfakes that are genuinely indistinguishable to the average person, regardless of whether they're parody or not. (While "bad" parodies like SNL will continue to be protected as always.)

You don't need new law. It's already against the law to copy someone's appearance in an attempt to impersonate them other than fair use.


I think you’ve misunderstood the point of impersonation laws, they are about stealing someone’s identity, this is not that... anyone watching these videos can see that it is NOT Jay Z in the video.


This isn't about impersonation but your copyright over your own likeness that can't be used without consent. Take a game like FIFA or Madden, they have to negotiate with the players (or their unions) in order to create realistic digital counterparts (including the names) and arguably better games often find themselves unable to compete because of this.

> anyone watching these videos can see that it is NOT Jay Z in the video.

I'm not that familiar with JayZ and it's more than enough that I would have been convinced and not given it a second thought. It's not obviously parody and the only tip off it was even fake was "(speech synthesis)" in the title.


Is it really copyright? Based on what I know of it, it doesn't seem like copyright would apply. Trademark I could sorta see, if they registered their likeness.


Quite a few places recognise "personality rights" [1] although the details vary from place to place. Sports games tend to comply so they can be sold everywhere.

Some form of personality rights make sense to me, as otherwise Mickey Mouse would enjoy more legal protection than you and I. And obviously it's not an absolute right - you can't sue red light cameras for photographing you.

There's also the question of whether a neural net trained on a copyrighted image or video is a derivative work - in contrast to a 3D model made by an artist who merely references copyrighted images or videos.

[1] https://en.wikipedia.org/wiki/Personality_rights


I was replying to a comment saying there are laws against impersonating people, so just changing the context because you think it’s not obvious that these are deep fakes creates a straw man to argue with. I’m not sure I believe you that you couldn’t understand that these were AI generated speech either.


There's also defamation laws. I'm sure a lawyer could make a good argument for banning a deep fake, or suing for the existence of one if it causes defamation.


> no court would ever uphold this removal, because it obviously falls under "parody" which is protected

Not a lawyer, but...

It doesn't seem like parody to me. It also doesn't seem like it has much of anything to do with copyright at all - one of the videos taken down was reportedly a performance of an excerpt from Hamlet, which is definitely in the public domain.

On the other hand it does seem like a possible violation of trademark law, but I have no idea how that would play out in court. Which is confusing, because I thought the DMCA only applied to copyright (it's the Digital Millennium _Copyright_ Act, after all) but perhaps I'm wrong and it includes provisions about trademark as well?


> Which is confusing, because I thought the DMCA only applied to copyright (it's the Digital Millennium _Copyright_ Act, after all) but perhaps I'm wrong and it includes provisions about trademark as well?

As far as I know, a legally enforceable "DMCA takedown" can only be about copyright infringement. However, YouTube is under no obligation to not honor takedowns that aren't legally binding.


I also am not a lawyer but from what I understand the situation is actually more severe than that.

I couldn't find the reference offhand but I've read that YouTube has no ability to adjudicate the validity of a DMCA takedown request at all. They must responed as proscribed by the DMCA.

Which makes sense because once the DMCA is involved it's ultimately a legal dispute between the person that uploaded the video and the person that claims to own the copyright. And that kind of dispute is something only the courts can decide.


DMCA is not involved in the standard YouTube removal request process - the removal request does not have to fit the criteria of DMCA, the response does not have to fit the criteria of DMCA, and as long as both Jay-Z and Youtube are satisfied, the DMCA does not get involved at all, no matter what the poster of the removed content might wish.

If Jay-Z was not satisfied with the process, he could submit a DMCA request and YouTube would have to follow the DMCA requirements in processing it, but the current process is better for both of them (not for the posters of the contested content, though).


Isn’t there some recourse? I think the content maker can assert it’s non-infringing, and then YouTube can host it again without liability: the liability falls on the content maker if it turns out to actually be infringement.


However, YouTube is under no obligation to not honor takedowns that aren't legally binding.

That's what's ripe for abuse. Why isn't it a fiduciary responsibility to their "partners" to make sure they aren't needlessly deprived of revenue, in situations where a 1st year law student could tell them there isn't a legal basis for the takedown?


Details matter - "takedown that is not legally binding" does not mean that "there isn't a legal basis for the takedown".

The legal basis for the takedown is the language in the Youtube terms and conditions that says that Youtube can take down the content at their discretion. It does not matter if that decision is caused by a non-binding takedown request of a self-asserted copyright owner, or by a drunken manager randomly clicking to remove videos, the legal arrangement between Youtube and their "partners" does not assume any fiduciary responsibility to ensure that your videos will be kept unless absoltely needed, and no current law is forcing them to assume that responsibility.

I mean, the statement "why isn't it a fiduciary responsibility to their "partners" to make sure they aren't needlessly deprived of revenue," seems very weird from the perspective of law - why would there be such a responsibility? Do you have some law in mind that would cause it? E.g. CEOs have fiduciary responsibility to shareholders, stockbrokers have fiduciary responsibility to their clients due to specific laws, but in general there is no fiduciary responsibility between parties in a contract, the parties can be 'hostile' towards each other as much as they want.

The default legal position is that every private actor is free to act as they want for their own benefit as long as they are not breaking any laws or contracts. If I cause you some losses by doing things I am entitled to do, there's no possible liability - in order to claim damages, I have to have done something wrong. So in the absence of a specific law or contract requiring Amazon to ensure that their "partners" aren't needlessly deprived of revenue, that lost revenue is not Amazon's problem, and they're entitled to (for example) intentionally cause partners to lose that revenue if it benefits Amazon in some way and there's no statute or contract that restricts Amazon from doing so.


Because the law treats the "partner" relationship as something that was freely negotiated between equal parties, despite the obvious power imbalance. Video creators need to from a union to protect their interests, but since they're all "temporarily embarrassed millionaires" I doubt that'll happen.


Let's not forget that YouTube's ContentID system is a direct result from them being sued by Viacom et all: https://en.wikipedia.org/wiki/Viacom_International_Inc._v._Y....

If YouTube stops taking down these videos based on this "fiduciary duty" and they get sued again, will this union pay for the legal fees and potential damages? Yeah, not likely.


Anyone can sue anyone for anything. YouTube were protected by the safe harbour rules and won a summary judgement on those grounds.


If the lawsuit was so obviously invalid (despite the first summary judgement being successfully appealed), what's your theory on why YouTube settled and agreed to spend money building Content-ID?


It's not obvious to me that it's parody. Is it actually making fun of Jay-Z? Is it parodying his songs? His voice? It's not a caricature, it's not commenting on Jay-Z's character or vocal styles or lifestyle or opinions.


Whether or not a deepfake is used for a parody is far less important than whether content using deepfakes identifies its artificial nature. Deepfake content masquerading as real content is actually dangerous and should be regulated as such.

But no, Jay-Z should not be able to take down parodies on YouTube that make it clear to the viewer that they are using deepfakes.


Deepfake used for parody is almost certainly protected by the first amendment under US law. So legally that is a very important distinction. (Morally, well, that's not a productive thing to debate on the internet usually)


If it’s convincing enough to make most people believe that it’s real it’s not a parody.


Plenty of people are convinced that Onion articles are real, but they're still clearly parody.


The definition of parody is somewhat vague, and parody is often subtle. It seems difficult to define and enforce objectively.


The youtube definition is "Whoever has the most money is correct"


I imagine that would likely fall under a new trademark or fraud law rather than copyright.


I don't get it. the videos are still here and I can access them just fine:

https://www.youtube.com/watch?v=iyemXtkB-xk

https://www.youtube.com/watch?v=m7u-y9oqUSw

was this an automatic takedown and maybe they have meanwhile been made available again?


> no court would ever uphold this removal

You mean enforce? Youtube can remove whatever they want; they don't need to be in the right about it.


These concerns should better be handled by other kind of law such as defamation, not the copyright or patent or trademark. The exact definition of defamation varies between jurisdictions so what constitute to be the defamation is another problem though.


I believe the DMCA defacto REQUIRES youtube (and others) to work like this. Once the DMCA take-down comes in, if they leave something up, they become liable if the court case goes against them. So if 1000 takedowns come in, and you correctly leave 999 videos up as fair use etc, you still get demolished on the other 1. Meanwhile the company requesting 999 incorrect takedowns owes you nothing.

This is a key problem with DMCA, it shifts all the risk/cost of deciding cases onto platforms, their only logical response is to take things down, so everything gets taken down.


The DMCA process also includes a counter-claim provision: the uploader can simply contest the takedown and have the content reinstated.

But YouTube doesn't follow the DMCA process at all, they have their own procedure, with no such recourse.


My understanding is that if a uploader does that, youtube has to decide whether to keep it up. If they do, they can then be sued by the right holder if it does infringe. If they take it down, no one can sue them.

So that's a huge incentive to just take everything down.

Effectively the whole point of DCMA (this part at least) is to pierce the common carrier protection platforms have.

Am I wrong? If so, any idea what Youtube don't just let people counterclaim and leave everyone else to argue it out in court?


Under the DMCA (technically Section 512), they have to reinstate access to the content at least 10 days and at most 14 days after receiving the counter-notice, in order to keep the safe-harbor (they have to wait 10 days to give time for the rights holder to decide if they want to sue the uploader).

> any idea what Youtube don't just let people counterclaim and leave everyone else to argue it out in court?

Essentially YouTube in the pre-Google days was purposefully uploading infringing videos, and Viacom (and others) used that to sue them and under the settlement agreement forced YouTube to build the Content-ID system and have a more lax takedown system.


>After all, the end result is no different from a really good vocal impersonator. In fact, a really good impersonator will definitely do better.

Really? I thought if it was "so good" that people don't realize it's a parody, it loses the relevant protections against defamation and infringement on likeness rights/image/trademark/whatever.


If that were the case, then people could claim that any parody was "too good" to get it removed.


Parody must include a component of criticism or commentary. If it's just imitation, it's not fair use.


It's a very fuzzy line they'll need to draw. The Onion seems like a good example of parody done really well, to a point that people often "eat the onion" and believe it. Would it be defamation if you published a quote that wasn't actually said by someone? That might be the analog they use to go after deepfakes in the future.


The article doesn't go into detail on what happened.

If it was a [manual] Content ID claim, there's no legal reason YT has to remove the video; the content ID system exists to provide an (arguably abusable) way for YT to scan every piece of uploaded content for copyrighted material.

If it was a DMCA, YouTube has to remove the video to maintain impunity.

But, as stated in a related comment, this is new territory in the eyes of the law. Maybe this video is a parody, but if it's not clear about that it would be clearly illegal. Even an extremely good impersonator can be sued if they try to actually pass themselves off as someone else.


I wonder if there's any legal grounds in the USA for trademarking the sound of your voice, and thereby protecting it from "counterfeiting".


> I.e. to make it a crime to produce unlicensed deepfakes that are genuinely indistinguishable to the average person

automatic Turing test pass


I could actually see it as libel. You’re leading people to believe the artist did something (created a work or said something) which isn’t true. SNL and other parodies would obviously be exempt because there’s no misrepresentation. A reasonable person doesn’t actually believe Trump went on SNL and happens to look like Alec Baldwin that day.


Bette Midler sued ford back in the day (1988) for hiring an impersonator to sing on an ad and won. (Of course it helped that Ford asked her to sing first)

https://en.wikipedia.org/wiki/Midler_v._Ford_Motor_Co.

"The appellate court ruled that the voice of someone famous as a singer is distinctive to their person and image and therefore, as a part of their identity, it is unlawful to imitate their voice without express consent and approval. The appellate court reversed the district courts decision and ruled in favor of Midler, indicating her voice was protected against unauthorized use.[4][5]"

I was actually a little surprised. And this is different, but I think that they're imitating a famous voice (as opposed to my voice...) means they become a target.

The fact you can fake anyone saying anything is little amazing. though there are people who can do it quite well:

https://en.wikipedia.org/wiki/Josh_Robert_Thompson

Thompson's Arnold Schwarzenegger impression first gained national attention during the California gubernatorial recall election of 2003. Posing as Schwarzenegger, Thompson phoned in to Fox News Channel's morning program, Fox & Friends, fooling the hosts into believing (at least for a short while) that he was, in fact, Schwarzenegger.[4]


In 2015 I had an unusual solicitation from a recruiter on LinkedIn - as a lawyer this in my experience is a rarity unlike tech where I understand it to be common place - with interest in offering me a General Counsel position with Romero Britto's Company.

Turns out Apple engaged in discussions to license (or maybe commissioning) Britto's Art, but then Apple turned around and essentially just "knocked off" of Britto's style. They were looking to bring me in to file and handle lawsuit for copyright infringement...I declined the offer and they filed suit (I believe it ended up with a settlement).

It is easy to have strong opinions one way or another about copyright laws, but feelings of law aside, my opinion was that it was in very poor taste for a company like Apple to court an Artist like Britto and then big time him the way they did...sure art and artistic styles are a dime a dozen, but they could have used any art for their campaign, they didn't have to mimic/copy his style (I can't fathom what would even make them do that...ego I suppose). For those who are curious here is a link to an article that shows Apple's alleged infringing art:

https://www.miamiherald.com/entertainment/visual-arts/articl...


Out of interest, if someone contacts you in this manner, do you/they have an implicit attorney/client privilege? Or is it only after you come to an agreement to represent them? Or is it more of a rule of thumb?


It is actually an extremely complex analysis that is dependent on the facts and the jurisdiction (applicable law). It simply isn't applicable to my communications with a 3rd party recruiter from LinkedIn which was not regarding legal advice but potential employment, nor did my comment contain any confidential information protected by the privilege.

To address some of the other responses (common myths):

1. It does not require the exchange of $1 for the attorney-client privilege to apply. For example attorney-client privilege will generally apply to free consultations and it will equally apply in the case of pro bono (free) legal services from groups such as ACLU or EFF to their clients.

2. Attorney-client privilege does not require an agreement for representation to apply (again think of a consultation before an agreement to represent where the privilege applies).

As far as a "rule of thumb" a decent summary by the ABA is:

"Nevertheless, there are some rules that generally apply to most, if not all, jurisdictions. For attorney-client privilege to apply to a communication, the general rules require that: (1) the communication be between a client and an attorney (i.e., an individual having a law degree and bar membership, and acting as an attorney for the client) or an agent of an attorney (e.g., a tax accountant, a patent agent, a forensic investigator, a technical analyst, or an expert); (2) the communication be made by the client and contain confidential information; (3) the communication be made outside the presence of a nonprivileged third party; (4) the communication be made for the purpose of securing legal advice; and (5) the privilege has not otherwise been waived. Privileged communications can be written or oral, but only communications between or among “privileged” persons are protected.


Thanks for clarifying this. I didn't mean to imply that you'd violated any trust here, I was just curious. And obviously you're being discreet with the use of a throwaway.


There's attorney client privilege the whole time. It does not require money to change hands or an explicit agreement of representation before kicking in (as some others have said).

https://www.nolo.com/legal-encyclopedia/does-the-attorney-cl...


I don't believe that that link supports the general assertion you're making about there being attorney client privilege the entire time. In this case, if the above lawyer receives a message that is unsolicited, Apple should have no expectation of that privilege/privacy of the communications without some prior agreement. Apple has no way to know that the account is even directly run by the lawyer themself or an aide.


Privilege begins once the client and the attorney have agreed on representation.


Pop culture says that happens when the first dollar is handed over.


I lived in South Florida for a while. That art style is so recognizable that I immediately knew where the Apple ad came from, though I didn't know the artist's name.

How do you even begin to quantify the 'likeness' of one piece of art to another though? In this case it sounds easy enough since they consulted with the artist beforehand, but what if they ripped off his style without ever talking to him, how do you prove it's a copyright infringement?


>In this case it sounds easy enough since they consulted with the artist beforehand, but what if they ripped off his style without ever talking to him, how do you prove it's a copyright infringement?

As they say when the facts are not on your side argue the law, and when the law is not on your side argue the facts. I tend to agree the facts of good faith conversations/business dealings helps if not makes the entire case possible. Even if there was an "obvious rip off" the artist would likely not take on Apple if it were not for the prior good faith discussions between the Artist and Apple.

>How do you even begin to quantify the 'likeness' of one piece of art to another though?

It is a good question...Britto has a very recognizable style that is some sense is also somewhat generic (my opinion only, I am far from an art expert). Copyright/Trademark infringement cases typically come down to questions of fact and the ultimate finder of fact (jury) which is likely why fewer than 10% of cases would go that far but rather parties will elect to settle.


Very important to note that the Midler case has very narrow interpretation. From the Opinion:

> We need not and do not go so far as to hold that every imitation of a voice to advertise merchandise is actionable. We hold only that when a distinctive voice of a professional singer is widely known and is deliberately imitated in order to sell a product, the sellers have appropriated what is not theirs and have committed a tort in California.


Videos on YouTube are selling advertisements so wouldn't this apply here?


Some differences that would probably affect things in court:

1. It's parody. Parody has specific statutory protection.

2. It's clearly labelled as speech synthesis.

3. It doesn't stand in for any of the original works. The market for deep fakes is much different than that for songs or presidential speeches.


Is it a parody as understood by the law, though? How is it parodying Jay-Z? It's definitely "for fun", but it's not clear that it's making fun of or commenting on Jay-Z.

It's really not clear to me. This page cites a case where someone imitated Dr Seuss' literary voice and lost.

https://www.cotmanip.com/articles/fair-use-parody


This is the point I was going to make. For a parody fair-use defence to be valid, you need to be parodying the thing you are infringing on, and as much as people always say "oh this is parody so it's clear fair use" by the accounts I've seen (such as Penny Arcade's Strawberry Shortcake comic) that can be a difficult defence to actually mount.


The author even says that he didn't pick the texts to comment on the voice in particular, but that doing the same text (the copypasta, the Bible, etc) with each voice was just convenient.

It's one thing to impersonate someone to make fun of them, but he's not picking texts for them that way. A Bush impersonator reading the phone book isn't obviously parodying Bush unless he starts throwing in comedy Bushisms or exaggerating the voice for effect. Some Elvis impersonators are parodying Elvis but a lot are just doing a straight recapitulation of his look and voice.


"the sellers have appropriated what is not theirs"

IANAL but I think the argument can be made that YT are the sellers here and they are protected by the DMCA


No. The video in question was not. And more generally just because an ad plays along side a video does not mean the video itself is a commercial work, in the same way a newspaper or magazine can have ads alongside editorial content.


This doesn't seem to narrow the interpretation very much to me besides noting that imitating a random, unrecognizable stranger off the street would be a different situation.


It very much narrows it to impersonating a well known voice _in order to to sell a product_, therefore the individual is _due proceeds based upon those sales_. Impersonating a well known person for other reasons (parody, political or religious expression, etc.) would not fall under this precedent.

‘In order to sell’ also may not cover all business transactions if they are not the primary purpose, or at least that is what the defense will be partly based on. In other words, just because a video is monetized does not necessarily mean it falls under this ruling.


> It very much narrows it to impersonating a well known voice _in order to to sell a product_, therefore the individual is _due proceeds based upon those sales_. Impersonating a well known person for other reasons (parody, political or religious expression, etc.) would not fall under this precedent.

Isn't that already covered by the fair use doctrine though? It wouldn't be a matter of copyright to begin with in those other cases.


During Presidents’ Day car ads are run that contain impersonations of past and present presidents, they are impersonated for commercial advantage, do they get prior permission before running those radio ads?


Those two cases sound rather different based on the intent.

The ford case seemed to be ford trying to get around Midler saying no. Its clear they wanted to imitate her for the purpose of selling their cars. The current case is a mix of satire and people playing around with new technologies (with youtube advertising money being incidental).

If anything seems more like a personality rights issue than a copyright issue.

IANAL.


Just want to point out (because a lot of people seem confused) that laws requiring permission to use your likeness are not copyright. I believe they fall under the category of moral rights. Moral rights vary dramatically from country to country so it's always tricky to figure out what you are or are not allowed to do if you work internationally.


Then how are Elvis impersonators not a copyright violation?


When you see an Elvis impersonator, do you think it is Elvis? Probably not. Elvis is dead.

If you heard an Elvis song in the background of an advertisement, would you assume it is actually Elvis? Probably, unless the ad was featuring impersonators.


They might be, but they're so small-time that it's not worth pursuing.

If an Elvis impersonator started to become famous or successful and popular, you can bet the EP estate would be in touch.


Or Elvis' hypothetical twin brother. Or clone. Or close genetic laryngeal relatives.


Or cover bands, for that matter.


Cover Bands do pay a royalty. It not much if I recall (Last century a housemate was a former Rush Cover band member...)

But I'm wondering who Dread Zeppelin pays.... (a mash up of Elvis and Led Zeppelin with a reggae beat, which oddly works better than you would expect.. for a little while)


I believe cover bands can actually be in violation of copyright.

It depends upon how you do it.

If you cover the song, then that's one copyright.

However, if you hire a bunch of people because they actually look and sound like the original band, that's a completely different copyright.


TBH, I saw an AC/DC cover band in a backwater Wichita Falls, Texas bar that was way better than the original. They were far, far nuttier animals and absolutely mad.


Hey, AC/DC were young once. Their lead singer died from partying too hard ffs.


I saw AC/DC in concert too. These guys were naked, doing backflips, bouncing off the walls, and breaking shit. Unless you're a sick puppy, hard living doesn't translate to entertainment value. I don't condone or value anyone's recreational extremophilia and I think your one-sided white-knighting is off-base.


that sounds an awful lot like The Impotent Sea Snakes - https://en.wikipedia.org/wiki/Impotent_Sea_Snakes


Not exactly. They were in the spirit of and homage to the original. They might not have been original but they brought the house down.


“ Midler was not seeking damages for copyright infringement of the song itself, but rather for the use of her voice which she claimed was distinctive of her person as a singer.”

Whereas Roc Nation is making a claim that the videos infringe on Jay Z’s copyrights, as well as another claim that they impersonate him without permission.

In the Midler vs Ford case, Ford had already secured permission from the copyright holder for the performance.

It’s quite a different case.


> The appellate court ruled that the voice of someone famous as a singer is distinctive to their person and image and therefore, as a part of their identity, it is unlawful to imitate their voice without express consent and approval

So it's illegal to imitate a famous person, but not a regular Joe?

We certainly are all equal, except those who are more equal of course.


I would read it as: if you make a living with your unique voice, imitation to make profit is unlawful.

The fact that they're famous is really just a tautology. Voices that make money are generally more famous than ones nobody recognizes.


I wonder if Rich Little had to get everyone's permission.


CEO of LBRY here. These videos are welcome on LBRY.

We'll have to get a real (i.e. not me commenting on HN) legal opinion should we get a take down request, but prima facie I don't see why these would be illegal.

(If they are illegal and we are notified, we would put them on the company maintained blacklist, as we cannot remove anything from the network itself.)

Edit: I wrote a rap on our perspective.

If you're having deep fake problems / I feel bad for you son

We got 99 deep fakes / And Jay-Z ain't taking down one

https://twitter.com/LBRYio/status/1255273319739293703


While I appreciate your hustle here, it might not be such a good idea to become the hub for deepfakes and the endless lawsuits they will invite. The deep ethical issues involved should make anyone pause, but taking on record companies and artists should make you pause further.

Everyone loves Napster as a history lesson. But it is doubtful that anyone would want to live it.

Your position seems to be inviting all of the controversy and lawsuits for little-to-no payoff.


There are no ethical issues in deepfake videos that weren't existent during the same infantile stages of faked and Photoshopped images on the internet. Some people could pick them out immediately. Some people are fooled even by obvious fakes. That will never change.


>There are no ethical issues in deepfake videos that weren't existent during the same infantile stages of faked and Photoshopped images on the internet

That is not the same thing as "there are no ethical issues". I'm sure some people, especially the celebrities portrayed, would feel there are the same ethical issues in all cases, and any difference is merely one of degree, not kind.


We are saying the same thing. Literally in the sentence you quoted, I said the same ethical issues exist in deepfakes that exist in altered still images.


Yes, but your comment has the potential to be read as dismissive of the ethical concern. I don't know if that was your intention, but I thought it relevant to make the distinction clearer.


I agree with pc86 on the ethical issues.

On the pragmatic ones:

- If the lawsuits ever become unbearable we can always choose to simply stop existing. One beautiful part of LBRY is that it does not depend on the company to work.

- It's not so much that we want to be thought of as the deep fake place, so much as we want to be thought of as a place for people who want to make their own choices. I think supporting this supports that goal.


If there is a lesson to be learned from Napster, it's that you should charge customers, still ignore copyright and pirate music to "bootstrap" your service. That's basically Spotify right there.


We need discussions in society about this stuff -- yes, even with people who might be too stupid for you to want to talk to them -- not just facts on the ground created by techies in ivory towers.

It used to be that jocks beating up nerds was an universally applauded activity, it's now rightfully seen as stupid and evil. The same will have to happen with people abusing their digital prowess against others, or the former will become untenable.

you can rap in tweets / but you don't want street / clown on us all / get your feet in concrete


Absolutely in love with lbry.tv! Had the desktop app for awhile, didn't know the website existed. Thanks! :)


I see no about page or anything on that website. Curious, is this like a P2P youtube? Decentralized somehow?

I hope something rises up and kills youtube with it's censorship. There was metacafe, then youtube, next: ???

Found this: https://lbry.tech/overview

>To create a market for accessing and publishing information1 that is global2, decentralized3, robust4, optimal5 and complete6.

Hell yes, do it!

Edit: Found the subreddit for this project, and they say youtube banned these videos? https://lbry.tv/Dr.-Erickson-COVID-19-Briefing:e1 - good job guys, the gatekeeping period of the internet 2007 to 2021 will be looked back on as dark ages.


Thank you sergio! lbry.tech is the best resource to learn about what LBRY is from a technical perspective.


So your business model is built on the presumption of possessing an inviolable license to simulate the likeness of others?

I’m not so sure that’s a stable basis given a lot of laws celebrities have gotten passed protecting control of their likeness.

This is a bit of a “I’m doing it on a computer so it’s different!” kind of thing that won’t necessarily legally fly.

Challenging the idea that someone can’t control the reproduction of their likeness full stop is settled in the general sense in law. Context shifting how the reproduction happens isn’t really much a difference maker.


> So your business model is built on the presumption of possessing an inviolable license to simulate the likeness of others?

No. It's based on people desiring a publishing platform that does not allow interference from intermediaries ala YouTube, Facebook, or Amazon.

> Challenging the idea that someone can’t control the reproduction of their likeness full stop is settled in the general sense in law.

You're commenting on an article that says this content is probably legal. If you think it's not, it'd probably make more sense to comment on the top-level thread.


https://www.descript.com/ and Lyrebird prevent people from making copies of voices that arent their own, but it looks like that cat is out of the bag.

Abuse potential notwithstanding, and ignoring the complete distortion of "reality" coming soon, I'm extremely excited for this technology to become more mainstream. Being able to edit audio and video, like you edit a word document. Record a conversation for a couple hours, compile the transcript, and synthesize it into something tight, all without the need for a traditional video editor. Voice synthesis for words that werent spoken, or misspoken, frame interpolation and morphing to prevent the jaggy youtube cutting effect.


A YouTuber (carykh) made a video where he explains an algorithm he made to automatically process lecture videos by speeding up, condensing and removing parts of the video.

https://www.youtube.com/watch?v=DQ8orIurGxw


https://s.sneak.berlin/@sneak/104054875133518950

I want a YT speed setting for “constant WPM” based on their autocaption timestamp metadata.


Don't let your dreams be dreams. YouTube data API allows downloading captions, and the iframe API allows setting playback speed.

You could build a site that inputs YouTube video URLs + WPM and then outputs the video playing at the desired WPM.


That’s an amazingly obvious idea whose time has come. Who do we @ to get this on YouTube already?


Overcast does roughly that for podcasts and it works great.


Is there a similar feature in Google Podcasts? I know that they are transcribing them automatically in the same way captions are generated for YouTube videos. I think it mostly blew over but I remember Google caught some flak for that in the same way they did for Google Books v Authors Guild fiasco. This is why we can’t have nice things.


oh. my. god. This is about to change... a lot of things. Stop motion by clapping totally has me.


I was surprised to find carykh's video is over a year old now! This has so many applications, I'm surprised it's not already common.


We just posted a new post about Lyrebird alternatives now that they've been bought: https://blog.replicastudios.com/lyrebird-alternatives/


> https://www.descript.com/ and Lyrebird prevent people from making copies of voices that arent their own

how do they check which voice is your own?


If I understand correctly, you train it live. So unless you knew what script they were going to give you, and had the words prerecorded in order, it would be hard to do. Sort of like a captcha with a timer?

I wonder if they also have signatures for famous voices they can blacklist.


This was approached all wrong. The first highly publicized demos of singing generated like this should have been of dead people - Michael Jackson, John Lennon, Elvis. First, it's not clear who has the right to sue, and in what jurisdiction. Rights associated with those people have been transferred and resold enough times that there's no clear claimant. Second, many people have impersonated those voices, so there's a strong argument that this is just automating a manual process. Someone could have probably gotten a few tracks out the door and onto airplay before the first litigation.

Going up against a living performer whose main asset is their vocal style makes a weaker case. Living people have stronger publicity rights than the estates of dead ones. This could result in a decision which expands the scope of copyright. One of those weird copyright decisions, like the one where Owens-Corning trademarked PINK, as a color, for insulation. Coloring insulation was so unusual that it was held to be a valid trademark. Now there are other copyright on colors. Bad cases make bad law.

This thing has been botched so badly that one wonders if it's a fake case from the music industry to get a losing decision.


> One of those weird copyright decisions, like the one where Owens-Corning trademarked PINK

Trademarks and copyright are NOT the same thing. That trademark would only give them rights to the branding PINK in the context of insulation. Copyright is more expansive - and I think you should be more worried about copyright than trademark.


Right, PINK is a trademark of Owens-Corning, associated with licensed copyright rights from the use of The Pink Panther cartoon character in some long-forgotten ads.[1][2]

[1] https://www.owenscorning.com/copyright [2] https://youtu.be/C5uM-QwKgW4


>Rights associated with those people have been transferred and resold enough times that there's no clear claimant

That doesn't really matter. Anyone who held any piece of the rights could submit to YouTube for a take down, and YouTube will do it. They don't go very far in trying to figure out the legitimacy of a claim, and any rights holder will be good enough for them.


dead people's copyrighted work sues for copyright issues all the time... not sure how that would help.


Can someone explain to me how copyright being valid after the original owner's death progresses healthy competition? What benefit does it have?


People work not just on their own behalf, but also of their descendants. Knowing that your spouse or kids will keep getting royalties provides an extra incentive to produce new work, much like people create businesses or make investments to leave as inheritance.


how often does it profit large corporations and not their descendants?


Rarely; it usually profits either both or neither.


It doesn't; it's just there because Disney has enough power to make it so.


It incentivizes artists to create art, because their work will support their heirs.


"... support their heirs"

, but often some large corporations


if you see a new video with elvis presley today, you know it's not really him. the chances of mistaken identiy is low.

if the person is still alive then people might believe it's an original and the impersonated can legitimately claim that the fake is profiting from that mistaken identity.


Mistaken identity doesn't come into play here. It's copyright that got the videos removed, not publicity rights.


The point is that copyright doesn't apply to someone's image or likeness - that's trademark! Copyright only applies to a work itself (at least in the US).

A deepfake Elvis video of a song that he didn't write would be subject to copyright by the original author of the song and to trademark by the owner of Elvis's likeness. However, it seems plausible that such a trademark claim might fail because (among other things) it should be apparent to a viewer that it isn't actually Elvis.


Deepfakes (a better term is needed) are by their nature derivative works, which copyright already handles with the original's owner retaining rights which have to be acquired. I have absolutely no problem with it. If you don't own the copyright to your entire training data set, you are infringing by publishing something the model produces as I see it.

The courts or the legislature are going to have to address how your rights to your own image being synthesized like this, but again, I have no problem with someone retaining the same sorts of original ownership rights when it comes to derivative works that use data collected from them, beyond just who owns the copyright to the works in the dataset. (voice, video, images, etc.)


> If you don't own the copyright to your entire training data set, you are infringing by publishing something the model produces as I see it.

Imagine if that were true - a human artist couldn't create anything, because everything in the world around them is their training set!

I think it has to be a likeness test - if an average person would think its a real Jay-Z music video, then it's a derivative work. Don't muddy it with the implementation details of how it was made.


A human isn't a machine. What is in your head is sacred to the law. A neural network algorithm and its data is not.


> If you don't own the copyright to your entire training data set, you are infringing by publishing something the model produces as I see it.

I suspect that would be an absolutely terrible standard in practice. I realize you specifically have deepfakes in mind, but consider that in more general terms those are outputs from a machine learning algorithm. Such outputs will inevitably fall on a continuum regarding the degree to which they resemble a distinctive art style, voice, personality, or other metric.

It seems unreasonable to me that it should be illegal to use arbitrary input data to train a NLP translation model or an image classifier. Farther along the spectrum of outputs, style transfer GANs don't seem like they should be a violation of the law. Should TWDNE really be against the law to host? (https://www.thiswaifudoesnotexist.net/) According to your proposed standard, it would be.


I agree with deepfakes are derivatives. You're using the likeness of someone whether visual or audio to highlight work. People pay attention because of that likeness. Weird Al asks for permission, video games have to pay royalties. So JayZ should have been asked permission in that sense.

If this was a parody, that could probably be argued very differently. Not sure where this particular case falls legally.

I don't agree with if you don't own the copyright you are infringing by publishing something the model produces. It should be tied back to the likeness of the original source.

The argument would be - if I never used any copyrighted JayZ sources (talking in a public setting, not giving a speech) and produced a deepfake, provably so, that would prevent me from being sued by JayZ & co?

Ideally no, because I'm still using his likeness to promote something.


Copyright only protects the written implementation, not ideas or likenesses. The amount of protection a likeness gets varies a lot more between states, it is a different set of laws, and outside of advertisements or defamation it often is not protected at all.


Whether or not something is a derivative, or it is a new transformative work is is legally transformative in a lot more cases than you would expect. Cariou v. Prince page 4 and 5 is the most famous example, and even this minimal change is legally transformative. https://cyber.harvard.edu/people/tfisher/cx/2013_Cariou.pdf


Isn't parodies are explicitly excluded from requiring to hold copyright to an object of a parody?


In theory, sure.. But remember that on YouTube, the vast majority of uploaders are not customers, they're the product.

Google is a business and therefore the calculations that go into deciding who to side with in these kinds of disputes doesn't _only_ factor in the law, but also the impact on the business.

In this case, they predictably sided with a fellow corporate entity, like they almost always do. There is no business reason for them to go to bat for a small non-commercial uploader, they're better off just removing the video when a deep-pocketed company complains about it.


Yes but with complexities in general, and a collision with non-copyright issues for deep fakes that starts to get into more unexplored societal/legal territory. First, parody is part of Fair Use, which means that it's an "affirmative defense": in a lawsuit, the burden is on the defendant to bring it up and prove it. That's in contrast to ordinary defenses or arguments around the facts asserted by the plaintiff, where it's up to the plaintiff to prove them to whatever the required standard of evidence is. In practice, that can mean a somewhat higher financial risk and higher chance of losing at the edges.

Second though, parody (and Fair Use) is about copyright and trademark, protecting use of such material for commentary and so on. But use of someone's likeness directly, particularly for someone famous, in order to produce new works is arguably something new that hasn't really been dealt with yet. Jay-Z making a copyright claim definitely seems dubious, and perhaps was done simply for convenience rather then legal strength, copyright disputes are the form in which most take down systems work. I can see arguments both ways for whether copyright would apply at all: in favor, the argument would be that the ML models are being trained via copyrighted works, which in turn makes them derivatives. On the other hand, facts are not copyrightable (in the USA) regardless of effort or source. A counter argument would be that the ML models are merely deriving facts about a person's vocal cord, facial structures and other physical natural characteristics, which then create a factual physical model which can be utilized to produce new works. In that case, all these new deep fakes would be their own brand new copyright (and potentially the ML models themselves not copyrightable). That'd be an interesting legal argument to see hashed out.

But even if they're new copyrights, right to voice & likeness are issues in some jurisdictions and certainly could be argued should be more so as technology makes this easier. I think factors around threat to reputation and so on also are raised in new ways with deep fakes vs remixing and adding commentary to real, existing works (which can in turn be referenced by anyone who sees the parody). Even if there is a disclaimer on the original deep fake, as a de novo work which itself might get spread around without context it's at least different then what we've had until now.


>First, parody is part of Fair Use, which means that it's an "affirmative defense": in a lawsuit, the burden is on the defendant to bring it up and prove it. That's in contrast to ordinary defenses or arguments around the facts asserted by the plaintiff, where it's up to the plaintiff to prove them to whatever the required standard of evidence is.

This is a misnomer.

Fair use is an authorized use, and consequently is “distinct from affirmative defenses where a use infringes a copyright, but there is no liability due to a valid excuse, e.g., misuse of a copyright.” Id. Lenz, 815 F.3d at 1152


>This is a misnomer.

No, I don't think so. Fair Use isn't merely a matter of court precedent, it specifically is in the Copyright Act (17 U.S. Code § 107 [1]), the language of which indicates it's on the plaintiff, and subsequent case law does seem to have affirmed that unless you have something further to cite? What you cited right there was Lenz v. Universal Music Corp, a 9th Circuit decision about abuse of DMCA takedowns, and in turn considering the "under penalty of perjury" aspect of the DMCA not "Fair Use" as a defense in general. The quote you gave was in the context of §512, Judge Tallman wrote that §512 "unambiguously contemplates fair use as a use authorized by the law". But again that's specific to the DMCA, and even there while the 9th seemed to want to try stemming abuse a bit, they only required the plaintiff to show a purely subjective lack of belief in infringement. Which could be without any real consideration of fair use factors at all. As well as being circuit only, analysis at the time indicated that if anything it might encourage copyright holders specifically to do as little as possible to consider fair use. Lenz did appeal to SCOTUS on that question but certiorari was not granted. Harvard Law had a fairly in-depth looking analysis [2].

In contrast for the 9th Circuit specifically in Perfect 10 v. Amazon/A9.com/Google [3] they explicitly covered Fair Use as an affirmative defense where the burden was on the plaintiffs:

>C. Fair Use Defense

>Because Perfect 10 has succeeded in showing it would prevail in its prima facie case that Google’s thumbnail images infringe Perfect 10’s display rights, the burden shifts to Google to show that it will likely succeed in establishing an affirmative defense. Google contends that its use of thumbnails is a fair use of the images and therefore does not constitute an infringement of Perfect 10’s copyright. See 17 U.S.C. § 107.

Additionally, I can find modern SCOTUS opinions such as in Campbell v. Acuff-Rose Music which support Fair Use as an affirmative defense:

>The fair use factors thus reinforce the importance of keeping the definition of parody within proper limits. More than arguable parodic content should be required to deem a would-be parody a fair use. Fair use is an affirmative defense, so doubts about whether a given use is fair should not be resolved in favor of the self-proclaimed parodist.

It'd be nice if plaintiffs were required to demonstrate as part of a suit that there was not a fair use defense for the defendants, but I really don't think that's the case nationally right now.

----

1: https://www.law.cornell.edu/uscode/text/17/107

2: https://harvardlawreview.org/2016/06/lenz-v-universal-music-...

3: http://cdn.ca9.uscourts.gov/datastore/opinions/2007/12/03/06...


I understood the Perfect 10 case to be decided not on the merits and on a pretrial motion, and that the standard for fair use is not the same. Almost certainly the plaintiff will make a prima facie case, and an affirmative defense is needed then only on the pre trial motion. If the case went to trial the statutory text(arguably)does not require it to be raised as an affirmative defense.

I agree that Campbell v. Acuff-Rose Music supports it as an affirmative defense, but it is not the most modern case law even though it is the only modern supreme court case. Mattel v Walking Mountain Productions in the 2nd did not use fair use as an affirmative defense. Cariou v Prince, also in the 9th and denied certiorari is a modern case that did not depend on fair use as an affirmative defense. In the appeal they did question whether fair use is an affirmative defense specifically.

Would you say that congress intended fair use to be an affirmative defense?


I don't remember the specific arguments that he laid out, but after watching this (42 minute, worthwhile) video by Tom Scott about copyright my understanding changed. Now I assume that most things don't fit some narrow requirements regarding what parodies are allowed and what aren't.

https://www.youtube.com/watch?v=1Jwo5qc78QU


I can’t help but laugh that the opening story about music rights could have been avoided if the original Pied Piper search existed.


Its a little tricky.. Even Weird Al gets the original artist's permission.

"(Technically, a parodist does not need permission, but it is a legal gray area, and Weird Al prefers to have every artist in on the joke.)" https://www.nytimes.com/2020/04/09/magazine/weird-al-yankovi...

edit: the tricky pit is defining parody.


>Its a little tricky.. Even Weird Al gets the original artist's permission.

Weird Al's preferences don't add any strength [evidence] to any argument regarding the legality.


Weird Al is trying to save himself trouble, because going to court can be expensive even when you win. He can legally do parodies without consent, but since so many people are happy to give him permission, that's the easier route for him.


Most of Weird Al's songs aren't parodies in the legal sense.

"Party in the CIA" is just "Party in the USA" with different Lyrics, it's not protected, just like most covers aren't. "Smells like Nirvana" is, because it directly references and parodies how the singer in the song is barely understandable.


Where would White and Nerdy place? It's not a direct comment on the original artist, but it is intentionally playing with a reversal of the stereotypes associated with the original piece.


Only a court can decide that. Parody is a defence in a copyright infringement case when it comes down to it, which is why Weird Al just gets permission instead. There's a lot of uncertainty on the edges.

It's interesting to read about the trouble Penny Arcade had https://whatever.scalzi.com/2003/04/25/strawberry-shortcake-...


See also: https://en.wikipedia.org/wiki/You%27re_Pitiful

> While Blunt himself had no issues with Yankovic recording the parody, Blunt's record label Atlantic did; they forbade Yankovic from commercially releasing the song at the last minute. Yankovic eventually released the song online as a veritable free single; furthermore, in music videos and during live performances, Yankovic has made reference to his dispute with Atlantic. Since the initial debacle in 2006, Yankovic has occasionally reached out to Blunt and his label to see if he can release the song on compilations. However, each time that he has approached Atlantic Records, he has been denied permission.


If Blunt had said no and his publisher said yes, Yankovic wouldn't have released a song even for free. So it seems Yankovic considers asking for permission to be both a matter of courtesy and liability, with more apparent emphasis on courtesy.


Fair use is decided on a case by case basis if the owner wants to take it to court, and each time you get a new bite at the apple!


Yes but an imitation is not necessary and usually a parody. There is a good Tom Scott's video about this distinction and other common misconceptions about copyright: https://www.youtube.com/watch?v=1Jwo5qc78QU (42 interesting minutes)


Are you sure that it's actually parodying Jay-Z? Is it making fun of Jay-Z?

Andeepfaked audiobook of a famous person reading a Brothers Grimm children's story is not obviously parody, and Jay-Z has a very well-known voice, and well known voices sell audiobooks nowadays. Just because you give it out for free doesn't make it fair use, either.

Second- no, parody is not explicitly excluded. It's one of several factors used to determine "fair use". Just because something is a parody doesn't mean it's automatically fair use, and the only way to get an explicit determination is to fight it in court.


Lots of examples of their synthesis here: https://lbry.tv/@VocalSynthesis:2?page=1 .

I've been thinking about the deepfakes as a kind of computational thinking aid. We can all simulate reading text from the voice of random celebrities in our mind's eye. How different is the ability to bring that imagination into reality?


What with Jeff Bridges in Tron and Peter Cushing in Rogue One, I'd suspect in about a decade or so we'll begin seeing Disney movies with an almost full cast of deceased CG actors, complete with their own voice.


See The Congress: https://www.imdb.com/title/tt1821641

We should expect living actors to be signing away their likenesses for when they are dead.


I feel like I missed something--what about Jeff Bridges in Tron? Or are we talking about Tron:Legacy, and the character of CLU?


yeah, CLU is what I was referring to


But we expect this don't we from YouTube? Just as an example of how broken it is I have been live streaming at our local church, our parish priest decided to sing "Tantum ergo" which was written and set to music at best guess 1264. YouTube flagged it.


For my own edification, wouldn't trademark law be more applicable?


I'm quite puzzled as well. As far as I can tell (not a lawyer) copyright law shouldn't apply at all in this case. According to the article the actual works performed were an excerpt from Hamlet (public domain) and a song by Billy Joel (so someone likely has a valid copyright claim, but probably not Jay-Z).

On the other hand, it seems like it might well be an infringement of his likeness (ie a trademark violation). I'm not sure about the nuances surrounding trademark law though - does it have to make money, do intentions matter, how obvious does it need to be that this is an impersonation, etc.


I deleted my own post to that effect, once I saw yours - so I also wonder about that.

However I’m not sure if YT has a separate DMCA process for copyright vs. trademark, or if they or people writing about this are just too lazy or uninformed to make the distinction.


One could argue that the audio used to train the deepfakes was the actual infringement, as it is essentially a copy of presumably copyrighted audio, even if not republished in its original form.

This would be similar to the Google Books scanning issue. Their original plan was to scan all the books in the world so users could search them. Though Google did provide extended samples in the search results, the core issue was the scanning, which was literally a copy of the books into Google's database without permission. This was considered infringement. [1]

Where the original sound comes from and who owns the copyright for that, I don't know. Maybe the person used CNN videos, so Time Warner should be involved as well.

1. https://www.wikipedia.org/wiki/Authors_Guild,_Inc._v._Google....


Wouldn't deepfakes have similar legal requirements as the Tupac hologram used at Coachella? http://www.ipbrief.net/2012/04/19/tupac-hologram-rocks-coach...


Now maybe it's time for society to catch up and provide some sort of digital identification method for citizens.

Sort of like keybase but from the government.

Because deepfakes should not be under assult for being parody. Instead artists should be able to verify their own works in a secure manner.

Some countries have already started with this, mostly in the EU.


Looks like the Jay-Z deepfake rap of Shakespear's "To Be, Or Not To Be" is still 5 hours after this link was published on HN: https://www.youtube.com/watch?v=m7u-y9oqUSw


I know its meant a a joke but Jay-z copypasta over a beat is legit cool. https://soundcloud.com/odyssey/jay-z-copypasta/s-JRXSJCBFsPu

So I get where he's coming from. If I heard this version first I would have thought it was actually Jay-Z. The naked recording has all artifices of AI generated voice but mixed into a composition they are gone to my ear.

And also there are a few moments that are surreal. A dragged out s at the end of a phrase. Change of flow from smooth to a fast staccato. Fucking the inflection was near perfect. There is something to this AI stuff.


Maybe this will get the Streisand effect and people will make scores of videos saying all sorts of things, some not so nice...


I hope Lawful Masses with Leonard French covers this.

I think YT may not have to do it legally, but they may still do it to keep money happy.


We really need a law that forces people to label fakes when they willingly produce ones that are hard to tell from the real thing, and harshly punish the ones that omit the label.

It's ok to fake stuff, but it should be clear that you are doing so.

We already have fake cars in advertising, fake food on menus, fake news, fake expert advices... I really don't want to add perfect impersonification to the list.


The law is going to have to catch up here. Jay-Z has worked very hard to make his 'voice fingerprint' have value. Having listened to the deepfake, it's an extremely accurate copy of Jay-Z's voice. To what degree does Jay-Z's voice constitute a 'work of art' and to what degree does a copy become a forgery? You could say that an impression made by a real person is a 'work of art' but if you remove a real person and use ML, is it still a 'work of art'? We really need to consider the existential questions raised by primitive forms of AI and consider what AGI means from a philosophical perspective before lawyers and judges start defining what can be done with AI in society.


Update to the article: “I just heard from Vocal Synthesis’s creator that the copyright strike was removed, and both videos are back on his channel. He’s not sure if YouTube reversed its decision or Roc Nation removed the claim, but I suspect the latter.”


Sorry to dilute the topic but you’ve got to listen to Frank Sinatra singing the “Navy Seals CopyPasta” with this technology. It’s the best thing I’ve come across so far this month: https://youtu.be/8ixYcyslmSI


The video of 6 US Presidents performing NWA's "Fuck Tha Police"[1] is also pretty incredible. Some of the older presidents (JFK, Roosevelt, Regan) are a bit rough, presumably due to less training data. But even in FDR/MC-Ren's verse you can hear echoes of his cadence when saying the "only thing we have to fear is fear itself".

The Obama and Trump verses also have moments that sound uncannily realistic.

[1]: https://www.youtube.com/watch?v=mAZVp-n-5TM


If your image if your income, having it defamed via a deep fake or the image rights, aka copyright is only fair. Can't understand why anyone would not back a creator. So many on here are themselves working to create a unique creation that they can live off of.


A human impersonator, clearly labelled as such, probably wouldn't have received the same treatment. These videos were simply computer-assisted impersonations.


A difference in quantity can become a difference in kind.

Consider a situation down the line where your phone can do this in real time, and at a quality that is hard for the untrained ear to distinguish from the genuine article.

It's not at all clear to me what the right thing to do here.


Me neither. Companies have been sued when they use impersonators in their ads, but individual impersonators are generally left alone (excepting that they license any songs etc by those they're impersonating)

Doing things computationally allows a scale that humans alone can't achieve, and as you said that may make all the difference.


As an update, it looks like the videos are back up and the copyright strikes are removed



Does anyone have a mirror for these? They were super funny


I thought Jay-Z had trademarked his image, if so couldn't he keep a deepfake off with an argument that it is a computer generated version of his likeness?


This would be a great voice processing effect for musicians. Want that Grunge Kurt Cobain sound to your vocal part? How about a Lennon?


Similar stories and even worse have been happening for years. I'm not sure why this sort of thing registers as news anymore. I'd expect more ink if YouTube actually said no to a copyright claim.


I went and found them and listened to them out of spite.

Thanks for bringing my attention to this Jay-Z!


Am I going to get sued for having a website that produces arbitrary deepfake audio (Arpabet + Tacotron + Melgan) of Trump and Biden?

My opinion is that state actors can already do this. If we train the public that "photoshop for audio and video" is a thing, then they'll learn to be skeptical.


Who is Jay-Z ? Who cares ?




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: