by talonx on 1/6/25, 5:10 PM with 412 comments
by RobertDeNiro on 1/6/25, 5:38 PM
" Hello! This was a false positive in our systems at @ChainPatrol . We are retracting the takedown request, and will conduct a full post-mortem to ensure this does not happen again.
We have been combatting a huge volume of fake YouTube videos that are attempting to steal user funds. Unfortunately, in our mission to protect users from scams, false positives (very) occasionally slip through.
We are actively working to reduce how often this happens, because it's never our intent to flag legitimate videos. We're very sorry about this! Will keep you posted on the takedown retraction. "
by tialaramex on 1/6/25, 5:34 PM
This works for actual creators, who are occasionally slightly inconvenienced but handsomely rewarded when that occurs, for Youtube, who get paid each time these "rare" mistakes happen, and for the companies "innovating" by making up nonsense and taking people's money. Just as well the "investment" goes to a Youtuber as to some random office park or an ad firm.
by daveguy on 1/6/25, 7:45 PM
https://xcancel.com/3blue1brown/status/1876291319955398799
This links to independent Nitter to provide a full thread.
by rglover on 1/6/25, 11:15 PM
If you're in a position of influence in an organization that's losing its marbles over AI, please, at the very least encourage others to pump the brakes and think.
If there was ever a time to speak up when you know implementing something will lead to a likely disaster, it's now.
by btown on 1/6/25, 7:15 PM
And the incentives for rectifying this are skewed: video platforms simply need to address individual cases with influential creators just reactively enough so that collective action isn't incentivized; that's far cheaper, and far easier to not need to coordinate with traditional rightsholders, than addressing the problem systematically.
If we believe that the vision of being an independent content creator is important to humanity - and I think it's becoming vital as "a way to distinguish myself" that folks are able to dream about from an early age - then we need to seriously work to protect it. Not everybody will get their "big break" but we can at the very least start having conversations about protecting creators from an AI-driven DMCA bot arbitrarily destroying their career through automated channel-disabling rules.
by the__alchemist on 1/6/25, 5:22 PM
by the__alchemist on 1/6/25, 5:29 PM
There's a `report abuse` button on the right of that page. I used it. (Category: Bullying and Harassment)
by kepano on 1/6/25, 5:32 PM
If you're a creator it's essential to have your own place on the web were you can host and publish anything without fear that it will be taken down for any reason — even accidentally.
As it becomes cheap to automate both creating takedown requests and processing requests, the volume of spam requests is going to skyrocket and it seems likely there will be more false positives.
by lousken on 1/6/25, 5:58 PM
if the content creator can get their channel removed, same thing must apply to the opposite side as well
by throwaway290 on 1/6/25, 5:20 PM
Tech built on copyright abuse used for copyright trolling? Too early for peak irony of 2025!
by pbhjpbhj on 1/6/25, 5:31 PM
Continued libel that inhibits the democratic exercise of free speech seems like something the government should act on?
by malwrar on 1/6/25, 7:41 PM
We need something that frees us from this prison. I still remember when it was normal for youtube videos to play mainstream music in the background. Now draconian enforcement has created this artificial power that people beyond music labels can abuse, and it affects our art adversely. Feels clear to me we need something new that operates in 2024, not from the era where individual movie pirates faced 6-7 figure fines and jail time.
by some_random on 1/6/25, 5:40 PM
by wil421 on 1/6/25, 5:23 PM
Taco Bell won the franchise war and is the only restaurant remaining.
by chad1n on 1/6/25, 5:39 PM
by queuebert on 1/7/25, 3:05 PM
by projectileboy on 1/7/25, 11:02 AM
by gweinberg on 1/6/25, 11:56 PM
by j45 on 1/6/25, 10:37 PM
With YouTube video being used as a proxy for credible content on search results..
3 relatively anonymous complaints, in bad faithc can end so much learning and work… without evidence or reply kind of is deterring from having great content on YouTube.
The deterrent to creating good content on YouTube lets the bad content win, except it might not keep the eyeballs for advertising as well or broadly.
I’m not sure if the complainant must be required to contact the channel prior to accepting a dmca complaint? EBay has a built in messaging system, maybe YouTube can too.
Further if there’s ways creators can be protecting their creations before posting they should be built into the workflow, whether it’s registering custom music, etc.
Otherwise the price of success is targetable in an automated fashion to take down a channel if they don’t comply or pay out.
A channel inbox might force behaviour into first creator to creator before escalating straight to too easily triggering things.
Maybe new complainants found to have too many complaints in short order or some other pattern could possibly have to pass much higher kyc requirements to help each other communicate more effectively.
by vasco on 1/7/25, 8:12 AM
by mensetmanusman on 1/6/25, 7:17 PM
3b1b is basically a causality in this war of greed.
by tomalaci on 1/6/25, 6:05 PM
That would put real consequences on users misusing platforms. Even a small fee for misbehavior would likely curtail vast swathes of bad actors. It would also make companies be less trigger-happy with their bots if such are allowed to operate in that ID framework (i.e. an identifiable bot being punished would be a fee subtracted from the company that uses it).
I pretty much expect that kind of system in the future, otherwise we will just return back to private networks and private communities.
by Imnimo on 1/6/25, 9:09 PM
by akino_germany on 1/8/25, 1:58 PM
by davidjhall on 1/6/25, 5:41 PM
by adamc on 1/6/25, 6:17 PM
by adas222 on 1/12/25, 10:43 PM
by nathanmills on 1/7/25, 10:49 PM
by teekert on 1/6/25, 7:44 PM
by ngriffiths on 1/6/25, 7:10 PM
I don't know though, maybe that will prove to be too hard and the bot filled platforms will win. In which case maybe the only way to be safe from the bot armies is to hire a bigger bot army yourself. Fun!
by throwaway423342 on 1/6/25, 9:06 PM
https://www.bilibili.com/video/BV1yJ41117we/?spm_id_from=333...
by grumple on 1/6/25, 6:26 PM
Why don't we build a video hosting service served from there, if such a place exists?
by dbspin on 1/7/25, 3:16 PM
by lazzlazzlazz on 1/6/25, 7:55 PM
by jvreeland on 1/7/25, 4:12 PM
by hkon on 1/6/25, 6:34 PM
by bulletmarker on 1/7/25, 12:26 PM
by TZubiri on 1/6/25, 6:49 PM
Surely there's precedent is it not illegal to operate an unmonitored industrial factory
by sneak on 1/6/25, 6:04 PM
This is just a mistaken corporate interest. What about when the state wants very much to hide something?
by Jupe on 1/7/25, 3:14 PM
by kemiller on 1/6/25, 6:32 PM
by Joel_Mckay on 1/6/25, 5:38 PM
Best of luck suing the IP racketeers =3
by dredmorbius on 1/7/25, 4:41 AM
<https://den.dev/blog/be-a-property-owner-not-a-renter-on-the...>
HN discussion: <https://www.youtube.com/watch?v=AlcDHlK_RoY>
If you keep working on Maggie's Farm, you'll keep encountering Maggie's rules.
<https://en.wikipedia.org/wiki/Maggie%27s_Farm>
Rely on YouTube for distribution, for now, but build a home base (or bases) elsewhere which you can fall back on.