by arunharidas on 3/3/22, 11:56 AM with 106 comments
I think Bitchute is using p2p to deliver videos, or is it, really?
by freefaler on 3/3/22, 12:34 PM
by blahgeek on 3/3/22, 1:20 PM
Live streaming p2p is the easy one. Live streaming content is very skewed: top 1% streams can cover majority of the bandwidth, the top room can easily have tens of thousands of users, so p2p helps a lot.
Video-on-demand is harder. But now there's also many "seed boxes" out in the market: it's basically a custom home router with a big disk, users buy it and put it in their home as a regular wireless router, but in the background it would automatically connect to server, cache videos and serve the videos to other peers. The user may get some bonus from it (mostly digital points). Essentially, these companies are buying users' home internet as CDN edge nodes.
But in either way, P2P is used to save some bandwidth (cost), but the performance would almost always be worse. There would always be traditional CDN as a fallback.
Some possible reasons why it's more popular in china: 1. there's lots of people here; 2. users don't care about their privacy much
(I worked on this area in one of the largest video hosting company in china)
by warrenm on 3/3/22, 1:21 PM
Case Study: Bit Torrent [0]
Bit torrent is a brilliant idea - allows everyone who has part(s) of a file to contribute to the pool of availability so that any given central server/mirror isn't overwhelmed
And it has its place (eg in file sharing)
But for streaming? Not so much
Say you're getting the "next chunk" (whatever 'chunksize' is in this context) from me, and I go offline (it's the end of my day, need to reboot for updates, any of myriad reasons). Where does the next bit of the video come from in a way that is seamless to the viewer?
That is the fundamental problem of shared/p2p streaming protocols - everytime the host of the current/next blob o' data goes offline, you need to waste time finding a replacement
Even if the replacement can be found "quickly", how do you ensure they don't go offline in the middle of streaming? How do you ensure "enough" copies of every chunk are distributed that it, effectively, 'doesn't matter' how many go offline [at once], it will still stream?
--------------
by tsujp on 3/3/22, 12:12 PM
Essentially no one wants to host and run the infrastructure for these things, only the "diehards" do. This is akin to how only the "diehards" produce content on Wikipedia, or YouTube. Remember, this is relative to the volume of people who only consume that content.
It's the same in P2P e.g. Bittorrent. The count of seeders (those who have the content and are uploading it to peers who need to download it) is almost always lower than leechers (those who are only downloading).
This is exacerbated with Peertube because video files are big, require a lot of bandwidth, and someone has to pay for that. If there is a set of diehards who love running Matrix or IRC (think other smaller networks not just Libera) or what have you then how many of that already small set can meaningfully afford to run a Peertube instance capable of serving a momentus volume of video traffic?
I wish it wasn't so but it is.
[1]: https://en.wikipedia.org/wiki/1%25_rule_(Internet_culture)
by jasode on 3/3/22, 12:39 PM
The constraint on wide adoption is literally the quantity of "peers" in both Peertube and peer-2-peer.
The underlying human incentives are not there for most people to host a peer node. This limitation applies to all types of digital domains including videos, or files (IPFS or bittorrent), or crypto (Bitcoin/Ethereum peer node).
Let's follow the trail of incentives for one Peertube node:
- see list of Peertube instances: https://joinpeertube.org/instances#instances-list
- I pick the 2nd one on the list: https://the.jokertv.eu/about/instance
- in that instance's "About" page, it says: How we will pay for keeping our instance running -- personal funds. If you feel like donating, put up your own instance instead and host some creators you find interesting.
Companies can't build a Youtube/TikTok competitor based on examples like that. Same forces of economic incentives that limits quantity of IPFS peers which means businesses can't use it to replace Cloudflare CDN or AWS S3 buckets.
It should be understandable why most people aren't willing to spend personal funds on hosting home nodes so businesses can freeload off of p2p and save money on bandwidth.
by CJefferson on 3/3/22, 12:33 PM
While peertube saves one big pile of money (the streaming costs), it makes it significantly harder to make any money. Videos are expensive to make, so generally creators want a wide audience, and ways to monetise that audience.
by kazinator on 3/3/22, 12:12 PM
A video hosting company needs to hold all the videos hostage so they can make money somehow, like via ads. Plus the ability to control content: the videos themselves (e.g. be able to censor or delete them), and other content like user accounts, comments and whatnot.
This is like asking, why doesn't Reddit just operate as a Usenet gateway with a Web UI?
by jstummbillig on 3/3/22, 12:46 PM
For the same reason we don't see many video hosting companies, period: YouTube. It eats the entire space for reasons that are entirely unrelated to the pros and cons of p2p. For hypothetical competitors that means solving their biggest issue – gaining users over YouTube – can also not be done by going p2p (but it might of course still be part of their hypothetical tech stack for other reasons).
by psion on 3/3/22, 4:00 PM
My two cents.
by austincheney on 3/3/22, 12:13 PM
Client/server is very easy, because there is a centralized common point of access that manages everything. In a serverless model you have to connect via identity that has no fixed address.
by KaiserPro on 3/3/22, 1:04 PM
Basically it requires enough people with enough bandwidth to stream data to you. Not only that but those peers need to ideally be near you so that they can react quick enough to overcome packet loss.
So you have a choice as a provider, maintain a fast autoscale peer list(expensive), or force long pre-caching to make up for uncertain peer reliability(bad experience and or expensive).
P2P is not a good platform for providing realtime high bandwidth services, unless there is strong incentive to keep your peer running for as long as possible. with video, you're going to close it as soon as you've finished watching. for an hour long movie, thats fine, for a 30 second clip, its terrible.
by chrisjarvis on 3/3/22, 4:40 PM
I hadn't even thought about the telemetry concerns (how do you sell ads efficiently?).
by Animats on 3/3/22, 6:30 PM
No ads. That's the big advantage over YouTube now. And it beats paying Vimeo to host obscure technical videos.
by krinchan on 3/3/22, 5:37 PM
1) The US and vast swaths of the world have metered internet connections. The benefits of P2P for your bottom line will be short lived as customers abandon you for bloating their data usage without their permission.
2) Those lower costs will never be passed down to consumers by companies. I don’t know how or why you seem to think this time will be different.
I’m inclined to feel there’s some Web3 Not-Quite-Astro-Turfing going on here. The refusal to see either of those terribly obvious points feels super fake.
by nicoburns on 3/3/22, 1:42 PM
The problem is that in most of the world, consumer internet connections are highly asymmetric, and people have terrible uplinks. Thus there often won't be enough bandwidth available for streaming, and furthermore those providing the bandwidth will find their general browsing experience degraded due to uplink saturation.
by jrm4 on 3/3/22, 5:11 PM
If Peertube takes off (and I hope it does) it would likely be on the back of something that's expressly anti-status-quo. (Which doesn't mean it couldn't be a company per se, but I feel like would have to wear "We are the anti-Youtube" on its sleeve.)
by devoutsalsa on 3/3/22, 1:24 PM
by brudgers on 3/3/22, 2:01 PM
The average internet connection (at least in the US) doesn't have enough upload bandwidth modulo level of service to meaningfully participate in streaming peer to peer.
Watching video on a mobile phone is probably the median streaming use case.
by evancordell on 3/3/22, 12:26 PM
I couldn’t tell from a quick Google if the modern VUDU service still does anything P2P, but the service is still around running on all sorts of platforms.
by max_ on 3/3/22, 12:13 PM
P2P infrastructure is usually run on hardware designed for personal use.
I think to see real adoption of P2P technology, one has to make it work better and cheaper than the current existing cloud infrastructure.
by johannes1234321 on 3/3/22, 3:26 PM
by spookthesunset on 3/3/22, 4:42 PM
I would never want some random hypothetical YouTube / peertube / Spotify users hogging my precious upload bandwidth. I need that tiny slice of upload for backups, conference calls, and gaming.
My friend, however, lived in an area that offered fully symmetrical gigabit fiber internet. Dirt cheap too ($60/mo). We’d use his Plex server all the time to stream HD or even 4K content right from a PC in his office. It was awesome!
by superkuh on 3/3/22, 1:48 PM
by littlestymaar on 3/3/22, 12:59 PM
Using p2p to offset bandwidth cost is a really cool idea, but it doesn't come without difficulties:
- WebRTS doesn't work everywhere: for this kind of thing you really don't want to use a TURN server, and only work with true p2p. This means you can't use it for users behind a symmetric NAT.
- `libwebrtc` (Google's implementations, used by Chrome and derivatives and also by Firefox) performs very poorly when there's a big number of open connection (I don't remember why, but you couldn't expect to maintain more than a dozen of connection on a laptop before having CPU load issue and dropped frames. This is probably an implementation issue, but Chrome's team were uninterested in investigating it). This means you can only be connected to a small pool of peers at any given time.
- Probably related to the previous point, it drains a lot of battery on mobile devices.
- Adaptive Bit Rate make things complex, since the user will switch tracks at random point, meaning they will need to be grouped with a different pool of peers. (since you cannot maintain a big group of peers, from different tracks at all time).
- it doesn't works that well on VoD: for new videos gathering many people at the same time it works really well, but for the long tail of old videos you're often the only one watching it at any given time. Unless you're Youtube scale indeed.
- it works better in live streaming, since everyone is indeed watching the same thing, but to maximize p2p efficiency you have to ad some latency (to have a bigger video buffer to work with), this isn't acceptable in every situation (sport events broadcaster don't like that at all for instance).
- to work well (especially regarding ABR, and live-streaming) you need your system to be quite tightly integrated to the video player. Polyfilling XHR/fetch with your own library isn't good enough (or competitors were doing so, and their product was less efficient for that reason). And surprisingly enough, there are (or at least there were) a ton of custom video players: many companies forked dash.js or hls.js and customized it, sometimes quite heavily.
- there's a serious privacy issue: the peers you're connected to know what video you're watching right now, and can identify you thanks to your IP address. Maybe this isn't too big of a deal when watching mainstream stuff, but for things like porn it can be a bit touchy…
[1]: https://www.lumen.com/en-us/edge-computing/mesh-delivery.htm...
by new_guy on 3/3/22, 7:28 PM
How many people fall for the latest buzzwords (like 'web3') without having the first clue what it is? Even investors aren't immune.
by jaaron on 3/3/22, 6:20 PM
Joost, backed by the founders of Kazaa & Skype, tried this starting in 2007. It failed, YouTube and other streaming approaches won.
Some of that was content, but some of that was that bandwidth became cheap enough.
by difosfor on 3/3/22, 12:38 PM
by mdtrooper on 3/3/22, 2:01 PM
But public organizations such as BBC (in UK) or RTVE (in Spain) or France Télévisions or others, they must use this kind of technologies because they are good for opendata, save public money and other good things.
by qeternity on 3/3/22, 7:25 PM
by vfistri2 on 3/3/22, 5:42 PM
by jdrc on 3/3/22, 3:00 PM
by kmeisthax on 3/3/22, 2:22 PM
P2P is a privacy nightmare, by design. You are asking everyone who wants to watch your video to also host it, which means that everyone watching the video gets to know the IP address of everyone else who watches that video.
Back in the early days of casual online piracy, music companies were happy to be able to sue service operators like Napster and get them shut down. However, when P2P services evolved to distributed-everything, it made it a lot harder to do that[0] since the only thing the service operators did was provide software to connect to their particular swarm protocol.
So they just joined the swarm, downloaded their own music, and then sued anyone who sent it back to them.
Now, imagine all of the copyright claim and takedown fraud that happens on YouTube, except instead of censoring one creator's video, they start suing the people who watched the video. Yeah, no thanks. Centralized video services have many problems, but legal liability on individual users for using the service as intended is not one of them.
Bonus points: given recent GDPR rulings on data exports[1] I would almost surely argue that any P2P swarm violates GDPR, because it turns every viewer of the video into a GDPR data controller, and any US peer in the swarm would constitute a GDPR data export into a privacy-hostile country.
[0] Grokster was sued on an "inducement" theory of liability that SCOTUS pinched from patent law, but that relies on the conduct of how the service operators and software providers advertised themselves to users.
[1] It is illegal to use Google Fonts on an EU website because of the US CLOUD Act and the fact that any subresource provider gets your IP address when you visit a website they service.
by corobo on 3/3/22, 4:39 PM