from Hacker News

Avoid FIFA World Cup CDN Issues

by shacharz on 6/12/18, 3:24 PM with 52 comments

  • by isostatic on 6/12/18, 4:21 PM

    What amazes me is the lack of concern on latency for web streaming.

    We use the internet (and IP in general) to stream video. At high bitrates (200mbit+) we aim for sub 100ms end to end, for compressed services we're happy with 500ms, maybe upto a second if it's something like Sydney to London over the internet.

    I was in a control room a couple of weeks ago watching some football. There were two displays, one end was the feed from the stadium, one was the feed from the web streaming service.

    There were cheers and then groans from the live end of the room. nearly a minute later someone on the web end started running up the field to score. Of course I knew at that point that it wouldn't be a goal, as not only did the people watching the live stream tell me, but twitter was abuzz.

    1 minute end to end delivery latency is shocking for this type of program. Heck 10 seconds is bad enough.

  • by isostatic on 6/12/18, 4:36 PM

    BBC is doing UHD, with HDR and HLG, for the world cup, with the top stream 36Mbit/second [0]

    There are a limited number of "spaces" available [1] -- I think it's upto 100gbit of output.

    Unlike with the FA Cup (where the BBC did a uhd trial), the world cup will have a lot of games during the week, where people will be watching from the office (although probably not UHD). This will mean far higher loads on the distribution.

    Fortunately England's only 3 games are either at the weekend or at 7PM. The second half of the tournament will really stress the UK internet though, with both World Cup and Wimbledon on during the working week.

    [0] https://www.bbc.co.uk/rd/blog/2018-05-uhd_hdr_world_cup_2018 [1] http://www.bbc.co.uk/mediacentre/latestnews/2018/uhd-vr-worl...

  • by lossolo on 6/12/18, 5:08 PM

    I've started building P2P live adaptive (DASH) live video streaming using WebRTC with distributed rate control mechanism some time ago and I am planning to open source it. Basically using that you could build your own p2p global distributed live adaptive video streaming CDN (or use it on one server only). Adding new supporting server (to add additional bandwidth) in this solution is just as easy as spawning vm/server and launch binary. Distributed signaling server with GEO/region based peer distribution, full real time statistics on whole network health, analytics, network automatically adapts to bandwidth shortage (if for some reason network can't sustain itself) switching to lower bandwidth versions of the stream. Very easy to use, you would need to set one config file only, launch 2 binaries and add one JS file to source of your site and you are ready to go.

    Anyone interested in this?

  • by sergiotapia on 6/12/18, 3:56 PM

    Life pro tip: Just use an ace stream and have zero connectivity issues.

    https://www.reddit.com/r/soccerstreams/comments/82ac7j/acest...

    My brother bought the MMA ticket to watch mcgregor vs mayweather and it was a horrible experience.

    I booted up my laptop and ran an acestream, boom crystal clear high definition image with zero network issues.

  • by lephty on 6/12/18, 3:53 PM

    Is there anything that can be done from the user perspective to improve your streaming experience. Beyond bandwidth.
  • by CapacitorSet on 6/12/18, 4:45 PM

    The P2P approach is interesting. Do you have data on how much bandwidth is saved at peak usage?

    It might be especially interesting when many users share the same connection, effectively achieving broadcasting - the CDN pushes the data to one client, and it broadcasts it to the local network.

  • by petepete on 6/12/18, 5:06 PM

    I watched the UCL final via YouTube this year and the Bet365 webapp kept telling me about goals half a minute before it happened on the screen. To be honest, if you're isolated the delay doesn't really matter.
  • by cflat on 6/12/18, 5:04 PM

    Don’t use webrtc for live video. It doesn’t scale with CDNs. Use dash/hls.
  • by btown on 6/12/18, 4:35 PM

    Software for scaling up live-streaming CDN points of presence (POPs) is a pretty crazy domain. For on-demand video, you can think of a CDN as a cache, getting known-ahead-of-time chunks. But what about for live streaming? It's not feasible to stream frame-by-frame directly from your encoding backend to all the viewers of the World Cup, over something like RMTP - you'd want to use a CDN. So typically, you distribute meaty (multi-second) HLS segments as individual video files, or collections of files, to your CDN; once available, they then need to be requested by browsers/mobile clients as a whole segment, over HTTP(S). Works well with existing CDN infrastructure (provided they can handle the write volume and have big enough inbound pipes)... but the huge issue is that the length of the segment plus round-trips is a lower bound on effective latency. And when interactivity is required, multi-second delays can be horrible.

    https://www.wowza.com/blog/hls-latency-sucks-but-heres-how-t... is a great writeup. Another overview of the problem, and a proposed solution, is in this excellent article by Twitter here:

    https://medium.com/@periscopecode/introducing-lhls-media-str...

    > In HLS live streaming, for instance, the succession of media frames arriving from the broadcaster is normally aggregated into TS segments that are each a few seconds long. Only when a segment is complete can a URL for the segment be added to a live media playlist. The latency issue is that by the time a segment is completed, the first frame in the segment is as old as the segment duration... By using chunked transfer coding, on the other hand, the client can request the yet-to-be completed segment and begin receiving the segment’s frames as soon as the server receives them from the broadcaster.

    And Twitch's followup challenge:

    https://blog.twitch.tv/twitch-invites-you-to-take-on-the-icm...

    > This Grand Challenge is to call for signal-processing/machine-learning algorithms that can effectively estimate download bandwidth based on the noisy samples of chunked-based download throughput.

    (IMO) If you're thinking that this is all rather silly, and that live video streaming is not something that should be done over HTTP in the first place... there are a lot of reasons why this is the case. All the CDN POPs are optimized for HTTP GET requests rather than stateful sessions, and Apple's smiting of Flash removed a lot of incentive for innovation on RTMP servers. The ironic thing is that Internet connectivity is fast/reliable enough nowadays that RTMP might have been able to escape its association with "buffering" spinners, and would provide a much lower-latency experience. Hopefully there's better standardization in the future as live video becomes more mainstream.