Here's a #question for you fedizens interested in #decentralisation and media distribution: is there a solution to do a #livestream (of audio or video) in a decentralised way, kind of like #Webtorrent but to broadcast, say, a #DJ mix or #radio show? Everyone who is listening is also a relay for the data. I can see that as something really empowering for people who want to #broadcast in a scalable way but are limited by their bandwidth and the gamble of renting a server. (boost appreciated)
@stragu i'm very interested in any solutions in terms of audio and video in a decentralized, p2p way. for example a PeerTube but for live broadcasting.
@luka I hope that, if #PeerTube supports it eventually, it also supports posting audio only / a "video" that only has a single frame (unlike YouTube, which, from what I heard, still streams a massive video even though the image is one single picture). @chocobozzz , @Framasoft , @peertube , want to chime in?
@stragu It's of no help to you but that is reportedly what Spotify does. Perhaps they patented it.
@pithi that's actually really interesting that they use p2p distribution! But do they have live features? I did a quick search and it doesn't look like it.
#WebTorrent already can start with the first blocks of data to start watching something before the whole file is downloaded (which I think is also what is used in PeerTube), so I assume it doesn't need too much more to support livestreams.
Maybe #LBRY is another protocol to look out for?
@pithi I found this presentation that talks about some of the details of the Spotify p2p system: http://www.cs.ox.ac.uk/people/amir.payberah/web/files/download/slides/p2p_cdn.pdf
... but it looks like they dropped it back in 2014? https://techcrunch.com/2014/04/17/spotify-removes-peer-to-peer-technology-from-its-desktop-client/
I'm confused by this statement: "[...] using peer-to-peer in addition to direct downloads actually adds a bit of overhead"
@stragu I really don't know much about it other than they used it at the start. I would imagine coordinating between a p2p and direct download causes the additional overhead they're talking about. Y'know "Does it go p2p or direct?" "Who is the fallback and when?" would make a hybrid inefficient as it would have to be on a per block basis for the p2p.
@pithi right, that makes sense. I guess I was simplifying "overhead" as "bandwidth" :)
@stragu I'd expect there to be some protocol overhead too, which would be a small amount of bandwidth. Also, falling back to direct from p2p would probably still leave you exposed to very late p2p blocks duplicating direct download blocks. So a modest amount of bandwidth eaten there too? TBH I find it hard to see how you would avoid such overhead with any dual algorithm channel method.
@stragu I.e. you're effectively maintaining 2+n network sessions. 1 for direct, 1 as primary / stub for p2p and n for subsequent p2p partners. There's got to be node and link overhead for both.
Am I making sense?
@pithi I think so! Although your n is supposed to be on the node itself, not on Spotify's end, right? I guess the benefit of going fully distributed is that it's "simpler", and you can flip the issue of many nodes requesting blocks into a non-issue + leverage the physical proximity for speed like IPFS and the likes are planning to do.
I need to read up on technical documentation around distributed networks, really. Then I can help on some new decentralised libre music distribution software! :)
An important characteristinc of the BitTorrent protocol is its tit-for-tat algorithm - you get a few chunks of the file for free, but to get more, you need to swap with another peer and give him a chunk that he doesn't have.
This way, BitTorrent defends from free-riding and incentivizes peers to contribute to the network.
The problem I see with livestreaming, is that everybody wants only the latest chunk, and nobody cares about old chunks.
To balance latency with bandwidth, you'd probably want a stable spanning tree, starting from the stream source and branching through different peers.
But if peer A is downstream of B, then A can't do anything for B. So B has no reason to feed A.
You'd need someone upstream of B to punish B for not feeding A, but how would that someone know that B is not feeding A?
One way would be for peers to have some kind of stable identities, but that opens another can of worms...
I hope I missed some possibility, and that there is a way to do it in a robust and secure way.
@Wolf480pl I'm not sure if I get what the issue is with the original idea. In my (very uninformed) imagined implementation, there would be the option to keep a "recording" of the whole stream too for later sharing, or only broadcast live. Peers keep chunks they have already seen, depending on that + how many they are required to keep + how much space is allocated on their drive. The original seed keeps everything. People who arrive "late" can choose "from start" or (close-enough-to-) "live". 1/2
@Wolf480pl Using the service requires users to share chunks. Availability of chunks / smoothness of stream is guaranteed by adding a bit of latency to make a more stable sharing tree (with equidistant peers?) that, admittedly, would for the most part go one way (except: why not allow people to pause the stream if they want to? which would swap downstream and upstream). Peers with good, available upload bandwidth would share to extra downstream peers. Identity: WebTorrent uses IP addresses. 2/2
@stragu I don't quite understand the "sharing tree" part.
But if you have N peers and can accept O(N) latency, then it could do it like this:
The seed sends each chunk to a different peer. So each peer gets approx. every Nth chunk from the seed, and has to trade with the rest to fill the gap. Your latency would be determined by the longest path from you to the most distant peer.
The seed could send each chunk to 2 peers or sth, that'd make it a bit more robust.
>Identity: IP addresses
@stragu Live streaming appears to be on the peertube roadmap: https://wedistribute.org/2019/04/check-out-whats-on-peertubes-2019-roadmap/
@cstanhope Thanks for the info! It looks like the groundwork for it being implemented is being done: https://github.com/Chocobozzz/PeerTube/projects/4#card-16877359
Apparently, the most promising bit of technology that could be used to implement livestreaming in PeerTube is this: https://github.com/Novage/p2p-media-loader
I guess you should try WebRTC.
@lambda_meadow that's a very good point, I forgot about this obvious contender! Haven't read up on it at all just yet.
Fast ducking show me this article - https://servicelab.org/2013/07/24/streaming-audio-between-browsers-with-webrtc-and-webaudio/
Think it is a good point to start.
I got some js experience, and project looks interesting, so if you will make your own app hope I can help a little bit.
@lambda_meadow will make sure to hit you up if I get stuck into it! Thanks :)
@lain between 0 and 1, thereabouts 👌🏾
@stragu 1. You would still need the bandwidth (it would be like torrenting but you seed only ratio = 1 (best case) and you must seed fast enough) 2. If you stop seeding then nobody can continue listening on your broadcast (since it doesn’t exist yet everybody must get it from your machine)
There is an incompatibility between streaming/broadcasting and p2p.
@jenny definitely nothing around 2; but for 1: doesn't the latency allow constructing a tree of peers? Then each peer can seed downstream to 3 people to make it sturdy enough with up to (close to) peers^3 places to get the data from for people joining. Which for HD video is admittedly tricky, but shouldn't be an issue for just sound.
@stragu well, you’ll be losing the whole point of broadcasting: synchronization between listeners. Or maybe it isn’t an issue in your case?
@stragu oc you would save bandwidth but you would still need a fast isp. I think you’re better off streaming from a centralized service then archiving your streams on ipfs/BitTorrent/…
@stragu Livepeer is working on that.
@pepesza didn't know about them! Thanks for sharing!
@cblgh oh wow thank you for sharing this one! Looks amazing, but I'm concerned that it hasn't seen any commit for more than a year. Have you tried it at all?
@stragu sadly no, not recently. it's open source though and the deps it relies are on are pretty stable :)