For some time, it has been self-evident to me, that Livepeer and S…warm ought to work together more coherently.
This brief proposes a path towards an achievable minimum-viable-integration, to act as a proof-of-concept to build on in future.
My self-evident position stems from considering the lifecycle for sharing video, which ultimately boils down to the following steps:
0. **Capture** - using camera and microphone, e.g.
- using cameras / microphones in smartphones, or
- professional digital cameras / dSLRs connected to computers / encoders
- often involves some form of performance or expression.
1. **Publish** - sharing is caring, and can help us all be radically transparent FTW.
2. **Transcode** - re-size to make the content accessible
- use `source` rendition (at e.g. `1080p` or `4K`) to generate "lighter" renditions (`720p`, `360p`, `144p`)
- make it more "distributable" for slow connections, and more "playable" on weak devices.
3. **Distribute** - get the content to the people in the world who wish to watch live.
4. **Archive** - store `source` and transcoded renditions for watching later.
Now, breaking this down:
- [x] 0. expression and performance are abundant, smartphones with cameras / mic are ubiquitous, digital media is thriving.
- [x] 1. web apps such as [justcast.it](https://justcast.it/)  are emerging, as ways of easily publishing content.
- [x] 2. Livepeer Public Network does this already, in sub-realtime, with accounting via Probabilistic Micropayments  and a Continuous Funding  protocol, both running on Arbitrum Mainnet.
- [ ] 3. This currently involves "centralised alternatives" (CDNs), with some decentralised options emerging ref: Media Network on Solana. However, a trustable soul told me Livepeer also have something which seeks to solve this. TBC
- [ ] 4. This also currently involves "centralised alternatives", with `go-livepeer`'s object storage functionality able to plug in to third-party APIs. Some experiments have also happened with Filecoin , and livepeer.com continues to pioneer ["Mint a Video NFT"](https://livepeer.com/docs/guides/video-nfts/mint-a-video-nft) functionality, using IPFS.
Both platforms run on EVM, with users transacting with `0x...` addresses/keys, so both projects share a core foundation from a user's point-of-view.
With Swarm starting to provide realistic Ethereum-based alternatives to IPFS, with minimum-viable storage-incentives, the time feels right to start "leaning in" to working out how to integration these two "baby giants" of the Ethereum ecosystem can begin to act like siblings, instead of distant relatives.
## But Chris, video is too heavy
In Prague, someone from the depths of the Swarm team told me that this kind of thing is not a good use case for Swarm: something about video being "too heavy".
I acknowledge this feedback, and agree that video is heavy. I am not defeated by it however, for reasons which I will attempt to explain now.
When livestreaming, it is possible to configure the bitrates and framesizes to be very very very small. For example, I can publish a source stream containing `50kbps` of video data and `32kbps` of audio data, totalling `82kbps`. This might be at a framesize of `240p` (426x240 pixels) and a frame-rate of `10fps`, so the content may be slightly pixelated and even strobe-like. But it does work.
I can then use Livepeer Public Network to transcode this `240p` `source` into a "lighter" `144p` rendition, thus exercising step 2 in the process above. The streams then become available as a sequence of 2-second video segments, in `240p` and `144p`.
At this point, I would be looking to make a minimum-viable-integration, whereby an Orchestrator (`livepeer/go-livepeer` can push these 2-second video segments into Swarm. It would result in a maximum bitrate of ~100kbps, which I _really_ hope Swarm is capable of handling.
Once we can prove this at a minimum-viable level, we can easily start to turn up the source bitrates, and observe where the bottlenecks occur, then fix them. Simples ;)
So, Livepeer and Swarm have been talking for years, about how to work together. I haven't been involved in most of the conversations, but the tl;dr I get from them is not positive. Things appear too hard, or not priorities, or even things like "Swarm is on Gnosis" and "Livepeer is on Arbitrum" so bridging stuff is needed.
This, for me, is a great shame, and a huge waste of an opportunity to create a mechanism to make Livepeer a test-suite for Swarm. It also makes me wonder whether any evaluation of feasibility of integration is done at the client-level (`go-ethereum` <> `bee`), which provokes me to consider whether a more coherent integration could happen at the protocol-level (`ethersphere/storage-incentives` <> `livepeer/protocol`), which could help rationalise a tangible integration at the client level.
So, if it was possible, to connect the two things together somewhere in the stack, and begin to dream of being able to deliver a coherent use-case for these two Ethereum-based sibling-technologies, we not only help build out testing infrastructure for the Swarm project, but also we evolve a fully-decentralised live video publishing, distribution and archiving solution for the world.
It can be used for everything from sharing art and creativity and beauty and performance, through to "little brother is watching you"-style mobile-bodycam / dash-cam to create the data to allow things like community policing - all backed up to censor-resistant storage, and discoverable via ENS.
OK, I acknowledge that I don't know much about the `storage-incentives` behind Swarm, but I know a fair amount about the mechanics of Livepeer. Given the shared underlying foundations (EVM), and the shared Identity infrastructure (`0x...` public/private keypairs, ENS), it appears to me at least, to be a feasible approach.
But I'm confident someone with more awareness of the Swarm project can contribute something here, which will either debunk my wild and ambitious theories about how the world of media infrastructure _could_ work, or will contribute to it by bringing additional facts to this conversation.
Either way, I hope the significance of a "joined up" solution to web3 video isn't lost on you. It can serve as some form of censor-resistant "audiovisual audit trail for humanity", where any device with a microphone (and camera) can publish live, with maximal accessibility, global distribution and permanent storage.
Soon enough, human souls would start to learn to only share the most beautiful and valuable content, to help them build their web3 reputations. And we might move towards a more beautiful, peaceful and creative world.
WDYT? Please leave a comment below.
 justcast.it is a web app which simply connects to your device's camera and microphone and allows you to start publishing video and audio to a livestream. Works on smartphones, laptops, desktops, and (probably) smart TVs with cameras and microphones. The project is by @victorges and source can be found here: https://github.com/victorges/justcast.it.
 Livepeer has a working implementation of Probabilistic Micropayments, integrated with the Livepeer Protocol. Primer is here: https://medium.com/livepeer-blog/a-primer-on-livepeers-probabilistic-micropayments-e16788b29331 Explainer is here: https://medium.com/livepeer-blog/streamflow-probabilistic-micropayments-f3a647672462, building on work by [D. Wheeler](https://www.semanticscholar.org/paper/Transactions-Using-Bets-Wheeler/6583c552f30c5a61abd7b10b88a226d377613f7b?p2df) and [R.L. Rivest](https://people.csail.mit.edu/rivest/pubs/Riv97b.pdf)
 Livepeer has a working prototype of a "continuous funding" protocol, with mechanically controlled inflation distributed to participants. This has recently migrated to Arbitrum per this proposal, which was accepted by Livepeer Governance: https://forum.livepeer.org/t/a-proposed-scaling-strategy-for-the-livepeer-protocol/1583/4
 File.video was a product demo built to demonstrate a video hosting service powered by Filecoin and Livepeer. While the demo is no longer being maintained, the code remains open sourced. See file.video or https://github.com/livepeer/file-video