Difference between revisions of "How to super seed with a seedbox"
Line 112: | Line 112: | ||
If you are a content publisher looking to provide super seeding for hundreds of terabytes to dozens of petabytes of data quickly we are happy to help you achieve that goal cost-effectively. We are happy to do some research and development to make is as smooth and cost-effective as possible for you all the way to a complete turn key solution. Contact sales@pulsedmedia.com | If you are a content publisher looking to provide super seeding for hundreds of terabytes to dozens of petabytes of data quickly we are happy to help you achieve that goal cost-effectively. We are happy to do some research and development to make is as smooth and cost-effective as possible for you all the way to a complete turn key solution. Contact sales@pulsedmedia.com | ||
+ | |||
+ | |||
+ | [[Category:seedbox guides]] | ||
+ | [[Category:guides]] |
Latest revision as of 17:18, 8 June 2023
How to super seed with a seedbox?
Abstract
This super seeding article is about seeding the maximum amount of data, in the least possible time by utilizing multiple seedboxes, instead of the "traditional super seeding" of consecutive blocks being seeded (initial seeding). Read more at our blog: Super Seeding with Seedboxes Here we show you how the get the maximum amount of data into the swarm, to serve thousands of users simultaneously.
We are going to use a multitude of seedboxes to achieve this.
Who is this target for?
Content Publishers primarily, anyone who needs to distribute a lot of data quickly. For example, Blizzard releases updates via Bittorrent, and they see a huge spike of traffic upon new releases, the only concern is getting as much of it out as possible as fast as possible, not just a single source of seed like traditional "superseed" is targeted to be.
This actually does not differ that much from what Blizzard does.
What's needed
You need multiple seedboxes in order to do this, at the very least you need 1 very fast, and multiple "slave seeders". You can work with a few up to dozens or even hundreds* of seedboxes. Type of seedboxes depends on type of data. If small data, a lot downloaders, then small fast (ie. SSD Seedboxes or NVMe Seedboxes). If big data set (ie. 50+ GiB) then you might need multiple HDD based big storage seedboxes.
The type of seedboxes and quantity depends upon your budget, the idea behind using many is to get as many discreet I/O resources, IPs etc. involved as possible, all of these instances will have differing timings etc. ensuring faster to connect speeds to new downloaders (leechers).
- ) Ask support to help distribute your .torrent file to the hundreds of boxes, it should be scripted. Basic idea is to scp / rsync to watch folder.
Recommendation of resources
We recommend using many of our shared slots, this way you get access to as many discreet resources as possible with the least amount of money - you don't necessarily need a big budget cluster of dedicated servers.
For example, get a SSD Seedbox for the initial source, then a bunch of 10Gbps Seedboxes for single large torrent. Vice-versa if you have smaller torrent or multiple torrents.
How much do i need resources?
First we need to determine your target of data seeded (X) in allotted time (Y). Also we need to remember, more individual instances there is, the more stable the speeds will be.
If X=5TiB and Y=1 week:
The formula goes: X in Megabytes / Y in seconds == Bandwidth required in MiB/s
5 242 880 / 604 800 == 8.7Megabytes per second.
In this instance, you will do fine with just a single SSD Seedbox.
X=1 000TiB, Y=1 week:
1 048 576 000 MiB / 604 800 (seconds) == 1 733.8 Megabytes per second, or roughly 20Gbps.
Since on shared slots we should account for 10% bandwidth max on 10Gbps Seedboxes, we get: 1Gbps per 10Gbps slot: ~110 MiB/s Until traffic limit met, then ~10MiB/s
Dedicated servers in practice are about 50% bandwidth max on 1Gbps, albeit you might be able to achieve 90% for a few days. 1Gbps Dedicated: ~55 MiB/s
We should probably go for the dedicated servers, instead of shared seedboxes in this case. 40x entry level dedicated servers will do this fine.
SUPER SEEDING!
Torrent creation
Create your .torrent file with your favorite on the node you pick as the main node. Choose your preferred trackers etc.
There's really nothing special on this stage.
Transferring .torrent file AND warming things up
We have multiple ways to do this, here's the preferred way.
- Login via SSH to the node where you created the torrent file
- Go to the directory containing your torrent file, we'll call it "file.torrent"
- command to transfer the file: scp file.torrent USERNAME@SERVER:watch/
Transfer it to the 2nd node, if your username is super2 and server name is seeder2.pulsedmedia.com, the command would be: scp file.torrent super2@seeder2.pulsedmedia.com:watch/ Watch directory: You can load torrent files here, and they are automatically loaded in a while.
At this stage we should go for a short coffee break to allow the 2nd 1Gbps node snatch the data! This way we get the 2nd stage much much quicker.
Warm up the superseeding swarm
Now repeat the SCP for all the slave seeder services, for example:
scp file.torrent super@slave1.pulsedmedia.com:watch/ scp file.torrent super@slave2.pulsedmedia.com:watch/ scp file.torrent super@slave3.pulsedmedia.com:watch/ scp file.torrent super@slave4.pulsedmedia.com:watch/ ....
Continue until you have it on all nodes. The slaves should get the data at a combined rate of about ~2000MB/s or more and accelerating all the way. Check the state from the last loaded slave node.
Release the SWARM!
Depending on the size of the data, if it's just a few gigabytes, it's probably safe to release quite early on, depending how fast you think your end users will start snatching it up. If it starts to load on the end users within seconds on the thousands upon release, it's better to wait until the slaves are at 100% or very near 100%, otherwise there will be serious hiccups at the last few % for the swarm, delaying the end users from getting their data. That's something none of us wants!
Let's assume your package is 10GiB and you expect 1000 users to snatch it up immediately, it's 10 000 gigabytes loaded up immediately. Seeding this all will take ~16hrs 21minutes, not accounting for end user to end user seeding. It's however likely you will see initial speed spikes in the range of 3000 MiB/s so the first few users will get it much much quicker. As time goes on, the seeding speed decreases because the fast peers have already finished and the slow ones keeps on dragging it out and consuming upload slots. You can re-boost the performance by simple restarting torrent client on the slave(s).
The total target in this example for 10GiB package is more than 10 000 end users. If the package is just 1GiB, then the 100TiB accounts for 100 000 end users. That's quite a few! And that is ignoring the end user to end user seeding.
Speeding things up even further
There are ways to speed things up even further, and we'll be happy to help you in that regard, just contact sales and ask what we can do for you, and what is your target seeding time for amount of data.
If you are a content publisher looking to provide super seeding for hundreds of terabytes to dozens of petabytes of data quickly we are happy to help you achieve that goal cost-effectively. We are happy to do some research and development to make is as smooth and cost-effective as possible for you all the way to a complete turn key solution. Contact sales@pulsedmedia.com