I wonder if I wanted to put together such a Frankenstein on a good consumer board, with a good CPU, with 4 boards of 20x SATA3 each (~2 PiB) - would the MB line be enough to operate only as a farmer?
Of course, the power supply would be provided by additional power supplies.
Because I suspect that if you approach it professionally and think about something like this - the CPU/MB line wouldn’t be enough?
Alternatively, assuming only three such boards playing in teams with GPU and plots with compression - is there a chance for reasonable operating times with low latency?
Ok, I know you can advise LSI and SFF-8087 etc.
But has anyone tried such an “amateur” composition?
I’ve had a card like this with 16 port.
I think the most I had connected to it was 10 drives at a time.
It worked, but not great. If you fill it with 20 drives, I think you’re going to get issues with lookup times.
Not even sure it’s the max throughput that is at fault, seemed to me more like an latency/cueing issue with the shitty chips on these things when trying to access multiple drives at the same time.
Besides, the cabling will be a nightmare. You would have to run a shitload of sata cables, and they don’t exactly come in 5 meter lenghts.
The main advantage of SAS in that case is that you can use expander boards (that just need power, no pcie connection), which greatly improves the cabling options.
Unfortunately, I haven’t seen any from 4.0 anywhere.
Currently, I have two of them - 8 SATA each.
Everything is fine if you do not want to exchange large amounts of data continuously and simultaneously on different drives.
In farming itself, latency is 300ms on average.
Cables 1 meter.
I placed the rest on external JBODs on USB-C.
And why do I ask?
Recently I’ve had some time and I’ve been thinking about creating something steampunk.
Here, even a large number of cables would be desirable.
The stand design would take the longest.
If you arrange the disks nicely (e.g. in the shape of an accelerator) with interesting metal cooling fins, it could be one of the coolest chia farms in the world…
If you’ve got “good” components everywhere else… don’t add a bunch of craptacular SATA port multipliers to the mix and create an unreliable mess.
I don’t know what you’'ll be holding the drives in: but starting with something as small as an 8-port SAS3 HBA and adding expanders to scale (that don’t need PCIe slots: just power) and the appropriate cables is a much more reliable and performant way to go.
I’ve got a couple machines running 10 port SATA expanders on a PCI-E 3 x4 interface.
They handle the load of farming without issues with an old Bulldozer 8xxx series 8-core in one and a 4 core A10 in the other.
Farming does NOT use a ton of bandwidth.
Keep in mind I’ve got a 71 drive machine being farmed ENTIRELY via 1GB LAN, no issues.
Plotting on the other hand, you want more bandwidth. This sort of a setup seems to be OK when plotting 1 drive at a time, but I’d not trust them for reliability or speed enough to handle plotting more than one drive at once.
Do any of you have experience with such extensions? This could add another level of “uncertainty” for those 80 SATA…
Then you could display these cards nicely (so that they are not covered and you can also use fans on them) and route the cables to the HDD more sensibly. The wires would be black, without bundles - it would make the design more aggressive.
But who can tell me whether these 80 SATA on 4 cards would work…
One $10 120mm fan can easily cool down 4-5 HDs on low speed. That chunk of metal will initially absorb some heat but will quickly get saturated if the extra airflow is not there. Actually, there are $14 200mm fans on amazon, if those can work for your setup - https://www.amazon.com/dp/B00J0NZFIA.
From one of them (the one from the bottom, invisible) we would blow with fans,
and from the visible, top one there would be this radiator.
I’m getting more and more excited to… do it.
A farm with a bad, black look.
I need to make an estimate.
Think.
Unfortunately, I have been away from home and the workshop with my tools for 3 years.
It’s a pity I’m not in Pakistan or India - I wonder who will assemble such an individual skeleton for me here.
There are a lot of talented people there who would cooperate at a reasonable price.
Additionally, computer components are expensive here (I always thought it was expensive in Europe!).
You definitely need to disassemble the PC case to get the MB mount.
Due to the weight of even just 80 HDDs, some beautiful black granite would have to be placed at the bottom as a counterweight.
Hard drives are reliable on side OR on end (reference “end loader” designs, been around well over a decade thanks to BackBlaze).
Non-90% angles though might be more of an issue.