Line-eating NO-Geek-PC

I wonder if I wanted to put together such a Frankenstein on a good consumer board, with a good CPU, with 4 boards of 20x SATA3 each (~2 PiB) - would the MB line be enough to operate only as a farmer?
Of course, the power supply would be provided by additional power supplies.

Because I suspect that if you approach it professionally and think about something like this - the CPU/MB line wouldn’t be enough?

Alternatively, assuming only three such boards playing in teams with GPU and plots with compression - is there a chance for reasonable operating times with low latency?

Ok, I know you can advise LSI and SFF-8087 etc.
But has anyone tried such an “amateur” composition?

Are there any cards that are PCI-E 4.0 that you have found for future, they should be backwards compl.

Do you think 3’s won’t pull?

I’ve had a card like this with 16 port.
I think the most I had connected to it was 10 drives at a time.
It worked, but not great. If you fill it with 20 drives, I think you’re going to get issues with lookup times.

Not even sure it’s the max throughput that is at fault, seemed to me more like an latency/cueing issue with the shitty chips on these things when trying to access multiple drives at the same time.

Besides, the cabling will be a nightmare. You would have to run a shitload of sata cables, and they don’t exactly come in 5 meter lenghts.
The main advantage of SAS in that case is that you can use expander boards (that just need power, no pcie connection), which greatly improves the cabling options.

1 Like

Just thinking of future, if the cost is about the same…
.Somebody has to have good cards out there!!

They do, they are called SAS expanders and HBAs :wink:


Unfortunately, I haven’t seen any from 4.0 anywhere.
Currently, I have two of them - 8 SATA each.
Everything is fine if you do not want to exchange large amounts of data continuously and simultaneously on different drives.
In farming itself, latency is 300ms on average.
Cables 1 meter.
I placed the rest on external JBODs on USB-C.

And why do I ask?
Recently I’ve had some time and I’ve been thinking about creating something steampunk.
Here, even a large number of cables would be desirable.
The stand design would take the longest.
If you arrange the disks nicely (e.g. in the shape of an accelerator) with interesting metal cooling fins, it could be one of the coolest chia farms in the world…

1 Like

If you’ve got “good” components everywhere else… don’t add a bunch of craptacular SATA port multipliers to the mix and create an unreliable mess.

I don’t know what you’'ll be holding the drives in: but starting with something as small as an 8-port SAS3 HBA and adding expanders to scale (that don’t need PCIe slots: just power) and the appropriate cables is a much more reliable and performant way to go.

I’ve got a couple machines running 10 port SATA expanders on a PCI-E 3 x4 interface.
They handle the load of farming without issues with an old Bulldozer 8xxx series 8-core in one and a 4 core A10 in the other.
Farming does NOT use a ton of bandwidth.

Keep in mind I’ve got a 71 drive machine being farmed ENTIRELY via 1GB LAN, no issues.

Plotting on the other hand, you want more bandwidth. This sort of a setup seems to be OK when plotting 1 drive at a time, but I’d not trust them for reliability or speed enough to handle plotting more than one drive at once.

1 Like

Used this for 8 internal drives, no issues as bandwidth needs are so little for farming.

To get rid of the cable mess, I used this.

Now for 20 drives, it may be a stretch, but why not try?

1 Like

Hi Guys!

Thanks for your opinions.
I found the perfect one here (I know this company) and it has PCI-e 4x (better than x1(less) or x16 (to much)) - so if you use 4 such cards, there should be enough CPU/MB line.
Moreover, it has a black laminate finish. BEYIMEI PCI-E 4X SATA Card 20 Ports, PCI Express SATA Controller Expansion Card, 6 Gbit/s SATA 3.0 PCIe Card Without Raid, Boot as System Disk Support HDD or SSD : Electronics
I already know that for design reasons I should use PCI port extenders.
Do any of you have experience with such extensions? This could add another level of “uncertainty” for those 80 SATA…

Then you could display these cards nicely (so that they are not covered and you can also use fans on them) and route the cables to the HDD more sensibly. The wires would be black, without bundles - it would make the design more aggressive.
But who can tell me whether these 80 SATA on 4 cards would work…

I am using such extender for the GPU. Works like a charm (

Thx Jacek!

And such industrial radiators made of oxidized aluminum on each HDD, which can be placed toroidally, at an acute angle…

Do HDDs work at any angle or only 90-180?

One $10 120mm fan can easily cool down 4-5 HDs on low speed. That chunk of metal will initially absorb some heat but will quickly get saturated if the extra airflow is not there. Actually, there are $14 200mm fans on amazon, if those can work for your setup -

I used such an extender back in around 2007 although it was shorter, worked a treat also.


From one of them (the one from the bottom, invisible) we would blow with fans,
and from the visible, top one there would be this radiator.

I’m getting more and more excited to… do it.
A farm with a bad, black look.
I need to make an estimate.
Unfortunately, I have been away from home and the workshop with my tools for 3 years.
It’s a pity I’m not in Pakistan or India - I wonder who will assemble such an individual skeleton for me here.
There are a lot of talented people there who would cooperate at a reasonable price.
Additionally, computer components are expensive here (I always thought it was expensive in Europe!).

You definitely need to disassemble the PC case to get the MB mount.
Due to the weight of even just 80 HDDs, some beautiful black granite would have to be placed at the bottom as a counterweight.

Hard drives are reliable on side OR on end (reference “end loader” designs, been around well over a decade thanks to BackBlaze).
Non-90% angles though might be more of an issue.

1 Like