I’m going to do a bit of a write-up here, because there seems to be some misunderstandings about cost and power usage of CPU vs GPU.
A bladebit CPU plotter like suggested here, will cost at least $2500, unless you get a killer deal on the hardware somehow.
Not much data to be found on plot times, the only one I was able to find was around 15 minutes. Let’s say 12 minutes for easy comparison.
A GPU plotter, with better performance, can be bought for $500-750. That would be with an RTX 1070, plot time according to test by Max, about 6 min.
So you get this comparison:
CPU plotter: $2500, 12 min plots, 180W power (CPU)
GPU plotter: $ 750, 6 min plots, 150W power (GPU)
for making 1000 plots, the CPU plotter will need to run 8 days
The GPU plotter only 4 days
Power cost CPU for 1000 plots: 35 kwh
Power cost GPU for 1000 plots: 14 kwh
So in summary, this CPU plotter is:
- 3-4 times more expensive to buy
- 50% slower in plotting
- Double the power usage for plotting
Farming is a different matter, using a plotting system to farm with is a waste of energy.
My farmer for 250TiB is an i3-6100. After I make compressed plots I will most likely keep using that.
As long as I stay at levels C1-C3 it should have no problem handling 250 TiB of compressed plots.
Estimate extra power usage for farming is I believe between 10-30 Watt for those compression levels.
Farming with a GPU seems to me mostly useful for either bigger farms and/or those who want to use a higher compression level. At that point a GPU will be more energy/cost efficient than using a CPU.