New plotter / no gpu plotter

hello guys, i am planing new plotter i just need some help about that,
1- do you have any idea about AMD EPYC™️ 7502P blade bit ram disk plotting performance?
2- according to github ramdisk needs around 440 gb ram this ram is enough? i mean when plotter start sending plot files to hard drive sometimes plotter stuck and stoped. How much ram is needed for this not to happen?

thank you.

1 Like

It’s in Russian, here is the google translate:
https://miningclub.info/threads/pro-majning-chia.80988/page-539

The culmination of all this madness is the AMD EPYC 7502P test with a standard opera of 288 gigabytes = something around 15 minutes per raft (k32), 96 rafts per day.
The price of this processor is cosmic and is completely incommensurable with Ryazan 3900, at the cost of this processor you can buy 8 Ryazan 3900
Yes, there is undoubtedly an increase, but with what blood?!

So why would you want to buy an insanely expensive CPU to make plots with? When a month from now you can get better speed with a $ 100-150 graphics card (RTX 1070 can do something like 6 min plots with the new MM plotter from what I heard)

Should be enough, getting stuck maybe because transfer speed to hdd’s is not fast enough? Should use a nvme as buffer drive probably.

3 Likes

I have an elderly Dell r910 with 1TB ram and 40 cores/80 threads. I run 2 instances of Bladebit and average 1 k32 plot in 9 minutes @ 0.165Kw/h per plot (UK £0.06 electricity per plot). No GPU and more ram than the NSA! And it is your lucky day because it could be yours!

GPU plotting puts efficient plotting within the reach of the small farmer. Sadly, the cost of farming with a power hungry GPU puts efficient GPU farming outside the reach of the small farmer. The pleasure of purchasing a 1070 for a low price will last as long as it takes to receive your next electricity bill.

Gui or command line.
If gui, try comm line.

1 Like

well I’m quite sure a epyc 7502P will burn more power if you farm with it.

But yes Farming with a GPU seems to be get more interesting as you get beyond a certain size point.
On the other hand it is also a matter of the price you pay per kwh. If you have low energy cost it can make sense to go with a higher compression level and then it will become more interesting quickly to look at a GPU that can handle that more efficiently.

hello, thank you for reply,
what kind of hard drive are you using?
Are you using any flags to wait for the plot to be sent?
do you think nvme is enough? With nvme, will I have to wait for the plot file to arrive on disk? Do you think there is a bottleneck?

Hi,
Bladebit was just writing from ram to the final destination SAS drive. It was fast enough for me so I did not bother trying to make it go faster. Also, bear in mind when using Bladebit it is a sequential write so a bit pointless using an ssd for that. There were bottlenecks because moving 102G from ram to disk takes a few minutes

1 Like

I’m going to do a bit of a write-up here, because there seems to be some misunderstandings about cost and power usage of CPU vs GPU.

A bladebit CPU plotter like suggested here, will cost at least $2500, unless you get a killer deal on the hardware somehow.
Not much data to be found on plot times, the only one I was able to find was around 15 minutes. Let’s say 12 minutes for easy comparison.

A GPU plotter, with better performance, can be bought for $500-750. That would be with an RTX 1070, plot time according to test by Max, about 6 min.

So you get this comparison:

CPU plotter: $2500, 12 min plots, 180W power (CPU)
GPU plotter: $ 750, 6 min plots, 150W power (GPU)

for making 1000 plots, the CPU plotter will need to run 8 days
The GPU plotter only 4 days

Power cost CPU for 1000 plots: 35 kwh
Power cost GPU for 1000 plots: 14 kwh

So in summary, this CPU plotter is:

  • 3-4 times more expensive to buy
  • 50% slower in plotting
  • Double the power usage for plotting

Farming is a different matter, using a plotting system to farm with is a waste of energy.

My farmer for 250TiB is an i3-6100. After I make compressed plots I will most likely keep using that.
As long as I stay at levels C1-C3 it should have no problem handling 250 TiB of compressed plots.
Estimate extra power usage for farming is I believe between 10-30 Watt for those compression levels.

Farming with a GPU seems to me mostly useful for either bigger farms and/or those who want to use a higher compression level. At that point a GPU will be more energy/cost efficient than using a CPU.

1 Like

Further, once integrated graphics cards are supported (OpenCL), there will be very budget friendly options for the smaller farmers to be able to farm the higher compression plots. And still have low energy usage.

my question is diffrent. i am not going to plotting and mining with gpu.
i am just seeking information about ram disk.

ok your money to throw away, just trying to help

For bladebit, the info on the github is from the maker. That is the most trustworthy source. But 440 seems the minimum, I would always go with a little extra room. 440 is a weird number for ram anyway, at that point might as wel go for 512GB.
Stuck for copy, most likely not a ram issue. Copy to hdd takes at least 5 minutes. you can put an nvme in between and a script that copies each plot to a different hard drive.
I think bladebit will always wait for the copy to finish before starting a new plot.

hello again,
i set up my server,
plot time is 27 mins on windows with 480 gb ram.
do you have any idea why is too slow like that ?

What are the specs of the machine (number of processors, cores, threads, etc)?

hello,
total 64 cores one cpu
ryzen epyc… :frowning:

Firstly, I would not be using Windows to plot. I would use Arch Linux or Ubuntu Server 22.10 (My preference is Arch Linux though) and I would use Bladebit (GitHub - Chia-Network/bladebit: A high-performance k32-only, Chia (XCH) plotter supporting in-RAM and disk-based plotting) at the command line. I would also suggest you measure your power consumption per plot to get an idea of how efficiently your machine is working.

Also, I presume you are not using Windows 10 Home, are you?