Testing the speed of plotting on NVME vs HDD

Thanks! Yours seems have higher writing speed, but anyways I think my times are really slow. I’m thinking my nvme drive or my nuc are not working well…

With more RAM and threads and only 2 plots it improved a little bit, but nothing significant:

Plot1: 16h11m
Plot2: 16h42m

Now I’m plotting 2 more using an external HDD, staggered 1.5h, 8GiB RAM 4 threads. Let’s see what happens…

The network is growing really fast and my plotting speeds are frustrating… I have 24TB really to fill but it’s going to take a lot of time :weary:

Yeah I hear ya, I have 5.5tb filled so far, 24 more to go as well (assuming I don’t buy more drives). I’m pretty reliably getting 9 plots per 24 hours, so I’ll nearly fill my drives by the time transactions open up.

You should be able to plot twice as fast as that, something is off. Try using dstat and the other tools to watch the write throughput of your drive. But even then, it’s weird the speeds you’re getting.

I’m curious what your external drive speeds will be.

Hey again!

Maybe 9 plot/24h is not too much comparing to many of the chia farmers, but to me seems pretty good plotting speed! :man_farmer:

With my build definitely is happening something, the NUC recently shut down without any reason. I look at the logs and after almost 30h still in phase 1 the two plots using the external HDD as temp drive…

With the same external drive and a 9 years old computer (i7 2700k and 8gb RAM DDR3 1600) took me about 25h to plot 1, it’s so weird.

I used some monitoring tools and seems everything is working fine except the writing speeds, which are extremely slow… I don’t have much experience, but I think everything is mounted correctly… So, due to this and the unexpected shutdown, Amazon will replace me the NUC for a new one.

I also wrote to @storage_jm form Chia Decentral and thanks to his advice I’m replacing the NVME drive too for a Corsair MP600 which should improve by a lot the plotting speed.

Will report my new speeds as soon as possible (hopefully this Wednesday) with the new NUC and drive! :smiley:

Excited to hear how it goes!

I’m running another test right now, using a simpler way to measure throughput, I’ll share my results here tomorrow along with the method for measuring it all. I might whip up a script to process log files and spit out pretty stats too, we’ll see.

Hey again!

Can wait to see that results and script!
I’m using a simple script that returns Total time per plot.

The new NUC and NVME drive work pretty cool!! :man_farmer:
Finally I bought the Seagate Fircuda 520 1TB, because the heat sink of the Corsair was too big for the NUC.

This are the logs of the old build:

  • Total time = 58605.715 seconds. CPU (55.410%) Sat Apr 10 12:46:48 2021
  • Total time = 59537.603 seconds. CPU (53.870%) Sun Apr 11 10:59:23 2021
  • Total time = 58117.708 seconds. CPU (55.080%) Mon Apr 12 09:36:26 2021
  • Total time = 71287.644 seconds. CPU (46.610%) Tue Apr 13 05:39:24 2021
  • Total time = 63844.642 seconds. CPU (51.740%) Fri Apr 9 15:30:10 2021
  • Total time = 68287.839 seconds. CPU (48.550%) Sun Apr 11 14:55:14 2021
  • Total time = 63223.090 seconds. CPU (51.770%) Mon Apr 12 12:01:31 2021
  • Total time = 70469.802 seconds. CPU (47.090%) Tue Apr 13 07:49:22 2021
  • Total time = 67813.585 seconds. CPU (48.630%) Fri Apr 9 17:36:19 2021
  • Total time = 66445.564 seconds. CPU (49.300%) Mon Apr 12 13:55:14 2021
  • Total time = 70986.216 seconds. CPU (46.990%) Tue Apr 13 09:58:40 2021

The new NUC and NVME:

  • Total time = 24083.250 seconds. CPU (110.530%) Tue Apr 13 21:24:02 2021
  • Total time = 24630.669 seconds. CPU (110.150%) Wed Apr 14 04:21:59 2021
  • Total time = 24369.612 seconds. CPU (110.680%) Wed Apr 14 11:15:39 2021
  • Total time = 25111.424 seconds. CPU (108.810%) Tue Apr 13 23:41:10 2021
  • Total time = 24946.530 seconds. CPU (108.750%) Wed Apr 14 06:44:24 2021
  • Total time = 24633.652 seconds. CPU (110.200%) Tue Apr 13 22:33:13 2021
  • Total time = 25520.944 seconds. CPU (108.820%) Wed Apr 14 05:46:02 2021

Between 6h30m -7h per plot.

Happy that finally everything worked successfully :smiley:

1 Like

That’s awesome, congrats!

What are your overall goals for number of plots? Want to join the pool when we launch it in a few months?

1 Like

That is awesome! You’re giving me motivation to scale my NUC a little more.

For now I will fill try to fill as soon as possible the drives I have. I have improved a little bit the speeds, about 22k sec per plot average plotting 3 in parallel. Also, I’ve just started to simultaneously plot another one using the external HDD, to compare results. Will post here.

I want to learn more about the pools, but maybe, depending on how things are going I will buy some more drives to join the pool once it’s launched! Probably I can add 1-2 HDDs of 10-14tb. I will try to find some good deals.

Yeah that’s my plan with the pool, I’ll have my own approximately 50tb of plots, and I’ll setup another external drive case with about 40-60tb of new space that I’ll plot into the pool. Will be interesting to see how the coin prices shake out starting in a few weeks leading up to pooling availability, will influence how much we all invest further I bet.

Out of curiosity, are you running with bitfield disabled (-e) or leaving it enabled in your plots?

At first I disabled it, but now I’ve leaved it enabled. Seems to plot faster.

Exactly! I will do numbers and $XCH price will help to decide if I scale this or not.

Note that the USB controller chipset matters a lot, especially for faster drives. Like… a lot a lot. The best and newest in my testing is the Asmedia ASM2364, so be careful what enclosures you buy for fast drives!

I did a similar test (well, actually, I have one running live!) on internal SATA HDD versus internal NVME HDD – its on a 10th gen intel 6c/12t NUC, and here are the results. Plot params are identical for all, 4t/16g, same machine, so the only variable is the destination drive:

c drive (nvme samsung 980 pro)

Total time = 26784.940 seconds. CPU (150.460%) Sat Apr 17 07:17:02 2021
Total time = 26626.646 seconds. CPU (150.230%) Sat Apr 17 14:48:12 2021
Total time = 26989.208 seconds. CPU (149.800%) Sat Apr 17 22:25:28 2021

avg time 26800s or 7.4 hours

d drive (sata samsung 860 pro)

Total time = 30461.661 seconds. CPU (127.230%) Sat Apr 17 04:42:48 2021
Total time = 30680.612 seconds. CPU (128.250%) Sat Apr 17 13:24:10 2021
Total time = 30355.009 seconds. CPU (132.970%) Sat Apr 17 22:00:08 2021

avg time 30488s or 8.4 hours

So NVME vs. SATA is worth 1 hour exactly. Percentage wise, that’s 7.4 vs 8.4 = 13% faster.

2 Likes

Ah so you are running a rig! Thanks for posting these results, very interesting.

I am definitely kicking myself for going with a slightly older, less powerful NUC. Was trying to limit my investment, but now I’m fully drinking the koolaid and browsing hard drives when my wife falls asleep haha.

Will probably upgrade in the near future.

1 Like

This one is the NUC10i7FNH. It has room for 2 drives, one NVME and one SATA, so I populated both.

It’s 6c/12t so my rule of thumb is not quite one plot per core, I am doing 5 simultaneous on this guy, 2 to the 2tb NVME SSD (which also holds the OS) and 3 to the 4tb SATA SSD. I should probably switch it around to 3 on the NVME if it’ll fit, and less load on the slower SSD… I know they reduced the plotter footprint in 1.0.4 so it’ll definitely fit… 1.0.4 temp space is 256gb. Let me do that now.

1 Like

Yeah you should be able to run 6-7 plots at once with some staggered start times on that 2tb nvme. I can run 3 easily on a 1tb right now, and someone on Reddit said 4 is also doable with the right staggers, so I’m going to try that once I get home.

I’m doing 4-5 on my 2TB Inland Premium NVMe. 4 was around 8.3hrs a plot. Seemed like it was worth it over doing 3 over the course of a few days (projected). This was with an hour stagger.
I think I start pushing 9.5hrs on 5… not sure, I was staggering funny. so not sure if that is accurate.

I tried to do 3 NVMe + 1 SSD and it seemed like I went into some weird time warp where everything slowed down. So not sure why there.

@codinghorror - I got the exact same NUC - actually 2 of them - equiped and configured exactly the same way - same ram, same nvme (Aorus 2TB) same SSD (Intel DC S3710). I do 6 plots via plotman (so some staggering) on both.
One is doing it in around 45k (which is not good at all)- the other one 75k (which is realy bad) I cant figure out the problem. There seem to be a high IO wait since the CPU report in “Total time” is around 50%

1 Like

I haven’t played around with staggering yet, Is the general idea behind staggering to not over-burden the CPU during stage 1?

1 Like

@dan90266 I think the general idea is to avoid simultaneous copying to the -d destination disk.

I have a decent number of plots on older hardware. These should all be on 1.0.4. They’re somewhat interesting because I’m plotting to matching (old) HDDs in 3 different systems.

Intel(R) Xeon(R) CPU E5-2620 v2 @ 2.10GHz (x2), 128GB RAM

  • 12 cores / 24 threads total
  • HDDs
    • 4x 2TB Hitach HUA723020ALA640
    • matching -t and -d args
    • 4 parallel plots (looped)

One interesting thing to note here is the plots shouldn’t be restricted by CPU / RAM and the times end up being worse than the i7 running 6 parallel plots on 4 cores. I’m guessing I’m losing time from anything compute related due to the low clock speed of the CPUs in this system.

chia plots create -k 32 -n 1 -b 6750 -r 4 -u 128 \
  -f "${fpk}" -p "${ppk}" -t "${tmp}" -d "${dst}" \
    1> "${flog}" 2> "${elog}" &
Total time = 52997.309 seconds. CPU (96.950%) Fri Apr 16 09:04:49 2021
Total time = 56877.742 seconds. CPU (96.380%) Sat Apr 17 00:52:55 2021
Total time = 56320.806 seconds. CPU (95.900%) Sat Apr 17 16:31:44 2021
Total time = 55545.991 seconds. CPU (94.650%) Sun Apr 18 07:57:39 2021
Total time = 52670.217 seconds. CPU (95.940%) Thu Apr 15 17:05:43 2021
Total time = 52020.539 seconds. CPU (94.900%) Fri Apr 16 08:48:40 2021
Total time = 52893.222 seconds. CPU (96.440%) Fri Apr 16 23:30:17 2021
Total time = 52698.017 seconds. CPU (95.860%) Sat Apr 17 14:08:44 2021
Total time = 53597.958 seconds. CPU (93.830%) Sun Apr 18 05:02:05 2021
Total time = 49713.422 seconds. CPU (100.440%) Thu Apr 15 16:16:31 2021
Total time = 50808.946 seconds. CPU (101.280%) Fri Apr 16 08:28:33 2021
Total time = 49460.708 seconds. CPU (100.880%) Fri Apr 16 22:13:23 2021
Total time = 49813.684 seconds. CPU (100.090%) Sat Apr 17 12:04:32 2021
Total time = 50352.219 seconds. CPU (98.620%) Sun Apr 18 02:04:11 2021
Total time = 51927.866 seconds. CPU (96.670%) Sun Apr 18 16:30:38 2021
Total time = 50610.372 seconds. CPU (101.600%) Thu Apr 15 16:31:33 2021
Total time = 51741.478 seconds. CPU (100.030%) Fri Apr 16 08:44:09 2021
Total time = 50767.802 seconds. CPU (101.170%) Fri Apr 16 22:50:49 2021
Total time = 49919.185 seconds. CPU (102.220%) Sat Apr 17 12:42:57 2021
Total time = 52194.367 seconds. CPU (101.070%) Sun Apr 18 03:13:24 2021
  • SSDs
    • 2x Samsung SSD 850 PRO 256GB (6.0 Gb/s SATA)
    • RAID0
    • external HDD for -d
    • 2 parallel plots contiuously looped after phase 1 is done
chia plots create -k 32 -n 1 -b 6750 -r 4 -u 128 \
  -f "${fpk}" -p "${ppk}" -t "${tmp}" -d "${dst}" \
    1> "${flog}" 2> "${elog}" &
Total time = 35647.899 seconds. CPU (133.050%) Thu Apr 15 12:21:35 2021
Total time = 37323.691 seconds. CPU (132.430%) Fri Apr 16 04:41:22 2021
Total time = 37814.833 seconds. CPU (128.800%) Fri Apr 16 15:25:13 2021
Total time = 37314.965 seconds. CPU (128.680%) Sat Apr 17 02:00:54 2021
Total time = 37424.037 seconds. CPU (130.650%) Sat Apr 17 12:39:28 2021
Total time = 37657.850 seconds. CPU (131.010%) Sat Apr 17 23:22:40 2021
Total time = 36579.381 seconds. CPU (130.450%) Sun Apr 18 09:48:27 2021
Total time = 38395.255 seconds. CPU (130.700%) Fri Apr 16 09:09:06 2021
Total time = 38520.077 seconds. CPU (131.280%) Fri Apr 16 20:05:22 2021
Total time = 38652.444 seconds. CPU (129.230%) Sat Apr 17 07:04:09 2021
Total time = 37746.034 seconds. CPU (128.400%) Sat Apr 17 17:47:24 2021
Total time = 37777.937 seconds. CPU (129.670%) Sun Apr 18 04:32:51 2021
Total time = 38249.343 seconds. CPU (130.510%) Sun Apr 18 15:26:53 2021

Intel(R) Core™ i7-4790 CPU @ 3.60GHz, 32GB RAM

  • 4 cores / 8 threads total

This one has 2 different types of disks, so I broke it down by disk type. The machine runs 6 parallel plots (looped) using all 6 disks.

  • HDDs
    • 3x 2TB Hitach HUA723020ALA640
    • matching -t and -d args

I had some scripting issues on this machine, so the numbers aren’t super reliable because I didn’t have 6 parallel plots all the time. I find it really interesting to see the fastest plot with some of the lowest CPU usage. I wonder if it’s counting D state as CPU time or something.

It’s also interesting to see the CPU usage calculated to be lower than the last machine which has slower plot times. I think that means I’m leaving CPU performance on the table, but I’m not sure. Anyone know?

Since I didn’t have 6 parallel plots going at first the lower times are a reasonable estimate of the fastest plots I can do to the Hitachi HDDs on this system.

chia plots create -k 32 -n 1 -b 4096 -r 2 -u 128 \
  -f "${fpk}" -p "${ppk}" -t "${tmp}" -d "${dst}" \
    1> "${flog}" 2> "${elog}" &
Total time = 39300.386 seconds. CPU (69.620%) Fri Apr 16 11:49:05 2021
Total time = 42273.174 seconds. CPU (70.800%) Sat Apr 17 03:31:29 2021
Total time = 40857.753 seconds. CPU (75.040%) Sat Apr 17 15:29:19 2021
Total time = 42719.311 seconds. CPU (74.630%) Sun Apr 18 06:30:42 2021
Total time = 44129.339 seconds. CPU (63.370%) Fri Apr 16 13:09:43 2021
Total time = 45614.537 seconds. CPU (68.250%) Sat Apr 17 04:27:20 2021
Total time = 47091.374 seconds. CPU (71.630%) Sat Apr 17 17:37:17 2021
Total time = 46774.869 seconds. CPU (68.890%) Sun Apr 18 07:38:26 2021
Total time = 43925.360 seconds. CPU (64.770%) Fri Apr 16 13:06:26 2021
Total time = 44995.274 seconds. CPU (65.480%) Sat Apr 17 04:17:07 2021
Total time = 46374.006 seconds. CPU (68.380%) Sat Apr 17 17:22:41 2021
Total time = 48164.761 seconds. CPU (70.010%) Sun Apr 18 08:01:41 2021
  • HDDs
    • 3x 320GB Seagate Barracuda 7200.10 (ST3320620AS)
    • large internal HDD for -d

I had scripting issues for the first couple rounds and the plots to these disks were failing, so I only have the first successful round. I can watch iostat and see these disks are slower than the others.

chia plots create -k 32 -n 1 -b 4096 -r 2 -u 128 \
  -f "${fpk}" -p "${ppk}" -t "${tmp}" -d "${dst}" \
    1> "${flog}" 2> "${elog}" &
Total time = 64126.384 seconds. CPU (50.990%) Sat Apr 17 14:33:28 2021
Total time = 65534.108 seconds. CPU (50.930%) Sat Apr 17 15:11:29 2021
Total time = 65253.436 seconds. CPU (52.260%) Sat Apr 17 15:08:37 2021

Intel(R) Xeon(R) CPU E5-1620 0 @ 3.60GHz, 64GB RAM

  • 4 cores / 8 threads total
  • HDDs
    • 2TB Hitach HUA723020ALA640
    • matching -t and -d args
    • 4 disks running in parallel with 4 more started after phase 1
    • 8 parallel plots at after the first stagger
chia plots create -k 32 -n 1 -b 6750 -r 2 -u 128 \
  -f "${fpk}" -p "${ppk}" -t "${tmp}" -d "${dst}" \
    1> "${flog}" 2> "${elog}" &
Total time = 61173.757 seconds. CPU (78.220%) Thu Apr 15 22:14:12 2021
Total time = 60109.205 seconds. CPU (81.210%) Fri Apr 16 22:30:13 2021
Total time = 60177.397 seconds. CPU (81.070%) Sun Apr 18 00:52:46 2021
Total time = 61352.661 seconds. CPU (78.860%) Fri Apr 16 05:21:37 2021
Total time = 60260.062 seconds. CPU (80.560%) Sat Apr 17 08:05:10 2021
Total time = 60243.810 seconds. CPU (80.370%) Sun Apr 18 08:30:42 2021
Total time = 61178.092 seconds. CPU (77.610%) Thu Apr 15 22:14:45 2021
Total time = 62880.919 seconds. CPU (77.940%) Fri Apr 16 23:18:18 2021
Total time = 63454.978 seconds. CPU (77.550%) Sun Apr 18 02:00:22 2021
Total time = 59981.112 seconds. CPU (80.950%) Fri Apr 16 04:46:54 2021
Total time = 60137.530 seconds. CPU (80.250%) Sat Apr 17 08:14:34 2021
Total time = 60761.847 seconds. CPU (79.760%) Sun Apr 18 09:25:05 2021
Total time = 61226.161 seconds. CPU (78.130%) Thu Apr 15 22:16:14 2021
Total time = 62288.823 seconds. CPU (77.950%) Fri Apr 16 23:09:33 2021
Total time = 62666.744 seconds. CPU (77.790%) Sun Apr 18 01:58:53 2021
Total time = 60868.249 seconds. CPU (80.390%) Fri Apr 16 05:09:01 2021
Total time = 60620.490 seconds. CPU (79.800%) Sat Apr 17 08:20:21 2021
Total time = 61447.937 seconds. CPU (79.240%) Sun Apr 18 09:57:00 2021
Total time = 60719.927 seconds. CPU (78.970%) Thu Apr 15 22:08:09 2021
Total time = 61064.468 seconds. CPU (79.440%) Fri Apr 16 22:31:53 2021
Total time = 61730.494 seconds. CPU (78.930%) Sun Apr 18 01:17:45 2021
Total time = 60115.464 seconds. CPU (80.800%) Fri Apr 16 04:52:27 2021
Total time = 60441.838 seconds. CPU (80.820%) Sat Apr 17 08:03:57 2021
Total time = 60271.737 seconds. CPU (80.080%) Sun Apr 18 08:22:56 2021
Total time = 60653.026 seconds. CPU (80.010%) Fri Apr 16 15:18:05 2021
Total time = 61004.122 seconds. CPU (79.320%) Sat Apr 17 15:34:53 2021
Total time = 60979.367 seconds. CPU (80.390%) Fri Apr 16 15:19:44 2021
Total time = 60986.338 seconds. CPU (79.320%) Sat Apr 17 16:18:56 2021
Total time = 61135.941 seconds. CPU (79.130%) Fri Apr 16 15:24:16 2021
Total time = 62452.243 seconds. CPU (77.950%) Sat Apr 17 16:45:00 2021
Total time = 60635.489 seconds. CPU (80.180%) Fri Apr 16 15:03:46 2021
Total time = 60817.650 seconds. CPU (79.130%) Sat Apr 17 15:32:05 2021

I started 10 days ago and I’m going to get my 200th plot today. I’ve won 4 XCH so far which is pretty decent statistically considering I had 0 plots 10 days ago and it’s only been 3-4 days since I got everything scripted to my liking.

5 Likes

Awesome. Having plotting machines that are all the same would be a dream come true. Mine are all different, from old i5 dell laptops to one of those 2012 “trash can” mac pros and some newer macbook pros. It’s been interesting and somewhat surprising to see which perform better than others.

A lot for me to chew on here. Thanks, @ryan ! :clap:t5:

1 Like

I realized … yet again … I am a complete dumbass. Guess what I should be using instead of the 6gbps SATA port?

The 40gbps Thunderbolt 3 port! This baby, right here:

Yeah, that was really stupid of me. :man_facepalming:

SSDs aren’t that much slower, but good lord – a quality NVME drive connected via Thunderbolt 3 is pretty much full bandwidth! Like having a second M.2 PCIe 4.0 port on the motherboard!

I’ve recalibrated my two NUC10i7FNH plotters so they are both using the TB3 port to run half their plots. Gosh, that was dumb of me. You don’t even need the SSD drive mounted in there, unless you want a speedy dump drive…

I think I got distracted by “things that are directly connected to the motherboard must be faster” but that is not necessarily so in a world of USB 3.2, and Thunderbolt 3 … and the upcoming USB 4 that smooshes it all together into one giant ball of awesome!

1 Like