NVMe drive not releasing disk space after failed plot


My Windows 10 plotter has been running for 22 days.
Once, every few months, some resource runs out, and my plotting on its associated NVMe drive stalls.

Today, it ran out of disk space. Why? Who knows. For 22 days it never got low on space.

As such, the plotting stalls (it keeps waiting 5 minutes for space to be available, and tries again). So I killed one of the k33 plotting sessions, and deleted all of its associated tmp files. That freed up a few hundred gigabytes of space, the the remaining k34 plotter continued – sort of.

It plotted for another 20 or 30 minutes, and it stalled. Still hundreds of gigabytes were free, and this time, no complaints from the plotting job. It just stalled. I waited 2 hours, and it did nothing. Task Manager showed no reads/writes to that NVMe drive.

So I killed that last plotting job, and I deleted all of its associated tmp files.

This should have made 100% of the NVMe drive’s space available, but it did not.

Of the 2 TB capacity of the NVMe drive, only 1.4 TB became available, and yet there were zero files on the drive (other than the directory files, which effectively use zero bytes). No hidden or system files. Simply zero files, and 600 GB of space is missing.

This happens once every few months. Upon rebooting, I get 100% of my space back. Everything returns to normal.

I would rather not do a reboot, but I do not know any other way to “reset” the space on the drive.

I have not yet rebooted the PC, because other plotting it still running on a different NVMe drive.
When that other plotting completes, I will do a reboot.

Something is holding on to file space.
Can anyone offer a way to identify what is holding on to the space, and how I can have it release the space, thereby not forcing me to do a reboot?

Thank you.

This is not uncommon. It happens to me about once a month.

I do not know why, but the Chia plotting process sometimes leaves ghost files on NVMe plotting drives. Usually you cannot see them and even if you can you cannot delete them.

I am surprised that you get the space back after re-boot. I have to quick format the plotting drive whenever this happens.

Do a quick format of the NVMe plotting drive and you will be good.

You mean no reboot. Just do a quick format?
That is fine. But I would prefer to identify what has taken the space hostage.

There could be stuff in memory that is associated with the connection to the unseen space usage.
Formatting the drive might not release that memory.

I will probably end up re-booting, just to start off completely fresh. And it will be an opportunity to let Windows update what stuff it wants to update.

I would like to find a way to identify the process(es) in question and terminate them. It just seems to be a cleaner solution. If nothing else, finding the culprit(s) will scratch my itch.

As mentioned, this has happened to me repeatedly. I have seen no ill effects from formatting or connection to running processes associated with the ghost files.

1 Like

You can see all the ghosts in the drive :slightly_smiling_face:

1 Like


It is known to run on Windows 95 (IE5), Windows 98 SE, Windows ME, Windows NT4 (SP5), Windows 2000, Windows XP, Windows Vista, Windows 7, 8 and 8.1.
Server versions of Windows generally work as well, but aren’t always tested as well by us. Feel free to get in touch with us if you want to lend a hand :slight_smile:
have you tried this on Windows 10?

@hoca05 Their web site has no information related to my issue.

How do you know that that utility will identify ghost files and their handles and provides a way to recoup the captive storage space?

Have you used the utility for the purpose I am seeking?

Did you tick the ‘hidden files’ box in Explorer View tab?

I mostly use the command prompt, and I did check for those files.

For command prompt, did you use attrib /s -r -s -h

Although, that will not bite on things like recycle bin, etc.

I typed dir/a
That will show everything.

I can’t remember the last time I used the recycle bin. It has to be a least 10 years ago.
I use the “del” command.

Similar to Linux.
You use “rm”, and it is gone. (well, it is recoverable – but for freeing up disk space, it is gone).

1 Like

Maybe search for Mark Russinovich sysinternals (Microsoft bought him). He wrote several system level apps. All high quality. Maybe you find something there.

1 Like

I have been using those apps when www.sysinternals.com was the URL.
Unfortunately, nothing there for this.

autoruns is especially helpful (not for the disk space issue – but for its intended purpose).

I frequently use du. Although, maybe DiskView will show some info?

That looks promising.

You should be trimming the drive frequently to free up the space. otherwise it will slow to a crawl. If you’ve got a 1TB drive you should probably be trimming it after every or every other plot created.

I thought that the drive’s controller handles the trimming, automatically?

SSDs are over-provisioned, and the controller deals with failing NAND cells accordingly by way of making use of the over-provisioned space and marking the failing cells as unavailable?

If not, how does the user perform the trimming?

My NVMEs are 2 TB, each, by the way.

Trimming and failed cells are different things. Trimming is kind of like garbage collection. Once a cell is used (written to and released), it is considered an “untrimmed” cell. The more such cells you have, the more likely you will hit such cell with a new allocation. Once you hit such cell, it is a tad slower than a “clean” cell. Although, my understanding is that the penalty is in low single digit percentage points (based on plotting times).

So, I would really not worry about that. Although, you can right click on a drive, go to Properties, Tools, Optimization, and schedule a daily optimization. If you want to be more aggressive, you can run a command line (Power Shell?) system util that will trim it (forgot the name). Just for kicks, you could run it once/twice per hour or so, and check your performance change (whether you can see any improvements or not - most likely not).

Found it:
Optimize-Volume -DriveLetter YourDriveLetter -ReTrim -Verbose

You can put multiple drive letter there (e.g., cdef)

Fully filling an ssd always slows it down. Just format the 2TB nvme as 1TB if not needed more space. That’s the trim. This is a 2 TB drive and formatted as 1 TB. It has been plotting without errors since 8 months. You have to cool it well.

I thought that that is only for mechanical drives; that it makes your stored data contiguous, and that that is not possible with solid state drives?

That looks interesting. I will look into that option to see if it differs form the “Optimization” noted in your first suggestion, and to confirm that it is designed to work with SSDs.

I am being cautious, because when SSDs were introduced, I read that attempting to optimize them actually wears on them more.