How to run Harvester on a Synology NAS

Hi all, new to the forum.

I have 2 x Synology 12-bay NAS that I am using for plot storage. I configured the first one as a 12 x 10TB drive RAID-5 and then managed to successfully fill it with 969 plots before the pools existed. However, I have been getting the ‘Looking up Qualities’ warnings that have been mentioned elsewhere here.

The second Synology was configued the same way and had around 200 pool plots on it and appeared to be working well. However, in light of the errors happening on the first one, I have reformatted and changed the RAID-5 to 12 x 10TB standalone basic drives. They are still in the NAS. I am now replotting to the individual disks.

I am running a Chia full node on a VM running on a Dell 720XD server and am just pointing the Plot directories to the NAS devices. This appears to be working, however, I would like to streamline it. I understand that I will be better running a Harvester on the NAS devices but I have no idea how to make this work. I know it’s something to do with Docker but it’s all gobbledygook to me.

Does anyone have any clear, simple instructions to get this working or should I leave it as it is?

Thanks!

1 Like

This depends totally on the model of the NAS. If it is not a plus model, you can’t run it locally (unless you really want to jump through a bunch of hoops). If it is a plus model, you can run VM’s and even docker. But what you did by setting them up as individual drives was the right move. You can harvest them from a remote system without issues. You just need to make sure you have a dedicated harvester for each NAS and don’t use the same network connection for harvesting as you use for copying new plots to that NAS. If you do those things, you’ll be fine.

2 Likes

Hi there,

Thanks for the reply. I will shortly have 3 x DS2413+ and 1 x DS2415+ containing the drives and plots.

I quite fancy the Docker route but it all seems a little complicated to setup. Do you have any easy instructions on how to do this?

Thanks again!

Our farmer has docked instructions in the help file plus a ton of synologuy users. But yeah unless you’re with us I dunno if those instructions would be helpful for you. I’ve been told it’s a turn key program.

Could I take a look?

1 Like

Thanks for that and I’ll take a look. BTW, I’m using FlexPool.

Then it’ll be easier :slight_smile:

OK, I’m getting somewhere. I’ll keep fighting with it for a while.

Hi,

I’ve configured the config.yml file but I’m getting '‘wrong farmer public key’ errors in the logs. Of the 69 plots I have, they are all showing the same error. I trhink I’ve got the ‘Farmer Secret Key’ field wrong in the config file?

Where can I find this farmer public key or the Farmer Secret Key on a Windows installation?

Thanks!

On windows you need to run the Python script to get it we publicly haven’t released the windows version yet as it’s too complicated currently.

To do it in Linux you can hop on our discord or YouTube it probably but for windows you’d need to run a Python script and get the windows version from us directly (again on discord)

Woohoo! Sorted, many thanks!

When I add plots to the directories, do I have to restart the Docker app or will it pick them up automatically?

Automatic !!!

Thanks.

Last question, I can see on my firewall a connection to flexpool.io from the Synology device. However, the Worker isn’t showing in the account.

Does it take a bit of time?

1 Like

Ignore that. I was using the wrong Mining Address. This thing looks awesome. Hope it works out!

Thanks again for your help.

1 Like

Isn’t it awesome on Synology? I hope you enjoy :smiley:

1 Like

Hi again,

I’ve added a second worker onto the same network. All seems initialized and it’s farming the first 5 plots. However, it doesn’t appear in the CP.

Does it take a while to appear?

Thanks!

And it’s just appeared, 2 seconds after I posted. :smiling_face_with_three_hearts:

Hi again,

Docker logs appear to be piling up. Is there any way I can turn this logging off?

Txs!