My Setup's and opinion on best way to go forward with parts/plots

So I am currently plotting some normal plots (mad max GUI plotter) on my 11th gen i7 at home and putting them into the office at work to sit and mine on a AMD FX 6300. I am thinking about grabbing a 35w 10th gen i3 to throw into a board I have at the office to lower the power usage for the farmer.

My biggest question is could I be making compressed plots in the Chia Alpha release with the GUI plotter and then use the i3 10th gen to CPU farm them? I don’t have a spare video card I want to put into that system and want to keep the power draw as low as possible for farming, maybe a GT 710 or GT 1030 but don’t plan on dropping either of my 7900xtx or 3080 into that farmer. I am only mining to hold the coins until they get onto coinbase or w/e. Mostly just farming for fun and have no current way or want to sell them, so using as little power as possible is my goal.

Also if I can CPU farm compressed plots on the 10th gen i3, what compression level should I use and what is my cap before my CPU gets over ran or getting hammered all day farming? Currently have about 100TB but am slowly getting more.

Thanks for reading and giving input :slight_smile:

Also I am familiar with Ubuntu but really don’t want to run linux as windows just makes remote management for me easier and daily ops easier for me.

1 Like

Bladebit has a simulator command, where you can benchmark the different c-level and get to know what your CPU can handle in terms of farm size. When GPU support is finally added, you will be able to simulate for that as well.

In terms of compression levels, C0 (uncompressed) is outdated, you should use at least C1 or C2 at the bare minimum since the computation overhead is non-existent. CPU sweet spot seems to be around C5.

Also make sure you understand that C levels are not the same for Gigahorse and BladeBit : GH skips the first two levels, so C1 for GH = C3 for BB. The max level is C9 for GH and C11 for BB (and it will remain this way).

2 Likes

it’s only one level difference, C1 GH is C2 for BB

I suspect BB does the higher levels differently, that’s why they go to C11, I could not find any code yet to see what is happening.

2 Likes

Thank you for the clarification.

That will be a while. Plotter and farmer are still in alpha, then going into beta, and then release.
If the first release will include plotting from the GUI or CLI only is unknown at this point (at least to me).

On the other hand both the plotter and farmer are working atm in alpha versions, albeit with some caveats here and there. (like no plotting in win10, creates bad plots)

Yes, I’m using the alpha’s now and farming compressed plots with an i3-6100.
I have like 140 TB of c4 plots now + 250TB uncompressed. But the CPU is fine with that. Looking at average 6-8 Watt power draw for the CPU with short peaks to maybe 15W in hwmonitor (idle it uses 5W). So really not even breaking a sweat.

C4 seems to be the best balance for low end CPU’s, c5 also ok but more restricted. c6 needs beefier CPU.
I also like to play it safe, a year from now most likely the plotfilter will change to 256, which will almost double the compute required for decompression.

So with a i3-10100 you want to look at c4 or c5, make a few of those plots and run the power simulator, and see what it says.

2 Likes

To add to what voodoo said, CPU efficiency is always lower than GPU efficiency, so if you’re going the CPU route you want to minimize the load on your CPU by principle.

The GPU route is better in scale for effiency, and can be used for larger farms where the efficiency shines (multiple PiB per 1 GPU), or for more intensive farming with higher C levels.

1 Like

@aurelius Yeah in scale the GPU makes sense but at 100tb to even 500tb, I am not sure the power draw from the GPU would really be worth more then the power cost of having one in your system being used, the TDP for even a 1030 is 30w so almost doubling the draw of the system for something like 5% more then CPU mining at c4-5. I’ll worry about that when I have enough hard drives for the added power cost to actually overtake the cost per KWH to make sense.

TBH I have been video card mining for years now until eth finally went to POS and I sold them all off, the power costs are not minimal when adding any video card into the system.

1 Like

@Voodoo No plotting in windows 10? Well that sucks, because that is what that system is running. What about 7 or 11? I really don’t want to make an unbuntu or linux distro install just for a chia miner, like, at all. I am honestly fine with the c0 plots living life in the corner with pooling now honestly. I also have a pretty full plate in my life, so simplicity to running, managing and adding more hard drives is the goal here.

I have no interest in GH only because my farm is not big enough and the plots are not universal and include a dev fee, which I am all fine with, if I was large enough to make sense to absorb those costs, but I am not.

But I would like to fill all of my drives moving forward with at least some compression and then work slowly through the back log of the old drives to get them upgraded. As long as the power draw stays truly minimal, since that is the entire point of chia and also not being able to sell them to recoup electricity costs makes a power hungry system all that more unappealing.

But just to make life as easy and cheap as possible, I am thinking to make c4 plots on alpha build on my i7 on windows 11 @ c4 to keep cpu usage as low as possible and then I put them into the i3 build at the office running the alpha on what ever OS they work on to farm. I was also debating using the flex farmer from flex pool, mostly so I don’t need to keep the full blockchain on the system.

Also I ordered the 10th gen i3 10100T today for the board and have 16GB of DDR4 laying around. So I’ll swap that over into the FX system when the CPU gets in. I feel like that will cut down power draw pretty massively I hope. The system right now likes to hover around 90w from the wall with all the drives as well, which isn’t bad but that seems like alot for a system that is making me like $10 bucks a month, that is probably break even on power costs.

1 Like

@madMAx43v3r Are you the same guy that made the plotter? If so that’s super cool man, living legend. So serious question though, how long until you figure out how to just GPU mine chia by spoofing the blocks and the entire chain is rendered useless to hard drives? I feel like it is only a matter of time, kind of like how BTC went from CPU>GPU>ASIC in the starting years and every coin since has had some developer figure out how to do it with X that is 10,000 times better and faster.

7 no idea, win11 is fine. Win10 seems to have some nvidia driver related issue. Some cards work, others not. It makes plots, they just don’t pass chia plots check.

Anyway win11 works for plotting, albeit slower than linux for the moment. They haven’t got round to the windows optimization yet. Seems all the devs in Chia space are Linux fans :sweat_smile:

I totally agree with your reply.

Coming from GPU mining too, think about plot compression in the same way you’d think about power and frequency : highest power (and frequency) during bull markets, lowest power (and frequency) during bear markets.

GH works with flexpool and flexfarmer, if you’re interested in lower fees (1.5% on cpu) and “hands off” approach it’s a good solution. Bladebit is free but still in development.

That is actually the real madmax :slight_smile:

FPGA and Asics won’t be a thing, I was talking about it with Gene Hoffman on discord a few days ago, search for “fpga” in the general channel and you’ll find the whole discussion. The quick TLDR (since you were into gpu mining) is that it’s similar to Daggerhashimoto (high mem bandwidth required) which makes GPUs competitive with ASICs (and thus disincentivizes their production).

Honestly I tuned all of my cards to use the highest stable memory clock and then undervolted and brought the GPU chip down as low as possible before losing hashrate. I would just sit on the coins until the market spiked to deal with the swings instead of actively adjusting the miners everyday based on the market lol.

Yeah I saw GH works on flexpool but I really hate plotting. I have really good, fast and high endurance m.2 SSD’s for plotting but I still don’t like watching them get murdered in TBW to plot. I might make a cheap plotting server in the future so I can ram disk plot because replacing my 2x 2TB m.2’s to keep plotting on them costs more then getting some of the budget plotting servers up and running. If GH was just universal and worked in the GUI I would use it, I’m not against the dev fee at all for a solid product, not saying it’s bad just won’t work how I want to run my farm is all and I am really not open to constantly replotting.

I know FPGA and Asics won’t be a thing but plot spoofing has been a talk of discussion ever since max first figured out ram disk and was ripping out like 45 seconds plots when the coin was still fairly new and now with GH I think it was they are getting dangerously close to the time needed to plot spoof the winning plot. I saw some discussions with I think bram that said just moving X to X would prevent that, if that ever happens. That was more what I was referring to asking max about it.

I can bump that system up to 11 with the new chip easily enough. Sucks for more overhead though. I’ve spent plenty of time in Linux. I just don’t like to daily drive it especially on a system that I want to barely touch. I don’t enjoy running command lines to do everything or more so having to memorize them to make it not a chore like it is 1987 again lol. But as we talk more, I really don’t care about that computer at all, so maybe ill throw ubuntu on it and just see how bad it goes at some point.

So to plot, do I need to use a video card to make the compressed plots no matter what? Will either of the plotters work with AMD cards? My 3080 is in my office water cooled PC on windows 10 and I am not bumping that to win 11 anytime soon and can’t easily pull out that card.

@Voodoo @aurelius So on windows 11 and I am able to plot compressed plots with BB disk in the chia alpha GUI with a 7900xtx and 11th gen i7?

Nvidia only, AMD support is a long way away

BB Diskplot, will work at some point, but not right now. Fixing that is waiting for a major overhaul that prob isn’t coming until end of june at the earliest.

You can always run Ubuntu WSL in windows for plotting only…work on those cli skills :stuck_out_tongue_winking_eye:

So if I run Ubuntu to plot/farm, can I make compressed plots without issue?

I mean you know my setup’s enough now to understand what I have. What options do I have to make and farm compressed plots? and you’re opinion of the best way forward?

Plotting has changed a lot since the early days. I’m not sure if you can actually plot compressed with SSDs, or with CPU for that matter, I’ve switched to ram plotting a long time ago since there is no wear and I just do GPU plotting since it’s faster.

You are referring to grinding, and that will not happen. It is already very hard to do by design (hence the large amount of memory/writing needed to make the plot) and changes are on the way to prevent it even further (read about CHIP-12 and CHIP-13)

1 Like

Apparently ( not tested by me ) GH plots are plottable.on cpu with 64 gb ram.

3 Likes