MadMAx k33, k34 now in the Chia v1.2.11!

Has support for K33, K34 been added to the madmax standalone program?
I only ask as my plotter is Linux, does not have chia or any keys or crypto installed on it, just runs madmax from CLI. K33’s would be of interest and I agree with SlugPlot they should require fewer disk operations to check plots.

MadMax plotter has a separate executable for creating K-33 and K-34 plots in Linux: chia_plot_k34. The reason for this is that MadMax source code uses compile-time evaluation (C++ templates, plus some #if directives) to generate an optimized executable.


In relation to whether K-34 requires fewer disk operations to answer block challenges faster, please check out the following quotation from Chia Proof of Space Construction (version 1.1, 2020-Jul-31):

Yes, I’m aware of the quality cutoff optimization. I’m still curious about the first pass before that cutoff. It’s a curiosity to me; I don’t use Raspberry Pi.

Edit: Also this concern is more relevant to pooling, because a pooled harvester is receiving more proof requests.

After the cutoff optimization, "ProofOfSpace check -f PLOT" takes about 0.4 seconds per check in the case of K-29 and about 0.8 seconds per check in the case of K-32 (HDD, 7200 RPM).

I am not sure what you mean:

  • The number of messages sent from a farmer to a pool is controlled by the difficulty setting. Difficulty 2 starts at about 10 TiB, difficulty 10 at about 60 TiB.

  • The number of proof attempts per second is basically a predefined constant value (approximately 1 attempt every 10 seconds).


With the pooling protocol, your farmer will communicate with a specific pool server, and send proofs of space very often to that pool, in order to prove how much plotted space you have.

Pooling requires more proofs than solo.

1 Like

Pooling requires sending proofs more often (partials), but the lookups stay the same

Is there k33 Support in madmax for Windows/powershell?
And are there other requierements Tp the temps as for k32? Is there a List somewhere?

Thx!

1 Like