But less than the energy requirement of the RGBY on all the gpus+keyboards. Or speaking of energy, how much energy do we spend on all those aroma/oil diffusers? Or charging those apple iwatches that everyone gets despite having a phone that does it all already?
I’m tired of people focusing on one tiny thing and saying it uses up the energy of xxx households. There are so many things we do that waste energy, money, time, etc. We either accept that or we attack them all.
You know how many forests we’ve destroyed for the sake of coffee?
How many edible crops we could have grown instead of marijuana which we just burn? (not to mention the massive energy footprint of grow lighting)
How many homes we could have built if 100,000+ people were employed in housing instead of the music industry? gun industry? videogame industry? Hell Twitter had 7500 employees for a platform that honestly could be created and ran with 10.
How many homes or bridges we could have built instead of all the football stadiums?
Humans could easily make tiny sacrifices to make the world better but we don’t. Ask any environmentalist why they haven’t gone after coffee or marijuana and they’ll give you a dirty look. Ask San Francisco why they pay more taxes yet have more homeless than elsewhere, why they spent $1.7 million on a public toilet.
We’re an imperfect society, get used to it. This massive mega-boner people have for the power use of cryptos is ridiculous. So what if BTC uses up the power of some small nation, we could probably save the same amount by reducing the volume on our TV’s. I’m pretty sure we spend more power on Youtube videos than most nations.
I know what this means on a grander scale. It means that smaller capacity drives just became a bit more logical. If the price is right of course. The fact that you can get a decent percent more bingo cards per drive? It changes the calculation for the cost to acquire the drive versus the cost to run the drive. SAS FTW.
I have a couple of questions which I have not yet seen answered.
In the OP, it is stated that Linux is a requirement. Will a windows version be created? I’m a windows man and while I’m sure that I can get it all running under Linux I would prefer not to have to.
Will it be possible to farm NFT plots and the new MadMax plots simultaneously while replotting?
I will be building a new PC for myself in the near future and I will overspec with replotting in mind. Also have a 2080ti sitting idle which I will put into my farm once it is required so I see no major downsides in the longterm for me to use these new plots (apart from the current Linux requirement )
Can someone give a small TLDR of what GPU plotting is and how we came here from lets say last year? I have not been following every post so i feel a little lost …
Windows GPU plotter is already out (for MMX), however the performance is not as good or in some cases pretty bad, we are still trying to figure out why exactly. Windows CPU plotter is finished also.
Yes, however the fee will be static, to keep it simple.
So for smaller farmers, speed is not really what is important when creating plots.
If a smaller farmer wanted to use Gigahorse primarily for creating C3 or C4 compressed plots for harvesting with a raspberry pi and only wanted to use a CPU to create them,
Then what would be the requirements or recommend specs and what are the plot times they could create a plot without a GPU?
Choosing a GPU in preparation for this new era of ploting and farming, what specs should be considered? only total cuda cores would matter or also GPU RAM? there is a general guideline? (I am considering either a rtx 3070 or a A5000 as I have a maximum 300W limit on my server riser and I also require a blower type refrigeration)
considering I can find some used A5000 for a good price from previous eth miners and that it has lower energy requirements that would not exceed my dell r730 riser power limit, would it be a much better investment that a 3070 (both are under 300 TDP <around 220 TDP>. cuda cores difference is significant in favour of the A5000 < 8,192 cuda cores vs 5.888). i currently own a 2PB farm and c7 seems possible with both even if I increase my netspace contribution in the next months.
Just to clarify, does this mean that compressed larger k size plots will take more resources to farm than smaller compressed k size plots? Does it track linearly with effective plot size?