Chia Power Usage

Chia has a very low potential power ceiling given the limits on global hard drive fabs and limited supply of older inefficient storage solutions.

If the price of chia went up 1000x power consumption would still be capped by global supply of hard drives that are larger than 100 GB. ETH and BTC have no similar limit. If the price of ETH or BTC goes 1000x there will be old, less efficient graphics cards and ASICs would be coming out of the woodwork sucking up even more power to make a profit.

2 Likes

@codinghorror @mike
Thank you guys for making legitimate arguments.

So now it seems to me that Chia is banking energy save on up-front invest of the drives (assuming we start using SSDs to farm). I don’t think we’d ever run out of drives because people will just starting selling drives at jacked up prices. Even if the SSD price speculations do turn out to be true and Chia does save energy this way, I still think it’s perhaps not necessarily as good a PoW model as the team may suggest due to the sheer amount of hardware involved.

What do you guys think about the plotting power consumption then? The network has grown almost 4 Exibytes in the past month. At this rate, we are consuming 5 Gigawatt-hours annually (assuming 10WH per plot). If SSD’s storage capacity increases drastically in the upcoming years, there would only be proportionally increasing higher plot rates.

3 Likes

My gut says it’s ultimately fixed the same way supply is. The energy cost of plotting can be amortized over the effective life of the drive/k-value. If a NUC can do 1.5 TB / day, that’s 500+ TB of plots per year and those plots are good for 5+ years, it can’t be that bad.

2 Likes

Still very theoretical and speculative, but we’ll see.

But most people do this at a much smaller scale. Chia farmers have… magnitudes higher quantities. I think the HDD/SSD use is the biggest knock on chia’s “greenness” (making the assumption that crypto is useful). THe plotters will get thrown out. Can’t use them. And the hard drives will be dumped in mass. Maybe not in 5 years, but it will need to be accounted for.

Backblaze might be the best source of data if they have it. Since they basically use consumerish drives for their storage? At one point they were shucking drives like us, lol. They do failure rates over time, but they sub in replacements as density gets better before they totally fail (and… even Chia farmers might do this?)

2 things missing from this argument.

(1) Intangible value of Chia. Cost means nothing by itself. Spend $100k on a car? What if it’s a lambo? Now you got a deal.

Here are some intangibles off the top of my head: honest money (limited supply), access to banking, freedom from government control/restriction, innovative financial ecosystem to build new ideas

(2) Cost of plotting My 5950x has a 105w TDP. It runs almost 24/7 plotting. This will be a small cost compared to the 16 x 10TB drives using 80-160w 24/7 until they fail, but it’s a cost none the less.

1 Like

I can’t tell who are supporting/disagreeing with.

(1) I brought the price of Chia into the argument to point out that the farming power is very dependent on the price of Chia. Higher the market cap, more farms will be out there, that’s a very consistent principle.

(2) However much power plotting consumes, according to 10WH per plot and our current rate of plotting, the plotting power alone is pretty much on par with other cryptocurrencies with the same market cap.

1 Like

One more thing take into consideration should be the number of plots that dieing drives are going to take down with them, I don’t think anyone is going to be creating any backup of the plots so let’s say with a speculative netspace of of 100EiB, what percentage of drives will die over a year, five year, a decade etc. Cost of plot + Cost per TB that can realistically be expected to go out of service for good…

If you scroll up just a little bit, someone made the same argument. Supporters of Chia are making the argument that we are at the dusk of HDD space per price dominance and that in 6~ years we will all be using SSDs and HDDs would become obsolete. If all plots are SSDs, the only factor capping people from making new plots is the up-front cost of investing in an SSD and the limit of SSD hardware space produced by manufactures since SSD’s farming power consumption is unimaginable low. They also claim that plotting power will slowly diminish up to a certain point because of diminishing storage described by the point above.

1 Like

@Zahk My post wasn’t meant to be a purely for or against. Just 2 thoughts. But if you want to know, yes I think Chia is WAY greener than PoW cryptocurrencies. If you want something even Greener, try Helium =) One of the greenest.

Plots only take 10W for 6-12 hours while created, then they go to ~5-7.5W for farming. Let’s be conservative and say 10W - it’s also easy numbers. 4.8EiB mainnet right now means 48 million plots. That’s 480,000 KwH for farming. BTC is currently 120,000,000,000 kWh, or 250,000x more electricity usage. The Mainnnet would need to grow 250,000x in size to equal Bitcoin in electricity usage. We’d need a 1.2 yottabyte Mainnet…

2020 storage globally was estimated at 6.8 zettabytes (0.0068 yottabytes) (Source: IDC's Global StorageSphere Forecast Shows Continued Strong Growth in the World's Installed Base of Storage Capacity) Will Chia mainnet reach 176-times the size of installed global storage today? I predict no.

All very interesting… but as green as I think Chia is (and will be) compared to PoW, everything takes energy.

  • Television 200-250w
  • Fridge 200-225w
  • Washing Machine 240-260w
  • Clothes Drying Machine 2500-3000w

At the end of the day, we all prefer less electricity usage to more. That’s one reason I prefer Chia to Bitcoin. However, everything takes electricity. I guess the best argument about Chia is “it takes less.”

That’s what the numbers tell us.

1 Like

I don’t think you did the math right.

5 watts per plot is definitely way too liberal. I assumed that every 8TB consumes 5 watts.

Assuming you really meant 10 watts per plot, it’s 480,000KWH per hour. The 120TWH BTC consumption is 120TWH per year. Multiply 480,000KWH by 365*24, you get 4,200,000,000KWH which is 4.2TWH. Only 30 times less power than BTC for 1000 times less market cap.

But, like I pointed out earlier, I assume that every 8TB consumes 5 watts. 4.8Eib is about 600 thousand of 8 TB of space. Multiply by 5 watts you get 3 megawatts of instantaneous consumption. Assuming every farmer farms 24/7/365, that’s about 3MWx365x24H = 26 Gigawatthours. Scale that by market cap, Chia does turn out to be less consumptive at 32TWH per year, but not by an order of magnitude.

Does that check out?

1 Like

Chia Plotting is nowhere near as power intensive as GPU mining though – so that’s not a valid comparison IMO.

For example, I can’t get my 5950x to hit 100% power consumption while pumping out 14 parallel plots. Right now here’s a picture of the watt meter on it (note that it’s also farming, so it’s doing a few different things) and it’s at about 80% CPU utilization.

That’s making a LOT of Chia plots … 14 of them at once! Now how much would a similar mining rig use?

As already mentioned, the MSI GeForce RTX 3090 SUPRIM X 24G GPU is capable of giving you around 105 MHS hashrate for Ethereum mining out of the box at stock settings and that means about 400W of power usage per GPU .

400 GPU watts for coin mining is significantly greater than 236 CPU watts for Chia plotting. And once I finish the plots, I can farm them forever, I don’t have to keep plotting endlessly.

(Though it remains to be seen how big the netspace will get!)

I am starting to get annoyed at what you guys are arguing. What you said there does not put everything in numbers. 400 watts of plotting that will generate what? How much much market will it account for?

I’ve made the argument for plotting power earlier. Decrease in disk cost will only promote increasing plot rates. Unless you think that plotting rate will decrease, we will remain plotting at 5 Gigawatt-hours annually. If this coin takes off, plot rates will significantly increase and proportional I think it will. 5GWH, if scaled to BTC market, times by 1250 is about 6 THW annually.

1 Like

It’s true that the plots are once and for all, but according to the trend, people are not going to stop plotting. The very green-ness that @mike describes hinges on cheap and almost insatiable disk space. For the years to come, plotting power will only increase as long as Chia keeps growing. Right now, it’s really a race of who can plot faster than who has more plots.

2 Likes

Correct, its ~5W per HDD. I was multi-tasking and when I came back to what I was writing, I took that statement I wrote and ran with it.

Market cap comparisons are kinda myeh IMO. Especially for a new coin. If Chia were the price of BTC, it’d be a bigger market cap today shrug If the prices stay the same Chia will have a bigger market cap over time as more Chia is produced shrug

It seems there are far too many assumptions to make any meaningful predictions. Price, innovation in HDD, innovation in Chia protocol, demand, competition from other cryptocurrencies and fiat currencies, difficulty of winning blocks, future availability and price of HDD, NVME and other gear Chia can use.

We do know is Chia is greener by a few sensible measures like dollars invested to wattage used or wattage used to physical space taken up by the gear. My entire plotting rig of [currently] 160TB which takes up an entire tower uses less electricity than 1 of my GPUs mining ETH.

One interesting angle is the “overprovisioned space” Bram Cohen and others talk about. If I have a 16TB HDD and I use 8TB for my work and 8TB for plots just because I have the space and don’t need it yet, is that 5W an electricity cost of Chia or am I using overprovisioned space and electricity to do something useful instead of ‘wasting it’?

You can’t do that with GPUs mining ETH and ASICS mining BTC. When they are mining, they are 100% burning electricity for that single purpose.

1 Like

None of that is relevant; if you are arguing that “there will be more, therefore there will be more”… uh… okay…?

So let’s narrow the argument to plotters only. Which is kind of absurd, but OK. Every GPU burning away at 400w+ is hardly equivalent to a 5950x plotting away at 250w.

Even if you consider plotting as the only valid activity for measurement (???), you’re still saving almost 2x on power. GPUs grinding away on proof-of-work simply burn way more than Chia plotters ever could writing the plots.

Honestly I still don’t see the point of that argument though. Eventually, people have to stop plotting as they run out of disk space. You can’t spend money infinitely on more and more disks.

1 Like

That is a valid point but I can’t imagine that any where near a reasonable amount of all chia farms are used like that.

I agree that ASICs are purely a waste of power other than generating SHA hashes. But GPUs have a LOT of life in them. As long as you treat them properly, they last as long as they are marginally useful, and even if they don’t, it’s not like HDDs are any better since they don’t life as long.

In addition to what @codinghorror said, you need a motherboard, RAM, power supply, etc. all for 1 plotter that uses 250W. A good motherboard, a good amount of ram, etc to match your 5950x.

GPU mining rigs need a cheap mobo, cheap CPU, cheap PSU, and can support 6-19 GPUs each burning 100-300W. My GPU miner motherboard was sub-100, I got a $150 CPU and an 8GB stick of no name brand RAM.

@Zahk

This goes to the assumption of will it be decentralized with a few or no major players, or will it be like BTC with massive ASISCs farms, or somewhere in the middle? Who knows.

On GPUs: I am looking forward to my RTX 3090 paying itself off, then maybe getting another and exploring AI/ML in the future. However, I can’t use the idle energy on my GPU while I do other tasks to mine. When it mines, it mines and hits ~300W. However, I can plot on unused HDD space and not use any more energy for farming that I already use on it, 5 watts.

1 Like

It’s a necessary portion of the Chia ecosystem’s power consumption is what I am saying, which lower-bounds total power consumption

When do you think people will run out of disk space? In 6 years? When net space reaches 100Exibytes? I prefer that you give quantitative arguments.

Do also address why you think my quantitative arguments not accurate if all.