Later this month the LA Board of Water and Power Commissioners is expected to approve a 25-year contract that will serve 7 percent of the city’s electricity demand at 1.997¢/kwh for solar energy and 1.3¢ for power from batteries.
The project is 1 GW of solar, 500MW of storage. They don’t specify storage capacity (MWh). The source provides two contradicting statements towards their ability to provide stable supply: (a)
“The solar is inherently variable, and the battery is able to take a portion of that solar from that facility, the portion that’s variable, which is usually the top tend of it, take all of that, strip that off and then store it into the battery, so the facility can provide a constant output to the grid”
And (b)
The Eland Project will not rid Los Angeles of natural gas, however. The city will still depend on gas and hydro to supply its overnight power.
Source (2) researches “Levelized cost of energy”, a term they define as
Comparative LCOE analysis for various generation technologies on a $/MWh basis, including sensitivities for U.S. federal tax subsidies, fuel prices,
carbon pricing and cost of capital
It looks at the cost of power generation. Nowhere does it state the cost of reaching 90% uptime with renewables + battery. Or 99% uptime with renewables + battery. The document doesn’t mention uptime, at all. Only generation, independant of demand.
To the best of my understanding, these sources don’t support the claim that renewables + battery storage are costeffective technologies for a balanced electric grid.
But then you added the requirement of 90% uptime which is isn’t how a grid works. For example a coal generator only has 85% uptime yet your power isn’t out 4 hours a day every day.
Nuclear reactors are out of service every 18-24 months for refueling. Yet you don’t lose power for days because the plant has typically two reactors and the grid is designed for those outages.
So the only issue is cost per megawatt. You need 2 reactors for nuclear to be reliable. That’s part of the cost. You need extra bess to be reliable. That’s part of the cost.
But then you added the requirement of 90% uptime which is isn’t how a grid works.
I’m referring to the uptime of the grid. Not an individual power source.
Assume we’ve successfully banned fossil fuels and nuclear, as is the goal of the green parties.
How much renewable production, and bess, does one need to achieve 90% grid uptime? Or 99% grid uptime?
If you want a balanced grid, you don’t need to just build for the average day (in production and consumption), you need to build for the worst case in both production and consumption.
The worst case production in case for renewables, is close to zero for days (example). Meaning you need to size storage appropriatelly, in order to fairly compare to nuclear. And build sufficient production so that surplus is generated and able to be stored.
If we’re fine with a blackout 10% of the time, I can see solar + bess beating nuclear, price wise. If the goal instead is a reliable grid, then not.
As an example: take Belgium. As a result of this same idea (solar/wind is cheap!) we ended up with both (1) higher greenhouse gas emissions and (2) costlier energy generation, as we now heavily rely on gas power generation (previously mainly russian, now mainly US LNG) to balance the grid. Previous winter we even had to use kerosene turbine generation to avoid a blackout.
Yes you have to build for worst case. That’s what I already said.
You are comparing overbuilt nuclear but acting like bess can’t be over built too. That’s why the cost of storage is the only important metric.
You need an absolute minimum of 2 nuclear reactors to be reliable (Belgium has 7). That doubles the cost of nuclear. But it doesn’t matter because that’s factored in when you look at levelized cost. You look at cost per MWhr. How reliability is achieved doesn’t matter.
130% production on average, with excess being stored, minus losses in conversions, transport and storage = 100% demand covered all the time.
Or the longer version: For a stable grid I need to cover 100% of the demand in next to real-time. This can be achieved with enough long- and short-term storage, plus some overproduction to account for storage losses. The 115% to 130% production (compared to actual demand) are based on studies for Germany and vary by scenario, with the higher number for the worst case (people strongly resisting all changes to better balance consumption and south Germany keeping up there resistence to diversify by only building solar while blocking wind power).
The question now is: How much storage do I need? And that answer is varying by much greater amount based on scenario (for example between 50 and 120 GW capacity needed as electrolysis for long term-storage or battery storage between 50GWh and 200GWh).
Uptime is calculated by kWh, I.E
How many kilowatts of power you can produce for how many hours.
So it’s flexible. If you have 4kw of battery, you can produce 1kw for 4hrs, or 2kw for 2hrs, 4kw for 1hr, etc.
Nuclear is steady state. If the reactor can generate 1gw, it can only generate 1gw, but for 24hrs.
So to match a 1gw nuclear plant, you need around 12gw of of storage, and 13gw of production.
This has come up before. See this comment where I break down the most recent utility scale nuclear and solar deployments in the US. The comentor above is right, and that doesn’t take into account huge strides in solar and battery tech we are currently making.
The 2 most recent reactors built in the US, the Vogtle reactors 3 and 4 in Georgia, took 14 years at 34 billion dollars. They produce 2.4GW of power together.
So each 1.2GW reactor works out to be 17bil. Time to build still looks like 14 years, as both were started on the same time frame, and only one is fully online now, but we will give it a pass. You could argue it took 18 years, as that’s when the first proposals for the plants were formally submitted, but I only took into account financing/build time, so let’s sick with 14.
For 17bil in nuclear, you get 1.2GW production and 1.2GW “storage” for 24hrs.
So for 17bil in solar/battery, you get 4.8GW production, and 2.85gw storage for 4hrs. Having that huge storage in batteries is more flexible than nuclear, so you can provide that 2.85gw for 4 hr, or 1.425 for 8hrs, or 712MW for 16hrs. If we are kind to solar and say the sun is down for 12hrs out of every 24, that means the storage lines up with nuclear.
The solar also goes up much, much faster. I don’t think a 7.5x larger solar array will take 7.5x longer to build, as it’s mostly parallel action. I would expect maybe 6 years instead of 2.
So, worst case, instead of nuclear, for the same cost you can build solar+ battery farms that produces 4x the power, have the same steady baseline power as nuclear, that will take 1/2 as long to build.
Uptime is calculated by kWh, I.E
How many kilowatts of power you can produce for how many hours.
That’s stored energy. For example: a 5 MWh battery can provide 5 hours of power at 1MW. It can provide 2 hours of power, at 2.5MW. It can provide 1 hour of power, at 5MW.
The max amount of power a battery can deliver (MW), and the max amount of storage (MWh) are independant characteristics. The first is usually limited by cooling and transfo physics. The latter usually by the amount of lithium/zinc/redox of choice.
What uptime refers to is: how many hours a year, does supply match or outperform demand, compared to the number of hours a year.
So to match a 1gw nuclear plant, you need around 12gw of of storage, and 13gw of production.
This is incorrect. Under the assumption that nuclear plants are steady state, (which they aren’t).
To match a 1GW nuclear plant, for one day, you need a fully charged 1GW battery, with a capacity of 24GWh.
Are you sure you understand the difference between W and Wh?
My math assumes the sun shines for 12 hours/day, so you don’t need 24 hours storage since you produce power for 12 of it.
My math is drastically off though. I ignored the 12 hrs time line when talking about generation.
Assuming that 12 hours of sun, you just need 2Gw solar production and 12Gw of battery to supply 1Gw during the day of solar, and 1Gw during the night of solar, to match a 1Gw nuclear plants output and “storage.”
Seeing as those recent projects put that nuclear output at 17 bil dollars and a 14 year build timeline, and they put the solar equivalent at roughly 14 billion(2 billion for solar and 12 billion for storage) with a 2 - 6 year build timeline, nuclear cannot complete with current solar/battery tech, much less advancing solar/battery tech.
Assuming that 12 hours of sun, you just need 2Gw solar production and 12Gw of battery to supply 1Gw during the day of solar, and 1Gw during the night of solar,
Again, I think you might not understand the difference between W and Wh. The SI unit for Wh is joules.
When describing a battery, you need to specify both W and Wh. It makes no sense, to build a 12GW battery, if you only ever need 1GW of output.
If you want more exact details about the batteries that array used, click on the link in my comment.
The array has a 380 MW battery and 1.4Gwh of output with 690Mw of solar production for 1.9 billion dollars. Splitting that evenly to 1 billion for the solar and 1 billion for the battery, we get 2.1Gw solar for 3 billion, and 12.6Gwh for 9 billion.
So actually, the solar array can match the nuclear output for 12 billion, assuming 12 hours of sun.
For 17 billion, we can get a 3.3Gw generation, and 15.6Gwh of battery. That means the battery array would charge in 7-8hrs of sun, and provide nearly 16hrs of output at 1Gwh, putting us at a viable array for just 8hrs of sun.
Can solar + battery tech do what nuclear does today, but much faster, likely cheaper and with mostly no downsides? That is a clear yes. Is battery and solar tech advancing at an exponential rate while nuclear tech is not? Also a clear yes.
Nuclear was the right answer 30 years ago. Solar + battery is the right answer now.
That means the battery array would charge in 7-8hrs of sun, and provide nearly 16hrs of output at 1Gwh
How many days a year does that occur? How much additional storage and production do you need add, to be able to bridge dunkelflautes, as is currently happening in germany, for example (1)?
That’s why I mentioned the 90%, 99%, etc. If you want a balanced grid, you don’t need to just build for the average day (in production and consumption), you need to build for the worst case in both production and consumption.
The worst case production in case for renewables, is close to zero for days on end. Meaning you need to size storage appropriatelly, in order to fairly compare to nuclear.
So you agree that solar + battery resolves 90-99% of power needs now at a drastically reduced cost and build time than nuclear today?
I expect that 10% will get much closer to 1% in the next decade with all the versatile battery/solar tech coming onboard, but to compensate for solar fluctuations, you use wind, you use hydro, and you use the new “dig anywhere” steady state geothermal that is also being brought online today. We can run more HVDC lines to connect various parts of the country also. We are working on some now, but not enough. With a robust transmission system, solar gets 3hrs of “free” storage across our time zones. With better national connections, power flows from excess to where its needed, instead of being forced to be regional.
Worst case? You burn green hydrogen you made with your excess solar capacity in retrofitted natgas plants.
There are lots of answers to steady-state that are green and won’t take 15-20 years to come online like the next nuclear plant. We should keep going with them, because they can help us now and in the future.
But, as often happens, the last 10% is as hard or harder as the first 90%. The law of diminishing returns.
There are lots of answers to steady-state that are green and won’t take 15 years
I’m aware of and have studied them. But general public seems to greatly underestimate the scale of storage that’s needed. Germany, for example, consumes about 1.4TWh of electrical energy a day. That’s more than the world’s current yearly battery production. It does not suffice to power Germany, for one day.
Pumped storage, if geology allows for it, seems like the only possible technology for sufficient storage.
Demand side reduction is possible as well, but that’s simply a controlled gray out. The implications for a society are huge. Ask any cuban or south african.
Others, like lithium ion batteries, green hydrogen, salt batteries, ammonium generation, … have been promised for decades now. Whilst the principle is there, they do store power, it simply does not scale to grid scaled needs.
The sad part is that it sets a trap, like we in EU have fallen into. You get far along the way, pat yourself on the back with “this windmill powers a 1000 households” style faulty thinking. But as you can’t bridge the last gap, your reliance on fossil fuels, and total emissions, increases.
Would love to see a source for that claim. How many 9’s uptime to they target? 90%, 99%
This is old news now! Here’s a link from 5 years ago. https://www.forbes.com/sites/jeffmcmahon/2019/07/01/new-solar--battery-price-crushes-fossil-fuels-buries-nuclear/
This is from last year: https://www.lazard.com/research-insights/2023-levelized-cost-of-energyplus/
As to uptime, they have the same legal requirements as all utilities.
I was pro nuke until finding out solar plus grid battery was cheaper.
Source (1)
The project is 1 GW of solar, 500MW of storage. They don’t specify storage capacity (MWh). The source provides two contradicting statements towards their ability to provide stable supply: (a)
And (b)
Source (2) researches “Levelized cost of energy”, a term they define as
It looks at the cost of power generation. Nowhere does it state the cost of reaching 90% uptime with renewables + battery. Or 99% uptime with renewables + battery. The document doesn’t mention uptime, at all. Only generation, independant of demand.
To the best of my understanding, these sources don’t support the claim that renewables + battery storage are costeffective technologies for a balanced electric grid.
Yes.
But then you added the requirement of 90% uptime which is isn’t how a grid works. For example a coal generator only has 85% uptime yet your power isn’t out 4 hours a day every day.
Nuclear reactors are out of service every 18-24 months for refueling. Yet you don’t lose power for days because the plant has typically two reactors and the grid is designed for those outages.
So the only issue is cost per megawatt. You need 2 reactors for nuclear to be reliable. That’s part of the cost. You need extra bess to be reliable. That’s part of the cost.
I’m referring to the uptime of the grid. Not an individual power source.
Assume we’ve successfully banned fossil fuels and nuclear, as is the goal of the green parties.
How much renewable production, and bess, does one need to achieve 90% grid uptime? Or 99% grid uptime?
If you want a balanced grid, you don’t need to just build for the average day (in production and consumption), you need to build for the worst case in both production and consumption.
The worst case production in case for renewables, is close to zero for days (example). Meaning you need to size storage appropriatelly, in order to fairly compare to nuclear. And build sufficient production so that surplus is generated and able to be stored.
If we’re fine with a blackout 10% of the time, I can see solar + bess beating nuclear, price wise. If the goal instead is a reliable grid, then not.
As an example: take Belgium. As a result of this same idea (solar/wind is cheap!) we ended up with both (1) higher greenhouse gas emissions and (2) costlier energy generation, as we now heavily rely on gas power generation (previously mainly russian, now mainly US LNG) to balance the grid. Previous winter we even had to use kerosene turbine generation to avoid a blackout.
Yes you have to build for worst case. That’s what I already said.
You are comparing overbuilt nuclear but acting like bess can’t be over built too. That’s why the cost of storage is the only important metric.
You need an absolute minimum of 2 nuclear reactors to be reliable (Belgium has 7). That doubles the cost of nuclear. But it doesn’t matter because that’s factored in when you look at levelized cost. You look at cost per MWhr. How reliability is achieved doesn’t matter.
Bess is $200 per MWhr.
About 115% to 130%. Depending on diversification of renewable sources and locations. The remains are losses in storage and transport obviously.
But shouldn’t you actual question be: How much storage is needed?
For a quick summary of those questions you can look here for example…
What would 130% grid uptime even look like? 475 days a year without blackout?
I think we’re talking about different things.
130% production on average, with excess being stored, minus losses in conversions, transport and storage = 100% demand covered all the time.
Or the longer version: For a stable grid I need to cover 100% of the demand in next to real-time. This can be achieved with enough long- and short-term storage, plus some overproduction to account for storage losses. The 115% to 130% production (compared to actual demand) are based on studies for Germany and vary by scenario, with the higher number for the worst case (people strongly resisting all changes to better balance consumption and south Germany keeping up there resistence to diversify by only building solar while blocking wind power).
The question now is: How much storage do I need? And that answer is varying by much greater amount based on scenario (for example between 50 and 120 GW capacity needed as electrolysis for long term-storage or battery storage between 50GWh and 200GWh).
Uptime is calculated by kWh, I.E How many kilowatts of power you can produce for how many hours.
So it’s flexible. If you have 4kw of battery, you can produce 1kw for 4hrs, or 2kw for 2hrs, 4kw for 1hr, etc.
Nuclear is steady state. If the reactor can generate 1gw, it can only generate 1gw, but for 24hrs.
So to match a 1gw nuclear plant, you need around 12gw of of storage, and 13gw of production.
This has come up before. See this comment where I break down the most recent utility scale nuclear and solar deployments in the US. The comentor above is right, and that doesn’t take into account huge strides in solar and battery tech we are currently making.
That’s stored energy. For example: a 5 MWh battery can provide 5 hours of power at 1MW. It can provide 2 hours of power, at 2.5MW. It can provide 1 hour of power, at 5MW.
The max amount of power a battery can deliver (MW), and the max amount of storage (MWh) are independant characteristics. The first is usually limited by cooling and transfo physics. The latter usually by the amount of lithium/zinc/redox of choice.
What uptime refers to is: how many hours a year, does supply match or outperform demand, compared to the number of hours a year.
This is incorrect. Under the assumption that nuclear plants are steady state, (which they aren’t).
To match a 1GW nuclear plant, for one day, you need a fully charged 1GW battery, with a capacity of 24GWh.
Are you sure you understand the difference between W and Wh?
My math assumes the sun shines for 12 hours/day, so you don’t need 24 hours storage since you produce power for 12 of it.
My math is drastically off though. I ignored the 12 hrs time line when talking about generation.
Assuming that 12 hours of sun, you just need 2Gw solar production and 12Gw of battery to supply 1Gw during the day of solar, and 1Gw during the night of solar, to match a 1Gw nuclear plants output and “storage.”
Seeing as those recent projects put that nuclear output at 17 bil dollars and a 14 year build timeline, and they put the solar equivalent at roughly 14 billion(2 billion for solar and 12 billion for storage) with a 2 - 6 year build timeline, nuclear cannot complete with current solar/battery tech, much less advancing solar/battery tech.
Again, I think you might not understand the difference between W and Wh. The SI unit for Wh is joules.
When describing a battery, you need to specify both W and Wh. It makes no sense, to build a 12GW battery, if you only ever need 1GW of output.
If you want more exact details about the batteries that array used, click on the link in my comment.
The array has a 380 MW battery and 1.4Gwh of output with 690Mw of solar production for 1.9 billion dollars. Splitting that evenly to 1 billion for the solar and 1 billion for the battery, we get 2.1Gw solar for 3 billion, and 12.6Gwh for 9 billion.
So actually, the solar array can match the nuclear output for 12 billion, assuming 12 hours of sun.
For 17 billion, we can get a 3.3Gw generation, and 15.6Gwh of battery. That means the battery array would charge in 7-8hrs of sun, and provide nearly 16hrs of output at 1Gwh, putting us at a viable array for just 8hrs of sun.
Can solar + battery tech do what nuclear does today, but much faster, likely cheaper and with mostly no downsides? That is a clear yes. Is battery and solar tech advancing at an exponential rate while nuclear tech is not? Also a clear yes.
Nuclear was the right answer 30 years ago. Solar + battery is the right answer now.
How many days a year does that occur? How much additional storage and production do you need add, to be able to bridge dunkelflautes, as is currently happening in germany, for example (1)?
That’s why I mentioned the 90%, 99%, etc. If you want a balanced grid, you don’t need to just build for the average day (in production and consumption), you need to build for the worst case in both production and consumption.
The worst case production in case for renewables, is close to zero for days on end. Meaning you need to size storage appropriatelly, in order to fairly compare to nuclear.
So you agree that solar + battery resolves 90-99% of power needs now at a drastically reduced cost and build time than nuclear today?
I expect that 10% will get much closer to 1% in the next decade with all the versatile battery/solar tech coming onboard, but to compensate for solar fluctuations, you use wind, you use hydro, and you use the new “dig anywhere” steady state geothermal that is also being brought online today. We can run more HVDC lines to connect various parts of the country also. We are working on some now, but not enough. With a robust transmission system, solar gets 3hrs of “free” storage across our time zones. With better national connections, power flows from excess to where its needed, instead of being forced to be regional.
Worst case? You burn green hydrogen you made with your excess solar capacity in retrofitted natgas plants.
There are lots of answers to steady-state that are green and won’t take 15-20 years to come online like the next nuclear plant. We should keep going with them, because they can help us now and in the future.
I’m saying you can get to 90% yes.
But, as often happens, the last 10% is as hard or harder as the first 90%. The law of diminishing returns.
I’m aware of and have studied them. But general public seems to greatly underestimate the scale of storage that’s needed. Germany, for example, consumes about 1.4TWh of electrical energy a day. That’s more than the world’s current yearly battery production. It does not suffice to power Germany, for one day.
Pumped storage, if geology allows for it, seems like the only possible technology for sufficient storage.
Demand side reduction is possible as well, but that’s simply a controlled gray out. The implications for a society are huge. Ask any cuban or south african.
Others, like lithium ion batteries, green hydrogen, salt batteries, ammonium generation, … have been promised for decades now. Whilst the principle is there, they do store power, it simply does not scale to grid scaled needs.
The sad part is that it sets a trap, like we in EU have fallen into. You get far along the way, pat yourself on the back with “this windmill powers a 1000 households” style faulty thinking. But as you can’t bridge the last gap, your reliance on fossil fuels, and total emissions, increases.