Capacity Expansion Planning with pypsa
#
Note
Also in this tutorial, you might want to refer to the PyPSA documentation: https://pypsa.readthedocs.io.
From electricity market modelling to capacity expansion planning#
Review the problem formulation of the electricity market model. Below you can find an adapted version where the capacity limits have been promoted to decision variables with corresponding terms in the objective function and new constraints for their expansion limits (e.g. wind and solar potentials). This is known as capacity expansion problem.
such that
New decision variables for capacity expansion planning:
\(G_{i,s}\) is the generator capacity at bus \(i\), technology \(s\),
\(F_{\ell}\) is the transmission capacity of line \(\ell\),
\(G_{i,r,\text{dis-/charge}}\) denotes the charge and discharge capacities of storage unit \(r\) at bus \(i\),
\(E_{i,r}\) is the energy capacity of storage \(r\) at bus \(i\) and time step \(t\).
New parameters for capacity expansion planning:
\(c_{\star}\) is the capital cost of technology \(\star\) at bus \(i\)
\(w_t\) is the weighting of time step \(t\) (e.g. number of hours it represents)
\(\underline{G}_\star, \underline{F}_\star, \underline{E}_\star\) are the minimum capacities per technology and location/connection.
\(\underline{G}_\star, \underline{F}_\star, \underline{E}_\star\) are the maximum capacities per technology and location.
Note
For a full reference to the optimisation problem description, see https://pypsa.readthedocs.io/en/latest/optimal_power_flow.html
Note
If you have not yet set up Python on your computer, you can execute this tutorial in your browser via Google Colab. Click on the rocket in the top right corner and launch “Colab”. If that doesn’t work download the .ipynb
file and import it in Google Colab.
Then install the following packages by executing the following command in a Jupyter cell at the top of the notebook.
!pip install pypsa pandas matplotlib highspy
First things first! We need a few packages for this tutorial:
import pypsa
import pandas as pd
import matplotlib.pyplot as plt
plt.style.use("bmh")
Prerequisites: handling technology data and costs#
We maintain a database (PyPSA/technology-data) which collects assumptions and projections for energy system technologies (such as costs, efficiencies, lifetimes, etc.) for given years, which we can load into a pandas.DataFrame
. This requires some pre-processing to load (e.g. converting units, setting defaults, re-arranging dimensions):
year = 2030
url = f"https://raw.githubusercontent.com/PyPSA/technology-data/master/outputs/costs_{year}.csv"
costs = pd.read_csv(url, index_col=[0, 1])
costs.loc[costs.unit.str.contains("/kW"), "value"] *= 1e3
costs.unit = costs.unit.str.replace("/kW", "/MW")
defaults = {
"FOM": 0,
"VOM": 0,
"efficiency": 1,
"fuel": 0,
"investment": 0,
"lifetime": 25,
"CO2 intensity": 0,
"discount rate": 0.07,
}
costs = costs.value.unstack().fillna(defaults)
costs.at["OCGT", "fuel"] = costs.at["gas", "fuel"]
costs.at["CCGT", "fuel"] = costs.at["gas", "fuel"]
costs.at["OCGT", "CO2 intensity"] = costs.at["gas", "CO2 intensity"]
costs.at["CCGT", "CO2 intensity"] = costs.at["gas", "CO2 intensity"]
Let’s also write a small utility function that calculates the annuity to annualise investment costs. The formula is
where \(r\) is the discount rate and \(n\) is the lifetime.
def annuity(r, n):
return r / (1.0 - 1.0 / (1.0 + r) ** n)
annuity(0.07, 20)
0.09439292574325567
Based on this, we can calculate the short-term marginal generation costs (STMGC, €/MWh):
costs["marginal_cost"] = costs["VOM"] + costs["fuel"] / costs["efficiency"]
and the annualised investment costs (capital_cost
in PyPSA terms, €/MW/a):
annuity = costs.apply(lambda x: annuity(x["discount rate"], x["lifetime"]), axis=1)
costs["capital_cost"] = (annuity + costs["FOM"] / 100) * costs["investment"]
Loading time series data#
We are also going to need some time series for wind, solar and load. For now, we are going to recycle the time series we used at the beginning of the course. They are given for Germany in the year 2015.
url = (
"https://tubcloud.tu-berlin.de/s/pKttFadrbTKSJKF/download/time-series-lecture-2.csv"
)
ts = pd.read_csv(url, index_col=0, parse_dates=True)
ts.head(3)
load | onwind | offwind | solar | prices | |
---|---|---|---|---|---|
2015-01-01 00:00:00 | 41.151 | 0.1566 | 0.7030 | 0.0 | NaN |
2015-01-01 01:00:00 | 40.135 | 0.1659 | 0.6875 | 0.0 | NaN |
2015-01-01 02:00:00 | 39.106 | 0.1746 | 0.6535 | 0.0 | NaN |
Let’s convert the load time series from GW to MW, the base unit of PyPSA:
ts.load *= 1e3
We are also going to adapt the temporal resolution of the time series, e.g. sample only every other hour, to save some time:
resolution = 4
ts = ts.resample(f"{resolution}h").first()
Simple capacity expansion planning example#
Note
See also https://model.energy.
In this tutorial, we want to build a replica of model.energy. This tool calculates the cost of meeting a constant electricity demand from a combination of wind power, solar power and storage for different regions of the world.
We deviate from model.energy by including offshore wind generation and electricity demand profiles rather than a constant electricity demand. Also, we are going to start with Germany only. You can adapt the code to other countries as an exercise.
Model Initialisation#
For building the model, we start again by initialising an empty network.
n = pypsa.Network()
Then, we add a single bus…
n.add("Bus", "electricity")
…and tell the pypsa.Network
object n
what the snapshots of the model will be using the utility function n.set_snapshots()
.
n.set_snapshots(ts.index)
n.snapshots
DatetimeIndex(['2015-01-01 00:00:00', '2015-01-01 04:00:00',
'2015-01-01 08:00:00', '2015-01-01 12:00:00',
'2015-01-01 16:00:00', '2015-01-01 20:00:00',
'2015-01-02 00:00:00', '2015-01-02 04:00:00',
'2015-01-02 08:00:00', '2015-01-02 12:00:00',
...
'2015-12-30 08:00:00', '2015-12-30 12:00:00',
'2015-12-30 16:00:00', '2015-12-30 20:00:00',
'2015-12-31 00:00:00', '2015-12-31 04:00:00',
'2015-12-31 08:00:00', '2015-12-31 12:00:00',
'2015-12-31 16:00:00', '2015-12-31 20:00:00'],
dtype='datetime64[ns]', name='snapshot', length=2190, freq='4h')
The weighting of the snapshots (e.g. how many hours they represent, see \(w_t\) in problem formulation above) can be set in n.snapshot_weightings
.
n.snapshot_weightings.head(3)
objective | stores | generators | |
---|---|---|---|
snapshot | |||
2015-01-01 00:00:00 | 1.0 | 1.0 | 1.0 |
2015-01-01 04:00:00 | 1.0 | 1.0 | 1.0 |
2015-01-01 08:00:00 | 1.0 | 1.0 | 1.0 |
n.snapshot_weightings.loc[:, :] = resolution
n.snapshot_weightings.head(3)
objective | stores | generators | |
---|---|---|---|
snapshot | |||
2015-01-01 00:00:00 | 4.0 | 4.0 | 4.0 |
2015-01-01 04:00:00 | 4.0 | 4.0 | 4.0 |
2015-01-01 08:00:00 | 4.0 | 4.0 | 4.0 |
Adding Components#
Then, we add all the technologies we are going to include as carriers.
carriers = [
"onwind",
"offwind",
"solar",
"OCGT",
"hydrogen storage underground",
"battery storage",
]
n.madd(
"Carrier",
carriers,
color=["dodgerblue", "aquamarine", "gold", "indianred", "magenta", "yellowgreen"],
co2_emissions=[costs.at[c, "CO2 intensity"] for c in carriers],
)
Index(['onwind', 'offwind', 'solar', 'OCGT', 'hydrogen storage underground',
'battery storage'],
dtype='object')
Next, we add the demand time series to the model.
n.add(
"Load",
"demand",
bus="electricity",
p_set=ts.load,
)
Let’s have a check whether the data was read-in correctly.
n.loads_t.p_set.plot(figsize=(6, 2), ylabel="MW")
<Axes: xlabel='snapshot', ylabel='MW'>
We are going to add one dispatchable generation technology to the model. This is an open-cycle gas turbine (OCGT) with CO\(_2\) emissions of 0.2 t/MWh\(_{th}\).
n.add(
"Generator",
"OCGT",
bus="electricity",
carrier="OCGT",
capital_cost=costs.at["OCGT", "capital_cost"],
marginal_cost=costs.at["OCGT", "marginal_cost"],
efficiency=costs.at["OCGT", "efficiency"],
p_nom_extendable=True,
)
Adding the variable renewable generators works almost identically, but we also need to supply the capacity factors to the model via the attribute p_max_pu
.
for tech in ["onwind", "offwind", "solar"]:
n.add(
"Generator",
tech,
bus="electricity",
carrier=tech,
p_max_pu=ts[tech],
capital_cost=costs.at[tech, "capital_cost"],
marginal_cost=costs.at[tech, "marginal_cost"],
efficiency=costs.at[tech, "efficiency"],
p_nom_extendable=True,
)
So let’s make sure the capacity factors are read-in correctly.
n.generators_t.p_max_pu.loc["2015-03"].plot(figsize=(6, 2), ylabel="CF")
<Axes: xlabel='snapshot', ylabel='CF'>
Model Run#
Then, we can already solve the model for the first time. At this stage, the model does not have any storage or emission limits implemented. It’s going to look for the least-cost combination of variable renewables and the gas turbine to supply demand.
n.optimize(solver_name="highs")
INFO:linopy.model: Solve problem using Highs solver
INFO:linopy.io: Writing time: 0.11s
INFO:linopy.solvers:Log file at /tmp/highs.log.
INFO:linopy.constants: Optimization successful:
Status: ok
Termination condition: optimal
Solution: 8764 primals, 19714 duals
Objective: 3.16e+10
Solver model: available
Solver message: optimal
/opt/hostedtoolcache/Python/3.11.9/x64/lib/python3.11/site-packages/pypsa/optimization/optimize.py:357: FutureWarning:
A value is trying to be set on a copy of a DataFrame or Series through chained assignment using an inplace method.
The behavior will change in pandas 3.0. This inplace method will never work because the intermediate object on which we are setting values always behaves as a copy.
For example, when doing 'df[col].method(value, inplace=True)', try using 'df.method({col: value}, inplace=True)' or df[col] = df[col].method(value) instead, to perform the operation inplace on the original object.
INFO:pypsa.optimization.optimize:The shadow-prices of the constraints Generator-ext-p-lower, Generator-ext-p-upper were not assigned to the network.
Running HiGHS 1.5.3 [date: 2023-05-16, git hash: 594fa5a9d-dirty]
Copyright (c) 2023 HiGHS under MIT licence terms
Presolving model
9900 rows, 7714 cols, 23130 nonzeros
9900 rows, 7714 cols, 23130 nonzeros
Presolve : Reductions: rows 9900(-9814); columns 7714(-1050); elements 23130(-10864)
Solving the presolved LP
Using EKK dual simplex solver - serial
Iteration Objective Infeasibilities num(sum)
0 0.0000000000e+00 Pr: 2190(1.25606e+09) 0s
7615 3.1624531941e+10 Pr: 0(0) 0s
Solving the original LP from the solution after postsolve
Model status : Optimal
Simplex iterations: 7615
Objective value : 3.1624531941e+10
HiGHS run time : 0.15
('ok', 'optimal')
Model Evaluation#
The total system cost in billion Euros per year:
n.objective / 1e9
31.62453194099228
The optimised capacities in GW:
n.generators.p_nom_opt.div(1e3) # GW
Generator
OCGT 70.096357
onwind -0.000000
offwind 49.326062
solar 85.967820
Name: p_nom_opt, dtype: float64
The total energy generation by technology in GW:
n.snapshot_weightings.generators @ n.generators_t.p.div(1e6) # TWh
Generator
OCGT 235.778879
onwind 0.000000
offwind 150.379873
solar 92.766988
Name: generators, dtype: float64
While we get the objective value through n.objective
, in many cases we want to know how the costs are distributed across the technologies. We can use the statistics module for this:
(n.statistics.capex() + n.statistics.opex()).div(1e6)
component carrier
Generator OCGT 18596.014468
offwind 8613.359135
onwind NaN
solar 4415.158338
dtype: float64
Possibly, we are also interested in the total emissions:
emissions = (
n.generators_t.p
/ n.generators.efficiency
* n.generators.carrier.map(n.carriers.co2_emissions)
) # t/h
n.snapshot_weightings.generators @ emissions.sum(axis=1).div(1e6) # Mt
113.86394645337052
Plotting Optimal Dispatch#
This function takes the network object n
as an argument and, optionally, a time frame. We want to plot the load time series, and stacked area charts for electricity feed-in and storage charging. Technologies should be coloured by their color defined in n.carriers
.
def plot_dispatch(n, time="2015-07"):
p_by_carrier = n.generators_t.p.groupby(n.generators.carrier, axis=1).sum().div(1e3)
if not n.storage_units.empty:
sto = n.storage_units_t.p.T.groupby(n.storage_units.carrier).sum().T.div(1e3)
p_by_carrier = pd.concat([p_by_carrier, sto], axis=1)
fig, ax = plt.subplots(figsize=(6, 3))
color = p_by_carrier.columns.map(n.carriers.color)
p_by_carrier.where(p_by_carrier > 0).loc[time].plot.area(
ax=ax,
linewidth=0,
color=color,
)
charge = p_by_carrier.where(p_by_carrier < 0).dropna(how="all", axis=1).loc[time]
if not charge.empty:
charge.plot.area(
ax=ax,
linewidth=0,
color=charge.columns.map(n.carriers.color),
)
n.loads_t.p_set.sum(axis=1).loc[time].div(1e3).plot(ax=ax, c="k")
plt.legend(loc=(1.05, 0))
ax.set_ylabel("GW")
ax.set_ylim(-200, 200)
Oje, that was complicated. Let’s test it:
plot_dispatch(n)
/tmp/ipykernel_2311/184362636.py:2: FutureWarning:
DataFrame.groupby with axis=1 is deprecated. Do `frame.T.groupby(...)` without axis instead.
Adding Storage Units#
Alright, but there are a few important components missing for a system with high shares of renewables? What about short-term storage options (e.g. batteries) and long-term storage options (e.g. hydrogen storage)? Let’s add them too.
First, the battery storage. We are going to assume a fixed energy-to-power ratio of 6 hours, i.e. if fully charged, the battery can discharge at full capacity for 6 hours. For the capital cost, we have to factor in both the capacity and energy cost of the storage. We are also going to enforce a cyclic state-of-charge condition, i.e. the state of charge at the beginning of the optimisation period must equal the final state of charge.
n.add(
"StorageUnit",
"battery storage",
bus="electricity",
carrier="battery storage",
max_hours=6,
capital_cost=costs.at["battery inverter", "capital_cost"]
+ 6 * costs.at["battery storage", "capital_cost"],
efficiency_store=costs.at["battery inverter", "efficiency"],
efficiency_dispatch=costs.at["battery inverter", "efficiency"],
p_nom_extendable=True,
cyclic_state_of_charge=True,
)
Second, the hydrogen storage. This one is composed of an electrolysis to convert electricity to hydrogen, a fuel cell to re-convert hydrogen to electricity and underground storage (e.g. in salt caverns). We assume an energy-to-power ratio of 168 hours, such that this type of storage can be used for weekly balancing.
capital_costs = (
costs.at["electrolysis", "capital_cost"]
+ costs.at["fuel cell", "capital_cost"]
+ 168 * costs.at["hydrogen storage underground", "capital_cost"]
)
n.add(
"StorageUnit",
"hydrogen storage underground",
bus="electricity",
carrier="hydrogen storage underground",
max_hours=168,
capital_cost=capital_costs,
efficiency_store=costs.at["electrolysis", "efficiency"],
efficiency_dispatch=costs.at["fuel cell", "efficiency"],
p_nom_extendable=True,
cyclic_state_of_charge=True,
)
Ok, lets run the again, now with storage, and see what’s changed.
n.optimize(solver_name="highs")
Show code cell output
INFO:linopy.model: Solve problem using Highs solver
INFO:linopy.io:Writing objective.
Writing constraints.: 0%| | 0/14 [00:00<?, ?it/s]
Writing constraints.: 57%|█████▋ | 8/14 [00:00<00:00, 79.77it/s]
Writing constraints.: 100%|██████████| 14/14 [00:00<00:00, 61.95it/s]
Writing continuous variables.: 0%| | 0/6 [00:00<?, ?it/s]
Writing continuous variables.: 100%|██████████| 6/6 [00:00<00:00, 155.50it/s]
INFO:linopy.io: Writing time: 0.28s
INFO:linopy.solvers:Log file at /tmp/highs.log.
Running HiGHS 1.5.3 [date: 2023-05-16, git hash: 594fa5a9d-dirty]
Copyright (c) 2023 HiGHS under MIT licence terms
INFO:linopy.constants: Optimization successful:
Status: ok
Termination condition: optimal
Solution: 21906 primals, 50376 duals
Objective: 3.16e+10
Solver model: available
Solver message: optimal
/opt/hostedtoolcache/Python/3.11.9/x64/lib/python3.11/site-packages/pypsa/optimization/optimize.py:357: FutureWarning:
A value is trying to be set on a copy of a DataFrame or Series through chained assignment using an inplace method.
The behavior will change in pandas 3.0. This inplace method will never work because the intermediate object on which we are setting values always behaves as a copy.
For example, when doing 'df[col].method(value, inplace=True)', try using 'df.method({col: value}, inplace=True)' or df[col] = df[col].method(value) instead, to perform the operation inplace on the original object.
INFO:pypsa.optimization.optimize:The shadow-prices of the constraints Generator-ext-p-lower, Generator-ext-p-upper, StorageUnit-ext-p_dispatch-lower, StorageUnit-ext-p_dispatch-upper, StorageUnit-ext-p_store-lower, StorageUnit-ext-p_store-upper, StorageUnit-ext-state_of_charge-lower, StorageUnit-ext-state_of_charge-upper, StorageUnit-energy_balance were not assigned to the network.
Presolving model
27420 rows, 20856 cols, 75690 nonzeros
27420 rows, 20856 cols, 75690 nonzeros
Presolve : Reductions: rows 27420(-22956); columns 20856(-1050); elements 75690(-24006)
Solving the presolved LP
Using EKK dual simplex solver - serial
Iteration Objective Infeasibilities num(sum)
0 0.0000000000e+00 Pr: 2190(1.54924e+09) 0s
26273 3.1624531941e+10 Pr: 0(0); Du: 0(4.071e-13) 2s
Solving the original LP from the solution after postsolve
Model status : Optimal
Simplex iterations: 26273
Objective value : 3.1624531941e+10
HiGHS run time : 2.24
('ok', 'optimal')
n.generators.p_nom_opt # MW
Generator
OCGT 70096.356831
onwind -0.000000
offwind 49326.061998
solar 85967.819687
Name: p_nom_opt, dtype: float64
n.storage_units.p_nom_opt # MW
StorageUnit
battery storage -0.0
hydrogen storage underground -0.0
Name: p_nom_opt, dtype: float64
Nothing! The objective value is the same, and no storage is built.
Adding emission limits#
The gas power plant offers sufficient and cheap enough backup capacity to run in periods of low wind and solar generation. But what happens if this source of flexibility disappears. Let’s model a 100% renewable electricity system by adding a CO\(_2\) emission limit as global constraint:
n.add(
"GlobalConstraint",
"CO2Limit",
carrier_attribute="co2_emissions",
sense="<=",
constant=0,
)
When we run the model now…
n.optimize(solver_name="highs")
Show code cell output
INFO:linopy.model: Solve problem using Highs solver
INFO:linopy.io:Writing objective.
Writing constraints.: 0%| | 0/15 [00:00<?, ?it/s]
Writing constraints.: 60%|██████ | 9/15 [00:00<00:00, 81.88it/s]
Writing constraints.: 100%|██████████| 15/15 [00:00<00:00, 65.34it/s]
Writing continuous variables.: 0%| | 0/6 [00:00<?, ?it/s]
Writing continuous variables.: 100%|██████████| 6/6 [00:00<00:00, 155.59it/s]
INFO:linopy.io: Writing time: 0.29s
INFO:linopy.solvers:Log file at /tmp/highs.log.
Running HiGHS 1.5.3 [date: 2023-05-16, git hash: 594fa5a9d-dirty]
Copyright (c) 2023 HiGHS under MIT licence terms
Presolving model
INFO:linopy.constants: Optimization successful:
Status: ok
Termination condition: optimal
Solution: 21906 primals, 50377 duals
Objective: 7.28e+10
Solver model: available
Solver message: optimal
/opt/hostedtoolcache/Python/3.11.9/x64/lib/python3.11/site-packages/pypsa/optimization/optimize.py:357: FutureWarning:
A value is trying to be set on a copy of a DataFrame or Series through chained assignment using an inplace method.
The behavior will change in pandas 3.0. This inplace method will never work because the intermediate object on which we are setting values always behaves as a copy.
For example, when doing 'df[col].method(value, inplace=True)', try using 'df.method({col: value}, inplace=True)' or df[col] = df[col].method(value) instead, to perform the operation inplace on the original object.
INFO:pypsa.optimization.optimize:The shadow-prices of the constraints Generator-ext-p-lower, Generator-ext-p-upper, StorageUnit-ext-p_dispatch-lower, StorageUnit-ext-p_dispatch-upper, StorageUnit-ext-p_store-lower, StorageUnit-ext-p_store-upper, StorageUnit-ext-state_of_charge-lower, StorageUnit-ext-state_of_charge-upper, StorageUnit-energy_balance were not assigned to the network.
25230 rows, 18665 cols, 69120 nonzeros
25230 rows, 18665 cols, 69120 nonzeros
Presolve : Reductions: rows 25230(-25147); columns 18665(-3241); elements 69120(-32766)
Solving the presolved LP
Using EKK dual simplex solver - serial
Iteration Objective Infeasibilities num(sum)
0 0.0000000000e+00 Pr: 2190(1.44615e+09) 0s
12329 1.9800946916e+10 Pr: 4598(5.58082e+12); Du: 0(3.59385e-08) 5s
19984 7.2838284781e+10 Pr: 0(0); Du: 0(1.42109e-14) 8s
Solving the original LP from the solution after postsolve
Model status : Optimal
Simplex iterations: 19984
Objective value : 7.2838284781e+10
HiGHS run time : 8.50
('ok', 'optimal')
…and inspect the capacities built…
n.generators.p_nom_opt # MW
Generator
OCGT -0.000000
onwind 238501.633267
offwind 77998.676273
solar 245669.862070
Name: p_nom_opt, dtype: float64
n.storage_units.p_nom_opt # MW
StorageUnit
battery storage 33417.521270
hydrogen storage underground 59250.275573
Name: p_nom_opt, dtype: float64
n.storage_units.p_nom_opt.div(1e3) * n.storage_units.max_hours # GWh
StorageUnit
battery storage 200.505128
hydrogen storage underground 9954.046296
dtype: float64
… we see quite a bit of storage. So how does the optimised dispatch of the system look like?
plot_dispatch(n)
/tmp/ipykernel_2311/184362636.py:2: FutureWarning:
DataFrame.groupby with axis=1 is deprecated. Do `frame.T.groupby(...)` without axis instead.
/opt/hostedtoolcache/Python/3.11.9/x64/lib/python3.11/site-packages/pandas/plotting/_matplotlib/core.py:1794: UserWarning:
Attempting to set identical low and high ylims makes transformation singular; automatically expanding.
We are also keen to see what technologies constitute the largest cost components. For that we’re going to define a small helper function:
def system_cost(n):
tsc = n.statistics.capex() + n.statistics.opex()
return tsc.droplevel(0).div(1e6) # million €/a
system_cost(n)
carrier
OCGT NaN
offwind 13620.214881
onwind 24346.303004
solar 12617.161281
battery storage NaN
hydrogen storage underground NaN
dtype: float64
This series, we can then process into plots, e.g. a pie chart:
system_cost(n).plot.pie(figsize=(2, 2))
<Axes: >
or use to compute the cost per unit of electricity consumed:
demand = n.snapshot_weightings.generators @ n.loads_t.p_set.sum(axis=1)
system_cost(n).sum() * 1e6 / demand.sum()
105.61904475056342
n.export_to_netcdf("network-new.nc")
INFO:pypsa.io:Exported network network-new.nc has global_constraints, carriers, loads, storage_units, buses, generators
<xarray.Dataset> Size: 404kB Dimensions: (snapshots: 2190, investment_periods: 0, global_constraints_i: 1, carriers_i: 6, loads_i: 1, loads_t_p_set_i: 1, loads_t_p_i: 1, storage_units_i: 2, ... storage_units_t_state_of_charge_i: 2, buses_i: 1, buses_t_p_i: 1, buses_t_marginal_price_i: 1, generators_i: 4, generators_t_p_max_pu_i: 3, generators_t_p_i: 3) Coordinates: (12/18) * snapshots (snapshots) int64 18kB 0 1 ... 2189 * investment_periods (investment_periods) int64 0B * global_constraints_i (global_constraints_i) object 8B 'C... * carriers_i (carriers_i) object 48B 'onwind' ..... * loads_i (loads_i) object 8B 'demand' * loads_t_p_set_i (loads_t_p_set_i) object 8B 'demand' ... ... * buses_i (buses_i) object 8B 'electricity' * buses_t_p_i (buses_t_p_i) object 8B 'electricity' * buses_t_marginal_price_i (buses_t_marginal_price_i) object 8B ... * generators_i (generators_i) object 32B 'OCGT' ..... * generators_t_p_max_pu_i (generators_t_p_max_pu_i) object 24B ... * generators_t_p_i (generators_t_p_i) object 24B 'onwi... Data variables: (12/37) snapshots_snapshot (snapshots) datetime64[ns] 18kB 201... snapshots_objective (snapshots) float64 18kB 4.0 ... 4.0 snapshots_stores (snapshots) float64 18kB 4.0 ... 4.0 snapshots_generators (snapshots) float64 18kB 4.0 ... 4.0 investment_periods_objective (investment_periods) object 0B investment_periods_years (investment_periods) object 0B ... ... generators_marginal_cost (generators_i) float64 32B 64.68 ..... generators_capital_cost (generators_i) float64 32B 4.772e+0... generators_efficiency (generators_i) float64 32B 0.41 ...... generators_p_nom_opt (generators_i) float64 32B -0.0 ...... generators_t_p_max_pu (snapshots, generators_t_p_max_pu_i) float64 53kB ... generators_t_p (snapshots, generators_t_p_i) float64 53kB ... Attributes: network__linearized_uc: 0 network__multi_invest: 0 network_name: network_objective: 72838284780.62393 network_pypsa_version: 0.27.1 network_srid: 4326 crs: {"_crs": "GEOGCRS[\"WGS 84\",ENSEMBLE[\"World Ge... meta: {}
Warning
Always consider, that the load data is given in units of power (MW) and if your resolution is not hourly, you need to multiply by the snapshot weighting to get the energy consumed!
Sensitivity Analysis#
Sensitivity analyses constitute a core activity of energy system modelling. Below, you can find sensitivity analyses regarding the
variation in allowed CO\(_2\) emissions
variation in solar overnight costs
variation in offshore wind potentials
sensitivity = {}
for co2 in [150, 100, 50, 25, 0]:
n.global_constraints.loc["CO2Limit", "constant"] = co2 * 1e6
n.optimize(solver_name="highs")
sensitivity[co2] = system_cost(n)
Show code cell output
INFO:linopy.model: Solve problem using Highs solver
INFO:linopy.io:Writing objective.
Writing constraints.: 0%| | 0/15 [00:00<?, ?it/s]
Writing constraints.: 60%|██████ | 9/15 [00:00<00:00, 81.12it/s]
Writing constraints.: 100%|██████████| 15/15 [00:00<00:00, 65.14it/s]
Writing continuous variables.: 0%| | 0/6 [00:00<?, ?it/s]
Writing continuous variables.: 100%|██████████| 6/6 [00:00<00:00, 155.60it/s]
INFO:linopy.io: Writing time: 0.29s
INFO:linopy.solvers:Log file at /tmp/highs.log.
Running HiGHS 1.5.3 [date: 2023-05-16, git hash: 594fa5a9d-dirty]
Copyright (c) 2023 HiGHS under MIT licence terms
INFO:linopy.constants: Optimization successful:
Status: ok
Termination condition: optimal
Solution: 21906 primals, 50377 duals
Objective: 3.16e+10
Solver model: available
Solver message: optimal
/opt/hostedtoolcache/Python/3.11.9/x64/lib/python3.11/site-packages/pypsa/optimization/optimize.py:357: FutureWarning:
A value is trying to be set on a copy of a DataFrame or Series through chained assignment using an inplace method.
The behavior will change in pandas 3.0. This inplace method will never work because the intermediate object on which we are setting values always behaves as a copy.
For example, when doing 'df[col].method(value, inplace=True)', try using 'df.method({col: value}, inplace=True)' or df[col] = df[col].method(value) instead, to perform the operation inplace on the original object.
INFO:pypsa.optimization.optimize:The shadow-prices of the constraints Generator-ext-p-lower, Generator-ext-p-upper, StorageUnit-ext-p_dispatch-lower, StorageUnit-ext-p_dispatch-upper, StorageUnit-ext-p_store-lower, StorageUnit-ext-p_store-upper, StorageUnit-ext-state_of_charge-lower, StorageUnit-ext-state_of_charge-upper, StorageUnit-energy_balance were not assigned to the network.
Presolving model
27421 rows, 20856 cols, 77880 nonzeros
27421 rows, 20856 cols, 77880 nonzeros
Presolve : Reductions: rows 27421(-22956); columns 20856(-1050); elements 77880(-24006)
Solving the presolved LP
Using EKK dual simplex solver - serial
Iteration Objective Infeasibilities num(sum)
0 0.0000000000e+00 Pr: 2190(1.54924e+09) 0s
26325 3.1624531941e+10 Pr: 0(0) 4s
Solving the original LP from the solution after postsolve
Model status : Optimal
Simplex iterations: 26325
Objective value : 3.1624531941e+10
HiGHS run time : 4.05
INFO:linopy.model: Solve problem using Highs solver
INFO:linopy.io:Writing objective.
Writing constraints.: 0%| | 0/15 [00:00<?, ?it/s]
Writing constraints.: 60%|██████ | 9/15 [00:00<00:00, 82.06it/s]
Writing constraints.: 100%|██████████| 15/15 [00:00<00:00, 65.49it/s]
Writing continuous variables.: 0%| | 0/6 [00:00<?, ?it/s]
Writing continuous variables.: 100%|██████████| 6/6 [00:00<00:00, 155.89it/s]
INFO:linopy.io: Writing time: 0.29s
INFO:linopy.solvers:Log file at /tmp/highs.log.
Running HiGHS 1.5.3 [date: 2023-05-16, git hash: 594fa5a9d-dirty]
Copyright (c) 2023 HiGHS under MIT licence terms
INFO:linopy.constants: Optimization successful:
Status: ok
Termination condition: optimal
Solution: 21906 primals, 50377 duals
Objective: 3.18e+10
Solver model: available
Solver message: optimal
/opt/hostedtoolcache/Python/3.11.9/x64/lib/python3.11/site-packages/pypsa/optimization/optimize.py:357: FutureWarning:
A value is trying to be set on a copy of a DataFrame or Series through chained assignment using an inplace method.
The behavior will change in pandas 3.0. This inplace method will never work because the intermediate object on which we are setting values always behaves as a copy.
For example, when doing 'df[col].method(value, inplace=True)', try using 'df.method({col: value}, inplace=True)' or df[col] = df[col].method(value) instead, to perform the operation inplace on the original object.
INFO:pypsa.optimization.optimize:The shadow-prices of the constraints Generator-ext-p-lower, Generator-ext-p-upper, StorageUnit-ext-p_dispatch-lower, StorageUnit-ext-p_dispatch-upper, StorageUnit-ext-p_store-lower, StorageUnit-ext-p_store-upper, StorageUnit-ext-state_of_charge-lower, StorageUnit-ext-state_of_charge-upper, StorageUnit-energy_balance were not assigned to the network.
Presolving model
27421 rows, 20856 cols, 77880 nonzeros
27421 rows, 20856 cols, 77880 nonzeros
Presolve : Reductions: rows 27421(-22956); columns 20856(-1050); elements 77880(-24006)
Solving the presolved LP
Using EKK dual simplex solver - serial
Iteration Objective Infeasibilities num(sum)
0 0.0000000000e+00 Pr: 2190(1.54924e+09) 0s
21410 3.0044451623e+10 Pr: 2374(7.03405e+09); Du: 0(8.7168e-07) 5s
27874 3.1764091630e+10 Pr: 0(0); Du: 0(2.42029e-14) 8s
Solving the original LP from the solution after postsolve
Model status : Optimal
Simplex iterations: 27874
Objective value : 3.1764091630e+10
HiGHS run time : 8.85
INFO:linopy.model: Solve problem using Highs solver
INFO:linopy.io:Writing objective.
Writing constraints.: 0%| | 0/15 [00:00<?, ?it/s]
Writing constraints.: 60%|██████ | 9/15 [00:00<00:00, 80.66it/s]
Writing constraints.: 100%|██████████| 15/15 [00:00<00:00, 65.10it/s]
Writing continuous variables.: 0%| | 0/6 [00:00<?, ?it/s]
Writing continuous variables.: 100%|██████████| 6/6 [00:00<00:00, 153.45it/s]
INFO:linopy.io: Writing time: 0.29s
INFO:linopy.solvers:Log file at /tmp/highs.log.
Running HiGHS 1.5.3 [date: 2023-05-16, git hash: 594fa5a9d-dirty]
Copyright (c) 2023 HiGHS under MIT licence terms
Presolving model
27421 rows, 20856 cols, 77880 nonzeros
27421 rows, 20856 cols, 77880 nonzeros
Presolve : Reductions: rows 27421(-22956); columns 20856(-1050); elements 77880(-24006)
Solving the presolved LP
Using EKK dual simplex solver - serial
Iteration Objective Infeasibilities num(sum)
0 0.0000000000e+00 Pr: 2190(1.54924e+09) 0s
13217 2.7613131005e+10 Pr: 9198(2.79115e+11); Du: 0(2.68615e-07) 5s
20423 3.4262556011e+10 Pr: 9020(3.59587e+10); Du: 0(3.0732e-07) 10s
24014 3.5768174048e+10 Pr: 0(0) 12s
24014 3.5768174048e+10 Pr: 0(0) 12s
Solving the original LP from the solution after postsolve
Model status : Optimal
Simplex iterations: 24014
Objective value : 3.5768174048e+10
HiGHS run time : 12.80
INFO:linopy.constants: Optimization successful:
Status: ok
Termination condition: optimal
Solution: 21906 primals, 50377 duals
Objective: 3.58e+10
Solver model: available
Solver message: optimal
/opt/hostedtoolcache/Python/3.11.9/x64/lib/python3.11/site-packages/pypsa/optimization/optimize.py:357: FutureWarning:
A value is trying to be set on a copy of a DataFrame or Series through chained assignment using an inplace method.
The behavior will change in pandas 3.0. This inplace method will never work because the intermediate object on which we are setting values always behaves as a copy.
For example, when doing 'df[col].method(value, inplace=True)', try using 'df.method({col: value}, inplace=True)' or df[col] = df[col].method(value) instead, to perform the operation inplace on the original object.
INFO:pypsa.optimization.optimize:The shadow-prices of the constraints Generator-ext-p-lower, Generator-ext-p-upper, StorageUnit-ext-p_dispatch-lower, StorageUnit-ext-p_dispatch-upper, StorageUnit-ext-p_store-lower, StorageUnit-ext-p_store-upper, StorageUnit-ext-state_of_charge-lower, StorageUnit-ext-state_of_charge-upper, StorageUnit-energy_balance were not assigned to the network.
---------------------------------------------------------------------------
KeyboardInterrupt Traceback (most recent call last)
Cell In[52], line 4
2 for co2 in [150, 100, 50, 25, 0]:
3 n.global_constraints.loc["CO2Limit", "constant"] = co2 * 1e6
----> 4 n.optimize(solver_name="highs")
5 sensitivity[co2] = system_cost(n)
File /opt/hostedtoolcache/Python/3.11.9/x64/lib/python3.11/site-packages/pypsa/optimization/optimize.py:604, in OptimizationAccessor.__call__(self, *args, **kwargs)
602 @wraps(optimize)
603 def __call__(self, *args, **kwargs):
--> 604 return optimize(self._parent, *args, **kwargs)
File /opt/hostedtoolcache/Python/3.11.9/x64/lib/python3.11/site-packages/pypsa/optimization/optimize.py:574, in optimize(n, snapshots, multi_investment_periods, transmission_losses, linearized_unit_commitment, model_kwargs, extra_functionality, assign_all_duals, solver_name, solver_options, **kwargs)
571 n._linearized_uc = linearized_unit_commitment
573 n.consistency_check()
--> 574 m = create_model(
575 n,
576 sns,
577 multi_investment_periods,
578 transmission_losses,
579 linearized_unit_commitment,
580 **model_kwargs,
581 )
582 if extra_functionality:
583 extra_functionality(n, sns)
File /opt/hostedtoolcache/Python/3.11.9/x64/lib/python3.11/site-packages/pypsa/optimization/optimize.py:263, in create_model(n, snapshots, multi_investment_periods, transmission_losses, linearized_unit_commitment, **kwargs)
261 # Define constraints
262 for c, attr in lookup.query("nominal").index:
--> 263 define_nominal_constraints_for_extendables(n, c, attr)
264 define_fixed_nominal_constraints(n, c, attr)
265 define_modular_constraints(n, c, attr)
File /opt/hostedtoolcache/Python/3.11.9/x64/lib/python3.11/site-packages/pypsa/optimization/constraints.py:324, in define_nominal_constraints_for_extendables(n, c, attr)
322 mask = upper != inf
323 n.model.add_constraints(capacity, ">=", lower, name=f"{c}-ext-{attr}-lower")
--> 324 n.model.add_constraints(
325 capacity, "<=", upper, name=f"{c}-ext-{attr}-upper", mask=mask
326 )
File /opt/hostedtoolcache/Python/3.11.9/x64/lib/python3.11/site-packages/linopy/model.py:562, in Model.add_constraints(self, lhs, sign, rhs, name, coords, mask)
560 elif isinstance(lhs, (Variable, ScalarVariable, ScalarLinearExpression)):
561 assert_sign_rhs_not_None(lhs, sign, rhs)
--> 562 data = lhs.to_linexpr().to_constraint(sign, rhs).data
563 else:
564 raise ValueError(
565 f"Invalid type of `lhs` ({type(lhs)}) or invalid combination of `lhs`, `sign` and `rhs`."
566 )
File /opt/hostedtoolcache/Python/3.11.9/x64/lib/python3.11/site-packages/linopy/expressions.py:934, in LinearExpression.to_constraint(self, sign, rhs)
917 def to_constraint(self, sign, rhs):
918 """
919 Convert a linear expression to a constraint.
920
(...)
932 to the right-hand side.
933 """
--> 934 all_to_lhs = (self - rhs).data
935 data = all_to_lhs[["coeffs", "vars"]].assign(sign=sign, rhs=-all_to_lhs.const)
936 return constraints.Constraint(data, model=self.model)
File /opt/hostedtoolcache/Python/3.11.9/x64/lib/python3.11/site-packages/linopy/expressions.py:439, in LinearExpression.__sub__(self, other)
436 if np.isscalar(other):
437 return self.assign(const=self.const - other)
--> 439 other = as_expression(other, model=self.model, dims=self.coord_dims)
440 return merge(self, -other, cls=self.__class__)
File /opt/hostedtoolcache/Python/3.11.9/x64/lib/python3.11/site-packages/linopy/expressions.py:1476, in as_expression(obj, model, **kwargs)
1474 except ValueError as e:
1475 raise ValueError("Cannot convert to LinearExpression") from e
-> 1476 return LinearExpression(obj, model)
File /opt/hostedtoolcache/Python/3.11.9/x64/lib/python3.11/site-packages/linopy/expressions.py:333, in LinearExpression.__init__(self, data, model)
328 data[["coeffs", "vars"]] = xr.broadcast(
329 data[["coeffs", "vars"]], exclude=[FACTOR_DIM]
330 )[0]
332 # transpose with new Dataset to really ensure correct order
--> 333 data = Dataset(data.transpose(..., TERM_DIM))
335 # ensure helper dimensions are not set as coordinates
336 if drop_dims := set(HELPER_DIMS).intersection(data.coords):
337 # TODO: add a warning here, routines should be safe against this
File /opt/hostedtoolcache/Python/3.11.9/x64/lib/python3.11/site-packages/xarray/core/dataset.py:693, in Dataset.__init__(self, data_vars, coords, attrs)
690 if isinstance(coords, Dataset):
691 coords = coords._variables
--> 693 variables, coord_names, dims, indexes, _ = merge_data_and_coords(
694 data_vars, coords
695 )
697 self._attrs = dict(attrs) if attrs else None
698 self._close = None
File /opt/hostedtoolcache/Python/3.11.9/x64/lib/python3.11/site-packages/xarray/core/dataset.py:422, in merge_data_and_coords(data_vars, coords)
418 coords = create_coords_with_default_indexes(coords, data_vars)
420 # exclude coords from alignment (all variables in a Coordinates object should
421 # already be aligned together) and use coordinates' indexes to align data_vars
--> 422 return merge_core(
423 [data_vars, coords],
424 compat="broadcast_equals",
425 join="outer",
426 explicit_coords=tuple(coords),
427 indexes=coords.xindexes,
428 priority_arg=1,
429 skip_align_args=[1],
430 )
File /opt/hostedtoolcache/Python/3.11.9/x64/lib/python3.11/site-packages/xarray/core/merge.py:691, in merge_core(objects, compat, join, combine_attrs, priority_arg, explicit_coords, indexes, fill_value, skip_align_args)
687 skip_align_args = []
689 skip_align_objs = [(pos, objects.pop(pos)) for pos in skip_align_args]
--> 691 coerced = coerce_pandas_values(objects)
692 aligned = deep_align(
693 coerced, join=join, copy=False, indexes=indexes, fill_value=fill_value
694 )
696 for pos, obj in skip_align_objs:
File /opt/hostedtoolcache/Python/3.11.9/x64/lib/python3.11/site-packages/xarray/core/merge.py:459, in coerce_pandas_values(objects)
454 coord_names.update(coords)
456 return coord_names, noncoord_names
--> 459 def coerce_pandas_values(objects: Iterable[CoercibleMapping]) -> list[DatasetLike]:
460 """Convert pandas values found in a list of labeled objects.
461
462 Parameters
(...)
471 that were pandas objects have been converted into native xarray objects.
472 """
473 from xarray.core.coordinates import Coordinates
KeyboardInterrupt:
df = pd.DataFrame(sensitivity).T.div(1e3) # billion Euro/a
df.plot.area(
stacked=True,
linewidth=0,
color=df.columns.map(n.carriers.color),
figsize=(4, 4),
xlim=(0, 150),
xlabel=r"CO$_2$ emissions [Mt/a]",
ylabel="System cost [bn€/a]",
ylim=(0, 100),
)
plt.legend(frameon=False, loc=(1.05, 0))
Exercises#
Explore the model by changing the assumptions and available technologies. Here are a few inspirations:
Rerun the model with cost assumptions for 2050.
What if either hydrogen or battery storage cannot be expanded?
What if you can either only build solar or only build wind?
What if we had a flat electricity demand profile (like in model.energy, i.e. average the demand time series)?
Vary the energy-to-power ratio of the hydrogen storage. What ratio leads to lowest costs?
How bad is the 4-hourly resolution used for demonstration? How does it compare to hourly or 2-hourly resolution?
On model.energy, you can download capacity factors for onshore wind and solar for any region in the world. Drop offshore wind from the model and use the onshore wind and solar prices from another region of the world. Try a few. What changes?
Add nuclear as another dispatchable low-emission generation technology (similar to OCGT). Perform a sensitivity analysis trying to answer how low the capital cost of a nuclear plant would need to be to be chosen.