Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add demand elasticity to speed up optimization #1179

Open
pz-max opened this issue Jul 25, 2024 · 2 comments
Open

Add demand elasticity to speed up optimization #1179

pz-max opened this issue Jul 25, 2024 · 2 comments

Comments

@pz-max
Copy link
Collaborator

pz-max commented Jul 25, 2024

Context

Formulating the PyPSA optimization as a QP problem and adding elasticity appears to significantly reduce the solving time. This has been observed by @fneum with a PyPSA electricity demo model. The exact reason for the performance improvement is still under discussion. Nevertheless, testing the new formulation within PyPSA-Eur seems worthwhile.

Idea

Let's incorporate the QP formulation and demand elasticity for the electricity sector, using typical realistic values. For other sectors, assuming a realistic demand elasticity value will be challenging. One possible approach to also include other sectors is to add a very steep gradient, which can enhance solver speed without altering the results' assumptions.

If someone is interested in making a paper out of that, ping on the PyPSA meets Earth discord.

@pz-max pz-max changed the title Add high elasticity gradient to each sector to speed up optimization Add demand elasticity to speed up optimization Jul 25, 2024
@koen-vg
Copy link
Contributor

koen-vg commented Sep 5, 2024

Out of curiosity I tried this out kind of quickly, but so far I'm unfortunately not having much success... Here's the essence of my implementation:

# load_shedding["willingness_to_pay"] is loaded from config, in EUR/kWh

load_by_bus = n.loads_t.p_set.T.groupby(n.loads.bus).sum().T
AC_buses = n.buses[n.buses.carrier.isin(["AC", "low voltage"])].index.intersection(
    load_by_bus.columns
)
load_by_bus = load_by_bus.loc[:, AC_buses]
load_shedding_i = load_by_bus.columns
n.madd(
    "Generator",
    load_shedding_i,
    " load shedding",
    bus=load_shedding_i,
    carrier="load",
    # Here, we convery back to EUR/MWh
    marginal_cost_quadratic=(1e3 * load_shedding["willingness_to_pay"]) / (2 * load_by_bus),
    marginal_cost=0,
    p_nom=load_by_bus.max(),
)

This is added in solve_network.py as an alternative to the "regular" load shedding. I added the "willingness_to_pay" coefficient (should probably be named something better) to the config, and a wildcard option for it in order to compare results easily.

It sort of seems to work, but it's entirely possible that I made a mistake, so please correct me if I'm wrong!

The following results are from a very "default" pypsa-eur network, 40 clusters, 50seg time resolution, single planning horizon at 2040.

As expected, total system costs are lower with more elastic demand:
image
Here and below, "elastic20" (for example) means the "willingness_to_pay" config option was set to 20. The higher the number, the less elastic the demand.

However, marginal prices are not all that different:
image

The real bummer is that solving times are worse, not better:
image

In the above, I applied demand elasticity only for AC/low voltage buses. I also tried adding demand elasticity at all buses that have some load, but this made solving times significantly worse.

First of all, we can hope that I made some mistake my implementation, and that a correct implementation would actually bear fruits? If not, there are a few big difference between this example and the setup in the price formation paper:

  • One node / 40 nodes
  • Constant demand / varying demand
  • Electricity only / sector coupling
  • High temporal resolution / low temporal resolution
    The best hope for still finding solving time improvements might be that these benefits only materialise with relatively high temporal resolution? I'm just checking with some higher resolution models and will update this once the results are in.

Has anyone else tried this out and gotten similar results? Something totally different?

@koen-vg
Copy link
Contributor

koen-vg commented Sep 6, 2024

So far I haven't been able to find any performance gains for larger models. In fact, for a 40-node, 1000seg european-wide model I got the following:

Optimize a model with 8396199 rows, 3958969 columns and 20102628 nonzeros
Model fingerprint: 0xd959d976
Model has 40000 quadratic objective terms
Coefficient statistics:
  Matrix range     [2e-03, 1e+03]
  Objective range  [2e-02, 2e+06]
  QObjective range [3e+00, 8e+03]
  Bounds range     [9e-02, 4e+10]
  RHS range        [2e-01, 9e+08]
Warning: Model contains large bounds
         Consider reformulating model or setting NumericFocus parameter
         to avoid numerical issues.
Presolve removed 4334219 rows and 330000 columns (presolve time = 5s) ...
Presolve removed 4798421 rows and 794202 columns (presolve time = 10s) ...
Presolve removed 4798421 rows and 822667 columns

Continuous model is non-convex -- solving as a MIP

Presolve removed 4396272 rows and 393057 columns (presolve time = 5s) ...
Presolve removed 4396272 rows and 393057 columns (presolve time = 10s) ...
Presolve removed 4793219 rows and 790004 columns (presolve time = 15s) ...
Presolve removed 4793219 rows and 790004 columns (presolve time = 20s) ...
Presolve removed 4793219 rows and 790004 columns (presolve time = 25s) ...
Presolve removed 4793219 rows and 796331 columns (presolve time = 31s) ...
Presolve removed 4802407 rows and 805519 columns (presolve time = 35s) ...
Presolve removed 4805102 rows and 808661 columns (presolve time = 40s) ...
Presolve removed 4805102 rows and 808661 columns (presolve time = 45s) ...
Presolve removed 4807102 rows and 809662 columns (presolve time = 50s) ...
Presolve removed 4807102 rows and 809662 columns
Presolve time: 50.50s
Presolved: 3589099 rows, 3189308 columns, 13560273 nonzeros
Presolved model has 39999 quadratic constraint(s)
Presolved model has 1 bilinear constraint(s)
Variable types: 3189308 continuous, 0 integer (0 binary)
Root relaxation presolved: 3545114 rows, 3105650 columns, 13388930 nonzeros

Gurobi wasn't able to solve this model. ("willingness_to_pay" set to 50 EUR/kWh.)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants