Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Callback not available when gradient is passed to VQE #10700

Closed
eliotheinrich opened this issue Aug 23, 2023 · 3 comments
Closed

Callback not available when gradient is passed to VQE #10700

eliotheinrich opened this issue Aug 23, 2023 · 3 comments
Labels
bug Something isn't working mod: algorithms Related to the Algorithms module

Comments

@eliotheinrich
Copy link

Environment

  • Qiskit Terra version: 0.25
  • Python version: 3.10

What is happening?

The algorithms.minimum_eigensolvers.VQE algorithm does not call callback during optimization when a gradient is explicitly provided. This seems unexpected, and even if this is the intended design, it might be a good feature to give callback access when a gradient is provided. This would allow, e.g., comparison of convergence for different gradient types.

How can we reproduce the issue?

The following code reproduces the issue; the output includes many calls to callback_func1 but only a single call to callback_func2 (I think this call comes from when the optimizer result is built?)

from qiskit.circuit.library import RealAmplitudes
from qiskit.algorithms.minimum_eigensolvers import VQE
from qiskit.primitives import Estimator
from qiskit.algorithms.gradients import FiniteDiffEstimatorGradient
from qiskit.algorithms.optimizers import ADAM
from qiskit.opflow import PauliSumOp


estimator = Estimator()
optimizer = ADAM()
ansatz = RealAmplitudes(2, reps=1, skip_final_rotation_layer=True).decompose()
op = PauliSumOp.from_list([('ZZ', 1)])

def callback_func1(eval_count, params, value, metadata):
    print(f'From vqe1: {eval_count}')

def callback_func2(eval_count, params, value, metadata):
    print(f'from vqe2: {eval_count}')

vqe1 = VQE(estimator, ansatz, optimizer, callback = callback_func1)
_ = vqe1.compute_minimum_eigenvalue(op)

gradient = FiniteDiffEstimatorGradient(estimator, epsilon=0.0001)
vqe2 = VQE(estimator, ansatz, optimizer, gradient=gradient, callback=callback_func2)
_ = vqe2.compute_minimum_eigenvalue(op)

What should happen?

The output of the two VQE runs should be the same (or as similar as possible).

Any suggestions?

A possible solution would be to call callback from VQE.evaluate_gradient in VQE._get_evaluate_gradient, although the Estimator values and metadata will not be accessible.

@eliotheinrich eliotheinrich added the bug Something isn't working label Aug 23, 2023
@woodsp-ibm
Copy link
Member

woodsp-ibm commented Aug 23, 2023

I think what you are seeing here is more a behavior of ADAM than a problem with VQE callback. If you switch say to using SLQSP or L_BFGS_B for example you will see more than one call to the second callback.

Similar - well the initial_point, you don't set that so they start from a different random point each. Then the built-in finite diff used by say SLSQP its eps is much smaller by default. And that will use the objective function, which is where the callback originates, to compute the gradient too. If you plot the value you can see a staircase effect where the values when it computes the gradient, using the objective function, their diff is very small. So the first callback will also end up showing calls made by the optimizer to do whatever it does including any gradient computation - the objective function has no idea what its being used for and just gives a callback for each use. With the second it will only be calls to the objective function itself with whatever is done for the gradient being off to the side so to speak.

So with ADAM what it does is to use the gradient and then when it sees a minimal change in params it ends and makes one call to the objective function. You can see that here https://github.com/Qiskit/qiskit/blob/57137ff7f2cdd6d7671aa0e6889258f28a70e5a1/qiskit/algorithms/optimizers/adam_amsgrad.py#L268C37-L268C37 at the end of the minimize method. If you don't supply a gradient it uses a custom finite diff method using the objective function. This is why in the first callback with ADAM you see a lot of callbacks as it ends up using that for the gradient.

The behavior you see is expected - when you use a gradient the computation done for it does not end up in the callback from the objective function. With ADAM doing its thing using the gradient and the params, and only explicitly calling the objective function once to return the minimum value it does make the difference more extreme.

With qiskit.algorithms being moved to qiskit_algorithms and the tutorials being moved around as well I can only point you to an example notebook (it has yet to be re-published as html, but will be soon) https://github.com/qiskit-community/qiskit-algorithms/blob/main/docs/tutorials/02_vqe_advanced_options.ipynb Here you can see the staircase effect I was talking about with SLSQP. COBYLA there is not gradient based so its plot looks different in that regard.

@woodsp-ibm woodsp-ibm added the mod: algorithms Related to the Algorithms module label Aug 23, 2023
@eliotheinrich
Copy link
Author

I see, that makes sense. Thank you for the detailed reply. Maybe this issue should be a feature request then; it seems like adding some sort of callback access to gradient calls for these kinds of scenarios would be useful. ADAM is a popular choice for VQE and other VQAs and if the callback is not called during optimization for custom gradients there's no obvious way to get mid-optimization convergence information.

@woodsp-ibm
Copy link
Member

There is also access to that and other other internal information the optimizer has from the optimizer itself. They (mostly) have a callback which you can use too. The callback in VQE is optimizer independent really and just gives a callback for each use of the objective function it supplies to the optimizer. I said "mostly" as ADAM does not have a have callback capability - but it can store some data about whats happening if you set a snapshot_dir.

As far as any specific new feature request that would need to be done on the qiskit_algorithms repository now. As of the last release of Qiskit here the qiskit.algorithms is deprecated and new features will not be considered here now. More info on the move in the last release note prelude text.

As far as the optimizer callback it does depend on the optimizer, so it can be a bit more work to use if you are trying out different optimizers. As such an issue was raised to unify qiskit-community/qiskit-algorithms#60 things a while back - now transferred to the new repo.

If you want to comment in the above issue, or raise something else for discussion there please feel free. I pointed out the above issue since I think that more closely relates to the sort of info you are looking for. I am going to close the issue here on the basis that if there is something further it can brought up on the qiskit_algorithms repo.

@woodsp-ibm woodsp-ibm closed this as not planned Won't fix, can't repro, duplicate, stale Aug 24, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working mod: algorithms Related to the Algorithms module
Projects
None yet
Development

No branches or pull requests

2 participants