Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix jacobian shape in VJP for measurements with shape and batched inputs #5986

Merged

Conversation

majafranz
Copy link
Contributor

@majafranz majafranz commented Jul 11, 2024

Context:
This PR should fix a bug related to computing the gradient of a circuit which supports:

  1. batching in non-trainable data
  2. Measurement with a shape (i.e. probs)
  3. More than one trainable parameter in the circuit

Localised example that fails, without the fix:

@qml.qnode(qml.device('default.qubit'), diff_method="parameter-shift")
def circuit(x, data):
    qml.RX(x[0], 0)
    qml.RX(x[1], 0)
    qml.RY(data, 0)
    return qml.probs(wires=0)

x = qml.numpy.array([0.5, 0.8], requires_grad=True)
data = qml.numpy.array([1.2, 2.3, 3.4], requires_grad=False)
circuit(x, data)
qml.jacobian(circuit)(x, data)

Reshaping the jacobian, when computing the VJP does resolve the issue.

Description of the Change:
Jacobian is reshaped according to the shape of dy when computing the VJP for measurements with dimension.

Benefits:
One less bug :)

Possible Drawbacks:
None

Related GitHub Issues:
This PR fixes #5979

This commit takes the proper shape of the jacobian when having
- batching in non-trainable data
- Measurement with a shape (ie probs)
- More than one trainable parameter in the circuit
@majafranz majafranz force-pushed the batched-probs-autograd-gradient branch from ccb773a to a84ad45 Compare July 11, 2024 08:46
Copy link

codecov bot commented Jul 11, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 99.65%. Comparing base (f9adf90) to head (f34fea3).
Report is 2 commits behind head on master.

Additional details and impacted files
@@            Coverage Diff             @@
##           master    #5986      +/-   ##
==========================================
- Coverage   99.66%   99.65%   -0.01%     
==========================================
  Files         430      430              
  Lines       41544    41281     -263     
==========================================
- Hits        41404    41140     -264     
- Misses        140      141       +1     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@josh146
Copy link
Member

josh146 commented Jul 16, 2024

Hi @majafranz, thank you for this PR/bugfix! Someone from the team will be able to review it shortly :)

@trbromley trbromley requested a review from a team July 16, 2024 14:28
@Alex-Preciado Alex-Preciado requested review from mudit2812 and removed request for a team July 16, 2024 15:24
Copy link
Contributor

@mudit2812 mudit2812 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for opening this PR @majafranz ! I just have one comment, and other than that, this looks great 🎉 .

Also, please ignore the one failing tensorflow test, that is unrelated to this PR and is being addressed.

tests/gradients/core/test_vjp.py Outdated Show resolved Hide resolved
@trbromley
Copy link
Contributor

@Alex-Preciado who do you recommend as a second reviewer for this PR?

Copy link
Contributor

@dwierichs dwierichs left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @majafranz! Thank you for fixing this bug, it's a very good catch 🎉
I basically have two comments:

  1. The test could be reduced a bit regarding its dependency on other parts of the library (QNode and VJP->Jacobian logic). I suggested an alternative, let me know what you think!
  2. There seems to be a similar bug in the JVP logic. Would you be interested in fixing this as well? It can be taken care of separately, but it also would fit neatly into this PR :)

tests/gradients/core/test_vjp.py Outdated Show resolved Hide resolved
pennylane/gradients/vjp.py Show resolved Hide resolved
doc/releases/changelog-dev.md Show resolved Hide resolved
majafranz and others added 2 commits July 24, 2024 20:35
Co-authored-by: David Wierichs <david.wierichs@xanadu.ai>
Copy link
Contributor

@dwierichs dwierichs left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for your contribution, @majafranz 🎉 💯

@mudit2812
Copy link
Contributor

@majafranz Could you fix the formatting errors preventing this PR from being merged? Thanks!

@majafranz majafranz force-pushed the batched-probs-autograd-gradient branch from be3e44e to 81dcb26 Compare July 31, 2024 11:53
@mudit2812 mudit2812 enabled auto-merge (squash) July 31, 2024 13:32
@mudit2812 mudit2812 merged commit 2892a9a into PennyLaneAI:master Jul 31, 2024
40 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
5 participants