Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix jacobian shape in VJP for measurements with shape and batched inputs #5986

Merged
Merged
7 changes: 5 additions & 2 deletions doc/releases/changelog-dev.md
Original file line number Diff line number Diff line change
Expand Up @@ -259,22 +259,25 @@
* `qml.AmplitudeEmbedding` has better support for features using low precision integer data types.
[(#5969)](https://github.com/PennyLaneAI/pennylane/pull/5969)

* Jacobian shape is fixed for measurements with dimension in `qml.gradients.vjp.compute_vjp_single`.
[(5986)](https://github.com/PennyLaneAI/pennylane/pull/5986)

* `qml.lie_closure` works with sums of Paulis.
[(#6023)](https://github.com/PennyLaneAI/pennylane/pull/6023)


<h3>Contributors ✍️</h3>

This release contains contributions from (in alphabetical order):
majafranz marked this conversation as resolved.
Show resolved Hide resolved

Tarun Kumar Allamsetty,
Guillermo Alonso,
Utkarsh Azad,
Gabriel Bottrill,
Ahmed Darwish,
Astral Cai,
Yushao Chen,
Gabriel Bottrill,
Ahmed Darwish,
Maja Franz,
Lillian M. A. Frederiksen,
Pietropaolo Frisoni,
Emiliano Godinez,
Expand Down
2 changes: 1 addition & 1 deletion pennylane/gradients/vjp.py
Original file line number Diff line number Diff line change
Expand Up @@ -151,7 +151,7 @@ def compute_vjp_single(dy, jac, num=None):

# Single measurement with dimension e.g. probs
else:
jac = qml.math.stack(jac)
jac = qml.math.reshape(qml.math.stack(jac), (-1, num))
mudit2812 marked this conversation as resolved.
Show resolved Hide resolved
try:
res = jac @ dy_row
except Exception: # pylint: disable=broad-except
Expand Down
21 changes: 21 additions & 0 deletions tests/gradients/core/test_vjp.py
Original file line number Diff line number Diff line change
Expand Up @@ -676,3 +676,24 @@ def test_reduction_extend(self):
# list will correspond to a single input parameter of the combined
# tapes.
assert len(res) == sum(len(t.trainable_params) for t in tapes)

def test_batched_params_probs_jacobian(self):
"""Test that the VJP gets calculated correctly when inputs are batched, multiple
trainable parameters are used and the measurement has a shape (probs)"""
data = np.array([1.2, 2.3, 3.4])
x0, x1 = 0.5, 0.8
ops = [qml.RX(x0, 0), qml.RX(x1, 0), qml.RY(data, 0)]
tape = qml.tape.QuantumScript(ops, [qml.probs(wires=0)], trainable_params=[0, 1])
dy = np.array([[0.6, -0.7], [0.2, -0.7], [-5.2, 0.6]])
v_tapes, fn = qml.gradients.batch_vjp([tape], [dy], qml.gradients.param_shift)

dev = qml.device("default.qubit")
vjp = fn(dev.execute(v_tapes))

# Analytically expected Jacobian and VJP
expected_jac = [-0.5 * np.cos(data) * np.sin(x0 + x1), 0.5 * np.cos(data) * np.sin(x0 + x1)]
expected_vjp = np.tensordot(expected_jac, dy, axes=[[0, 1], [1, 0]])
assert qml.math.shape(vjp) == (1, 2) # num tapes, num trainable tape parameters
assert np.allclose(
vjp, expected_vjp
) # Both parameters essentially feed into the same RX rotation
Loading