-
Notifications
You must be signed in to change notification settings - Fork 591
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Make qchem jax compatible #6096
base: master
Are you sure you want to change the base?
Conversation
Hello. You may have forgotten to update the changelog!
|
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## master #6096 +/- ##
=======================================
Coverage 99.70% 99.70%
=======================================
Files 444 444
Lines 42236 42270 +34
=======================================
+ Hits 42113 42147 +34
Misses 123 123 ☔ View full report in Codecov by Sentry. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Leaving the first set of comments. More to follow later today.
pennylane/qchem/hamiltonian.py
Outdated
if len(coordinates) == len(symbols) * 3: | ||
geometry_dhf = qml.numpy.array(coordinates.reshape(len(symbols), 3)) | ||
geometry_dhf = qml.math.array(coordinates.reshape(len(symbols), 3)) | ||
geometry_hf = coordinates | ||
elif len(coordinates) == len(symbols): | ||
geometry_dhf = qml.numpy.array(coordinates) | ||
geometry_dhf = qml.math.array(coordinates) | ||
geometry_hf = coordinates.flatten() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not a blocker but maybe we can just unify these two to always reshape lol.
…eally matter if someone is mixing numpy with jax/autograd/interface.
Co-authored-by: Utkarsh <utkarshazad98@gmail.com>
if qml.math.get_interface(x) != "numpy" | ||
) | ||
) > 1 and (alpha is not None or coeff is not None): | ||
warnings.warn( | ||
"The parameters coordinates, coeff, and alpha are not of the same interface. Please use the same interface for all 3 or there may be unintended behavior.", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You are allowed to mix numpy with any of the other interfaces. So using numpy + jax is considered "the same interface" but using pnp + jax is considered heresy.
UserWarning, | ||
) | ||
use_jax = any(qml.math.get_interface(x) == "jax" for x in [coordinates, alpha, coeff]) | ||
interface_args = [{"like": "autograd", "requires_grad": False}, {"like": "jax"}][use_jax] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh, if you didn't set the alphas as JAX arrays I purposefully don't coerce them into numpy arrays. numpy
arrays are perfectly compatible with jax.
Context:
Autograd deprecation means we are moving to using JAX for auto-differentiation. Since the
requires_grad
keyword is not supported by JAX arrays, we need a different solution.Description of the Change:
We keep backwards compatibility to
pnp
by checking what interface the user is using. We check this by checking the interface of the tensor. If they are using autograd, then we stick to the old workflow and checkrequires_grad
usinggetattr()
.If the user inputs a jax array for any of
[coordinates, coefficients, alpha]
, we assume that the user wants to use JAX and define all undefined coeffs/alphas using jax arrays. This means that if a user decides to mix pnp with jax, we don't hard cast the rest into either since we can't make a decision; therefore it'll result in a warning about mixing these two.WHEN USING JAX:
If users wish to differentiate any one of these parameters they should mark the parameter they want differentiable using the JAX UI, e.g.
jax.grad(..., argnums = <indice(s) of differentiable parameter(s)>)(*args)
. In our case, due to technical limitations,*args
must always be exactly[coordinates, coefficients, alpha]
. No other order is allowed and you cannot omit any of them. Unfortunately, this also includes when you are NOT using jax.grad or a jax function, like when you definediff_hamiltonian(...)(*args)
, the args here (if using JAX) also need to be exactly[coordinates, coefficients, alpha]
. When you do decide to differentiate, doingjax.grad(..., argnums=1)(coordinates, coefficients, alpha)
would mean you wantcoefficients
to be differentiable. Note this is a departure from the UI of qml.grad, where you could doqml.grad(..., argnum=0)(coefficients)
instead.Additional notes:
UI for
qml.grad
and all the other stuff is unchanged for autograd and pnp users. However, if you are using JAX and trying to use theargs
keyword inmolecular_hamiltonian
and related hamiltonians, you will need to define all of [coordinates, coefficients, alpha] as well since it goes downstream todiff_hamiltonian(...)(*args)
.Benefits:
Now JAX compatible.
Possible Drawbacks:
More changes may be needed to JIT, may have performance issues. Different UI for qml.grad and jax.grad. Different expectations for args keyword for jax arrays and pnp arrays.
Related GitHub Issues:
[sc-69776] [sc-69778]