Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix probabilities do not sum to 1 #528

Merged
merged 6 commits into from
Dec 27, 2021
Merged

Fix probabilities do not sum to 1 #528

merged 6 commits into from
Dec 27, 2021

Conversation

mlazzarin
Copy link
Contributor

Closes #517. In this PR I fixed issue #517 by casting the state vector to double precision before computing the probabilities.
The memory overhead w.r.t. main should be around 33%.

In the meantime, I made a small change in the NumPy cast, to ensure that no arrays are duplicated if not needed. Let me know your opinion.

@mlazzarin mlazzarin added the bug Something isn't working label Dec 23, 2021
@codecov
Copy link

codecov bot commented Dec 23, 2021

Codecov Report

Merging #528 (9c81b47) into master (20fb08d) will not change coverage.
The diff coverage is 100.00%.

Impacted file tree graph

@@            Coverage Diff            @@
##            master      #528   +/-   ##
=========================================
  Coverage   100.00%   100.00%           
=========================================
  Files           85        85           
  Lines        12439     12439           
=========================================
  Hits         12439     12439           
Flag Coverage Δ
unittests 100.00% <100.00%> (ø)

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
src/qibo/backends/numpy.py 100.00% <100.00%> (ø)
src/qibo/core/states.py 100.00% <100.00%> (ø)

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 20fb08d...9c81b47. Read the comment docs.

@@ -92,7 +92,7 @@ def wrapper(self, qubits=None, measurement_gate=None):
def probabilities(self, qubits=None, measurement_gate=None):
unmeasured_qubits = tuple(i for i in range(self.nqubits)
if i not in qubits)
state = K.reshape(K.square(K.abs(K.cast(self.tensor))), self.nqubits * (2,))
state = K.reshape(K.square(K.abs(K.cast(self.tensor, dtype="complex128"))), self.nqubits * (2,))
Copy link
Member

@stavros11 stavros11 Dec 23, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does it work if we make the cast on the float level, after the abs:

Suggested change
state = K.reshape(K.square(K.abs(K.cast(self.tensor, dtype="complex128"))), self.nqubits * (2,))
state = K.reshape(K.cast(K.square(K.abs(self.tensor)), dtype="float64"), self.nqubits * (2,))

This way we would save some memory and performance deterioration. Not sure if it works though. Or we could also do it before the sum, since it's only the sum that is causing the problem, right?

Copy link
Contributor Author

@mlazzarin mlazzarin Dec 23, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure, we may still need the first cast because we need to cast self.tensor from cupy to numpy in case of fallback, but we may add an additional cast before the sum. I'll give it a try.

Copy link
Contributor Author

@mlazzarin mlazzarin Dec 23, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I tried a similar approach and it works, take a look at the latest commit.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, that's exactly what I meant. Given that K.cast is properly implemented and does not create any copies when the same type is passed, now it should work ok, right? And the sum issue is fixed since we convert to float64 before, while we avoid the complex128 conversion.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, it should work fine.

@scarrazza
Copy link
Member

Can we merge this?

@mlazzarin
Copy link
Contributor Author

Yes, I believe so.

@scarrazza scarrazza merged commit 5ebf0d1 into master Dec 27, 2021
@scarrazza scarrazza deleted the fixprobs branch January 20, 2022 14:47
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Probabilities do not sum to 1
3 participants