Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: don't call batch() if batch == False #417

Open
wants to merge 4 commits into
base: main
Choose a base branch
from

Conversation

MikeWalrus
Copy link

Description

Currently, creating a CKKSTensor from a PyTorch scalar throws an exception even with batch=False.

import torch
import tenseal as ts

ctx = ts.context(
    ts.SCHEME_TYPE.CKKS, 8192, coeff_mod_bit_sizes=[60, 40, 40, 60]
)
ctx.global_scale = 2**10

ts.CKKSTensor(ctx, torch.tensor(3), batch=False)
Traceback (most recent call last):
  File "a.py", line 9, in <module>
    ts.CKKSTensor(ctx, torch.tensor(3), batch=False)
  File "...python3.10/site-packages/tenseal/tensors/ckkstensor.py", line 44, in __init__
    self.data = ts._ts_cpp.CKKSTensor(context.data, tensor.data, batch)
ValueError: invalid dimension for batching

This pr fixes it by not calling TensorStorage::batch if batch==false.

Affected Dependencies

None.

How has this been tested?

  • Describe the tests that you ran to verify your changes.
  • Provide instructions so we can reproduce.
  • List any relevant details for your test configuration.

Checklist

@bcebere
Copy link
Member

bcebere commented Nov 7, 2022

Thank you for the contribution!

Can you please add a test that was failing before, and it is working with your fix? Thanks!

@MikeWalrus
Copy link
Author

Haven't got the time to look at the testing code yet. I'll try to add the test someday.

@bcebere
Copy link
Member

bcebere commented Jan 15, 2023

Haven't got the time to look at the testing code yet. I'll try to add the test someday.

Thanks! A Python test is enough to cover the fix, and to justify it in the future.

@jiejiejie5335
Copy link

hello, i wonder is there any other plain_modulus that can be used in BFV? I find only 1032191 and 786433 can worked other wise will cause the error: ValueError: encryption parameters are not valid for batching

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants