You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Problem Statement:
gsutil lacks a feature to allow in-place compression for same bucket.
For example, if you have a bucket named 'test-bucket' and you want to compress large amount of data in the same bucket, you either have to download or transfer the data out of the bucket to some other services in GCP like VM Instance's File System or local system, then fire up the gsutil command to upload the same data to bucket, you have apply the '-z/Z' argument for compression.
Or worst case, you have to implement a pipeline for bulk compression like with Data Flow Service in GCP or custom program.
It could be possible that I am not aware of any such functionality but if there isn't none, would like to work on creating such feature based on suggestions.
Thanks!!!
The text was updated successfully, but these errors were encountered:
Requesting a new feature!!
Problem Statement:
gsutil lacks a feature to allow in-place compression for same bucket.
For example, if you have a bucket named 'test-bucket' and you want to compress large amount of data in the same bucket, you either have to download or transfer the data out of the bucket to some other services in GCP like VM Instance's File System or local system, then fire up the gsutil command to upload the same data to bucket, you have apply the '-z/Z' argument for compression.
Or worst case, you have to implement a pipeline for bulk compression like with Data Flow Service in GCP or custom program.
It could be possible that I am not aware of any such functionality but if there isn't none, would like to work on creating such feature based on suggestions.
Thanks!!!
The text was updated successfully, but these errors were encountered: