Skip to content

Fix LZ4 decompression issue causing OutOfMemory errors #118

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

ratatouille100
Copy link

Fixes #94

This PR addresses the OutOfMemory error that some users, including myself, encountered when attempting to decompress large LZ4 images (e.g. super.img.lz4).

The problem was caused by loading the entire LZ4 file into memory at once during decompression. This approach worked for smaller files but failed with large archive files, resulting in OutOfMemory exceptions.

The solution implements streaming decompression by:

  • Reading and writing the file in chunks (8MB at a time)
  • Using a while loop with chunked reading to process the file gradually
  • Keeping only small portions of the file in memory

These changes significantly reduce memory usage when handling large LZ4 files, allowing the decompression to complete successfully without running out of memory.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

possible memory leak trying to unpack LZ4
1 participant