mirror of
https://github.com/webmproject/libwebp.git
synced 2025-08-29 23:32:05 +02:00
Reasoning: Analysis showed `bit_depths` is passed from `VP8LCreateHuffmanTree` (as `huff_code->code_lengths`) to `GenerateOptimalTree` (as `bit_depths` with size `histogram_size` = `huff_code->num_symbols`) and then to `SetBitDepths`. The `HuffmanTreeCode` struct stores `code_lengths` and `codes` pointers, both sized by `num_symbols`. These arrays are allocated in `GetHuffBitLengthsAndCodes` (called by `EncodeImageInternal`) based on `num_symbols`. The fix involves: - Annotating `HuffmanTreeCode::code_lengths` and `HuffmanTreeCode::codes` with `__counted_by(num_symbols)` in `src/utils/huffman_encode_utils.h`. - Annotating the `bit_depths` parameter in `GenerateOptimalTree` with `__counted_by(histogram_size)` in `src/utils/huffman_encode_utils.c`. - Annotating the `bit_depths` parameter in `SetBitDepths` with `__indexable` in `src/utils/huffman_encode_utils.c`, as the size parameter (`histogram_size`) is not directly available but indexing is known to be safe based on caller logic (indices `tree->value` are within `[0, histogram_size - 1]`). Bug: 432511821 Change-Id: Icfd32f15d0744983b5912d527e5bc59ac58343a5