mirror of
https://github.com/webmproject/libwebp.git
synced 2024-12-25 13:18:22 +01:00
webp-lossless-bitstream-spec: fix max_symbol definition
If ReadBits(0) == 0, the value of max_symbol is set to the alphabet size for each symbol type. See vp8l_dec.c, ReadHuffmanCode(), which passes alphabet_size to ReadHuffmanCodeLengths() as num_symbols, and ReadHuffmanCodeLengths() then sets max_symbol to that. Bug: webp:611 Change-Id: I662bd1d7f372e7f2e9c71cc86f87aefd02f36647
This commit is contained in:
parent
eac3bd5c53
commit
71916726b6
@ -925,8 +925,15 @@ for (i = 0; i < num_code_lengths; ++i) {
|
||||
}
|
||||
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||
|
||||
Next, if `ReadBits(1) == 0`, the maximum number of different read symbols is
|
||||
`num_code_lengths`. Otherwise, it is defined as:
|
||||
Next, if `ReadBits(1) == 0`, the maximum number of different read symbols
|
||||
(`max_symbol`) for each symbol type (A, R, G, B, and distance) is set to its
|
||||
alphabet size:
|
||||
|
||||
* G channel: 256 + 24 + `color_cache_size`
|
||||
* Other literals (A, R, and B): 256
|
||||
* Distance code: 40
|
||||
|
||||
Otherwise, it is defined as:
|
||||
|
||||
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||
int length_nbits = 2 + 2 * ReadBits(3);
|
||||
@ -948,11 +955,7 @@ to `max_symbol` code lengths.
|
||||
`11 + ReadBits(7)` times.
|
||||
|
||||
Once code lengths are read, a prefix code for each symbol type (A, R, G, B, and
|
||||
distance) is formed using their respective alphabet sizes:
|
||||
|
||||
* G channel: 256 + 24 + `color_cache_size`
|
||||
* Other literals (A, R, and B): 256
|
||||
* Distance code: 40
|
||||
distance) is formed using their respective alphabet sizes.
|
||||
|
||||
The Normal Code Length Code must code a full decision tree, that is, the sum of
|
||||
`2 ^ (-length)` for all non-zero codes must be exactly one. There is however
|
||||
|
Loading…
Reference in New Issue
Block a user