fix bug in RefineUsingDistortion()

When try_both_modes=0 (that is: -m 0 or -m 1), and the mode is i4,
we were still sometimes falling back to (unexplored, uninitialized) i16 mode,
which resulted in a enc/dec mismatch.
This was mainly occurring for large images (when bit_limit is low enough)

We disable the fall-back by disabling bit_limit using a large MAX_COST threshold.

Change-Id: I0c60257595812bd813b239ff4c86703ddf63cbf8
(cherry picked from commit 0a3838ca77)
This commit is contained in:
Pascal Massimino 2016-11-12 02:15:28 -08:00 committed by James Zern
parent e168af8c6c
commit 58410cd6dc

View File

@ -1155,7 +1155,8 @@ static void RefineUsingDistortion(VP8EncIterator* const it,
const int lambda_d_uv = 120;
score_t score_i4 = dqm->i4_penalty_;
score_t i4_bit_sum = 0;
const score_t bit_limit = it->enc_->mb_header_limit_;
const score_t bit_limit = try_both_modes ? it->enc_->mb_header_limit_
: MAX_COST; // no early-out allowed
if (is_i16) { // First, evaluate Intra16 distortion
int best_mode = -1;