Skip to content

Commit aabbee2

Browse files
committed
Remove useless validity check when converting UTF-16LE -> wchar
The check ensures that the decoded codepoint is between 0x10000-0x10FFFF, which is the valid range which can be encoded in a UTF-16 surrogate pair. However, just looking at the code, it's obvious that this will be true. First of all, 0x10000 is added to the decoded codepoint on the previous line, so how could it be less than 0x10000? Further, even if the 20 data bits already decoded were 0xFFFFF (all ones), when you add 0x10000, it comes to 0x10FFFF, which is the very top of the valid range. So how could the decoded codepoint be more than 0x10FFFF? It can't.
1 parent f474e55 commit aabbee2

File tree

1 file changed

+1
-5
lines changed

1 file changed

+1
-5
lines changed

ext/mbstring/libmbfl/filters/mbfilter_utf16.c

Lines changed: 1 addition & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -290,11 +290,7 @@ int mbfl_filt_conv_utf16le_wchar(int c, mbfl_convert_filter *filter)
290290
case 3:
291291
filter->status = 0;
292292
int n = filter->cache + ((c & 0x3) << 8) + 0x10000;
293-
if (n >= MBFL_WCSPLANE_SUPMIN && n < MBFL_WCSPLANE_SUPMAX) {
294-
CK((*filter->output_function)(n, filter->data));
295-
} else { /* illegal character */
296-
CK((*filter->output_function)((n & MBFL_WCSGROUP_MASK) | MBFL_WCSGROUP_THROUGH, filter->data));
297-
}
293+
CK((*filter->output_function)(n, filter->data));
298294
break;
299295
}
300296

0 commit comments

Comments
 (0)