Skip to content

Commit c45ab05

Browse files
[3.14] gh-140576: Fixed crash produced by lexer in case of dedented zero byte (GH-140583) (#140757)
gh-140576: Fixed crash produced by lexer in case of dedented zero byte (GH-140583) (cherry picked from commit 8706167) Co-authored-by: Mikhail Efimov <[email protected]>
1 parent e0f54a0 commit c45ab05

File tree

3 files changed

+6
-0
lines changed

3 files changed

+6
-0
lines changed

Lib/test/test_tokenize.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -3183,6 +3183,7 @@ def get_tokens(string):
31833183
f'__{
31843184
x:d
31853185
}__'""",
3186+
" a\n\x00",
31863187
]:
31873188
with self.subTest(case=case):
31883189
self.assertRaises(tokenize.TokenError, get_tokens, case)
Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,2 @@
1+
Fixed crash in :func:`tokenize.generate_tokens` in case of
2+
specific incorrect input. Patch by Mikhail Efimov.

Parser/lexer/lexer.c

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -539,6 +539,9 @@ tok_get_normal_mode(struct tok_state *tok, tokenizer_mode* current_tok, struct t
539539
return MAKE_TOKEN(ERRORTOKEN);
540540
}
541541
}
542+
else if (c == EOF && PyErr_Occurred()) {
543+
return MAKE_TOKEN(ERRORTOKEN);
544+
}
542545
else {
543546
break;
544547
}

0 commit comments

Comments
 (0)