We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
1 parent 638702f commit 2ae9ba6Copy full SHA for 2ae9ba6
pythainlp/tokenize/longest.py
@@ -1,7 +1,6 @@
1
# -*- coding: utf-8 -*-
2
# SPDX-FileCopyrightText: 2016-2025 PyThaiNLP Project
3
# SPDX-FileType: SOURCE
4
-# SPDX-FileType: SOURCE
5
# SPDX-License-Identifier: Apache-2.0
6
"""
7
Dictionary-based longest-matching Thai word segmentation. Implementation is based
0 commit comments