Skip to content
Merged

Update #1332

Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 6 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,14 +30,17 @@
- 新增 HTTP 代理配置(`http_proxy`),增强在受限网络环境下的获取能力。
- 支持识别并过滤过期/无效的 EPG 数据,提高 EPG 质量。
- 支持语言切换(`language`),可选 `zh_CN` / `en`,界面与实时日志可切换语言输出。
- 新增M3U`tvg-id`以适配更多播放器合并频道源。

### 🐛 优化与修复

- 优化降低程序运行时的内存占用。
- 优化 CCTV 类频道别名匹配与 4K 频道识别(匹配规则改进)。
- 优化推流首播体验、转码兼容性与 Docker 推流监控。
- 优化接口冻结流程,智能管理与解冻判断。
- 更新 IP 归属库与运营商数据,提高归属地过滤准确性。
- 若干测速与过滤逻辑优化,减少误判与提升效率。
- 调整Docker日志实时无缓冲输出。

### ⚙️ 配置项说明(新增 / 重点变更)

Expand Down Expand Up @@ -97,14 +100,17 @@
- Added HTTP proxy configuration (`http_proxy`) to improve fetching in restricted network environments.
- Support identification and filtering of expired/invalid EPG data to improve EPG quality.
- Support language switching (`language`), optional `zh_CN` / `en`, enabling UI and real-time log language switching.
- Added M3U `tvg-id` to support merging channel sources in more players.

### 🐛 Optimizations & fixes

- Optimized to reduce the memory usage during program runtime.
- Improved alias matching for CCTV-type channels and 4K channel recognition (matching rules refined).
- Improved first-play streaming experience, transcoding compatibility, and Docker streaming monitoring.
- Optimized interface freezing process with smarter management and unfreeze judgment.
- Updated IP attribution and carrier data to improve accuracy of location-based filtering.
- Several speed test and filtering logic optimizations to reduce false positives and improve efficiency.
- Adjust Docker logs to output in real-time without buffering.

### ⚙️ Configuration items (new / important changes)

Expand Down
1 change: 1 addition & 0 deletions Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -38,6 +38,7 @@ ENV NGINX_HTTP_PORT=8080
ENV NGINX_RTMP_PORT=1935
ENV PUBLIC_PORT=80
ENV PATH="$APP_WORKDIR/.venv/bin:/usr/local/nginx/sbin:$PATH"
ENV PYTHONUNBUFFERED=1

WORKDIR $APP_WORKDIR

Expand Down
5 changes: 3 additions & 2 deletions Pipfile
Original file line number Diff line number Diff line change
Expand Up @@ -20,14 +20,15 @@ bs4 = "==0.0.2"
tqdm = "==4.67.1"
async-timeout = "==5.0.1"
aiohttp = "==3.13.3"
flask = "==3.1.2"
opencc-python-reimplemented = "==0.1.7"
gunicorn = "==23.0.0"
pillow = "==11.1.0"
m3u8 = "==6.0.0"
pytz = "==2025.1"
pystray = "==0.19.5"
ipip-ipdb = "==1.6.1"
urllib3 = "==2.6.3"
pillow = "==12.1.1"
flask = "==3.1.3"

[requires]
python_version = "3.13"
517 changes: 269 additions & 248 deletions Pipfile.lock

Large diffs are not rendered by default.

6 changes: 3 additions & 3 deletions config/alias.txt
Original file line number Diff line number Diff line change
Expand Up @@ -5,10 +5,10 @@
# Format: Main name (corresponds to channel name in demo.txt template), alias1, alias2, alias3
# Aliases support regex matching. Aliases starting with re: are recognized as regular expressions, e\.g\. CCTV-1,re:(?i)^\s*CCTV[-\s_]*0?1(?![0-9Kk+])[\s\S]*$

CCTV-1,re:(?i)^\s*CCTV[-\s_]*0?1(?![0-9Kk+])[\s\S]*$,CCTV1,CCTV-01,CCTV-01_ITV,CCTV-01北联,CCTV-01电信,CCTV-01东联,CCTV-01高码,CCTV-01高清,CCTV-01广西,CCTV-01梅州,CCTV-01咪咕,CCTV-01汝阳,CCTV-01山东,CCTV-01上海,CCTV-01斯特,CCTV-01四川,CCTV-01太原,CCTV-01天津,CCTV-01影视,CCTV-01浙江,CCTV-01重庆,CCTV1综合,CCTV-1综合,CCTV1(B),CCTV1[1920*1080],CCTV1「IPV6」,CCTV1HD,CCTV-1HD,CCTV1-标清,CCTV1港澳牿,CCTV-1高清,CCTV1-综合,CCTV-1综合ᴴᴰ
CCTV-2,re:re:(?i)^\s*CCTV[-\s_]*0?2(?![0-9Kk+])[\s\S]*$,CCTV-02北联,CCTV-02电信,CCTV-02东联,CCTV-02高码,CCTV-02高清,CCTV-02广西,CCTV-02梅州,CCTV-02咪咕,CCTV-02汝阳,CCTV-02山东,CCTV-02上海,CCTV-02斯特,CCTV-02四川,CCTV-02太原,CCTV-02天津,CCTV-02影视,CCTV-02浙江,CCTV-02重庆,CCTV2财经,CCTV-2财经,CCTV2[1280*720],CCTV2「IPV6」,CCTV20241,CCTV22,CCTV250,CCTV2HD,CCTV-2HD,CCTV2-标清,CCTV2-财经,CCTV-2财经ᴴᴰ,CCTV-2高清
CCTV-1,re:(?i)^\s*CCTV[-\s_]*0?1(?![0-9Kk+])[\s\S]*$,CCTV1,CCTV-01,CCTV-01_ITV,CCTV-01北联,CCTV-01电信,CCTV-01东联,CCTV-01高码,CCTV-01高清,CCTV-01广西,CCTV-01梅州,CCTV-01咪咕,CCTV-01汝阳,CCTV-01山东,CCTV-01上海,CCTV-01斯特,CCTV-01四川,CCTV-01太原,CCTV-01天津,CCTV-01影视,CCTV-01浙江,CCTV-01重庆,CCTV1综合,CCTV-1综合,CCTV1(B),CCTV1[1920*1080],CCTV1「IPV6」,CCTV1HD,CCTV-1HD,CCTV1-标清,CCTV-1高清,CCTV1-综合,CCTV-1综合ᴴᴰ
CCTV-2,re:(?i)^\s*CCTV[-\s_]*0?2(?![0-9Kk+])[\s\S]*$,CCTV2,CCTV-02北联,CCTV-02电信,CCTV-02东联,CCTV-02高码,CCTV-02高清,CCTV-02广西,CCTV-02梅州,CCTV-02咪咕,CCTV-02汝阳,CCTV-02山东,CCTV-02上海,CCTV-02斯特,CCTV-02四川,CCTV-02太原,CCTV-02天津,CCTV-02影视,CCTV-02浙江,CCTV-02重庆,CCTV2财经,CCTV-2财经,CCTV2[1280*720],CCTV2「IPV6」,CCTV20241,CCTV22,CCTV250,CCTV2HD,CCTV-2HD,CCTV2-标清,CCTV2-财经,CCTV-2财经ᴴᴰ,CCTV-2高清
CCTV-3,re:(?i)^\s*CCTV[-\s_]*0?3(?![0-9Kk+])[\s\S]*$,CCTV3,CCTV-03,CCTV-03_ITV,CCTV-03北联,CCTV-03电信,CCTV-03东联,CCTV-03高码,CCTV-03高清,CCTV-03广西,CCTV-03梅州,CCTV-03咪咕,CCTV-03汝阳,CCTV-03山东,CCTV-03上海,CCTV-03斯特,CCTV-03四川,CCTV-03太原,CCTV-03天津,CCTV-03影视,CCTV-03浙江,CCTV-03重庆,CCTV3综艺,CCTV-3综艺,CCTV3[1920*1080],CCTV3「IPV6」,CCTV3HD,CCTV-3HD,CCTV3-标清,CCTV-3高清,CCTV-3-高清,CCTV3-综艺,CCTV-3综艺ᴴᴰ
CCTV-4,re:(?i)^\s*CCTV[-\s_]*0?4(?![0-9Kk+])(?!.*(?:欧洲|美洲|Europe|America|Americas))[\s\S]*$,CCTV4,CCTV-04,CCTV-04_ITV,CCTV-04北联,CCTV-04电信,CCTV-04东联,CCTV-04高码,CCTV-04高清,CCTV-04广西,CCTV-04梅州,CCTV-04咪咕,CCTV-04汝阳,CCTV-04山东,CCTV-04上海,CCTV-04斯特,CCTV-04四川,CCTV-04太原,CCTV-04天津,CCTV-04影视,CCTV-04浙江,CCTV-04重庆,CCTV4中文国际,CCTV-4 中文国际,CCTV-4 中文国际 美洲,CCTV-4 中文国际欧洲,CCTV4(美洲),CCTV4(欧洲),CCTV4[1280*720],CCTV4[1920*1080],CCTV4「IPV6」,CCTV42,CCTV450,CCTV4HD,CCTV-4HD,CCTV-4K超高渿,CCTV-4标清,CCTV-4高清,CCTV4国际,CCTV4-国际,CCTV4美洲,CCTV-4美洲,CCTV4欧洲,CCTV-4欧洲,CCTV4-中文国际,CCTV-4中文国际,CCTV-4中文国际ᴴᴰ,CCTV4中文国际美洲,CCTV-4中文国际美洲,CCTV4中文国际欧洲,CCTV-4中文国际欧洲
CCTV-4,re:(?i)^\s*CCTV[-\s_]*0?4(?![0-9Kk+])(?!.*(?:欧洲|美洲|Europe|America|Americas))[\s\S]*$,CCTV4,CCTV-04,CCTV-04_ITV,CCTV-04北联,CCTV-04电信,CCTV-04东联,CCTV-04高码,CCTV-04高清,CCTV-04广西,CCTV-04梅州,CCTV-04咪咕,CCTV-04汝阳,CCTV-04山东,CCTV-04上海,CCTV-04斯特,CCTV-04四川,CCTV-04太原,CCTV-04天津,CCTV-04影视,CCTV-04浙江,CCTV-04重庆,CCTV4[1280*720],CCTV4[1920*1080],CCTV4「IPV6」,CCTV42,CCTV450,CCTV4HD,CCTV-4HD,CCTV-4标清,CCTV-4高清
CCTV-5,re:(?i)^\s*CCTV[-\s_]*0?5(?![0-9Kk+])[\s\S]*$,CCTV5,CCTV-05,CCTV-05_ITV,CCTV-05北联,CCTV-05电信,CCTV-05东联,CCTV-05高码,CCTV-05高清,CCTV-05广西,CCTV-05梅州,CCTV-05咪咕,CCTV-05汝阳,CCTV-05山东,CCTV-05上海,CCTV-05斯特,CCTV-05四川,CCTV-05太原,CCTV-05天津,CCTV-05影视,CCTV-05浙江,CCTV-05重庆,CCTV5体育,CCTV-5 体育,CCTV-5体育(高码率),CCTV5[1920*1080],CCTV5「IPV6」,CCTV5HD,CCTV-5HD,CCTV5-标清,CCTV-5高清,CCTV-5-高清,CCTV-5高清测试,CCTV5-体育,CCTV-5体育,CCTV-5体育ᴴᴰ
CCTV-5+,re:(?i)^\s*CCTV[-\s_]*0?5\s*(?:\+|+)[\s\S]*$,CCTV5+,CCTV5+,CCTV5+ 体育赛事,CCTV-5+体育赛事,CCTV5+[1920*1080],CCTV-5+_ITV,CCTV5+「IPV6」,CCTV5+HD,CCTV-5+HD,CCTV-5+北联,CCTV-5+电信,CCTV-5+高码,CCTV-5+高清,CCTV-5+广西,CCTV-5+梅州,CCTV-5+咪咕,CCTV-5+汝阳,CCTV-5+四川,CCTV-5+太原,CCTV5+体育赛事,CCTV5+-体育赛事,CCTV-5+体育赛事ᴴᴰ,CCTV-5+天津,CCTV-5+影视,CCTV-5+浙江,CCTV-5+重庆,CCTV5+斯特,CCTV5+体育
CCTV-6,re:(?i)^\s*CCTV[-\s_]*0?6(?![0-9Kk+])[\s\S]*$,CCTV6,CCTV-06,CCTV-06_ITV,CCTV-06北联,CCTV-06电信,CCTV-06东联,CCTV-06高码,CCTV-06高清,CCTV-06广西,CCTV-06梅州,CCTV-06咪咕,CCTV-06汝阳,CCTV-06山东,CCTV-06上海,CCTV-06斯特,CCTV-06四川,CCTV-06太原,CCTV-06天津,CCTV-06影视,CCTV-06浙江,CCTV-06重庆,CCTV6电影,CCTV-6电影,CCTV6[1920*1080],CCTV6「IPV6」,CCTV650,CCTV6HD,CCTV-6HD,CCTV6-标清,CCTV6-电影,CCTV-6电影ᴴᴰ,CCTV-6高清,CCTV-6-高清,CCTV-6高清测试
Expand Down
4 changes: 2 additions & 2 deletions entrypoint.sh
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,6 @@ sed -e "s/\${APP_PORT}/${APP_PORT}/g" \

nginx -g 'daemon off;' &

python $APP_WORKDIR/main.py &
python -u $APP_WORKDIR/main.py &

python -m gunicorn service.app:app -b 127.0.0.1:$APP_PORT --timeout=1000
exec python -m gunicorn service.app:app -b 127.0.0.1:$APP_PORT --timeout=1000
1 change: 1 addition & 0 deletions main.py
Original file line number Diff line number Diff line change
Expand Up @@ -310,6 +310,7 @@ async def main(self):
clear_cache()
await self._run_speed_test()
else:
self.aggregator.test_results = self.channel_data
self.aggregator.is_last = True
await self.aggregator.flush_once(force=True)

Expand Down
28 changes: 19 additions & 9 deletions utils/aggregator.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
import copy
from collections import defaultdict
from logging import INFO
from typing import Any, Dict, Optional, Set, Tuple
from typing import Any, Dict, Optional, Set, Tuple, Callable, cast

import utils.constants as constants
from utils.channel import sort_channel_result, generate_channel_statistic, write_channel_to_file, retain_origin
Expand Down Expand Up @@ -62,7 +62,8 @@ def _ensure_debounce_task_in_loop(self, loop: asyncio.AbstractEventLoop) -> None
self._debounce_task = loop.create_task(self._debounce_loop())
except Exception:
try:
loop.call_soon_threadsafe(self._create_debounce_task_threadsafe)
cast(Any, loop).call_soon_threadsafe(
cast(Callable[[], None], self._create_debounce_task_threadsafe), *())
except Exception:
pass

Expand All @@ -73,13 +74,12 @@ def _create_debounce_task_threadsafe(self) -> None:
"""
self._debounce_task = asyncio.create_task(self._debounce_loop())

def add_item(self, cate: str, name: str, item: dict, is_channel_last: bool = False, is_last: bool = False):
def add_item(self, cate: str, name: str, item: dict, is_channel_last: bool = False, is_last: bool = False,
is_valid: bool = True):
"""
Add a test result item for a specific category and name.
"""
self.test_results[cate][name].append(item)
self._dirty = True
self._dirty_count += 1
self.is_last = is_last
self._pending_channels.add((cate, name))

Expand All @@ -92,20 +92,22 @@ def add_item(self, cate: str, name: str, item: dict, is_channel_last: bool = Fal
except Exception:
pass

if self.realtime_write:
if is_valid and self.realtime_write:
try:
self._dirty = True
self._dirty_count += 1
loop = asyncio.get_running_loop()
self._ensure_debounce_task_in_loop(loop)
if self._dirty_count >= self._min_items_before_flush:
self._dirty_count = 0
loop.call_soon(self._flush_event.set)
cast(Any, loop).call_soon(cast(Callable[[], None], self._flush_event.set), *())
except RuntimeError:
try:
loop = asyncio.get_event_loop()
self._ensure_debounce_task_in_loop(loop)
if self._dirty_count >= self._min_items_before_flush:
self._dirty_count = 0
loop.call_soon_threadsafe(self._flush_event.set)
cast(Any, loop).call_soon_threadsafe(cast(Callable[[], None], self._flush_event.set), *())
except Exception:
pass

Expand Down Expand Up @@ -223,14 +225,22 @@ async def flush_once(self, force: bool = False) -> None:
async with self._lock:
if not self._dirty and not force:
return
test_copy = copy.deepcopy(self.test_results)

pending = set(self._pending_channels)
self._pending_channels.clear()

if force:
test_copy = copy.deepcopy(self.test_results)
finished_for_flush = set(self._finished_channels)
self._finished_channels.clear()
else:
test_copy = defaultdict(lambda: defaultdict(list))
for cate, name in pending:
items = self.test_results.get(cate, {}).get(name, [])
copied_items = [it.copy() if isinstance(it, dict) else it for it in items]
if copied_items:
test_copy[cate][name] = copied_items

finished_for_flush = set(self._finished_channels & pending)
self._finished_channels.difference_update(finished_for_flush)

Expand Down
101 changes: 73 additions & 28 deletions utils/channel.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,9 @@
import re
import tempfile
from collections import defaultdict
from itertools import chain
from logging import INFO
from typing import cast

import utils.constants as constants
from utils.alias import Alias
Expand Down Expand Up @@ -72,7 +74,7 @@ def format_channel_data(url: str, origin: OriginType) -> ChannelData:
"id": hash(url),
"url": url,
"host": get_url_host(url),
"origin": origin,
"origin": cast(OriginType, origin),
"ipv_type": None,
"extra_info": info
}
Expand Down Expand Up @@ -315,13 +317,13 @@ def append_data_to_info_data(
init_info_data(info_data, category, name)

channel_list = info_data[category][name]
existing_urls = {info["url"] for info in channel_list if "url" in info}
existing_map = {info["url"]: idx for idx, info in enumerate(channel_list) if "url" in info}

for item in data:
try:
channel_id = item.get("id") or hash(item["url"])
url = item["url"]
host = item.get("host") or get_url_host(url)
raw_url = item.get("url")
host = item.get("host") or (get_url_host(raw_url) if raw_url else None)
date = item.get("date")
delay = item.get("delay")
speed = item.get("speed")
Expand All @@ -334,20 +336,50 @@ def append_data_to_info_data(
catchup = item.get("catchup")
extra_info = item.get("extra_info", "")

if not url or url in existing_urls:
if not raw_url:
continue

if url_origin != "whitelist" and whitelist_maps and is_url_whitelisted(whitelist_maps, url, name):
url_origin = "whitelist"
normalized_url = raw_url
if url_origin not in retain_origin:
normalized_url = get_channel_url(raw_url)
if not normalized_url:
continue
if is_url_frozen(normalized_url):
continue
if blacklist and check_url_by_keywords(normalized_url, blacklist):
continue

if not url_origin:
continue
if url_origin != "whitelist" and whitelist_maps and is_url_whitelisted(whitelist_maps, normalized_url,
name):
url_origin = "whitelist"

if url_origin not in retain_origin:
url = get_channel_url(url)
if not url or is_url_frozen(url) or blacklist and check_url_by_keywords(url, blacklist):
if normalized_url in existing_map:
existing_idx = existing_map[normalized_url]
existing_origin = channel_list[existing_idx].get("origin")
if existing_origin != "whitelist" and url_origin == "whitelist":
channel_list[existing_idx] = {
"id": channel_id,
"url": normalized_url,
"host": host or get_url_host(normalized_url),
"date": date,
"delay": delay,
"speed": speed,
"resolution": resolution,
"origin": url_origin,
"ipv_type": ipv_type,
"location": location,
"isp": isp,
"headers": headers,
"catchup": catchup,
"extra_info": extra_info
}
continue
else:
continue

url = normalized_url

if url_origin not in retain_origin:
if not ipv_type:
if ipv_type_data and host in ipv_type_data:
ipv_type = ipv_type_data[host]
Expand All @@ -369,10 +401,11 @@ def append_data_to_info_data(

if isp and isp_list and not any(item in isp for item in isp_list):
continue

channel_list.append({
"id": channel_id,
"url": url,
"host": host,
"host": host or get_url_host(url),
"date": date,
"delay": delay,
"speed": speed,
Expand All @@ -385,7 +418,7 @@ def append_data_to_info_data(
"catchup": catchup,
"extra_info": extra_info
})
existing_urls.add(url)
existing_map[url] = len(channel_list) - 1

except Exception as e:
print(t("msg.error_append_channel_data").format(info=e))
Expand Down Expand Up @@ -583,7 +616,8 @@ def _on_task_done(task):
else:
mark_url_good(merged.get("url"))

if is_valid_speed_result(merged):
is_valid = is_valid_speed_result(merged)
if is_valid:
valid_count_by_channel[(cate, name)] += 1
if not open_full_speed_test and valid_count_by_channel[(cate, name)] >= urls_limit:
_cancel_remaining_channel_tasks(cate, name)
Expand All @@ -596,7 +630,7 @@ def _on_task_done(task):

if on_task_complete:
try:
on_task_complete(cate, name, merged, is_channel_last, is_last)
on_task_complete(cate, name, merged, is_channel_last, is_last, is_valid)
except Exception:
pass

Expand Down Expand Up @@ -638,18 +672,29 @@ def sort_channel_result(channel_data, result=None, filter_host=False, ipv6_suppo
for n in names:
values = obj.get(n) or []
whitelist_result = []
test_result = (result.get(c, {}).get(n, []) if result else []).copy()

for value in values:
origin = value.get("origin")
if origin in retain or (not ipv6_support and result and value.get("ipv_type") == "ipv6"):
whitelist_result.append(value)
elif filter_host:
host = value.get("host")
merged = {**value, **(speed_lookup(host) or {})}
test_result.append(merged)

total_result = whitelist_result + sorter(test_result, ipv6_support=ipv6_support)
result_list = (result.get(c, {}).get(n, []) if result else [])

if filter_host:
merged_items = []
for value in values:
origin = value.get("origin")
if origin in retain or (not ipv6_support and result and value.get("ipv_type") == "ipv6"):
whitelist_result.append(value)
else:
host = value.get("host")
merged = {**value, **(speed_lookup(host) or {})}
merged_items.append(merged)

sorter_input = chain(result_list, merged_items) if merged_items else result_list
total_result = whitelist_result + sorter(sorter_input, ipv6_support=ipv6_support)
else:
for value in values:
origin = value.get("origin")
if origin in retain or (not ipv6_support and result and value.get("ipv_type") == "ipv6"):
whitelist_result.append(value)

total_result = whitelist_result + sorter(result_list, ipv6_support=ipv6_support)

seen_urls = set()
for item in total_result:
url = item.get("url")
Expand Down
Loading