Skip to content

Commit 9905634

Browse files
authored
Merge branch 'develop' into first
2 parents 6ee05d6 + a2e159a commit 9905634

File tree

1,200 files changed

+1713
-5193
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

1,200 files changed

+1713
-5193
lines changed

_typos.toml

Lines changed: 0 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -83,16 +83,6 @@ Wether = "Wether"
8383
accordding = "accordding"
8484
accoustic = "accoustic"
8585
accpetance = "accpetance"
86-
accracy = "accracy"
87-
acutal = "acutal"
88-
apporach = "apporach"
89-
apporaches = "apporaches"
90-
arguements = "arguements"
91-
arguemnts = "arguemnts"
92-
assgin = "assgin"
93-
assginment = "assginment"
94-
auxilary = "auxilary"
95-
avaiable = "avaiable"
9686
baisc = "baisc"
9787
basci = "basci"
9888
beacuse = "beacuse"

ci_scripts/check_api_label_cn.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -77,7 +77,7 @@ def run_cn_api_label_checking(rootdir, files):
7777
for file in files:
7878
if should_test(file) and not check_api_label(rootdir, file):
7979
logger.error(
80-
f"The first line in {rootdir}/{file} is not avaiable, please re-check it!"
80+
f"The first line in {rootdir}/{file} is not available, please re-check it!"
8181
)
8282
sys.exit(1)
8383
valid_api_labels = find_all_api_labels_in_dir(rootdir)

ci_scripts/hooks/pre-doc-compile.sh

Lines changed: 109 additions & 24 deletions
Original file line numberDiff line numberDiff line change
@@ -5,55 +5,140 @@ SCRIPT_DIR="$( cd "$( dirname "$0" )" && pwd )"
55

66
FLUIDDOCDIR=${FLUIDDOCDIR:=/FluidDoc}
77
DOCROOT=${FLUIDDOCDIR}/docs
8-
9-
10-
## 1. 获取API映射文件
118
APIMAPPING_ROOT=${DOCROOT}/guides/model_convert/convert_from_pytorch
129
TOOLS_DIR=${APIMAPPING_ROOT}/tools
1310

14-
# 确保tools目录存在
15-
mkdir -p ${TOOLS_DIR}
11+
# Create tools directory if not exists
12+
if [ ! -d "${TOOLS_DIR}" ]; then
13+
echo "INFO: Creating tools directory at ${TOOLS_DIR}"
14+
mkdir -p "${TOOLS_DIR}"
15+
if [ $? -ne 0 ]; then
16+
echo "ERROR: Failed to create directory ${TOOLS_DIR}"
17+
exit 1
18+
fi
19+
else
20+
echo "INFO: Tools directory ${TOOLS_DIR} already exists"
21+
fi
1622

17-
#下载的文件URL
23+
# Define API mapping files URLs
1824
API_ALIAS_MAPPING_URL="https://raw.githubusercontent.com/PaddlePaddle/PaConvert/master/paconvert/api_alias_mapping.json"
1925
API_MAPPING_URL="https://raw.githubusercontent.com/PaddlePaddle/PaConvert/master/paconvert/api_mapping.json"
2026
GLOBAL_VAR_URL="https://raw.githubusercontent.com/PaddlePaddle/PaConvert/master/paconvert/global_var.py"
2127
ATTRIBUTE_MAPPING_URL="https://raw.githubusercontent.com/PaddlePaddle/PaConvert/master/paconvert/attribute_mapping.json"
2228

23-
# 下载文件
29+
# Define backup URLs
30+
BACKUP_API_ALIAS_MAPPING_URL="https://paddle-paconvert.bj.bcebos.com/api_alias_mapping.json"
31+
BACKUP_API_MAPPING_URL="https://paddle-paconvert.bj.bcebos.com/api_mapping.json"
32+
BACKUP_GLOBAL_VAR_URL="https://paddle-paconvert.bj.bcebos.com/global_var.py"
33+
BACKUP_ATTRIBUTE_MAPPING_URL="https://paddle-paconvert.bj.bcebos.com/attribute_mapping.json"
34+
35+
# Check for proxy settings
2436
PROXY=""
2537
if [ -n "$https_proxy" ]; then
2638
PROXY="$https_proxy"
39+
echo "INFO: find proxy"
2740
elif [ -n "$http_proxy" ]; then
2841
PROXY="$http_proxy"
42+
echo "INFO: find proxy"
43+
else
44+
echo "INFO: No proxy detected, downloading directly."
2945
fi
3046

31-
# 构建 curl 代理参数
32-
CURL_PROXY_ARGS=""
47+
# Build curl proxy arguments
3348
if [ -n "$PROXY" ]; then
34-
CURL_PROXY_ARGS="--proxy $PROXY"
49+
CURL_PROXY_ARGS="--proxy ${PROXY}"
3550
else
36-
echo "No proxy detected, downloading directly."
51+
CURL_PROXY_ARGS=""
52+
fi
53+
54+
# Download API mapping files with retry
55+
download_file() {
56+
local url=$1
57+
local dest=$2
58+
local filename=$(basename "$dest")
59+
local max_retries=5
60+
local retry_count=0
61+
62+
echo "INFO: Starting download of ${filename} from ${url}"
63+
64+
while [ $retry_count -lt $max_retries ]; do
65+
retry_count=$((retry_count + 1))
66+
echo "INFO: Attempt $retry_count of $max_retries to download ${filename}"
67+
68+
if curl $CURL_PROXY_ARGS -o "${dest}" -s "${url}" > /dev/null 2>&1; then
69+
echo "SUCCESS: Successfully downloaded ${filename} to ${dest}"
70+
return 0
71+
else
72+
echo "WARNING: Failed to download ${filename} from ${url} (attempt $retry_count)"
73+
sleep 2 # Wait for 2 seconds before next retry
74+
fi
75+
done
76+
77+
echo "ERROR: Failed to download ${filename} after $max_retries attempts"
78+
return 1
79+
}
80+
81+
# Download each file with detailed logging
82+
echo "INFO: Downloading API alias mapping file"
83+
if ! download_file "${API_ALIAS_MAPPING_URL}" "${TOOLS_DIR}/api_alias_mapping.json"; then
84+
echo "INFO: Trying backup URL for API alias mapping file"
85+
if ! download_file "${BACKUP_API_ALIAS_MAPPING_URL}" "${TOOLS_DIR}/api_alias_mapping.json"; then
86+
echo "ERROR: API alias mapping download failed (both main and backup URLs). Exiting."
87+
exit 1
88+
fi
89+
fi
90+
91+
echo "INFO: Downloading API mapping file"
92+
if ! download_file "${API_MAPPING_URL}" "${TOOLS_DIR}/api_mapping.json"; then
93+
echo "INFO: Trying backup URL for API mapping file"
94+
if ! download_file "${BACKUP_API_MAPPING_URL}" "${TOOLS_DIR}/api_mapping.json"; then
95+
echo "ERROR: API mapping download failed (both main and backup URLs). Exiting."
96+
exit 1
97+
fi
3798
fi
3899

39-
# 执行下载
40-
curl $CURL_PROXY_ARGS -o "${TOOLS_DIR}/api_alias_mapping.json" -s "${API_ALIAS_MAPPING_URL}"
41-
curl $CURL_PROXY_ARGS -o "${TOOLS_DIR}/api_mapping.json" -s "${API_MAPPING_URL}"
42-
curl $CURL_PROXY_ARGS -o "${TOOLS_DIR}/global_var.py" -s "${GLOBAL_VAR_URL}"
43-
curl $CURL_PROXY_ARGS -o "${TOOLS_DIR}/attribute_mapping.json" -s "${ATTRIBUTE_MAPPING_URL}"
100+
echo "INFO: Downloading global variable file"
101+
if ! download_file "${GLOBAL_VAR_URL}" "${TOOLS_DIR}/global_var.py"; then
102+
echo "INFO: Trying backup URL for global variable file"
103+
if ! download_file "${BACKUP_GLOBAL_VAR_URL}" "${TOOLS_DIR}/global_var.py"; then
104+
echo "ERROR: Global variable download failed (both main and backup URLs). Exiting."
105+
exit 1
106+
fi
107+
fi
108+
109+
echo "INFO: Downloading attribute mapping file"
110+
if ! download_file "${ATTRIBUTE_MAPPING_URL}" "${TOOLS_DIR}/attribute_mapping.json"; then
111+
echo "INFO: Trying backup URL for attribute mapping file"
112+
if ! download_file "${BACKUP_ATTRIBUTE_MAPPING_URL}" "${TOOLS_DIR}/attribute_mapping.json"; then
113+
echo "ERROR: Attribute mapping download failed (both main and backup URLs). Exiting."
114+
exit 1
115+
fi
116+
fi
44117

45-
# 检查下载是否成功
46-
if [ $? -ne 0 ]; then
47-
echo "Error: Failed to download API mapping files"
118+
# Check if all files exist before proceeding
119+
if [ ! -f "${TOOLS_DIR}/api_alias_mapping.json" ] || \
120+
[ ! -f "${TOOLS_DIR}/api_mapping.json" ] || \
121+
[ ! -f "${TOOLS_DIR}/global_var.py" ] || \
122+
[ ! -f "${TOOLS_DIR}/attribute_mapping.json" ]; then
123+
echo "ERROR: One or more API mapping files are missing after download"
124+
echo "Missing files:"
125+
if [ ! -f "${TOOLS_DIR}/api_alias_mapping.json" ]; then echo " - api_alias_mapping.json"; fi
126+
if [ ! -f "${TOOLS_DIR}/api_mapping.json" ]; then echo " - api_mapping.json"; fi
127+
if [ ! -f "${TOOLS_DIR}/global_var.py" ]; then echo " - global_var.py"; fi
128+
if [ ! -f "${TOOLS_DIR}/attribute_mapping.json" ]; then echo " - attribute_mapping.json"; fi
48129
exit 1
49130
fi
50131

51-
## 3. Apply PyTorch-PaddlePaddle mapping using the new API mapping files
52-
python ${APIMAPPING_ROOT}/tools/get_api_difference_info.py
53-
python ${APIMAPPING_ROOT}/tools/generate_pytorch_api_mapping.py
132+
echo "INFO: All API mapping files successfully downloaded"
54133

134+
echo "INFO: Running get_api_difference_info.py"
135+
if ! python "${APIMAPPING_ROOT}/tools/get_api_difference_info.py"; then
136+
echo "ERROR: get_api_difference_info.py failed. Please check the script."
137+
exit 1
138+
fi
55139

56-
if [ $? -ne 0 ]; then
57-
echo "Error: API mapping generate script failed, please check changes in ${APIMAPPING_ROOT}"
140+
echo "INFO: Running generate_pytorch_api_mapping.py"
141+
if ! python "${APIMAPPING_ROOT}/tools/generate_pytorch_api_mapping.py"; then
142+
echo "ERROR: generate_pytorch_api_mapping.py failed. Please check the script."
58143
exit 1
59144
fi

docs/api/paddle/distributed/ParallelEnv_cn.rst

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ ParallelEnv
66
.. py:class:: paddle.distributed.ParallelEnv()
77
88
.. note::
9-
不推荐使用这个 API,如果需要获取 rank 和 world_size,建议使用 ``paddle.distributed.get_rank()`` 和 ``paddle.distributed.get_world_size()`` 。
9+
不推荐使用这个 API,如果需要获取 rank 和 world_size,建议使用 ``paddle.distributed.get_rank()`` 和 ``paddle.distributed.get_world_size()`` 。
1010

1111
这个类用于获取动态图模型并行执行所需的环境变量值。
1212

@@ -24,7 +24,7 @@ rank
2424

2525
当前训练进程的编号。
2626

27-
此属性的值等于环境变量 `PADDLE_TRAINER_ID` 的值。默认值是 0。
27+
此属性的值等于环境变量 ``PADDLE_TRAINER_ID`` 的值。默认值是 0。
2828

2929
**代码示例**
3030

@@ -35,7 +35,7 @@ world_size
3535

3636
参与训练进程的数量,一般也是训练所使用 GPU 卡的数量。
3737

38-
此属性的值等于环境变量 `PADDLE_TRAINERS_NUM` 的值。默认值为 1。
38+
此属性的值等于环境变量 ``PADDLE_TRAINERS_NUM`` 的值。默认值为 1。
3939

4040
**代码示例**
4141

@@ -46,7 +46,7 @@ device_id
4646

4747
当前用于并行训练的 GPU 的编号。
4848

49-
此属性的值等于环境变量 `FLAGS_selected_gpus` 的值。默认值是 0。
49+
此属性的值等于环境变量 ``FLAGS_selected_gpus`` 的值。默认值是 0。
5050

5151
**代码示例**
5252

@@ -57,7 +57,7 @@ current_endpoint
5757

5858
当前训练进程的终端节点 IP 与相应端口,形式为(机器节点 IP:端口号)。例如:127.0.0.1:6170。
5959

60-
此属性的值等于环境变量 `PADDLE_CURRENT_ENDPOINT` 的值。默认值为空字符串""。
60+
此属性的值等于环境变量 ``PADDLE_CURRENT_ENDPOINT`` 的值。默认值为空字符串""。
6161

6262
**代码示例**
6363

@@ -68,7 +68,7 @@ trainer_endpoints
6868

6969
当前任务所有参与训练进程的终端节点 IP 与相应端口,用于在 NCCL2 初始化的时候建立通信,广播 NCCL ID。
7070

71-
此属性的值等于环境变量 `PADDLE_TRAINER_ENDPOINTS` 的值。默认值为空字符串""。
71+
此属性的值等于环境变量 ``PADDLE_TRAINER_ENDPOINTS`` 的值。默认值为空字符串""。
7272

7373
**代码示例**
7474

docs/api/paddle/logical_xor_cn.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -30,7 +30,7 @@ logical_xor
3030

3131
返回
3232
::::::::::::
33-
``Tensor``,维度``x`` 维度相同,存储运算后的结果。
33+
``Tensor``,维度 ``x`` 维度相同,存储运算后的结果。
3434

3535
代码示例
3636
::::::::::::

docs/api/paddle/nn/Conv1DTranspose_cn.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ Conv1DTranspose
88
99
一维转置卷积层(Convlution1d transpose layer)
1010

11-
该层根据输入(input)、卷积核(kernel)和空洞大小(dilations)、步长(stride)、填充(padding)来计算输出特征大小或者通过 output_size 指定输出特征层大小。输入(Input)和输出(Output)为 NCL 或 NLC 格式,其中 N 为批尺寸,C 为通道数(channel),L 为特征长度。卷积核是 MCL 格式,M 是输出图像通道数,C 是输入图像通道数,L 是卷积核长度。如果组数大于 1,C 等于输入图像通道数除以组数的结果。转置卷积的计算过程相当于卷积的反向计算。转置卷积又被称为反卷积(但其实并不是真正的反卷积)。欲了解转置卷积层细节,请参考下面的说明和 `参考文献 <https://arxiv.org/pdf/1603.07285.pdf/>`_。如果参数 bias_attr 不为 False,转置卷积计算会添加偏置项。
11+
该层根据输入(input)、卷积核(kernel)和空洞大小(dilations)、步长(stride)、填充(padding)来计算输出特征大小或者通过 output_size 指定输出特征层大小。输入(Input)和输出(Output)为 NCL 或 NLC 格式,其中 N 为批尺寸,C 为通道数(channel),L 为特征长度。卷积核是 MCL 格式,M 是输出图像通道数,C 是输入图像通道数,L 是卷积核长度。如果组数大于 1,C 等于输入图像通道数除以组数的结果。转置卷积的计算过程相当于卷积的反向计算。转置卷积又被称为反卷积(但其实并不是真正的反卷积)。欲了解转置卷积层细节,请参考下面的说明和 `参考文献 <https://arxiv.org/pdf/1603.07285>`_。如果参数 bias_attr 不为 False,转置卷积计算会添加偏置项。
1212

1313

1414
输入 :math:`X` 和输出 :math:`Out` 函数关系如下:

docs/api/paddle/put_along_axis_cn.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,7 @@ put_along_axis
1414
- **indices** (Tensor) - 索引矩阵,包含沿轴提取 1d 切片的下标,必须和 arr 矩阵有相同的维度。当 ``broadcast`` 为 ``True`` 时,需要能够 broadcast 与 arr 矩阵对齐,否则除 ``axis`` 维度,其他维度都需要小于等于 ``arr`` 与 ``values`` 的对应维度。数据类型为:int32、int64。
1515
- **values** (float) - 需要插入的值,当 ``broadcast`` 为 ``True`` 时,形状和维度需要能够被 broadcast 与 indices 矩阵匹配,否则各维度需大于等于 ``indices`` 的各维度。数据类型为:bfloat16、float16、float32、float64、int32、int64、uint8、int16。
1616
- **axis** (int) - 指定沿着哪个维度获取对应的值,数据类型为:int。
17-
- **reduce** (str,可选) - 归约操作类型,默认为 ``assign``,可选为 ``add``、 ``multiple``、 ``mean``、 ``amin``、 ``amax``。不同的规约操作插入值 value 对于输入矩阵 arr 会有不同的行为,如为 ``assgin`` 则覆盖输入矩阵, ``add`` 则累加至输入矩阵, ``mean`` 则计算累计平均值至输入矩阵, ``multiple`` 则累乘至输入矩阵, ``amin`` 则计算累计最小值至输入矩阵, ``amax`` 则计算累计最大值至输入矩阵。
17+
- **reduce** (str,可选) - 归约操作类型,默认为 ``assign``,可选为 ``add``、 ``multiple``、 ``mean``、 ``amin``、 ``amax``。不同的规约操作插入值 value 对于输入矩阵 arr 会有不同的行为,如为 ``assign`` 则覆盖输入矩阵, ``add`` 则累加至输入矩阵, ``mean`` 则计算累计平均值至输入矩阵, ``multiple`` 则累乘至输入矩阵, ``amin`` 则计算累计最小值至输入矩阵, ``amax`` 则计算累计最大值至输入矩阵。
1818
- **include_self** (bool,可选) - 规约时是否包含 arr 的元素,默认为 ``True``。
1919
- **broadcast** (bool,可选) - 是否广播 ``index`` 矩阵,默认为 ``True``。
2020

docs/api/paddle/scatter_cn.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -59,7 +59,7 @@ PyTorch 兼容的 scatter 函数。基于 :ref:`cn_api_paddle_put_along_axis`
5959
- **dim** (int) - 进行 scatter 操作的维度,范围为 ``[-input.ndim, input.ndim)``。
6060
- **index** (Tensor)- 索引矩阵,包含沿轴提取 1d 切片的下标,必须和 arr 矩阵有相同的维度。注意,除了 ``dim`` 维度外, ``index`` 张量的各维度大小应该小于等于 ``input`` 以及 ``src`` 张量。内部的值应该在 ``input.shape[dim]`` 范围内。数据类型可以是 int32,int64。
6161
- **src** (Tensor)- 需要插入的值。``src`` 是张量时,各维度大小需要至少大于等于 ``index`` 各维度。不受到 ``input`` 的各维度约束。当为标量值时,会自动广播大小到 ``index``。数据类型为:bfloat16、float16、float32、float64、int32、int64、uint8、int16。本参数有一个互斥的别名 ``value``。
62-
- **reduce** (str,可选)- 指定 scatter 的归约方式。默认值为 None,等效为 ``assign``。可选为 ``add``、 ``multiple``、 ``mean``、 ``amin``、 ``amax``。不同的规约操作插入值 src 对于输入矩阵 arr 会有不同的行为,如为 ``assgin`` 则覆盖输入矩阵, ``add`` 则累加至输入矩阵, ``mean`` 则计算累计平均值至输入矩阵, ``multiple`` 则累乘至输入矩阵, ``amin`` 则计算累计最小值至输入矩阵, ``amax`` 则计算累计最大值至输入矩阵。
62+
- **reduce** (str,可选)- 指定 scatter 的归约方式。默认值为 None,等效为 ``assign``。可选为 ``add``、 ``multiple``、 ``mean``、 ``amin``、 ``amax``。不同的规约操作插入值 src 对于输入矩阵 arr 会有不同的行为,如为 ``assign`` 则覆盖输入矩阵, ``add`` 则累加至输入矩阵, ``mean`` 则计算累计平均值至输入矩阵, ``multiple`` 则累乘至输入矩阵, ``amin`` 则计算累计最小值至输入矩阵, ``amax`` 则计算累计最大值至输入矩阵。
6363
- **out** (Tensor,可选) - 用于引用式传入输出值,注意:动态图下 out 可以是任意 Tensor,默认值为 None。
6464

6565
返回

docs/design/memory/memory_optimization.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -79,7 +79,7 @@ In former control flow graph, the out-edges of node 5 are 5 --> 6 and 5 --> 2, a
7979

8080
- Uses and Defs
8181

82-
An assignmemt to a variable or temporary defines that variable. An occurence of a variable on the right-hand side of an assginment(or in other expressions) uses the variable. We can define the *def* of a variable as the set of graph nodes that define it; or the *def* of a graph node as the set of variables that it defines; and the similarly for the *use* of a variable or graph node. In former control flow graph, *def(3)* = {c}, *use(3)* = {b, c}.
82+
An assignmemt to a variable or temporary defines that variable. An occurence of a variable on the right-hand side of an assignment(or in other expressions) uses the variable. We can define the *def* of a variable as the set of graph nodes that define it; or the *def* of a graph node as the set of variables that it defines; and the similarly for the *use* of a variable or graph node. In former control flow graph, *def(3)* = {c}, *use(3)* = {b, c}.
8383

8484
- Liveness
8585

docs/design/phi/design_cn.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1219,7 +1219,7 @@ REGISTER_OPERATOR(sign, ops::SignOp, ops::SignOpMaker<float>,
12191219
* The infrt declare like:
12201220
*
12211221
* def PDKEL_Reshape_to_CPU : Pat<
1222-
* (PD_ReshapeOp $x, $shape_tensor, $shape_attr), // OpMaker arguements
1222+
* (PD_ReshapeOp $x, $shape_tensor, $shape_attr), // OpMaker arguments
12231223
* (PDKEL_ReshapeKernelAttr $x, fn($shape_attr)>; // Kernel arguments
12241224
* def PDKEL_Reshape_to_CPU : Pat<
12251225
* (PD_ReshapeOp $x, $shape_tensor, $shape_attr),

0 commit comments

Comments
 (0)