Skip to content

Commit 061f179

Browse files
authored
映射文档更新 (#7325)
* update docs * update docs
1 parent 3eb0013 commit 061f179

15 files changed

+201
-18
lines changed
Lines changed: 21 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,21 @@
1+
## [参数完全一致]torch.cuda.StreamContext
2+
3+
### [torch.cuda.StreamContext](https://pytorch.org/docs/stable/generated/torch.cuda.StreamContext.html#torch.cuda.StreamContext)
4+
5+
```python
6+
torch.cuda.StreamContext(stream)
7+
```
8+
9+
### [paddle.device.stream_guard](https://www.paddlepaddle.org.cn/documentation/docs/zh/api/paddle/device/stream_guard_cn.html#stream-guard)
10+
11+
```python
12+
paddle.device.stream_guard(stream)
13+
```
14+
15+
功能一致,参数完全一致,具体如下:
16+
17+
### 参数映射
18+
19+
| PyTorch | PaddlePaddle | 备注 |
20+
|---------------|-------------------| ------------------------------------------------------ |
21+
| stream | stream | 指定的 CUDA stream 。|

docs/guides/model_convert/convert_from_pytorch/api_difference/torch/torch._foreach_ceil_.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,5 +15,5 @@ Paddle 无此 API,需要组合实现。
1515
torch._foreach_ceil_(tensors)
1616

1717
# Paddle 写法
18-
[paddle.assign(paddle.ceil(x), x) for x in tensors]
18+
[x.ceil_() for x in tensors]
1919
```

docs/guides/model_convert/convert_from_pytorch/api_difference/torch/torch._foreach_exp_.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,5 +15,5 @@ Paddle 无此 API,需要组合实现。
1515
torch._foreach_exp_(tensors)
1616

1717
# Paddle 写法
18-
[paddle.assign(paddle.exp(x), x) for x in tensors]
18+
[x.exp_() for x in tensors]
1919
```

docs/guides/model_convert/convert_from_pytorch/api_difference/torch/torch._foreach_floor_.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,5 +15,5 @@ Paddle 无此 API,需要组合实现。
1515
torch._foreach_floor_(tensors)
1616

1717
# Paddle 写法
18-
[paddle.assign(paddle.floor(x), x) for x in tensors]
18+
[x.floor_() for x in tensors]
1919
```
Lines changed: 19 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,19 @@
1+
## [组合替代实现]torch.\_foreach_frac
2+
3+
### [torch.\_foreach_frac](https://pytorch.org/docs/stable/generated/torch._foreach_frac.html#torch-foreach-frac)
4+
5+
```python
6+
torch._foreach_frac(self)
7+
```
8+
9+
Paddle 无此 API,需要组合实现。
10+
11+
### 转写示例
12+
13+
```python
14+
# PyTorch 写法
15+
torch._foreach_frac(tensors)
16+
17+
# Paddle 写法
18+
[paddle.frac(x) for x in tensors]
19+
```
Lines changed: 19 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,19 @@
1+
## [组合替代实现]torch.\_foreach_frac_
2+
3+
### [torch.\_foreach_frac_](https://pytorch.org/docs/stable/generated/torch._foreach_frac_.html#torch-foreach-frac)
4+
5+
```python
6+
torch._foreach_frac_(self)
7+
```
8+
9+
Paddle 无此 API,需要组合实现。
10+
11+
### 转写示例
12+
13+
```python
14+
# PyTorch 写法
15+
torch._foreach_frac_(tensors)
16+
17+
# Paddle 写法
18+
[paddle.frac_(x) for x in tensors]
19+
```
Lines changed: 19 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,19 @@
1+
## [组合替代实现]torch.\_foreach_reciprocal
2+
3+
### [torch.\_foreach_reciprocal](https://pytorch.org/docs/stable/generated/torch._foreach_reciprocal.html#torch-foreach-reciprocal)
4+
5+
```python
6+
torch._foreach_reciprocal(self)
7+
```
8+
9+
Paddle 无此 API,需要组合实现。
10+
11+
### 转写示例
12+
13+
```python
14+
# PyTorch 写法
15+
torch._foreach_reciprocal(tensors)
16+
17+
# Paddle 写法
18+
[paddle.reciprocal(x) for x in tensors]
19+
```
Lines changed: 19 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,19 @@
1+
## [组合替代实现]torch.\_foreach_reciprocal_
2+
3+
### [torch.\_foreach_reciprocal_](https://pytorch.org/docs/stable/generated/torch._foreach_reciprocal_.html#torch-foreach-reciprocal)
4+
5+
```python
6+
torch._foreach_reciprocal_(self)
7+
```
8+
9+
Paddle 无此 API,需要组合实现。
10+
11+
### 转写示例
12+
13+
```python
14+
# PyTorch 写法
15+
torch._foreach_reciprocal_(tensors)
16+
17+
# Paddle 写法
18+
[x.reciprocal_() for x in tensors]
19+
```
Lines changed: 19 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,19 @@
1+
## [组合替代实现]torch.\_foreach_sigmoid
2+
3+
### [torch.\_foreach_sigmoid](https://pytorch.org/docs/stable/generated/torch._foreach_sigmoid.html#torch-foreach-sigmoid)
4+
5+
```python
6+
torch._foreach_sigmoid(self)
7+
```
8+
9+
Paddle 无此 API,需要组合实现。
10+
11+
### 转写示例
12+
13+
```python
14+
# PyTorch 写法
15+
torch._foreach_sigmoid(tensors)
16+
17+
# Paddle 写法
18+
[paddle.sigmoid(x) for x in tensors]
19+
```
Lines changed: 19 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,19 @@
1+
## [组合替代实现]torch.\_foreach_sigmoid_
2+
3+
### [torch.\_foreach_sigmoid_](https://pytorch.org/docs/stable/generated/torch._foreach_sigmoid_.html#torch-foreach-sigmoid)
4+
5+
```python
6+
torch._foreach_sigmoid_(self)
7+
```
8+
9+
Paddle 无此 API,需要组合实现。
10+
11+
### 转写示例
12+
13+
```python
14+
# PyTorch 写法
15+
torch._foreach_sigmoid_(tensors)
16+
17+
# Paddle 写法
18+
[x.sigmoid_() for x in tensors]
19+
```

0 commit comments

Comments
 (0)