Skip to content

Commit e0bd0f3

Browse files
Address issue #1359
1 parent 077780d commit e0bd0f3

File tree

3 files changed

+10
-4
lines changed

3 files changed

+10
-4
lines changed

RELEASENOTES.md

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -4,6 +4,12 @@ Releases, starting with 9/2/2021, are listed with the most recent release at the
44

55
# NuGet Version 0.102.6
66

7+
__Bug Fixes__:
8+
9+
#1359 torch.nn.functional.l1_loss computes a criterion with the MSE, not the MAE.<br/>
10+
11+
# NuGet Version 0.102.6
12+
713
__Breaking Changes__:
814

915
When creating a tensor from a 1-D array, and passing in a shape, there is now an ambiguity between the IList and Memory overloads of `torch.tensor()`. The ambiguity is resolved by removing the `dimensions` argument if it is redundant, or by an explicit cast to IList if it is not.

build/BranchInfo.props

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -2,8 +2,8 @@
22
<PropertyGroup>
33
<MajorVersion>0</MajorVersion>
44
<MinorVersion>102</MinorVersion>
5-
<PatchVersion>7</PatchVersion>
6-
<PreviousPackageVersion>0.102.6</PreviousPackageVersion>
5+
<PatchVersion>8</PatchVersion>
6+
<PreviousPackageVersion>0.102.7</PreviousPackageVersion>
77
</PropertyGroup>
88

99
</Project>

src/Native/LibTorchSharp/THSLoss.cpp

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -94,10 +94,10 @@ Tensor THSNN_kl_div_loss(const Tensor input, const Tensor target, const int64_t
9494
Tensor THSNN_l1_loss(const Tensor input, const Tensor target, const int64_t reduction)
9595
{
9696
CATCH_RETURN_Tensor(
97-
auto opts = torch::nn::functional::MSELossFuncOptions();
97+
auto opts = torch::nn::functional::L1LossFuncOptions();
9898
ApplyReduction(opts, reduction);
9999

100-
res = ResultTensor(torch::nn::functional::mse_loss(*input, *target, opts));
100+
res = ResultTensor(torch::nn::functional::l1_loss(*input, *target, opts));
101101
)
102102
}
103103

0 commit comments

Comments
 (0)