Skip to content

Commit 176b907

Browse files
committed
chore: apply linting
Signed-off-by: Naren Dasan <[email protected]> Signed-off-by: Naren Dasan <[email protected]>
1 parent 10cae30 commit 176b907

File tree

2 files changed

+1
-3
lines changed

2 files changed

+1
-3
lines changed

py/setup.py

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -242,8 +242,7 @@ def run(self):
242242
dir_path + "/../bazel-TRTorch/external/tensorrt/include",
243243
dir_path + "/../bazel-Torch-TensorRT/external/tensorrt/include",
244244
dir_path + "/../bazel-TensorRT/external/tensorrt/include",
245-
dir_path + "/../bazel-tensorrt/external/tensorrt/include",
246-
dir_path + "/../"
245+
dir_path + "/../bazel-tensorrt/external/tensorrt/include", dir_path + "/../"
247246
],
248247
extra_compile_args=[
249248
"-Wno-deprecated",

tests/core/lowering/test_module_fallback_passes.cpp

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,6 @@ TEST(Lowering, NotateModuleForFallbackWorksCorrectly) {
2020
std::unordered_set<std::string> mods_to_mark;
2121
mods_to_mark.insert("custom_models.ModuleFallbackSub");
2222

23-
2423
torch_tensorrt::core::lowering::passes::NotateModuleForFallback(mod, "", "forward", mods_to_mark);
2524

2625
auto g = mod.get_method("forward").graph();

0 commit comments

Comments
 (0)