Skip to content

Commit 66713b1

Browse files
committed
Add documentation and minor cleanup
Signed-off-by: Dheeraj Peri <[email protected]>
1 parent a02bf4f commit 66713b1

File tree

7 files changed

+78
-126
lines changed

7 files changed

+78
-126
lines changed

core/conversion/conversionctx/BUILD

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,6 @@ cc_library(
1717
],
1818
deps = [
1919
"@tensorrt//:nvinfer",
20-
"@tensorrt//:nvinferplugin",
2120
"//core/util:prelude",
2221
] + select({
2322
":use_pre_cxx11_abi": ["@libtorch_pre_cxx11_abi//:libtorch"],

core/conversion/conversionctx/ConversionCtx.cpp

Lines changed: 1 addition & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,11 +1,7 @@
1+
#include "core/conversion/conversionctx/ConversionCtx.h"
12
#include <iostream>
23
#include <sstream>
3-
4-
#include "NvInferPlugin.h"
5-
#include "NvInferPluginUtils.h"
6-
74
#include <utility>
8-
#include "core/conversion/conversionctx/ConversionCtx.h"
95

106
namespace trtorch {
117
namespace core {

core/conversion/converters/impl/batch_norm.cpp

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,5 @@
11
#include "core/conversion/converters/converters.h"
22
#include "core/util/prelude.h"
3-
// #include "core/plugins/plugin_prelude.h"
43
#include "torch/torch.h"
54

65
namespace trtorch {

core/conversion/converters/impl/instance_norm.cpp

Lines changed: 0 additions & 118 deletions
This file was deleted.

core/plugins/README.md

Lines changed: 40 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,40 @@
1+
# TRTorch Plugins
2+
3+
A library for plugins (custom layers) used in a network. This component of TRTorch library builds a separate library called `libtrtorch_plugins.so`.
4+
5+
On a high level, TRTorch plugin library interface does the following :
6+
7+
- Uses TensorRT plugin registry as the main data structure to access all plugins.
8+
9+
- Automatically registers TensorRT plugins with empty namepsace.
10+
11+
- Automatically registers TRTorch plugins with `"trtorch"` namespace.
12+
13+
Here is the brief description of functionalities of each file
14+
15+
- `plugins.h` - Provides a macro to register any plugins with `"trtorch"` namespace.
16+
- `register_plugins.cpp` - Main registry class which initializes both `libnvinfer` plugins and TRTorch plugins (`Interpolate` and `Normalize`)
17+
- `impl/interpolate_plugin.cpp` - Core implementation of interpolate plugin. Uses pytorch kernels during execution.
18+
- `impl/normalize_plugin.cpp` - Core implementation of normalize plugin. Uses pytorch kernels during execution.
19+
20+
### Converter for the plugin
21+
A converter basically converts a pytorch layer in the torchscript graph into a TensorRT layer (in this case a plugin layer).
22+
We can access a plugin via the plugin name and namespace in which it is registered.
23+
For example, to access the Interpolate plugin, we can use
24+
```
25+
auto creator = getPluginRegistry()->getPluginCreator("Interpolate", "1", "trtorch");
26+
auto interpolate_plugin = creator->createPlugin(name, &fc); // fc is the collection of parameters passed to the plugin.
27+
```
28+
29+
### If you have your own plugin
30+
31+
If you'd like to compile your plugin with TRTorch,
32+
33+
- Add your implementation to the `impl` directory
34+
- Add a call `REGISTER_TRTORCH_PLUGINS(MyPluginCreator)` to `register_plugins.cpp`. `MyPluginCreator` is the plugin creator class which creates your plugin. By adding this to `register_plugins.cpp`, your plugin will be initialized and accessible (added to TensorRT plugin registry) during the `libtrtorch_plugins.so` library loading.
35+
- Update the `BUILD` file with the your plugin files and dependencies.
36+
- Implement a converter op which makes use of your plugin.
37+
38+
Once you've completed the above steps, upon successful compilation of TRTorch library, your plugin should be available in `libtrtorch_plugins.so`.
39+
40+
A sample runtime application on how to run a network with plugins can be found <a href="https://github.com/NVIDIA/TRTorch/tree/master/examples/sample_rt_app" >here</a>

examples/sample_rt_app/BUILD

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
package(default_visibility = ["//visibility:public"])
22

3-
# trtorch is the dowloaded tar file
3+
# trtorch is the directory extracted from dowloaded release tar file
44
# It has include, lib, bin directory and LICENSE file
55
cc_library(
66
name = "trtorch_runtime",

examples/sample_rt_app/README.md

Lines changed: 36 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,36 @@
1+
# Sample application which uses TRTorch runtime library and plugin library.
2+
3+
This sample is a demonstration on how to use TRTorch runtime library `libtrtorchrt.so` along with plugin library `libtrtorch_plugins.so`
4+
5+
In this demo, we convert two models `ConvGelu` and `Norm` to TensorRT using TRTorch python API and perform inference using `samplertapp`. In these models, `Gelu` and `Norm` layer are expressed as plugins in the network.
6+
7+
## Generating Torch script modules with TRT Engines
8+
9+
The following command will generate `conv_gelu.jit` and `norm.jit` torchscript modules which contain TensorRT engines.
10+
11+
```
12+
python network.py
13+
```
14+
15+
## Sample runtime app
16+
17+
The main goal is to use TRTorch runtime library `libtrtorchrt.so`, a lightweight library sufficient enough to deploy your Torchscript programs containing TRT engines.
18+
19+
1) Download the latest release from TRTorch github repo and unpack the tar file.
20+
21+
```
22+
cd examples/sample_rt_app
23+
// Download latest TRTorch release tar file (libtrtorch.tar.gz) from https://github.com/NVIDIA/TRTorch/releases
24+
tar -xvzf libtrtorch.tar.gz
25+
```
26+
27+
2) Build and run `samplertapp`
28+
29+
`samplertapp` is a binary which loads the torchscript modules `conv_gelu.jit` or `norm.jit` and runs the TRT engines on a random input using TRTorch runtime components. Checkout the `main.cpp` and `BUILD ` file for necessary code and compilation dependencies.
30+
31+
To build and run the app
32+
33+
```
34+
cd TRTorch
35+
bs run //examples/sample_rt_app:samplertapp $PWD/examples/sample_rt_app/norm.jit
36+
```

0 commit comments

Comments
 (0)