You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/_sources/api/api_docs/modules/exporter.rst.txt
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -8,7 +8,7 @@ exporter Module
8
8
=================================
9
9
10
10
Allows to export a quantized model in different serialization formats and quantization formats.
11
-
For more details about the export formats and options, please refer to the project's GitHub `README file <https://github.com/sony/model_optimization/tree/main/model_compression_toolkit/exporter>`_.
11
+
For more details about the export formats and options, please refer to the project's GitHub `README file <https://github.com/SonySemiconductorSolutions/mct-model-optimization/tree/main/model_compression_toolkit/exporter>`_.
12
12
If you have any questions or issues, please open an issue in this GitHub repository.
This can be addressed in MCT by using the target_platform_capabilities module, that can configure different
18
18
parameters that are hardware-related, and the optimization process will use this to optimize the model accordingly.
19
-
Models for IMX500, TFLite and qnnpack can be observed `here <https://github.com/sony/model_optimization/tree/main/model_compression_toolkit/target_platform_capabilities>`_, and can be used using :ref:`get_target_platform_capabilities function<ug-get_target_platform_capabilities>`.
19
+
Models for IMX500, TFLite and qnnpack can be observed `here <https://github.com/SonySemiconductorSolutions/mct-model-optimization/tree/main/model_compression_toolkit/target_platform_capabilities>`_, and can be used using :ref:`get_target_platform_capabilities function<ug-get_target_platform_capabilities>`.
Copy file name to clipboardExpand all lines: docs/_sources/api/api_docs/modules/trainable_infrastructure.rst.txt
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -9,7 +9,7 @@ trainable_infrastructure Module
9
9
10
10
The trainable infrastructure is a module containing quantization abstraction and quantizers for hardware-oriented model optimization tools.
11
11
It provides the required abstraction for trainable quantization methods such as quantization-aware training.
12
-
It utilizes the Inferable Quantizers Infrastructure provided by the `MCT Quantizers <https://github.com/sony/mct_quantizers>`_ package, which proposes the required abstraction for emulating inference-time quantization.
12
+
It utilizes the Inferable Quantizers Infrastructure provided by the `MCT Quantizers <https://github.com/SonySemiconductorSolutions/mct-quantization-layers>`_ package, which proposes the required abstraction for emulating inference-time quantization.
13
13
14
14
When using a trainable quantizer, each layer with quantized weights is wrapped with a "Quantization Wrapper" object,
15
15
and each activation quantizer is being stored in an "Activation Quantization Holder" object.
0 commit comments