You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
For users who need to train a LightGBM model on huge datasets that do
not fit the memory, they often choose to train through the Booster API,
which supports two_round loading of huge datasets, or the command line,
which exports a model.txt file that can only be reconstructed to a
Booster. Currently the ONNX converter does not support either case.
Most of the converter work is about processing information from the
dumped model dictionary. Other information can also be inferred from
that information as well. Here we wrap the Booster information to
facilitate the conversion process.
The multiclass model is not supported yet. The exported ONNX model has
an issue with its ZipMap node.
0 commit comments