.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "auto_tutorial/plot_gexternal_lightgbm.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note :ref:`Go to the end ` to download the full example code. .. rst-class:: sphx-glr-example-title .. _sphx_glr_auto_tutorial_plot_gexternal_lightgbm.py: .. _example-lightgbm: Convert a pipeline with a LightGBM classifier ============================================= .. index:: LightGBM :epkg:`sklearn-onnx` only converts :epkg:`scikit-learn` models into *ONNX* but many libraries implement :epkg:`scikit-learn` API so that their models can be included in a :epkg:`scikit-learn` pipeline. This example considers a pipeline including a :epkg:`LightGBM` model. :epkg:`sklearn-onnx` can convert the whole pipeline as long as it knows the converter associated to a *LGBMClassifier*. Let's see how to do it. Train a LightGBM classifier +++++++++++++++++++++++++++ .. GENERATED FROM PYTHON SOURCE LINES 22-52 .. code-block:: Python import onnxruntime as rt from skl2onnx import convert_sklearn, update_registered_converter from skl2onnx.common.shape_calculator import ( calculate_linear_classifier_output_shapes, ) from onnxmltools.convert.lightgbm.operator_converters.LightGbm import ( convert_lightgbm, ) from skl2onnx.common.data_types import FloatTensorType import numpy from sklearn.datasets import load_iris from sklearn.pipeline import Pipeline from sklearn.preprocessing import StandardScaler from lightgbm import LGBMClassifier data = load_iris() X = data.data[:, :2] y = data.target ind = numpy.arange(X.shape[0]) numpy.random.shuffle(ind) X = X[ind, :].copy() y = y[ind].copy() pipe = Pipeline( [("scaler", StandardScaler()), ("lgbm", LGBMClassifier(n_estimators=3))] ) pipe.fit(X, y) .. rst-class:: sphx-glr-script-out .. code-block:: none [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.018751 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 47 [LightGBM] [Info] Number of data points in the train set: 150, number of used features: 2 [LightGBM] [Info] Start training from score -1.098612 [LightGBM] [Info] Start training from score -1.098612 [LightGBM] [Info] Start training from score -1.098612 [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf [LightGBM] [Warning] No further splits with positive gain, best gain: -inf .. raw:: html
Pipeline(steps=[('scaler', StandardScaler()),
                    ('lgbm', LGBMClassifier(n_estimators=3))])
In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.


.. GENERATED FROM PYTHON SOURCE LINES 53-64 Register the converter for LGBMClassifier +++++++++++++++++++++++++++++++++++++++++ The converter is implemented in :epkg:`onnxmltools`: `onnxmltools...LightGbm.py `_. and the shape calculator: `onnxmltools...Classifier.py `_. .. GENERATED FROM PYTHON SOURCE LINES 64-73 .. code-block:: Python update_registered_converter( LGBMClassifier, "LightGbmLGBMClassifier", calculate_linear_classifier_output_shapes, convert_lightgbm, options={"nocl": [True, False], "zipmap": [True, False, "columns"]}, ) .. GENERATED FROM PYTHON SOURCE LINES 74-76 Convert again +++++++++++++ .. GENERATED FROM PYTHON SOURCE LINES 76-88 .. code-block:: Python model_onnx = convert_sklearn( pipe, "pipeline_lightgbm", [("input", FloatTensorType([None, 2]))], target_opset={"": 12, "ai.onnx.ml": 2}, ) # And save. with open("pipeline_lightgbm.onnx", "wb") as f: f.write(model_onnx.SerializeToString()) .. GENERATED FROM PYTHON SOURCE LINES 89-93 Compare the predictions +++++++++++++++++++++++ Predictions with LightGbm. .. GENERATED FROM PYTHON SOURCE LINES 93-97 .. code-block:: Python print("predict", pipe.predict(X[:5])) print("predict_proba", pipe.predict_proba(X[:1])) .. rst-class:: sphx-glr-script-out .. code-block:: none predict [2 0 0 0 1] predict_proba [[0.22814003 0.31657806 0.45528191]] .. GENERATED FROM PYTHON SOURCE LINES 98-99 Predictions with onnxruntime. .. GENERATED FROM PYTHON SOURCE LINES 99-105 .. code-block:: Python sess = rt.InferenceSession("pipeline_lightgbm.onnx", providers=["CPUExecutionProvider"]) pred_onx = sess.run(None, {"input": X[:5].astype(numpy.float32)}) print("predict", pred_onx[0]) print("predict_proba", pred_onx[1][:1]) .. rst-class:: sphx-glr-script-out .. code-block:: none predict [2 0 0 0 1] predict_proba [{0: 0.22814001142978668, 1: 0.3165780305862427, 2: 0.45528194308280945}] .. rst-class:: sphx-glr-timing **Total running time of the script:** (0 minutes 0.102 seconds) .. _sphx_glr_download_auto_tutorial_plot_gexternal_lightgbm.py: .. only:: html .. container:: sphx-glr-footer sphx-glr-footer-example .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: plot_gexternal_lightgbm.ipynb ` .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: plot_gexternal_lightgbm.py ` .. container:: sphx-glr-download sphx-glr-download-zip :download:`Download zipped: plot_gexternal_lightgbm.zip ` .. only:: html .. rst-class:: sphx-glr-signature `Gallery generated by Sphinx-Gallery `_