File tree Expand file tree Collapse file tree 5 files changed +26
-3
lines changed
Expand file tree Collapse file tree 5 files changed +26
-3
lines changed Original file line number Diff line number Diff line change 33 History
44-------
55
6+ 9.8.0.dev1 (2024-02-28)
7+ -----------------------
8+
9+ - Documenting and removing partially the need for Node.js in Pipelines.
10+
6119.8.0.dev (2024-02-19)
712----------------------
813
Original file line number Diff line number Diff line change 2121import os
2222import logging
2323import warnings
24+ import subprocess
2425
2526from bigml .fields import Fields , sorted_headers , get_new_fields
2627from bigml .api import get_api_connection , get_dataset_id , get_status
3031from bigml .flatline import Flatline
3132from bigml .featurizer import Featurizer
3233
34+ process = subprocess .Popen (['node -v' ], stdout = subprocess .PIPE , shell = True )
35+ out = process .stdout .read ()
36+ FLATLINE_READY = out .startswith (b"v" )
37+ if FLATLINE_READY :
38+ from bigml .flatline import Flatline
39+
40+
3341#pylint: disable=locally-disabled,bare-except,ungrouped-imports
3442try :
3543 # avoiding tensorflow info logging
@@ -211,6 +219,10 @@ def transform(self, input_data_list):
211219 rows = [self ._input_array (input_data ) for input_data in
212220 input_data_list ]
213221 if self .transformations :
222+ if not FLATLINE_READY :
223+ raise ValueError ("Nodejs should be installed to handle this"
224+ " dataset's transformations. Please, check"
225+ " the bindings documentation for details." )
214226 out_headers , out_arrays = self ._transform (rows )
215227 rows = [dict (zip (out_headers , row )) for row
216228 in out_arrays ]
Original file line number Diff line number Diff line change 1- __version__ = '9.8.0.dev '
1+ __version__ = '9.8.0.dev1 '
Original file line number Diff line number Diff line change @@ -77,6 +77,11 @@ The bindings will also use ``simplejson`` if you happen to have it
7777installed, but that is optional: we fall back to Python's built-in JSON
7878libraries is ``simplejson `` is not found.
7979
80+ `Node.js <https://nodejs.org/en >`_ is not installed by default, but will be
81+ needed for `Local Pipelines <local_resources.html#local-pipelines ` to work
82+ when datasets containing new added features are part of the transformation
83+ workflow.
84+
8085The bindings provide support to use the ``BigML `` platform to create, update,
8186get and delete resources, but also to produce local predictions using the
8287models created in ``BigML ``. Most of them will be actionable with the basic
Original file line number Diff line number Diff line change @@ -2218,7 +2218,9 @@ the existing BigML objects and create the prediction pipeline.
22182218The first obvious goal that we may have is reproducing the same feature
22192219extraction and transformations that were used when training our data to create
22202220our model. That is achieved by using a `` BMLPipeline`` object built
2221- on the training dataset.
2221+ on the training dataset. Note that, if your datasets contain features derived
2222+ from the original fields in your data, `` Nodejs`` has to be previously
2223+ installed for the transformations to work locally.
22222224
22232225.. code- block:: python
22242226
@@ -2489,7 +2491,6 @@ and libraries. A new data transformer can be created by deriving the
24892491to cover the particulars of the functions to be used in the generation of
24902492new fields.
24912493
2492-
24932494Local Evaluations
24942495---------------- -
24952496
You can’t perform that action at this time.
0 commit comments