Skip to content

Commit b6c205a

Browse files
committed
Documenting and removing partially the need for Node.js in local Pipelines
1 parent ec0f553 commit b6c205a

File tree

5 files changed

+26
-3
lines changed

5 files changed

+26
-3
lines changed

HISTORY.rst

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -3,6 +3,11 @@
33
History
44
-------
55

6+
9.8.0.dev1 (2024-02-28)
7+
-----------------------
8+
9+
- Documenting and removing partially the need for Node.js in Pipelines.
10+
611
9.8.0.dev (2024-02-19)
712
----------------------
813

bigml/dataset.py

Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -21,6 +21,7 @@
2121
import os
2222
import logging
2323
import warnings
24+
import subprocess
2425

2526
from bigml.fields import Fields, sorted_headers, get_new_fields
2627
from bigml.api import get_api_connection, get_dataset_id, get_status
@@ -30,6 +31,13 @@
3031
from bigml.flatline import Flatline
3132
from bigml.featurizer import Featurizer
3233

34+
process = subprocess.Popen(['node -v'], stdout=subprocess.PIPE, shell=True)
35+
out = process.stdout.read()
36+
FLATLINE_READY = out.startswith(b"v")
37+
if FLATLINE_READY:
38+
from bigml.flatline import Flatline
39+
40+
3341
#pylint: disable=locally-disabled,bare-except,ungrouped-imports
3442
try:
3543
# avoiding tensorflow info logging
@@ -211,6 +219,10 @@ def transform(self, input_data_list):
211219
rows = [self._input_array(input_data) for input_data in
212220
input_data_list]
213221
if self.transformations:
222+
if not FLATLINE_READY:
223+
raise ValueError("Nodejs should be installed to handle this"
224+
" dataset's transformations. Please, check"
225+
" the bindings documentation for details.")
214226
out_headers, out_arrays = self._transform(rows)
215227
rows = [dict(zip(out_headers, row)) for row
216228
in out_arrays]

bigml/version.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1 +1 @@
1-
__version__ = '9.8.0.dev'
1+
__version__ = '9.8.0.dev1'

docs/index.rst

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -77,6 +77,11 @@ The bindings will also use ``simplejson`` if you happen to have it
7777
installed, but that is optional: we fall back to Python's built-in JSON
7878
libraries is ``simplejson`` is not found.
7979

80+
`Node.js <https://nodejs.org/en>`_ is not installed by default, but will be
81+
needed for `Local Pipelines <local_resources.html#local-pipelines` to work
82+
when datasets containing new added features are part of the transformation
83+
workflow.
84+
8085
The bindings provide support to use the ``BigML`` platform to create, update,
8186
get and delete resources, but also to produce local predictions using the
8287
models created in ``BigML``. Most of them will be actionable with the basic

docs/local_resources.rst

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -2218,7 +2218,9 @@ the existing BigML objects and create the prediction pipeline.
22182218
The first obvious goal that we may have is reproducing the same feature
22192219
extraction and transformations that were used when training our data to create
22202220
our model. That is achieved by using a ``BMLPipeline`` object built
2221-
on the training dataset.
2221+
on the training dataset. Note that, if your datasets contain features derived
2222+
from the original fields in your data, ``Nodejs`` has to be previously
2223+
installed for the transformations to work locally.
22222224
22232225
.. code-block:: python
22242226
@@ -2489,7 +2491,6 @@ and libraries. A new data transformer can be created by deriving the
24892491
to cover the particulars of the functions to be used in the generation of
24902492
new fields.
24912493
2492-
24932494
Local Evaluations
24942495
-----------------
24952496

0 commit comments

Comments
 (0)