Skip to content

[BUG] InferenceServerException: [StatusCode.INTERNAL] tuple index out of range #334

@radekosmulski

Description

@radekosmulski

Bug description

Running the test tests/unit/tf/examples/test_usecase_transformers_next_item_prediction.py in merlin-models now results in the following error (the error doesn't occur when on the release-23.04 branch in systems)

tests/unit/tf/examples/test_usecase_transformers_next_item_prediction.py:48:                                                                                                                                                                                                                                                                                                              _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _                                                                                                                                                                                                                                                                                                                                                                                           self = <testbook.client.TestbookNotebookClient object at 0x7f6442feb430>, cell = {'cell_type': 'code', 'execution_count': 19, 'id': '3cad9026', 'metadata': {'execution': {'iopub.status.busy': '2023-0...ta = validation_set_dataset.compute()\ninputs = convert_df_to_triton_input(wf.input_schema, validation_data.iloc[:4])'}, kwargs = {}                                            cell_indexes = [38, 39, 40, 41, 42, 43, ...], executed_cells = [{'cell_type': 'markdown', 'id': '5edc6046', 'metadata': {}, 'source': 'After we export the ensemble, we are ready to ...a = validation_set_dataset.compute()\ninputs = convert_df_to_triton_input(wf.input_schema, validation_data.iloc[:4])'}], idx = 41                                                                                                                                                                                                                                                                                                                                                                                                                                                               def execute_cell(self, cell, **kwargs) -> Union[Dict, List[Dict]]:                                                                                                                                                                                                                                                                                                                            """                                                                                                                                                                                                                                                                                                                                                                                       Executes a cell or list of cells                                                                                                                                                                                                                                                                                                                                                          """                                                                                                                                                                                                                                                                                                                                                                                       if isinstance(cell, slice):                                                                                                                                                                                                                                                                                                                                                                   start, stop = self._cell_index(cell.start), self._cell_index(cell.stop)                                                                                                                                                                                                                                                                                                                   if cell.step is not None:
                raise TestbookError('testbook does not support step argument')

            cell = range(start, stop + 1)
        elif isinstance(cell, str) or isinstance(cell, int):
            cell = [cell]

        cell_indexes = cell

        if all(isinstance(x, str) for x in cell):
            cell_indexes = [self._cell_index(tag) for tag in cell]

        executed_cells = []
        for idx in cell_indexes:
            try:
                cell = super().execute_cell(self.nb['cells'][idx], idx, **kwargs)
            except CellExecutionError as ce:
>               raise TestbookRuntimeError(ce.evalue, ce, self._get_error_class(ce.ename))
E               testbook.exceptions.TestbookRuntimeError: An error occurred while executing the following cell:
E               ------------------
E               import tritonclient.grpc as grpcclient
E
E               with grpcclient.InferenceServerClient("localhost:8001") as client:
E                   response = client.infer('executor_model', inputs)
E               ------------------
E
E               ---------------------------------------------------------------------------
E               InferenceServerException                  Traceback (most recent call last)
E               Cell In[20], line 4
E                     1 import tritonclient.grpc as grpcclient
E                     3 with grpcclient.InferenceServerClient("localhost:8001") as client:
E               ----> 4     response = client.infer('executor_model', inputs)
E
E               File /usr/local/lib/python3.8/dist-packages/tritonclient/grpc/__init__.py:1431, in InferenceServerClient.infer(self, model_name, inputs, model_version, outputs, request_id, sequence_id, sequence_start, sequence_end, priority, timeout, client_timeout, headers, compression_algorithm)                                                                                E                  1429     return result
E                  1430 except grpc.RpcError as rpc_error:
E               -> 1431     raise_error_grpc(rpc_error)
E
E               File /usr/local/lib/python3.8/dist-packages/tritonclient/grpc/__init__.py:62, in raise_error_grpc(rpc_error)
E                    61 def raise_error_grpc(rpc_error):
E               ---> 62     raise get_error_grpc(rpc_error) from None
E
E               InferenceServerException: [StatusCode.INTERNAL] tuple index out of range
E               InferenceServerException: [StatusCode.INTERNAL] tuple index out of range

/usr/local/lib/python3.8/dist-packages/testbook/client.py:135: TestbookRuntimeError

Steps/Code to reproduce bug

Run the tests/unit/tf/examples/test_usecase_transformers_next_item_prediction.py in models

Expected behavior

The test passes.

Environment details

Current main across all repos, 23.02 TF container

Metadata

Metadata

Assignees

Labels

P1bugSomething isn't working

Type

No type

Projects

No projects

Relationships

None yet

Development

No branches or pull requests

Issue actions