-
Notifications
You must be signed in to change notification settings - Fork 12
Description
Defining a GufeTokenizable with a tuple of GufeTokenizable objects raises a counterintuitive exception. The code block below demonstrates that even though the Foo class is serializable (since it can generate a key when instantiated alone), a GufeTokenizable containing a tuple of Foo instances, Baz, reports that Foo is not JSON serializable. In contrast, instantiating a Bar GufeTokenizable, which uses list instead of tuple, produces the expected serializable object.
The wording of the output (below) is misleading and appears to result from tuple being absent in the checks of gufe.tokenization.modify_dependencies.
This is also inconsistent with how the python json library treats list and tuple. There, both would be serialized to a JSON array, which is always de-serialized as to a Python list. I propose that instead of just checking isinstance(obj, list) in modify_dependencies, we check isinstance(obj, (list, tuple)).
from gufe.tokenization import GufeTokenizable
class Foo(GufeTokenizable):
@classmethod
def _defaults(cls): return {}
def _to_dict(self): return {}
@classmethod
def _from_dict(cls, dct): return {}
class Bar(GufeTokenizable):
# initialize all Bar instances with a list of 3 Foo instances
def __init__(self):
self.foos = {"foos": [Foo(), Foo(), Foo()]}
@classmethod
def _defaults(cls): return {}
def _to_dict(self): return {"foos": self.foos}
@classmethod
def _from_dict(cls, dct): return cls(**dct)
class Baz(GufeTokenizable):
# initialize all Bar instances with a tuple of 3 Foo instances
def __init__(self):
self.foos = (Foo(), Foo(), Foo())
@classmethod
def _defaults(cls): return {}
def _to_dict(self): return {"foos": self.foos}
@classmethod
def _from_dict(cls, dct): return cls(**dct)
if __name__ == "__main__":
foo = Foo()
bar = Bar()
baz = Baz()Traceback (most recent call last):
File "[...]/network/error.py", line 42, in <module>
baz = Baz()
File "[...]/gufe/tokenization.py", line 76, in __call__
key = instance.key
^^^^^^^^^^^^
File "[...]/gufe/tokenization.py", line 422, in key
token = self._gufe_tokenize()
File "[...]/gufe/tokenization.py", line 354, in _gufe_tokenize
return tokenize(self)
File "[...]/gufe/tokenization.py", line 1226, in tokenize
dumped = json.dumps(
obj.to_keyed_dict(include_defaults=False),
sort_keys=True,
cls=JSON_HANDLER.encoder,
)
File "[...]/lib/python3.13/json/__init__.py", line 238, in dumps
**kw).encode(obj)
~~~~~~^^^^^
File "[...]/lib/python3.13/json/encoder.py", line 200, in encode
chunks = self.iterencode(o, _one_shot=True)
File "[...]/lib/python3.13/json/encoder.py", line 261, in iterencode
return _iterencode(o, 0)
File "[...]/gufe/serialization/json.py", line 138, in default
return json.JSONEncoder.default(self, obj)
~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^
File "[...]/lib/python3.13/json/encoder.py", line 180, in default
raise TypeError(f'Object of type {o.__class__.__name__} '
f'is not JSON serializable')
TypeError: Object of type Foo is not JSON serializable