Protobufs are currently deeply integrated into the carball architecture, serving as the primary internal data structure for the analysis phase, not just an export format.
- Data Backbone: The
AnalysisManagerinitializes agame_pb2.Gameobject immediately. As the replay is parsed and analyzed, data is populated directly into this protobuf structure.- Metadata:
ApiGame,ApiPlayer, etc., convert parsed JSON data into protobuf fields. - Stats:
StatsManagerand individual stat calculators (e.g.,BoostStat,HitAnalysis) write calculated statistics directly into the protobuf hierarchy (e.g.,player.stats.boost). - Events:
EventsCreatorgeneratesHit,Goal, andDemoprotobuf objects and appends them to the game object.
- Metadata:
- Serialization: The
ProtobufManagerhandles reading/writing these objects to files (.pts), which is the primary way analysis results are saved and loaded. - Build Process: Currently,
setup.pyattempts to runprotoc(the Protocol Buffer compiler) on the user's machine during installation (PostInstallCommand). This requires the user to haveprotocinstalled and in their path, which is a major friction point and source of errors ("bane of existence").
This option involves replacing the Protobuf classes with native Python objects (e.g., dataclasses or Pydantic models). This removes the dependency on protoc and the protobuf library entirely.
- Zero Build Dependencies: No need for
protocor complex build steps. Pure Python. - Easier Debugging: Native Python objects are often easier to inspect and modify than generated protobuf classes.
- Flexibility: Easier to add helper methods or properties to the data classes.
- High Effort: Requires a substantial refactor. Every file importing
carball.generated(which is most of the analysis logic) needs modification. - Performance: Protobufs are generally faster and more compact for serialization than standard JSON/Pickle, though for this use case, the difference might be negligible compared to the analysis time.
- Breaking Change: Any external tools relying on the
.pts(protobuf) output format will break.
- Define Data Models: Create a new module (e.g.,
carball.models) with Pythondataclassesthat mirror the structure of the existing.protofiles (Game, Player, Team, Stats, etc.). - Update Analysis Manager: Modify
AnalysisManagerto initialize the newGamedataclass instead ofgame_pb2.Game. - Refactor Stats & Events: Systematically go through
carball/analysis/stats/andcarball/analysis/events/to update all references fromproto_game.fieldtogame_model.field. - Update Metadata Parsers: Rewrite
ApiGame,ApiPlayer, etc., to populate the dataclasses. - Implement Serialization: Create a new
JsonManager(or similar) to handle saving/loading the dataclasses to disk (likely usingjsonorpickle). - Cleanup: Remove
api/directory,utils/create_proto.py,setup.pyhooks, andprotobufdependency. - Fix Tests: Update the test suite to assert against the new data models.
This option keeps the Protobuf code but fixes the "version hell" and installation issues by changing how the code is generated and packaged.
- Lower Effort: Preserves the existing logic and data structures.
- Backward Compatibility: Keeps the
.ptsformat for existing tools. - Type Safety: Generated protobuf code provides strict typing.
- Retains Dependency: Still depends on the
protobufruntime library (though not the compiler for the end-user).
- Pre-Generate Code: Shift the responsibility of running
protocfrom the end-user (at install time) to the developer/CI (at build time). - Remove Setup Hooks: Delete
PostInstallCommandandPostDevelopCommandfromsetup.py. Thepip installprocess should never runprotoc. - Include Generated Files: Update
MANIFEST.into explicitly include the generated_pb2.pyfiles in the source distribution (sdist) and wheels.- Crucial: The generated files must be present in the package uploaded to PyPI.
- Pin Dependencies: In
setup.py, pin theprotobufruntime library to a version compatible with the generated code (e.g.,protobuf>=3.0.0,<4.0.0or similar, depending on whatprotocversion is used). - CI Automation: Configure GitHub Actions to:
- Install
protoc. - Run
utils/create_proto.pyto generate the Python files. - Build the package (Wheel/Sdist).
- Publish to PyPI.
- Install
- Version Control (Optional): You could choose to commit the generated
_pb2.pyfiles to the repo (so they are always there), or just ensure they are generated before packaging. Committing them is often easier for ensuring they are present.
If your primary pain point is "ridiculous version hell/protoc mismatch/generated files missing" during installation/usage, Option 2 is the most pragmatic and immediate fix. It solves the user-facing problem completely: users just pip install carball and it works, because the generated files are already inside the package.
If you fundamentally dislike Protobufs or want to remove the dependency for other reasons (e.g., binary size, complexity), Option 1 is cleaner in the long run but requires significantly more work upfront.