Skip to content

Conversation

@Skyline-23
Copy link

PR Summary (feat/coreml)

Overview

This branch isolates CoreML integration (no actor refactor) and makes it buildable/usable via the CLI with a CoreML-only flag. Sync APIs remain primary; async wrappers are thin bridges.

Key changes

  • Sendable fixes for CoreML path
    • Zenz and ZenzCoreMLService marked @unchecked Sendable to unblock async bridging.
    • blockingAsync now accepts @Sendable closures via Task.detached; CoreML personalization handle is mapped explicitly to a tuple.
  • macOS version handling for CoreML
    • When ZenzaiCoreML trait is used, macOS linker gets a -platform_version macos 15.0 15.0 hint to silence xcframework version warnings while keeping the default macOS 13 baseline for non-CoreML builds.
  • CLI installer improvements
    • Supports --zenzai-coreml (plus optional --debug).
    • Copies the required resource bundle AzooKeyKanaKanjiConverter_KanaKanjiConverterModuleWithDefaultDictionary.bundle into /usr/local/bin/.
    • For CoreML builds, skips copying llama.framework; for Zenzai/ZenzaiCPU builds, still expects it.
      Add Support Page for English/Korean

How to build and install (CoreML)

sudo rm -rf .build   # clear stale caches if needed
swift build --traits ZenzaiCoreML
./install_cli.sh --zenzai-coreml --debug   # drop --debug for release

Known warnings / notes

  • updateIfRequired(options:) in KanaKanjiConverter is deprecated but behavior is unchanged; can be refactored later.
  • Minor warnings remain (unused intention in InputTable.swift, redundant try in ZenzContext+CoreML.swift) with no behavioral impact.
  • CoreML xcframeworks are built for macOS 15.5; the trait-specific linker hint mitigates version mismatch noise.

Copy link

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

Comment on lines +120 to +146
#else
package func getModel(modelURL: URL) -> Zenz? {
self.zenzStatus = "zenz-v2 model unavailable on this platform"
return nil

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Non-CoreML builds never load zenz models

When ZenzaiCoreML is not set, getModel now always marks the model as unavailable and returns nil, but the rest of the pipeline still checks zenzaiMode.enabled. On Zenzai/ZenzaiCPU builds this means the llama backend is never loaded and convertToLattice falls through to the non-zenz path, so enabling zenzaiMode no longer produces any zenz candidates or next-character predictions even though the trait and weight URL are provided—a regression from the previous behavior.

Useful? React with 👍 / 👎.

Copy link
Author

@Skyline-23 Skyline-23 Nov 25, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It might be resolved. in 5a07f7d

Adds support for CoreML-based Zenzai conversion, leveraging the `ZenzCoreMLBackend` for improved performance and efficiency on compatible devices.

This includes:
- Integration of the CoreML model for stateful generation.
- A new `ZenzContext+CoreML.swift` to manage the CoreML inference process.
- Updates to `Package.swift` to include `swift-transformers` dependency.
- Conditional compilation to switch between CoreML and CPU/Metal implementations based on the `ZenzaiCoreML` flag.
- Adds tokenizer files and marks mlpackage for LFS.

The changes aim to provide a faster and more resource-efficient Zenzai conversion method on Apple platforms with Neural Engine support.
Cause of swift-transformers package uses combine, and it not compatible with windows
Enables building the application with CoreML support for enhanced performance on Apple devices.

This allows users to leverage CoreML acceleration by using the `--zenzai-coreml` flag during installation.
Removes the explicit LFS bootstrap step from CI workflows as it is no longer required due to changes in how dependencies are managed. This simplifies the CI configuration and reduces build times.

Adds documentation for development setup in multiple languages, covering build instructions, testing and devcontainer usage. This enhances the developer experience by providing clear and comprehensive instructions in English, Japanese and Korean.

Improves Zenzai integration, adding CoreML support and cross-platform compatibility with Swift 6 concurrency. This allows for high-precision neural Kana-Kanji conversion on Apple devices using CoreML, while ensuring compatibility and performance on both Darwin and Linux platforms.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant