Releases: CodeWithKyrian/transformers-php
v0.4.1
What's Changed
- configuration.md: fix indentation of Transformers::setup() by @k00ni in #35
- PretrainedTokenizer::truncateHelper: prevent array_slice() error for flawed text input (summarization) by @k00ni in #36
- Fix bug with Download CLI - use named parameters for model construct by @CodeWithKyrian in #39
New Contributors
Full Changelog: 0.4.0...0.4.1
v0.4.0
This release marks a significant milestone in enhancing the performance and functionality of the Tensor class while introducing convenient tools to streamline the installation of essential dependencies. These improvements not only optimize existing operations but also pave the way for future enhancements and expanded capabilities within the project.
What's Changed
- New Inference Session: The InferenceSession has been overhauled to now receive Tensor inputs directly, facilitating easier conversion of Tensor objects to ONNX Tensors by simplifying memory copying.
- Overhaul Tensor Buffer Implementation: The Tensor class has been revamped to utilize OpenBlas and Rindow Matlib C shared libraries, introducing a massive performance improvements in Tensor operations.
- PHP Buffer Fallback: When the C Based Buffer fails for some reason, there's still a working PHP buffer implemented as a fallback, which is obviously slower, but will prevent errors.
- OpenMP Integration: The Tensor operations can be further optimized further by utilizing the parallel operation ability of OpenMP with an optional fallback to the the non OpenMP alternatives when OpenMP isn't installed.
- New Tensor Methods: Several new methods, including
topk,divide, andslice, have been added to the Tensor class, along with corresponding changes to existing implementations to leverage these methods. - Refactor Stack Method: The
stackmethod in the Tensor class has been refactored for enhanced performance. - Move Thumbnail Method: The thumbnail method has been relocated from the feature extractor to the Image class for improved organization.
- Code Cleanup and Style Review: The codebase has undergone cleanup and style review to ensure consistency and readability.
- Optimize Image <-> Tensor Conversion: Efforts have been made to optimize the speed of conversion between Image and Tensor objects, and vice versa enhancing overall performance for image related tasks.
- Image Driver Configuration: While Image driver setting can still be set in the
Transformersclass, it can be set directly on theImageclass, allowing it to be used independently. - Introduce Libraries Loader: A new library loader package has been introduced to automate the downloading of required shared libraries, such as
onnxruntime,openblas, andrindow-matlib, during the Composer install process. - TinyLlama Support: Add support for the TinyLlama model by @CodeWithKyrian
- Install Command Returned: Returned the
installcommand back to serve as an alternative way of getting the shared libraries if it fails for any reason during composer install.
New Contributors
- @das-peter made their first contribution in #30
Full Changelog: 0.3.1...0.4.0
v0.3.1
What's Changed
- Add Qwen2 model support by @CodeWithKyrian in #20
- Add chat input detection for text generation, and refactor streamer API. by @CodeWithKyrian in #21
- bugfix: Fix error that occurs when streamer is not used by @CodeWithKyrian in #22
- bugfix: Decoder sequence not calling the right method by @CodeWithKyrian in #23
Full Changelog: 0.3.0...0.3.1
v0.3.0
What's Changed
- Add Image Classification pipelines support by @CodeWithKyrian in #9
- Add Zero shot Image Classification pipelines support by @CodeWithKyrian in #9
- Add New Image Driver - VIPS by @CodeWithKyrian in #10
- Add Object Detection Pipeline by @CodeWithKyrian in #11
- Download ONNXRuntime automatically after composer install by @CodeWithKyrian in #12
- Add Zero Shot Object Detection Pipeline and OwlVit models by @CodeWithKyrian in #14
- Improve tensor performance by @CodeWithKyrian in #13
- Set [MASK] usage in prompts for default Xenova/bert-base-uncased model by @takielias in #15
- Add image feature extraction pipeline by @CodeWithKyrian in #16
- Add image to image pipeline by @CodeWithKyrian in #17
- bugfix: https slashes affected when joining paths by @CodeWithKyrian in #19
Breaking Changes
- The install command no longer exists, as the required libraries are downloaded automatically on composer install.
- New Image driver configuration settings added that required either GD, Imagick or Vips
New Contributors
- @takielias made their first contribution in #15
Full Changelog: 0.2.2...0.3.0
v0.2.2
What's new
- bugfix: Fix the wrong argument being passed in Autotokenizer by @CodeWithKyrian in 05e5588
- feat: cache tokenizer output to improve speed in repetitive tasks leading to 75% speed improvement (11.7687s to 2.9687s) by @CodeWithKyrian in b115c28
Full Changelog: 0.2.1...0.2.2
v0.2.1
What's Changed
- bugfix: Add symfony/console explicitly as a dependency by @CodeWithKyrian in #7
- bugfix: Autoload errors for
WordPieceTokenizeron case-sensitive operating systems in 0f1fc8b
Full Changelog: 0.2.0...0.2.1
v0.2.0
What's Changed
- feat: Add ability to use chat templates in Text Generation by @CodeWithKyrian in #1
- bugfix: Autoload errors for
PretrainedModelon case-sensitive operating systems by @CodeWithKyrian in #4 - feat: Bump OnnxRuntime PHP to 0.2.0 in b333162
- feat: Improve download and install command interfaces to show progress bar in b333162
Full Changelog: 0.1.0...0.2.0
v0.1.0
Initial Release 🎉
We are thrilled to announce the launch of Transformers PHP, a groundbreaking library that brings the power of state-of-the-art machine learning to the PHP community. Inspired by the HuggingFace Transformers and Xenova Transformers.js, Transformers PHP aims to provide an easy-to-use, high-performance toolset for developers looking to integrate advanced NLP, and in future updates potentially more, capabilities into their PHP applications.
Key Features:
- Seamless Integration: Designed to be functionally equivalent to its Python counterpart, making the transition and usage straightforward for developers familiar with the original Transformers library.
- Performance Optimized: Utilizes ONNX Runtime for efficient model inference, ensuring high performance even in demanding scenarios.
- Comprehensive Model Support: Access to thousands of pre-trained models across 100+ languages, covering a wide range of tasks including text generation, summarization, translation, sentiment analysis, and more.
- Easy Model Conversion: With 🤗 Optimum, easily convert PyTorch or TensorFlow models to ONNX format for use with Transformers PHP.
- Developer Friendly: From installation to deployment, every aspect of Transformers PHP is designed with ease of use in mind, featuring extensive documentation and a streamlined API.
Getting Started:
Installation is a breeze with Composer:
composer require codewithkyrian/transformersAnd you must initialize the library to download neccesary libraries for ONNX
./vendor/bin/transformers installCheckout the Documentation
For a comprehensive guide on how to use Transformers PHP, including detailed examples and configuration options, visit our documentation.
Pre-Download Models:
To ensure a smooth user experience, especially with larger models, we recommend pre-downloading models before deployment. Transformers PHP includes a handy CLI tool for this purpose:
./vendor/bin/transformers download <model_identifier>What's Next?
This initial release lays the groundwork for a versatile machine learning toolkit within the PHP ecosystem. We are committed to continuous improvement and expansion of Transformers PHP, with future updates aimed at increasing supported tasks, enhancing functionality, and broadening the scope of models.
Get Involved!
We encourage feedback, contributions, and discussions from the community. Whether you're reporting bugs, requesting features, or contributing code, your input is invaluable in making Transformers PHP better for everyone.
Acknowledgments:
A huge thank you to Hugging Face for their incredible work on the Transformers library, to Xenova for inspiring this package, and to the broader machine learning community for their ongoing research and contributions. Transformers PHP stands on the shoulders of giants, and we are excited to see how it will empower PHP developers to push the boundaries of what's possible.