Releases: Blarc/ai-commits-intellij-plugin
Releases · Blarc/ai-commits-intellij-plugin
v2.10.1
Fixed
- Common branch is not detected when most files are unassociated with branches, despite at least one file belonging to a branch.
- Cannot generate commit message when updating a submodule.
v2.10.0
v2.9.1
v2.9.0
Added
- More options for configuring LLM clients.
- Use the chosen LLM client icon as the generate commit message action's icon.
- Option to stop the commit message generation by clicking the action icon again.
- Setting for HuggingFace client to automatically remove prompt from the generated commit message.
- Show progress and result when refreshing models via API.
- Support for Mistral AI.
Fixed
- The progress bar for generating commit message continues running after the user creates the commit.
v2.8.0
Added
- Support streaming mode for Gemini Google.
- Support GitHub models client.
- Theme based icons for better visibility.
Fixed
- Project specific locale is not used when creating prompt.
- Properties topP and topK are not used when verifying Gemini Google client configuration.
v2.7.1
Added
- Option to set top K and top P in Gemini Google client settings.
Fixed
- Unable to submit request to Gemini Google because it has a topK value of 64 but the supported range is from 1 (inclusive) to 41 (exclusive).
v2.7.0
Added
- Support for Gemini Google.
- Save the size of the dialog for adding prompts, if it's resized by the user.
Changed
- Rename Gemini to Gemini Vertex.
- Use the correct icon for Gemini Vertex.
Fixed
- Project's specific prompt is not saved properly.
v2.6.0
Added
- Support streaming response.
- Support for Hugging Face.
v2.5.0
Added
- Support for Azure OpenAI.
- Sort LLM client configurations by provider name and configuration name.
Changed
- Update default prompt for generating commit messages with GitMoji.
Fixed
- Open AI configuration setting
organizationIdis not used when verifying configuration. - Gemini configuration settings
projectIdandlocationare not used when verifying configuration. - Notification about common branch is shown after the prompt dialog is closed.
- Invalid caret position for prompt preview.
v2.4.1
Fixed
- Setting LLM client configuration or prompt as project specific does not work.