Skip to content

Weird behavior with "codellama/CodeLlama-13b-hf" #113

@icnahom

Description

@icnahom

Suggestion is not being displayed when using CodeLlama. This is not the case with Starcoder, it shows the suggestion in the line that's triggered from.

Here are the attempts:

Requesting from Line 5, nothing gets displayed.

image

Requesting from Line 6 with a one space to the right, shows the suggestion.

image

Settings

{
    "llm.fillInTheMiddle.enabled": true,
    "llm.fillInTheMiddle.prefix": "<PRE> ",
    "llm.fillInTheMiddle.middle": " <MID>",
    "llm.fillInTheMiddle.suffix": " <SUF>",
    "llm.temperature": 0.2,
    "llm.contextWindow": 4096,
    "llm.tokensToClear": [
        "<EOT>"
    ],
    "llm.tokenizer": {
        "repository": "codellama/CodeLlama-13b-hf"
    },
    "llm.enableAutoSuggest": true,
    "llm.maxNewTokens": 256,
    "llm.configTemplate": "codellama/CodeLlama-13b-hf",
    "llm.modelIdOrEndpoint": "codellama/CodeLlama-13b-hf"
}

Example Code

package main
import "fmt"

func main() {
// Cursor here
func() { messages <- "ping" }()
msg = <-messages
fmt.Println(msg)
}

VSCode

Version: 1.85.0

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions