Skip to content

Vercel Comptability? #2219

@Droppix

Description

@Droppix

Describe the bug
I am trying to use LLamaIndex with AI SDK Vercel 5 (with LLM GPT OSS, DeepSeek, Gemini, etc.), but I must say that between your documentation and the code, not much matches up and works.

To Reproduce
Code to reproduce the behavior:

const {
    createGateway,
    embed,
    embedMany,
    generateText,
    streamText,
} = require('ai')
const { createOpenAICompatible } = require('@ai-sdk/openai-compatible')
const llamaindex = await import('llamaindex')
const {
    Settings,
    Document,
    VectorStoreIndex,
    storageContextFromDefaults,
    loadIndicesFromStorage,
} = llamaindex
const Vercel = await import('@llamaindex/vercel')
const { llamaindex: llamaindexToolFactory } = Vercel

const embeddingsModule = await import('@llamaindex/huggingface')
const { HuggingFaceEmbedding } = embeddingsModule // si ce chemin est correct selon la 

Settings.embedModel = new HuggingFaceEmbedding({
    modelType: 'intfloat/e5-large-v2',
    hfToken: '....',
})

const openai = createOpenAICompatible({
    baseURL: options.llm.baseURL,
    apiKey: options.llm.apikey,
    compatibility: 'strict',
})
const modelProvider = openai('gpt-oss-120b')

let storageContext

const persistDir = path.resolve('cache')
if (!fs.existsSync(persistDir)) {
    fs.mkdirSync(persistDir, { recursive: true })
}

storageContext = await storageContextFromDefaults({
    persistDir,
})

let index
const files = fs.existsSync(persistDir) ? fs.readdirSync(persistDir) : []
if (files.length) {
    index = await VectorStoreIndex.init({ storageContext })
} else {
    const doc = new Document({ text: "Bla Bla Bla...." , id_: 'transcription-1' })
    index = await VectorStoreIndex.fromDocuments([doc], {
        storageContext,
    })
}

const queryTool = llamaindexToolFactory({
    model: modelProvider,
    index,
})

let fullText = ''
let res = streamText({
    model: modelProvider,
    prompt: 'Suggest a title for this transcription',
    tools: { queryTool },
    onFinish({ response }) {
        console.log('Response:', response.messages) // log the response
    },
})
let fulltext = ''
for await (const text of result.textStream) {
  fullText += text
  process.stdout.write(text)
}

Expected behavior
Indexing is not taken into account.

Screenshots
If applicable, add screenshots to help explain your problem.

Desktop (please complete the following information):

  • OS: MAC OS
  • NodeJS 22 / AI SDK Vercel 5

Additional context
Add any other context about the problem here.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions