Use Local Models With Vercel's AI SDK
You can use Vercel's AI SDK to connect to locally running models. In fact, not just locally running models but models running at any URL.
Scrollycoding
The AI SDK has a function called createOpenAICompatible
which lets you communicate with models that have an OpenAI-compatible API.
In our case, I'm using an app called LM Studio which exposes this API on localhost:1234
.
So I can install ai-sdk/openai-compatible
and then create an LM Studio provider.
And I can use this provider to grab a model.
I'm using an empty string here because if you pass an empty string it will default to choosing the model you have loaded in LM studio.
I can then use this model by passing it into generateText
.
I've specified maxRetries
as zero here. By default, the SDK will retry queries three times to make things more robust and handle any network issues.
But since the model is on our local network, we want it to fail instantly if it can't reach it.
Let's give this a go. We're going to ask the LLM a story about its grandmother:
And if we run it we get a story about the LLM's grandmother.
import { createOpenAICompatible } from "@ai-sdk/openai-compatible";
So this is a nice simple setup with how you can connect the Vercel AI SDK to a local model.