AI Engineering Tips
    Delivered to your inbox

    Hot-Swap AI Models With Vercel's AI SDK

    Matt PocockMatt Pocock

    I wanted to demonstrate for you just how flexible the Vercel AI SDK is when it comes to model selection.

    Scrollycoding

    Here we have an ask function that takes in a prompt and a model of a type of LanguageModel.

    We can call this function with any model that the AI SDK provides.

    For instance, we can call it with Anthropic:

    Or with OpenAI:

    import { generateText, type LanguageModel } from "ai";
    export const ask = async (
    prompt: string,
    model: LanguageModel,
    ) => {
    const { text } = await generateText({
    model,
    prompt,
    });
    return text;
    };

    This gives you a ton of flexibility with how you want to build your application.

    The ask function is totally decoupled from the model that it uses.

    The LanguageModel type exposed by the AI SDK lets you do dependency injection. In other words, you can inject any model into your system.

    To me, this is one of the core selling points of the AI SDK.

    In our next example, we're going to look at chat history and how you can preserve the history of a chat over time.

    Share