Skip to main content

Command Palette

Search for a command to run...

TanStack AI

Updated
4 min read
TanStack AI

The TanStack team just dropped the alpha release of TanStack AI — a framework-agnostic AI toolkit built for developers who want real control over their stack.

Today’s AI ecosystem pushes you into someone else’s platform, tools, and workflow. TanStack AI flips that. It’s open source, adapter-driven, and works with your existing stack instead of boxing you into a new one.

1.What’s Inside TanStack AI

  1. Multi-Language Server Support : Out of the gate: JavaScript/TypeScript, PHP, and Python — each supporting full agentic flows and tool calling.

  2. Adapters for Real-World Providers : TypeScript adapters for

  • OpenAI

  • Anthropic

  • Gemini

  • Ollama

  1. Plus built-in summarization + embedding support.

  2. Open Protocol : The server - client protocol is fully documented. Use any language. Use any transport. If your backend speaks the protocol, the client works.

2.Why TanStack AI Exists

Developers deserve AI tools without:

  • vendor lock-in

  • proprietary platforms

  • ecosystem traps

Just open source, framework-agnostic, type-safe, developer-first tooling — from the same team that brought you TanStack Query, Table, Router, and more.

3.Framework Agnostic

TanStack AI supports the following frameworks

  • Next.js - API routes and App Router

  • TanStack Start - React Start or Solid Start (recommended!)

  • Express - Node.js server

  • Remix Router v7 - Loaders and actions

TanStack AI lets you define a tool once and provide environment-specific implementations. Using toolDefinition() to declare the tool’s input/output types and the server behavior with .server() (or a client implementation with .client()). These isomorphic tools can be invoked from the AI runtime regardless of framework.

import { toolDefinition } from '@tanstack/ai'

// Define a tool
const getProductsDef = toolDefinition({
  name: 'getProducts',
  inputSchema: z.object({ query: z.string() }),
  outputSchema: z.array(z.object({ id: z.string(), name: z.string() })),
})

// Create server implementation
const getProducts = getProductsDef.server(async ({ query }) => {
  return await db.products.search(query)
})

// Use in AI chat
chat({ tools: [getProducts] })

4.Installation & Quick Start

You can install the TanStack AI in minutes

npm install @tanstack/ai @tanstack/ai-react @tanstack/ai-openai

Server Setup

First, create an API route that handles chat requests. Here's a simplified example:

// app/api/chat/route.ts (Next.js)
// or src/routes/api/chat.ts (TanStack Start)
import { chat, toStreamResponse } from "@tanstack/ai";
import { openai } from "@tanstack/ai-openai";

export async function POST(request: Request) {
  // Check for API key
  if (!process.env.OPENAI_API_KEY) {
    return new Response(
      JSON.stringify({
        error: "OPENAI_API_KEY not configured",
      }),
      {
        status: 500,
        headers: { "Content-Type": "application/json" },
      }
    );
  }

  const { messages, conversationId } = await request.json();

  try {
    // Create a streaming chat response
    const stream = chat({
      adapter: openai(),
      messages,
      model: "gpt-4o",
      conversationId
    });

    // Convert stream to HTTP response
    return toStreamResponse(stream);
  } catch (error: any) {
    return new Response(
      JSON.stringify({
        error: error.message || "An error occurred",
      }),
      {
        status: 500,
        headers: { "Content-Type": "application/json" },
      }
    );
  }
}

Client Setup

To use the chat API from your React frontend, create a Chat component:

// components/Chat.tsx
import { useState } from "react";
import { useChat, fetchServerSentEvents } from "@tanstack/ai-react";

export function Chat() {
  const [input, setInput] = useState("");

  const { messages, sendMessage, isLoading } = useChat({
    connection: fetchServerSentEvents("/api/chat"),
  });

  const handleSubmit = (e: React.FormEvent) => {
    e.preventDefault();
    if (input.trim() && !isLoading) {
      sendMessage(input);
      setInput("");
    }
  };

  return (
    <div className="flex flex-col h-screen">
      {/* Messages */}
      <div className="flex-1 overflow-y-auto p-4">
        {messages.map((message) => (
          <div
            key={message.id}
            className={`mb-4 ${
              message.role === "assistant" ? "text-blue-600" : "text-gray-800"
            }`}
          >
            <div className="font-semibold mb-1">
              {message.role === "assistant" ? "Assistant" : "You"}
            </div>
            <div>
              {message.parts.map((part, idx) => {
                if (part.type === "thinking") {
                  return (
                    <div
                      key={idx}
                      className="text-sm text-gray-500 italic mb-2"
                    >
                      💭 Thinking: {part.content}
                    </div>
                  );
                }
                if (part.type === "text") {
                  return <div key={idx}>{part.content}</div>;
                }
                return null;
              })}
            </div>
          </div>
        ))}
      </div>

      {/* Input */}
      <form onSubmit={handleSubmit} className="p-4 border-t">
        <div className="flex gap-2">
          <input
            type="text"
            value={input}
            onChange={(e) => setInput(e.target.value)}
            placeholder="Type a message..."
            className="flex-1 px-4 py-2 border rounded-lg"
            disabled={isLoading}
          />
          <button
            type="submit"
            disabled={!input.trim() || isLoading}
            className="px-6 py-2 bg-blue-600 text-white rounded-lg disabled:opacity-50"
          >
            Send
          </button>
        </div>
      </form>
    </div>
  );
}

Ensure you set the OPENAI_API_KEY in your .env

Now you now have a working chat application. The useChat hook handles:

  • Message state management

  • Streaming responses

  • Loading states

  • Error handling

5.DevTools

TanStack Devtools is a unified devtools panel for inspecting and debugging TanStack libraries, including TanStack AI. It provides real-time insights into AI interactions, tool calls, and state changes.

  • Real-time Monitoring - View live chat messages, tool invocations, and AI responses.

  • Tool Call Inspection - Inspect input and output of tool calls.

  • State Visualization - Visualize chat state and message history.

  • Error Tracking - Monitor errors and exceptions in AI interactions.

Installation

npm install -D @tanstack/react-ai-devtools @tanstack/react-devtools

Usage

Import and include the Devtools component in your application

import { TanStackDevtools } from '@tanstack/react-devtools'
import { aiDevtoolsPlugin } from '@tanstack/react-ai-devtools'

const App = () => {
  return (
    <>
       <TanStackDevtools 
          plugins={[
            // ... other plugins
            aiDevtoolsPlugin(),
          ]}
          // this config is important to connect to the server event bus
          eventBusConfig={{
            connectToServerBus: true,
          }}
        />
    </>
  )
}

That’s a quick overview of TanStack AI. Check the official docs for more info https://tanstack.com/ai/latest

More from this blog

Nidhin's blog

163 posts

✨Crafting Code with a Smile for 8 Years:) Merging the Formal Dance of Angular, the Playful Rhythms of React, and the Next-level Moves of Next.js 🚀

TanStack AI