Building an Analytics Chatbot for your SaaS app in 1 day

2026/03/11Featuring:

Building an analytics chatbot for your SaaS product

TL;DR: This webinar walks through building a conversational AI layer for a SaaS product using the MotherDuck MCP server. You'll see how to give an LLM scoped, read-only access to customer data and return answers in plain English.

Why add a chatbot to your product?

Most SaaS products sit on data that users want to explore but can't easily query. A chat interface lets them ask questions in natural language and get answers from their own data — no SQL, no dashboards.

The MotherDuck MCP server is what makes this work. It connects an LLM to your data warehouse, translates natural language into SQL, and returns results. MotherDuck's hypertenancy model keeps each customer's data isolated with dedicated compute.

What the webinar covers

The session walks through the full stack: connecting the MCP server so an LLM gets scoped, read-only access to production databases, building a streaming chat backend that handles multi-step tool use, and extending the AI with custom tools for inline data visualizations.

You'll see MCP workflows in action — a user types a plain-English question, it becomes a SQL query against their specific database, and the answer streams back through a chat interface.

What you walk away with

The pattern is reusable. Once you know how to scope MCP access per customer and handle streaming tool use, the same architecture works for any SaaS product backed by a MotherDuck data warehouse.

1:03Hey there everyone. Good morning. Good afternoon, good evening, wherever you're coming in from. Welcome to our webinar today on building an AI chatbot with the MotherDuck MCP. It'll be more of a workshop — we'll go through how to build it yourselves.

1:40My name is Gerald. I'm on the marketing team here at MotherDuck. I'm joined by Ryan Boyd, one of the co-founders of MotherDuck. Jacob from our DevRel team is also here to help answer questions.

2:43Hey everyone, super excited to do this. We have had so much fun internally with the MCP server building AI capabilities into applications. I've tried to build AI chatbots that were reasonable for many years and it hasn't really worked out. Now we've gotten to the point where we can build it in a day.

3:41I'm going to show you the application I built for the workshop — a dashboard for e-commerce stores so that a store owner can see analytics about how their store is performing. You can consider it like the Shopify dashboard for analytics.

6:27I said I want to build a chatbot because I want to ask questions of my data. With the help of Claude I built this chatbot really quickly once our MCP server launched. You can build the basic framework in just several prompts to your LLM.

8:40What is MotherDuck? A serverless data warehouse built on DuckDB designed for low-cost, low-friction analytics by humans, agents, and applications.

8:52A traditional data warehouse is often more like a monolith with shared resources. The distributed architectures make it very hard to build applications because they expect response times in tens to hundreds of milliseconds, not tens of seconds.

9:56MotherDuck offers what we call hypertenancy. Each user has their own isolated compute that can scale up and scale down. You can give the CEO a larger instance than your day-to-day analyst — or in a good company, maybe you do the inverse.

10:45This isolated compute really helps in in-app analytics. It allows us to build a chatbot on top of the data while trusting that users can't affect the performance of other users or the data of other companies.

11:11DuckDB is also extremely fast — it's a columnar database. A lot of people come to us having tried Postgres for analytics and realize Postgres is a row store: amazing but not great for analytics workloads.

12:07We now have a Postgres endpoint in our docs and UI so you can connect to MotherDuck using the Postgres protocol and interact with your MotherDuck data too.

13:04For customer-facing analytics, each customer has their own isolated database and users within those customers can access that data. If you have a Shopify-type site with hundreds of stores, each store gets its own isolated database and compute.

15:51The architecture has a single orchestrator database that stores the list of all e-commerce stores and two types of tokens per store: a read-write token for ingesting data, and a read-scaling token for queries and the analytics chatbot.

17:15The application is a Node.js app deployed to Vercel. It connects to Claude — Sonnet 4.6 in this case — and to the MotherDuck remote MCP server, which connects to the duckling for each individual store.

19:01You can also use the Anthropic SDK with OpenRouter to access other models. For another app I built, I used Gemini because it was fast and I could put enough in the system prompt to make it work well on the given data.

20:48The repo has two branches: main (without AI chat) and AI chat branch. In Claude Code I simply said: create a new branch called AI chat, create an AI chat window in my application, deploy it on a separate domain. That's the set of prompts to build the initial version.

25:36It's easy because Claude Code is awesome nowadays, and because the hypertenancy architecture isolates data per customer. With that isolation, we don't have to build a ton of security features into the chatbot to ensure users only see the right data.

26:06The backend route is 325 lines of code. The React chatbot component is 213 lines. That's about it beyond some minor import and CSS changes.

27:09Now I'm doing this live in Claude Code on the main branch. I'm creating a new branch called live-ai-chat and telling it to use the MCP server — not the Node MotherDuck client — because the MCP server provides additional context about DuckDB syntax, schema exploration, and best practices.

33:17The MCP server converts tool definitions to the Anthropic tool format, mapping all tools advertised by the server into the application. Using a read-scaling token is important because some tools allow writing back to the database — and we don't want to allow that.

37:05The app deployed to Vercel. Two environment variables: the MotherDuck token for the main store listing and the Anthropic SDK token. The app is live.

39:51What's the top category of products sold in January? It's connecting to Claude, figuring out the list of tables available, and writing SQL against those tables.

41:53Seth asks: show me a week over week analysis of the top three customers from January 1 onward. You can balance model speed against accuracy. Nicer models take more time but give better results. You'll want to keep adding to the system prompt over time.

43:27Ryan David asks: how do we increase orders next quarter? Claude can pull in real-world context — for an e-commerce business it noticed a tariff announcement probably caused a shift in orders last week. It's as smart as Claude is nowadays, which is pretty smart.

47:53For charts: the MotherDuck homepage itself uses the MCP server for live analyses with concurrent load, so there's good evidence concurrency works well. Specify charting libraries in your system prompt so the chatbot knows what to use.

50:33For the role of MCP: there are two architectures. One is LLM-generated SQL sent directly to MotherDuck via the DuckDB client. The other uses the MCP server — LLMs know about it natively, and it advertises schema exploration, DuckDB syntax guidance, and query best practices.

53:30For PII: the orchestrator database is never accessed by the LLM — only by SQL I've written. You can create service accounts with access to a subset of data that excludes PII. MotherDuck's team is happy to walk you through the right architecture for your scenario.

55:30Pricing: service accounts are sub-accounts within one org. On the Business plan, one $250/month fee covers unlimited service accounts, each with their own isolated data and ducklings. Ducklings are only active when you're working with that store's data.

57:35For background jobs: GitHub Actions works well for running recurring jobs with MotherDuck. For more complex orchestration, Airflow and other tools in the modern data stack integrate well.

59:57Thank you Ryan and thank you everyone for the great questions. The recording will be sent out and posted on our website. Looking forward to seeing what you build!

FAQS

How does the MotherDuck MCP server connect an LLM to my data?

The MotherDuck MCP server connects an LLM to your data warehouse. When you ask a question in plain English, the server translates it into SQL, runs the query against MotherDuck, and hands back the results. You configure it with scoped, read-only access so the LLM only sees the data you allow. See the MCP server documentation for setup details.

How does MotherDuck keep customer data isolated in a multi-tenant chatbot?

MotherDuck gives each customer their own isolated database and dedicated compute resources. When you build a chatbot on the MCP server, you scope each connection to a specific customer's database. The LLM can only query that customer's data, so there's no cross-tenant leakage.

Can the AI chatbot generate data visualizations alongside text answers?

Yes, the MCP server handles multi-step tool use, so the LLM can chain several queries and tool calls in one conversation turn. In the webinar, we show how to add custom tools that generate inline data visualizations next to the text answers. The streaming backend handles each step as it finishes, so users see results as they come in.

Can I reuse this chatbot architecture for my own SaaS product?

This architecture is reusable. Connect the MotherDuck MCP server with scoped access to your customer's database, build a streaming chat backend that handles tool-use responses, and add whatever custom tools your product needs. The same pattern works whether you're building a support chatbot, an analytics assistant, or a plain-language query layer on top of a dashboard. Check out the getting started guide to set up your MotherDuck account.

Related Videos

"Agents That Build Tables, Not Just Query Them" video thumbnail

38:23

2026-03-17

Agents That Build Tables, Not Just Query Them

See how MotherDuck's new query_rw MCP tool lets AI agents write back to your data warehouse, creating tables, storing embeddings, and saving views.

Stream

AI, ML and LLMs

MotherDuck Features

SQL

"Text-to-SQL, Data Modeling for LLMs, MCP, and Dives with Jacob Matson" video thumbnail

1:00:01

2026-03-12

Text-to-SQL, Data Modeling for LLMs, MCP, and Dives with Jacob Matson

Jacob Matson covers text-to-SQL accuracy, why data models matter more than LLM choice, the MotherDuck MCP workflow, and Dives on Super Data Brothers.

Interview

AI, ML and LLMs

SQL

MotherDuck Features

"Shareable visualizations built by your favorite agent" video thumbnail

1:00:10

2026-02-25

Shareable visualizations built by your favorite agent

You know the pattern: someone asks a question, you write a query, share the results — and a week later, the same question comes back. Watch this webinar to see how MotherDuck is rethinking how questions become answers, with AI agents that build and share interactive data visualizations straight from live queries.

Webinar

AI ML and LLMs

MotherDuck Features