0:00AI has gotten really good but you are
0:02often still the blocker for software
0:04development specifically you copy paste
0:06things code SQL queries deployment
0:09errors and then the AI suggested fixes
0:12and actions run this deploy that get me
0:15the logs so the iteration loop is slow
0:17because you are the bottleneck and there
0:19is no real added value here you're
0:22mostly doing monkey copy paste work the
0:25AI is smart enough to run these things
0:27itself do a deployment check the logs,
0:30create the database, iterate until it
0:32works, and that's what MCP [music]
0:34enables. In this video, we'll understand
0:36what MCP is and go through a quick
0:39hands-on setup. You'll see that the
0:41magic happens when you chain multiple
0:43MCPs with the knowledge of an LM. So,
0:46MCP stand for model context protocol.
0:49It's a standards that lets AI tools
0:51connect to external services. Say you
0:53want to search and summarize all notion
0:56pages related to a topic. Typically you
0:59would search yourselves then copy paste
1:01it and say give me a one-pager summary.
1:04With the notion MCP the AI can read your
1:07documents directly and even creates new
1:10ones. No downloading no copy paste. You
1:12just say summarize all your work that
1:14has been done on a topic example DAX and
1:17it does it. Same with databases. The AI
1:19writes you a query you run it. It fails.
1:22You copy the error back get a fix. It's
1:25slow. With an MCP, the AI can run the
1:27query itself, see it fail, fix it, and
1:29keep iterating until it works. And it's
1:31not just about avoiding copy paste. It's
1:34about the AI can now act on tools
1:36directly, see what's happening, and keep
1:39trying until it succeed. So that
1:41feedback loop is the superpower. MCP was
1:44created by Entropic, the company behind
1:46Clo in November 2024. The idea was
1:49simple. Every AI tool was building its
1:52own notion integration for example its
1:54own Slack integration it's on everything
1:56chat GPD as one cloud as one. So MCP
1:59said let's make a standard the community
2:01or mostly the owner here notion for the
2:04notion MCP example builds one MCP server
2:08and all cloud chat GPT cursor whatever
2:11can directly connect and use it as a
2:14superpower tool. Something important to
2:16mention in December 2025 Entropic
2:18donated MCP to the Linux Foundation
2:21specifically the new Agentic AI
2:23foundation the same foundation that
2:25stewards Kubernetes and PyTorch OpenAI
2:28Google Microsoft AWS all joined as
2:31founding members. So this is now
2:32becoming slowly the industry standard
2:35for LMS to connect through tools. But
2:38what exactly is the protocol definition
2:40and you want to stay with me because as
2:43a user you'll give access to various
2:45things so better understand which power
2:47you give to your AI before they take all
2:50over. So when you install an MCP server
2:52you're giving your AI access to specific
2:55capabilities like instead of just
2:57writing back text it can perform various
2:59action. This can come in three flavors
3:02and you'll see them when youize the
3:04connection. First tools. These are
3:06action the AI can take. For notion MCP
3:09server, it has tools like notion fetch,
3:12notion get comments. These are readonly
3:14tools. But of course, you have also
3:16another category here, write and delete.
3:19When the AI needs to do something, it
3:21calls a tool and you can specify which
3:24permission per default it has. Always
3:26allow, need approval or block. The
3:28second is resources. This is data the AI
3:31can read. A file system MCP expose your
3:34files as resource. A database MCP might
3:37expose your schema as a resource. A CRM
3:40like HubSpots might expose your contact
3:42list. Resources are readonly context and
3:45the AI can see them but can't modify to
3:48resource. That's what tools are for. And
3:51the third one finally is prompts. Think
3:53of these are slash commands or shortcuts
3:55and need boosts to your request that
3:58provide better context. A database MCP
4:00might offer a analyze table prompt that
4:03automatically structures how the AI
4:06should analyze your data. For instance,
4:08first getting the schema, etc. Honestly,
4:11most MCP servers focus on tools. Prompts
4:14are nice to have, but not essential. But
4:16now you know what you're looking at when
4:18you authorize a connection. Note also
4:20that MCP released just a new
4:22specification called MCP apps, but they
4:25are entirely something else. Tell us in
4:27the comments if you want a dedicated
4:29video specifically on that. Now here is
4:32where people get confused. There are two
4:34type of MCP servers remote and local.
4:38Remote servers run in the cloud. Modduck
4:40runs theirs. Notion runs theirs. Linear
4:43Slack. They all their own MCP servers.
4:46You connect via HTTPS. Authenticate with
4:50OS. Click sign in with Google styles and
4:53you're done. No installation, no config
4:55files. in Clo these are called
4:57connectors and in chat GPD they are
5:00called apps now. Yeah, don't ask me why
5:02they couldn't just call it MCP. Maybe
5:04for folks that have no idea what MCP is,
5:07but you're not one of those. Not
5:09anymore. Anyway, local servers run on
5:12your machine and when you configure one,
5:14you're basically starting a small web
5:16server locally that translate the AI
5:19request into action on your system. The
5:22file system MCP for example runs on your
5:25computer and lets the AI read and write
5:27files in your folder specifically. So if
5:30you're using a local model through
5:32Olama, the data never leaves your
5:34laptop. One important security tip,
5:37don't install MCP servers that aren't
5:40open sourced or backed by the main
5:42company of the product. A local MCP runs
5:45codes on your machine with your
5:47permission. And if it's a random GitHub
5:49repo with no star and no company behind
5:52it, skip it. Stick to the official
5:54servers or wellknown open-source
5:57projects that you can actually inspect.
6:00Stay safe. All right. So, how to set
6:02them up. Okay. So, no more copy paste
6:04monkey works. Two ways to add MCP
6:06server. Let's show you both. One way is
6:08to go and click the like button and
6:10subscribe and ta your MCP is installed.
6:13I'm just kidding.
6:14
>> Or no. So for remote servers, the
6:16easiest way is through the approved
6:18directory. These are servers that have
6:21been reviewed and trusted by the AI
6:23platform. Would it be cloud or chat GPD?
6:25In cloud, you go to settings,
6:27connectors, browse the directory. You
6:29can find moduck, click connect,
6:30authenticates, done. You're granting the
6:33AI permissions to use that service on
6:35your behalf. Same for notion. Browse,
6:38connect, authorize. Now I have both
6:40moddion
6:42connected. The second way to install an
6:43MCP server is through a JSON
6:46configuration. This works for both
6:48remote servers that aren't approved in
6:50the directory and local servers. For
6:52remote servers, you'll see that in the
6:54configuration, you have to connect to an
6:56endpoint that is hosted by the owner.
6:59For local servers, you're running the
7:01servers directly. So, it would be
7:02classically an npm command or a UV
7:05command to bootstrap the local servers.
7:08So that means you either need NodeJS
7:10with npm for JavaScript servers or
7:13Python with UV for Python servers and
7:15most MCP servers are built in one of
7:18these two. You can check mcp.so over
7:2117,000 servers are there. All right,
7:24let's see what it actually enables. I
7:26worked in Devril at MoDoc and part of my
7:29job is tracking what people say about us
7:31and moduck on various things like a news
7:34you know helping people when there is
7:36confusion. So normally you search your I
7:38can use open threads read the comments
7:41take notes copy do but you actually can
7:43do that in just one prompt. I've
7:45connected two MCP servers motoduck which
7:48has a public data set with the entire
7:50acronuse history more than 50 millions
7:53post and notion for creating documents.
7:55So watch what happens. So this is my
7:57first initial prompt not that much
7:59complicated asking to create a report of
8:02comments that I can act on. So first
8:04it's going to call moduck running SQL uh
8:07across the 50 million rows data set. It
8:10finds the post mentioning DDB and moduck
8:13get the top discussion pulls the
8:15comments. Now here is where the LLM does
8:17its thing. It's not just moving the
8:20data. It's reading these comments and
8:22understanding them. Oh this question is
8:24about S3 caching and it's not answered.
8:26This one is a misconception about scale.
8:28We should correct it. This is a filter
8:30request about DAX. And now it calls
8:33notion to create the report. One prompt,
8:36two MCP servers. So now you see the
8:38power of combining two MCP with an LLM.
8:41And I have a document ready to share
8:43with my team. That's it. Now go play
8:45around. Install one MCP, the Moduck one,
8:48the notion or whatever fits you. See
8:51what you're doing as a copy paste and if
8:53there is actually an existing MCP. And
8:56yeah, now get the hell out of here and
8:58go build something. Cheers.