Skip to main content

Command Palette

Search for a command to run...

Model Context Protocol

Model Context Protocols for Non Techies

Updated
6 min read
Model Context Protocol
S

Entrepreneur, Founder, Mentor. Also runs a Registered NGO. Visit my website: sandeepgokhale.com

The AI Evolution

Ever since the dawn of Generative Pre-trained Transformer (GPT) the world has never been the same. A lot of “everything” these days are generated by these so called AI Assistants. Be it Content(Multi Media), Code(Multi Lingual), Designs, Tests, Infra and more.

For the first time these AI Assistants could generate human like outputs at scale. AI went from being a futuristic concept to a quiet co-creator behind much of the digital world we interact with today. I would be shocked if there is someone who does not use AI Assistant today.

But there was a problem. They could generate almost anything, yet they were trapped within the boundaries of their training data. They couldn’t fetch live information, connect to a system, or take meaningful action in the real world. This created a gap: the knowledge of the world was at their fingertips, but the ability to use that knowledge was missing.


AI Assistants

Imagine having a team mate (say, an intern) who has been trained on all the knowledge available over the internet publicly. Technical Blogs, Documentation, Policies, Code, Regulations, Medical knowledge, Plumbing, Home Decor, Car Maintenance - name it and this intern knows it. And this intern can answer if you ask a question without forgetting anything and works non stop. Yes - This Intern is what we call as an AI Assistant.

With the advent of AI Assistants like ChatGPT, Gemini, Claude (among others) - The way of working has completely changed. Give these Assistants some instructions (AKA Prompts), you will be surprised how well they can do some tasks.


LLMs

Remember the “Interns” we just spoke about, A Large Language Models (LLMs) is like the “brain” of this Intern who knows a lot of things about a lot of things.

The evolution of LLMs is nothing short of brilliant. Right from having models that predict just the next word(s) (N Gram, Markov’s Models) to Neural Networks to LTSM to Attention and Transformers, the journey is nothing short of a Marvel movie. That is probably a blog for another time.

When LLMs were introduced, they excelled at summarizing, explaining, answering questions by spotting patterns in language at scale. Also, they could NOT count how many “r”s were there in the word “Strawberry“. They also initially failed at basic Mathematics and logical reasoning (like 0.09 is bigger than 0.9).

As of today, most of these issues are “fixed” and every passing day these interns(LLMs) are getting better. The companies behind these LLMs claim that they have reached PhD level of intelligence. We will see in the near future if this is true. For now - We are still not there.


Interacting with LLMs

On the 30th of November 2022, OpenAI introduced to ChatGPT to the world. ChatGPT was the first widely accessible AI assistant powered by a LLM and marking a new era in how we all interact with AI.

The interaction was just via a text input (voice was added sooner). Prompt/Type something and these AI Assistants would give you an answer.

Prompts: The input we give to these AI assistants are called as Prompts.

Very soon the world realized that for an AI Assistant to work well - the input we provide is crucial. If we do not give good inputs/prompts, these AI Assistants would not give good answers and worse case confidently give a wrong answer too.

So, Writing good prompts were the need of the hour and thus Prompt Engineering was born.

Prompt Engineering is way of providing inputs to an LLM to extract the best from it.


We needed more..

It did not take much time for the thinkers to realize that these LLMs who were trained on a lot of things were limited by the inputs provided by the user. These LLMs had all the knowledge of the world but could not access anything “live”. For Ex: What is the current weather in Namma Bengaluru ?

Whatever you do, there were always things new and users wanted the latest.


The need for APIs

Well anytime we want a system (AI Assistant in this case) to interact with something external to it - we always reach out to something called an API. APIs are the bridges that let one system talk to another.

AS LLMs became popular, everyone realized the need to connect them to real-world tools and data. So began the race to develop these APIs and also the problem: Every one started to write these APIs in their own ways.

Simple put: There was no “standard“ way to write these APIs and Anthropic (the company behind Claude) realized this and this came up with MCP.


Enter MCP

Remember previously how every phone has their own “type” of charging cables? The same was happening to the AI assistant world when it tried to connect to external world.

The phone charging cable problem was resolved via USB C. For the AI Assistant that would be MCP. Basically, Instead of every company “inventing” their own way for LLMs to talk to tools, MCP or Model Context Protocol provided a framework.

With MCP, an AI assistant could connect to APIs, databases, or services through a common language and structure. That means developers don’t waste time “reinventing”.

Most importantly, MCP introduces guardrails:

  • The LLM can only access what is explicitly exposed.

  • Every interaction follows a clear schema.

  • Risky or destructive operations can be controlled by design.

In short, MCP turns the chaos of many one-off APIs into an organized ecosystem, giving LLMs a safe way to step out of their training data and interact with the real world.


Building your own MCP

Now this bit is mainly for developers. I would encourage you to read on, even if you are a non developer.

MCPs like any other APIs are very easy provided you have good fundamental understanding of building APIs and the related protocols.

If you have written any kind of Production Grade Rest API, all of this should be very easy for you.

  1. JSON

  2. Protocols (HTTP, TCP) and Paradigms (SSE, RCP)

  3. Creating Rest APIs

  4. Networking Concepts (Pipes, Streams..)

  5. Authentication & Authorization

  6. Security

If the above sounds simple to you - You can get started today with MCP today and the best way is to start by doing.


Let's Connect

Hi, I’m Sandeep Gokhale, and I'm passionate about building high-performing teams at my company, Techvito and I write about Technology, People, Processes and some more stuff.

If you’re exploring MCP or looking for a trusted technology partner to:

  • Build secure, production-ready MCP servers and APIs,

  • Guide your business through zero-downtime cloud migrations,

  • Accelerate your goals with clarity, speed, quality, and security,

  • Work with a team that values reliability, transparency, and trust,

…then me and my team are here, ready to help you make it happen.

Feel free to connect with me on LinkedIn and Twitter.

Until Next time!