OPERATION CORE PROTOCOLâąď¸ Operation Time Window:
~30 minutes â intel only, no fieldwork required
đĽ Watch the Walkthrough
Welcome, Recruit. This mission will equip you with foundational intel to understand how Copilot Studio works, and how to build intelligent agents that deliver real business value.
Before building your first agent, you need to understand the four key components that make up every custom AI agent: Knowledge, Tools, Topics, and Instructions. Youâll also learn how these elements work together in the Copilot Studio orchestrator.
In this mission, you will:
An agent is a specialized AI assistant you design to handle specific tasks or queries. Unlike a general-purpose chatbot, your agent:
Because Copilot Studio is low-code, you can drag and drop prebuilt componentsâno deep coding skills required. Once your agent is built, people can call on it inside Teams, Slack, or even a custom webpage to get answers or trigger workflows automatically.
While Microsoft 365 Copilot provides general AI assistance across Office apps, youâll want a custom agent when:
Every Copilot Studio agent is built from four core components:
Below, weâll define each building block and show how they work together to make an effective agent.
Knowledge is the data and context your agent uses to answer questions accurately. It has two parts:
You write a brief description of the agentâs purpose and tone. For example:
You are an IT support agent. You help employees troubleshoot common software issues, provide troubleshooting steps, and escalate urgent tickets.
During a conversation, the agent remembers previous turns so it can refer back to what was already discussed (for instance, if the user first says, âMy printer is offline,â then later asks, âDid you check the ink level?â the agent recalls the printer context).
!!! example A âPolicy Assistantâ agent might connect to your HR SharePoint site. If a user asks, âWhat is our PTO accrual rate?â the agent retrieves the exact text from the HR policy document rather than relying on a generic AI response.
Tools (Actions) define what the agent can do beyond chatting. Each action is a task the agent executes programmatically, such as:
Define Inputs & Outputs
- For example, a Send Email action might require:
- RecipientEmailAddress
- SubjectLine
- EmailBody
Combine Actions into Workflows
- Often, fulfilling a user request involves multiple steps.
- You can sequence actions so that:
1. The agent retrieves data from a SharePoint list.
2. It generates a summary using the LLM.
3. It sends a Teams message with that summary.
Connect to External Systems
- If you need to update a CRM or call an internal API, create a custom action to handle that.
- Copilot Studio can integrate with the Power Platform or any HTTP-based endpoint.
!!! example âAn âExpense Helperâ agent could:â
1. Listen for a âSubmit Expenseâ request.
2. Grab the userâs expense details from a form.
3. Use an âAdd to SharePoint Listâ action to store the data.
4. Trigger a âSend Emailâ action to notify the approver.
Topics define the conversational triggers or entry points for your agent. Each topic corresponds to a piece of functionality or a question category.
!!! example âExample of topic descriptionâ This topic helps users submit an IT support ticket by collecting the issue details, priority, and contact information.
!!! example If a user says, âI need help setting up my new laptop,â the AI might match that intent to the âSubmit IT Ticketâ topic. The agent then asks for laptop model, user details, and pushes a ticket into the helpdesk system automatically
Instructions (sometimes called âPromptsâ or âSystem Messagesâ) guide the LLMâs tone, style, and boundaries. They shape how the agent responds in any situation.
!!! example âIn a âBenefits Advisorâ agent, you might include:â âAlways reference the latest HR handbook when answering questions. If asked about enrollment deadlines, provide the specific dates from the policy. Keep answers under 150 words.â
When you assemble Knowledge, Tools, Topics, and Instructions, Copilot Studioâs AI orchestrator creates an agent that:
Under the hood, the orchestrator uses a generative planning approach: it decides which steps to take, in what order, to fulfill a user request. If an action fails (for example, an email canât be sent), the agent follows your exception-handling guidelines (ask a clarifying question or report the error). Because the LLM adapts to conversation context, the agent can maintain memory over multiple turns and incorporate new information as the conversation unfolds.
Visual Flow Example:
<!â
âYour current PTO balance is 12 days.â
â>
sequenceDiagram
participant User
participant AI
participant Agent
User->>AI: "Show me my PTO balance."
AI->>AI: Match topic: "Check PTO Balance"
Note over AI: Instructions: Apply friendly, concise tone
AI->>Agent: Request user's PTO balance
Note right of Agent: Knowledge: Query HR SharePoint list
Agent-->>AI: PTO balance = 12 days
AI->>Agent: Send message to user (Teams)
Note right of Agent: Action: Deliver notification
Agent-->>User: "Your current PTO balance is 12 days."
Youâve successfully completed your fundamentals briefing. Youâve now learned the four essential building blocks of any agent in Copilot Studio:
With these components in place, you can build a basic agent that answers questions and executes simple workflows. In the next lesson, weâll walk through a step-by-step tutorial to create a âService Deskâ agentâfrom connecting your first knowledge source to defining a topic and wiring up an action.
Up next: Youâll build your first declarative agent for M365 Copilot.