Building Intelligent Chatbots with Semantic Kernel 

Function Calling

 

 Introduction


 In this post, we’ll explore how to build a chatbot that can search and retrieve information about motivational speakers using Semantic Kernel’s   function-calling capabilities. You can clone the sample code repository by following this link 

 
What is Function Calling?

Function calling bridges the gap between AI models and your application logic. Instead of the LLM only generating text responses, it can:

  • Recognize when it needs specific data to answer a question
  • Automatically invoke appropriate functions from your codebase
  • Use the returned data to formulate accurate responses

Think of it as giving your AI assistant the ability to “use tools” from your application.

The Architecture
Flow:

User: "Tell me about Tony Robbins"

↓

LLM determines it needs speaker information

↓

Calls: GetSpeaker(name: "Tony Robbins")

↓

Returns: Speaker object with bio and website

↓

LLM formats response naturally

↓

Bot: "Tony Robbins is an American motivational speaker..."

Our example consists of three main components:
  1. The Data Model
public class Speaker
{
[JsonPropertyName("id")]
public int Id { get; set; }
[JsonPropertyName("name")]
public string Name { get; set; }
[JsonPropertyName("bio")]
public string Bio { get; set; }
[JsonPropertyName("webSite")]
public string WebSite { get; set; }
}

A simple model representing a motivational speaker with their biographical information.

  1. The Plugin – SpeakerSearchPlugin

This is where the magic happens. The plugin exposes several functions that the AI can invoke:

[KernelFunction("get_speakers")]
[Description("Gets a list of speakers")]
public IList<Speaker> GetSpeakers()
[KernelFunction("get_speaker_by_id")]
[Description("Get speaker by id")]
public Speaker GetSpeaker(int id)
[KernelFunction("get_speaker_by_name")]
[Description("Get speaker by name")]
public Speaker GetSpeaker(string name)
[KernelFunction("get_speaker_by_search_term")]
[Description("Get speaker by search term")]
public List<Speaker> GetSpeakers(string searchTerm)
  1. The Orchestration Layer

The Program.cs file ties everything together, configuring the kernel and handling the conversation flow.


 Key Concepts Explained


 Kernel Functions

The [KernelFunction] attribute marks methods that can be invoked by the AI. The parameter is the function name that the AI model sees:

[KernelFunction("get_speaker_by_name")]

 Descriptions Matter

The [Description] attribute is crucial. It tells the AI what each function does, helping it decide when to use each one:

[Description("Get speaker by search term")]

Good descriptions lead to better function selection by the model.

 Auto-Invocation

The ToolCallBehavior.AutoInvokeKernelFunctions setting enables automatic function calling:

var settings = new OpenAIPromptExecutionSettings()
{ToolCallBehavior = ToolCallBehavior.AutoInvokeKernelFunctions};

When set, the AI will:

  1. Analyze the user’s question
  2. Determine if it needs data from your functions
  3. Automatically call the appropriate function(s)
  4. Incorporate the results into its response

All of this happens transparently without manual intervention.

 How It Works in Practice

Let’s walk through a user interaction:

User asks: “Tell me about Tony Robbins”

Behind the scenes:

  1. The AI analyzes the query and recognizes it needs speaker information
  2. It decides to invoke get_speaker_by_name with parameter “Tony Robbins”
  3. The function executes and returns the Speaker object
  4. The AI receives the data and formulates a natural language response
  5. User sees: “Tony Robbins is an American motivational speaker, personal finance instructor, and self-help author. He became well known from his infomercials and self-help books: Unlimited Power, Unleash the Power Within and Awaken the Giant Within. You can learn more at http://www.tonyrobbins.com”

User asks: “Which speakers focus on financial topics?”

Behind the scenes:

  1. The AI determines it needs to search across speaker bios
  2. It invokes get_speaker_by_search_term with “financial”
  3. The function searches through names, bios, and websites
  4. Returns matches: Dave Ramsey, Tony Robbins, Robert Kiyosaki, Suze Orman
  5. The AI presents the results in a readable format


 Setup and Configuration


 Installing Dependencies 

dotnet add package Microsoft.SemanticKernel

dotnet add package Microsoft.SemanticKernel.Connectors.OpenAI

 Configuring the Kernel 

var builder = Kernel.CreateBuilder();

builder.Services.AddLogging(b => b.AddConsole().SetMinimumLevel(LogLevel.Trace));

builder.AddOpenAIChatCompletion(model, apiKey);
 
var kernel = builder.Build();

Registering the Plugin

kernel.ImportPluginFromType<SpeakerSearchPlugin>();

This single line makes all your [KernelFunction] decorated methods available to the AI.

Best Practices


  1. Write Clear Function Descriptions

The AI relies on descriptions to choose the right function. Be specific:

Good: “Get speaker by exact name match”
Better: “Search for a speaker by their full name (case-insensitive)”

  1. Design Focused Functions

Rather than one complex function, create multiple focused ones. This helps the AI make better decisions and improves reliability.

  1. Handle Edge Cases
public Speaker GetSpeaker(string name)
{ return _speakers.FirstOrDefault(x => x.Name == name);}

Consider what happens when no match is found. Return null? Throw an exception? The AI will work with whatever you return.

  1. Use Strong Typing

Return concrete types rather than dynamic objects. This improves serialization and makes the AI’s job easier.

  1. Consider Search Flexibility

The get_speaker_by_search_term function searches across multiple fields:

searchedSpeakers.AddRange(_speakers.Where(x => x.Name.Contains(searchTerm)));

searchedSpeakers.AddRange(_speakers.Where(x => x.Bio.Contains(searchTerm)));

searchedSpeakers.AddRange(_speakers.Where(x => x.WebSite.Contains(searchTerm)));

This provides flexibility but could return duplicates. Consider using Distinct() or implementing smarter ranking logic.

Real-World Applications


This pattern extends far beyond speaker databases. You can create plugins for:

  • E-commerce: Product search, inventory checks, order status
  • CRM Systems: Customer lookup, ticket creation, sales data
  • DevOps: Build status, deployment queries, log analysis
  • Healthcare: Patient records, appointment scheduling, lab results
  • Financial Systems: Transaction history, balance inquiries, fraud detection

Performance Considerations


Function calling adds latency to responses:

  • The model must analyze the query
  • Functions must execute
  • Results must be processed
  • Final response must be generated

For production systems:

  • Cache frequently accessed data
  • Optimize database queries
  • Consider implementing request timeouts
  • Monitor function execution times

 

Conclusion


Semantic Kernel’s function calling transforms static AI models into dynamic assistants that can interact with your application’s data and logic. By properly decorating your methods with [KernelFunction] and [Description] attributes, you enable the AI to intelligently decide when and how to use your functions.

The speaker search example demonstrates the fundamentals, but the pattern scales to enterprise applications. Whether you’re building customer service bots, internal tools, or consumer applications, function calling provides the bridge between AI capabilities and your business logic.

Start simple, test thoroughly, and gradually expand your plugin’s capabilities. The AI will surprise you with how effectively it can utilize the tools you provide.

Next Steps


  • Explore Semantic Kernel’s planning capabilities for multi-step operations
  • Implement authentication and authorization for sensitive functions
  • Add streaming support for real-time responses
  • Integrate with Azure OpenAI for enterprise deployments
  • Build plugins for your existing services and APIs