By now, we have seen "Chat with your documents" functionality being introduced in many Microsoft 365 applications. It is typically built by combining Large Language Models (LLMs) and vector databases.
To make the documents "chat ready", they have to be converted to embeddings and stored in vector databases like Azure AI Search. However, indexing the documents and keeping the index in sync are not trivial tasks. There are many moving pieces involved. Also, many times there is no need for "similarity search" or "vector search" where the search is made based on meaning of the query.
In such cases, a simple "keyword" search can do the trick. The advantage of using keyword search in Microsoft 365 applications is that the Microsoft Search indexes are already available as part of the service. APIs like the Microsoft Graph Search API and the SharePoint Search REST API give us "ready to consume" endpoints which can be used to query documents across SharePoint and OneDrive. Keeping these search indexes in sync with the changes in the documents is also handled by the Microsoft 365 service itself.
So in this post, let's have a look at how we can combine OpenAI's gpt-4o Large Language Model with Microsoft Graph Search API to query SharePoint and OneDrive documents in natural language.
On a high level we will be using OpenAI function calling to achieve this. Our steps are going to be:
1. Define an OpenAI function and make it available to the LLM.
2. During the course of the chat, if the LLM thinks that to respond
to the user, it needs to call our function, it will respond with the
function name along with the parameters.
3. Call the Microsoft Graph Search API based on the parameters provided by
the LLM.
4. Send the results returned from the Microsoft Graph back to the LLM to generate a response in natural language.
So let's see how to achieve this. In this code I have used the following nuget packages:
https://www.nuget.org/packages/Azure.AI.OpenAI/2.1.0
https://www.nuget.org/packages/Microsoft.Graph/5.64.0
The first thing we will look at is our OpenAI function definition:
"functions": [{ | |
"name": "search_microsoft365_documents", | |
"description": "Search the Microsfot 365 documents from user's SharePoint and OneDrive", | |
"parameters": { | |
"type": "object", | |
"required": ["searchQuery"], | |
"properties": { | |
"searchQuery": { | |
"type": "string", | |
"description": "the text to search in the documents to get the required information" | |
} | |
} | |
} | |
} | |
] |
static async Task Main(string[] args) | |
{ | |
string endpoint = "<azure-openi-key>"; | |
string key = "<azure-openi-endpoint>"; | |
string deploymentName = "gpt-4o"; | |
var azureOpenAIClient = new AzureOpenAIClient(new Uri(endpoint), new ApiKeyCredential(key)); | |
//get userQuestion from the console | |
Console.WriteLine("What would you like to search?: "); | |
string userQuestion = Console.ReadLine(); | |
//1. Call Open AI Chat API with the user's question. | |
var chatCompletionResponse = await CallOpenAIAPI(userQuestion, deploymentName, azureOpenAIClient); | |
//2. Check if the Chat API decided that for answering the question, a function call to the MS Graph needs to be made. | |
if (chatCompletionResponse.Value.FinishReason == ChatFinishReason.ToolCalls) | |
{ | |
string functionName = chatCompletionResponse.Value.ToolCalls[0].FunctionName; | |
BinaryData functionArguments = chatCompletionResponse.Value.ToolCalls[0].FunctionArguments; | |
string toolCallId = chatCompletionResponse.Value.ToolCalls[0].Id; | |
Console.WriteLine($"Function Name: {functionName}, Params: {functionArguments}"); | |
if (functionName == "search_microsoft365_documents") | |
{ | |
//3. If the MS Graph function call needs to be made, the Chat API will also provide which parameters need to be passed to the function. | |
var searchParams = JsonSerializer.Deserialize<M365SearchQueryParams>(functionArguments); | |
//4. Call the MS Graph with the parameters provided by the Chat API | |
var functionResponse = await ExecuteMicrosoft365SearchWithGraph(searchParams.searchQuery); | |
Console.WriteLine($"Graph Response: {functionResponse}"); | |
//5. Call the Chat API again with the function response. | |
var functionMessages = new List<OpenAI.Chat.ChatMessage> | |
{ | |
new AssistantChatMessage(new List<ChatToolCall>() { ChatToolCall.CreateFunctionToolCall(toolCallId, functionName, functionArguments) }), | |
new ToolChatMessage(toolCallId, functionResponse) | |
}; | |
chatCompletionResponse = await CallOpenAIAPI(userQuestion, deploymentName, azureOpenAIClient, functionMessages); | |
//6. Print the final response from the Chat API. | |
Console.WriteLine("------------------"); | |
Console.WriteLine(chatCompletionResponse.Value.Content[0].Text); | |
} | |
} | |
else | |
{ | |
//If the LLM decided that a function call is not needed, print the final response from the Chat API. | |
Console.WriteLine(chatCompletionResponse.Value.Content[0].Text); | |
} | |
} |
There is a lot to unpack here as this function is the one which does the heavy lifting. This code is responsible for handling the chat with OpenAI, calling the MS Graph and also responding back to the user based on the response from the Graph.
Next, let's have a look at the code which calls the Microsoft Graph based on the parameters provided by the LLM.
Before executing this code, you will need to have created an App registration. Here is how to do that: https://learn.microsoft.com/en-us/azure/active-directory/develop/quickstart-register-app
Since we are calling the Microsoft Graph /search endpoint with delegated permissions, the app registration will need a minimum of the User.Read and Files.Read.All permissions granted. https://learn.microsoft.com/en-us/graph/api/search-query?view=graph-rest-1.0&tabs=http
private static async Task<string> ExecuteMicrosoft365SearchWithGraph(string searchQuery) | |
{ | |
// To initialize your graphClient, see https://learn.microsoft.com/en-us/graph/sdks/create-client?from=snippets&tabs=csharp | |
GraphServiceClient graphClient = GetGraphClient(["User.Read", "Files.Read.All"]); | |
var requestBody = new QueryPostRequestBody | |
{ | |
Requests = new List<SearchRequest> | |
{ | |
new SearchRequest | |
{ | |
EntityTypes = new List<EntityType?> | |
{ | |
EntityType.DriveItem, | |
}, | |
Query = new SearchQuery | |
{ | |
QueryString = searchQuery, | |
}, | |
From = 0, | |
Size = 25, | |
}, | |
}, | |
}; | |
var searchResults = await graphClient.Search.Query.PostAsQueryPostResponseAsync(requestBody); | |
var result = string.Empty; | |
foreach (var hit in searchResults.Value[0].HitsContainers[0].Hits) | |
{ | |
var listItem = hit.Resource as DriveItem; | |
if (listItem != null) | |
{ | |
//Using the summary of the search result. In production, it might be the case that the summary is not enough. | |
//In that situation, the solution could be to fetch the file contents through another graph call | |
result += hit.Summary; | |
} | |
} | |
return result; | |
} |
private static async Task<ClientResult<ChatCompletion>> CallOpenAIAPI(string userQuestion, string modelDeploymentName, AzureOpenAIClient azureOpenAIClient, IList<OpenAI.Chat.ChatMessage> functionMessages = null) | |
{ | |
var chatCompletionOptions = new ChatCompletionOptions(); | |
var messages = new List<OpenAI.Chat.ChatMessage> | |
{ | |
new SystemChatMessage("You are a search assistant that helps find information. Only use the functions and parameters you have been provided with."), | |
new UserChatMessage(userQuestion) | |
}; | |
if (functionMessages != null) | |
{ | |
foreach (var functionMessage in functionMessages) | |
{ | |
messages.Add(functionMessage); | |
} | |
} | |
chatCompletionOptions.Tools.Add(ChatTool.CreateFunctionTool( | |
functionName: "search_microsoft365_documents", | |
functionDescription: "Search the Microsfot 365 documents from user's SharePoint and OneDrive.", | |
functionParameters: BinaryData.FromString("{\"type\": \"object\",\"required\": [\"searchQuery\"],\"properties\": {\"searchQuery\": {\"type\": \"string\",\"description\": \"the text to search in the documents to get the required information\"}}}") | |
)); | |
var chatCompletionResponse = await azureOpenAIClient.GetChatClient(modelDeploymentName).CompleteChatAsync(messages, chatCompletionOptions); | |
return chatCompletionResponse; | |
} |