From cc892a5eed6fc786a8b39a238cbe9f0e4f51152f Mon Sep 17 00:00:00 2001 From: Aleksei Smirnov Date: Fri, 5 Jan 2024 22:21:53 +0300 Subject: [PATCH] Fix typos in SemanticKernel README file --- LLama.SemanticKernel/README.md | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) diff --git a/LLama.SemanticKernel/README.md b/LLama.SemanticKernel/README.md index 71f4611a..907a9912 100644 --- a/LLama.SemanticKernel/README.md +++ b/LLama.SemanticKernel/README.md @@ -1,12 +1,12 @@ # LLamaSharp.SemanticKernel -LLamaSharp.SemanticKernel are connections for [SemanticKernel](https://github.com/microsoft/semantic-kernel): an SDK for intergrating various LLM interfaces into a single implementation. With this, you can add local LLaMa queries as another connection point with your existing connections. +LLamaSharp.SemanticKernel are connections for [SemanticKernel](https://github.com/microsoft/semantic-kernel): an SDK for integrating various LLM interfaces into a single implementation. With this, you can add local LLaMa queries as another connection point with your existing connections. For reference on how to implement it, view the following examples: -- [SemanticKernelChat](../LLama.Examples/NewVersion/SemanticKernelChat.cs) -- [SemanticKernelPrompt](../LLama.Examples/NewVersion/SemanticKernelPrompt.cs) -- [SemanticKernelMemory](../LLama.Examples/NewVersion/SemanticKernelMemory.cs) +- [SemanticKernelChat](../LLama.Examples/Examples/SemanticKernelChat.cs) +- [SemanticKernelPrompt](../LLama.Examples/Examples/SemanticKernelPrompt.cs) +- [SemanticKernelMemory](../LLama.Examples/Examples/SemanticKernelMemory.cs) ## ITextCompletion ```csharp