Are you sure you want to delete this task? Once this task is deleted, it cannot be recovered.
|
|
2 years ago | |
|---|---|---|
| .. | ||
| ChatCompletion | 2 years ago | |
| TextCompletion | 2 years ago | |
| ExtensionMethods.cs | 2 years ago | |
| LLamaSharp.SemanticKernel.csproj | 2 years ago | |
| README.md | 2 years ago | |
LLamaSharp.SemanticKernel are connections for SemanticKernel: an SDK for intergrating various LLM interfaces into a single implementation. With this, you can add local LLaMa queries as another connection point with your existing connections.
For reference on how to implement it, view the following examples:
using var model = LLamaWeights.LoadFromFile(parameters);
// LLamaSharpTextCompletion can accept ILLamaExecutor.
var ex = new StatelessExecutor(model, parameters);
var builder = new KernelBuilder();
builder.WithAIService<ITextCompletion>("local-llama", new LLamaSharpTextCompletion(ex), true);
using var model = LLamaWeights.LoadFromFile(parameters);
using var context = model.CreateContext(parameters);
// LLamaSharpChatCompletion requires InteractiveExecutor, as it's the best fit for the given command.
var ex = new InteractiveExecutor(context);
var chatGPT = new LLamaSharpChatCompletion(ex);
C#/.NET上易用的LLM高性能推理框架,支持LLaMA和LLaVA系列模型。
C# Text Metal JavaScript HTML+Razor other