|
|
|
@@ -7,7 +7,7 @@ |
|
|
|
<Platforms>AnyCPU;x64;Arm64</Platforms> |
|
|
|
<AllowUnsafeBlocks>True</AllowUnsafeBlocks> |
|
|
|
|
|
|
|
<Version>0.11.0</Version> |
|
|
|
<Version>0.11.2</Version> |
|
|
|
<Authors>Rinne, Martin Evans, jlsantiago and all the other contributors in https://github.com/SciSharp/LLamaSharp/graphs/contributors.</Authors> |
|
|
|
<Company>SciSharp STACK</Company> |
|
|
|
<GeneratePackageOnBuild>true</GeneratePackageOnBuild> |
|
|
|
@@ -22,7 +22,7 @@ |
|
|
|
With the higher-level APIs and RAG support, it's convenient to deploy LLM (Large Language Model) in your application with LLamaSharp. |
|
|
|
</Description> |
|
|
|
<PackageReleaseNotes> |
|
|
|
LLamaSharp 0.11.0 added support for multi-modal (LLaVA), improved the BatchedExecutor and added state management of `ChatSession`. |
|
|
|
LLamaSharp 0.11.2 fixed the performance issue of LLaVA on GPU and improved the log suppression. |
|
|
|
</PackageReleaseNotes> |
|
|
|
<PackageLicenseExpression>MIT</PackageLicenseExpression> |
|
|
|
<PackageOutputPath>packages</PackageOutputPath> |
|
|
|
|