You can not select more than 25 topics Topics must start with a chinese character,a letter or number, can include dashes ('-') and can be up to 35 characters long.

llama.sampling.isamplingpipelineextensions.md 1.2 kB

1234567891011121314151617181920212223242526272829303132333435363738
  1. # ISamplingPipelineExtensions
  2. Namespace: LLama.Sampling
  3. Extensions methods for ISamplingPipeline
  4. ```csharp
  5. public static class ISamplingPipelineExtensions
  6. ```
  7. Inheritance [Object](https://docs.microsoft.com/en-us/dotnet/api/system.object) → [ISamplingPipelineExtensions](./llama.sampling.isamplingpipelineextensions.md)
  8. ## Methods
  9. ### **Sample(ISamplingPipeline, SafeLLamaContextHandle, Span<Single>, List<LLamaToken>)**
  10. Sample a single token from the given logits
  11. ```csharp
  12. public static LLamaToken Sample(ISamplingPipeline pipeline, SafeLLamaContextHandle ctx, Span<float> logits, List<LLamaToken> lastTokens)
  13. ```
  14. #### Parameters
  15. `pipeline` [ISamplingPipeline](./llama.sampling.isamplingpipeline.md)<br>
  16. `ctx` [SafeLLamaContextHandle](./llama.native.safellamacontexthandle.md)<br>
  17. The context being sampled from
  18. `logits` [Span&lt;Single&gt;](https://docs.microsoft.com/en-us/dotnet/api/system.span-1)<br>
  19. The logits produced by the model
  20. `lastTokens` [List&lt;LLamaToken&gt;](https://docs.microsoft.com/en-us/dotnet/api/system.collections.generic.list-1)<br>
  21. A list of tokens recently returned by the model
  22. #### Returns
  23. [LLamaToken](./llama.native.llamatoken.md)<br>