# Conversation Namespace: LLama.Batched A single conversation thread that can be prompted (adding tokens from the user) or inferred (extracting a token from the LLM) ```csharp public sealed class Conversation : System.IDisposable ``` Inheritance [Object](https://docs.microsoft.com/en-us/dotnet/api/system.object) → [Conversation](./llama.batched.conversation.md)
Implements [IDisposable](https://docs.microsoft.com/en-us/dotnet/api/system.idisposable) ## Properties ### **Executor** The executor which this conversation belongs to ```csharp public BatchedExecutor Executor { get; } ``` #### Property Value [BatchedExecutor](./llama.batched.batchedexecutor.md)
### **ConversationId** Unique ID for this conversation ```csharp public LLamaSeqId ConversationId { get; } ``` #### Property Value [LLamaSeqId](./llama.native.llamaseqid.md)
### **TokenCount** Total number of tokens in this conversation, cannot exceed the context length. ```csharp public int TokenCount { get; } ``` #### Property Value [Int32](https://docs.microsoft.com/en-us/dotnet/api/system.int32)
### **IsDisposed** Indicates if this conversation has been disposed, nothing can be done with a disposed conversation ```csharp public bool IsDisposed { get; } ``` #### Property Value [Boolean](https://docs.microsoft.com/en-us/dotnet/api/system.boolean)
### **RequiresInference** Indicates if this conversation is waiting for inference to be run on the executor. "Prompt" and "Sample" cannot be called when this is true. ```csharp public bool RequiresInference { get; } ``` #### Property Value [Boolean](https://docs.microsoft.com/en-us/dotnet/api/system.boolean)
### **RequiresSampling** Indicates that this conversation should be sampled. ```csharp public bool RequiresSampling { get; } ``` #### Property Value [Boolean](https://docs.microsoft.com/en-us/dotnet/api/system.boolean)
## Methods ### **Finalize()** Finalizer for Conversation ```csharp protected void Finalize() ``` ### **Dispose()** End this conversation, freeing all resources used by it ```csharp public void Dispose() ``` #### Exceptions [ObjectDisposedException](https://docs.microsoft.com/en-us/dotnet/api/system.objectdisposedexception)
### **Fork()** Create a copy of the current conversation ```csharp public Conversation Fork() ``` #### Returns [Conversation](./llama.batched.conversation.md)
#### Exceptions [ObjectDisposedException](https://docs.microsoft.com/en-us/dotnet/api/system.objectdisposedexception)
**Remarks:** The copy shares internal state, so consumes very little extra memory. ### **Sample()** Get the logits from this conversation, ready for sampling ```csharp public Span Sample() ``` #### Returns [Span<Single>](https://docs.microsoft.com/en-us/dotnet/api/system.span-1)
#### Exceptions [ObjectDisposedException](https://docs.microsoft.com/en-us/dotnet/api/system.objectdisposedexception)
[CannotSampleRequiresPromptException](./llama.batched.cannotsamplerequirespromptexception.md)
Thrown if this conversation was not prompted before the previous call to infer [CannotSampleRequiresInferenceException](./llama.batched.cannotsamplerequiresinferenceexception.md)
Thrown if Infer() must be called on the executor ### **Prompt(String)** Add tokens to this conversation ```csharp public void Prompt(string input) ``` #### Parameters `input` [String](https://docs.microsoft.com/en-us/dotnet/api/system.string)
### **Prompt(List<LLamaToken>)** Add tokens to this conversation ```csharp public void Prompt(List tokens) ``` #### Parameters `tokens` [List<LLamaToken>](https://docs.microsoft.com/en-us/dotnet/api/system.collections.generic.list-1)
#### Exceptions [ObjectDisposedException](https://docs.microsoft.com/en-us/dotnet/api/system.objectdisposedexception)
[AlreadyPromptedConversationException](./llama.batched.alreadypromptedconversationexception.md)
### **Prompt(ReadOnlySpan<LLamaToken>)** Add tokens to this conversation ```csharp public void Prompt(ReadOnlySpan tokens) ``` #### Parameters `tokens` [ReadOnlySpan<LLamaToken>](https://docs.microsoft.com/en-us/dotnet/api/system.readonlyspan-1)
#### Exceptions [ObjectDisposedException](https://docs.microsoft.com/en-us/dotnet/api/system.objectdisposedexception)
[AlreadyPromptedConversationException](./llama.batched.alreadypromptedconversationexception.md)
### **Prompt(LLamaToken)** Add a single token to this conversation ```csharp public void Prompt(LLamaToken token) ``` #### Parameters `token` [LLamaToken](./llama.native.llamatoken.md)
#### Exceptions [ObjectDisposedException](https://docs.microsoft.com/en-us/dotnet/api/system.objectdisposedexception)
[AlreadyPromptedConversationException](./llama.batched.alreadypromptedconversationexception.md)
### **Modify(ModifyKvCache)** Directly modify the KV cache of this conversation ```csharp public void Modify(ModifyKvCache modifier) ``` #### Parameters `modifier` [ModifyKvCache](./llama.batched.conversation.modifykvcache.md)
#### Exceptions [CannotModifyWhileRequiresInferenceException](./llama.batched.cannotmodifywhilerequiresinferenceexception.md)
Thrown if this method is called while [Conversation.RequiresInference](./llama.batched.conversation.md#requiresinference) == true