# LLamaModel
Namespace: LLama.OldVersion
#### Caution
The entire LLama.OldVersion namespace will be removed
---
```csharp
public class LLamaModel : IChatModel, System.IDisposable
```
Inheritance [Object](https://docs.microsoft.com/en-us/dotnet/api/system.object) → [LLamaModel](./llama.oldversion.llamamodel.md)
Implements [IChatModel](./llama.oldversion.ichatmodel.md), [IDisposable](https://docs.microsoft.com/en-us/dotnet/api/system.idisposable)
## Properties
### **Name**
```csharp
public string Name { get; set; }
```
#### Property Value
[String](https://docs.microsoft.com/en-us/dotnet/api/system.string)
### **Verbose**
```csharp
public bool Verbose { get; set; }
```
#### Property Value
[Boolean](https://docs.microsoft.com/en-us/dotnet/api/system.boolean)
### **NativeHandle**
```csharp
public SafeLLamaContextHandle NativeHandle { get; }
```
#### Property Value
[SafeLLamaContextHandle](./llama.native.safellamacontexthandle.md)
## Constructors
### **LLamaModel(String, String, Boolean, Int32, Int32, Int32, Int32, Int32, Int32, Int32, Dictionary<Int32, Single>, Int32, Single, Single, Single, Single, Single, Int32, Single, Single, Int32, Single, Single, String, String, String, String, List<String>, String, String, Boolean, Boolean, Boolean, Boolean, Boolean, Boolean, Boolean, Boolean, Boolean, Boolean, Boolean, Boolean, Boolean, Boolean, String)**
Please refer `LLamaParams` to find the meanings of each arg. Be sure to have set the `n_gpu_layers`, otherwise it will
load 20 layers to gpu by default.
```csharp
public LLamaModel(string model_path, string model_name, bool verbose, int seed, int n_threads, int n_predict, int n_ctx, int n_batch, int n_keep, int n_gpu_layers, Dictionary logit_bias, int top_k, float top_p, float tfs_z, float typical_p, float temp, float repeat_penalty, int repeat_last_n, float frequency_penalty, float presence_penalty, int mirostat, float mirostat_tau, float mirostat_eta, string prompt, string path_session, string input_prefix, string input_suffix, List antiprompt, string lora_adapter, string lora_base, bool memory_f16, bool random_prompt, bool use_color, bool interactive, bool embedding, bool interactive_first, bool prompt_cache_all, bool instruct, bool penalize_nl, bool perplexity, bool use_mmap, bool use_mlock, bool mem_test, bool verbose_prompt, string encoding)
```
#### Parameters
`model_path` [String](https://docs.microsoft.com/en-us/dotnet/api/system.string)
The model file path.
`model_name` [String](https://docs.microsoft.com/en-us/dotnet/api/system.string)
The model name.
`verbose` [Boolean](https://docs.microsoft.com/en-us/dotnet/api/system.boolean)
Whether to print details when running the model.
`seed` [Int32](https://docs.microsoft.com/en-us/dotnet/api/system.int32)
`n_threads` [Int32](https://docs.microsoft.com/en-us/dotnet/api/system.int32)
`n_predict` [Int32](https://docs.microsoft.com/en-us/dotnet/api/system.int32)
`n_ctx` [Int32](https://docs.microsoft.com/en-us/dotnet/api/system.int32)
`n_batch` [Int32](https://docs.microsoft.com/en-us/dotnet/api/system.int32)
`n_keep` [Int32](https://docs.microsoft.com/en-us/dotnet/api/system.int32)
`n_gpu_layers` [Int32](https://docs.microsoft.com/en-us/dotnet/api/system.int32)
`logit_bias` [Dictionary<Int32, Single>](https://docs.microsoft.com/en-us/dotnet/api/system.collections.generic.dictionary-2)
`top_k` [Int32](https://docs.microsoft.com/en-us/dotnet/api/system.int32)
`top_p` [Single](https://docs.microsoft.com/en-us/dotnet/api/system.single)
`tfs_z` [Single](https://docs.microsoft.com/en-us/dotnet/api/system.single)
`typical_p` [Single](https://docs.microsoft.com/en-us/dotnet/api/system.single)
`temp` [Single](https://docs.microsoft.com/en-us/dotnet/api/system.single)
`repeat_penalty` [Single](https://docs.microsoft.com/en-us/dotnet/api/system.single)
`repeat_last_n` [Int32](https://docs.microsoft.com/en-us/dotnet/api/system.int32)
`frequency_penalty` [Single](https://docs.microsoft.com/en-us/dotnet/api/system.single)
`presence_penalty` [Single](https://docs.microsoft.com/en-us/dotnet/api/system.single)
`mirostat` [Int32](https://docs.microsoft.com/en-us/dotnet/api/system.int32)
`mirostat_tau` [Single](https://docs.microsoft.com/en-us/dotnet/api/system.single)
`mirostat_eta` [Single](https://docs.microsoft.com/en-us/dotnet/api/system.single)
`prompt` [String](https://docs.microsoft.com/en-us/dotnet/api/system.string)
`path_session` [String](https://docs.microsoft.com/en-us/dotnet/api/system.string)
`input_prefix` [String](https://docs.microsoft.com/en-us/dotnet/api/system.string)
`input_suffix` [String](https://docs.microsoft.com/en-us/dotnet/api/system.string)
`antiprompt` [List<String>](https://docs.microsoft.com/en-us/dotnet/api/system.collections.generic.list-1)
`lora_adapter` [String](https://docs.microsoft.com/en-us/dotnet/api/system.string)
`lora_base` [String](https://docs.microsoft.com/en-us/dotnet/api/system.string)
`memory_f16` [Boolean](https://docs.microsoft.com/en-us/dotnet/api/system.boolean)
`random_prompt` [Boolean](https://docs.microsoft.com/en-us/dotnet/api/system.boolean)
`use_color` [Boolean](https://docs.microsoft.com/en-us/dotnet/api/system.boolean)
`interactive` [Boolean](https://docs.microsoft.com/en-us/dotnet/api/system.boolean)
`embedding` [Boolean](https://docs.microsoft.com/en-us/dotnet/api/system.boolean)
`interactive_first` [Boolean](https://docs.microsoft.com/en-us/dotnet/api/system.boolean)
`prompt_cache_all` [Boolean](https://docs.microsoft.com/en-us/dotnet/api/system.boolean)
`instruct` [Boolean](https://docs.microsoft.com/en-us/dotnet/api/system.boolean)
`penalize_nl` [Boolean](https://docs.microsoft.com/en-us/dotnet/api/system.boolean)
`perplexity` [Boolean](https://docs.microsoft.com/en-us/dotnet/api/system.boolean)
`use_mmap` [Boolean](https://docs.microsoft.com/en-us/dotnet/api/system.boolean)
`use_mlock` [Boolean](https://docs.microsoft.com/en-us/dotnet/api/system.boolean)
`mem_test` [Boolean](https://docs.microsoft.com/en-us/dotnet/api/system.boolean)
`verbose_prompt` [Boolean](https://docs.microsoft.com/en-us/dotnet/api/system.boolean)
`encoding` [String](https://docs.microsoft.com/en-us/dotnet/api/system.string)
### **LLamaModel(LLamaParams, String, Boolean, String)**
Please refer `LLamaParams` to find the meanings of each arg. Be sure to have set the `n_gpu_layers`, otherwise it will
load 20 layers to gpu by default.
```csharp
public LLamaModel(LLamaParams params, string name, bool verbose, string encoding)
```
#### Parameters
`params` [LLamaParams](./llama.oldversion.llamaparams.md)
The LLamaModel params
`name` [String](https://docs.microsoft.com/en-us/dotnet/api/system.string)
Model name
`verbose` [Boolean](https://docs.microsoft.com/en-us/dotnet/api/system.boolean)
Whether to output the detailed info.
`encoding` [String](https://docs.microsoft.com/en-us/dotnet/api/system.string)
#### Exceptions
[RuntimeError](./llama.exceptions.runtimeerror.md)
## Methods
### **WithPrompt(String, String)**
Apply a prompt to the model.
```csharp
public LLamaModel WithPrompt(string prompt, string encoding)
```
#### Parameters
`prompt` [String](https://docs.microsoft.com/en-us/dotnet/api/system.string)
`encoding` [String](https://docs.microsoft.com/en-us/dotnet/api/system.string)
#### Returns
[LLamaModel](./llama.oldversion.llamamodel.md)
#### Exceptions
[ArgumentException](https://docs.microsoft.com/en-us/dotnet/api/system.argumentexception)
### **WithPromptFile(String)**
Apply the prompt file to the model.
```csharp
public LLamaModel WithPromptFile(string promptFileName)
```
#### Parameters
`promptFileName` [String](https://docs.microsoft.com/en-us/dotnet/api/system.string)
#### Returns
[LLamaModel](./llama.oldversion.llamamodel.md)
### **InitChatPrompt(String, String)**
```csharp
public void InitChatPrompt(string prompt, string encoding)
```
#### Parameters
`prompt` [String](https://docs.microsoft.com/en-us/dotnet/api/system.string)
`encoding` [String](https://docs.microsoft.com/en-us/dotnet/api/system.string)
### **InitChatAntiprompt(String[])**
```csharp
public void InitChatAntiprompt(String[] antiprompt)
```
#### Parameters
`antiprompt` [String[]](https://docs.microsoft.com/en-us/dotnet/api/system.string)
### **Chat(String, String, String)**
Chat with the LLaMa model under interactive mode.
```csharp
public IEnumerable Chat(string text, string prompt, string encoding)
```
#### Parameters
`text` [String](https://docs.microsoft.com/en-us/dotnet/api/system.string)
`prompt` [String](https://docs.microsoft.com/en-us/dotnet/api/system.string)
`encoding` [String](https://docs.microsoft.com/en-us/dotnet/api/system.string)
#### Returns
[IEnumerable<String>](https://docs.microsoft.com/en-us/dotnet/api/system.collections.generic.ienumerable-1)
#### Exceptions
[ArgumentException](https://docs.microsoft.com/en-us/dotnet/api/system.argumentexception)
### **SaveState(String)**
Save the state to specified path.
```csharp
public void SaveState(string filename)
```
#### Parameters
`filename` [String](https://docs.microsoft.com/en-us/dotnet/api/system.string)
### **LoadState(String, Boolean)**
Load the state from specified path.
```csharp
public void LoadState(string filename, bool clearPreviousEmbed)
```
#### Parameters
`filename` [String](https://docs.microsoft.com/en-us/dotnet/api/system.string)
`clearPreviousEmbed` [Boolean](https://docs.microsoft.com/en-us/dotnet/api/system.boolean)
Whether to clear previous footprints of this model.
#### Exceptions
[RuntimeError](./llama.exceptions.runtimeerror.md)
### **Tokenize(String, String)**
Tokenize a string.
```csharp
public List Tokenize(string text, string encoding)
```
#### Parameters
`text` [String](https://docs.microsoft.com/en-us/dotnet/api/system.string)
The utf-8 encoded string to tokenize.
`encoding` [String](https://docs.microsoft.com/en-us/dotnet/api/system.string)
#### Returns
[List<Int32>](https://docs.microsoft.com/en-us/dotnet/api/system.collections.generic.list-1)
A list of tokens.
#### Exceptions
[RuntimeError](./llama.exceptions.runtimeerror.md)
If the tokenization failed.
### **DeTokenize(IEnumerable<Int32>)**
Detokenize a list of tokens.
```csharp
public string DeTokenize(IEnumerable tokens)
```
#### Parameters
`tokens` [IEnumerable<Int32>](https://docs.microsoft.com/en-us/dotnet/api/system.collections.generic.ienumerable-1)
The list of tokens to detokenize.
#### Returns
[String](https://docs.microsoft.com/en-us/dotnet/api/system.string)
The detokenized string.
### **Call(String, String)**
Call the model to run inference.
```csharp
public IEnumerable Call(string text, string encoding)
```
#### Parameters
`text` [String](https://docs.microsoft.com/en-us/dotnet/api/system.string)
`encoding` [String](https://docs.microsoft.com/en-us/dotnet/api/system.string)
#### Returns
[IEnumerable<String>](https://docs.microsoft.com/en-us/dotnet/api/system.collections.generic.ienumerable-1)
#### Exceptions
[RuntimeError](./llama.exceptions.runtimeerror.md)
### **Dispose()**
```csharp
public void Dispose()
```