# InteractiveExecutor
Namespace: LLama
The LLama executor for interactive mode.
```csharp
public class InteractiveExecutor : StatefulExecutorBase, LLama.Abstractions.ILLamaExecutor
```
Inheritance [Object](https://docs.microsoft.com/en-us/dotnet/api/system.object) → [StatefulExecutorBase](./llama.statefulexecutorbase.md) → [InteractiveExecutor](./llama.interactiveexecutor.md)
Implements [ILLamaExecutor](./llama.abstractions.illamaexecutor.md)
## Properties
### **Model**
The mode used by the executor.
```csharp
public LLamaModel Model { get; }
```
#### Property Value
[LLamaModel](./llama.llamamodel.md)
## Constructors
### **InteractiveExecutor(LLamaModel)**
```csharp
public InteractiveExecutor(LLamaModel model)
```
#### Parameters
`model` [LLamaModel](./llama.llamamodel.md)
## Methods
### **GetStateData()**
```csharp
public ExecutorBaseState GetStateData()
```
#### Returns
[ExecutorBaseState](./llama.statefulexecutorbase.executorbasestate.md)
### **LoadState(ExecutorBaseState)**
```csharp
public void LoadState(ExecutorBaseState data)
```
#### Parameters
`data` [ExecutorBaseState](./llama.statefulexecutorbase.executorbasestate.md)
### **SaveState(String)**
```csharp
public void SaveState(string filename)
```
#### Parameters
`filename` [String](https://docs.microsoft.com/en-us/dotnet/api/system.string)
### **LoadState(String)**
```csharp
public void LoadState(string filename)
```
#### Parameters
`filename` [String](https://docs.microsoft.com/en-us/dotnet/api/system.string)
### **GetLoopCondition(InferStateArgs)**
Define whether to continue the loop to generate responses.
```csharp
protected bool GetLoopCondition(InferStateArgs args)
```
#### Parameters
`args` [InferStateArgs](./llama.statefulexecutorbase.inferstateargs.md)
#### Returns
[Boolean](https://docs.microsoft.com/en-us/dotnet/api/system.boolean)
### **PreprocessInputs(String, InferStateArgs)**
```csharp
protected void PreprocessInputs(string text, InferStateArgs args)
```
#### Parameters
`text` [String](https://docs.microsoft.com/en-us/dotnet/api/system.string)
`args` [InferStateArgs](./llama.statefulexecutorbase.inferstateargs.md)
### **PostProcess(InferenceParams, InferStateArgs, IEnumerable`1&)**
Return whether to break the generation.
```csharp
protected bool PostProcess(InferenceParams inferenceParams, InferStateArgs args, IEnumerable`1& extraOutputs)
```
#### Parameters
`inferenceParams` [InferenceParams](./llama.common.inferenceparams.md)
`args` [InferStateArgs](./llama.statefulexecutorbase.inferstateargs.md)
`extraOutputs` [IEnumerable`1&](https://docs.microsoft.com/en-us/dotnet/api/system.collections.generic.ienumerable-1&)
#### Returns
[Boolean](https://docs.microsoft.com/en-us/dotnet/api/system.boolean)
### **InferInternal(InferenceParams, InferStateArgs)**
```csharp
protected void InferInternal(InferenceParams inferenceParams, InferStateArgs args)
```
#### Parameters
`inferenceParams` [InferenceParams](./llama.common.inferenceparams.md)
`args` [InferStateArgs](./llama.statefulexecutorbase.inferstateargs.md)