* Added a `Guidance` method to `LLamaTokenDataArray` which applies classifier free guidance
* Factored out a safer `llama_sample_apply_guidance` method based on spans
* Created a guided sampling demo using the batched executor
* fixed comment, "classifier free" not "context free"
* Rebased onto master and fixed breakage due to changes in `BaseSamplingPipeline`
* Asking user for guidance weight
* Progress bar in batched fork demo
* Improved fork example (using tree display)
* Added proper disposal of resources in batched examples
* Added some more comments in BatchedExecutorGuidance
- Added BaseSamplingPipeline which provides a base impl of `ISamplingPipeline`
- Added `DefaultSamplingPipeline` which mimics normal llama.cpp sampling
- Cleaned up comments in implementations of `IInferenceParams`
- Removed default values for all parameters in `LLamaContext.Sample` - they're never used and probably _shouldn't_ ever be used
* Added a bool to sbyte Utils convertor
As an attempt to avoid using any MarshalAs attribute for .Net Framework support this Utils method will take in a bool value and return a 1 for true or 0 for false sbyte.
* Changed all bool "MarshalAs" types to sbytes
Changed all previous BOOL types with "MarshalAs" attributes to SBYTEs and changed all the setters of them to use the Utils.BoolToSignedByte() convertor method.
* Fixed Utils bool convertor & added sbyte to bool
Improved the Utils bool convertor just casting an sbyte value to get rid of the unneeded sbyte array and added an sbyte to bool convertor to convert back the way to a C# bool assuming any positive value above 0 is a bool and no bools are packed in the single byte integer.
* bool to & from sbyte conversions via properties
All 1byte bools are now handled where they "sit", via public properties which perform the conversions to keep all external data able to communicate as it did before.
- Moved repeated code to convert `LLamaTokenDataArray` into a `LLamaTokenDataArrayNative` into a helper method.
- Modified all call sites to dispose the `MemoryHandle`
- Saved one copy of the `List<LLamaTokenData>` into a `LLamaTokenData[]` in `LlamaModel`