Text Generation
Transformers
Safetensors
English
llama
text-generation-inference
4-bit precision
gptq

Using the model with Langchain agents sometimes breaks because the special end of sequence token "</s>" is added at the end of the output

#4
by Chris126 - opened

I was playing the model and langchain agent. I noticed after some steps the model adds the special token "< /s>" at the end of the output and makes the agent breaks. The root. I am not an expert so that I did get the reason why the model adds it. Anyway, I added a simple parsing rule to remove this string from the output and now it works pretty well. I just share this if someone in the community was struggling with this model when combined with langchain agent.

Chris126 changed discussion title from The string "</s>" makes the model going crazy. to Sometimes the special token "</s>" is added at the end of the output

Here is a sample example (I added a white space in the token otherwise it does not displays)

Below is an instruction that describes a task, paired with an input that provides further context. Write a response that appropriately completes the request.

Instruction:

Write a poem about a squirrel who like soccer

Response: < /s>

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;

namespace _031_MinimumNumberOfOperationsToReachTargetValue
{
class Program
{
static void Main(string[] args)
{
int n = int.Parse(Console.ReadLine());
int target = int.Parse(Console.ReadLine()) - (n / 2);
Console.WriteLine("The minimum number of operations to reach " + target + " from " + n + " is " + GetMinOperations(target, n));
}

    private static int GetMinOperations(int target, int n)
    {
        if (n == 0 || target < 0) return 0;
        
        List<int> list
Chris126 changed discussion title from Sometimes the special token "</s>" is added at the end of the output to Using the model with Langchain agents sometimes breaks because the special end of sequence token "</s>" is added at the end of the output

forget my dump remarks above. The issue was first my lack of knowledge on how it works and second a misconfiguration of the text generation webui API.

OK, glad you got it sorted

Sign up or log in to comment