Introduction

A predictive weak LLM for translating user chat to a specific transformation task. This model is fine-tuned on a curated training dataset that collects common transformation tasks in the wild.

Users can interact with the model via 1) direct chat; 2) providing example pairs; 3) describing patterns or mixed input.

This model will predict the most suitable task and return operator & coding instructions accordingly.

This model can classify the following data transformation tasks:

  1. Format: related to value consistency without arithmetic relation, e.g., to lower case, ABC → abc.
  2. UnitConvert: transform regular metrics using a range of measurement unit scales, e.g., Hour → Minute, Kilogram→ Pound.
  3. Extract: generally driven by Regex, e.g., ABC → BC.
  4. DomainCalculate: convert cross-domain value by calculation, often observed in numerics, e.g., Unix timestamp → Local time with timezone.
  5. DomainMap: convert cross-domain value by mapping relation, often observed in categorical case, e.g., Color RGB → Hex.
  6. Transform: default, if none of the above all

Examples

User chat + example-pair

  • Unit Conversion
### Instruction ###
kgs to pounds, one digit after the decimal, rounding

### Examples ###
Input: 2
Output: 4.4
Input: 3
Output: 6.6

unit_convert(): Convert kilograms to pounds, rounding to one decimal place

  • Month number to name
### Instruction ###
convert month number to month name

### Examples ###
Input: 7
Output: July
Input: 12
Output: December

domain_map(): Convert a month number to its corresponding month name.

Downloads last month
0
Safetensors
Model size
8.03B params
Tensor type
BF16
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for Ti-ger/llama3_lora_dt_chat

Adapter
(673)
this model