CLinker
The CLinker models are distilled language models specifically designed for command-line graph construction, developed by CyCraft AI Lab. CLinker was instroduced in SINCON 2025, with talk titled "CLINKER — An Efficient Distilled LLM Command Line Graph Constructor".
Usage
Launch openai-compatible server (e.g., vllm)
python3 -m vllm.entrypoints.openai.api_server \
--host 0.0.0.0 \
--port 3000 \
--served-model-name $model_name \
--max-model-len $length \
--api-key $api_key \
--model $model_path
DSPy inference
import dspy
# Set dspy module default LM
lm = dspy.LM(
model=f'openai/{$model_name}',
api_key=f'{$api_key}',
api_base='http://localhost:3000/v1',
model_type='chat',
temperature=0.7,
max_tokens=4000,
cache=False,
num_retries=0
)
dspy.configure(lm=lm)
from command_parser import CmdlineParser, CoTCmdlineParser
from command_extractor import CmdlineExtractor, CoTCmdlineExtractor
cmdline = 'echo hello world'
# Reasoning model `CLinker-DeepSeek-1.5B` use non-chain-of-thought prompt
parser = CmdlineParser()
extractor = CmdlineExtractor()
# Non-reasoning models are equipped with chain-of-thoughts prompt
parser = CoTCmdlineParser()
extractor = CoTCmdlineExtractor()
# Run inference
parser_response = parser(cmdline).toDict()
extractor_response = extractor(cmdline).toDict()
# Transform Response: pydantic.BaseModel object into dict
parser_response['response'] = parser_response['response'].model_dump(mode='json')
print(parser_response)
print(extractor_response)
Graph construction
from command_graph_builder import build_cmdline_graph
graph: nx.DiGraph = build_cmdline_graph(cmdline, parser_response, extractor_response)
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support