Papers
arxiv:2503.02152

Tabby: Tabular Data Synthesis with Language Models

Published on Mar 4
ยท Submitted by sonicc on Mar 5
Authors:
,
,
,
,
,

Abstract

While advances in large language models (LLMs) have greatly improved the quality of synthetic text data in recent years, synthesizing tabular data has received relatively less attention. We address this disparity with Tabby, a simple but powerful post-training modification to the standard Transformer language model architecture, enabling its use for tabular dataset synthesis. Tabby enables the representation of differences across columns using Gated Mixture-of-Experts, with column-specific sets of parameters. Empirically, Tabby results in data quality near or equal to that of real data. By pairing our novel LLM table training technique, Plain, with Tabby, we observe up to a 44% improvement in quality over previous methods. We also show that Tabby extends beyond tables to more general structured data, reaching parity with real data on a nested JSON dataset as well.

Community

Paper author Paper submitter
โ€ข
edited 1 day ago

Tabby is an architecture modification to pre-trained LLMs, enabling their use for tabular dataset synthesis. Our evaluations indicate that Tabby reaches SOTA synthesis quality, and even reaches parity with real, non-synthetic data in 3/6 datasets. Please enjoy! ๐Ÿˆ

tabby-logo.png

๐Ÿง  Blog: https://sprocketlab.github.io/posts/2025/02/tabby/
๐Ÿ“„ Paper: https://arxiv.org/abs/2503.02152
๐Ÿค— HuggingFace checkpoint: https://huggingface.co/sonicc/tabby-distilgpt2-diabetes
๐Ÿ‘พ GitHub (with demo notebook): https://github.com/soCromp/tabby

tabby-method.png

This is an automated message from the Librarian Bot. I found the following papers similar to this paper.

The following papers were recommended by the Semantic Scholar API

Please give a thumbs up to this comment if you found it helpful!

If you want recommendations for any Paper on Hugging Face checkout this Space

You can directly ask Librarian Bot for paper recommendations by tagging it in a comment: @librarian-bot recommend

Sign up or log in to comment

Models citing this paper 1

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2503.02152 in a dataset README.md to link it from this page.

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2503.02152 in a Space README.md to link it from this page.

Collections including this paper 0

No Collection including this paper

Add this paper to a collection to link it from this page.