SVLM / README.md
Binarybardakshat's picture
Update README.md
ee6dfc1 verified
metadata
license: openrail
datasets:
  - Binarybardakshat/SVLM-ACL-DATASET
language:
  - en
library_name: transformers
tags:
  - code

SVLM: A Question-Answering Model for ACL Research Papers

This model, SVLM, is designed to answer questions based on research papers from the ACL dataset. It leverages the BART architecture to generate precise answers from scientific abstracts.

Model Details

  • Model Architecture: BART (Bidirectional and Auto-Regressive Transformers)
  • Framework: TensorFlow
  • Dataset: Binarybardakshat/SVLM-ACL-DATASET
  • Author: @binarybard (Akshat Shukla)
  • Purpose: The model is trained to provide answers to questions from the ACL research paper dataset.

Usage

To use this model with the Hugging Face Interface API:

from transformers import AutoTokenizer, TFAutoModelForSeq2SeqLM

# Load the model and tokenizer
tokenizer = AutoTokenizer.from_pretrained("Binarybardakshat/SVLM")
model = TFAutoModelForSeq2SeqLM.from_pretrained("Binarybardakshat/SVLM")

# Example input
input_text = "What is the main contribution of the paper titled 'Your Paper Title'?"

# Tokenize input
inputs = tokenizer(input_text, return_tensors="tf", padding=True, truncation=True)

# Generate answer
outputs = model.generate(inputs.input_ids, max_length=50, num_beams=5, early_stopping=True)
answer = tokenizer.decode(outputs[0], skip_special_tokens=True)

print("Answer:", answer)