Binarybardakshat commited on
Commit
3d08de3
1 Parent(s): 4f50729

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +42 -3
README.md CHANGED
@@ -1,3 +1,42 @@
1
- ---
2
- license: openrail
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: openrail
3
+ datasets:
4
+ - Binarybardakshat/SVLM-ACL-DATASET
5
+ language:
6
+ - en
7
+ library_name: transformers
8
+ ---
9
+ # SVLM: A Question-Answering Model for ACL Research Papers
10
+
11
+ This model, `SVLM`, is designed to answer questions based on research papers from the ACL dataset. It leverages the BART architecture to generate precise answers from scientific abstracts.
12
+
13
+ ## Model Details
14
+
15
+ - **Model Architecture:** BART (Bidirectional and Auto-Regressive Transformers)
16
+ - **Framework:** TensorFlow
17
+ - **Dataset:** [Binarybardakshat/SVLM-ACL-DATASET](https://huggingface.co/datasets/Binarybardakshat/SVLM-ACL-DATASET)
18
+ - **Author:** @binarybardakshat (Akshat Shukla)
19
+ - **Purpose:** The model is trained to provide answers to questions from the ACL research paper dataset.
20
+
21
+ ## Usage
22
+
23
+ To use this model with the Hugging Face Interface API:
24
+
25
+ ```python
26
+ from transformers import AutoTokenizer, TFAutoModelForSeq2SeqLM
27
+
28
+ # Load the model and tokenizer
29
+ tokenizer = AutoTokenizer.from_pretrained("binarybardakshat/SVLM")
30
+ model = TFAutoModelForSeq2SeqLM.from_pretrained("binarybardakshat/SVLM")
31
+
32
+ # Example input
33
+ input_text = "What is the main contribution of the paper titled 'Your Paper Title'?"
34
+
35
+ # Tokenize input
36
+ inputs = tokenizer(input_text, return_tensors="tf", padding=True, truncation=True)
37
+
38
+ # Generate answer
39
+ outputs = model.generate(inputs.input_ids, max_length=50, num_beams=5, early_stopping=True)
40
+ answer = tokenizer.decode(outputs[0], skip_special_tokens=True)
41
+
42
+ print("Answer:", answer)