Edit model card

Introduction

This is ItbearZhang/facebook-opt-125m-with-alpacadataset using the dataset from tatsu-lab/stanford_alpaca: Code and documentation to train Stanford's Alpaca models, and generate the data. (github.com) and the opt-125m pretrained lm model from facebook/opt-125m.

How to use it to interfere?

from transformers import pipeline

def generate_by_pipeline(instruction, inputs=""):
    if inputs == "":
        prompt = f"### Instruction:\n{instruction}\n\n### Response:"
    else:
        prompt = f"### Instruction:\n{instruction}\n\n### Input:\n{inputs}\n\n### Response:"
    return generator(prompt)[0]['generated_text']

print(generate_by_pipeline("What is the capital of China?"))
Downloads last month
21
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.