There's a typo in Readme

#32
by Kurapika993 - opened

Model description
BART is a transformer encoder-encoder (seq2seq) -------->> encoder-decoder

"BART is a transformer encoder-encoder (seq2seq) model with a bidirectional (BERT-like) encoder and an autoregressive (GPT-like) decoder."

Why this is not fixed?

Your need to confirm your account before you can post a new comment.

Sign up or log in to comment