Spaces:
Runtime error
Runtime error
# Model documentation & parameters | |
**Model type**: Type of PGT model to be used: | |
- `PGTGenerator`: A model for part-of-patent generator. | |
- `PGTEditor`: An algorithm for part-of-patent editing. | |
- `PGTCoherenceChecker`: An algorithm for patent coherence check. | |
**Generator task**: Task in case the `PGTGenerator` model is used. Options are: | |
- `title-to-abstract` | |
- `abstract-to-title` | |
- `abstract-to-claim` | |
- `claim-to-abstract` | |
**Editor task**: Task in case the `PGTEditor` model is used. Options are: | |
- `abstract` | |
- `claim` | |
**Coherence task**: Task in case the `PGTCoherenceChecker` model is used. Options are: | |
- `title-abstract` | |
- `title-claim` | |
- `abstract-claim` | |
**Primary text prompt**: The main text prompt for the model | |
**Secondary text prompt**: The secondary text prompt for the model (only used for `PGTCoherenceChecker`). | |
**Maximal length**: The maximal number of tokens in the generated sequences. | |
**Top-k**: Number of top-k probability tokens to keep. | |
**Top-p**: Only tokens with cumulative probabilities summing up to this value are kept. | |
# Model card -- PatentGenerativeTransformer | |
**Model Details**: Patent Generative Transformer (PGT), a transformer-based multitask language model trained to facilitate the patent generation process. Published by [Christofidellis et al. (*ICML 2022 Workshop KRLM*)](https://openreview.net/forum?id=dLHtwZKvJmE) | |
**Developers**: Dimitrios Christofidellis and colleagues at IBM Research. | |
**Distributors**: Model natively integrated into GT4SD. | |
**Model date**: 2022. | |
**Model type**: | |
- `PGTGenerator`: A model for part-of-patent generator | |
- `PGTEditor`: An algorithm for part-of-patent editing. | |
- `PGTCoherenceChecker`: An algorithm for patent coherence check | |
**Information about training algorithms, parameters, fairness constraints or other applied approaches, and features**: | |
N.A. | |
**Paper or other resource for more information**: | |
The Patent Generative Transformer (PGT) [paper by Christofidellis et al. (*ICML 2022 Workshop KRLM*)](https://openreview.net/forum?id=dLHtwZKvJmE). | |
**License**: MIT | |
**Where to send questions or comments about the model**: Open an issue on [GT4SD repository](https://github.com/GT4SD/gt4sd-core). | |
**Intended Use. Use cases that were envisioned during development**: N.A. | |
**Primary intended uses/users**: N.A. | |
**Out-of-scope use cases**: Production-level inference, producing molecules with harmful properties. | |
**Metrics**: N.A. | |
**Datasets**: N.A. | |
**Ethical Considerations**: Unclear, please consult with original authors in case of questions. | |
**Caveats and Recommendations**: Unclear, please consult with original authors in case of questions. | |
Model card prototype inspired by [Mitchell et al. (2019)](https://dl.acm.org/doi/abs/10.1145/3287560.3287596?casa_token=XD4eHiE2cRUAAAAA:NL11gMa1hGPOUKTAbtXnbVQBDBbjxwcjGECF_i-WC_3g1aBgU1Hbz_f2b4kI_m1in-w__1ztGeHnwHs) | |
## Citation | |
```bib | |
@inproceedings{christofidellis2022pgt, | |
title={PGT: a prompt based generative transformer for the patent domain}, | |
author={Christofidellis, Dimitrios and Torres, Antonio Berrios and Dave, Ashish and Roveri, Manuel and Schmidt, Kristin and Swaminathan, Sarath and Vandierendonck, Hans and Zubarev, Dmitry and Manica, Matteo}, | |
booktitle={ICML 2022 Workshop on Knowledge Retrieval and Language Models}, | |
year={2022} | |
} | |
``` |