haesleinhuepf's picture
Update README.md
22423cf verified
|
raw
history blame
1.7 kB
---
library_name: transformers
license: gemma
datasets:
- haesleinhuepf/bio-image-analysis-qa
---
# Model Card for Model ID
This is a proof-of-concept model. It is not properly trained. Do not use it for anything.
Gemma is provided under and subject to the Gemma Terms of Use found at [ai.google.dev/gemma/terms](https://ai.google.dev/gemma/terms/)
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** gemma license
- **Finetuned from model [optional]:** google/gemma-2b-it
## How to Get Started with the Model
Use the code below to get started with the model.
```
def prompt_hf(request, model="haesleinhuepf/gemma-2b-it-bia-proof-of-concept2"):
global prompt_hf
import transformers
import torch
if prompt_hf._pipeline is None:
prompt_hf._pipeline = transformers.pipeline(
"text-generation", model=model, model_kwargs={"torch_dtype": torch.bfloat16}, device_map="auto"
)
return prompt_hf._pipeline(request)[0]['generated_text']
prompt_hf._pipeline = None
prompt_hf("What is the capital of France?")
```
## Training Details
### Training Data
https://huggingface.co/datasets/haesleinhuepf/bio-image-analysis-qa
## Model Card Contact
robert dot haase at uni minus leipzig dot de