File size: 1,732 Bytes
9e0e1c7
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
---
license: other
license_name: bespoke-lora-trained-license
license_link: https://multimodal.art/civitai-licenses?allowNoCredit=True&allowCommercialUse=Image&allowDerivatives=True&allowDifferentLicense=True
tags:
- text-to-image
- stable-diffusion
- lora
- diffusers
- template:sd-lora
- migrated
- celebrity

base_model: black-forest-labs/FLUX.1-dev
instance_prompt: 
widget:
- text: ' , female , posing for photographer, thin, tall,, smiling, happy,'
  
  output:
    url: >-
      35262296.jpeg
- text: ' , female , posing for photographer, thin, tall,, smiling, happy,'
  
  output:
    url: >-
      35262293.jpeg

---

# Brooke Shields 

<Gallery />



([CivitAI](https://civitai.com/models/))

## Model description

<p>One more girl I used to crush on, when I was a younger....</p><p>Follow me for more!!</p><p>Niko3DX</p>



## Download model

Weights for this model are available in Safetensors format.

[Download](/Keltezaa/brooke-shields/tree/main) them in the Files & versions tab.

## Use it with the [🧨 diffusers library](https://github.com/huggingface/diffusers)

```py
from diffusers import AutoPipelineForText2Image
import torch

device = "cuda" if torch.cuda.is_available() else "cpu"

pipeline = AutoPipelineForText2Image.from_pretrained('black-forest-labs/FLUX.1-dev', torch_dtype=torch.bfloat16).to(device)
pipeline.load_lora_weights('Keltezaa/brooke-shields', weight_name='brooke-shields.safetensors')
image = pipeline(' , female , posing for photographer, thin, tall,, smiling, happy,').images[0]
```

For more details, including weighting, merging and fusing LoRAs, check the [documentation on loading LoRAs in diffusers](https://huggingface.co/docs/diffusers/main/en/using-diffusers/loading_adapters)