jbloom commited on
Commit
f09894d
1 Parent(s): 18fcaa2

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -15,11 +15,11 @@ They are loadable using SAE via a few methods. The preferred method is to use th
15
  ```python
16
  import torch
17
  from transformer_lens import HookedTransformer
18
- from sae_lens import SparseAutoencoder, ActivationsStore
19
 
20
  torch.set_grad_enabled(False)
21
  model = HookedTransformer.from_pretrained("gemma-2b")
22
- sae, cfg, sparsity = SparseAutoencoder.from_pretrained(
23
  "gemma-2b-it-res-jb", # to see the list of available releases, go to: https://github.com/jbloomAus/SAELens/blob/main/sae_lens/pretrained_saes.yaml
24
  "blocks.12.hook_resid_post" # change this to another specific SAE ID in the release if desired.
25
  )
 
15
  ```python
16
  import torch
17
  from transformer_lens import HookedTransformer
18
+ from sae_lens import SAE, ActivationsStore
19
 
20
  torch.set_grad_enabled(False)
21
  model = HookedTransformer.from_pretrained("gemma-2b")
22
+ sae, cfg, sparsity = SAE.from_pretrained(
23
  "gemma-2b-it-res-jb", # to see the list of available releases, go to: https://github.com/jbloomAus/SAELens/blob/main/sae_lens/pretrained_saes.yaml
24
  "blocks.12.hook_resid_post" # change this to another specific SAE ID in the release if desired.
25
  )