oklingefjord commited on
Commit
3ef0f59
·
verified ·
1 Parent(s): 7b21aaf

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +12 -1
README.md CHANGED
@@ -7,4 +7,15 @@ sdk: gradio
7
  pinned: false
8
  ---
9
 
10
- Edit this `README.md` markdown file to author your organization card.
 
 
 
 
 
 
 
 
 
 
 
 
7
  pinned: false
8
  ---
9
 
10
+ # Meaning Alignment Institute (MAI)
11
+
12
+ ### Mission
13
+ MAI aims to align artificial intelligence with human values and meaning. They focus on developing Wise AI, which integrates practical wisdom to make decisions beneficial to humanity, and on exploring post-AGI futures centered around human flourishing.
14
+
15
+ ### Research Highlights
16
+ - **Moral Graph Elicitation (MGE):** A process to crowdsource moral wisdom and fine-tune models accordingly.
17
+ - **Post-AGI Mechanisms:** New institutions and LLM-driven systems to prioritize deeper human values over surface-level preferences.
18
+
19
+ ### Get Involved
20
+ Visit [MAI's website](https://www.meaningalignment.org/) to explore their work or contribute.
21
+