|
<!-- |
|
title: OpenFactCheck |
|
emoji: ✅ |
|
colorFrom: green |
|
colorTo: purple |
|
sdk: streamlit |
|
app_file: src/openfactcheck/app/app.py |
|
pinned: false |
|
--> |
|
|
|
<p align="center"> |
|
<img alt="OpenFactCheck Logo" src="assets/splash.png" height="120" /> |
|
<h3 align="center" style="color:SlateBlue;">OpenFactCheck</h3> |
|
<p align="center">An Open-source Factuality Evaluation Demo for LLMs</p> |
|
</p> |
|
|
|
--- |
|
|
|
[![License: MIT](https://img.shields.io/badge/License-MIT-green.svg)](https://github.com/IINemo/isanlp_srl_framebank/blob/master/LICENSE) |
|
![Python 3.11](https://img.shields.io/badge/python-3.11-blue.svg) |
|
|
|
## Overview |
|
OpenFactCheck is an open-source repository designed to facilitate the evaluation and enhancement of factuality in responses generated by large language models (LLMs). This project aims to integrate various fact-checking tools into a unified framework and provide comprehensive evaluation pipelines. |
|
|