Hasan Iqbal commited on
Commit
9620c5d
1 Parent(s): ec5f465

Centered the badges

Browse files
Files changed (1) hide show
  1. README.md +15 -4
README.md CHANGED
@@ -15,10 +15,21 @@ pinned: false
15
 
16
  ---
17
 
18
- [![Build and Test](https://img.shields.io/github/actions/workflow/status/hasaniqbal777/openfactcheck/main.yaml)](https://github.com/hasaniqbal777/OpenFactCheck/actions/workflows/main.yaml)
19
- [![License: Apache-2.0](https://img.shields.io/github/license/hasaniqbal777/openfactcheck)](https://opensource.org/licenses/Apache-2.0)
20
- [![Python Version](https://img.shields.io/pypi/pyversions/openfactcheck.svg)](https://pypi.org/project/openfactcheck/)
21
- [![PyPI Latest Release](https://img.shields.io/pypi/v/openfactcheck.svg)](https://pypi.org/project/openfactcheck/)
 
 
 
 
 
 
 
 
 
 
22
 
23
  ## Overview
 
24
  OpenFactCheck is an open-source repository designed to facilitate the evaluation and enhancement of factuality in responses generated by large language models (LLMs). This project aims to integrate various fact-checking tools into a unified framework and provide comprehensive evaluation pipelines.
 
15
 
16
  ---
17
 
18
+ <p align="center">
19
+ <a href="https://github.com/hasaniqbal777/OpenFactCheck/actions/workflows/main.yaml">
20
+ <img src="https://img.shields.io/github/actions/workflow/status/hasaniqbal777/openfactcheck/main.yaml" alt="Build and Test">
21
+ </a>
22
+ <a href="https://opensource.org/licenses/Apache-2.0">
23
+ <img src="https://img.shields.io/github/license/hasaniqbal777/openfactcheck" alt="License: Apache-2.0">
24
+ </a>
25
+ <a href="https://pypi.org/project/openfactcheck/">
26
+ <img src="https://img.shields.io/pypi/pyversions/openfactcheck.svg" alt="Python Version">
27
+ </a>
28
+ <a href="https://pypi.org/project/openfactcheck/">
29
+ <img src="https://img.shields.io/pypi/v/openfactcheck.svg" alt="PyPI Latest Release">
30
+ </a>
31
+ </p>
32
 
33
  ## Overview
34
+
35
  OpenFactCheck is an open-source repository designed to facilitate the evaluation and enhancement of factuality in responses generated by large language models (LLMs). This project aims to integrate various fact-checking tools into a unified framework and provide comprehensive evaluation pipelines.