Skip to content

Latest commit

 

History

History
5 lines (3 loc) · 360 Bytes

README.md

File metadata and controls

5 lines (3 loc) · 360 Bytes

calibrationgame

Calibration game is a game to get better at identifying hallucination in LLMs.

Prompts and hallucination labels (using ChatGPT) are obtained from Alpaca and HaluEval. You can use your own dataset to calibrate users to the responses of a different LLM.