Hallucination Detection Service (ALPHA)

Compare LLM-generated output against the Wikipedia database.

A full-stack claim verification system that extracts claims from LLM output, resolves contextual references, retrieves external evidence from Wikipedia, and evaluates support using semantic similarity and numeric consistency checks.

UPDATE: Project needs to be run locally, due to limited memory allocation on Render's free tier (cloud hosting)! Backend hosting costs are too expensive for this project's scale.

I am keeping this portfolio frontend up just for recruiters to get a visual of what was intended/already set up. But, the system will timeout/give error due to memory limit on analyze; it's a free-tier limitation due to RAM size.

Input Text

Paste a response from ChatGPT, Gemini, Claude, or another LLM.