Hallucination Detection Service (ALPHA)
Compare LLM-generated output against the Wikipedia database.
A full-stack claim verification system that extracts claims from LLM output, resolves contextual references, retrieves external evidence from Wikipedia, and evaluates support using semantic similarity and numeric consistency checks.
Input Text
Paste a response from ChatGPT, Gemini, Claude, or another LLM.