Skip to main content
ScriptAI Tools & APIsv1.0

Hallucination Detector

by AaaS · open-source · Last verified 2026-03-01

Detects hallucinated content in LLM outputs by cross-referencing claims against source documents and knowledge bases. Uses claim decomposition, source attribution scoring, and consistency checking to flag unsupported or fabricated statements.

https://aaas.blog/script/hallucination-detector
C+
C+Average
Adoption: C+Quality: AFreshness: ACitations: C+Engagement: F

Specifications

License
MIT
Pricing
open-source
Capabilities
claim-decomposition, source-attribution, consistency-checking, confidence-scoring, flagging
Integrations
openai, anthropic, langchain, sentence-transformers
Use Cases
factuality-checking, rag-quality-assurance, content-verification, output-validation
API Available
No
Language
python
Dependencies
openai, anthropic, langchain, sentence-transformers, pandas
Environment
Python 3.11+
Est. Runtime
3-15 minutes depending on output length
Tags
script, automation, hallucination, factuality, detection
Added
2026-03-17
Completeness
100%

Index Score

51
Adoption
54
Quality
82
Freshness
86
Citations
52
Engagement
0

Explore the full AI ecosystem on Agents as a Service