When generative AI pulls the wrong index from its database,
it doesn't know it's wrong.
It just generates — confidently, incorrectly.
That is hallucination.
Your model is already trained.
Don't retrain it. Just replace the matching function.
KlastroKnowledge swaps cosine similarity with Mahalanobis distance —
plug it in, and hallucination drops significantly.
AGPL v3 — free for research and non-commercial use.
Commercial use requires a separate agreement.
For the commercial licensing, please contact: contact@klastrovanie.com