Garak
Visit pageOpen-source LLM vulnerability scanner. Apache 2.0.
Open-source LLM vulnerability scanner.
Garak is an open-source scanner for LLM applications. Tests for hallucination, prompt injection, jailbreak, data leakage, and other failure modes. Now maintained under NVIDIA stewardship.
Notable open-source projects and reference frameworks used by enterprises and consultancies to harden AI deployments.
Direct links to the vendor's product pages. Last reviewed 2026-05-07.
Open-source LLM vulnerability scanner. Apache 2.0.
CWS helps customers evaluate, deploy, and operate Garak products as part of an AI security program. Engagements span vendor selection, proof-of-concept design, integration with existing controls, day-2 operations, and exit planning if the fit changes over time.
CWS does not resell Garak. The recommendation is honest, evidence-based, and tied to the customer's posture gaps — not to channel economics.
Engage CWS on GarakOpen-source toolkit for adding programmable guardrails to LLM apps.
View profileOpen-source LLM evaluation, red teaming, and security testing.
View profileMicrosoft's open-source Python Risk Identification Toolkit for GenAI.
View profileOpen-source LLM testing framework with hosted hub.
View profileOpen-source security toolkit for LLM-powered applications.
View profileThe free AI Posture Check scores your security across six dimensions in 10 minutes. Use the result to shortlist vendors that fit your actual posture — not the loudest demo.
Take the AI Posture Check