Skip to main content
ScriptAI Tools & APIsv1.0

Red Teaming Script

by AaaS · open-source · Last verified 2026-03-01

Automated red teaming toolkit that generates and tests adversarial prompts against LLM applications. Covers jailbreak attempts, prompt injection variants, social engineering patterns, and boundary probing with categorized attack vectors and success tracking.

https://aaas.blog/script/red-teaming-script
C
CBelow Average
Adoption: CQuality: AFreshness: ACitations: CEngagement: F

Specifications

License
MIT
Pricing
open-source
Capabilities
attack-generation, jailbreak-testing, injection-variants, success-tracking, vulnerability-reporting
Integrations
openai, anthropic, pytest, pandas
Use Cases
security-testing, adversarial-evaluation, defense-validation, pre-deployment-hardening
API Available
No
Language
python
Dependencies
openai, anthropic, pytest, pandas, rich
Environment
Python 3.11+
Est. Runtime
15-45 minutes depending on attack vector count
Tags
script, automation, red-teaming, adversarial, security
Added
2026-03-17
Completeness
100%

Index Score

46.7
Adoption
46
Quality
84
Freshness
88
Citations
46
Engagement
0

Explore the full AI ecosystem on Agents as a Service