The evaluation is broken into four phases:
- Setup: Vendors install their product(s)/tool(s) in a Microsoft Azure cyber range we provide. The tool(s) must be deployed for detect/alert-only – preventions, protections, and responses are out of scope for the evaluation and must be disabled or set to alert-only mode. Access to the range is provided to the vendor ten days prior to the start of Phase 2 (Execution).
- Execution: During a joint evaluation session, our red team executes an adversary emulation. The vendor is in open communication with us, either via a telecon or in person. We announce the techniques and procedures as they are executed. The vendor shows us their detections and describes how they occurred so that we can verify the detections. We take screenshots to provide proof of detection. Phase 2 (Execution) occurs over three days, with the third day often used as an overflow and retesting day.
- Processing and Feedback: We process the results, assign detection categories, and summarize detections into short notes. We select screenshots to support the detection notes. We consider each vendor independently based on their capabilities, but also calibrate the results across all participants to ensure consistent application of detection categories. Once complete, vendors are provided a ten-day feedback period to review the results.
- Publication: We review all vendor feedback, but we are not obligated to incorporate it. When reviewing the vendor’s feedback, we consider how we apply detection categories across the entirety of the vendor’s evaluation as well as the other vendors’ results to ensure that we are making consistent and fair decisions. We release the evaluation methodology and the evaluation results onto the ATT&CK Evaluations website.