Home  >  Evaluation Process

Methodology Overview: Evaluation Process


Detection Evaluation Process

  1. Setup: Vendors install their product(s)/tool(s) in a Microsoft Azure cyber range that we provide. The tool(s) must be deployed for detect/alert-only. Preventions, protections, and responses are out of scope for the evaluation and must be disabled or set to alert-only mode. Vendors are advised to deploy a detection solution that is available to their end users, and representative of a realistic deployment. Access to the Azure range is provided to the vendor 10 days prior to the start of Phase 2 (Execution).
  2. Execution: During a joint evaluation session, our red team executes an adversary emulation. The vendor is in open communication with us, either via a telecon or in person. We announce the techniques and procedures that were executed, as well as the relevant details concerning how they were executed. The vendor shows us their detections and describes them to us so that we can verify them. We take screenshots to provide proof of detection. Phase 2 (Execution) occurs over three days, with the third day used as an overflow and retesting day. The Azure cyber ranges will be suspended within 72 hours of the end of the evaluation.
  3. Processing and Feedback: We process the results, assign detection categories, and summarize detections into short notes. We select screenshots to support the detection notes. We consider each vendor independently based on its capabilities, and also calibrate the results across all participants to ensure consistent application of detection categories. Once complete, vendors have a 10-day feedback period to review the results.
  4. Publication: We review all vendor feedback, but we are not obligated to incorporate it. When reviewing a vendor’s feedback, we consider how we apply detection categories across the entirety of a vendor evaluation as well as the other vendors’ results to ensure consistent and fair decisions. We release the evaluation methodology and the evaluation results onto the ATT&CK Evaluations website.

Protection Evaluation Process

An ATT&CK Evaluation - Protection evaluation is offered as an optional extension to the ATT&CK Evaluations - Detection evaluation. The Protection evaluation process modifies the Detection evaluation process in the following ways:

  1. Setup: Vendors are provided access to an additional range during their detection setup period. They deploy their solution with a configuration that is available to their customers and representative of a realistic deployment.
  2. Execution: The Protection evaluation is conducted over an extra day in conjunction with the Detection evaluation. We execute the adversary emulation and note when and how an automated protection influenced our activity. The vendor explains the protection, and we note this for inclusion in the results. Manual actions to block or interrupt the adversary emulation are not permitted during the evaluation period, nor are any configuration changes to what was deployed before start of the exercise.
  3. Processing and Feedback: The Protection evaluation processing and feedback mirrors that of the Detection evaluation, though protection results will receive their own categories to describe their performance.
  4. Publication: Protection evaluations adhere to the same publication policy as the Detection evaluation.