Detection as Code
- Detection Content Versioning so that you can truly understand what specific rule or model triggered an alert.
- Regression Testing of Detections: Testing for broken alerts such as those that never fire or those that won’t fire when the intended threat materializes
- Quality Assurance: Testing for gaps in detection overall and False Positives reduction/handling
- Content Content Reuse and modularity, as well as community sharing of content, just as it happens for real programming languages. As a reminder, detection content does not equal rules; but covers rules, signatures, analytics, algorithms, etc.
- Cross-vendor content would be nice, after all we don’t really program in “vendor X python” or “big company C” (even though we used to), we just write in C or Python. In the detection realm, we have projects like Sigma and YARA.
- Metrics: Things of things like coverage of MITRE ATT&CK Techniques per IT Asset or failure rates of your regression testing, but it in the end it is up to the individual to define and use them so you get better.[1]
A good example of this is FalconForge
Relevant Note(s): Detection Engineering