Threat Hunting Metrics: The Good, The Bad and The Ugly

Kostas
6 min readAug 21, 2023
Photo by Luke Chesser on Unsplash

Threat hunting is a crucial aspect of information security, but measuring its effectiveness can be challenging. In this article, we will explore the good and bad metrics for threat hunting, helping you to understand what works and what doesn’t.

Picture threat hunting as a skilled archer. Just as the archer meticulously chooses the right arrow, studies the wind, and gauges distance to hit the target, a threat hunter sifts through data, evaluates patterns, and pinpoints anomalies. But like any archer, the measure of its accuracy isn’t just in hitting the target but in understanding which shots truly matter. Enter the world of threat hunting metrics. Quantifiable indicators can guide security professionals to improve their information security program’s effectiveness, and threat hunting is no exception.

However, not all metrics are created equal. Some offer clear value (‘The Good’), while others can mislead or divert resources (‘The Bad’), and then there are those that, if misunderstood, can jeopardize an entire security strategy (‘The Ugly’). Let’s explore threat hunting metrics, identify the best from the rest, and learn to use our data effectively.

What are the key differences between good and bad threat hunting metrics?

--

--

Kostas

I am a security researcher. My interests lie in #ThreatIntel, #malware, #IR & #Threat_Hunting. I either post here or at http://thedfirreport.com/