by Manuel Kern and Florian Skopik

In the last decade there was a clear paradigm shift from focusing only on prevention and protection to also including detection and response. While prevention and protection are indispensable to enable a baseline security, it is presumed that attackers have already compromised systems to some extent (“presumption of compromise”). The fact that professional attackers often operate in the network over a long period of time has long been known in cyber security research. A key pillar of a holistic security approach is therefore the early detection of attackers in the network. But still, the average time to detect attackers remains high. In the course of a study commissioned by IBM Security [L1], the average time it takes to detect a data breach is quantified with a time period of 212 days, five days longer than the year before.

It was in late 2020 that the disastrous case of SolarWinds became public [L2]. State attackers abused the update mechanism of a security solution to infiltrate thousands of organisations up to the highest state level and to move unnoticed in the networks of the attacked organisations for many months. Kaspersky [L3] reports that compared to costs of a cyberattack with immediate remediation, recovery costs are four times higher if remediation is performed after one week. At the same time, detection within one day is still about 30% more cost-effective than detection after more than one week. The insurance provider Allianz [L4] has determined a total loss of 660 million euros in an evaluation of 1,736 incidents. The largest share of costs is due to operational downtime. Another emerging threat is theft of customer data and blackmailing.

Implementing organisation-wide detection and response to deal with this issue, is a resource-intensive undertaking. Not only are required software solutions are difficult to select, deploy, and maintain, it also requires expensive and often rare security expertise of staff to install, maintain and operate these solutions. Security experts are a highly requested resource these days and at the same time, trends driven by economy, environment and technology have drastically accelerated digitalisation. This is also reflected in current employment figures, which reflect a clear lack of IT and IT security specialists worldwide. Besides the lack of human resources, detection and response needs dedicated infrastructures with excessive performance requirements. Efficient detection systems for large infrastructures are not an off-the shelf product. The security aspects of ongoing system integration are complex and typically not fully considered in business decisions. Implementing infrastructure-wide detection systems can quickly consume the entire IT security budget. There is a high probability that detection and response projects will be rejected from the beginning, aborted, or carried out incompletely. This is indeed a serious problem. Thus, it requires effective detection and response to reduce the time of cyberattacks and to keep economic damage and impacts on human safety at a minimum.

The mission of project SPOTTED is to counteract these problems by lowering the entry barrier for modern monitoring and detection solutions and making them more applicable. In recent years, a wide variety of novel methods and models for incident detection and response have been developed that enable accurate detection and efficient prediction in minimum time. Special focus is put on the fact that organisations have only limited resources to establish and operate them. Thus, an optimisation problem is the basis of the project. Cyberattacks leave traces in data sources, such as in log files, memory or data-streams. Detection systems utilise these data sources to detect the application of specific attack techniques. Attack techniques vary considerably in terms of their effectiveness, potential impact and application by threat actors. Data sources, on the other side, may contain traces of one or several attack techniques, and the effort to process their output may differ heavily. Therefore, it is obvious that not all data sources are of equal value for detection and organisations must carefully survey which sources shall be analysed and what attack techniques need to be discovered.

SPOTTED developed D3TECT, a process model based on the three key elements of attack detection (techniques, data sources and algorithms). Figure 1 outlines interactions of key elements on an organisation’s assets, their vulnerabilities and threats. The model describes a procedure for dynamically ranking and selecting data sources suitable for detection. The novelty is that this model accounts for constraints in the selection process. For instance, if a certain data source cannot be utilised in a specific setting, e.g., due to data privacy constraints, the discovery of the most important attack techniques is still ensured by the remaining data sources. Eventually, the D3TECT approach solves the challenge of strategically selecting data sources while accounting for their varying usefulness for attack detection. The model is tested with real data, utilising the MITRE ATT&CK framework and numerous public cyber threat intelligence databases. A recent work [1] shows the ranking results and discusses their plausibility to validate D3TECT.

Figure 1: D3TECT’s attack detection in a nutshell.
Figure 1: D3TECT’s attack detection in a nutshell.

The project SPOTTED is financially supported by the Austrian Research Promotion Agency (FFG) under grant number FO999887725). The project is carried out in the course of an industry-related PhD thesis at the Austrian Institute of Technology (AIT) in cooperation with the Vienna University of Technology (TU WIEN).



[1] M. Kern, et al.: “Strategic selection of data sources for cyber attack detection in enterprise networks: A survey and approach”, 37th ACM/SIGAPP Symposium On Applied Computing, 2022.

Please contact:
Manuel Kern
AIT Austrian Institute of Technology, Austria
This email address is being protected from spambots. You need JavaScript enabled to view it.

Next issue: July 2023
Special theme:
"Eplainable AI"
Call for the next issue
Image ERCIM News 129
This issue in pdf


Image ERCIM News 129 epub
This issue in ePub format

Get the latest issue to your desktop
RSS Feed
Cookies user preferences
We use cookies to ensure you to get the best experience on our website. If you decline the use of cookies, this website may not function as expected.
Accept all
Decline all
Read more
Tools used to analyze the data to measure the effectiveness of a website and to understand how it works.
Google Analytics
Set of techniques which have for object the commercial strategy and in particular the market study.
DoubleClick/Google Marketing