Applying Traditional Intelligence Operations to the Fifth Domain
As the term “cyber intelligence” has emerged from the myriad of buzzwords that abound in the cybersecurity industry, the meaning of the term has remained rather nebulous. Cyber intelligence therefore means different things to different people. To some, cyber intelligence is a term, synonymous with “cyber threat intelligence”, which is a subdiscipline that is focused on the analysis of an adversary’s tactics and capability to project force across the fifth domain. Cyber threat intelligence practitioners, therefore collect, process, analyze and disseminate often technical intelligence data, such as IP address lists, file hashes, domain names or unique system and library calls or commandline arguments leveraged by adversary malware implants or exploitation tools. While such technical analysis is vital, without an understanding of how cyber intelligence activities fit into the broader intelligence lifecycle, the end goal of providing executive decision makers with actionable insights and specific and strategic guidance is often lost.
Furthermore, the increasing importance of open source intelligence or “OSINT” in supporting both tactical, operational and strategic mandates is missing from this concept of cyber intelligence. It’s therefore important to also integrate an understanding of how cyber OSINT methods of collection and analysis, focused on metadata from the surface, deep and dark web, fits into the overall model. Unfortunately, many OSINT practitioners have not developed a codified, rigorous and structured approach to how OSINT can support traditional intelligence processes and operations.
What we aim to achieve with this article, is to share specific ways that cyber intelligence analytical techniques and collection methods can be integrated into broader intelligence processes. We therefore present a more structured approach to cyber intelligence collection, processing, analysis and dissemination. The cyber intelligence dimension to each phase of the intelligence lifecycle is discussed and potential benefits explored.
Direction and Planning
The direction and planning phase of the intelligence lifecycle is intended to define the essential purpose and scope of activities undertaken by an intelligence organization. This is often done in one of two ways; by defining General Intelligence Requirements and Priority Intelligence Requirements. General Intelligence Requirements (GIR) are “standing orders” to continuously collect, monitor and report on intelligence targets or areas of interest that are associated with the organization’s core mission. Whereas, Priority Intelligence Requirements (PIR) are ad-hoc requests to steer collection efforts and produce finished intelligence that address specific problems or concerns as they emerge. Intelligence producers will then develop a collection plan and strategy to address the needs of decision makers. However, traditional intelligence organizations often overlook in this process, the opportunities and potential that cyber collection capabilities present. In developing plans to collect intelligence on the operating environment, for instance, the cyber dimension ought to be evaluated. Consider for a moment, the ability to monitor for abrupt changes in network traffic patterns, might indicate an evolution in the target’s tactics or strategies. Similarly, OSINT collection of social media data can be invaluable in gathering situational intelligence during a crisis, or ascertaining an adversary’s intentions or defining their area of operation, by dissecting the narratives that they propagate through social networks. Collection management must therefore consider how cyber intelligence could augment or support incoming requirements to ensure that intelligence estimates or judgments are better informed by the nuanced perspectives that sometimes only cyber can provide.
Identification of Sources
Once it has been decided that cyber will be an essential component of intelligence collection and analysis, the identification of relevant sources for collection is performed. This may involve the manual use of Google “dorks” queries to search for relevant data types, file names, API endpoints, databases or accessible cloud-based object storage interfaces that are relevant to the intelligence requirement. During this process, cyber operators will compile lists of relevant keyword terms, or “selectors” that will be useful to pivot from during future collection or analysis efforts. These selectors can either be “strong” or “soft”. Strong selectors are search terms that relate directly to the intelligence target, such as an e-mail address or telephone number. Soft selectors are keywords that peripherally relate to the intelligence target or their online activities and communications.
Once sources for collection have been identified and selector lists prepared, collection efforts can transition to more automated approaches. Here, scripts and tools are developed and employed to, for instance, automatically iterate through a series of API queries to collect relevant metadata. Other methods may involve the use of machine learning or natural language processing algorithms, to expand a list of selectors by automatically identifying related terms or linguistically translating colloquial terms or expressions from one language to another. The goal of this phase, is to broadly discover and collect in bulk, potentially valuable data and information from relevant sources, that can later be refined by analysts. This approach not only ensures that intelligence data is collected efficiently, but also systematically around the information space of interest, so that relevant intelligence is not missed.
Processing of Intelligence Data
The processing phase of the cyber intelligence lifecycle is important to ensure that relevant data points can bubble up to the surface so that they can be more closely examined during the analysis phase. This may, for instance, involve the use of anomaly or outlier detection algorithms to identify unexpected data patterns that might indicate threat activity. Analysts may opt to use machine learning algorithms such as “decision trees” in combination with “random forests” or Bayesian “changepoint detection” to attain better scale in identifying deviations from expected baselines.
It’s also during this phase that relevant data points are first assembled into the investigative narrative so that its meaning and impact is better understood. The magnitude or relevance of technical data points are sometimes overlooked by analysts, unless their relevance can be articulated in narrative form. Once in narrative form, other connections or correspondences can sometimes become more readily apparent. This also ensures that intelligence data is placed into context and correctly interpreted.
Once data has been processed, general patterns identified and data points properly labeled, the work of the intelligence analysts can begin. It is within this phase that cyber intelligence analysts will more closely comb over the data structures and extract meaningful “metadata” that may provide deeper insight into where the information came from, who created it and when. The intelligence data is then properly sorted and collated and relevant extracts are catalogued within a shared intelligence analysis platform. From there, information can be shared between analysts and evaluated. The intelligence will be graded, ranked and scored, so that decision makers can later judge the reliability of the sources, the relevance of the information and import of the intelligence provided.
Once a finished intelligence product has been produced, it must be distributed to all relevant stakeholders. This may imply that written reports and briefs shall be prepared for executive stakeholders, while technical intelligence is shared with technical stakeholders. This often means the development of Indicators of Compromise (IoC), Yara or SIGMA rules. A “Threat Intelligence Platform” or “TIP” may be leveraged that utilizes STIX/TAXII protocols to publish intelligence data or findings across a broader community of stakeholders.
In the consumption phase of the cyber intelligence lifecycle, intelligence data must be properly operationalised to fully realise its value. If intelligence consumers are unable to take action on the intelligence provided, the intelligence lifecycle is broken and the intelligence program is placed at risk. Errors in judgement can ensue, and opportunities for correction and growth will be missed. It’s therefore critically important that all stakeholders have the means to integrate technical indicators into SIEM platforms, network intrusion detection sensors, endpoint agents, firewalls and other cybersecurity infrastructure.
It is also important that decision makers begin to evaluate relevant courses of action. Intelligence that cannot be used to decide upon a specific course of action, can hardly be called intelligence at all. Intelligence is distinguished from information or data, by being actionable. It is equally important for decision makers to authorize action based on good, finished intelligence, in order for the impacts to be properly evaluated and feedback provided to intelligence producers.
Assessment and Feedback
The final phase of the cyber intelligence lifecycle is crucially important. Once actions are taken based on intelligence, and impacts are realized, the value and accuracy of that intelligence must be evaluated and Priority Intelligence Requirements must be further refined. A closed feedback loop is essential to ensure that intelligence producers can make adjustments to their collection or analysis processes and methods. This is the only way to ensure that future intelligence collection efforts will produce more meaningful results.