Five Key Analytics Categories to Assess Cyber Exposure

By Scott Stransky

10/07/2021

Cybersecurity Risk Management Online Article

Recent cyberattacks, such as those against Colonial Pipeline Co. and SolarWinds, highlight the elevated level of cyber risk and the need for businesses to have a holistic view of potential cyber vulnerabilities and exposures.

A comprehensive cyber analytics strategy is crucial to quantifying risk exposure and tracking progress toward improved cyber resiliency. As cyber analytics become more sophisticated, boards should ensure management teams are tracking new analytics vendors and regularly assessing what data should be incorporated into their companies’ analytics strategy.

Below are five categories of cyber data and analytics that are vital to tracking exposure and resilience and to informing the board’s cyber-risk dashboard.

1. Firmographics

Metrics about an organization—including, but not limited to, its industry, size, and location—form the basis of any cyber analytics exercise. (Industry can be described with text or by using coding schemes such as the North American Industry Classification System or Standard Industrial Classification; company size is typically measured by revenue or turnover and employee count.) These data also give an indication of corporate links, including parent companies, subsidiaries, and branch locations.

High-quality firmographic data are updated frequently and use sources such as 10-K filings to derive the most accurate information for publicly traded companies. Companies may not consider it critical to externally source this information since they feel they know their own organization best, but understanding how they are seen externally can yield important insights. For example, if there are divested subsidiaries in the corporate tree, if the employee count is far off base, or if there are other material errors, any cyber analyses may be inaccurate. Companies can reach out to firmographic vendors to rectify inaccurate information.

2. Historical Incidents

To understand the general level of cyber risk for a particular industry, annual revenue and historical incident data are quite useful. Correlating historical data across attacks helps identify patterns, detect intrusions, and mitigate potential risks. Several vendors scrape 10-K filings and news articles, as well as perform Freedom of Information Act requests, to collect broad swaths of this data, especially in the United States.

This information must be treated with care, as there are several known biases in these data sets:

  • The data sets are much less complete for non-US entities.

  • The data sets are less complete for smaller companies than for larger companies due to cyber-reporting requirements.

  • Since many cyberbreaches go unreported until well after the incident, counterintuitively, recent data are less reliable than older data.

As with firmographics, it is important for a company to understand how external parties perceive it. Additionally, these data sets are a good way to analyze potential third-party vendors for historical cyber incidents.

3. Technographics

Technographic data provide information about an organization’s cybersecurity stance and posture. There are two high-level methods of sourcing such data: outside-in and inside-out.

For “outside-in” sourcing, various types of sensors are deployed online to unobtrusively collect Internet traffic; the data can help determine which organizations are virtually connected, both intentionally for business purposes and unintentionally if a company is infected with malware and transmitting data to a malicious server. As with tracking a building’s foot traffic, collecting this data does not require the knowledge or permission of the organizations being analyzed.

From the raw data, a company can draw a map of its virtual supply chain, augmented by metrics on how much traffic is flowing to malicious destinations. A company can also determine the status of various traditional cyber defenses, such as email filtering, firewall, and security against spoofing. We have seen cases in which organizations do not realize that they control certain IP addresses or are unaware that their servers are pinging suspicious websites at various intervals. Using outside-in technology, especially prior to purchasing cyber insurance, can reduce the potential premium and lower the chances of an embarrassing claim.

These data are complemented by “inside-out” sourcing, which requires a business’ cooperation to collect and can be facilitated via a questionnaire or via an application or device installed on the company’s network. This inside-out analysis can generate a deep understanding of a company’s network, security systems, configurations, and policies, further informing the holistic view of cyber risk.

4. Scoring

Many vendors take firmographic data, historical incident data, and “outside-in” technographic data to develop indices, scores, and ratings. These ratings can illuminate a company’s relative cybersecurity exposure and provide insight into risks based on an organization’s cyber symptoms. If built and calibrated properly, these ratings can also provide a view into the likelihood of potential future cyber losses. There are now many such indices in the industry, so it is important to understand and vet a particular methodology before using it.

Cyber insurers today do consider these scores as part of the underwriting process. Any company looking to purchase cyber insurance should learn what its scores are from various vendors and look to improve them.

5. Loss Modeling

Several vendors combine many, or all, of the previous four data sources and analytics to develop modeled loss estimates at the company level or for a portfolio of companies. Most of these models employ Monte Carlo simulation to model next year’s potential cyber events tens of thousands of times, forming a full distribution of potential loss scenarios. This distribution is then used to derive metrics such as average annual loss (a representation of the expected loss due to cyber threats) and the exceedance probability curve (which illustrates various “tail” scenarios that must be accounted for in a robust risk management strategy).

The more sophisticated models allow the various data sources and model parameters to be refined to better represent a particular risk or portfolio of risks. For example, some models allow the analyst to study the impact of various cyber controls on loss outcomes, determine appropriate levels of limit and retention to purchase, and evaluate the potential loss in the unfortunate event of a specific cyber incident. The most advanced organizations use third-party loss models in addition to proprietary models to help themselves understand their full cyber-risk profiles.

Scott Stransky
Scott Stransky leads Marsh McLennan’s Cyber Risk Analytics Center.