Knowledge

Articles by tag "likelihood"

11/6/2021 Compliance

Michael Petrov provided his recommendation for risk assessment methodology – CIS RAM 2.0

POSITION ON RISK

  1. Initial risk is 100% (99.9%). I argue, if you deploy a system without any controls and connect it to the internet, it will be hacked multiple times in a year.
  2. Risk = 100% - control mitigation + destabilizing events (zero days, new vulnerabilities).
  3. We may calculate control mitigation but cannot predict those destabilizing events, and this is the nature of the business, and this is why we cannot precisely measure risks. So we don't have to; we can just assess.
  4. Mitigation is NOT lowering the impact but lowering LIKELIHOOD. When there is a cybersecurity breach, it is easier to predict maximum impact, which depends on the time of detection (controls - destabilizing). I would argue that within some short time, the impact could be the cost of the business. 
  5. My biggest problem with current frameworks is that they all concentrate on initial assessment, not the continuous process.
  6. Risk has to be re-assessed yearly, and the methodology is more important for re-assessment compared to the initial assessment.
  7. Incidents should be used to adjust risks as it is real-life data for statistical analysis for the given client. Incidents should be used to re-assess likelihood, and each incident must be bound to a risk and effect KPI.
  8. Methodology should suggest KPI assignment. 

This is the big picture. All I see today is ISO-like risks analyses that are initially made based on industrial risks. The mentality should be changed, and I don't know if it is too big of a shift from the current approach.