Tag Archives: Operational risk

How to assess risk – Part I: Asking the right questions

It seems to me that self-assessment of risk is a difficult process to understand and execute, primarily because the employees who are asked to assess the risk in their business process, a) don’t really understand the notion of risk and b) don’t really care.  Let’s face it – risk is difficult to understand, since it is a function of many different, often-interdependent variables.

So the question I am going to pose today is:  What is the best way to do a risk assessment?

and the answer is: Start by asking the right questions.

Let’s say that you have the job to collect data for a risk assessment in your business unit. You sit down with the security and compliance manager and schedule meetings with people in the unit. You figure you’re going to be less than thrilled with the quality of information you receive and the employees may not be excited by your standard checklist questions. However, you know that whistleblowing is innate in all of us and it’s worth trying to get to first base.

Drop the compliance checklist and use an attack modeling approach instead.

Explain the notion of valuable company assets, vulnerabilities, threats that exploit vulnerabilities and security countermeasures. It will take a few minutes and every employee I’ve ever met will grok the concept immediately. For starters – ask 7 questions (you notice how all the process improvement methodologies always have 7 steps…)

  1. What is the single most important asset in your job?
  2. What do you think is the single biggest threat to that asset?
  3. How do you think attackers cause damage to the asset?
  4. Can you give me one example of a security exploit (on conditions of non-disclosure)?
  5. If you could give the risk and compliance manager one suggestion, what would it be?
  6. If you had to give the CEO one suggestion, what would it be?
  7. If you had to give President Obama one suggestion on how to reduce the threat of global terror, what would it be?
Tell your friends and colleagues about us. Thanks!
Share this

The problem of security information sharing

Hermann von Helmholtz

In a previous post Sharing security information I suggested that fragmentation of knowledge is a root cause of security breaches.

I was thinking about the problem of sharing data loss information this past week and I realized that we are saturated with solutions, technologies, policies, security frameworks and security standards – COBIT, ISO27001 etc..

The German physicist Helmholtz identified three stages of creativity: saturation, incubation and illumination.   We appear to be in the saturation stage right now.

Henri Poincaré identified a fourth step that follows the other three. Verification is putting a solution into concrete form and checking it for errors or usefulness.

In the early 1960s, the American psychologist Jacob Getzels proposed that a preliminary stage of creativity involves formulating a problem.So let’s start with formulating the problem of security information sharing.

People and their employers are unwilling to discuss the details of security events that happened, their security vulnerabilities,  the damage in dollars was actually caused, how the events were discovered, how the threats that exploited the vulnerabilities were mitigated and most importantly – how well their current security products perform.

In our threat analysis work, we run into these problems daily.  We offer an excellent free threat modeling tool from our colleagues at PTA Technologies called PTA – Practical Threat Analysis. I think we have over 15,000 downloads. Users sometimes have questions that require taking a closer look at their threat model but it almost never happens because of the fear of disclosure. On one occasion – a user shared his threat model after obfuscating the data (you can download the software here – free risk assessment software.)

Here is a possible solution to the  problem we just formulated:

  • Define a language for describing a security event –  having a canonical language to describe things is a basic requirement for sharing information between people.
  • Build models of attackers, vulnerabilities, assets under attack and security countermeasures in order to describe loss events using the common language.
  • Enable people to build, maintain and share models anonymously. What is important is not the identity of the company who had the loss event, but the details of the model.
  • Use the models to measure the loss impact and the effectiveness of their security countermeasures in dollars. This provides a security metric that will enable people to look at models and compare ‘apples’ to ‘apples’ without involving marketing factors such as product features and distribution channels.
Tell your friends and colleagues about us. Thanks!
Share this

People should be very frightened of the FSA

Fear is a good deterrent for individuals – but, will it work for large corporations?  I don’t know, but for sure the UK FSA believes in fear. Financial Services Authority (FSA) chief executive Hector Sants pledged in a confrontational speech last week that the UK regulator would be far more “intrusive and direct” in its supervision of UK firms from now on. See the article in Op Risk and Compliance for more.

Tell your friends and colleagues about us. Thanks!
Share this