We recently assembled a panel of experts to explore new trends and solutions in the area of anticipating business risk, to help organization gain a foothold on better managed processes and structure for staying clear of identifiable weaknesses.
The goal: To help enterprises better deliver risk assessment and, one hopes, defenses, in the current climate of challenging cybersecurity and against other looming business threats. By predicting risks and potential losses accurately, IT organizations can gain agility via thoughtful priorities and thereby repeatably reduce the odds of losses.
The panel consists of Jack Freund, Information Security Risk Assessment Manager at TIAA-CREF; Jack Jones, Principal at CXOWARE and an inventor of the FAIR risk analysis framework, and Jim Hietala, Vice President, Security, at The Open Group. The discussion is moderated by Dana Gardner, Principal Analyst at Interarbor Solutions.
This special BriefingsDirect thought leadership interview comes in conjunction with The Open Group Conference to be held held beginning July 15 in Philadelphia. The conference is focused on enterprise transformation in the finance, government, and healthcare sectors. Registration to the conference remains open. Follow the conference on Twitter at #ogPHL. [Disclosure: The Open Group is a sponsor of this and other BriefingsDirect podcasts.]
Here are some excerpts:
Freund: We're entering a phase where there is going to be increased regulatory oversight over very nearly everything. When that happens, all eyes are going to turn to IT and IT risk management functions to answer the question of whether we're handling the right things.Listen to the podcast. Find it on iTunes. Read a full transcript or download a copy. Sponsor: The Open Group.
Without quantifying risk, you're going to have a very hard time saying to your board of directors that you're handling the right things the way a reasonable company should.
As those regulators start to see and compare among other companies, they'll find that these companies over "here" are doing risk quantification, and you're not. You're putting yourself at a competitive disadvantage by not being able to provide those same sorts of services.
Gardner: So you're saying that the market itself hasn’t been enough to drive this, and that regulation is required?
Freund: It’s probably a stronger driver than market forces at this point. But especially in information security, if you're not experiencing primary losses as a result of these sorts of things, then you have to look to economic externalities, which are largely put in play by regulatory forces here in the United States.
Freund
Jones: To support Jack’s statement that regulators are becoming more interested in this too, just in the last 60 days, I've spent time training people at two regulatory agencies on FAIR. So they're becoming more aware of these quantitative methods, and their level of interest is rising.
Hietala: Certainly, in the cybersecurity world in the past six or nine months, we've seen more and more discussion of the threats that are out there. We’ve got nation-state types of threats that are very concerning, very serious, and that organizations have to consider.
With what’s happening, you've seen that the US Administration and President Obama direct the National Institute of Standards and Technology (NIST) to develop a new cybersecurity framework. Certainly on the government side of things, there is an increased focus on what can we do to increase the level of cybersecurity throughout the country in critical infrastructure. So my short answer would be yes, there is more interest in coming up with ways to accurately measure and assess risk so that we can then deal with it.
Hietala
Gardner: Please give us the high-level overview of FAIR, also know as Factor Analysis of Information Risk.
Jones: First and foremost, FAIR is a model for what risk is and how it works. It’s a decomposition of the factors that make up risk. If you can measure or estimate the value of those factors, you can derive risk quantitatively in dollars and cents.
Risk quantification
You see a lot of “risk quantification” based on ordinal scales -- 1, 2, 3, 4, 5 scales, that sort of thing. But that’s actually not quantitative. If you dig into it, there's no way you could defend a mathematical analysis based on those ordinal approaches. So FAIR is this model for risk that enables true quantitative analysis in a very pragmatic way.
For example, one organization I worked with recently had certain deficiencies from the security perspective that they were aware of, but that were going to be very problematic to fix. They had identified technology and process solutions that they thought would take them a long way toward a better risk position. But it was a very expensive proposition, and they didn't have money in the IT or information security budget for it.
Jones
So, we did a current-state analysis using FAIR, how much loss exposure they had on annualized basis. Then, we said, "If you plug this solution into place, given how it affects the frequency and magnitude of loss that you'd expect to experience, here's what’s your new annualized loss exposure would be." It turned out to be a multimillion dollar reduction in annualized loss exposure for a few hundred thousand dollars cost.
When they took that business case to management, it was a no-brainer, and management signed the check in a hurry. So they ended up being in a much better position.
If they had gone to executive management saying, "Well, we’ve got a high risk and if we buy this set of stuff we’ll have low or medium risk," it would've been a much less convincing and understandable business case for the executives. There's reason to expect that it would have been challenging to get that sort of funding given how tight their corporate budgets were and that sort of thing. It can be incredibly effective in those business cases.
Gardner: There's lots going on in the IT world. Perhaps IT's very nature, the roles and responsibilities, are shifting. Is doing such risk assessment and management becoming part and parcel of core competency of IT, and is that a fairly big departure from the past?
Hietala: It's becoming kind of a standard practice within IT. When you look at outsourcing your IT operations to a cloud-service provider, you have to consider the security risks in that environment. What do they look like and how do we measure them?
It's the same thing for things like mobile computing. You really have to look at the risks of folks carrying tablets and smart phones, and understand the risks associated with those same things for big data. For any of these large-scale changes to our IT infrastructure you’ve got to understand what it means from a security and risk standpoint.
We have to find a way to embed risk assessment, which is really just a way to inform decision making and how we adapt all of these technological changes to increase market position and to make ourselves more competitive.
Freund: We have to find a way to better embed risk assessment [into businesses], which is really just a way to inform decision making and how we adapt all of these technological changes to increase market position and to make ourselves more competitive. That’s important.
Whether that’s an embedded function within IT or it’s an overarching function that exists across multiple business units, there are different models that work for different size companies and companies of different cultural types. But it has to be there. It’s absolutely critical.
Gardner: Jack Jones, how do you come down this role of IT shifting in the risk assessment issues, something that’s their responsibility. Are they embracing that or maybe wishing it away?
Jones: Some of them would certainly like to wish it away. I don't think IT’s role in this idea for risk assessment and such has really changed. What is changing is the level of visibility and interest within the organization, the business side of the organization, in the IT risk position.
Board-level interest
Previously, they were more or less tucked away in a dark corner. People just threw money at it and hoped bad things didn't happen. Now, you're getting a lot more board-level interest in IT risk, and with that visibility comes a responsibility, but also a certain amount of danger. If they’re doing it really badly, they're incredibly immature in how they approach risk.
They're going to look pretty foolish in front of the board. Unfortunately, I've seen that play out. It’s never pretty and it's never good news for the IT folks. They're realizing that they need to come up to speed a little bit from a risk perspective, so that they won't look the fools when they're in front of these executives.
They're used to seeing quantitative measures of opportunities and operational issues of risk of various natures. If IT comes to the table with a red, yellow, green chart, the board is left to wonder, first how to interpret that, and second, whether these guys really get it. I'm not sure the role has changed, but I think the responsibilities and level of expectations are changing.
Gardner: Is there a synergistic relationship between a lot of the big-data and analytics investments that are being made for a variety of reasons, and also this ability to bring more science and discipline to risk analysis?
Are we seeing the dots being connected in these large organizations; that they can take more of what they garner from big data and business intelligence (BI) and apply that to these risk assessment activities? Is that happening yet?
Jones: It’s just beginning to. It’s very embryonic, and there are only probably a couple of organizations out there that I would argue are doing that with any sort of effectiveness. Imagine that -- they’re both using FAIR.
There are some models out there that that frankly are just so badly broken that all the data in the world isn’t going to help you.
But when you think about BI or any sort of analytics, there are really two halves to the equation. One is data and the other is models. You can have all the data in the world, but if your models stink, then you can't be effective. And, of course, vise versa. If you’ve got great model and zero data, then you've got challenges there as well.
Being able to combine the two, good data and effective models, puts you in much better place. As an industry, we aren’t there yet. We've got some really interesting things going on, and so there's a lot of potential there, but people have to leverage that data effectively and make sure they're using a model that makes sense.
There are some models out there that that frankly are just so badly broken that all the data in the world isn’t going to help you. The models will grossly misinform you. So people have to be careful, because data is great, but if you’re applying it to a bad model, then you're in trouble.
Gardner: We're coming up very rapidly on The Open Group Conference, beginning July 15. What should we expect? [ Registration to the conference remains open. Follow the conference on Twitter at #ogPHL.]
Jones: We're offering FAIR training as a part of a conference. It's a two-day session with an opportunity afterwards to take the certification exam.
If history is any indication, people will go through the training. We get a lot of very positive remarks about a number of different things. One, they never imagined that risk could be interesting. They're also surprised that it's not, as one friend of mine calls it "rocket surgery." It's relatively straightforward and intuitive stuff. It's just that as a profession, we haven't had this framework for reference, as well as some of the methods that we apply to make it practical and defensible before.
Once you learn how to do it right, it's very obvious which are the wrong methods and why you can't use them to assess risk.
So we've gotten great feedback in the past, and I think people will be pleasantly surprised at what they experienced.
Freund: One of the things I always say about FAIR training is it's a real red pill-blue pill moment -- in reference to the old Matrix movies. I took FAIR training several years ago with Jack. I always tease Jack that it's ruined me for other risk assessment methods. Once you learn how to do it right, it's very obvious which are the wrong methods and why you can't use them to assess risk and why it's problematic.
It's really great and valuable training, and now I use it every day. It really does open your eyes to the problems and the risk assessment portion of IT today, and gives a very practical and actionable things to do in order to be able to fix that, and to provide value to your organization.
Gardner: Are there any updates that we should be aware of in terms of activities within The Open Group and other organizations working on standards, taxonomy, and definitions when it comes to risk?
In government, clearly there has been a lot of emphasis on understanding risk and mitigating it throughout various government sectors.
Hietala: At The Open Group we originally published a risk taxonomy standard based on FAIR four years ago. Over time, we've seen greater adoption by large companies and we've also seen the need to extend what we're doing there. So we're updating the risk taxonomy standard, and the new version of that should be published by the end of this summer.
We also saw within the industry, the need for a certification program for risk analysts, and so they'd be trained in quantitative risk assessment using FAIR. We're working on that program and we'll be talking more about it in Philadelphia. Follow the conference on Twitter at #ogPHL.
Along the way, as we were building the certification program, we realized that there was a missing piece in terms of the body of knowledge. So we created a second standard that is a companion to the taxonomy. That will be called the Risk Analysis Standard that looks more at some of that the process issues and how to do risk analysis using FAIR. That standard will also be available by the end of the summer and, combined, those two standards will form the body of knowledge that we'll be testing against in the certification program when it goes live later this year.
Gardner: For those organizations that are looking to get started, in addition to attending The Open Group Conference or watching some of the plenary sessions online, what tips do you have? Are there some basic building blocks that should be in place or ways in which to get the ball rolling when it comes to a better risk analysis?
Freund: Strong personality matters in this. They have to have some sort of evangelist in the organization who cares enough about it to drive it through to completion. That’s a stake on the ground to say, "Here is where we're going to start, and here is the path that we are going to go on."
Strong commitment
When you start doing that sort of thing, even if leadership changes and other things happen, you have a strong commitment from the organization to keep moving forward on these sorts of things.
I spend a lot of my time integrating FAIR with other methodologies. One of the messaging points that I keep saying all the time is that what we are doing is implementing a discipline around how we choose our risk rankings. That’s one of the great things about FAIR. It's universally compatible with other assessment methodologies, programs, standards, and legislation that allows you to be consistent and precise around how you're connecting to everything else that your organization cares about.
Concerns around operational risk integration are important as well. But driving that through to completion in the organization has a lot to do with finding sponsorship and then just building a program to completion. But absent that high-level sponsorship, because FAIR allows you to build a discipline around how you choose rankings, you can also build it from the bottom up.
You can have these groups of people that are FAIR trained that can build risk analyses or either pick ranges -- 1, 2, 3, 4 or high, medium, low. But then when questioned, you have the ability to say, "We think this is a medium, because it met our frequency and magnitude criteria that we've been establishing using FAIR."
Different organizations culturally are going to have different ways to implement and to structure quantitative risk analysis.
Different organizations culturally are going to have different ways to implement and to structure quantitative risk analysis. In the end it's an interesting and reasonable path to get to risk utopia.
Jones: A good place to start is with the materials that The Open Group has made available on the risk taxonomy and that soon to be published risk-analysis standard.
Another source that I recommend to everybody I talk to about other sorts of things is a book called How to Measure Anything by Douglas Hubbard. If someone is even least bit interested in actually measuring risk in quantitative terms, they owe it to themselves to read that book. It puts into layman’s terms some very important concepts and approaches that are tremendously helpful. That's an important resource for people to consider too.
As far as within organizations, some organizations will have a relatively mature enterprise risk-management program at the corporate level, outside of IT. Unfortunately, it can be hit-and-miss, but there can be some very good resources in terms of people and processes that the organization has already adopted. But you have to be careful there too, because with some of those enterprise risk-management programs, even though they may have been in place for years, and thus, one would think over time and become mature, all they have done is dig a really deep ditch in terms of bad practices and misconceptions.
So it's worth having the conversation with those folks to gauge how clueful are they, but don't assume that just because they have been in place for a while and they have some specific title or something like that that they really understand risk at that level.
You may also be interested in:
- Managing transformation to Platform 3.0 a major focus of The Open Group Philadelphia conference on July 15
- The Open Group Conference Panel Explores How the Big Data Era Now Challenges the IT Status Quo
- Complexity from big data and cloud trends makes architecture tools like ArchiMate and TOGAF more powerful, says expert panel
- Using the Cloud for Big-Data Requires a New Recipe
- Big Data Success Depends on Better Risk Management Practices Like FAIR, Say The Open Group Panelists
- The Open Group Keynoter Sees Big-Data Analytics Bolstering Quality, Manufacturing, Processes
- The Open Group Trusted Technology Forum is Leading the Way to Securing GLobal IT Supply Chains
No comments:
Post a Comment