Note: This page is part of the us-cert.gov archive.

Archived Content

In an effort to keep CISA.gov current, the archive contains outdated information that may not reflect current policy or programs.

Information Sharing and Analysis Organization (ISAO) Workshop, July 30, 2015, Part 2 of 3

Description

The Workshop was a one-day event held at San Jose State University. Information obtained through interaction with Government and Industry colleagues in these Workshops was rolled up in a Workshop Readout and presented to the (future) selected Standards Organization. They will use the work towards the development of ISAO Standards.

Audio File Media
Audio file
Audio Gal Description

This is an audio file.

Transcript

Executive Panel

AZZAR NADVI:  All right.  If you can all start finding our seats, we're going to go ahead and get started here. 

All right.  So this portion of the workshop will be the Executive Panel, and I'd like to introduce Mr. Greg Schaffer, who is the CEO and Founder of First72 Cyber, a security firm created to help enterprise prepare for, respond to, and manage the risk of cyber events, with a special focus on the rapidly emerging area of third-party risk.  Mr. Schaffer is responsible for all aspects of the enterprise's development and execution, including part development, infrastructure build, service delivery, quality assurance, and partnerships.  So if you'd all give a welcome to Mr. Greg Schaffer and the Executive Panelists.

[Applause.]

GREG SCHAFFER:  Thank you so much.  Pleasure to be here.  I come to this space from a variety of perspectives.  My last role with government was as the Assistant Secretary for Cybersecurity and Communications at DHS.  I've also be the CSO at a number of large organizations before starting the company.

We have a great panel for you today, and I'm going to set the verbosity to low because I think you know these folks.  And their full bios are in your program, but I'll do some quick introductions.

David Turetsky is a partner with the Akin Gump firm.  He co-leads their cybersecurity practice.  He comes to us with experience from the FCC as the chief of the Public Safety and Homeland Security Bureau.  He's got a long background in antitrust and other legal issues, but he's doing a lot in the cybersecurity space these days.

We then have Mike Hamilton, who is Cybersecurity Policy Advisor to Washington State and the office of the CIO.  He has served in a variety of roles for the state.  He was the former Chief Information Security Officer.

Rick Howard is the Chief Security Officer for Palo Alto Networks, and he's been a CSO for other large organizations, comes to this with a background of being the commander of the U.S. Army's Computer Emergency Response Team and has obviously worked a variety of issues in the space.

And then John Woods, who is a partner with Baker & McKenzie firm, he's done extensive work in global cybercrime investigations, handling a number of large breach intrusions, including a number of the large cash-out events, and he was even asked to speak at Interpol to about 28 global law enforcement agencies around issues involving how law enforcement should interface with the private sector for large intrusion matters.

So you've got a panel that has really dealt with a broad range of issues related to information sharing in a variety of different contexts, and I think that they will be able to bring a lot of good discussion to the topics that we have at hand.

And we've got a number of questions that the folks at DHS and otherwise have really been thinking about in the context of how to continue to move this discussion forward, and so we want to cover those this morning and have this discussion go wherever it's going to go in that regard.  And then we'll have some time at the end of that discussion for questions and answers, so we've reserved some time for that.

So the first piece of the puzzle really goes to is this developing area of ISAOs, is the new executive order in creating and expanding the ability for a number of different kinds of organizations to get involved in information sharing—is that going to have an impact on the existing ISAOs and ISACs in a way that is positive, negative?  What's the intent?  How does this really impact the existing structures?  In particular, does the panel view the growth in ISAOs in this manner as something to be concerned about?  Is it something that is an enabling factor?  Where does the panel think that this introduction of new entities is likely to take us?

And so I'm going to start, John, with you.  You've had a lot of opportunities to see information sharing at a lot of large enterprises.  What do you think the impact is likely to be?

JOHN WOODS:  I don't know if you hear me or not, but I'll speak up.  I think one of the main things, having read the comments of those that submitted comments to DHS and then just from personal experience, one of the key things in the development of the ISAOs—you can't do is to grade the efficacy of the ISACs.  I think, universally, the comments that have been submitted to DHS make that point.  I think that point came out in the PwC study that Neal referenced earlier as well.

And the question that we have to really answer here is how can we integrate the ISAOs around community of interest or other enabling features to make sure that we're enhancing the effectiveness of the ISACs, not degrading their capability?

We've talked to a number of clients.  We deal generally with general counsels but also during incidents, and then in the aftermath of incidents, how to improve.  Recently, since the ISAO concept has come on board, I think one of the things that we've really seen is a desire to potentially use the ISAOs almost as threat intelligence analysis centers for midsize businesses.  We get brought in not only with very large corporations but midsize and smaller banks that really don't have the capabilities internally or the budgets to engage in, for instance, analysis of threat intelligence that's coming out of the ISACs.

So one of the questions and I think one of the areas where at least our clients are wanting to talk through is can we create ISAOs that serve as an interstitial layer between the government feeding the ISAC and then the ISACs coming back into an ISAO that really enables midsize and small businesses to have a better understanding of what particular pieces of intelligence mean to them.  So I think there are models that I think will enhance the effectiveness of the ISACs, but then there is the possibility, as contemplated at some of the other meetings, around for-profit or other models where it may run in parallel where economic incentives align to make potentially some competitive friction between the ISACs, and I think we have to watch for that.  We don't want to sort of disable the really great work that's been done across the ISACs.

One proposal, for instance, that we've heard discussed, not in a privileged way, but insurance is being viewed by many participants in this debate as a potential significant risk mitigant, and one of the things that has at least been talked about with a couple of insurers is, you know, is there an ISAO concept around insurance companies trying to protect their policyholders, and could an ISAO be stood up and aligned with the economic understanding incentive that an insurance company has?  That could put some friction with the ISAC, the FS-ISAC, but I think there are ways to potentially have the ISAOs serve an independent role because of economic interests while at the same time not competing with the ISACs.

RICK HOWARD:  I don't think we should protect the ISACs.  Right?  This is a marketplace.  Right?  Information is valuable, right, and ISACs are really valuable right now, especially two or three of the big ones are really valuable.  And the way ISAOs can help with this is to make it more competitive.  If ISAC membership starts to go down because they're going over to ISAOs, that means that ISAC needs to deliver better service, I think.  Right?  So I don't think that's a—I don't think we should save the ISACs.

JOHN WOODS:  I'm not—but I think there's a challenge, right, between—there are things that are working, and I think when you review the comments, we've got to—you know, the old adage, let's don't harm the things that are working already, needs to be—because not a lot is working, and so I think we need to be cognizant of the fact that while we don't want to create some sort of monopoly—I'll let David talk about the antitrust implications of that, but I think we need to recognize that—and a number of the commentators, both the ISAC members themselves, but a lot of the other independent commentators in what was submitted to DHS have really highlighted the fact that we have some things that are working in certain sector—in certain sectors, and we need to be cognizant that there will be new market participants.  And maybe ultimately the ISACs go away, but I'm not sure that in the short term, that's going to be—sort of reduce the overall threat environment by degrading, for instance, the FS-ISAC.

MICHAEL HAMILTON:  Yeah.  To me, it's a function, I think of—it's a question of scale.  So I'm a bit of a one-trick pony.  Right?  I'm about the public sector, so we monitor a number of public sector organizations in real time and spread that information around through the fusion center.  So the ISAC that we are engaged with is multistate ISAC, and that's a very high-functioning organization.

I think to a certain extent, what we're trying to do is we're trying to help scale what they do, and I believe that to a certain extent, it's going to be up to the standards organization to define what that interoperability looks like.

I agree with you that some of the ISACs might not keep up, and things might become more regionalized than is appropriate for some of the ISACs, but I do think that there is a very clear way to help increase the eyes and ears and protect the investment that's already been made in those ISACs and help them do their job better.

GREG SCHAFFER:  So John posited a structure, perhaps, where the ISACs are feeding the ISAOs and the ISAOs were feeding a community of interest.  We heard from HITRUST about their direct involvement with DHS and HHS.  Are those alternate views?  Are they different models, or are we talking about maybe having both of those models operative at the same time?  What's the panel's thinking about whether the ISACs should be the ISAOs or the ISAOs should be fed directly from some of the government resources?

MICHAEL HAMILTON:  I'm going to defer to the standards organization, to tell you the truth.  I mean, I don't want to opine on that.  I want to do things that are consistent with a standard because I think interoperability is really where we're trying to get with this.  Even with a lot of diversity in the ISAO organizations, I believe that there's a way that we can share information in a very standard way, and what's going to fall out of that is going to fall out of that.

I do believe in what you say about market forces and whoever is going to win is going to win.  I don't think that win and lose is really the right way to position it.  I think we're—because we're all in the same mission space here and we're just going to try and do the best job we can, and the standards organization is probably going to have to take this up and decide that.  And that's not for me to decide.

RICK HOWARD:  I don't think there's going to be any one model that wins, and for us to prescribe the model at this point is kind of silly.  Right?  What's going to win is what works for the members of that organization, and if they like the model they subscribe to, then fine.  If they don't, they'll move to another model.  So, yeah, there are going to be lots of different ways this is going to pan out in the future, I think.

GREG SCHAFFER:  Fair enough.  So one of the things that PwC posited in the document, the white paper that they put out—and we talked about it a little bit this morning as well—was this notion of having different models that might be successful, member-driven models like the existing ISACs, some of the data-driven models, as well as problem driven, where you've got like the Conficker Working Group and some risk-driven models where folks come together leveraging specific technologies.  Are those the only four models?  Are there other ways that these could be organized, or is that really the universe that the panel can think of as being potentially effective?

RICK HOWARD:  So I got tagged to answer this one first.  I think trying to find the model before we figure out what the data is that we're going to share with each other is fraught with danger.  Right?  What we're talking about is information sharing organization, and when we get to first principles here on information sharing and cybersecurity, what is it that we're trying to share with each other, so that we are more secure?  And in my mind, when I approach this problem as a CSO is I'm looking for adversary campaign information down the kill chain, so that I can put controls in my network to stop them from being successful.  That is the sharing group that I'm going to go look for, but that may not be the sharing group that some other CSO is looking for down the road.

I think these four models are good.  I couldn't think of any others while we were talking about them, but I think it is wide open how it is going to go in the future.

MICHAEL HAMILTON:  I don't know if this is the complete taxonomy that we're going to end up with, but I agree with you.  I would rephrase that and say, to a large extent, it's going to be driven by why.  What's the outcome that we are trying to achieve, and what model makes more sense?  For us, it's looking for the doorknob twisting going on in public sector organizations that cut across the Homeland Security critical sector, so traffic managements, communication systems for law enforcement, energy delivery from public utilities, water purification, waste treatment.  That's the stuff that if it falls over, it's going to be a disaster.  When I get a letter from a credit card company, that's an annoyance, and so our why is really tied to critical infrastructure at a very local scale.  So that's what we're going to be interested in.

GREG SCHAFFER:  So what about operationalizing some of the information sharing that's happening in various places today?  So, for example, there's several large companies that have done things like CISO councils where you've got an enterprise that has a number of partners, clients, customers.  They have got the CISOs from those organizations engaged with them because they are all a community of interest.  They have got alignment because they are users of some of the technology of one entity, or they're otherwise dependent on one another in various ways.  Does an ISAO built around that common infrastructure make sense, and does anybody see someone taking the CISO councils and operationalizing them as a way of building an ISAO that might—and would that kind of approach speak to this question about long-term sustainment?  Does that model potentially have a greater opportunity to have people continue to stay in the game because they are interdependent and working together on a regular basis?

RICK HOWARD:  I would just talk about that one specifically, the CSO council.  One of my roles at Palo Alto is to go around the world and talk to CSOs, not about Palo Alto, per se, just about security.  Right?  And I've been doing it for about 2 years now, and I feel like I've got a master's degree in how people do security because everybody does it differently, and everybody has a good reason to how they do it differently.  And having that ability to get with your peers around a big steak dinner and having that conversation and not having to be attributed to you as you leave, that is a valuable service to do.

So if some ISAO comes along and says, "I'm going to provide that to our members," and that's what they want, that's a great model for drumming up membership, I think.

JOHN WOODS:  Greg, are you really talking about potentially ISAOs in certain verticals where there is a sort of commonality of infrastructure?  You as Palo Alto, Rick, could potentially start an ISAO for your key users where, as a value-added service, for your key customers or as part of the purchase of your product.  Is that sort of—is that what you're talking about Greg where there is almost a product-driven—you know, whether it's in the insurance industry, there are certain very large back-end claims processing engines, and so that might be an area where if everyone—if a thousand companies have the same claims processing, you build an ISAO around that because they're all going to have shared common vulnerabilities?

ATTENDEE:  And that's the question.  For example, in financial services, you've got a very large number of institutions that are leveraging the same banking core from various vendors.  That community of interest has common vulnerabilities.  They've got common tool sets that they leverage because there's certain things that work with certain of those offerings from the various vendors, and so there's a natural group of people that are all on a common platform.  Is that an ISAO in the making?

RICK HOWARD:  Let me jump in there because we have started a vendor sharing organization.  I'm going to talk about it this afternoon.  It's called the Cyber Threat Alliance.  It is the first time security vendors have gotten together to decide to share information with each other.  Right?  We are struggling what to call it.  We're not sure if it's an ISAO or something else.  That's why we're here today.  So come this afternoon, I can tell you all about it.  So, yes, we can do something like that around our communal community base.

ATTENDEE:  So does that mean of Symantec discovers a new threat, it's going to tell McAfee?

RICK HOWARD:  Yeah.

ATTENDEE:  Really?

RICK HOWARD:  Yeah.  It really is.  We've been doing it for about a year now.

ATTENDEE:  That's kind of cool.

RICK HOWARD:  And I'm not saying it was easy to get all that trust build, okay, but it was a lot of beer and a lot of dinner, convincing people that was a good thing to do and keeping the CEOs in line with that idea.  But, yeah, we are absolutely doing that right now.

ATTENDEE:  That's the way the world needs to work, I think.

GREG SCHAFFER:  David, maybe you can talk about it.  I think there's some good antitrust language that came out of DOJ, and I know that's something you're personally—

DAVE TURETSKY:  Yeah.  Actually, I have a question about that.

RICK HOWARD:  Lawyers talking to CSOs knows—

[Laughter.]

DAVE TURETSKY:  So the vendors of these services are talking to one another—

RICK HOWARD:  Yeah.

DAVE TURETSKY:  —and exchanging that information, but are they going to share that without charge with the ISACs and ISAOs—

RICK HOWARD:  Yeah.  I think the difference—

DAVE TURETSKY:  —or is this aggregating a set of information that is going to make them more valuable and more necessary for others to pay for?

RICK HOWARD:   There's no cost to it, right?  The difference between the Cyber Threat Alliance and like an ISAC is whatever Symantec gives me, I dump it right into my product.  Whatever we give them, they dump it right into their product also.  Right?

DAVE TURETSKY:  That's amazing.

RICK HOWARD:  Yeah, it is amazing.  Right?  And there's eight members total right now, and we're going to talk about this, this afternoon.  I might not even have to do this panel this afternoon, okay, after this.

But I think there's a tipping point of security vendors, and I think that number is like 25 to 50, that at some point, every organization on the Internet will have at least one of us in their networks.  Right?  And all of us will have the same intelligence, okay, which is a pretty interesting idea.  It's a pretty big-area idea.

So I don't know if I answered your question.

DAVE TURETSKY:  Well, I think it's great at one level.

RICK HOWARD:  Yeah.

DAVE TURETSKY:  My question is—some of the dialogue at prior sessions has concerned whether information flows that come from vendors that people have paid for can be fed into the ISAC and what can be done with that—

RICK HOWARD:  Yeah.

DAVE TURETSKY:  —and what can be shared, and so my question is, will the feed of that, since the companies are getting smarter and better by sharing that information, is that going to still be proprietary and not be shared without cost with ISACs?

RICK HOWARD:  Yeah, that's a good question.  That is not our intent.  We have also adopted STIX and TAXII as a way to share information between the members.  Our intent is to just dump that out to a public Web server as soon as we possibly get organized.

DAVE TURETSKY:  I see, okay.

RICK HOWARD:  Right?  In the meantime, though, it's jumping right into all of our product set, so you're getting automatically updated with whatever we know across the board.  It's a pretty powerful idea.

DAVE TURETSKY:  Well, that's a terrific plus then.

RICK HOWARD:  Yeah.

MICHAEL HAMILTON:  Yeah.  Getting back to your original questions, organizations—so there's sector-specific organizations.  There's vendor organizations.  There's regional organizations.  You know, is that an Information Sharing and Analysis Organization?  Well, yeah.  It's kind of 20th century, you know, three guys in a room talking about what happened last week, but this is now, I think, in everybody's mind.  To the extent that we know, we need to have these discussions.  These are very valuable, and we've been able to harness one in our region and specifically get it focused on their role in emergency response.  So if we have the big cyber meltdown, the emergency operations center does not really how to handle these things yet, and so the construct of mutual aid where you go to another political jurisdiction and say, "We need to borrow resources," it's not going to work because those jurisdictions don't have access to those practitioners.  Government can't afford these people.  So we have to reach out to the private sector.  So there's a very specific function that we've been able to get them tasked on and get into their charter to try and integrate with a number of things that we do in our region, one of which is real-time regional monitoring, but this readiness for response is a big part of the conversation we have.

GREG SCHAFFER:  So, essentially, a cyber EOC capability at the state level?

MICHAEL HAMILTON:  Yeah.  And, well, these are shark-filled waters.  There are issues of indemnification, credentialing, qualifications, I mean, things like this, and we're working through a lot of this.

Matt Modarelli is sitting over there, is in the emergency management department, sitting next to Agnes Kirk.  She is the CISO, the state of Washington, and these are the things that we work on, and we're feeding what we do back up to FEMA, and it's a two-way thing.  So I'm hoping to be able to share everything that we do because I think all of our regions, based on geography, need to be prepared for some things.

ATTENDEE:  Can we talk about—go ahead.

DAVE TURETSKY:  I was just going to say, which is to say that much as it is the approach of companies in the private sector, what you're describing is an overall disaster management approach with an appendix, if you will, for cyber.

MICHAEL HAMILTON:  Yeah.

DAVE TURETSKY:  And the preparation has to go to that.

And that's often what we talk to our clients about with respect to incident response plans as well as the testing and drilling and tabletops to ensure that they know how to deploy them effectively.  Often those cyber plans will be an appendix or an augmentation of an overall disaster response plan, and that's very much, when I was at the FCC, the way it was approached.  As Chief of the Public Safety and Homeland Security Bureau, we were concerned with resilience and reliability and disaster response to help the commercial sector response after Sandy to challenges that they faced.  It would be the same if the outages, disruptions, and the like were a cyber-generated incident.

And we also oversaw the FCC's continuity of operations plan itself and were prepared for the agency survivability.  So I think that model is common, not only on the government side, but also on the business side.

RICK HOWARD:  Well, you bring up something you said, too, about protection of the data you're passing around.  I want to have a very heated debate about that idea, okay, because the kneejerk reaction from everybody in this business is we have to protect our data, and we don't want to get sued later if we inadvertently give something bad out—and define what bad is.  Okay.

But let me tell you what I think we should be sharing.  Adversary information, not company information, not people information.  I'm telling you how that bad guy worms his way into your network, so that you can put controls into your network to stop him from doing that.

And I agree there's other things we could talk about in this information sharing, like context and motivation and maybe what kind of business it was, but I don't need that for everything, and I don't need it for most things.

JOHN WOODS:  Rick, I think what you're talking about, though, is exactly sort of the concerns that some folks from the privacy side have, but it's embedded in what you're saying.

RICK HOWARD:  Yeah.

JOHN WOODS:  Minimization to provide only that which his necessary in order to identify or protect against a threat actor is something that's sort of embedded in the government in a variety of different contexts, and I think what the concern is, that rather than dumping over PCAP files that may have a bunch of embedded personally identifiable information in it into an ISAO, I think the principal minimization that you've talked about here is something that from a privacy perspective is going to be very important in order to have that community, who are very concerned about that, particularly in providing it to the government, to assuage some of their concerns that this isn't just handing more information over to third parties because the handing over of that information itself to the ISAO creates risk for those individuals and consumers.

RICK HOWARD:  Yeah.  I love that word "minimization."  I'm going to steal that from now on.

ATTENDEE:  Yeah.  And we're setting standards.

RICK HOWARD:  Yeah, yeah.

MICHAEL HAMILTON:  I think the other consideration that comes up in the public sector is public disclosure, and there is a reluctance to handle anything that may be requested for public disclosure, and so we run into that a lot.  And we are really trying—when we minimize, it's source address, destination address, port protocol, and flags.  That's it.  And we do the best we can with that and some log information, but, you know, we want to launder that out.

I agree that minimization down to exactly what we need to share adversarial or threat information, that should be it.

RICK HOWARD:  And then if we get Congress to pass a law that says through an extra layer of protection on it just in case we do something stupid, so we don't all go to jail when we're trying to protect the country, then I think we'd be fine with that.

JOHN WOODS:  You know, to that point, I do think it's important.  The ISAO concept is actually something that was in the original DHS statute.  If you look around Sections 212, 214, there's a lot of law, and the CFR, 6 CFR Part 20, 29, there's already quite a bit of guidance around some of the things you just raised around protected critical infrastructure information that people I'm not sure have fully appreciated as part of what the ISAO could serve as a vehicle for.

We often are dealing with clients that are consumer facing and that get very concerned about litigation down the road.  I want to share.  I've got a live event, but I'm worried about sharing information and how that may come back at me in litigation.  And I think if you look hard at some of the existing regulation and statutes, there's some very helpful language that already exists that I'm not sure is fully appreciated that could be—and it has been helpful in talking to other general counsels or business leaders around, first of all, sharing facts is different than sharing legal advice.  And I think that there's some real benefit if people better understood the value that an ISAO may have under existing regulation statute because I'm not sure that's been fully internalized into this debate.

DAVE TURETSKY:  I'd like to just add one thing to what Rick said because I think it's important.  He was talking about being careful to ensure that information that may be personally identifiable may have privacy content isn't shared, but also, I think that if an error is made in the course of trying to do that, that that not be some sort of terminal event or capital punishment of that for companies.

RICK HOWARD:  Right.

DAVE TURETSKY:  I think that's important, and I know it's a subject of dialogue between the privacy community and the business community in connection with the legislation that is being drafted.  And it's really very important too.  As we think about small and midsize businesses—and we, I think, all share a view that improving cybersecurity, sharing information, making it easier for them to benefit from shared information is important—they are going to be scared to death about a misstep if the process is so overly complex and the penalties are so great that the sharing of even a tiny bit erroneously of privacy information results in some kind of severe penalty.  And that's just going to have to be taken into account.

And I would just like to say in terms of the larger picture of balancing privacy and cybersecurity, if you will, I feel like I have no privacy anymore.  Anthem, I got the letter from that my information was hacked.  OPM, I got the letter from that, you know, my TS/SCI information and Lord knows what else, as a former government employee, was hacked.  There is no privacy without—

RICK HOWARD:  It's nice of you to be on both lists so they can cross-check you.  That's nice.

[Laughter.]

DAVE TURETSKY:  Yeah.  Well, I have a view of what else they may—

MICHAEL HAMILTON:  Well, your credit card is available for 5 bucks.

DAVE TURETSKY:  But I think it's important to look at this in perspective.  There is on privacy today without cybersecurity, and we need to take that into account as we strike this very important balance.  We need to be serious about all aspects of it, but if we wait until everybody thinks we've got it perfect, there is no privacy left at that point.

RICK HOWARD:  We were talking about that too, because when the ISACs first started back in '98, '99, whatever year it was, nobody, no commercial organization would admit they were breached.  Nobody would do that.  Right?  But now times have changed.  Part of the things my intel group does at Palo Alto is write the report for every public breach that happens from the news so the executives know what they're—I can't keep up with how many public breaches there are anymore.  Right?  So the stigma from getting breached right now is not what it was 15 years ago.

I agree we have to worry about getting sued because we let information go that we shouldn't have, but companies worrying about their brand reputation for getting breached, that's not as big of a deal as it used to be.

MICHAEL HAMILTON:  Yeah.  Well, I mean, part of the reason is that 47 states have the data breach reporting statutes, which really is a stick we've given ourselves to hit ourselves with.  Right?

[Laughter.]

MICHAEL HAMILTON:  But I think that one way we can help move this needle, the PCII, Protected Critical Infrastructure Information Program, is largely focused on printed materials, and as we collect data, as we monitor networks and we aggregate and we try to do those things, we have a data set that needs to be protected.  And I've tried to get that, our data set, which is many terabytes—and it's been operational since 2008—designated as PCII, and they said—and I quote—"Print it an send it to us."  Okay.

[Laughter.]

MICHAEL HAMILTON:  That's not going to work.  And they also said PCII cannot be used to insulate information from public disclosure, which is exactly what I want to do, and I don't really understand the reasoning there.  So if we can maybe work on that program a little bit, I think it's got some potential to help address this problem.

GREG SCHAFFER:  So a very interesting part of the conversation because, obviously, with the SB 1386 breach notice law and its progeny coming from California and then spreading to the rest of the world—

ATTENDEE:  Thanks, California.

GREG SCHAFFER:  —that has certainly changed the dynamic, and we do get a lot more information about breaches that have occurred, but we get information about a certain kind of breach and not necessarily other kinds of breach.  So the intellectual property theft or the, frankly, in some ways, more insidious breaches don't necessarily have to be disclosed under those laws, and so they may not.

What incentives are needed in order to get information shared through these new structures?  What are the—if we're standing up ISAOs, what are the incentives to get people standing them up?  What are the incentives needed to have people share with them?  Because we want them to share not just the disclosed breaches, but at least the threat information associated with the ones that don't necessarily have to be disclosed because those may be the ones that are, indeed, even more insidious.  So where do we have to go with that?  David?

DAVE TURETSKY:  I think the key to this is a value proposition for businesses.  They have to think that the benefits from cybersecurity information sharing outweigh the costs, and the benefits are challenging because some of them are more speculative than the hard out-of-pocket costs in terms of time and resources and the like.

There are things, I think, government can do.  One, which I think is recognized by the President's executive order, is to make the government's information available, and I think the government is seeking to do a better job of that so that not only is the information from other companies available, but whatever information the government has, it becomes available.

I think another piece of what needs to be done is addressing the legal underbrush that we've heard some about.  That's why the legislation is important in part.  That's before the Congress to provide the liability protection and other shields that that legislation, as I hope it will end up being passed, will provide, as I think something that will provide incentives and in fact lower the costs, if you will, in that balance of cost and benefits.

The government, I think, can incentivize participation eventually through its own procurement efforts.  If ISAOs and ISACs become valuable, not only the government, but private companies may, as a matter of contract, ask their vendors to participate in them because it will increase the level of cyber health, if you will, and reduce the threats.

I think the government's funding of the standards organization is very helpful.  There may be other things that can be done in terms of government outreach and the like.

The government will also play, I think, a different role.  Some of it will be from the private sector.  Cyber insurers, I think, to the extent that ISAOs and ISACs produce real value, will lower their rates for participants, I think, in those kinds of organizations.  As I said before, vendors and service providers will expect their vendors to be part of those, and if ISACs and ISAOs are valuable, you're going to see the legal system come into play in some of the breach litigations.  It will be—

ATTENDEE:  I'm detecting a theme in your comments.

DAVE TURETSKY:  You know, if ISAOs and ISACs are considered to be important parts of the business tool kit, what you'll end up seeing in the litigations about breach and standard of care is the question "Did you open your tool kit?"  And if you're not a member of an ISAO or an ISAC, that is going to make your defense more risky and more problematic.

The FTC and other organizations that look for reasonableness after the fact—because, after all, this is risk management.  There is no silver bullet.  There is no absolutely protection, but if ISAOs and ISACs and valuable and information sharing is valuable, that is going to become an element of what reasonable care is, and so there will be that aspect too, I think, to incentivize participation.

And, finally, I do think we need to try to step back and think about this problem of small and midsize businesses.  I represent on cybersecurity matters, an association comprised of hundreds of small and midsize businesses, and one of the things we did before my colleagues put in comments in an agency proceeding about cyber security is interview a cross section of the members about what they were encountering.  And what was interesting was every one of them, without exception, was encountering cyber attacks of one kind or another.  The smaller guys may have been—it may have been DDoS attacks.  It may have been ransomware, but they were all experiencing something.  Some of them addressed this through outside vendors, and that may be ultimately the way some of this is handled.  But we need to think about how to educate them and how to get them into this because, you know, the story which is now a legend of the HVAC vendor, that was the link to the target problem.  We're going to see more and more of that, and so we need to try and make it simple to share.  We need to educate, and there is a long way to go to get small, medium-size businesses into this.

GREG SCHAFFER:  So how far down do we think the direct participation will go?  Because even in the middle market, you have organizations that have not gone to full-blown cybersecurity teams the way you have in the Fortune 500.  As you get below that sort of middle market area, it becomes increasingly challenging to have dedicated resources for cybersecurity, the IT.  You know, there's some IT resources that have that responsibility but maybe not dedicated.  How far down in the economic ranks do we think there will be direct participation?  And if we hit a point where there's not, how do those people get the benefit of these organizations?  If this helps it go down a level, does it take it all the way down to the gas station on the corner or the grocery store that's a mom-and-pop?

RICK HOWARD:  I'm going to put my vendor hat on for a second.  Right?  Consider all the IT vendors and all the security vendors on the planet and how many potential sensors that we have on the planet.  Right?  So a way to incentivize the small to medium-size businesses, authorize the vendors to collect more intelligence from their devices, because we could collect lots of different things about how bad guys attack your network.  We don't do it because customers say, "I don't want you in my network collecting that stuff."  However, if you can convey to them that "We are collecting intelligence on bad guys, and we'll use it to better protect your own systems," that might be incentive enough for them to do that.  And that's where the small company could benefit.  They're going to have something in their network, whether it's an antivirus system or something bigger like a firewall, whatever it is.  They're going to have something protecting their network, and they can benefit from a collective information sharing program.

MICHAEL HAMILTON:  Yeah.  And so, in my experience, working with organizations from the city of Seattle, which is fairly large and mature, and the Port of Seattle and a bunch of maritime ports, all the way down to very small cities, I found that there is—because of the resource shortage and the in-availability of practitioners to these organizations, they're very willing to share information, especially in the public sector.  Right?

With all due respect to Microsoft, the city of Seattle is a monopoly.  Try to get somebody else to take your sewage away.  It's not going to work.

[Laughter.]

MICHAEL HAMILTON:  And so there is a very reduced barrier to sharing information because that competitive issue isn't even in scope, and so going all the way down to the smaller organizations, they're willing to share up.  They want to hear things coming back, but I wouldn't really call them active participants.  But what they represent in terms of telemetry is really good.  There are 90,000 local governments in the United States, and when you add school districts, public utility districts, water purification, all that stuff, it's about a half-a-million places where you could put sensors out there, and you have got radar for infrastructure disruption attempts, and that I don't think would be that hard to pull off.

RICK HOWARD:  So I'm not naïve.  I realize that Edward Snowden ran not too long ago, and people are going to say, "You're going to what?  You're going to turn on sensors from all over the planet?"

MICHAEL HAMILTON:  My project is called PRISM.  Thanks, NSA, for blowing our brand.

[Laughter.]

JOHN WOODS:  I do think that given this is a security focus group, I think a lot of people in this room are inclined to agree that—you know, I think the statement was you can't have privacy without cybersecurity.  But I do think that there is a very loud voice around concerns that if we're putting censors in schools and things, that we just have to be mindful that there is a very mature privacy community that actually from a legal perspective—

RICK HOWARD:  Oh, yeah.

JOHN WOODS:  —is much more mature than the cybersecurity—

RICK HOWARD:  I agree.

JOHN WOODS:  —legal community.  And presenting these tradeoffs has got to be done thoughtfully because we can't ignore the privacy overlay to many of the collection items that would provide security, and I think the ISAO, as they develop, need to be cognizant of that community because they have a very strong voice in this debate.

MICHAEL HAMILTON:  Yeah, I agree.  Privacy keeps coming up, keeps coming up, and even with the little amount of information that we collected, you know, somebody could analyze that and say, "We saw the website you're going to is for AIDS treatment."  Right?  Okay.  So that's a HIPAA problem right there, and so you'd have to define that.  But we are very aware of this.

This is where, again, I think the standards organization is going to come in and break some ground on this, and then the attorneys are going to take over and fix it.

DAVE TURETSKY:  There's no question we have to design privacy in from the start as we do that, as we think about sensors and the like.

What I was saying before, picking up on what Rick was saying, it's one thing to have a system that doesn't respect privacy from the start.  It's another thing to—in some particular case where there is an emergency to have made a mistake, and there needs to just be some kind of way to ensure that the penalties, if you will, are suited to what the nature of the breach and the circumstances are.  That's all I'm saying.

RICK HOWARD:  I think transparency.  All right?  We talk about the standards bodies.  They've got to build transparency into whatever we're doing.  Right?  There needs to be reporting back to the public about whatever we're doing that says, "Here's what we did.  There's the stats we got.  We screwed this up.  We'll fix it."  That has to be part of it.  It can't be hidden in secrecy.

JOHN WOODS:  And I think there was someone who stood up from OASIS earlier, and I think the transition of STIX/TAXII development to OASIS, particularly given for many large corporations, their global footprint, and making sure that those—that the transition incorporate some of the tricky privacy issues that you can run into when you're trying to do global investigation, so that large entities and then the ISAOs can make sure that they're getting information that doesn't cause compliance problems, overseas and Europe in particular.  And I think that's going to be a real benefit because while it started in the United States, I think we have to recognize that other jurisdictions have taken fairly aggressive positions with regard to information that we here in the United States would consider run-of-the-mill investigative information around IP addresses or other information that can create significant barriers when you're trying to do a global investigation, and hopefully, developments like that will further enhance the efficacy of the ISAOs, the members of those ISAOs as they're getting global threat intelligence as opposed to just U.S. based.

DAVE TURETSKY:  Well, and that's going to be an issue also for businesses that operate in more than one jurisdiction, even for our ISACs and our ISAOs.  To the extent they are providing information that's threat information, is all of it coming from the United States, or is some of that information coming from their systems elsewhere?  And whether that information can be supplied and with what caveats is indeed a challenge that I think probably has to be grappled with today.  It's not tomorrow—

RICK HOWARD:  We already deal with that just as a vendor.  Welcome to our party.  Come on over, ISAOs.  I'll have a long conversation with you about that problem.

DAVE TURETSKY:  Yeah.

GREG SCHAFFER:  So let's go ahead and talk about that, the international implications of the ISAOs.  Most of the discussion has been around domestic entities, but we do have international organizations.  We have multinationals, large companies that are very engaged from a cybersecurity perspective, but we also have players who are not domestically based that have telemetry, that have lots of information, that may be able to contribute to this conversation.  Should they be allowed to participate in the ISAOs, or is this something that we think really should just be domestic players?  And then if they are domestic players with international pieces, should they be controlling what's coming in from those overseas portions of themselves, or should that—what's the panel's thought about how the international aspects of this really need to play out?

MICHAEL HAMILTON:  I've had a little conversation about this with our partners up in Canada.  We're close to Canada.  And when I showed them what we do, they were all for it, and my suggestion that we could do this cross-border was enthusiastically received.  But Canada has got a privacy statute, PIPEDA, that is very prescriptive in what can and cannot pass the border.  I think this is where governments need to get together and just have clear definitions of what's allowable, what constitutes a privacy violation, which is agreed to by multiple governments, and then we can set up some kind of information flow.  But it's going to have to be government to government before we can get there.

RICK HOWARD:  So I think this is a market thing again, okay, especially Europe with their restrict clause on privacy.  I don't think we dictate to those folks that "You must do this."  That's not how we win this.  We need to show value, that these things are working, and they're going to want to come to us and change their own laws because we've shown value.  Right?  So those are not the people I go to first, but we absolutely need to bring them.

In the Cyber Threat Alliance, we're all U.S.-based companies but with a huge international presence, and so that's one of the big questions is, who do you invite to the Cyber Treat Alliance?  And we think we have to bring everybody in.  Right?  But I don't think you have to do the international community first.  We have to demonstrate capability.  They'll come to us.  What's the movie?  You build it; they will come.

GREG SCHAFFER:  So is there a concern if you do take it international that you may be giving information to some of the folks who are on the wrong side of this equation?

JOHN WOODS:  It's international.  If you're in the FS-ISAC and you're getting threat intelligence and you're having to make change on your global firewalls, not all those firewalls are in the United States, and we've run into this issue around—I mean, part of what, I think, DHS and the executive order are trying to address is over-classification.   And whether we sort of get into some traffic TLP-like, you know, sort of information exchanges, a lot of global certs have that same—you know, same type of information governance structure around it.  But for large U.S. companies, are we going to have an Internet of things or a SCADA discussion without Siemens at the table?  It can't happen.  And if we try to set that up, I think we're setting ourselves up for failure.

So we are confronting those issues today around classified threat indicators and whether—I mean, not even classified but no foreign.  We have the situation where firewalls were run out of an Indian data center with Indian nationals.  Threat intelligence came in.  The company had to fly someone, U.S. citizen, into India to do the firewall changes because we couldn't hand the signatures off to the Indian support team.

So I think some of those mechanical issues, as we're trying to work through this, have got to be built into the ISAO framework around understanding.  And maybe this is the standards organization has to really do this, that there's going to be a TLP-like information flow that comes out of the government, to the extent that there are information sharing agreements with the ISAO and the government around how that information needs to flow.  And those information flows have to reflect the fact that you have a global, many global organizations, not just the Fortune 500, but many businesses today are outsourcing or offshoring aspects of their IT infrastructure that are in the 1,000 to 10,000 fortune range.  So I think as we build out the standards organization, we have to take into account all of that reality.

RICK HOWARD:  So think of the dichotomy here.  Okay.  We're here.  How long have you guys been in the industry working on information sharing?  And we continue to talk about "Well, we want to restrict it."  Okay?  This is not what this is about.  The idea is to get the information out to everybody who can use it.  Right?

And I'm an old government guy, and I understand we don't want to share intelligence with our adversaries, but in the Cyber Threat Alliance, we've had the conversation:  "Do we want Kaspersky in our alliance?"  Okay?  Because we're pretty sure that whatever we give Kaspersky is going to dump it right to the Russian government.  Right?  Do we want Baidu in the Cyber Threat Alliance?  Because we're sure they're going to dump it to the Chinese government.  Right?  And then you get in this conversation about "Gee, we need to restrict it, and how do you do that?" and all these complicated processes.  And then the small to medium-size business doesn't get what it needs, and it's just a mess.

So I've come full circle to this on my old government days.  So I think we should share everything.  Okay?  Share it all.

ATTENDEE:  I agree.  We've got to get over it.

RICK HOWARD:  Yep.

MICHAEL HAMILTON:  I also think with respect to your question that preventing something like that, because it goes back—prevention is not possible.  That's collateral damage, cost of doing business.

RICK HOWARD:  They're going to get it anyways.

MICHAEL HAMILTON:  Yeah, they're going to get it.

RICK HOWARD:  Right.  So here's the thought, though.  If every security vendor knows all the indicators or compromise from every bad guy that we've been tracking and we share that around the world, okay—and we can do that on a continuous basis, not we just do it once and forget about it—we do it all the time—okay—only the highly resources organizations can keep up with that.  Right?  So now we're taking the smaller people off the table.  They're not going to be able to keep up with this, and now we've reduced the problem set from this to this.  Right?  And that's—in my mind, that's better.

DAVE TURETSKY:  I think there's some pretty important policy issues here, and there are two different ones.  I think it's important to separate them out.  One is the classified side of this, and, you know, that would have a different set of possibilities, I think, and potentially should be viewed differently than sharing non-classified information.

RICK HOWARD:  Oh, you're talking about government classified?

DAVE TURETSKY:  Yeah.

RICK HOWARD:  Yeah.

DAVE TURETSKY:  Because you know what?  When there are foreign people working on the firewall that is going to handle classified information, you know, I don't mind that our government worries about that.  I hope they make the right decision about that.  That's a different situation than sharing information that isn't classified.

And I think the point Rick makes is an important one, and I just think we have to be transparent about it.  If we think that we're willing to take more risk that the shared information ends up in the hands of bad guys, but we think that that risk is worth it because it still has an effect of narrowing the attack, treats, and the like, I think we just have to be transparent about that and honest about that.  That may be a risk worth taking, but the members of an ISAC or an ISAO ought to—just like a board decides for a company as to where they are on the risk management scale, that's something that members can prudently decide.  There just has to be some transparency about it.

MICHAEL HAMILTON:  And I think the government could come up with a logical equivalent of terror lines to—right?  You don't have to expose your methods to say, "Here is a threat indicator that we want to distribute."  I think that's entirely possible.

ATTENDEE:  And I think they're working towards that.

RICK HOWARD:  It becomes easier too if the whole world is contributing to that pool of indicators because try to figure out where that came from if everybody is working together.

DAVE TURETSKY:  And our certs do have relations with certs in other countries and sharing arrangements.  I don't know whether we know all of those, to be honest with you.

GREG SCHAFFER:  So that's a really good point about where does the information come from and the difficulties that people may have in this new environment.  How do we ensure in a world where we have all of these new ISAOs standing up and these various configurations that we've talked about, that we don't just get the information sharing within those, but that this turns into a national capability, that it actually—in addition to informing these smaller groups that will stand up also turns into a national capability that really bangs the gong from the government's perspective and the national infrastructure protection perspective?  What needs to happen in order to ensure, Rick, that the little pieces come together and turn into something larger?

RICK HOWARD:  This is really hard because we're trying to build a national capability, and what we're really is—in a world capability is what we're talking about.  Right?  And so whether or not the U.S. government can feel comfortable shepherding that kind of idea along to cooperate with the world, that's an open question, I think.  Right?

I think someone mentioned before, one of the memberships—one of the characteristics of an ISAO is the government is a partner, not a controller.  So they're part of the mix, right, and to shepherd that as a group member and try to incorporate everybody on the Internet, that's the way it should go.  I don't know how comfortable any government is going to be with that, that idea.

But I think the idea is—you guys know as well as I do that a lot of security vendors have almost as good intelligence as the U.S. government on cyber matters.  Some cases are better.  Other cases, we don't know because we don't watch those things.  So we can be pretty good intelligence sharing without having to get a lot of help from the government for our particular customers.

I don't know if I answered the questions, but did I?

GREG SCHAFFER:  I think you're touching on it.  I guess one of the things that we talked about in one of our discussions getting ready for the panel was, you know, how do we avoid a Tower of Babel?

RICK HOWARD:  Yeah.

GREG SCHAFFER:  How do we keep from getting into a situation where you've got so many people talking that actors don't know what to implement and how to implement it?  And is there a potential for conflicting information for one entity coming from multiple ISAOs, different approaches to the fix for a problem?  How do we avoid some of those kinds of situations, and is that something that the standards organization, as it stands up, should be thinking about?  Thoughts on that?

RICK HOWARD:  This goes to something that somebody said earlier this morning too.  They said they want to keep it wide open.  I understand the need for standards so we can automate a lot of this stuff, but they want to allow everybody the opportunity to give information in any way they want to.  Right?  And I think that's—I'm going to push back hard on that.  We want to develop a very small set of things that we're going to share, and yeah, we'll add the other stuff too.  But it's not going to be part of the core thing that we do.  Okay.

Minimize.  What you said, I'm going to start using that from now on.  Minimize what we decide to share, and make that valuable, and then that's how this thing gets rolling, I believe.  And agree to what those things are and agree to what the legal protections are.  That's the key.  I don't think we open it up.  Oh, you can give me—you can come through parcel post, or you can give it to me on a USB.  No.  You're going to use STIX and TAXII, and you're going to give me these 18 fields.  That's what you're going to do, whatever it is.

JOHN WOODS:  You have already, I think, identified some of the trust issues that embedded in many of these dialogues.  You're debating about whether or not to bring people into your group.

RICK HOWARD:  Yep.

JOHN WOODS:  And I think that, you know, by their nature, one of the core—again, going back to the comments that I've read that have been submitted—trust is at the core of effective information sharing, and I think you are going to see a proliferation of some of these events because at a certain size—and I think PwC, in their paper, made this point very notably that once you get over 300 members, the trust factors drops off dramatically.  And I think you've got a very small group now, and even expanding to 15, you're sort of having to make some hard choices.  So the Tower of Babel, I think there is going to be a self-selection into ISAOs where there is a high trust factor, and hopefully, that in and of itself will be a factor in keeping the Tower of Babel down because if what you're getting is Babel I don't think there's going to be scarce resources dedicated to participating in an ISAO where you're not really feeling there's value added.  So it's probably going to be somewhat self-regulating based upon that—you know, what seems to be a very core factor of trust.

RICK HOWARD:  Should we talk about Tower of Babel and being able to communicate?  All the old-timers out there, remember we were arguing over TC/PIP and IPX.  Remember that?

ATTENDEE:  Oh, yeah.  IPX should have won.  Thank you.

[Laughter.]

RICK HOWARD:  And Betamax should have won and all that.

[Laughter.]

RICK HOWARD:  I'm just saying it doesn't make sense to keep up with all of these standards.  We should pick one and go for it, and it makes our lives easier.  That's how the world works anyway.

MICHAEL HAMILTON:  And I think with respect to your comment, Rick, about this needs to be an—this needs to be global, not national, I think if we break this ground here, eventually the international standards organization gets involved, and we start working these things out.  But somebody has got to demonstrate, show value, and say, "Look, I shared information.  Nobody got fired.  It's actually cool."

DAVE TURETSKY:  Trust also may have other ways of being protected besides just the concept of a single tier of membership.  It may be—especially as you think of small and midsized businesses—that there may be different tiers of membership in ISAOs and ISACs where there's a simple way of sharing that's compatible with more complex ways, and you have a tier of members who share at a different level and have different attributes of membership.  That may be a way to protect trust because the vulnerabilities, if you will, of a bigger group are less because these other members have access to less, participate in less, and the like.  They may not have the crown jewels of the most sensitive information because they can't use them.  They don't have the capacity to turn them into actionable intelligence and the like at some level.

So I think as we think about this, we need to have some flexible notion because we can't stick to these—I think we can stick to these principles, but we have to think about how we make them inclusive as well.  We need to preserve trust, but as Rick was saying, he's willing to take the risk that certain information goes to bad guys because it's worth the game.  We need to think about that also because what we don't want is organizations that end up being exclusive, which will—exclusive for the wrong reasons and undermine the ability to be safer because we don't have those midsized businesses who are potentially going to be the route to infecting bigger guys anyway.

GREG SCHAFFER:  So one of the things that the various white papers and the comments at the Boston event and various others have said is we've got to have a position, and we've got to have some ability to hold people accountable.  That we've got trust, but in variably, someone will violate that trust.  And when that happens, there's got to be an ability to hold people accountable.  Is that something that each individual ISAO should be deciding on how to deal with on their own?  Is that something that we expect a standards organization to spell out?  Where do we think that issue of how to deal with violations of trust should be handled?  John, do you have thought on that one?

JOHN WOODS:  I think that the designation of an ISAO probably on the front end needs to—the standards-setting organization is going to have to sort of create some criteria for entry because if you don't have criteria for entry, you can't have criteria for exit, and it may be one in the same.

I do think when you actually get into the executive order, there is still a role for DHS here, which I think the Secretary is designated under the executive order with the ability to enter into security clearances and otherwise, and so to the extent that we believe that part of an ISAO's role may be to serve as that interstitial layer, there's still going to be some mechanisms that you could pull, for instance.  DHS could pull that security clearance, or that security clearance agreement in and of itself could have provisions in it that retain some control so that if a bunch of classified information is disclosed that the ISAO had because they didn't have appropriate security controls, there will be consequences for that.  And so I think—I'm trying to think of the scenarios where an ISAO could go rogue, but it may be that—you know, maybe that's the end of the bell curve.

ATTENDEE:  The Silk Road to ISAO.  Right?

JOHN WOODS:  But I think there is a concern.  There is a concern that if they don't engage in appropriate risk management techniques around the information they have, which in and of itself can be sensitive, there may need to be in the standards organization, some ability to hold them accountable.  And it's clear that DHS has retained some link in through the executive order to the extent the ISAO wants to get classified information.

GREG SCHAFFER:  So there were two pieces in that.  I want to call both of them out.  There's one piece of membership in an ISAO, and if somebody doesn't obey the rules, if they violate the stoplight protocol, whatever that maybe, how is that handled?  But I heard you, John, going to even a further step, which is, well, what if the whole ISAO is problematic?  What if the group—you know, to posit the Silk Road version, right, they're basically taking the information, and they're distributing it to hacker organizations around the world, and that's their primary constituency in the back end.  They've got a front end that looks like an ISAO, and the back end is distributing directly to—if that were to happen, I'm assuming that you're advocating that the government would be in a position to do something about that.

JOHN WOODS:  Yeah.  But I think you've got to be very careful there because—I think Neal gave a great example of the Conficker group that came together.  If we create significant barriers around the coming together, it is going to—you know, in exigent circumstances, one of the things that I think some of the studies have talked about 0is that some of you can have ad hoc ISAOs.  So, in some of the ATM cash-out matters that we've handled, there have been  multiple victims going on at the same time, and it may be that a corporately funded ISAO to sort of bring together victim companies could be something that could get much better threat intelligence or evidentiary trails to law enforcement in the U.S. and globally to try to walk back and understand how it's done.  If you have too many barriers to entry and we put it in a 30-day application window, I think we have to be flexible enough to allow these ad hoc groups to form very quickly, but at the same time, the standards organization is going to have to have some ability to cut them off.

And I do think there are benefits from being an ISAO that you can feed information to the government with certain legal protections, and how those are benefited to the ISAO and the ISAO members, I think the standards organization and DHS are going to have to work through because you don't want everyone necessarily being able to just stand it up without any oversight or control mechanism, but the control mechanism has to be carefully calibrated because we're going to lose a huge value, like Neal was discussing earlier.

DAVE TURETSKY:  I think the government isn't establishing the ISAOs.  It's not running them, and so it needs to take care in this area.  I see the standards organization, at least from the executive order, as helping define what an ISAO is.  The President's executive order talks about setting out baselines and some guidelines.  It doesn't prevent flexibility.  I think there will be a lot of flexibility, but it's not clear to me that the standards organization is going to continue to exist 5 years from now.

And so I think that these organizations are contemplated to be membership organizations in part.  Some of them will be for profit; some of them not.  And those who participate in them are going to have to monitor them but with their feet.  There may be some ability if there are hundreds of ISACs to see the equivalent of a J.D. Power or a Consumer Reports that helps rate ISACs on certain bases.  It's going to be hard to do that, though, because "transparency" is not going to be the watchword, I think, at least publicly.

RICK HOWARD:  I disagree with that.  I think one of the things in the intelligence community is you want to know who the source is of the intelligence you're getting.  Right?  And the beauty of a crowdsource intelligence feed is that you're going to be able to say, "Oh, that ISAO, this is the tenth time this month they've given me something that turned out to be wrong."  Right?  And so that starts to get down-voted or whatever it is, and so you stop paying attention to that intelligence over time.  Right?

DAVE TURETSKY:  I think there's potential for that, yeah.  I mean, I think, ultimately, the government, as somebody said earlier, is a peer.  So one of the options for the government is to stop sharing with that ISAO—

RICK HOWARD:  Oh, that's true too.

DAVE TURETSKY:  —as opposed to disband it.

But also, in a world—I hate to say this—where we're talking about profit and for-profit entities, the same array of laws and agencies that are available for for-profit entities to ensure the marketplace is one that has some integrity are going to be available here.

I assume that an ISAO that may be for profit and engages in deceptive and unfair practices may hear from the Federal Trade Commission.

JOHN WOODS:  And I do think, Greg, if you go back to the DHS Act, there is a definition of an ISAO under 212, and it has to have a particular purpose, which is reducing threats to critical infrastructure.  So if we get folks out there doing crazy things, they may lose the designation because they don't meet the terms of the statute.  So I think that's at least one—I always go back to the statutes, read it, because I think that's at least a grounding principle off of which a lot of this needs to pivot.

MICHAEL HAMILTON:  And, you know, there could end up being a yearly review and re-designation as an official ISAO or something like that, but to a large extent, I think this is all going to be self-correcting.  Somebody blows it.  You know, an ISAO that goes rogue and is providing bad information or whatever, or worse, taking information and providing it to threat actors for a fee.  If that gets out, they're going to disappear, but just at the individual level, the comment was made that when we come together as a group of people and we're sharing information, that trust is easier to achieve when it's regionalized and it's smaller because it's easier to trust the people that you work with and see at local conferences and things like that.  And when one of those people goes rogue, it's only going to happen once.  Nothing focuses the mind like a public hanging.  If they're gone and everybody has got that lesson that they can take, then it's not going to happen.

RICK HOWARD:  Yeah, it gets your attention.  It's not going to happen again.

DAVE TURETSKY:  Because we're going to get it wrong at least once, right?

RICK HOWARD:  Oh, yeah.

DAVE TURETSKY:  I mean, if we're talking about hundreds and hundreds, potentially, of ISAOs, the one thing I feel pretty comfortable about is that some of them are going to get it wrong, and what will happen to them, I don't know whether they're merge into a different—you know, the members will leave and merge into a different ISAO, just like companies get it wrong today.  They get acquired.  They disappear.  All sorts of things happen, and that's going to happen with a world where you contemplate hundreds of ISAOs.  We need to have an orderly process for that.  We need to think about that, but that's a risk we need to take.

GREG SCHAFFER:  So, Michael, you mentioned earlier that you've participated in a multistate ISAC.  As ISAOs start to evolve, do you think that the state ends up being a participant in the multistate ISAC and one ISAO, five ISAOs?  How many are people going to participate in do you think at the various levels?  And I'm going to ask the same question of you, Rick.  Where is that going to go?

MICHAEL HAMILTON:  So, first of all, I don't speak for the state, and I think this needs to be worked out.  We've had some discussions about what the architecture might look like from originating organization up to an information distribution broker that shares against a taxonomy of rules that are adjudicated by the originating organization, and how that flow goes, we don't know yet.  And so the standards organization is going to be part of that.  We're partnered with Pacific Northwest National Labs, the University of Washington, and some of those hundred-pound brains are going to figure that out.  I would not say a priori how that's going to work.

I do like the idea of everybody getting informed, but I think there needs to be a controlled pathway for that and checks and balances all along the way, so that we do have a set of sharing rules, and that's respected in every step.

RICK HOWARD:  It's the commodities.  It's the marketplace.  Right?  You're going to belong to as many ISAOs as you can find value in, and some of those, you're going to pay for.  Some of those are going to be peer relationships.  Some of those, you just have to because one of your bosses made a deal.  Whatever it is, you're going to belong to as many as you find value and no more than that.

GREG SCHAFFER:  Rick, you had mentioned earlier this concept of "Hey, I want the data, STIX/TAXII.  I want it these fields filled out, and I want it as fast as I can get it after it's known to the community."  If that is happening, does getting multiple feeds—if it's structured and it looks in a certain way across all of the ISAOs, that sort of addresses that issue as well, right?  Because if you've got duplicate feeds coming in and they're the same, doesn't that address some of the issue of the pathing?

RICK HOWARD:  I think every ISAO defines the attributes that they're looking for.  I know what I'm looking for in the security vendor community.  I outlaid them a minute ago.  Right?  But maybe for the cement ISAO, they don't have those things.  Okay?  They have something else they're worried about, and as long as we're all using the same framework to pass information, DHS might find, "Oh, the cement ISAO has something really interesting that I need to bring in.  It needs to be easy."  But I might not want it as a vendor community.  Does that make sense?

GREG SCHAFFER:  Yeah.

RICK HOWARD:  So you're going to define what you want that is going to be useful to you, I think.

GREG SCHAFFER:  Great.  Any other thoughts?

[No audible response.]

GREG SCHAFFER:  Well, we're at the end of our time.  We do have some time for questions, so we'll open the floor to questions for the panel.

RICK HOWARD:  Nobody fell asleep.  Nobody's heads are on the table.

ATTENDEE:  Some people fell asleep.

ATTENDEE:  So I just want to start by thanking the panel for providing an indicator of where we are, and both the answers you folks have and don't have and the questions.  This indicates what I'll try to comment on as little as possible, but we're at a point in time, because things we know and things we don't know, and this all speaks to the issue of scale.  What this reminds me of is the firewall market in '93 where the answer was, among all the experts, the only way to have a firewall is to read all the source code.  And at the time, my thoughts were "My mom needs one."  Not reading the source code.  So what do we do?

And, Rick, your comments were particularly to that point.  Very large scale, most of this information we can share.  One of the understandable missteps we make, misassumptions we make is because we've always kept it in very tight communities, hugely vetted, that this is the issue; this is how we fix this.  And it just can't be.

Also, one of Rick's comments, the kinds of things we need shared are that three water facilities in the western U.S. had an experience on Wednesday, not even who they are, but how do I detect that, and the fact that someone attacked three, not one but three water facilities?  So all of these scaling issues.

And the one other thing I want to throw in there, because this is among the ISAC community, like the early firewall community, it's like, "Oh, my God, there's going to be competition," but if everyone in the world needs a firewall and the total number of installed firewalls in the world is a thousand, you don't have competition.  At Webster, we've stood up an insurance ISAC, and the FS-ISAC isn't going anywhere.  It's fine, and it will be forever.  In fact, there will probably be multiple insurance because insurance is such a big deal.  And of the 200 ISAOs that Mike mentions, if you looked just at the SMBs, that's 100,000 SMB members per ISAO.  It's not going to happen.  So not only is 200 or 1,000 or 2,000 not too large, but most of them will not touch anyone.  They'll touch Palo Alto, and that's exactly as it should be.

RICK HOWARD:  That's the last mile, I think.  That's how we get to the last mile.

JAMIE CLARK:  Guys, Jamie Clark.  Again, I'm the general counsel of OASIS.  A datum and a pointed question.  Here is the datum.  When STIX and TAXII came over to us for hosting, we ended up with approximately 20 to 25 percent of non-U.S. participants so far.  Still growing.  What are we at?  Like 130?  It's up to 160 as of today's e-mail.  Whatever we do with standards organizations—and we're hoping personally that DHS will find a way to take some creative work with the NTTAA and find ways to drag in nontraditional sources of communities of people we want to include like the ABA into this as potential standards-setting processes, but whatever they are, they've got to meet the NTTAA.  They've got to be open.  They've got to be transparent.  that game is over.  There's no more keep-away because they've got to fit within that, and that's what the executive order says.  So we can stop worrying about open/close, and we can stop worrying about participation from outside the United States.  The body of stuff we're talking about, according to the law, is going to be open.

Having said that, let me ask you a question about privacy.  It came up a little bit, and somebody said, "Oh, the STIX/TAXII guys will solve the privacy problem."  Okay.  We'll tell them that.

RICK HOWARD:  I'm glad you guys took that on, so thank you.

[Laughter.]

JAMIE CLARK:  Thanks.  And you're on the dang committee, so okay.  You've got yourself a new subcommittee assignment.

Seriously, this is a transport mechanism for information, and it's a set of taxonomies of information.  And there may be some relationship with privacy in the limited sense that the need to do privacy-enhancing analysis and transactions and controls might put some requirements on the metadata or structure of some transport methods, whether TAXII or anything else, so there might be a little bit of overlap, but realistically, let me come back to you guys.  This stuff about are we being privacy enhancing, that's going to come back to sharing agreements, federation terms, policy stuff.  This is going to be our friends at Akin Gump and Baker & McKenzie, not of techies that are going to be having to bring a lot of privacy so this.

So let me ask this.  I don't see, at least today—I don't see the NIST power center and privacy people in the room, at least the ones I know don't show up.  I don't see the Center for Democracy and Technology here, the NGOs with privacy expertise.  How did you guys—two of you are involved operationally and doing information sharing now, and two of you have a lot of clients who are doing that.  Where are you guys getting your privacy voodoo now?  How did you bring privacy-enhancing requirements into this stuff you're building?  Where do you expertize it, and what can we learn from that as we try and make sure we fulfill the privacy-enhancing tenet of the executive order?

MICHAEL HAMILTON:  I can tell you I've had attorneys from the ACLU come stand on my desk a number of times and say, "What are you doing?

ATTENDEE:  How was that?

MICHAEL HAMILTON:  Honestly, I gave them fire department T-shirts, and they went away.

[Laughter.]

MICHAEL HAMILTON:  But, you know, I mean, it's about limiting the information, and they were the ones that pointed out, "Well, a destination address, if you analyze it carefully, may have some embedded health implications, which is private," things like that.  But they found it to be a stretch, and so I think it's about limiting what you do and being transparent.  You know, I invited them to come talk to me a number of times, and I've reached out to them a number of times.  And I will also say while you might not see NIST in the room, you can bet EPIC is going to be on top of this.  Right?  The Electronic Privacy Information Center, they are going to be all over this.  When this starts to be a thing, they're going to want to be heard.

ATTENDEE:  [Speaking off mic.]

JOHN WOODS:  Yeah.  I think the privacy issue, the global legal regime—never mind the regime in the U.S.—is highly non-homogenized, and it's rapidly evolving.  And it's being implicated by developments around disclosures of sensitive documents that have gone on the last few years.  And I think one of the things, for instance, in a whole-of-government approach is that the Department of Commerce needs to be in the safe harbor negotiations that are going to go on, needs to really confront this issue because when you try to conduct investigations on a global scale, significant sums of money are spent, me working with my colleague in Frankfort, Germany, around whether or not we can take certain data as part of a global cybercrime investigation and get it exported back to the U.S. or whether how we do those things.  And Germany is easy relative to the UAE and some of the other jurisdiction where the bad guys may hop around for different points.  And I think that we need to include in this debate the whole of government because, if we can't get the threat indicators out of non-U.S. jurisdictions, we've got a real problem set.

Now, I do think that the challenge has become a lot of this was going on, and perhaps not as close attention was being paid to the privacy rules from an internal compliance perspective because on a risk-benefit basis, many companies said there's no stick.  I think the sticks are coming if you look at the trend lines particularly in Europe, and I think that's going to complicate this community's efforts in ways that we're not fully anticipating.

We have a lot of privacy colleagues at Baker, and I'm sure David does as well.  It is a hard turn to do cybersecurity well in the current environment—you all know that—if you really sort of stare at the rules, and I think we do need a whole-of-government approach to allow this specific kind of information that is the minimized threat indicators under some of the exceptions to the data directive to get that codified so that large global organizations like Rick's don't have to worry about the data protection to the guy in Hamburg, Germany, coming after that organization with potentially very large finding powers from a cultural norm that is very distinct from ours.  And I think that is going to be one of the core challenges that we are a group in this room have to come recognize because they are much further along in their development around thinking of privacy outside the United States, and they're much firmer in their views of that, and the dynamic hasn't been trending well for the U.S. view of privacy, given recent events.  And that's going to be something we have to overcome.

DAVE TURETSKY:  I agree with all of that, and I would just add as a process point in response to your question that the President's executive order talks about protecting the privacy and civil liberties of individuals as part of this ISAO process, and it also calls on the standards organization to engage in any "open public review and comment process" for the development of the standards.

So whoever is or isn't in the room today, you can be sure that as this is developed, they will have full and fair opportunity to participate and voice this.  So I just want to mention that is the context that we're coming together today and how this is going to move forward.  So that will be, I think, a core concern that people will have a full opportunity to address.

Beyond that, I agree with everything that others said on the panel, and in fact, the co-chair of our cyber practice alternates her time between San Francisco and one of our offices abroad in part because we recognize this as a global issue.

PATRICK COUGHLIN:  Thank you.  Can you hear me?  Patrick, TruSTAR.  I'm going to be a little deliberately dramatic here.  So I can't help but think when our discussion is about ISAOs, it's dominated by fuzzy phrases like "risk management processes," "institutionalizing trust relationships," "data privacy and minimization," "whole-of-government approach"—my personal favorite—that we haven't already lost or perhaps conceded, at least conceded significant ground to the bad guys.  I think we all agree that the bad guys communicate with supreme efficiency and often impunity.

So if the objective is really not to just share information but to share the right information at the speed at which the bad guys operate, then how or where should we be looking for opportunities to perhaps take a page from the bad guys' books and perhaps use things like data anonymization or zero knowledge platforms to eliminate barriers to sharing and use data correlation and alerting to provide real market incentives to sharing?  In other words, how do we really create a paradigm shift rather than incremental change from the ISAC model and PD-63 of the past?

JOHN WOODS:  I guess I've got the mic.  I do think that the government is trying to do some things to create those incentives, so let's just take that on.  The FFIEC just recently introduced an inherent risk model with a maturity model where they talk about the expectation that FFIEC-regulated entities at a certain level of inherent risk, if you're doing online ATM banking, needs to be a part of an individual sharing organization.  So you're starting to get a trend towards a little bit of the stick.

But I do think that one of the things that at least we have observed is there are going to be challenges here.  Offense wins, I think I have heard hundreds of times, but I do think that there are a number of things that these types of informal information sharing that have gone on in communities of trust have really made a difference.

And I'll give you an example.  If you look at the number of arrests that have come out of the ATM cash-out organizations, there have been guys arrested and extradited, the United States.  They took over $300 million over a period of about 6 or 7 years; the last one, $40 million in 12 hours.  That didn't happen absent companies stepping forward and then not only collaborating with each other, but collaborating with the government. 

If these ISAOs provide a mechanism to allow that to happen more quickly with some liability protections around the information sharing and not having to worry about potentially other government regulators coming at them—and that's embedded in some of the statutes that are already on the books, potentially—they have never been tested—but those are the kinds of things that as the ISAOs stand up, they get more mature, and they understand what they can and can't do.  The next time there is a large-scale financial hack—or take the DDoS attacks.  Collaborating within the ISAO could be a huge game change because in the financial crimes world, there are named groups that many in this room know, and if you can identify who those very skilled adversaries are and use law enforcement mechanisms to move them out of the seats that they're in, because there's a person at the computer, then I think we're going to significantly reduce crime.  If you look at the trend lines, what I have been told, in the aftermath of Gonzalez getting arrested and his crew getting rolled up, there was a decrease in credit card theft from fraud.  Now, like everything else, new guys entered the market space, but it made a material difference for a while.  And, again, that's an example, because we were involved in those, where companies stepped forward and sometimes spent a lot of money, because the CEO was very angry about these guys doing what they did to him, of spending money to help that.  And if you can do that in a construct where you're not having to worry as heavily about some of the liability issues, that could be a facilitating and facilitate what we call a national capability that—to throw another sort of nice pithy phrase out there—that could be very beneficial.

DAVE TURETSKY:  You try to shift the paradigm, which is what this is all about, so that information sharing becomes the norm, and I know you're saying that's incremental.  You try to create the conditions so that innovation can happen, which is partly why they're allowing them to be for profit and the like.  And we've got companies like Palo Alto and others who are in that space and involved, and frankly, there will be shocking events that are going to help spur investment and everything else.  At least for a while, Sony was one of those.  We got calls that we had never gotten before from companies with concerns that they had but never were worth devoting any budget to before, and frankly, the thing that keeps me awake at night is the ICS stuff because I worry that this will be the good old days when we were just worrying about information being taken, and in some ways, the recall of 1.5 million cars is the reminder of the changed world that we're moving into.

MICHAEL HAMILTON:  So, if I understand your question, it's how do we move the needle without using 20th century buzzwords.  Is that—

[Laughter.]

MICHAEL HAMILTON:  So I don't even think we're close to there yet.  Everybody has got this kind of vision on how we can rapidly disseminate, "I saw this thing.  I want to tell everybody else about it so that they're better protected."  Without any seamless integration buzzwords, I will just tell you that we're not there yet, and when we come up with standards and we understand are STIX and TAXII going to win, is it CybOX, how do we package this information, how do we minimize it, how do we indemnify organizations from making mistakes, I think this is going to happen.

My experience has been we broke some ground on this, and nobody recoiled in horror.  Nobody lost their job, and I think, to paraphrase Rick, we're going to get over it, and I really do think we're going to get there.  And I don't actually think it's going to be that long.  We might hit a landmine first that congeals everybody's thought process around this.  The first time your toilet won't flush for 2 days, you're going to forget all about credit cards.

[Laughter.]

MICHAEL HAMILTON:  I'm right.  You know I'm right.  And that's going to help it along too, and hopefully, nobody dies in something like that.  But I think I'm more optimistic, and I like the buzzwords, so I'll just keep using them.  Cyber.

RICK HOWARD:  Synergy.

[Laughter.]

ATTENDEE:  Seamlessly cyber integrated.

KENT LANDFIELD:  Kent Landfield.  My question really is more around the value chain of cyber threat life cycle.  It's nice to have a flooding model of IOCs where everybody is sending stuff to everybody about what they're seeing and the like, but the real value isn't in the IOCs.  The real value is in the process data, where you've applied context to that information, applied ability to actually see how you can fix something and prevent and mitigate those items from affecting you.  Are we really expecting 200 to 1,000 ISAOs to stand up the an analytics capabilities?  Because, honestly, I don't see that happening.  I don't see the financial aspects being able to afford something like that.  The most expensive part is people, and when it comes to having the type of technology and the type of expertise that we're looking at, we're not there.  We're going to have to do a lot more with automation, and all I hear here is organization, and really, automation is where we really have to focus.

RICK HOWARD:  I think the Cyber Threat Alliance does the automation piece for it because it automatically dumps it into your already controls, but I think also someone said that maybe the ISACs take up the mission of figuring out what controls need to go in place.  So maybe if there's a hundred ISAOs underneath an FS-ISAC, maybe the ISAC—I'm not saying they would do this, but maybe they would and say, "If you have McAfee, here's the controls you're going to install.  If you have Symantec, here's the controls, you're going to install."  Let them do the work, so the little guy can figure out what to do with it.  That may be an option.  I'm just saying.

GREG SCHAFFER:  I think we've got time for maybe one more question.

JONATHAN GOLDER:  I'm Jon Golder.  I'm here from Discover, and a couple quick points on this.  First of all, I want to go off of some of what you're saying, which is for the small or medium guys, if they can't get ISAO in a box or ISAO as a service, it's not going to work for them, and that's what they're going to need is something that's prepackaged that may be like you mentioned, this whole—you know, they put a box in their system and serves as a data collection point, and in response or in return for doing that, they get a feed of stuff, so they can at least participate in this, but it's got to be completely automated for most of those guys or they're just not going to play.

RICK HOWARD:  I think we just created another avenue of vendorship.  Right?  Some vendor or some contrapreneur is going to build a box that takes the FS-ISAC feed and dumps it to your controls.

ATTENDEE:  Either that.

MICHAEL HAMILTON:  Either that, or we create the public option for monitoring.

JONATHAN GOLDER:  Right.  And there's nothing wrong with that, and that's probably the direction I think that those guys are mostly going to be interested in.  So thinking about our vendor space and what some of the smaller guys we use would want to see from us if we stood up an ISAO as a company for our vendors, and they're going to want something that allows them to basically flip a switch on and then forget it, which we know isn't the ideal, but it's also what their appetite is.  And they've got to work within that.

The second piece is, you know, I guess—I'm flipping this around.  I take a threat view on things, and my take on a lot of this is our end state here isn't necessarily improving our security or any of that from the angle I'm looking at.  I'm taking it as I want to reduce the return on investment for cybercrime, because right now it's 14 to 1, give or take.  For every dollar you spend investing in cybercrime, you're getting $14 back on average.  If we can reduce, like you said, the low hanging fruit, we can make it to where those guys don't get that return because their automated malware development system is being countered by an automated malware countering system, then we reduce a lot of that over the line  And that's the big piece.  I want to take those guys out of place because I can't stop the APDs in zero days.  That's just not going to happen without access to much better classified information than the Federal Government is going to be willing to release to the ISAOs, but I want to reduce the threat space to something that's manageable to where my intel team can actually be chasing down, "Hey, this doesn't seem right," and has an ISAO that they can trust and go, "Hey, are you guys seeing anything on this, or do you have any bright mind somewhere that thinks you're seeing something?" So the question becomes, how much are we sharing automated data, and how much are we sharing finalized human consumable intelligence that informs strategic decision-making?  Because I hear two different versions on the panel right now, and is the question, is the one ISAO is doing both, that different ISAOs focus in different areas?  How do people see that shaking out?

MICHAEL HAMILTON:  I'll just give you my thoughts on that.  I think analysts have to be involved.  When I say there needs to be a sharing broker that routes information according to a set of rules, there's got to be eyes on that that vets that, and the fidelity of that information has to be high, or people start to not trust it.  So I think a human has to be in the loop somewhere.

We got a guy in the fusion center.  He's one guy, and we're monitoring about close to 20 organizations, and he's stretched, but—

RICK HOWARD:  The Cyber Threat Alliance, though, you're having the intel organizations, all those security vendors.  That means you don't have to do it.  Right?  So that's the difference.

ATTENDEE:  Another indictor, the ISAO is a service.  That's the second time I've heard that phrase because we're doing the same thing already, literally that phrase at Webster, and, Rick, this is a question to you and Kent.  We are at this point where we see a lot of challenges, but we'll reach a point in time where the answers are obvious, like this one.  Most of the ISAOs will not have analysts.  They don't do things, and that's fine.  We can share with others.

And just to finish the question at the back, I think really needs to be touched on because a lot of people are saying, 'How are you, the government, going to solve this?" and as much as we love you, Mike, you're not going to.  That leads down all the failure scenarios.

RICK HOWARD:  Aren't you glad you're off the hook?

ATTENDEE:  But 20 years from now, we will have solved this, or society would have collapsed, so—

ATTENDEE:  One of those two things will happen, yeah.

ATTENDEE:  And I really think it will be the first one.

GREG SCHAFFER:  And on that cheerful note, I'd just like to have us give a round of applause for the panel.

[Applause.]

GREG SCHAFFER:  And I'll hand it back over to Mike or—breaking for lunch, I assume.

MIKE ECHOLS:  [Speaking off mic.]