Skip to content

The Hidden Cost of Silence: Why Info Sharing Fails and How to Fix It

Presenter:

Sadie-Anne Jones

Transcript:

Hello, everybody. We are going to have a little intimate, discussion today. So, ask a lot of questions, I guess, as as we go. So just, just, I wanted to do a quick introduction. So, so I'm Brian de Paulo, I one of our volunteers here. I wanted to introduce, Sadie-Anne Jones.


She's going to be presenting the hidden Costs of Silence. Why? Info sharing fails and how to fix it. Sadie's an analysts with four years of experience in cybersecurity focused on, I, I, cyber threat analyst at Cyber One. Am I saying ISAC right. Is that okay? And advancing collaboration, information sharing across the energy sector, specializing in emerging threats, providing analysts to strengthen community resilience, and as an active author.


So and I'll call, in all cases, I want to, welcome city to close out our day. I guess there is a final keynote in this General Assembly after this session. But, with that, I want to hit it off city. Let her jump in here. Oh, I.


Well, Hello, everyone. I'm really excited to give this presentation. It's going to be all around information sharing. What goes wrong with that? And then what you can do to fix it. We're also going to have some success stories with the executive director of ONE ISAC, Angela Hein, who is also here. So I'm going to be talking a lot about information sharing and analysis centers.


I work with the oil and natural energy. ISAC so I just want to make sure it is who here has either heard of a an ISAC is familiar with it, or has been a part of it? Okay, awesome. So for those that aren't familiar with ISAC, there are a whole ton of them. Pretty much for any industry that you are, you could be a part of.


There's probably an Isac for that industry. And while I right now I work with the one Isac, I have previously worked with the water ice or not the water Isac the health Isac and the Otto Isac, which are both really great. And they had had a lot of resources for those sectors as well. So what all Isac still broadly is to get organizations together from a specific industry and talk about threats as they are emerging threats, what to do about them, how to secure your environment, and to get all this information together so that you kind of have an idea of what's out there and what you can do.


So you're never blindsided by something that's already active in your industry. Because often we get sort of pigeonholed in our own organization. We're not looking outward, at what's out there. So that that's a little bit about the ice. I am a threat analyst with the ONE ISAC. So that means that I help take in any indicators of compromise that industry members send our way, and I help analyze them and put them out anonymously so that other industry members kind of have that information.


And we also do a lot of host a lot of events and that sort of thing. So it's a great sort of community experience. Next slide please. Okay. So the first thing that I like to talk about when I talk about information sharing is the legal aspect, because when we're talking about information sharing at the very high level, it's often limited because of that legal aspect.


So whenever a couple of years ago, I did a lot of tabletop exercises. I helped with those. I helped write reports based on what we found with those, when I worked with health Isac. And this came up a lot, a lot of security professionals, they want to share, but they can't just based on the legal requirements that they have in place.


So this is the little stat we have here from, an MIT study. So 56% of respondents identified legal risks as the top barrier to openness. And so that that ends up limiting what can go out. Next slide please. So I'm just going to show very briefly some breaches that didn't necessarily get out as quickly, weren't shared as quickly as maybe they should have based on that legal component.


So we have the Equifax breach, we have the Uber breach. And then we had the SolarWinds breach and all of these have in common is a lot of customer, data was, compromised. And at least in part this had to do either with stock prices or brand reputation. And it could have gone out if not, super soon, then at least sooner.


And you could have gotten more information coming in because you had shared that with either a government organization or with an Isac and got an information coming and next slide please. So I think this is a really good example of how important it is to get information out quickly, because often it does come out eventually. It just doesn't come out when it should.


So with the WannaCry incident, now this great little graphic here from Symantec and you can see how with that incident it caught like wildfire. And you you really would have had to get information out as soon as possible in order to help mitigate that. And it's this was particularly a critical time to do that because so many hospitals were impacted by WannaCry.


And if you are part of an ISAC or your working closely with government agencies and others, you can help get information. And when these incidents occur, as they're happening, to help you mitigate it, so you can get information out to others so you can potentially save, save lives if it's affecting a critical industry. Or at the very least, keep your business running and keep it going smoothly.


Next slide please. Okay. So we talked about information sharing at the top level. Now we're going to go up to the low level and talk about cultural barriers at the individual level. So a lot of people a lot of analysts don't want to talk about their part in an incident because they're afraid of getting blamed. If I think that you're going to be upset with me, I might not want to tell you that I had such a big role in what happened.


I don't want to tell you I was the one that clicked the phishing email. So in this report, we have here 88% of global I.T and security leaders believe that blame culture exists in cyber security with finger pointing especially heavily prevalent in the US. Next slide.


So this goes into the job security concerns because if you're looking if your organization is looking for a scapegoat, if I know that they want to find someone to fire to show that they dealt with the incident, and I had a part of that, I might not want to say what I did, and that that keeps people from telling.


So maybe they'll tell part of it and lessen their, their role. And the issue. And that can keep information from going out that could potentially keep others from making the same mistake in the future. Next slide.


And then we have ego and reputation. So it all it's all very similar but with ego and reputation I think of this as being more those at the managerial level. So maybe you have a bunch of employees and you don't want them to think less of you because you were the one that click the phishing email, even though you were also the one that told everyone not to do that.


And so that can keep those individuals from going forward with information as well. Next slide please.


Okay. So we talked about organizational top level and individual level. And now we're going to talk about team to team. So there are different priorities with teams. So executives are going to want their goals are going to be different than say cyber teams because executives are going to focus more on the return on investment. They're going to focus more on industry growth.


Whereas those on the cyber side are probably going to be looking more at how to keep the organization from losing money, losing losing time and preventing a disaster, which there's a disconnect there, and especially because there's a lack of urgency. So if you don't see an incident happening right now, if that's going, not going on, then you're not going to want to put a whole ton of funds on that.


So if I know that if I put random number $50,000 over here on this project, and I'm going to get $100,000 back, you're probably going to go with that rather than on the cyber thing, where maybe it's a process, maybe it's putting in info sharing, tool or getting involved in an Isac and that costs, let's say, the same amount, but there's not a return on that that you can see.


So people aren't going to want to maybe be as inclined to do that, even though that might save you $1 million potentially, if you go into a crisis at some point and everyone gets hit eventually. We all like to think that we're the exceptions, but you will get hit, and it's just a matter of how hard, that incident will be on your organization.


Next slide. So all these things come together and they cause repeat incidents. So you no one disclosed their role in an incident, so you don't know what happened. You didn't report to a government agency. So they can't help, develop more processes or tools or guidance on that. And eventually this all culminates into just whatever vulnerability that was.


It gets hit again. And I'm not going to ask you this question on screen. You don't have to answer this out loud, but in your head, think about if your organization has had a vulnerability hit twice. Think about whether info sharing would have helped had you had access to information from other industry peers or processes that you know now maybe were in place, would that have either mitigated the issue completely, or at least helped to some extent?


And a lot of times, that is the case. Next slide.


So this comes together some other reasons why we should info share. It's because there's often a duplication of effort. So a great thing I've seen with the one Isac is how willing industry members are to share information with each other and processes, that they have in place at their organization with other organizations. And the Isac we have committees and task force where people can come together and create those things, put a bunch of brilliant minds together, and you don't have to do that on your own.


I think this is also great because instead of just looking within your organization for information, when you look outwards, if you are a small to midsize organization, I think that's also a really great resource because maybe you aren't even capable of putting these things together yourself, but because now you have the minds of so many different industry members and larger organizations you now have on information you wouldn't have had had access to before.


You also have a lack of shared Intel, so a great thing about the Isac, as I mentioned, I help with analyzing IOCs that come in from industry members and putting those back out. So if one if someone if there's a threat actor targeting for your industry, you're going to be the first to know because you are part of an information sharing community.


Next slide please. And all these things end up impacting in your industry and then national security as a whole. Because a lot of times.


Threats will affect not just your industry but other industries in the, in the country, in the world we saw WannaCry. How fast that spread, this impacts threat response times. You don't know what you're looking for, so you can't detect it when maybe you could have you have fragmented visibility. The weekends, your collective security because now there are gaps.


And then you have reduced recovery capability because maybe now you didn't have that rapid response. You could have had. And also, you don't have the resources that maybe you would have had before. So when you are part of a community, when you're sharing information with government agencies, when there's in your industry, you have all this, these shared capabilities, you're all part of, you're all feeding into these industry standards and are able to put that, make a recovery as your recovery as quick as possible.


Next slide.


So I think it's sometimes we were talking about teams and how sometimes the at the security, the security teams have a hard time quantifying their, info sharing processes or tools to leadership when they're trying to get these things approved. And it's hard to quantify that. I think people are doing. I've read some articles that are really interesting and more, more and more people are finding ways to do that.


But I think this is a good little staff to show that it does have a real impact. This was, sans report showing that 83% of respondents reported improved security, prevention, detection and response when Cyberthreat intelligence was integrated into their workflows, which was very full. Next slide please.


Okay, so we've gone through all the things you shouldn't do and cyber or and information sharing. So we're going to go over a few things you can do. So now now that you're info sharing hopefully what you want to do is actionable sharing. So a lot of times when people first get into information sharing, especially people just send out everything they have.


Here is a large document of unedited research. Here is, a whole Excel spreadsheet of IOCs I've seen in my environment. Do something with this. Well, I if I got that, I wouldn't necessarily know what to do. I wouldn't need to probably. I would probably have some follow up questions. It's not super useful because there's not a an actionable anything actionable you can really do with that.


A lot of people get frustrated and just stop reading. What you want to have is the actionable data. So something that's, maybe it is a long Excel sheet of IOCs, but I've told you exactly what each of those, all of these IOCs are related to malware and they need to be blocked. Maybe it's not a lot of information.

CYBR.SEC.CON CTA


Maybe you just send me a URL and an email address and you say this is part of a phishing campaign. Well, maybe it's not. That's not a whole lot of information, but it is actionable information because you can show people the approach that that threat actor used. And you can you are able to block that, those URLs and those email addresses and educate people on it in the future.


Next slide please.


Also one one other thing with that too is when you are starting out, that can be kind of a a learning curve. And that's another thing that I do. So if you do have we'll still have questions for you. His info dump on us. But one thing that I can do is take IOCs that maybe you weren't sure about.


You don't have hard evidence showing that this is bad, but you're pretty sure you can send those. A lot of the times to your Isac. You can do definitely do that with the one Isac. And we'll take those, put those through our tools, do our own analysis, and then put that back out so that you have that information and so that other industry members have that information anonymously.


Of course. So when you're building out your information sharing within your organization, it's important not to. We're not trying to get a share first verify later sort of approach to it. Sharing is great. I'm of course all for it. But those legal restrictions are there. They're in place for a reason, and we don't want something to get out that shouldn't.


Neither do we want those strict NDAs, because as we talked about earlier, then you're not sharing anything. Something I suggest is to always talk with your legal team beforehand. So when cyber, it's the perfect middle ground of these two approaches is to sit down, talk to the legal team. Figure out when, how with who you can share information and get that in writing so that you.


When an incident does happen, you don't have to argue what's legal or trying to figure that out. A lot of times when people are going through an incident, they want to share, it's not that the intention is not there, but if they don't know if they can and they don't want to accidentally go against the restrictions they have in place.


So having those having knowing exactly what you can and cannot share is really important. A lot of times you can help educate legal teams on Isaacs because many people don't know about legal is no different. There are a lot of legal protections with Isaacs when you share with them, and once legal knows about that, that can really, really help you in getting info out in a way that neither hurts the organization nor, limits the flow of information.


Next slide please.


And this is my favorite part of the presentation because I adore blameless post-mortems. I think they're extremely helpful in getting away from that. Those blame cultures we talked about earlier at the individual level. So people don't feel like they are being, scapegoated or any of that when they come forward with information. So they're more likely to give full accounts of their part in an incident.


So essentially they it looks at that those second stories. So human error is seen as the effect of systemic vulnerabilities deeper inside the organization. Saying what people should have done doesn't explain why it made sense for them to do it in the first place. And only by constantly seeking out these vulnerabilities can you keep your organization safe. And this does not mean by any means that people are not held accountable for their actions.


So if someone made a terrible mistake, they're still going to be held accountable. And only it only reframes things so that we're not looking for just for someone to blame. If someone made a mistake, it's there's a very good chance that someone else will make that same mistake in the future. And so looking for how and why they need it and how that how maybe a process can be put in place or improved really helps protect you way more than just firing everyone that clicks on a phishing email or what put it to USD into the wrong computer, that sort of thing.


Next slide please. And by the way, that was from that little image I had was from an Etsy article, and they were one of the first big organizations to start doing, blameless postmortems. And now other bigger, big organizations are also doing it. I believe Google is one of them, which is really cool.


There's also process improvements you can put in place. So, the one I sack, we use the miter line sticks and taxi. So, within our systems we have those. And that just is a standardized language, of putting in information for others to consume. There's also internal internal policies for sharing. And I really like these. Anything that sort of game of size, the info sharing part of things I think is great.


I know a lot of companies are doing that, doing different security, games and prizes. So I think that really helps motivate people. I like that sort of thing. A cool thing. This is my first year at the one I sack, but every year they do an incentive challenge, which is really fun because it helps. It essentially gives out points to members who do a series of different tasks within our organization that help promote info sharing.


So sending us indicators, a compromise or giving us papers or speaking up in meetings and hosting meetings, that sort of thing. And at the end of the year, we tally up all these points and give out a bunch of prizes. So that's a really great way to get people motivated. We see it a lot of, a lot of activity because of that.


Of those here, does anyone have any internal policies that promote info sharing?


Nope. Okay. I think it's a really great way to just even if you do something pretty small to get people excited and involved.


Next slide please.


There's also tooling and automation. So make sure when you to share on anonymized detection roles, not just logs, so that others can find these same, vulnerabilities and threats in their environments and then build into existing. Saw an incident response platforms. So those platforms that just help get information out and about help keep you secure. So next slide please.


And then we have policy and advocacy. So these one great thing some that is in place for a lot of individuals is safe harbor laws. So those are essentially these legal incentives for putting plan in place in case of an, a breach and then not to be left out. I've heard a lot of talks about vendors being left out of discussions, and we don't want them to be the weakest link.


Vendors should help build us up, build up what we already have. So it's important to require security products to support the export of sanitized Intel from your vendors so they are secure as well. Next slide please. Oh, and the best part, I want to welcome Angela Hahn, our one Isac executive director, to talk about some info sharing stories and action.


Hi everyone. The chat sake. I think we have just enough people in the audience to start the wave. Who's ready? Who's with me here? We got Erica. Awesome. I know it's, Tuesday afternoon, but one of the fun things I get to do in today's presentation is share when it works, and some specific case examples where all the things that we just talked about actually got put in place and made a difference, a positive difference in the industry.


I've been at this for seven years now. I know many of you, good to see some fresh faces too. But, prior to my life as the executive director of when I sac, I was special agent with the FBI for 20 years, including building trusted relationships with the private sector here in Houston. So that's where I got a chance to meet a lot of people.


And also be that connection to reporting something to government when it does happen. Now, we used to say, if now we say one, right. So one that I love. I call it a quiet win because we don't get to advertise when we save the day. You know, I used to put on a red jacket and go put handcuffs on people and parade them around.


Right. Not here. When we win, we keep it quiet. It's confidential. Right? It's important. But few years ago, you might have heard a little story about a pipeline company, right, that had a ransomware attack. And ransomware was really high profile in the media at the time. And we as a group got together as peers and said, okay, what do we know?


What's cracked, what's incorrect? Misinformation. How do we get to the truth and how do we protect our companies and the assets? So, very important time, a lot of interest in information sharing. And, we heard you guys are not going to relate to this at all. Right? There was an incident on a Friday night, right? It never happens on Monday morning.


Right? Happens on Friday night. I got a call. It's 9:00. Angela, we have something going on. How do I contact the FBI? Okay, this is while I'm at the Isac. And I still have my contacts at the office. Right. So I tell them, here's how you get through to the switchboard. Talk to a real person. And, and then he said before he before I could even say anything, he said, we're going to get you some information.


We just got to go through legal right. So by Monday afternoon, we had indicators of compromise to share so that others can know. What should we be looking for in our environments? What what do we need to be on the lookout for? And the coolest thing about that is they could have just kept their head down. They could have just been working on putting out the fire.


But they put something out. And then what happens then you get something back because we were getting reporting back from our member companies who are saying if they did see it or even if they didn't. Right. How common is this? How targeted is this? That's all valuable information. And then what do we do about it? What how do we mitigate this so they get so much more back?


They could have just kept quiet. Right. And here's another interesting thing about this company. This member company, we by default anonymize the information. That's shared with us. So it's good to know who's you know that's happening. But do you really care who is happening to. Probably not. Right. I just need to know that it might impact me. But this company said, Angela, you know, I think it's going to get out there any way.


You know, go ahead and put our name on it. So nobody's confused. Just protect it with our traffic light protocol labeling, handling, caveats. Who can you share? Something that you receive with someone else. And it's important because not all information is created equal. Some is more sensitive, right? Some the internet knows. Right. And others. Hey, I'm sharing this with you because it's timely, actionable, and it's relevant.


And you can share back what you know, what you see, what you find. So by Wednesday, we had more IOCs. We had more information about the tech sector. They didn't wait until the investigation was over. It's so important that they were proactive to be able to get information back and to help protect the community, the industry, their fellow peers.


It's really set the bar for when they need information from a fellow member company. They can share it. Right. So this went on, continue to get updates. They came on to one of our meetings and said, here's what we know, answered questions. Right. And then and after action, sharing lessons learned right. And the coolest thing is nobody knew.


It never got out to the media. Nobody ever. I never heard a whisper of this company's name in the press. And that's really important because this is this kind of sharing. It's not easy, but you build trust. You build trust in relationships, and you know your information is going to be protected because you would want your information protected as well.


So that was a quiet win and it really helped. I think companies go, wow, maybe we could share information too and it wouldn't get out to the media, and maybe we could learn more if we did more of that. You guys want one more story? Okay, I'm going to take you back to my FBI days, and I'm on the Cyber Squad.


I'm not technically trained. My peers are. But I work with outreach and engagement and making think maybe I might be the only FBI agent they ever met in their life, but they know who to contact if something happens. So I got a call from one of our companies in the area and they say, Angela, we've got this ransomware problem.


It's weird. We know how to triage. We know how to what to do. That's not the point. It's a nuisance. But we think we need to figure out where it's coming from. What is behind this? Well, remediate, yes. But in the meantime, let's figure out what's going on. So a couple of my colleagues, cyber trained, went to the company, started going through the logs with their team.


This is not the old days of rushing in and taking everything and leaving out the door with it, working side by side, trying to get to the bottom of what's happening. So it wasn't too long before they figured out all of these machines that had gone down had visited one particular site, one particular legitimate website, registering for an energy conference.


Legitimate. So if we had not known that, we could not have contacted the company because everybody that was going there was getting compromised. So we can contact the company. Oh no, we're good. No, you have a problem. You need to remediate. Trust us. Okay. We know what we're talking about. So get that fixed so that there are no more victims.


Also, to get the word out to other energy companies that until this site is remediated, block it or you're going to get the same problem, the same ransomware problem. So that's great news. But it also told me, as a national security agent, what if this is a test? What if a nefarious actor is testing the industry to see what they can bring down by impacting a legitimate conference?


We need to know what the motivation is. Is it just kind of opportunity? Is it just financial in nature or is it a nation state that wants to impact the security of our energy sector? So if that contact had not reached out to me, we could not answer these questions and more victims would have been compromised. So it made a difference and that company's name never got out either.


Right? But we protected so many others, so that made me feel good. I had a good day, right? But sitting in is absolutely right. You cannot know everything on your own. There's too many silos. Tap into resources that others have developed and skills and knowledge that they are willing to share with you. Best practices, mitigation strategies. How to build an OT program.


These are all things that we're talking about on a regular basis, and it's making a difference. So if you believe in the mission of protecting our critical infrastructure, then make sure you see something and say something. My other favorite sharing is caring. So anyway, those are my two stories. Thank you for listening. I like to say that special agents love to tell stories, but retired special agents really love to tell stories.


So thank you for your time today. I'm going to give it back to Sadie and and, Yeah. Keep doing a great job. You.


Well, I have one more slide just to close it out. I can go to that. Thank you. So, just a little call to action before we head out for individuals, build strong, trusted communities and share lessons, not just indicators for vendors and partners. Make sure you get that exportable Intel. Those exportable Intel formats and support cross organization collaboration.


And then for leaders, really set the tone within your organization that sharing is a defensive multiplier. So you share in your organization. You share outside your organization that will come back. And you're going to make your not just your organization, but the entire industry more resilient. Next slide please. And that's it. Thank you all for attending. You're welcome to email me or connect on LinkedIn.


But otherwise, that's all I have for today. And thank you very.


Much.


Oh, I can, I can I can hand this out.


So, so I just wanted to say, first off, thanks for this. I'm on the board for, And I saw you didn't mention this owls, but like I said, there's Icehouse as well. There's one just here in Houston for for that. So all of the sharing is important. In my industry and the financial side of things, we now actually have regulation that actually requires us to have a policy, to determine what we're going to do this first year.


We can decide not to share, but we need to have a policy to actually share. As we do that, as we share, everyone becomes more resilient. It is much better for us to go ahead and do those things. So, I was going to give you another story. I know you guys are probably familiar with like evil genius, evil proxy, greatness, that stuff.


We've now built a huge regex, basically, of ways to be able to to stop that. We're actually seeing it still constantly. I know just within the I saw we get about five examples a week. It probably on average, but we're getting those out of the quarantine boxes now, right? Instead of actually getting they're getting clicked on, etc. somebody else is getting hit.


But in our industry is not. And being able to share that information and then create saw actions to be able to say, if I see one of these 16 things and I get to four of them and then kick that account out, right, do those kind of things, that allows us to do that. But we I don't have enough examples internally to do that.


But as an industry, I've got enough examples to be able to do that. So sharing is awesome. Sounds like you're the story. What what I saw. Are you part of so so it's NCUA, the National Credit Union ISO okay. Awesome. Yeah, I have actually, I, I have worked with ISIS before. I did some work with, faith based ISIS.


So we love those two.


Okay. We'll, I will I wanted to ask a stupid question, but I guess I won't because I have microphones. Another question. So we, And I don't think it's like a new way to deploy applications, but, you know, our company has a lot of SAS applications and recently with a lot of customers, this was private clouds wanted to wanting us to deploy our SAS in their private cloud.


When it starts becoming SAS technically. But they expect all the same visibility into cyber security as if it's SAS. And they want us to support security, but they also have full visibility into our infrastructure and B of our vulnerabilities. I was just wondering if you're a success because it's oil and gas, maybe others are doing it. Like if you have any good recommendations, where is the balance between like us doing all cyber security, customer doing also where security in between or maybe some good templates for that.


Like it's not that we don't want to share everything with the customer, but like we have our own security tools and we know how to read out the noise. Right. And then the customer uses something else on the same infrastructure because now they see it as like, oh, you have this critical, critical, critical fix it. And like, no, we're not fixing it because we have compensating controls, you know?


So it's kind of one similar topic. It's about sharing but it's slightly different. So I just wanted to ask in person. But then to have a microphone ask now, so you're kind of talking about the sharing with, with customers. Is that. Yeah. Okay.


Because.


You know, got.


Something is going wrong. May we see, you know, the practice saying that.


You need to fix this because. Yeah, I think there is a certain balance there for sure.


I don't know.


Now that I can't say, I think that's a really interesting question. I haven't personally dealt with that. Have do you have any? Angela.


Thank you.


We have partners with certain vendor companies, and they have customers. Right. And they see things going on that they will report to us. They also sell that stuff. Right. So they don't want to give it all away. But they make exception when it is a priority and actionable for the industry because it's the right thing to do, because they are a partner and they trust us, and that we have the dissemination mechanism to get more information out.


So I think the balance word that you use is very, very true. And we're still figuring it out.


Any other questions comments.


Don't be great. Thank you everyone. All right.

HOU.SEC.CON CTA

Latest