centered image

When HIPAA Is Outpaced By Technology And the Cyber-Elephant We Need Confront: Exclusive With CEO

Discussion in 'Hospital' started by The Good Doctor, May 31, 2021.

  1. The Good Doctor

    The Good Doctor Golden Member

    Joined:
    Aug 12, 2020
    Messages:
    15,161
    Likes Received:
    7
    Trophy Points:
    12,195
    Gender:
    Female

    Mathieu Gorge is the author of The Cyber-Elephant in the Boardroom, as well as CEO and founder of VigiTrust, which provides Integrated Risk Management SaaS solutions to clients in 120 countries across various industries. He helps CEOs, CxOs, and boards of directors handle cyber accountability challenges through good cyber hygiene and proactive cybersecurity compliance programs. He is a multi-award-winning CEO and an established authority on IT security, information governance, and risk management, with more than 20 years of international experience.

    Mr. Gorge is also a prominent member of the international cybersecurity community and served as President of the French Irish Chamber of Commerce. He is the current Vice President of the Irish section of the French Trade Foreign Advisor, appointed by the French Government. He previously served as the Chairman of Infosecurity Ireland and was an Official Reviewer for ANSI.

    We had the pleasure of talking to Mr. Gorge about what happens when regulations and habits cannot keep pace with the times and evolution of technology, especially as cybersecurity applies to COVID vaccine passports and other sensitive data being handled currently.

    This is the situation we are seeing right now in the healthcare industry, where HIPAA covers most consumer personal health information, but has some significant gaps. For example, it does not cover data from Fitbits and other wearable technology, or DNA used in ancestry kits for sites like 23andMe.

    [​IMG]

    Alice Ferng, Medgadget: Please tell us more about yourself, your background, and VigiTrust.

    Mathieu Gorge, CEO & Founder of VigiTrust: I’m Mathieu, CEO and Founder of VigiTrust. We are a provider of software as a service (SaaS) integrated risk management (IRM) software that enables our clients to prepare for validation and to manage continuous compliance with legal and industry frameworks and regulations like PCI, GDPR, HIPAA, NIST, ISO, and many others. In fact, we are told that it covers about 150. The tool is called VigiOne; it’s in use in about 120 countries primarily in retail, healthcare, hospitality, government, semi state, higher education, and to a lesser extent, the transportation industry—primarily airports and airlines. We run an advisory board which is a not-for-profit think tank that has 150 plus science members who are C-level Board of Directors, law enforcement regulators, researchers, security bloggers in office. The not-for-profit think tank – when you sign the charter, you get access to a portal – you can put that on your LinkedIn. In fact, we’re doing a lot of updates on that at the moment as part of our own governance. And we also have a community of about 700 security professionals that are invited guests to some of the events.

    I also created a methodology called the “Five Pillars of Security” about 12 years ago, and it’s based on the idea of whenever you look at any information governance, security regulation framework anywhere in the world, any industry, you always dial back to five common denominators: 1) people’s security, 2) physical security, 3) data security, 4) infrastructure security – which is your wider infrastructure – your third parties, first parties, your franchisees, your subsidiaries, your cloud, your applications, your remote workers, and so on, and then finally, 5) crisis management – what do you do when something goes wrong.

    We use the five pillars of security specifically for education of board-level and C-level, folks. And in fact, it was suggested to me by members of the advisory board that I should write a book about the topic. And that’s how “The Cyber Elephant in the Boardroom” came about. So that’s my background in in a nutshell. And obviously, as you can hear from my accent, I’m French, but I’ve been living in Ireland for 25 years.

    Medgadget: How or why did you get into cybersecurity? Was it something you were always interested in?

    Mr. Gorge: I started working in project management, and then as part of the work that I was doing selling project management training, I started selling training to IT organizations. And then I started working in in sales in network security back in the day, back in the late 90s. And I kind of, you know, “got the bug: I thought it was an interesting, interesting industry. After working in that industry, for other people, for about four years, I felt around 2002 and 2003 – in fact, I started VG trust in 2003 – that there was a requirement for security, education, and education in data protection. Now, today, it’s common ground, and everybody understands the concept of data protection. But back then it was a new thing. And so even though I started VigiTrust to do data protection training, the first few years, when I went back to my old clients they always said, “Well, look, you’re nice, but your business is young, and you’re trying to sell something new. But we’d like to help – you can you sell us a firewall,” or something like that. Within three years, we were in Ireland a reasonably big value added reseller when we started doing assessments, and eventually, we went back to clients and started doing training on data protection, up until about 2012, when we productized the training. And then in around 2016, we stopped doing consulting, and pivoted into what is now VigiTrust: a provider of SaaS based integrated risk management tools. So that that’s the background.

    I’ve always been really passionate about it, because I believe in protecting your data. Data is the new currency and is the new oil. But it’s also something that at some stage, some of the data becomes you. And in fact, the whole idea of having to tell your employer that you’ve been vaccinated, and suddenly you give your employer a copy of your COVID details and health information. It’s crazy, because the amount of the amount of information that we share with parties has gotten to a stage where there’s very little personal information that you don’t share. And I’ve always been fascinated by that.

    Medgadget: This is a very important and fascinating topic, as that touches on my main professional worlds of tech and healthcare. We’re often talking about HIPAA or GDPR, and the amount of personal data that is willingly or tacitly given away without much further thought from patients or consumers of products. When doing clinical research, one of the first things we’re so careful about is de-identifying everything and making it pretty much impossible to link the data with a specific individual, yet we see many holes in data handling procedures and protocols these days. Please elaborate more on data handling related to the COVID vaccine passport propositions and the main issues that you see with that as a professional in cybersecurity. I think a lot of folks don’t understand what data is actually getting handed over and who handles the data and how that’s stored, or what the consequences long-term may be.

    Mr. Gorge: Yes, this goes back to education around the value of your personal data, right? So everybody tends to understand the value of your credit card data. But equally, they don’t really pay too much attention to it because if something goes wrong, they contact the credit card provider and 90% of the time, they get their money back within the same day, and 99% of the time, they can get the money back. So, they’re not too worried about it.

    But your health information is very unique, and you can’t go to another hospital and get a different set of health data – your health data is what it is. If you’ve got whatever medical condition you’ve got, even if it’s something small, such as asthma, or if you get tired when you do XYZ, you can’t change that it’s part of who you are.

    You cannot get a second health identity, and your health identity is unique to you. It’s very important that we convey as security professionals – or medical professionals in your case – the value of that data to people that go to a hospital or to a physician or wherever. Unfortunately, most people are not really aware. And when they become aware of that, it’s when they’re in the hospital and they’re sick, and their primary concern is to get better.

    I think that the hospitals have a duty to inform people as to what data they’re going to collect, and typically do so in the fine print, but nobody really understands that so it is a challenge. And in regards to HIPAA, if I’m not mistaken, HIPAA was enacted in August 1996, and so it’s an old framework, which has advantages and disadvantages. The main advantage is that there’s a lot of data as to what works and what doesn’t work, and how you apply the five rules. There’s a lot of best practice. And there’s enough jurisprudence out there that says, “Well, you know, there’s a hospital that owns the most expensive laptop in the world, and because the practitioner had a laptop with information covered under HIPAA and that laptop was stolen, they had to pay $3.5 million in fines.” People get that, and the university is the one that gets the attention. But HIPAA is a lot more unannounced and nuanced, and I do you think that it’s one of the things that HIPAA doesn’t cover well, and is why it needs to be modernized – the whole idea of software security.

    HIPAA is very good at disaster recovery and making sure that entities keep the most up to date information, such that if there’s a problem, we can bring that up. And it will be accurate, and it will be as recent as possible. And that’s great. HIPAA is good at covering for parties, business, associates, and so on, and it’s good at requiring policies and procedures. Where I think HIPAA is falling down is where it’s an old regulation around software security. The part that I feel is missing, or, at least in my experience of dealing with health systems, is the whole idea of: if a hospital is using custom made software or if they’re integrating with custom made software – that we’re not necessarily putting in place the right checks in the right controls around – such as, has that software been checked for security architecture, or has there been secure coding reviews, and that kind of stuff. Right now, you see a lot of attacks going deep into code, and I think that that’s an area HIPAA needs to be modernized in.

    Medgadget: So let’s say I’m one of these hospitals, and I’ve just created my own new program. And, actually, there’s so many spinning off right now related to the even the COVID data, and they’re trying to track personal health data or contact tracing in all sorts of ways. Many of these are not necessarily well built, because they were built in haste and often with limited resources where the actual coder isn’t a professional coder, or at least, a coder that is more familiar with security protocols. What would be your recommendations here?

    Mr. Gorge: COVID is a great example of people rushing to market with applications that allow you to collect data and use the data for surveillance purposes that add value from a traceability or contact tracing perspective, from access to data to help cure people or look after them. But unfortunately, it’s a case of rushing to market versus risk assessment versus actually making sure that the solution is not increasing your risk surface.

    And so rushing to markets, for data that’s going to be COVID related, is inherently going to be linked to my – to me – to my personal information, which means that if it’s not done the right way and if there’s no security checks, I am majorly increasing my risk exposure as a health system. So the recommendation would be to, of course, implement tools that allow you to do those things, but to make sure that the applications are pen tested, that the code is being reviewed, and that the applications are scanned on a regular basis, and that they’ve been risk assessed. And if there’s an issue, that they’re potentially suspended for a couple of days to fix the issues, because otherwise, it’s just creating a nightmare.

    And I will say that using basic stuff, like the checks from OWASP or SANS, it should be second nature to those people that actually create those applications. But unfortunately, it’s not. And I think it’s not because HIPAA is not really focused on that, but HIPPA is focused on other important areas where things are already consistent.

    When I look at the way that a health system is run: if you want to simplify “health system” to maybe a group of hospitals – there will be physicians, there might be clinics, there might be even potentially a nursing home that links to the hospital, and so on. So you’ve got all of these interconnected, little business units that exchange information, but are they exchanging information in a secure way of exchanging information in a way that I can say, this is not you?

    The patient was admitted that day, we took that information, we checked his insurance, we asked him a few questions, we did a COVID test, and the tests came back positive. We did another test the next day, and it came back negative. So we had a false positive and so on. And so I mean, do I really want all of that data going from one business unit to another? Maybe I do, maybe I don’t, and maybe it has to go to another business unit within the health system so that I can be treated the right way. But I now have copies of all of my data in various business units. And traditionally, HIPAA requires some good controls around there. But with COVID, we’re essentially dealing with something that is just so topical and so sensitive, that we need to pay attention to it.

    Now going back to the idea of the COVID passport: Let’s say I go into the health system and I come out, and I’m COVID free, and I go to get my vaccine somewhere in the hospital, and am given a CDC card, and then I’m told to go fill out an application. I go on to the application and and I upload all of my information – now that application has my date of birth, my name, where I live, my COVID status, the type of vaccine that I got. So if you look at Johnson & Johnson, which was halted recently in some places – imagine that I have a vaccine passport that says, “has been vaccinated by Johnson & Johnson – and now somehow somebody gets ahold of that, and starts saying, “Well, no, actually, you can’t hang out with Mathieu anymore, because he got he got that vaccine. That doesn’t work.” Well, from what I understand it’s one in a million, apparently, but you can see the ramifications of all that.

    I think that right now, it seems since beginning of COVID, we’ve been reacting, and we’ve been taking a short-term approach towards cybersecurity, versus a long-term security goal to make sure that all of the additional information that we collect, is actually collected the right way. That it is stored the right way and disposed of the right way. But we’re just not really doing that, you know, and I fear that we’ve seen a rise in phishing, we’ve seen a rise in ransomware, and we’ve seen a number of groups using the information immediately. We’ve also seen a number of well-known criminal groups that are actually probably playing the long game, by infiltrating critical infrastructure, like hospitals, because hospitals are overwhelmed. Cybersecurity right now is extremely important, but they’re literally overwhelmed, and criminals have no hearts and I dread they’re going after those targets.

    Medgadget: Yeah, it’s awful, and there’s definitely an upward trend of schemes to steal data from folks. I also agree with you and think that hospitals and healthcare data are most vulnerable right now. What would be your recommendations for healthcare systems and healthcare professionals? Clinical researchers or other hospital staff? How do they even start to learn about this? And what should they immediately implement?

    Mr. Gorge: There’s a high chance that there’s some related COVID-19 cyber-storm that’s brewing because we’ve expanded our risk surface tremendously. Whether it’s a health system or any other organization, you will find that there’s a majority of the staff that is now working from home, probably using their own devices. And suddenly, you’ve increased your risk surface by double or triple since and sometimes even multiplied it by 10.

    I think you need to go back to mapping your ecosystem. How many business units do you have? What kind of data is each business unit collecting, storing, and manipulating for whatever reason? What is it tracking? And then what’s the data flow between all of the different business units? And how is that data flow protected? Once you have done all of that, you can go back to the five pillars of security that I mentioned to you, whereby you can essentially say, “Okay, I want I want to know, within this health system environment, how am I doing for physical security? Can somebody just walk in and steal a computer, maybe steal the hard drive of a multifunctional printer, and then get access to all of that data that you provide? When you check in? At the hospital? How’s the people security? Who has access to what and who’s coming in? Who’s coming out? Where are they going? Can I trace everything? How’s the data security? What kind of data am I dealing with?”

    So obviously, any type of normal PHI (protected health information), but in addition to that, any type of COVID vaccine information with any type of PII (personally identifiable information), or even payment data, because where things are going with modern hospitals, it’s like a five star hotel where you can pay to get access to the internet, you can pay to have additional services, and everything is done by credit card or tokes or so on, so everything is interconnected. So where is that data? From the infrastructure perspective: is my infrastructure bulletproof? How much of it is run by third parties or first parties? And then finally, crisis management. What are we going to do if the list of patients in the COVID worlds makes it to the dark web? Many have identified as a potential scenario. My guess is that not all health systems have done that as much as they can, and this is potentially an opportunity for health systems to get their house in order, right? Because if you listen to Bill Gates and other visionaries, there will be more pandemics, and there will be more crisis.

    We need to learn from the mistakes that we weren’t completely ready for – for that type of game changing event, right? We need to look at architecture. If you look at the IT architecture of any business, whether it’s a hospital, bank, or hotel, it’s completely been turned upside down. So what you believe to be the right architecture, and the right model before, may have been the right model at the time, but today, you can’t anymore, because 90% of people are working from home. And even if we go back to working to the office, we’re going to have to reorganize and re-engineer the physical space. And therefore, that means reorganizing the logical access to systems and traceability, and so we’re opening up a lot of new doors. We need to be on our A-game.

    Medgadget: Taking the conversation back to any individual out there – what should folks be thinking about? Most folks don’t think about things the way a cybersecurity professional does, and many don’t realize that they’re being directly or indirectly pressured to provide different sorts of data for all sorts of reasons as they go about their day.

    Mr. Gorge: So I mean, I can give you the risk professional answer, which will be long drawn answer with: be careful with your data, don’t share data with people you don’t know, make sure that you, if it’s an application that looks odd that you Google it, make sure that you look at the reviews, if you know don’t use 15 different applications to do the same thing, because you’re multiplying copies of the data.

    But I think that the easiest answer to that is: if an application related to COVID at the moment is asking you to share data that you wouldn’t share with your stylist or with your best friend – be extra vigilant, do you really need to share it? Do you really need that application? Is it really adding value to you and to protecting your health? Because if the answer is “Eh, not sure,” then you shouldn’t do it. If on the other hand, the answer is, “Yeah, I really need it,” then at that stage, you need to dig a bit deeper, and, you know, nobody’s going to read the privacy policy unless we need to, but maybe now would be a good time to do that and potentially check that you’re happy to share that data. This is not crying wolf, but you have to ask yourself: if that company is hacked, and my data ends up in the public domain, what is going to be the impact. Okay, so my date of birth, well, I can probably get your date of birth anywhere on Google. My address – it’s annoying – but I can probably get it too. My health status — that’s really annoying – and I don’t want anyone to know, because it’s very, very personal. And as I said, there’s no health status “B,” and you only have one, and you need to protect it. So I’m not trying to scare people, I’m just trying to say that the real value of health data is tremendous. And yes, there’s HIPAA, and yes, it’s putting in a lot of good controls and it’s dealing with most of the attacks. But, COVID has generated loads of new attacks, and criminals have absolutely no mercy whatsoever. So now is the time to be cautious, I would say.

    Medgadget: Yes, this is just the reality these days, and it’s better to be aware now rather than once it’s too late. Our physiological status and biometrics are definitely unique to us, and this sort of information isn’t something to be trifled with because there can be all sorts of long-term consequences.

    Mr. Gorge: Absolutely. And I go back to the analogy with a credit card. It’s a pain if my credit cardholder data is stolen and used, but there’s so much regulation in this, and so much jurisprudence that I am nearly guaranteed to land on my feet within 24 hours. If somebody steals my health data, it’s a whole different ballgame.

    Medgadget: What about the other side of this, in terms of risk management – let’s say it’s too late, and somebody has already given out their data and or they’ve enabled access to something or agreed to be tracked. What do you do to backtrack? Is that possible?

    Mr. Gorge: Again, I go back to that idea of my five pillars, the fifth pillar being crisis management. So if you work with reputable health systems, and so on, you will definitely have identified a number of scenarios to address the risk, and to take corrective action when something goes wrong. That might be helping you to protect your identity, it might be making sure that it doesn’t happen again, that kind of stuff. But the reality is that you’re hoping that it won’t happen, especially for health data. And that’s why large health systems also have systems that allow them to stratify the type of data that we get, as you know, so they might have one level for generic personal data, and they might have another level for generic health questions, and another level for more in depth questions. And eventually, you end up with stuff that’s highly confidential, and goes back to the idea of data classification, with only people who are accessing the right data on a need to know basis. If you don’t need access to that data, you don’t get access to the data. Why would you have access to the data? You work as a triage officer at the reception at the hospital? You don’t need to see the results of my drug tests. Two days later, you don’t need to see, because your task is done. Somebody else is dealing with that, and they need access to that.

    Source
     

    Add Reply

Share This Page

<