Menu

Why Good Cyber Security Starts with Great Data Management

steve-rackham
Steve Rackham
1,035 views

integrate-data-management-blog-1024x682

In an era of zero trust, security controls should be as close to the data as possible.

The threat of cybercrime is a constant, in the best of times as well as in times of crisis. It is, nevertheless, worthy of note when British and America Cybersecurity agencies join forces to issue a very rare joint statement. That’s exactly what GCHQ’s National Cyber Security Centre and the US’s Cybersecurity and Infrastructure Agency did in early April, warning against an uptick in criminal activity – especially malware and ransomware attacks – during the first weeks of COVID-19 lockdown.

The financial services industry contributed 62 percent of all exposed data in 2019, though it accounted for only 6.5 percent of data breaches, according to a Bitglass report. Across industries, financial services recorded the second-highest cost per breach, behind only healthcare. Within the sector, an average breach costs $210 per record, while a “mega breach” – as experienced by Capital One when an attack affected around 106 million customers – can cost up to $388 per record.

The concern is logical, even more so in the current environment. More and more people are working from home, taking company and customer-sensitive information beyond traditional organizational boundaries. As my colleague Ray White wrote recently, we are seeing levels of internet traffic rarely, if ever, witnessed before, as maps from the likes of Internet Traffic Report make clear. Without necessary safeguards, home workers are sitting targets. Banks, insurance and other financial institutions are not adept at working from home. It is not habitual, often not practical but the same institutions now realize that in the wake of COVID-19 a working from home policy must be part of their business continuity planning.

Beyond the headline-grabbing numbers, there remain core principles sensible organizations must observe. Above all else, good security management is predicated on good data management. Along every step of the security journey – from prevent to detect to respond – knowing where your data is, how to extract it, and how it interoperates across and beyond organizational boundaries are key to ensuring you protect yours and your customers’ most valuable intelligence.

At NetApp we take a data-centric approach to security.

Data, after all, is the most valuable asset you own. And data is the heart of Zero Trust, a methodology that substitutes a ‘Trust but verify’ approach to security with the unbending command: ‘Verify but never trust’. In an industry ecosystem like financial services – where contractors are insiders and where insiders work on, or beyond, the edge – Zero Trust assumes that old notions of perimeter security are obsolete. Those who subscribe to Zero Trust – and that includes NetApp – acknowledge that security controls should be as close to the data as possible.

What does this mean in practice? As we’ve noted on these pages before, good practice demands that you:
  • Know where your organization’s data resides
  • Classify your data
  • Securely dispose of data you no longer require
  • Understand what roles should have access to which data classifications
  • Apply the principle of least privilege to enforce access controls to verify and never trust
  • Use multifactor authentication for administrative access and data access
  • Use encryption for data at rest and data in flight
  • Monitor and log all access
  • Alert suspicious access or behaviors

Let’s take just three of those imperatives: encryption, location, and access.

First, flexible encryption and key management solutions help guard sensitive data on premise, in the cloud, and in flight. In other words, encryption is only truly effective if it operates seamlessly regardless of infrastructure. And it is only truly effective if it works in-flight as well as at rest.

The second imperative is location. Only by knowing where your data is can you choose what to keep, how to classify it, and how to grant access. Classification, in turn, provides a staging post for regulatory compliance. By determining what is your most sensitive data – in relation to regulations as disparate as the Payment Card Industry Data Security Standard (PCI-DSS) and the EU General Data Protection Regulation (GDPR) – allows you to put in place appropriate safeguards. In an era of growth and stronger appetite for cloud across financial services, location will be a crucial element of the risk management approach, not least with regards to regulation imperatives.

The final imperative is access. As I have alluded to already, access does not equate to trust. That means applying the principle of least privilege, only granting access on the basis of what’s required to carry out a particular function. NetApp Data ONTAP provides predefined access-control roles for cluster and Storage Virtual Machine (SVM) administrators. You can create additional access-control roles for the cluster or an SVM and customize their access to certain commands or command directories.

In an era of artificial intelligence and machine learning, many financial institutions are looking to transform their cyber security efforts. Smart organizations are applying tools to analyze data from millions of cyber incidents and using it to identify potential threats and develop defense strategies. That said, implementation of such technologies requires companies to focus on creating a data architecture that eliminates bottlenecks and facilitates faster model iteration. Designing a data architecture involves thinking holistically about the data pipeline: from data ingest and edge analytics to data preparation, training in the core data center, and archiving in the cloud. It is critical to understand the performance requirements, datasets, and data services needed.

At NetApp, we know that data is your business. That’s why helping you to protect it is our business.

Steve Rackham

Steve Rackham leads the technical pre-sales team for Strategic Enterprise FSI accounts across EMEA. Steve began his career in technology working for Sequent Computers, spending time at Intel and StorageTek. He joined NetApp in 2006 taking a role in pre-sales. For the past 13 years he has been involved with customers working on accounts across the FSI vertical including heading up a Global pre-sales team for a large, multi-national bank before moving to his current leadership role. He has been enhancing relationships with customers and strategic partners alike, helping them adopt their own Data Fabric utilizing NetApp’s Data Management solutions. Steve has also been exploring how the rapid advances in the use of AI is impacting organizations.

View all Posts by Steve Rackham

Next Steps

Drift chat loading