Let’s start first with what is a data lake?
Below is the oversimplified definition –
This raises another important question
We know that data is an asset for any organization that can be used to understand the customer better and extract valuable insights from it to enhance the value of the business.
So, organization, and businesses that have recognized the value of data, uses Data Lake to store original data in large volumes. Then they processed this data with the use of various analytics and machine learning tool. This allows organizations to use their data assets more effectively and extract value from them.
Let’s consider a curious case of an Online Delivery Platform. We all love to order tasty food from the comfort of our homes. All the data from these orders are available on the Online Delivery Platform. They can make use of Data Lake for storing this data. It can then do some ninja analytics to identify which dish is popular in the local area or the most ordered dish. Online Delivery Platform can then slightly increase the price for that dish, knowing it won’t affect the demand. In return, they will increase their share of profit. This simple price change can bring millions in revenue.
The core idea behind data lake security is to protect data while it is in the transfer phase or the storage phase. The n number of tools available for operations will also create lots of security misconfiguration.
So, the organization needs to have a strong cybersecurity posture and strategy to mitigate all these risks.
Amazon, Microsoft, Oracle, Cloudera, and Teradata all have popular data lake options. All of them have different mechanisms and different processes for the implementation of Data Lake. Still, the security issues faced by all of them remain the same.
We advise considering the following challenges and mitigations when dealing with Data Lake.
Again and again, we have seen in the wild the threats posed by access control issues. The data lake is no exception to it. So, it is essential to have proper access control policies to avoid unauthorized access to data resources. A good approach is to use the built-in Identity and Access Management (IAM) controls provided by the cloud vendor.
All of these will act as a defense layer.
Encryption of the sensitive data which is stored is one of the requirements of most compliance standards. However, for enterprises using Amazon S3, Azure ADLS, or any other cloud data lake vendors, encryption at rest is offered as a bundled free service.
Encryption is only as secure as the key to encrypt and decrypt. To fully protect the encryption, the encryptions keys also need to be strong. Using AES 256-bit key encryption will provide strong encryption. Though encryption is desired and often required, it is not a complete solution. Also, it should be implemented in a way that does not impact the performance.
Monitoring is the live review of application and security logs. It is essential to maintain the system to detect intrusion, to retain logs for forensic analysis and investigations and is also useful to satisfy regulatory compliance requirements. Ensure data security by understanding what kind of data is in the data lake and accessing it.
Over the last decade, we have seen major data leaks from top organizations due to simple misconfigurations. Applying proper access control and doing a regular data security audit is one way to avoid this. Another way is to apply obfuscation whenever sensitive data is in transit.
It is essential to ensure that cloud vendors have security controls in place. This can be anything from FedRAMP certification or HIPAA. This system should be designed in such a way that it complies with industry and data privacy regulations.