Cryptogals

Demystifying
Cyber Security

Big data means big opportunities that come with big risks, needing big measures

Big data analytics in the financial sector around the world has become increasingly crucial to improve business efficiency, reduce operational costs and address long-standing business challenges.

The term big data was coined in the early 1990s and, as the name suggests, it is “big.” Not just in size, but also in terms of speed of generation and diversity, which is why traditional computer algorithms are often not able to process big data as efficiently as conventional data.

For example, a massive increase in the number of sales records for revenue calculation in one file does not necessarily make data “big.” On the other hand, a large number of sales records, along with real-time data on customer requests, past purchases and market trends all continually updated from multiple sources in a variety of methods, can lead to availability of big data for goals, such as predicting customer behavior (like the data, goals can also evolve).

Technology and methods for the encryption of data at rest and in transit have evolved over the years, while encryption of data during processing is more nascent. The challenges with protecting data during processing become more severe with big data because unlike conventional data, big data is not local (processing strongly depends on shared computing environments), is processed continually (not just when the machine and program are switched on) and has more longevity, ever evolving and reincarnating.

We identified five key principles to bear in mind when implementing big data analytics.

You can refer to our article on this topic at International Association of Privacy Professionals (IAPP) here

Leave a Reply

Your email address will not be published. Required fields are marked *