Welcome to IE-LAB!

Generic filters
Generic filters

Security Challenges facing big data – part1

1. What is big data

Big data (big data), refers to the collection of data can not be captured, managed and treated with conventional software tools within a certain time frame, the new model requires processing in order to have greater decision-making power, strength and insight discovery process optimization capabilities of mass , high growth rates and diverse information assets

For the “Big Data” (Big data) Gartner research institutions such definitions given. “Big Data” is the need for a new processing mode in order to have more decision-making power, strength and insight discovery process optimization capabilities to adapt to the massive, high growth rates and diverse information assets

2. The research background and significance

Big Data is the second cloud computing, Internet of things after the current information industry and technological innovation, industrial policy and national security of knowledge and new growth times. In the context of large data information security faces many challenges, especially at this stage the existing information security tools can not meet the actual requirements of information security, big data era, the study of information security issues facing the era of big data has important Research and application sense of big data also attracted the attention of national government departments, it has become an important strategic layout direction.

Big Data technologies of strategic importance does not lie in the huge master data information, and that these data contain meaningful for specialized treatment. In other words, if the big key data compared to an industry, then this industry to achieve profitability, to improve the data “processing capacity” through “process” to achieve “value-added” data.

From a technical point of view, as a close relationship between the front and back big data and cloud computing is like a coin. Big data can not necessarily be treated with a single computer, it must adopt a distributed architecture. It features lies in the massive data distributed data mining. But it must rely on cloud computing distributed processing, distributed database and cloud storage, virtualization technology

Large data requires special techniques to efficiently process a large amount of data within the tolerance elapsed time. Techniques suitable for large data comprising massively parallel processing (MPP) database, data mining, distributed file system, distributed databases, cloud computing platform, Internet and scalable storage system.

3. big data Structure

Large data including structured, semi-structured and unstructured data, unstructured data is increasingly becoming part of the main data. According to IDC’s survey: 80% of the enterprise data is unstructured data, which have an exponential growth of 60% per year. Big Data is the development of the Internet to a symptom or feature of the current stage only, it is not necessary or it myth

In the cloud computing technology to set off the big screen as the representative of Innovation, which originally seemed difficult to start data collection and use easily used up, all walks of life through continuous innovation, big data will gradually create more for mankind the value of.

Secondly, you want a large system of cognitive data, it is necessary to break down a comprehensive and detailed, begin to expand from three levels:

The first level is the theory, cognitive theory is a necessary way, and it was widely recognized baseline and dissemination. Here the definition of the characteristics of big data to understand the industry as a whole drawing and characterization of large data; from and Big data value to in-depth analysis of precious big data resides; insight into the trends of big data; from large data privacy this special and important the long-term perspective on the game between the people and data.

The second level is technology, technology is a means to reflect the value of big data and the cornerstone of progress. Here are the cloud, the development of distributed processing technology, storage technology and large sensing techniques will be described from data acquisition, processing, storage of the results of the formation of the entire process.

The third level is practice, practice is the ultimate value reflects the large data. Here are the big data from the Internet, government Big Data, Big Data business and personal aspects of big data to describe four large data already show a better picture and blueprint to be realized.

For more articles you can follow us on:

error: Content is protected !!
× How can I help you?