Big Data

60seconds-go-gulf

Source : LeBigData.com

According to this website, ‘Every 48 hours we create as much data as all those created from 2003 to today.’

For information on how Amazon is leveraging Big Data, click here. With millions of customers and abundance of data, such companies have to invest in data banks/warehouses and employ staff who can regularly input and analyse this data so they can be used for decision-making and lead to business intelligence. Information exchange can be almost instant and customers are served faster than ever.

Some common big data concepts and terms are briefly explained in this article.

Analytics 1.0 (Business Intelligence) – data about production processes, purchases, sales, customer history recorded and analyzed. Slowly the need for information systems and data warehouses became paramount.

Analytics 2.0 (Big Data) – big data got distinguished from small data and it was not just sourced internally but externally as well from the internet, audio, video recordings, different types of sensors, etc. This was a mix of structured and unstructured data such as mobile data, logs, and social media. Big data that couldn’t be analyzed locally was done by third party software such as HadoopFor unstructured data, companies turned to NoSQL, a new class of databases to support document, graph, columnar and geospatial data.

Analytics 3.0 (IoT) – connecting increasing amounts of data generated on the edge on devices and activities with data stored in data centres. This means most analysis has to be done closer to the edge combining traditional business intelligence, big data and Internet of Things (IoT). With advanced analytics and optimization in almost real-time, businesses can benefit customers and monetize on them faster.