Posts

Showing posts from March, 2021

Characteristics of big data analysis

Image
 8 .      What are the Characteristics of big data analysis (including visualisations)? 5 Characteristics of Big Data + Visualisation:   Volume If we see big data as a pyramid, volume is the base. The volume of data that companies manage skyrocketed around 2012, when they began collecting more than three million pieces of data every data. “Since then, this volume doubles about every 40 months,”   Velocity In addition to managing data, companies need that information to flow quickly – as close to real-time as possible. So much so that the MetLife executive stressed that: “Velocity can be more important than volume because it can give us a bigger competitive advantage. Sometimes it’s better to have limited data in real time than lots of data at a low speed.”   Variety The third V of big data is variety. A company can obtain data from many different sources: from in-house devices to smartphone GPS technology or what people are saying on social networks. The importance of these sources of

Limitations of traditional data analysis

Image
 7 .      What are the limitations of traditional data analysis? Traditional statistics is about analysing and summarizing data, and suited to processing lesser amounts of linear, repeatable data to find a solution. It’s very useful in environments where data, and the relationships between the data, are relatively stable. It is still by far the most commonly used methodology and continues to be good for mid to long term forecasting based on prior sales history, where a few hundred or even fewer data points may generate a reasonable forecast. The important limitations of statistics are:   Statistics laws are true on average. Statistics are aggregates of facts, so a single observation is not a statistic. Statistics deal with groups and aggregates only. Statistical methods are best applicable to quantitative data. Statistics cannot be applied to heterogeneous data. If sufficient care is not exercised in collecting, analysing and interpreting the data, statistical results might be misleadi

Traditional data analysis

Image
 6 .      What is Traditional data analysis (descriptive and inferential)?          Traditional data is the structured data which is being majorly maintained by all types of businesses starting from very small to big organizations. In traditional database system a centralized database architecture used to store and maintain the data in a fixed format or fields in a file. For managing and accessing the data Structured Query Language (SQL) is used. 3 Characteristic of Traditional Data: ·        Traditional analytics is static.   Traditional data analytics typically relies on dashboards composed of visualizations. These dashboards are based on common business questions and are predefined well in advance.   ·        Traditional analytics answers “what.”   Traditional analytics answers a series of “what” questions and leaves the user to determine “why” through their own analyses.   ·        Traditional analytics is driven by hypotheses.   Dashboards are typically predefined based on common

The value of data

Image
 5 .      What is the value of data (including future value)? Todays wealth lies in data. There will be 4.8 billion internet users by 2022, up from 3.4 billion in 2017   There are a Few Frameworks for Trying to Put a Value on Data   James E. Short of The Sloan Management Review, defines data value as the combination of a few types of approaches to determining value:   1.       The asset, or stock, value 2.       The activity values 3.       The expected, or future, value  4.       The prudent value   Some facts about the Value of Data:   ·        Bad data costs the US $ 3.1 Trillion annually. ·        Data investments in the financial services industry will account for nearly 9 Billion in 2018 alone. ·        AI’s impact on marketing is growing, predicted to reach nearly 40 billion by 2025. ·        The salaries of data scientists are rapidly increasing with demand. ·        IoT will save consumers and businesses $1 Trillion a year by 2022   . . . Reference: https://www.dataversity.net

Reasons for the growth of data

Image
  4.      What are the reasons for the growth of data? The rapidly increasing volume and complexity of data are due to growing mobile data traffic, cloud-computing traffic and burgeoning development and adoption of technologies including IoT and AI, which is driving the growth of big data analytics market. Over 2.5 quintillion bytes of data generated every day.  Data is created by every click, swipe, share, search, and stream, proliferating the demand for big data analytics market globally. 5 reasons Big Data is growing so fast:   1.       We have more data now then we ever had in past and even more is expected in the future. 2.       Today we have more computing power and better and economical ways to harness it. 3.       There are a lot of innovations and developments in software for unstructured data. 4.       Analytical algorithms are advancing in several areas, most importantly with machine learning. 5.       The uptake and evolution of the Internet of Things (IoT) has made a subs

Growth of the data

Image
  3 .      Give examples of the growth of data (including measures of data) 

Historical development of big data

Image
 2. Give a description of the historical development of big data 90% of the available data has been created in the last two years. , The earliest records of using data to track and control businesses date back from 7.000 years ago when accounting was introduced in Mesopotamia in order to record the growth of crops and herds.        This is the historical timeline of Big Data:       1943 – The UK developed the first data-processing machine to decipher Nazi codes.   1945 – ENIAC, the first electronic general purpose computer, was completed.   1954 – The first fully transistorised computer used all transistors and diodes and no vacuum tubes.   1964 – The IBM System/360 family of mainframe computer systems was launched.   1971 – Intel’s 4004 became the first general purpose programmable processor.   1973 – Xerox unveiled the first desktop system to include a graphical user interface and internal memory storage.   1977 – ARCnet introduced the first LAN at Chase Manhattan Bank, connecting 25

Define Big Data

Image
1. Define Big Data    Big data is a term that describes the large volume of data – both structured and unstructured – that inundates a business on a day-to-day basis. But it’s not the amount of data that’s important. It’s what organizations do with the data that matters. Big data can be analysed for insights that lead to better decisions and strategic business moves.   The three Vs of big data:   Volume: The amount of big data, and sometimes big companies don’t even know the amount of data produced on a daily basis.   Velocity: Velocity is the fast rate at which data is received and acted on.   Variety: Variety refers to the many types of data that are available.   . . .   Reference: https://www.oracle.com/uk/big-data/what-is-big-data/