(+98) 21 42070000
Big Data is large sized data. The macro is a term used to describe a large and changing set of information that has increasingly grown production and variation rates. Part of this data is automatically generated by machines and the other part is generated by users.
The macro data can be defined according to the following features:
Size – The size of the data being tested (generated) and stored (stored). The data size is the key to identifying the value or fragmentation of the data. If the data is small, the macro data is not read.
Diversity – Species, data categorization leads to better recognition. The production rate is the same as the production rate. The high rate of data generation brings challenges in data storage and processing.
Variability-Data instability can prevent processing from handling and managing data.
Data accuracy – Collected data quality can affect data accurate data mining. If we want to define a macro data set, we can call it a data set that exceeds the extent that it can be downloaded, received, stored, managed, and processed by standard software and routines.