Big Data Cleaning Algorithms in Cloud Computing
DOI:
https://doi.org/10.3991/ijoe.v9i3.2765Keywords:
big data, cleaning algorithms, cloud computing, data cleaning, Map-ReduceAbstract
Big data cleaning is one of the important research issues in cloud computing theory. The existing data cleaning algorithms assume all the data can be loaded into the main memory at one-time, which are infeasible for big data. To this end, based on the knowledge base, a data cleaning algorithm is proposed in cloud computing by Map-Reduce. It extracts atomic knowledge of the selected nodes firstly, then analyzes their relations, deletes the same objects, builds an atomic knowledge sequence based on weights, lastly cleans data according to the sequence. The experimental results show that the cloud computing environment big data algorithm is effective and feasible, and has better expansibility.
Downloads
Published
2013-06-11
How to Cite
Zhang, F., Xue, H.-F., Xu, D.-S., Zhang, Y.-H., & You, F. (2013). Big Data Cleaning Algorithms in Cloud Computing. International Journal of Online and Biomedical Engineering (iJOE), 9(3), pp. 77–81. https://doi.org/10.3991/ijoe.v9i3.2765
Issue
Section
Papers