Data Quality Indicators Composition and Calculus: Engineering and Information Systems Approaches

Big Data phenomenon is a result of novel technological developments in sensor, computer and communication technologies. Nowadays more and more data are produced by nanoscale photonic, optoelectronic and electronic devices. However, their quality characteristics could be very low. The paper proposes...

Full description

Bibliographic Details
Main Authors: Leon REZNIK, Sergey Edward LYSHEVSKI
Format: Article
Language:English
Published: IFSA Publishing, S.L. 2015-02-01
Series:Sensors & Transducers
Subjects:
Online Access:http://www.sensorsportal.com/HTML/DIGEST/february_2015/Vol_185/P_2612.pdf
Description
Summary:Big Data phenomenon is a result of novel technological developments in sensor, computer and communication technologies. Nowadays more and more data are produced by nanoscale photonic, optoelectronic and electronic devices. However, their quality characteristics could be very low. The paper proposes new methods of the data management with huge data amounts that is based on associating of data quality indicators with each data entity. To achieve this goal, one needs to define the composition of the data quality indicators and to develop their integration calculus. As data quality evaluation involves multi-disciplinary research, various metrics have been investigated. The paper describes two major approaches in assigning the data quality indicators and developing their integration calculus. The information systems approach employs traditional high-level metrics like data accuracy, consistency and completeness. The engineering approach utilizes signal characteristics processed with the probability based calculus. The data quality metrics composition and calculus are discussed. The tools developed to automate the metrics selection and calculus procedures are presented. The user- friendly interface examples are provided.
ISSN:2306-8515
1726-5479