ijact-book-coverT

Automating Data Quality Monitoring In Machine Learning Pipelines

© 2023 by IJACT

Volume 1 Issue 3

Year of Publication : 2023

Author : Naveen Edapurath Vijayan

:10.56472/25838628/IJACT-V1I3P111

Citation :

Naveen Edapurath Vijayan, 2023. "Automating Data Quality Monitoring In Machine Learning Pipelines", ESP International Journal of Advancements in Computational Technology (ESP-IJACT)  Volume 1, Issue 3: 104-111.

Abstract :

This paper addresses the critical role of automated data quality monitoring in Machine Learning Operations (MLOps) pipelines. As organizations increasingly rely on machine learning models for decision-making, ensuring the quality and reliability of input data becomes paramount. The paper explores various types of data quality issues, including missing values, outliers, data drift, and integrity violations, and their potential impact on model performance. It then examines automated detection methods, such as statistical analysis, machine learning-based anomaly detection, rule-based systems, and data profiling. The integration of data quality monitoring into different stages of the MLOps pipeline is discussed, emphasizing continuous monitoring at data ingestion, pre-training validation, post- deployment drift detection, and feedback loops for model retraining. The paper also addresses key challenges in implementing automated data quality monitoring, including balancing precision and recall in anomaly detection, handling high-dimensional and unstructured data, managing false positives and alert fatigue, and adapting to evolving data distributions. By providing a comprehensive framework for automating data quality monitoring in MLOps pipelines, this paper aims to equip practitioners with the knowledge and strategies necessary to enhance the reliability and performance of machine learning systems in production environments.

References :

[1] Polyzotis, N., Roy, S., Whang, S. E., & Zinkevich, M. (2018). Data lifecycle challenges in production machine learning: A survey. ACM SIGMOD Record, 47(2), 17-28.

[2] Schelter, S., Lange, D., Schmidt, P., Celikel, M., Biessmann, F., & Grafberger, A. (2018). Automating large-scale data quality verification. Proceedings of the VLDB Endowment, 11(12), 1781-1794.

[3] Breck, E., Cai, S., Nielsen, E., Salib, M., & Sculley, D. (2017). The ML test score: A rubric for ML production readiness and technical debt reduction. 2017 IEEE International Conference on Big Data (Big Data), 1123-1132.

[4] Renggli, C., Karlaš, B., Ding, B., Liu, F., Schawinski, K., Wu, W., & Zhang, C. (2019). Continuous integration of machine learning models with ease.ml/ci: Towards a rigorous yet practical treatment. SysML Conference.

[5] Baylor, D., Breck, E., Cheng, H. T., Fiedel, N., Foo, C. Y., Haque, Z., ... & Zinkevich, M. (2017). TFX: A TensorFlow-based production-scale machine learning platform. Proceedings of the 23rd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, 1387-1395.

[6] Sculley, D., Holt, G., Golovin, D., Davydov, E., Phillips, T., Ebner, D., ... & Dennison, D. (2015). Hidden technical debt in machine learning systems. Advances in Neural Information Processing Systems, 28.

[7] Amershi, S., Begel, A., Bird, C., DeLine, R., Gall, H., Kamar, E., ... & Zimmermann, T. (2019). Software engineering for machine learning: A case study. 2019 IEEE/ACM 41st International Conference on Software Engineering: Software Engineering in Practice (ICSE-SEIP), 291-300.

[8] Paleyes, A., Urma, R. G., & Lawrence, N. D. (2020). Challenges in deploying machine learning: A survey of case studies. arXiv preprint arXiv:2011.09926.

[9] Miao, H., Li, A., Davis, L. S., & Deshpande, A. (2017). Towards unified data and lifecycle management for deep learning. 2017 IEEE 33rd International Conference on Data Engineering (ICDE), 571-582.

[10] Schelter, S., Biessmann, F., Januschowski, T., Salinas, D., Seufert, S., & Szarvas, G. (2018). On challenges in machine learning model management. IEEE Data Eng. Bull., 41(4), 5-15.

[11] Karlaš, B., Interlandi, M., Renggli, C., Wu, W., Zhang, C., Mukunthu, D.,... & Weimer, M. (2020). Building continuous integration services for machine learning. Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, 2407-2415.

[12] Zaharia, M., Chen, A., Davidson, A., Ghodsi, A., Hong, S. A., Konwinski, A., ... & Xin, R. S. (2018). Accelerating the machine learning lifecycle with MLflow. IEEE Data Eng. Bull., 41(4), 39-45.

[13] Renggli, C., Rimanic, L., Hollenstein, N., & Zhang, C. (2021). A data quality-driven view of MLOps. IEEE Data Engineering Bulletin.

[14] Böse, J. H., Flunkert, V., Gasthaus, J., Januschowski, T., Lange, D., Salinas, D., ... & Wang, Y. (2017). Probabilistic demand forecasting at scale. Proceedings of the VLDB Endowment, 10(12), 1694-1705.

[15] Vartak, M., Subramanyam, H., Lee, W. E., Viswanathan, S., Husnoo, S., Madden, S., & Zaharia, M. (2016). ModelDB: A system for machine learning model management. Proceedings of the Workshop on Human- In-the-Loop Data Analytics, 1-3.

Keywords :

MLOps, Data Quality Monitoring, Automated Detection, Machine Learning, Data Drift, Anomaly Detection, Data Integrity, Scalable Solutions, Real-Time Monitoring, Data Validation, Model Performance, Alert Management, High-Dimensional Data, Concept Drift, Production ML Systems.