Ensuring Data Integrity in Large-Scale Migration Projects

Authors

  • Hakim Ahmad Department of Computer Science, Universiti Malaysia Pahang
  • Liyana Salleh Department of Computer Science, Universiti Malaysia Sabah

Keywords:

Data Integrity, Migration Projects, ETL Tools, Apache Kafka

Abstract

This research investigates the critical factors influencing data integrity during large-scale data migration projects, emphasizing its importance for decision-making, operational efficiency, and regulatory compliance. Data integrity, encompassing accuracy, consistency, and reliability, is paramount in such projects to prevent data loss, corruption, and extended downtime. The study evaluates challenges such as data mapping, cleansing, and validation, and examines best practices and strategies for maintaining data integrity. Additionally, it explores the role of technology and automation in enhancing data integrity, assessing the effectiveness of various tools and methods. Through qualitative and quantitative methods, including case studies, interviews, and surveys, the research aims to provide actionable insights and recommendations for successful data migration. The findings are expected to benefit both industry and academia by informing best practices, improving data migration processes, and contributing to the development of new data management frameworks and tools.

Author Biography

Liyana Salleh, Department of Computer Science, Universiti Malaysia Sabah

 

 

Downloads

Published

2021-12-01

How to Cite

Hakim Ahmad, & Liyana Salleh. (2021). Ensuring Data Integrity in Large-Scale Migration Projects. Sage Science Review of Educational Technology, 4(2), 69–92. Retrieved from https://journals.sagescience.org/index.php/ssret/article/view/192