Implementation of change data capture in ETL process for data warehouse using HDFS and apache spark

Penulis: Denny; Atmaja, I. Putu Medagia; Saptawijaya, Ari; Aminah, Siti
Informasi
JurnalProceedings - WBIS 2017: 2017 International Workshop on Big Data and Information Security, Big Data and Information Security (IWBIS), 2017 International Workshop on
PenerbitInstitute of Electrical and Electronics Engineers Inc., IEEE
Volume & EdisiVol. 2018-January
Halaman49 - 55
Tahun Publikasi2017
ISBN978-153862038-0
Jenis SumberScopus
Sitasi
Scopus: 3
Google Scholar: 3
PubMed: 3
Abstrak
This study aims to increase ETL process efficiency flud reduce processing time by applying the method of Change Data Capture (CDC) in distributed system using Hadoop Distributed file System (HDFS) and Apache Spark in the data warehouse of Learning Analytics system of Universitas Indonesia. Usually, increases in I lie number of records in the data source result in an increase in ETL processing time for the data warehouse system. This condition occurs as a result of inefficient ETL process using the full load method. Using the tull load method, ETL has to process the same number of records as the number of records in the data sources. The proposed ETL model design with the application of CDC method using HDFS and Apache Spark can reduce the amount of data in the ETL process. Consequently, the process becomes more efficient and the ETL processing time Is reduced approximately 53% in average. © 2017 IEEE.
Dokumen & Tautan

© 2025 Universitas Indonesia. Seluruh hak cipta dilindungi.