AUTOMATIC WELL TEST MODELS IDENTIFICATION USING CONVOLUTIONAL NEURAL NETWORK (CNN) BACHELOR THESIS Rocky Yan Classica 12219047 Submitted as partial fulfillment of the requirements for the degree of BACHELOR OF ENGINEERING in Petroleum Engineering study program PETROLEUM ENGINEERING STUDY PROGRAM FACULTY OF MINING AND PETROLEUM ENGINEERING INSTITUT TEKNOLOGI BANDUNG 2023 AUTOMATIC WELL TEST MODELS IDENTIFICATION USING CONVOLUTIONAL NEURAL NETWORK (CNN) BACHELOR THESIS Rocky Yan Classica 12219047 Submitted as partial fulfillment of the requirements for the degree of BACHELOR OF ENGINEERING in Petroleum Engineering study program Approved by: Thesis Adviser I, Thesis Adviser II, Dr. Ir. Dedy Irawan, S.T., M.T., I.P.M. Pahala Dominicus Sinurat, S.T., M.Sc., Ph.D. NIP. 197511052010121004 NIP. 197705252012121002 AUTOMATIC WELL TEST MODELS IDENTIFICATION USING CONVOLUTIONAL NEURAL NETWORK (CNN) Rocky Yan Classica*, Dr. Ir. Dedy Irawan, S.T., M.T., I.P.M. **, and Pahala Dominicus Sinurat, S.T., M.Sc., Ph.D. ** Copyright 2023, Institut Teknologi Bandung Abstract This study discusses the development of a Pressure Transient Analysis (PTA) method using the Convolutional Neural Network (CNN) algorithm to identify well test models automatically. The well test model is a combination of wellbore, reservoir, and boundary models. Although the PTA process currently utilizes commercial well testing software. However, this method proves to be less efficient as it requires long iterations in the process. In this study, the CNN program will be trained using a database of pressure and pressure derivative response images obtained from the test design process using commercial well testing software. The database consists of 8 classes with a total of 1000 images. For the purposes of training and testing the CNN program, 70% of the data will be used as training data, while the remaining 30% will be used as test data. The number of layers used in the CNN network is 12 layers, consisting of 5 convolution layers, 5 pooling layers, and 2 fully connected layers. The type of activation function used in the convolution and first fully connected layers is Exponential Linear Units (ELU). Meanwhile, in the second fully connected layer, the type of activation function used is Soft-Max.