Skip navigation
st. Mary's University Institutional Repository St. Mary's University Institutional Repository

Please use this identifier to cite or link to this item: http://hdl.handle.net/123456789/7879
Title: Development of Bidirectional Amharic-Tigrinya Machine Translation using Recurrent Neural Networks
Authors: Ephrem, Metages
Keywords: Machine Translation, Neural Network, Long Short Term Memory, Gated Recurrent Unit, BLEU, Amharic, and Tigrinya
Issue Date: Jan-2024
Publisher: St. Mary's University
Abstract: Machine translation employs Artificial Intelligence (AI) to autonomously convert text from one language to another, eliminating the need for human intervention. Contemporary machine translation transcends basic word-to-word conversion, aiming to convey the overall meaning of the source language text in the target language. It comprehensively analyzes all textual elements, discerning the intricate relationships between words. The advantages of machine translation include automated translation assistance, cost- ffectiveness, rapid processing, and scalability. Even though there has been a lot of movement in developing machine translation using Neural Machine Translation (NMT) there is only little research conducted for Ethiopian language pairs. This research aims to answer which Recurrent neural network (RNN) is best fitted for a bidirectional Amharic-Tigrinya machine translation depending on their Bilingual Evaluation understudy (BLEU) score. The evolution of machine translation has progressed through rule-based, statistical, hybrid, and neural network approaches. Among neural network models, RNNs play a significant role, offering a diverse array of models. In this study, the researcher utilized a dataset consisting of 34,350 parallel Amharic and Tigrinya sentences, employing an 80/20 split for training and testing, respectively. The investigation aimed to identify the most suitable model for Amharic-Tigrinya and vice versa machine translation among options such as Long Short Term Memory (LSTM), LSTM with attention, Bidirectional Long Short Term Memory (BILSTM), BILSTM with attention, Gated Recurrent Unit (GRU), GRU with attention, Bidirectional Gated Recurrent Unit (BIGRU), and BIGRU with attention. The research initially fine-tuned hyper-parameters, including the number of units, layers, and epochs for LSTM and GRU. Once optimal hyper-parameters were determined, they were applied to the respective models, and the results were analyzed based on BLEU scores. Among the models considered, BIGRU with attention emerged as the most effective for Amharic-Tigrinya and vice versa machine translation, as evidenced by its superior BLEU score performance. For Amharic-Tigrinya machine translation scoring a loss of 0.0775, accuracy of 0.9786, and BLEU score of 3.3415. To conclude, this research has systematically investigated the experimental setup, hyper-parameter tuning, and model construction processes, providing a comprehensive overview of Amharic-Tigrinya NMT. Each chapter contributes to a nuanced understanding of the specific challenges posed by this linguistic context. The evaluation of various RNN models underscores the significance of attention mechanisms in improving BLEU scores, offering crucial contributions to the domain of machine translation. Notably, the BIGRU model with attention emerges as the top performer, achieving the highest BLEU score of 3.3415, thereby substantiating its efficacy in enhancing translation accuracy for Amharic-Tigrinya language pairs.
URI: http://hdl.handle.net/123456789/7879
Appears in Collections:Master of computer science

Files in This Item:
File Description SizeFormat 
1. Metages Ephram.pdf2.61 MBAdobe PDFView/Open
Show full item record


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.