Papers by Dekera K Kwaghtyo

Journal of Information Security, 2021
Video shreds of evidence are usually admissible in the court of law all over the world. However, ... more Video shreds of evidence are usually admissible in the court of law all over the world. However, individuals manipulate these videos to either defame or incriminate innocent people. Others indulge in video tampering to falsely escape the wrath of the law against misconducts. One way impostors can forge these videos is through inter-frame video forgery. Thus, the integrity of such videos is under threat. This is because these digital forgeries seriously debase the credibility of video contents as being definite records of events. This leads to an increasing concern about the trustworthiness of video contents. Hence, it continues to affect the social and legal system, forensic investigations, intelligence services, and security and surveillance systems as the case may be. The problem of inter-frame video forgery is increasingly spontaneous as more video-editing software continues to emerge. These video editing tools can easily manipulate videos without leaving obvious traces and these...

International Journal of Advanced Trends in Computer Science and Engineering, 2021
Big data is traditionally associated with distributed systems and this is understandable given th... more Big data is traditionally associated with distributed systems and this is understandable given that the volume dimension of Big Data appears to be best accommodated by the continuous addition of resources over a distributed network rather than the continuous upgrade of a central storage resource. Based on this implementation context, non- distributed relational database models are considered volume-inefficient and a departure from their usage contemplated by the database community. Distributed systems depend on data partitioning to determine chunks of related data and where in storage they can be accommodated. In existing Database Management Systems (DBMS), data partitioning is automated which in the opinion of this paper does not give the best results since partitioning is an NP-hard problem in terms of algorithmic time complexity. The NP-hardness is shown to be reduced by a partitioning strategy that relies on the discretion of the programmer which is more effective and flexible t...

The certificate issued by educational institutions is one of the most important documents for a g... more The certificate issued by educational institutions is one of the most important documents for a graduate. It is a proof of the graduate’s qualification and can be used anywhere. However, due to advances in printing and photocopying technologies, fake certificates can be created easily and the quality of a fake certificate can now be as good as the original. Certificates issued by many institutions have been forged and these forgeries are difficult to detect. Moreover, many factors have led to reduced operational efficiency in student services in many institutions. One of the most significant factors is the verification process for educational certificates and related documents. Certificate verification is necessary to ensure that the holder of the certificate is genuine and that the certificate itself comes from a real source. However, the verification of certificates is a challenge for the verifier (the prospective employer who wants to verify the certificate). To address this issu...
International Journal of Advanced Research, 2019
Application of machine learning techniques for supply chain demand forecasting
European Journal of Operational Research, 2008
Full collaboration in supply chains is an ideal that the participant firms should try to achieve.... more Full collaboration in supply chains is an ideal that the participant firms should try to achieve. However, a number of factors hamper real progress in this direction. Therefore, there is a need for forecasting demand by the participants in the absence of full information about other participants’ demand. In this paper we investigate the applicability of advanced machine learning techniques,
Nigerian Annals of Pure and Applied Science, 2021
Social media provides opportunities for individuals to anonymously communicate and express hatefu... more Social media provides opportunities for individuals to anonymously communicate and express hateful feelings and opinions at the comfort of their rooms. This anonymity has become a shield for many individuals or groups who use social media to express deep hatred for other individuals or groups, tribes or race, religion, gender, as well as belief systems. In this study, a comparative analysis is performed using Long Short-Term Memory and Convolutional Neural Network deep learning techniques for Hate Speech classification. This analysis demonstrates that the Long Short-Term Memory classifier achieved an accuracy of 92.47%, while the Convolutional Neural Network classifier achieved an accuracy of 92.74%. These results showed that deep learning techniques can effectively classify hate speech from normal speech.

International Journal of Advanced Trends in Computer Science and Engineering, 2021
Big data is traditionally associated with distributed systems and this is understandable given th... more Big data is traditionally associated with distributed systems and this is understandable given that the volume dimension of Big Data appears to be best accommodated by the continuous addition of resources over a distributed network rather than the continuous upgrade of a central storage resource. Based on this implementation context, nondistributed relational database models are considered volume-inefficient and a departure from their usage contemplated by the database community. Distributed systems depend on data partitioning to determine chunks of related data and where in storage they can be accommodated. In existing Database Management Systems (DBMS), data partitioning is automated which in the opinion of this paper does not give the best results since partitioning is an NP-hard problem in terms of algorithmic time complexity. The NP-hardness is shown to be reduced by a partitioning strategy that relies on the discretion of the programmer which is more effective and flexible though requires extra coding effort. NP-hard problems are solved more effectively by a combination of discretion rather than full automation. In this paper, the partitioning process is reviewed and a programmer-based partitioning strategy implemented for an application with a relational DBMS backend. By doing this, the relational DBMS is made adaptive in the volume dimension of big data. The ACID properties (atomicity, consistency, isolation, and durability) of the relational database model which constitutes a major attraction especially for applications that process transactions is thus harnessed. On a more general note, the results of this research suggest that databases can be made adaptive in the areas of their weaknesses as a one-size-fitsall database management system may no longer be feasible.

Journal of Basic Physical Research, 2021
Big Data has been traditionally associated with distributed systems, the reason being that the vo... more Big Data has been traditionally associated with distributed systems, the reason being that the volume dimension of Big Data, it appears, can be best accommodated by the continuous addition of inexpensive resources. It is within this implementation context that the nondistributed database models such as the relational database model have been faulted and departure from their usage contemplated by the database community. The atomicity, consistency, isolation, and durability (ACID) properties of the relational database model however constitute a major attraction especially for applications that process transactions. A transaction-laden application may demand a lot more of the ACID properties of a database so as to maintain data integrity while requiring that the ever-increasing volume of data is also accommodated. This means that a one-size-fits-all database as proposed by several researchers may end up as a mirage and the current trend suggests that databases be made adaptive in the areas of their weakness rather than throw the baby away with the bath. This paper appreciates that the query time is negatively impacted as data volume increases in a relational database and therefore proposes a Big Data model of the relational database that partitions a relation thereby allowing volume to grow within partitions rather than a single relation. The results of the experiments performed show that the query time is enhanced as more data is accommodated in the partitions.

Journal of Data Analysis and Information Processing, 2021
The ability of machine learning techniques to make accurate predications is increasing. The aim o... more The ability of machine learning techniques to make accurate predications is increasing. The aim of this work is to apply machine learning techniques such as Support Vector Machine, Naïve Bayes, Decision Tree, Logistic Regression, and K-Nearest Neighbour algorithms to predict the shelf life of Okra. Predicting the shelf life of Okra is important because Okra becomes harmful for human consumption if consumed after its shelf life. Okra parameters such as weight loss, firmness, Titrable Acid, Total Soluble Solids, Vitamin C/Ascorbic acid content, and PH were used as inputs into these machine learning techniques. Support Vector Machine, Naïve Bayes and Decision Tree each accurately predicted the shelf life of Okra with accuracies of 100%. However, the Logistic Regression and K-Nearest Neighbour achieved 88.89% and 88.33% accuracies, respectively. These results showed that machine learning techniques especially Support Vector Machine, Naïve Bayes and Decision Tree can be effectively applied for the prediction of Okra shelf life.

Journal of Information Security, 2021
Video shreds of evidence are usually admissible in the court of law all over the world. However, ... more Video shreds of evidence are usually admissible in the court of law all over the world. However, individuals manipulate these videos to either defame or incriminate innocent people. Others indulge in video tampering to falsely escape the wrath of the law against misconducts. One way impostors can forge these videos is through inter-frame video forgery. Thus, the integrity of such videos is under threat. This is because these digital forgeries seriously debase the credibility of video contents as being definite records of events. This leads to an increasing concern about the trustworthiness of video contents. Hence, it continues to affect the social and legal system, forensic investigations, intelligence services, and security and surveillance systems as the case may be. The problem of inter-frame video forgery is increasingly spontaneous as more video-editing software continues to emerge. These video editing tools can easily manipulate videos without leaving obvious traces and these tampered videos become viral. Alarmingly, even the beginner users of these editing tools can alter the contents of digital videos in a manner that renders them practically indistinguishable from the original content by mere observations. This paper, however, leveraged on the concept of correlation coefficients to produce a more elaborate and reliable inter-frame video detection to aid forensic investigations, especially in Nigeria. The model employed the use of the idea of a threshold to efficiently distinguish forged videos from authentic videos. A benchmark and locally manipulated video datasets were used to evaluate the proposed model. Experimentally, our approach performed better than the existing methods. The overall accuracy for all the evaluation metrics such as accuracy, recall, precision and F1-score was 100%. The proposed method implemented in the MATLAB programming language has proven to effectively detect inter-frame forgeries.
International Journal of Advanced Research (IJAR), 2019

International Journal of Computer Science and Mobile Computing, 2019
The certificate issued by educational institutions is one of the most important documents for a g... more The certificate issued by educational institutions is one of the most important documents for a graduate. It is a proof of the graduate's qualification and can be used anywhere. However, due to advances in printing and photocopying technologies, fake certificates can be created easily and the quality of a fake certificate can now be as good as the original. Certificates issued by many institutions have been forged and these forgeries are difficult to detect. Moreover, many factors have led to reduced operational efficiency in student services in many institutions. One of the most significant factors is the verification process for educational certificates and related documents. Certificate verification is necessary to ensure that the holder of the certificate is genuine and that the certificate itself comes from a real source. However, the verification of certificates is a challenge for the verifier (the prospective employer who wants to verify the certificate). To address this issue, a Certificate Verification System for Institutions (CVSI) using the Top-Down Design approach with iterative model is proposed. The system uses a NoSQL database (MongoDB) for certificate storage and Hypertext Preprocessor (PHP) for the front-end design. The university, the graduate and the verifier are the three parties involved in the proposed solution to accomplish accurate certificate verification. Several benefits can be obtained by using the proposed model. These benefits include; improved work processes, ease of use and maintenance by the University for the Verification Process as well as longtime of operation due to the use of MongoDB (a NoSQL database) which supports even horizontal scaling.
Uploads
Papers by Dekera K Kwaghtyo