Papers by Chandrasekar Shastry
Simulation and Modelling of Task Migration in Distributed Systems Using SimGrid
Smart innovation, systems and technologies, 2024
Audio Signal Processing Using MATLab
2023 International Conference on Network, Multimedia and Information Technology (NMITCON)
A new approach for global task scheduling in volunteer computing systems
International journal of information technology, Sep 3, 2022

International journal of information technology, Nov 9, 2019
Plant identification plays a crucial role in sustaining the balance of the environment and protec... more Plant identification plays a crucial role in sustaining the balance of the environment and protecting the biodiversity of a region. Recognizing different species of plants using conventional methods for conservation purposes is a tedious task. Today there is a cumulative effort made by computer scientists and botanists to automate the entire process of plant identification with leaf being a key feature for distinguishing different species of plants. With the advancement and utilization of relevant technologies like digital cameras, mobile cameras, newer techniques in image processing, pattern recognition, machine learning, automation of this system has been a reality. In this paper we have reviewed the current status of research on computer vision methodologies for taxonomical identification of plants and have also focused on the research challenges such as the diversity of the taxa to be identified, morphological variation in plants belonging to the same species, smaller interspecies variations, the challenges in acquisition of high quality images and standard datasets. The future trends in use of new technologies, creation of standard databases and interdisciplinary aspect of research is also discussed.
General Purpose Computing on Graphics Processing Units: From Fixed-Function Pipelines to Programmable Cores

Iet Computers and Digital Techniques, Mar 18, 2021
The test data volume (TDV) increases with increased target compression in scan compression and ad... more The test data volume (TDV) increases with increased target compression in scan compression and adds to the test cost. Increased TDV is the result of a dependency across scan flip-flops (SFFs) that resulted from compression architecture, which is absent in scan mode. The SFFs have uncompressible values logic-0 and logic-1 in many or most of the patterns contribute to the TDV. In the proposed new scan compression (NSC) architecture, SFFs are analysed from Automatic Test Pattern Generation (ATPG) patterns generated in a scan mode. The identification of SFFs to be moved out of the compression architecture is carried out based on the NSC. The method includes a ranking of SFFs based on the specified values present in the test patterns. The SFFs having higher specified values are moved out of the compression architecture and placed in the outside scan chain. The NSC is the combination of scan compression and scan mode. This method decides the percentage (%) of SFFs to be moved out of compression architecture and is less than 0.5% of the total SFFs present in the design to achieve a better result. The NSC reduces dependencies across the SFFs present in the test compression architecture. It reduces the TDV and test application time. The results show a significant reduction in the TDV up to 78.14% for the same test coverage. This is an open access article under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited.

International Journal For Multidisciplinary Research
Electroencephalography, or EEG for short, is a technique used to record the electrical activity o... more Electroencephalography, or EEG for short, is a technique used to record the electrical activity of the brain. This EEG detects errors that affect how the human brain functions. This method is the most commonly used for recording the brain in laboratory research, clinical investigations, patient health monitoring, diagnostics, and a variety of other applications due to its non-invasiveness and cost-benefit ratio. Most EEG recordings are contaminated by a variety of irregularities, including those caused by electrode displacement, motion, ocular, and muscular activity related EMG anomalies. These unwanted artifacts may make it difficult to distinguish genuine information from them, in addition to confusing the brain's information processing that supports them. EEG signal artifacts can be removed in a variety of ways. The top and most popular artifact reduction techniques are listed on this page as PCA, pure EEG, and wavelet transform. The study provides a thorough evaluation of cu...

Periodica Polytechnica Electrical Engineering and Computer Science
Massive computations in today's computer applications necessitate the use of high-performance... more Massive computations in today's computer applications necessitate the use of high-performance computing environments. Unfor-tunately, high costs and power management must be addressed while operating these environments. Volunteer computing (VC) enables the creation of a global network of computing devices capable of accumulating their computing power to outperform any supercomputer. VC refers to the use of underutilized computing resources donated by thousands of volunteers who want to actively participate in solving common research problems. However, VC systems experience unexpected and sudden loss of connections between volunteers' computing resources and the main server. In this case, the server must redistribute the work to new devices as they become available. This process is known as task migration, and it is already used in various volunteer frameworks to address the unavailability of computing resources. However, there is a tendency to limit the number of migrations ...
Task Migration in Volunteer Computing Systems
2022 4th International Conference on Advances in Computing, Communication Control and Networking (ICAC3N), Dec 16, 2022

International Journal of Advanced Research in Science, Communication and Technology
Oil well monitoring has metamorphosed over the years from Dump Wells completion and the era of pe... more Oil well monitoring has metamorphosed over the years from Dump Wells completion and the era of permanent gauges through to the hydraulic control wells until this era of intelligent Well completion. These efforts are geared towards an era where Well data can be collected and interpreted with no human intervention. The aim is to improve recovery (optimization), minimized OPEX and CAPEX, and general efficiency. However, intelligent monitoring by virtue of intelligent Well completion is still an expensive venture. The paper presents an efficient IoT-based monitoring system whereby an ESP32 microcontroller and sensors are used to monitor the Well pressure, temperature, level, and flow rate on a real-time basis. The data from the oil well is available to the user at any remote location because the sensor data is sent to a cloud service on the internet. The cloud service used is the MQTT protocol and the MIT APP Inventor. The sensor data is also viewed in an android Mobile App
International Journal for Research in Applied Science and Engineering Technology
The oil and gas surface processing plant or station is the gathering station where the crude oil ... more The oil and gas surface processing plant or station is the gathering station where the crude oil is separated into its constituent units of oil, water, and gas. This paper reports the sensor data capturing, the process operation, and the SCADA design of the processing unit using Wonder-ware InTouch Software. The system monitors/controls various process variables including pressure, temperature level, flowrates, and the emergency shutdown system based on the Cause & Effect Chart.

Zenodo (CERN European Organization for Nuclear Research), Mar 11, 2023
Operators of telecommunications are sitting on a gold mine. They produce enormous amounts of data... more Operators of telecommunications are sitting on a gold mine. They produce enormous amounts of data each day, up to billions of CDRs and events. These data could be user, network, or customer-related. For the telecommunications operators, effectively gathering, storing, processing, and analyzing this amount of data can be very difficult. The infrastructure must have ample storage space and computational power. Additionally, it needs adaptability to assess various data formats. Therefore, it is crucial to create the best architecture possible in order to overcome these technical difficulties and satisfy commercial needs. In this paper, we have used the seven layers of implementation described in the previous work and implemented a potential use case-churn analysis of telecom customers. We have also analyzed various other use cases along with case studies and have proved how our open source data pipeline architecture would help the telecommunication sectors to implement and analyze those use cases.
General Purpose Computing on Graphics Processing Units: From Fixed-Function Pipelines to Programmable Cores
2022 4th International Conference on Advances in Computing, Communication Control and Networking (ICAC3N)
Design of WSN Model with NS2 for Animal Tracking and Monitoring
Procedia Computer Science

Scalable Computing: Practice and Experience
Wireless sensor networks (WSN) have been exploited for {countless} application domains, most nota... more Wireless sensor networks (WSN) have been exploited for {countless} application domains, most notably the surveillance of environments and habitats, which has already become a critical mission. As a result, WSNs have been implemented to monitor animal care and track their health status. However, excessive energy utilization and communication traffic on packet transmissions lead to system deterioration, especially whenever perceived information captured in the monitoring area is transferred to the access point over multiple dynamic sinks. Further to manage the energy and data transmission issue, the energy consumption and location aware routing protocol has been architected on the wireless Nano sensor nodes. In this article, a novel hybrid energy and location aware routing protocol to cloud enabled IoT based Wireless Sensor Network towards animal health monitoring and tracking has been proposed. However proposed data routing protocol incorporates the trace file for path selection for ...
Ewe Health Monitoring Using IoT simulator
2022 IEEE International Conference on Data Science and Information System (ICDSIS)
A new approach for global task scheduling in volunteer computing systems
International Journal of Information Technology

International Journal of Electrical and Electronics Research
Electricity use and its access are correlated in the economic development of any country. Economi... more Electricity use and its access are correlated in the economic development of any country. Economically, electricity cannot be stored, and for stability of an electrical network a balance between generation and consumption is necessary. Electricity demand depends on various factors like temperature, everyday activities, time of day, days of the week days/Holidays. These parameters have led to price volatility and huge spikes in electricity prices. The research work proposes a short term Load prediction Model for LT2 (residential consumers), LT3 (Commercial Consumers) of Karnataka State Electricity Board using Cascaded Feed Forward Neural Network (CFNN). MATLAB software is utilized to design and test the forecasting model for predicting the power consumption. Furthermore, a shallow feed forward neural network-based prediction model is constructed and evaluated for performance comparison. The Performance metrics include Mean Absolute Percentage Error (MAPE) and Mean Squared Error (MSE)...
International journal of health sciences
Peak power management is one of the demand-side management methods aiming at regulating energy us... more Peak power management is one of the demand-side management methods aiming at regulating energy usage throughout the day. This paper discusses peak load, which relates to a consumer's peak demand during specific hourly hours and how to manage it. Shifting loads from demand hours to non peak hours relieves pressure on utilities to meet demand and supply while also lowering the cost to consumers. The Peak Load Management Model offers a more effective framework for lowering peak loads and moving loads from peak to off-peak hours. A cascaded artificial neural network is utilised to construct a demand side management strategy for managing peak electricity in residential buildings in this paper. The peak load control model's results and discussion are highlighted in the simulation results and performance review.

IET Computers & Digital Techniques
The test data volume (TDV) increases with increased target compression in scan compression and ad... more The test data volume (TDV) increases with increased target compression in scan compression and adds to the test cost. Increased TDV is the result of a dependency across scan flip-flops (SFFs) that resulted from compression architecture, which is absent in scan mode. The SFFs have uncompressible values logic-0 and logic-1 in many or most of the patterns contribute to the TDV. In the proposed new scan compression (NSC) architecture, SFFs are analysed from Automatic Test Pattern Generation (ATPG) patterns generated in a scan mode. The identification of SFFs to be moved out of the compression architecture is carried out based on the NSC. The method includes a ranking of SFFs based on the specified values present in the test patterns. The SFFs having higher specified values are moved out of the compression architecture and placed in the outside scan chain. The NSC is the combination of scan compression and scan mode. This method decides the percentage (%) of SFFs to be moved out of compression architecture and is less than 0.5% of the total SFFs present in the design to achieve a better result. The NSC reduces dependencies across the SFFs present in the test compression architecture. It reduces the TDV and test application time. The results show a significant reduction in the TDV up to 78.14% for the same test coverage. This is an open access article under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited.
Uploads
Papers by Chandrasekar Shastry