Papers by Shraddha Phansalkar

PARKTag: An AI–Blockchain Integrated Solution for an Efficient, Trusted, and Scalable Parking Management System
Technologies
The imbalance between parking availability and demand has led to a rise in traffic challenges in ... more The imbalance between parking availability and demand has led to a rise in traffic challenges in many cities. The adoption of technologies like the Internet of Things and deep learning algorithms has been extensively explored to build automated smart parking systems in urban environments. Non-human-mediated, scalable smart parking systems that are built on decentralized blockchain systems will further enhance transparency and trust in this domain. The presented work, PARKTag, is an integration of a blockchain-based system and computer vision models to detect on-field free parking slots, efficiently navigate vehicles to those slots, and automate the computation of parking fees. This innovative approach aims to enhance the efficiency, scalability, and convenience of parking management by leveraging and integrating advanced technologies for real-time slot detection, navigation, and secure, transparent fee calculation with blockchain smart contracts. PARKTag was evaluated through implem...
Blockchain-Enabled IoT Security in Automotive Supply Chain
Springer eBooks, 2021
Secure and Transparent Election System for India using Block chain Technology
Data security, data reliability and data availability are the challenges the world has always bee... more Data security, data reliability and data availability are the challenges the world has always been solving. Election and Voter data are very crucial for a democratic nation. India, in this respect, brings its own challenges. A lot of financial resources are spent to conduct elections. At times, people are unable to cast their vote because they are unable to travel back to their constituency. Either way, the regular public lifestyle is affected for days in a row. Notwithstanding the best efforts, the whole process is susceptible to unauthorized access and unauthorized deeds. In this paper use of Blockchain Technology, powered by Ethereum to overcome the challenges in existing voting system.

Indian agriculture GDP and non performing assets: A regression model
IOP conference series, 2021
Agriculture is one of India’s crucial sectors in terms of its contribution to employment and the ... more Agriculture is one of India’s crucial sectors in terms of its contribution to employment and the country’s (Gross Domestic Product) GDP. It has primarily emerged as an essential - growing sector in the global economy since independence. [21]However, the non-realization of the reasonable price for agricultural crop production leads to the introduction of loan waivers, which impact the credit culture and weaken the farming economy and growth. The presented work aims to perform exploratory data analytics on the GDP data in agriculture public domain by performing feature engineering on the factors affecting the agricultural GDP using the data for the period 1961 to 2019. It further builds a multi-linear prediction model to forecast the Agriculture Sector’s economic performance in terms of GDP and NPAs generated by the Agricultural Sector using Machine Learning Techniques. Keywords: Multiple Linear Regression, Ordinary least Squared, Agricultural GDP, Non-Performing assets, Indian Economy.
Blockchain-Enabled IoT Security in Automotive Supply Chain
Lecture Notes in Networks and Systems, 2021
Secure and Transparent Election System for India using Block chain Technology
2018 IEEE Punecon, 2018
Data security, data reliability and data availability are the challenges the world has always bee... more Data security, data reliability and data availability are the challenges the world has always been solving. Election and Voter data are very crucial for a democratic nation. India, in this respect, brings its own challenges. A lot of financial resources are spent to conduct elections. At times, people are unable to cast their vote because they are unable to travel back to their constituency. Either way, the regular public lifestyle is affected for days in a row. Notwithstanding the best efforts, the whole process is susceptible to unauthorized access and unauthorized deeds. In this paper use of Blockchain Technology, powered by Ethereum to overcome the challenges in existing voting system.

Secured Communication in Vehicular Adhoc Networks (VANETs) using Blockchain
IOP Conference Series: Materials Science and Engineering, 2021
In the last decade, the vehicular ad-hoc network (VANET) field is growing drastically. In VANET t... more In the last decade, the vehicular ad-hoc network (VANET) field is growing drastically. In VANET the communication is the most important aspect through which the data transmission is taking place. Data transmission is maybe related to data in the form of multimedia messages, notifications, announcements, or some warning messages that are involved in VANETs. Data transmission among different vehicles would lead to an exchange of audio-video files in the formed network. These multimedia messages should be transmitted in the fraction of second as this network is built for temporary communication. Vehicles involved in this communication should be trustworthy otherwise other vehicles those who are part of this network would be misguided due to intruder present in the network. To achieve security and integrity in the constituted network, system has proposed the blockchain security environment as a part of this proposed system. Blockchain induces high-end communication in the ad-hoc network...

IEEE Access
The advancement and comprehensive adoption of blockchain technology has led to momentous progress... more The advancement and comprehensive adoption of blockchain technology has led to momentous progression in data-critical areas such as DeFi, health sector, defense, supply chain, and beyond. Nevertheless, beside this growth, there has been a distinguished rise in both the prevalence and diversification of security vulnerabilities. Recent attacks have exploited these vulnerabilities in targeting different layers of the blockchain architecture. This paper centralizes on the security vulnerabilities, attacks, and challenges faced by blockchain systems, presenting an exhaustive taxonomical classification of blockchain layered security. It classifies diverse security issues across the different layers of blockchain architecture. With the widespread adoption of blockchain in enterprise applications, smart contracts have become a vital target for attackers, leading to significant financial and data losses. Additionally, as divergent applications are deployed across multiple blockchains, the demand for cross-chain contracts is getting progressively imperative. Investigating the security of smart contracts and cross-chain interactions is important for defending decentralized applications against vulnerabilities, assuring the secure transfer of assets among blockchains, and embracing trust and stability within the extensive blockchain ecosystem. The paper also examines different tools and techniques available in the literature for improving blockchain security, including techniques for vulnerability detection along with the surging impact of deep learning in vulnerability detection. This work presents beneficial insights into the prevailing state of blockchain security and provides a progressive prospect on the capability of intelligent, high-dimensional deep learning-based solutions to target future threats in the blockchain security space.

Selective Data Consistency Model in No-SQL Data Store
Advances in information security, privacy, and ethics book series, 2017
Contemporary web-applications are deployed on the cloud data-stores for realizing requirements li... more Contemporary web-applications are deployed on the cloud data-stores for realizing requirements like low latency and high scalability. Although cloud-based database applications exhibit high performance with these features, they compromise on the weaker consistency levels. Rationing the consistency guarantees of an application is a necessity to achieve the augmented metrics of application performance. The proposed work is a paradigm shift from monotonic transaction consistency to selective data consistency in web database applications. The selective data consistency model leverages consistency of critical data-objects and leaves consistency of non-critical data-objects to underlying cloud data-store; it is called selective consistency and it results in better performance of the cloud-based applications. The consistency of the underlying data-object is defined from user-perspective with a user-friendly consistency metric called Consistency Index (CI). The selective data consistency model is implemented on a cloud data-store with OLTP workload and the performance is gauged.

Journal of Cloud Computing, Jun 10, 2015
Tunable consistency guarantees in big data stores help in achieving optimized consistency guarant... more Tunable consistency guarantees in big data stores help in achieving optimized consistency guarantees with improved performance. Commercial data stores offer tunable consistency guarantees at transaction level where the user specifies the desired level of consistency in terms of number of participating replicas in read and write consensus. Selective data consistency model applies strict consistency to a subset of data objects. The consistency guarantees of data attributes or objects are measured using an application independent metric called consistency index (CI). Our consistency model is predictive and helps in expression of data consistency as a function of known database design parameters, like workload characteristics and number of replicas of the data object. This work extends the causal relationships presented in our earlier work and presents adaptive consistency guarantees of this consistency model. The adaptive consistency guarantees are implemented with a consistency tuner, which probes the consistency index of an observed replicated data object in an online application. The tuner uses statistically derived threshold values of an optimum time gap, which, when padded in a workload stream, guarantees a desired value of consistency index for the observed data object. The tuner thus works like a workload scheduler of the replicated data object and pads only the required time delay between the requests in such a way that desired level of consistency is achieved with minimal effect on performance metrics like response time.

Selective Data Consistency Model in No-SQL Data Store
Advances in Information Security, Privacy, and Ethics
Contemporary web-applications are deployed on the cloud data-stores for realizing requirements li... more Contemporary web-applications are deployed on the cloud data-stores for realizing requirements like low latency and high scalability. Although cloud-based database applications exhibit high performance with these features, they compromise on the weaker consistency levels. Rationing the consistency guarantees of an application is a necessity to achieve the augmented metrics of application performance. The proposed work is a paradigm shift from monotonic transaction consistency to selective data consistency in web database applications. The selective data consistency model leverages consistency of critical data-objects and leaves consistency of non-critical data-objects to underlying cloud data-store; it is called selective consistency and it results in better performance of the cloud-based applications. The consistency of the underlying data-object is defined from user-perspective with a user-friendly consistency metric called Consistency Index (CI). The selective data consistency mo...

Journal of Cloud Computing, 2015
Tunable consistency guarantees in big data stores help in achieving optimized consistency guarant... more Tunable consistency guarantees in big data stores help in achieving optimized consistency guarantees with improved performance. Commercial data stores offer tunable consistency guarantees at transaction level where the user specifies the desired level of consistency in terms of number of participating replicas in read and write consensus. Selective data consistency model applies strict consistency to a subset of data objects. The consistency guarantees of data attributes or objects are measured using an application independent metric called consistency index (CI). Our consistency model is predictive and helps in expression of data consistency as a function of known database design parameters, like workload characteristics and number of replicas of the data object. This work extends the causal relationships presented in our earlier work and presents adaptive consistency guarantees of this consistency model. The adaptive consistency guarantees are implemented with a consistency tuner, which probes the consistency index of an observed replicated data object in an online application. The tuner uses statistically derived threshold values of an optimum time gap, which, when padded in a workload stream, guarantees a desired value of consistency index for the observed data object. The tuner thus works like a workload scheduler of the replicated data object and pads only the required time delay between the requests in such a way that desired level of consistency is achieved with minimal effect on performance metrics like response time.

IEEE Access, 2021
Interoperability in Electronic Health Records (EHR) is significant for the seamless sharing of in... more Interoperability in Electronic Health Records (EHR) is significant for the seamless sharing of information amongst different healthcare stakeholders. Interoperability in EHR aims to devise agreements in its interpretation, access, and storage with security, privacy, and trust. A study and survey of the stateof-the-art literature, prototypes, and projects in standardization of the EHR structure, privacy-preservation, and EHR sharing are very essential. The presented work conducts a systematic literature review to address four research questions. 1) What are the different standards for common interpretation, representation, and modeling of EHR to achieve semantic interoperability? 2) What are the different privacy-preservation techniques and security standards for EHR data storage? 3) How mature is blockchain technology for building interoperable, privacy-preserving solutions for EHR storage and sharing? 4) What is the state-ofthe-art for cross-chain interoperability for EHR sharing? An exhaustive study of these questions establishes the potential of a blockchain-based EHR management framework in privacy preservation, access control and efficient storage. The study also unveils challenges in the adoption of blockchain in EHR management with the state-of-the-art maturity of cross-chain interoperable solutions for sharing EHR amongst stakeholders on different blockchain platforms. The research gaps culminate in proposing a blockchain-based EHR framework called as MyBlockEHR with privacy preservation and access control design. The proposed framework employs partitioning of EHR to on-chain and off-chain storages for performance guarantees with the retrieval of valid off-chain data. The framework is deployed on the Ethereum test network with Solidity smart contracts. It is observed that different test cases on the partitioning of the EHR data, yielded better read-write throughput and effective gas price than fully on-chain storage. INDEX TERMS Blockchain, cross-chain, EHR, interoperability, partitioning.

Consistency is a qualitative measure of database performance. Consistency Index (CI) is a quantif... more Consistency is a qualitative measure of database performance. Consistency Index (CI) is a quantification of consistency of a data unit in terms of percentage of correct reads to the total reads observed on the data unit in given time. The consistency guarantee of a replicated database logically depends on the number of reads, updates, number of replicas, and workload distribution over time. The objective of our work is to establish this dependency and finding their level of interactions with consistency guarantees to develop a predictor model for CI. We have implemented Transaction Processing Council-C (TPCC) online transactions benchmark on Amazon SimpleDB which is used as big-data storage. We have controlled the database design parameters and measured CI with 100 samples of workload and database design. The findings helped us to implement a prototype of CI based consistency predictor using statistical predictive techniques like a) Regression model and b) Multiple Perceptron neural...

Adaptation of SQL-Isolation Levels to No-SQL Consistency Metrics
Smart innovation, systems and technologies, 2016
Big-data applications are deployed on cloud data-stores for the augmented performance metrics lik... more Big-data applications are deployed on cloud data-stores for the augmented performance metrics like availability, scalability and responsiveness. However they assure higher performance at the cost of lower consistency guarantees. The commercial cloud data-stores have unassured lower consistency guarantees which are measured with different metrics. For a traditional application deployed on relational databases with strong and assured consistency guarantees, SQL isolation levels have been used as a measure for the user to specify his/her consistency requirements. Migration of these applications to No-SQL data-stores necessitates a mapping of the changed levels of consistency from SQL isolation levels to No-SQL standard consistency metrics. This work gives insight to user about the adaptation in changed levels and guarantees of consistency from SQL isolation levels to No-SQL consistency metric.

Adaptation of SQL-Isolation Levels to No-SQL Consistency Metrics
Proceedings of the 3rd International Symposium on Big Data and Cloud Computing Challenges (ISBCC – 16’), 2016
Big-data applications are deployed on cloud data-stores for the augmented performance metrics lik... more Big-data applications are deployed on cloud data-stores for the augmented performance metrics like availability, scalability and responsiveness. However they assure higher performance at the cost of lower consistency guarantees. The commercial cloud data-stores have unassured lower consistency guarantees which are measured with different metrics. For a traditional application deployed on relational databases with strong and assured consistency guarantees, SQL isolation levels have been used as a measure for the user to specify his/her consistency requirements. Migration of these applications to No-SQL data-stores necessitates a mapping of the changed levels of consistency from SQL isolation levels to No-SQL standard consistency metrics. This work gives insight to user about the adaptation in changed levels and guarantees of consistency from SQL isolation levels to No-SQL consistency metric.

Sustainability
Biomedical text summarization (BTS) is proving to be an emerging area of work and research with t... more Biomedical text summarization (BTS) is proving to be an emerging area of work and research with the need for sustainable healthcare applications such as evidence-based medicine practice (EBM) and telemedicine which help effectively support healthcare needs of the society. However, with the rapid growth in the biomedical literature and the diversities in its structure and resources, it is becoming challenging to carry out effective text summarization for better insights. The goal of this work is to conduct a comprehensive systematic literature review of significant and high-impact literary work in BTS with a deep understanding of its major artifacts such as databases, semantic similarity measures, and semantic enrichment approaches. In the systematic literature review conducted, we applied search filters to find high-impact literature in the biomedical text summarization domain from IEEE, SCOPUS, Elsevier, EBSCO, and PubMed databases. The systematic literature review (SLR) yielded 81...
Online transaction Processing (OLTP) applications a re business applications which are characteri... more Online transaction Processing (OLTP) applications a re business applications which are characterized by high-frequency short lived data transactions. In cl oud domain, applications are expected to be highly responsive and low cost with optimized levels of c nsistency. Cloud data stores rely on an appropria te data partitioning scheme to achieve promising level s of responsiveness and scalability. This work pres ents a novel, transaction aware, static, vertical data p artitioning scheme based on denormalization which performs well for OLTP applications in cloud domain . The scheme is implemented and tested on contemporary cloud data stores i.e Amazon SimpleDB and Hadoop HBase. Our work also proposes a mathematical specification model for TAVPD based da ta partitioning and suggests appropriate evaluation factors for a data partitioning scheme in cloud dat ab se.

Electronics
Electronic Health Records (EHR) serve as a solid documentation of health transactions and as a vi... more Electronic Health Records (EHR) serve as a solid documentation of health transactions and as a vital resource of information for healthcare stakeholders. EHR integrity and security issues, however, continue to be intractable. Blockchain-based EHR architectures, however, address the issues of integrity very effectively. In this work, we suggest a decentralized patient-centered healthcare data management (PCHDM) with a blockchain-based EHR framework to address issues of confidentiality, access control, and privacy of record. This patient-centric architecture keeps the patient at the center of control for secured storage of EHR data. It is effective in the storage environment with the interplanetary file system (IPFS) and blockchain technology. In order to control unauthorized users, the proposed secure password authentication-based key exchange (SPAKE) implements smart contract-based access control to EHR transactions and access policies. The experimental setup comprises four hyperled...

Computational Intelligence and Neuroscience
Social media platforms play a key role in fostering the outreach of extremism by influencing the ... more Social media platforms play a key role in fostering the outreach of extremism by influencing the views, opinions, and perceptions of people. These platforms are increasingly exploited by extremist elements for spreading propaganda, radicalizing, and recruiting youth. Hence, research on extremism detection on social media platforms is essential to curb its influence and ill effects. A study of existing literature on extremism detection reveals that it is restricted to a specific ideology, binary classification with limited insights on extremism text, and manual data validation methods to check data quality. In existing research studies, researchers have used datasets limited to a single ideology. As a result, they face serious issues such as class imbalance, limited insights with class labels, and a lack of automated data validation methods. A major contribution of this work is a balanced extremism text dataset, versatile with multiple ideologies verified by robust data validation me...
Uploads
Papers by Shraddha Phansalkar