Papers by Mahadevan Subramaniam

Frontiers in microbiology, Mar 12, 2024
Data-driven Artificial Intelligence (AI)/Machine learning (ML) image analysis approaches have gai... more Data-driven Artificial Intelligence (AI)/Machine learning (ML) image analysis approaches have gained a lot of momentum in analyzing microscopy images in bioengineering, biotechnology, and medicine. The success of these approaches crucially relies on the availability of high-quality microscopy images, which is often a challenge due to the diverse experimental conditions and modes under which these images are obtained. In this study, we propose the use of recent ML-based image super-resolution (SR) techniques for improving the image quality of microscopy images, incorporating them into multiple ML-based image analysis tasks, and describing a comprehensive study, investigating the impact of SR techniques on the segmentation of microscopy images. The impacts of four Generative Adversarial Network (GAN)-and transformer-based SR techniques on microscopy image quality are measured using three well-established quality metrics. These SR techniques are incorporated into multiple deep network pipelines using supervised, contrastive, and non-contrastive self-supervised methods to semantically segment microscopy images from multiple datasets. Our results show that the image quality of microscopy images has a direct influence on the ML model performance and that both supervised and self-supervised network pipelines using SR images perform better by %-% in comparison to baselines, not using SR. Based on our experiments, we also establish that the image quality improvement threshold range [ -] for the complemented Perception-based Image Quality Evaluator(PIQE) metric can be used as a pre-condition by domain experts to incorporate SR techniques to significantly improve segmentation performance. A plug-and-play software platform developed to integrate SR techniques with various deep networks using supervised and self-supervised learning methods is also presented.
Machine Learning-Assisted Optical Detection of Multilayer Hexagonal Boron Nitride for Enhanced Characterization and Analysis

Quantum key distribution (QKD) will most likely be an integral part of any practical quantum netw... more Quantum key distribution (QKD) will most likely be an integral part of any practical quantum network setup in the future. However, not all QKD protocols can be used in today's networks because of the lack of single photon emitters and noisy intermediate quantum hardware. Attenuated photon transmission typically used to simulate single photon emitters severely limits the achievable transmission distances and the integration of QKD into existing classical networks that use tens of thousands of photons per bit of transmission. Furthermore, it has been found that different protocols perform differently in different network topologies. In order to remove the reliance of QKD on single photon emitters and increase transmission distances, it is worthwhile exploring QKD protocols that do not rely on single-photon transmissions for security, such as the 3-stage QKD protocol; the 3-stage protocol can tolerate multiple photons in each burst without leakage of information. This paper compares and contrasts the 3-stage QKD protocol and its efficiency in different network topologies and conditions. Further, we establish a mathematical relationship between achievable key rates for increasing transmission distances in various topologies. Our results provide insight to a network engineer in designing QKD networks of the future.
A Framework for an Intelligent Adaptive Education Platform for Quantum Cybersecurity

arXiv (Cornell University), Feb 16, 2010
Live sequence charts (LSCs) have been proposed as an interobject scenario-based specification and... more Live sequence charts (LSCs) have been proposed as an interobject scenario-based specification and visual programming language for reactive systems. In this paper, we introduce a logic-based framework to check the consistency of an LSC specification. An LSC simulator has been implemented in logic programming, utilizing a memoized depth-first search strategy, to show how a reactive system in LSCs would response to a set of external event sequences. A formal notation is defined to specify external event sequences, extending the regular expression with a parallel operator and a testing control. The parallel operator allows interleaved parallel external events to be tested in LSCs simultaneously; while the testing control provides users to a new approach to specify and test certain temporal properties (e.g., CTL formula) in a form of LSC. Our framework further provides either a state transition graph or a failure trace to justify the consistency checking results.

Lecture Notes in Computer Science, 1996
A methodology for mechanically verifying a family of parameterized multiplier drcuits~ including ... more A methodology for mechanically verifying a family of parameterized multiplier drcuits~ including many well-known multiplier circuits such as the linear array, the Wallace tree and the 7-3 multiplier is proposed. A top level specification for these multipliers is obtained by abstracting the commonality in their behavior. The behavioral correctness of any multiplier in the family can be mechanically verified by a uniform proof strategy. Proofs of properties axe done by rewriting and induction using an automated theorem prover RRL (Rewrite Rule Laboratory). The behavioral correctness of the circuits is established with respect to addition and multiplication on numbers. The automated proofs involve minimal user bltervention in terms of intermediate lemmas required. Generic hardware components axe used to segregate the specification and the implementation aspects, enabling verification of circuits in terms of behavioral constraints that can be realized in different ways. The use of generic components aids reuse of proofs and helps modulaxize the correctness proofs, allowing verification to go hand in hand with the hardware design process in a hierarchical fashion.
Leveraging Weak annotations for Deep learning tasks on Biofilm Images
2022 IEEE International Conference on Bioinformatics and Biomedicine (BIBM)
BioMDSE: A Multimodal Deep Learning-Based Search Engine Framework for Biofilm Documents Classifications
2022 IEEE International Conference on Bioinformatics and Biomedicine (BIBM)
Frontiers in Microbiology, Dec 1, 2022
Subramaniam M ( ) An AI-based approach for detecting cells and microbial byproducts in low volume... more Subramaniam M ( ) An AI-based approach for detecting cells and microbial byproducts in low volume scanning electron microscope images of biofilms.
SSRN Electronic Journal
Porous polymer microspheres are employed in biotherapeutics, tissue engineering, and regenerative... more Porous polymer microspheres are employed in biotherapeutics, tissue engineering, and regenerative medicine. Porosity dictates cargo carriage and release that are aligned with the polymer physicochemical properties. These include material tuning, biodegradation, and cargo encapsulation. How uniformity of pore size affects therapeutic delivery remains an area of active investigation. Herein, we characterize six branched aliphatic hydrocarbonbased porogen(s) produced to create pores in single and multilayered microspheres. The porogens are composed of biocompatible polycaprolactone, poly(lactic-co-glycolic acid), and polylactic acid polymers within porous multilayered microspheres. These serve as controlled effective drug and vaccine delivery platforms.

Preserving Consistency of Runtime Monitors across Protocol Changes
10th IEEE International Conference on Engineering of Complex Computer Systems (ICECCS'05)
Protocols governing communication among the components of a complex system are frequently changed... more Protocols governing communication among the components of a complex system are frequently changed during the design process. To enable faster verification turnaround time, it is important that the existing verification infrastructure continues to be consistent with the changed protocol. In this paper, an approach to identify the effects of protocol changes on runtime monitors is proposed. Runtime monitors are commonly used to observe and verify the dynamic protocol behaviors. Protocols as well as the monitors are modeled using communicating finite state machines. Addition/deletion/replacement of transitions in one or more protocol components may result in similar changes to the monitor transitions. A notion of consistency of a monitor relative to a protocol is introduced. Conditions under which a protocol change necessitates a change to the monitor to preserve relative consistency are identified. Automatic procedures to synthesize new monitors that are guaranteed to be consistent with the changed protocol are described.

Virtual Interactive Construction Education (VICE) using BIM Tools
Training and process analysis in the construction industry has not taken full advantage of new te... more Training and process analysis in the construction industry has not taken full advantage of new technologies such as building information modeling(BIM). The purpose of this research is to develop a framework for the virtual interactive construction education system using three dimensional technologies. The modules will simulate the construction process for a facility from start to finish using information drawn from real projects in the built environment. These modules can be used as training tools for new employees where they attempt to optimize time and cost in a virtual environment given a limited number of equipment, time and employee options. They can also be used as a process analysis tool for new construction where a number of situational variables can change leading to exposure of potential risk. These modules would be particularly useful for repetitive construction where the initial project is analyzed for optimization and risk mitigation. This paper describes the framework ...

An Approach for Selecting Tests on Extended Finite State Machines with Provable Guarantees
Building high confidence regression test suites to validate the changes performed during system e... more Building high confidence regression test suites to validate the changes performed during system evolution and maintenance is a challenging problem. This paper describes a formal approach that selects every test from a given test suite guaranteed to exercise a given change and discards others without actually running the tests for building a regression test suite. Systems are modeled as extended finite state machines (EFSMs) supporting several commonly used data types including booleans, numbers, arrays, queues, and record data types. Changes add/delete/replace EFSM transitions. Tests are a sequence of input and expected output messages with concrete parameter values over the supported data types. Fully-observable tests whose descriptions contain all the information about the transitions executed when a test is run are introduced. An invariant characterizing fully-observable tests is formulated such that a test is fully-observable whenever the invariant is a satisfiable formula. Incr...

Software Engineering and Formal Methods, Sep 7, 2005
While verifying complex protocols, it is often fruitful to consider all protocol contexts in whic... more While verifying complex protocols, it is often fruitful to consider all protocol contexts in which an interesting set of transitions may appear. The contexts are represented as yet another protocol called observable protocol that may be further analyzed. An efficient approach based on static analysis to compute an over-approximated protocol that includes all the runs of an observable protocol is described. The approach uses dominator relations over state and message dependency graphs. An over-approximation of transitions that occur with an interesting transition in any run are produced, from which a transition relation of the overapproximated protocol is automatically generated. To facilitate systematic state space exploration of the over approximated protocol, it is shown how a series of underapproximations can be generated by identifying parallelism among the transitions using dominators. The effectiveness of the proposed approach is illustrated by model checking several examples including several coherence protocols.
Lecture Notes in Computer Science, 1996
Using linear arithmetic procedure for generating induction schemes
Lecture Notes in Computer Science, 1994
... 9. D. Kapur, P. Narendran, D. Rosenkrantz, H. Zhang.," Sufficient-completeness, quasi-re... more ... 9. D. Kapur, P. Narendran, D. Rosenkrantz, H. Zhang.," Sufficient-completeness, quasi-reducibility and their complexity," Acta Informatics, 28, 1991, 311-350. ... 342 M. de Rougemont 12 M. Santha 12 V. San tosh 318 H. Saran 318 P. Savicky 390 B. Schieder 77 RC Sekar 288 S ...

Model-Based Test Generation Using Evolutional Symbolic Grammar
2012 Sixth International Symposium on Theoretical Aspects of Software Engineering, 2012
ABSTRACT We present a new model-based test generation approach using an extended symbolic grammar... more ABSTRACT We present a new model-based test generation approach using an extended symbolic grammar, which is used as a formal notation for enumerating test cases for communication and reactive systems. Our model-based test generation approach takes inputs a reactive system model, in Live Sequence Charts (LSCs), and a general symbolic grammar serving as preliminary test coverage criteria, performs an automatic simulation for consistency testing on the LSC model specification, and eventually generates an evolved symbolic grammar with relined test coverage criteria. The evolved symbolic grammar can either be used to generate practical test cases for software testing, or be further relined by applying our model-based test generation approach again with additional test coverage criteria.
Proceedings of the 2011 workshop on Knowledge discovery, modeling and simulation
Proceedings of the 2011 workshop on Knowledge discovery, modeling and simulation - KDMS '11, 2011

Proceedings of the 1996 ACM/IEEE conference on Supercomputing, 1996
W e present an analytical performance model for P anda, a library for synchronized i/o of large m... more W e present an analytical performance model for P anda, a library for synchronized i/o of large multidimensional arra ys on parallel and sequential platforms, and sho w h o w the Panda developers use this model to ev aluate Panda's parallel i/o performance and guide future P anda developmen t. The model v alidation shows that system dev elopers can simplify performance analysis, identify potential performance bottlenec ks, and study the design trade-o s for P anda on massiv ely parallel platforms more easily than b y conducting empirical experiments. More importantly, w e show that the outputs of the performance model can be used to help make optimal plans for handling application i/o requests, the rst step tow ard our long-term goal of automatically optimizing i/o request handling in Panda. 1 In troduction P anda (URL y.cs.uiuc.edu/CDR/panda/) is an i/o library motivated by the needs of high-performance SPMD scienti c applications that must input and output multidimensio nal arrays on distributed memory parallel platforms or networks of workstations. P anda supports collective i/o operations where all the processors used by an application are closely sync hronized 1

Journal of Automated Reasoning, 1996
Zhang, Kapur, and Krishnamoorthy introduced a cover set method for designing induction schemes fo... more Zhang, Kapur, and Krishnamoorthy introduced a cover set method for designing induction schemes for automating proofs by induction from specifications expressed as equations and conditional equations. This method has been implemented in the theorem prover Rewrite Rule Laboratory (RRL) and a proof management system Tecton built on top of RRL, and it has been used to prove many nontrivial theorems and reason about sequential as well as parallel programs. The cover set method IS based on the assumption that a function symbol is defined by using a finite set of terminating (conditional or unconditional) rewrite rules. The termination ordering employed m orienting the rules is used to perform proofs by well-founded induction. The left sides of the rules are used to design different cases of an induction scheme, and recurslve calls to the function made m the right side can be used to design appropriate instantlations for generating induction hypotheses. A weakness of this method is that it relies on syntactic umficatlon for generating an induction scheme for a conjecture. This paper goes a step further by proposing semantic analysis for generating an induction scheme lot a conjecture from a cover set. We discuss the use of a decision procedure for Presburger arithmetic (quantifier-tree theory of numbers with the addition operation and relational predicates >, <, r =, ~>, ~<) for performing semantic analysis about numbers. The decision procedure is used to generate appropriate induction schemes for a conJecture by using cover sets of function taking numbers as arguments. This extension of the cover set method automates proofs of many theorems that otherwise require human guidance and hints. The effectiveness of the method is demonstrated by using some examples that commonly arise in reasoning about specifications and programs. It is also shown how semantic analysis using a Presburger arithmetic decision procedure can be used for checking the completeness of a cover set of a functxon defined by using operations such as + and -on numbers. With this check, many function definitions used in a proof of the prime factorization theorem stating that every number can be factored uniquely into prime factors, which had to be checked manually, can now be checked automatically in RRL. The use of the decision procedure for guiding generalization for generating conjectures and merging induction schemes is also illustrated.
Uploads
Papers by Mahadevan Subramaniam