Springer proceedings in mathematics & statistics, 2013
Sensitivity analysis (SA) is a procedure for studying how sensitive are the output results of lar... more Sensitivity analysis (SA) is a procedure for studying how sensitive are the output results of large-scale mathematical models to some uncertainties of the input data. The models are described as a system of partial differential equations. Often such systems contain a large number of input parameters. Obviously, it is important to know how sensitive is the solution to some uncontrolled variations or uncertainties in the input parameters of the model. Algorithms based on analysis of variances technique (ANOVA) for calculating numerical indicators of sensitivity and computationally efficient Monte Carlo integration techniques have recently been developed by the authors. They have been successfully applied to sensitivity studies of air pollution levels calculated by the Unified Danish Eulerian Model (UNI-DEM) with respect to several important input parameters. In this paper a comprehensive theoretical and experimental study of the Monte Carlo algorithm based on symmetrised shaking of Sobol sequences has been done. It has been proven that this algorithm has an optimal rate of convergence for functions with continuous and bounded second derivatives in terms of probability and mean square error. Extensive numerical experiments with Monte Carlo, quasi-Monte Carlo (QMC) and scrambled quasi-Monte Carlo algorithms based on Sobol sequences are performed to support the theoretical studies and to analyze applicability of the algorithms to various classes of problems. The numerical tests show that the Monte Carlo algorithm based on symmetrised shaking of Sobol sequences gives reliable results for multidimensional integration problems under consideration.
Highlights d A PCR-based electroporation screen yielded an improved voltage indicator, ASAP3 d AS... more Highlights d A PCR-based electroporation screen yielded an improved voltage indicator, ASAP3 d ASAP3 shows larger voltage responses than other fluorescent protein-based sensors d Ultrafast local volume excitation (ULoVE) boosts randomaccess two-photon signals d ASAP3 and ULoVE report subthreshold and spiking potentials in deep brain regions Authors
ABSTRACTImaging of transmembrane voltage deep in brain tissue with cellular resolution has the po... more ABSTRACTImaging of transmembrane voltage deep in brain tissue with cellular resolution has the potential to reveal information processing by neuronal circuits in living animals with minimal perturbation. Multi-photon voltage imaging in vivo, however, is currently limited by speed and sensitivity of both indicators and imaging methods. Here, we report the engineering of an improved genetically encoded voltage indicator, ASAP3, which exhibits up to 51% fluorescence responses in the physiological voltage range, sub-millisecond activation kinetics, and full responsivity under two-photon illumination. We also introduce an ultrafast local volume excitation (ULOVE) two-photon scanning method to sample ASAP3 signals in awake mice at kilohertz rates with increased stability and sensitivity. ASAP3 and ULOVE allowed continuous single-trial tracking of spikes and subthreshold events for minutes in deep locations, with subcellular resolution, and with repeated sampling over multiple days. By ima...
Proceedings of the National Academy of Sciences, 2011
After 35 years the hunt for improved anthracycline antibiotics is unabated but has yet to achieve... more After 35 years the hunt for improved anthracycline antibiotics is unabated but has yet to achieve the levels of clinical success desired. Electrochemical techniques provide a large amount of kinetic and thermodynamic information, but the use of such procedures is hindered by issues of sensitivity and selectivity. This work demonstrates how by harnessing the mechanism of catalytic reduction of oxygen by the quinone functionality present within the anthracycline structure it is possible to study the reactive moiety in nanomolar concentration. This methodology allows electrochemical investigation of the intercalation of quinizarin into DNA and, in particular, the quinone oxidation and degradation mechanism. The reversible reduction of the quinizarin, which in the presence of oxygen leads to the formation of reactive oxygen species, is found to occur at -0.535 V (vs. SCE) pH 6.84 and the irreversible oxidation leading to the molecules degradation occurs at +0.386 V (vs. SCE) pH 6.84.
We present a stochastic approach for solving the quantumkinetic equation introduced in Part I. A ... more We present a stochastic approach for solving the quantumkinetic equation introduced in Part I. A Monte Carlo method based on backward time evolution of the numerical trajectories is developed. The computational complexity and the stochastic error are investigated numerically. Variance reduction techniques are applied, which demonstrate a clear advantage with respect to the approaches based on symmetry transformation. Parallel implementation is realized on a GRID infrastructure.
Annals of Computer Science and Information Systems
This paper introduces a sophisticated multidimensional sensitivity analysis, incorporating cuttin... more This paper introduces a sophisticated multidimensional sensitivity analysis, incorporating cutting-edge stochastic methods for air pollution modeling. The study focuses on a large-scale long-distance transportation model of air pollutants, specifically the Unified Danish Eulerian Model (UNI-DEM). This mathematical model plays a pivotal role in understanding the detrimental impacts of heightened levels of air pollution. With this research, our intent is to employ it to tackle crucial questions related to environmental protection. We suggest advanced Monte Carlo and quasi-Monte Carlo methods, leveraging specific lattice and digital sequences to enhance the computational effectiveness of multi-dimensional numerical integration. Moreover, we further refine the existing stochastic methodologies for digital ecosystem modeling. The main aspect of our investigation is to analyze the sensitivity of the UNI-DEM model output to changes in the input emissions of human-induced pollutants and the rates of a number of chemical reactions. The developed algorithms are utilized to calculate global Sobol sensitivity measures for various input parameters. We also assess their influence on key air pollutant concentrations in different European cities, considering the diverse geographical locations. The overarching goal of this research is to broaden our understanding of the elements influencing air pollution and inform potent strategies to alleviate its negative impacts on the environment.
The sampling of certain solid angle is a fundamental operation in realistic image synthesis, wher... more The sampling of certain solid angle is a fundamental operation in realistic image synthesis, where the rendering equation describing the light propagation in closed domains is solved. Monte Carlo methods for solving the rendering equation use sampling of the solid angle subtended by unit hemisphere or unit sphere in order to perform the numerical integration of the rendering equation. In this work we consider the problem for generation of uniformly distributed random samples over hemisphere and sphere. Our aim is to construct and study the parallel sampling scheme for hemisphere and sphere. First we apply the symmetry property for partitioning of hemisphere and sphere. The domain of solid angle subtended by a hemisphere is divided into a number of equal sub-domains. Each sub-domain represents solid angle subtended by orthogonal spherical triangle with fixed vertices and computable parameters. Then we introduce two new algorithms for sampling of orthogonal spherical triangles. Both algorithms are based on a transformation of the unit square. Similarly to the Arvo's algorithm for sampling of arbitrary spherical triangle the suggested algorithms accommodate the stratified sampling. We derive the necessary transformations for the algorithms. The first sampling algorithm generates a sample by mapping of the unit square onto orthogonal spherical triangle. The second algorithm directly compute the unit radius vector of a sampling point inside to the orthogonal spherical triangle. The sampling of total hemisphere and sphere is performed in parallel for all sub-domains simultaneously by using the symmetry property of partitioning. The applicability of the corresponding parallel sampling scheme for Monte Carlo and Quasi-Monte Carlo solving of rendering equation is discussed.
Recently, single-cell molecular analysis has been leveraged to achieve unprecedented levels of bi... more Recently, single-cell molecular analysis has been leveraged to achieve unprecedented levels of biological investigation. However, a lack of simple, high-throughput single-cell methods has hindered in-depth population-wide studies with single-cell resolution. We report a microwellbased cytometric method for simultaneous measurements of gene and protein expression dynamics in thousands of single cells. We quantified the regulatory effects of transcriptional and translational inhibitors on cMET mRNA and cMET protein in cell populations. We studied the
Induction methods of localization the electrically conductive media were implemented experimental... more Induction methods of localization the electrically conductive media were implemented experimentally. Local and integral techniques are considered. Relations were represented between of the local level probe (LLP) signal, its position and the frequency of alternating current. The dependences of the integral level probe (ILP) signal on the frequency of the alternating current are obtained. The methods were tested in laboratory experiments in which it was possible to determine the metal level with an error not exceeding 1 mm. The LLP is more designed for small changes in the liquid metal boundary, comparable to the size of the probe. The ILP has its advantages when used in installations in which the change in the level of the liquid metal is greater than the pitch with which the measuring coils are installed.
Sensitivity analysis (SA) is a procedure for studying how sensitive are the output results of lar... more Sensitivity analysis (SA) is a procedure for studying how sensitive are the output results of large-scale mathematical models to some uncertainties of the input data. The models are described as a system of partial differential equations. Often such systems contain a large number of input parameters. Obviously, it is important to know how sensitive is the solution to some uncontrolled variations or uncertainties in the input parameters of the model. Algorithms based on analysis of variances technique (ANOVA) for calculating numerical indicators of sensitivity and computationally efficient Monte Carlo integration techniques have recently been developed by the authors. They have been successfully applied to sensitivity studies of air pollution levels calculated by the Unified Danish Eulerian Model (UNI-DEM) with respect to several important input parameters. In this paper a comprehensive theoretical and experimental study of the Monte Carlo algorithm based on symmetrised shaking of Sobol sequences has been done. It has been proven that this algorithm has an optimal rate of convergence for functions with continuous and bounded second derivatives in terms of probability and mean square error. Extensive numerical experiments with Monte Carlo, quasi-Monte Carlo (QMC) and scrambled quasi-Monte Carlo algorithms based on Sobol sequences are performed to support the theoretical studies and to analyze applicability of the algorithms to various classes of problems. The numerical tests show that the Monte Carlo algorithm based on symmetrised shaking of Sobol sequences gives reliable results for multidimensional integration problems under consideration.
It is becoming increasingly apparent that some individuals are more susceptible to disease than o... more It is becoming increasingly apparent that some individuals are more susceptible to disease than others and more importantly some patients respond to prescribed therapies better than others. One of the main reasons for differences in disease susceptibility and the effectiveness of drug treatment lies in the genetic makeup of the patient. In addition to many environmental factors, genetic variations such as mutations, DNA polymorphisms and epigenetic gene regulation are the key players involved in the fate of a person's health. Recent advances in genomics and proteomics are providing novel insights into the complex biological process of disease. These insights will ultimately help to tailor personalised approaches to the treatment of disease based upon individual molecular "blueprints" of their genome and proteome. Personalised medicine extends beyond the traditional medical approach in the treatment of patients as it aims to identify and target molecular factors contributing to the illness of individual patients. The personalised medicine approach is already playing a significant role in the way we treat and monitor disease. As many as 10 out of 36 anti-cancer drugs approved by the European Union in the last 10 years are considered to be personalised medicines . Breast cancer is one of the best examples whereby a personalised medical approach is adopted to detect the expression status of an oestrogen receptor called ESR1 in the nucleus of breast cancer cells. Approximately 70% of breast cancer patients overexpress this protein which is an important prognostic and predictive marker. Outcomes for these patients have been significantly improved by targeting the ESR1 using a hormonal treatment known as Tamoxifen. Interestingly this is the most commonly prescribed anticancer treatment in the world, highlighting the importance of a personalised approach in the management of disease. Microscale technologies are emerging as an enabling platform for the development of novel personalised medicines and their broad accessibility. Miniaturised devices have the potential to process minute clinical samples and perform extensive genetic, molecular and cellular analyses directly on a microfluidic chip. The integration of pre-analytical sample handling with a subsequent sample analysis on a single microfluidic device will help to achieve highest reproducibility of results and minimise inter-laboratory bias and operators
The transcytosis of lipids through enterocytes occurs through the delivery of lipid micelles to t... more The transcytosis of lipids through enterocytes occurs through the delivery of lipid micelles to the microvilli of enterocytes, consumption of lipid derivates by the apical plasma membrane (PM) and then their delivery to the membrane of the smooth ER attached to the basolateral PM. The SER forms immature chylomicrons (iChMs) in the ER lumen. iChMs are delivered at the Golgi complex (GC) where they are subjected to additional glycosylation resulting in maturation of iChMs. ChMs are secreted into the intercellular space and delivered into the lumen of lymphatic capillaries (LCs). The overloading of enterocytes with lipids induces the formation of lipid droplets inside the lipid bilayer of the ER membranes and transcytosis becomes slower. Here, we examined components of the enterocyte-to-lymphatic barriers in newly born rats before the first feeding and after it. In contrast to adult animals, enterocytes of newborns rats exhibited apical endocytosis and a well-developed subapical endoso...
Atherosclerosis is a complex non-monogenic disease related to endothelial damage in elastic-type ... more Atherosclerosis is a complex non-monogenic disease related to endothelial damage in elastic-type arteries and incorrect feeding. Here, using cryodamage of endothelial cells (ECs) of rat abdominal aorta, we examined the role of the EC basement membrane (BM) for re-endothelization endothelial regeneration and its ability to capture low density lipoproteins (LDLs). Regeneration of endothelium induced thickening of the ECBM. Secretion of the BM components occurred in the G2-phase. Multiple regenerations, as well as arterial hypertension and aging, also led to the thickening of the BM. Under these conditions, the speed of re-endothelialization increased. The thick BM captured more LDLs. LDLs formed after overloading of rats with lipids acquired higher affinity to the BM, presumably due to the prolonged transport of chylomicrons through neuraminidase-positive endo-lysosomes. These data provide new molecular and cellular mechanisms of atherogenesis.
Overloading the intestine enterocytes with lipids induced alteration of the Golgi complex (GC; Se... more Overloading the intestine enterocytes with lipids induced alteration of the Golgi complex (GC; Sesorova et al., 2020) and could cause glycosylation errors. Here, using differentiated Caco-2 cells with the established 0[I] blood group phenotype (no expression of the blood antigens A and B [AgA, AgB] under normal conditions) as a model of human enterocytes we examined whether the overloading of these cells with lipids could cause errors in the Golgi-dependent glycosylation. We demonstrated that under these conditions, there were alterations of the GC and the appearance of lipid droplets in the cytoplasm. Rare cells produced AgA and AgB. This suggested that after overloading of enterocytes with lipids, AgA were mistakenly synthesized in individual enterocytes by the Golgi glycosyltransferases. These mistakes could explain why in the absence of AgA and AgB antibodies against them exist in the blood.
In order to precisely determine the magnesium level in a titanium reduction retort by inductive m... more In order to precisely determine the magnesium level in a titanium reduction retort by inductive methods, many interfering influences have to be considered. By using a look-up-table method, the magnesium level can be reliably identified by taking into account the interfering effects of the titanium sponge rings forming at the walls with their unknown geometrical and electrical parameters. This new method uses a combination of numerical simulations and measurements, whereby the simulation model is calibrated so that it represents the experimental setup as closely as possible. Previously, purely theoretical studies on this method were presented. Here, the practical feasibility of that method is demonstrated by performing measurements on a model experiment. The method is not limited to the production of titanium but can also be applied to other applications in metal production and processing.
Nowadays, flaked product became quite popular. The aim of the present paper is to study the effec... more Nowadays, flaked product became quite popular. The aim of the present paper is to study the effect of the process of flaking of einkorn (Triticum monococcum L.) on some basic chemical properties, the biologically active substances and the antioxidant activity of the flaked product. The chemical parameters (contents of moisture, ash and fats) were determined according to ISO standard methods. Protein content was determined by the method of Lowry. The following biologically active compounds were also determined: total polyphenols, antioxidant activity (% DPPH) and total carotenoids. The analyses carried out showed that the flaking has certain effect, although to small extent, on the values of the properties of the flaked einkorn studied. The moisture content and the total amount of carotenoids were found to decrease while the amounts of fats, proteins and total polyphenols increased. The results obtained from the analyses of the flaked product were compared to these of wholegrain einkorn flour and it was found that the differences were considered to be immaterial although some of them were statistically significant. It was found also that the process of flaking does not affect the amount of mineral substances and the antioxidant activity of the flaked einkorn compared to einkorn flour.
Journal of Siberian Federal University. Engineering & Technologies, 2019
The purpose of this work was to study the chemical changes, taking place during the initial stage... more The purpose of this work was to study the chemical changes, taking place during the initial stage of the interaction of butadiene rubber and its vulcanized product with nitric acid by means of the differential IR spectroscopy. Thus some mechanistic knowledge, associated with the complex nature of this process could be obtained
Introduction. The research aim -to determine the effect of two herbal mixtures on herb bread prop... more Introduction. The research aim -to determine the effect of two herbal mixtures on herb bread properties. The influence was established of the herbals on the total phenolic content and antioxidant activities of herbal mixtures, herbal-flour mixtures and herb breads. Materials and methods. It was used two herbal mixtures (1 -thyme, oregano and lemon balm; 2 -thyme, oregano, lemon balm and fenugreek) with wheat flour for herb bread production. Total polyphenol content was determined following the Folin-Ciocalteu method. The antioxidant activities of sample extracts were evaluated by four methods: ABTS •+ , CUPRAC, FRAP and DPPH assay. Results and discussion. The highest total phenolic content from all investigated herbs showed oregano (30.43 mg GAE/g dw), herbal mixtures 1 and herbal mixtures 2 -19.18 mg GAE/g dw and 17.47 mg GAE/g dw, respectively. In herbal-flour mixture and prepared breads the level of total phenolic content were in the range from 0.31mg GAE/g dw to 0.37 mg GAE/g dw. Therefore, the content of these bioactive compounds didn`t changed significantly during the baking process. The highest antioxidant activity of herbal mixtures, herbal-flour mixtures and breads were obtained by two of the used methods -ABTS and FRAP assay. The highest antioxidant potential was demonstrated by herbal mixture 1 consisted of 3 herbs -16829.73 mM TE/100 g dw, followed by the herbal mixture 2 with 4 herbs -14693.75 mM TE/100 g dw, respectively, both evaluated by the ABTS method. For the FRAP method, the antioxidant activity values were: 15997.65 mM TE/100 g dw for the herbal mixture 1 with 3 herbs and 14136.82 mM TE/100 g dw for the herbal mixture 2 of 4 herbs. Conclusions. Herbs added to flour increased the total phenolics and antioxidant values of flour-mixtures and breads. Insignifiant differences in the antioxidant potentials were observed between breads with three and four herbs.
Uploads
Papers by ivan dimov