Papers by Anusuriya Devaraju
This workshop was provided within Task 4.3 concerning certification support for digital repositor... more This workshop was provided within Task 4.3 concerning certification support for digital repositories. Through this first certification support workshop, FAIRsFAIR provides support and capacity building including materials, training and advice for repository managers to improve their knowledge of activities related to the preparations for CoreTrustSeal self-assessments. Participants to this workshop were representatives of the ten repositories that have been selected for the Certification Support initiative, part of FAIRsFAIR Work Package 4 on FAIR Certification.

The overall goal of FAIRsFAIR is to accelerate the realization of the goals of the European Open ... more The overall goal of FAIRsFAIR is to accelerate the realization of the goals of the European Open Science Cloud (EOSC) by compiling and disseminating all knowledge, expertise, guidelines, implementations, new trajectories, training and education on FAIR matters. FAIRsFAIR work package 4 (WP4) will support the provision of practical solutions for implementing the FAIR principles through the co-development and implementation of certification schemes for trusted data repositories enabling FAIR research data in the EOSC, and the provision of organizational support and outreach activities. One of the objectives of WP4 is to develop requirements (e.g., metrics) and tools to pilot the FAIR assessment of digital objects, in particular research data objects in trustworthy digital repositories (TDRs). This report presents the first results of work carried out towards achieving the objective. We outline the context for our activities by summarising related work both performed in other work pack...
This paper is milestone 4.1 of the FAIRsFAIR task 4.1 (Capability Maturity models towards FAIR Ce... more This paper is milestone 4.1 of the FAIRsFAIR task 4.1 (Capability Maturity models towards FAIR Certification) within the FAIR Certification work package (WP4). This document presents the first iterative step in aligning the characteristics of FAIR digital objects with the repositories that 'enable' FAIRness, through the CoreTrustSeal Trustworthy Data Repository Requirements and the application of a capability/maturity evaluation approach.

Weber, T., McPhee, M.J. and Anderssen, R.S. (eds) MODSIM2015, 21st International Congress on Modelling and Simulation, 2015
Various portals have been developed to provide an easy way to discover and access public research... more Various portals have been developed to provide an easy way to discover and access public research data sets from various organizations. Data sets are made available with descriptive metadata based on common (e.g., OGC, CUAHSI, FGDC, INSPIRE, ISO, Dublin Core) or proprietary standards to facilitate better understanding and use of the data sets. Provenance descriptions may be included as part of the metadata and are specified from a data provider's perspective. These can include, for example, different entities and activities involved in a data creation flow, such as sensing platforms, personnel, and data calculation and transformation processes. Moving beyond the provider-centric descriptions, data provenance may be complemented with forward provenance records supplied by data consumers. The records may be gathered via a user-driven feedback approach. The feedback information from data consumers gives valuable insights into application and assessment of published data sets. This might include descriptions about a scientific analysis in which the data sets were used, the corrected version of an actual data set or any discovered issues and suggestions concerning the quality of the published data sets. Data providers might then use this information to handle erroneous data and improve existing metadata, their data collection and processing methods. Contributors can use the feedback channel to share their scientific analyses. Data consumers can learn more about data sets based on other people's experiences, and potentially save time by avoiding the need for interpreting or cleaning data sets. The goals of the study are to capture feedback from data users on published research data sets, link this to actual data sets, and finally support search and discovery of research data using feedback information. This paper reports preliminary results addressing the goals. We provide a summary of current practices on gathering feedback from end-users on research data portals, and discuss their relevance and limitations. Examples from the Earth Science domain on how commentaries from data users might be useful in practice are also included. Then, we present a data model representing key aspects of user feedback. We propose a system architecture to gather and manage feedback from end-users. We describe how the core PROV model may be used to represent the provenance of user feedback information. Technical solutions for linking feedback to existing data portals are also specified.

The convergence of the internet and wireless communication has led the popularity of using handhe... more The convergence of the internet and wireless communication has led the popularity of using handheld devices. People have now started demanding services that can be delivered any time anywhere, called Location Based Services (LBS). This paper deal with development of location based services on handheld devices that apply to emergency services. Handheld devices suffer from serious constrains in three areas: memory size, processor speed and screen size. This application uses the client server concept within wireless internet environment. The positioning service such as GPS is used to know the position of the user. The objective of this research is to display special query on the required spatial information within handheld devices using different operating systems such as WinCE, Palm OS and Symbian. This implies the strong feature of the proposed system. Hence the system assists people e.g. at the time of emergency to find the shortest path to the nearest hospital. The application will...

Temporal constraints play an important role in the specification and implementation of clinical t... more Temporal constraints play an important role in the specification and implementation of clinical trial protocols, and subsequently, in the querying of the generated trial data. Protocols specify a temporal schedule of clinical trial activities such as tests, procedures, and medications. The schedule includes temporal constraints on the sequence of these activities, on their duration, and on potential cycles. In this paper, we present our approach to formally represent temporal constraints found in clinical trials. We have identified a representative set of temporal constraints found in protocols to study immune tolerance. Our research group has developed a temporal constraint ontology that allows us to formulate the temporal constraints to the extent required to support clinical trials management. We use this ontology to provide temporal annotation of clinical activities in an encoded clinical trial protocol. We have developed a temporal model that represents time-stamped data and fa...

Data Science Journal, 2021
Funders and policy makers have strongly recommended the uptake of the FAIR principles in scientif... more Funders and policy makers have strongly recommended the uptake of the FAIR principles in scientific data management. Several initiatives are working on the implementation of the principles and standardized applications to systematically evaluate data FAIRness. This paper presents practical solutions, namely metrics and tools, developed by the FAIRsFAIR project to pilot the FAIR assessment of research data objects in trustworthy data repositories. The metrics are mainly built on the indicators developed by the RDA FAIR Data Maturity Model Working Group. The tools' design and evaluation followed an iterative process. We present two applications of the metrics: an awareness-raising self-assessment tool and an automated FAIR data assessment tool. Initial results of testing the tools with researchers and data repositories are discussed, and future improvements suggested including the next steps to enable FAIR data assessment in the broader research data ecosystem.
Integrating data and analysis technologies within leading environmental research infrastructures: Challenges and approaches
Ecological Informatics, 2021
The Implementation of Questionnaires Design Principles Via OnlineQuestionnaire Builder
Online Questionnaire Builder (OQB) is web-based survey software that provides complete set of too... more Online Questionnaire Builder (OQB) is web-based survey software that provides complete set of tools for users to conduct the overall survey process from questionnaire design and distribution to the presentation of the survey results. This paper delivers the implementation a comprehensive set of guidelines for the design of online questionnaires via our survey software. The guidelines are drawn from relevant disparate existing studies. Implementation of the design principles are mainly concerning the survey structure, layout, navigation, formatting, response format and question types. The design principles are incorporated within the survey creation software to guide questionnaire design according to best-practice, while the benefits of online-questionnaire delivery can be achieved
Älyä Havaintojen Yhdistämiseen

ISPRS International Journal of Geo-Information, 2015
The worldwide Sensor Web comprises observation data from diverse sources. Each data provider may ... more The worldwide Sensor Web comprises observation data from diverse sources. Each data provider may process and assess datasets differently before making them available online. This information is often invisible to end users. Therefore, publishing observation data with quality descriptions is vital as it helps users to assess the suitability of data for their applications. It is also important to capture contextual information concerning data quality such as provenance to trace back incorrect data to its origins. In the Open Geospatial Consortium (OGC)'s Sensor Web Enablement (SWE) framework, there is no sufficiently and practically applicable approach how these aspects can be systematically represented and made accessible. This paper presents Q-SOS-an extension of the OGC's Sensor Observation Service (SOS) that supports retrieval of observation data together with quality descriptions. These descriptions are represented in an observation data model covering various aspects of data quality assessment. The service and the data model have been developed based on open standards and open source tools, and are productively being used to share observation data from the TERENO observatory infrastructure. We discuss the advantages of deploying the presented solutions from data provider and consumer viewpoints. Enhancements applied to the related open-source developments are also introduced.
Building Location-based Service System with Java Technologies
International Conference on Information Technology in Asia, 2005
ABSTRACT The growing use of Java in Location-based Service provides an opportunity to find soluti... more ABSTRACT The growing use of Java in Location-based Service provides an opportunity to find solutions for problems and challenges in the rapidly changing telecommunications environment. This paper describes the development of location-based service components using Java technologies. The technologies include J2ME, Servlet, Java Server Pages (JSP) and XML Java Binding Tool. The developed components are the location server simulator, location service application and device client application. This study is crucial for support of BT's launch of User Location Service on prototype ERICA mobile application platform through supporting the testing and validation of the platform components.
Affordances as Qualities
International Conference on Formal Ontology in Information Systems, 2010

Incorporating Quality Control Information in the Sensor Web
ABSTRACT The rapid development of sensing technologies had led to the creation of large amounts o... more ABSTRACT The rapid development of sensing technologies had led to the creation of large amounts of heterogeneous environmental observations. The Sensor Web provides a wider access to sensors and observations via common protocols and specifications. Observations typically go through several levels of quality control, and aggregation before they are made available to end-users. Raw data are usually inspected, and related quality flags are assigned. Data are gap-filled, and errors are removed. New data series may also be derived from one or more corrected data sets. Until now, it is unclear how these kinds of information can be captured in the Sensor Web Enablement (SWE) framework. Apart from the quality measures (e.g., accuracy, precision, tolerance, or confidence), the levels of observational series, the changes applied, and the methods involved must be specified. It is important that this kind of quality control information is well described and communicated to end-users to allow for a better usage and interpretation of data products. In this paper, we describe how quality control information can be incorporated into the SWE framework. Concerning this, first, we introduce the TERENO (TERrestrial ENvironmental Observatories), an initiative funded by the large research infrastructure program of the Helmholtz Association in Germany. The main goal of the initiative is to facilitate the study of long-term effects of climate and land use changes. The TERENO Online Data RepOsitORry (TEODOOR) is a software infrastructure that supports acquisition, provision, and management of observations within TERENO via SWE specifications and several other OGC web services. Next, we specify changes made to the existing observational data model to incorporate quality control information. Here, we describe the underlying TERENO data policy in terms of provision and maintenance issues. We present data levels, and their implementation within TEODOOR. The data levels are adapted from those used by other similar systems such as CUAHSI, EarthScope and WMO. Finally, we outline recommendations for future work.

One of the most promising areas of education is the development of computer-based teaching materi... more One of the most promising areas of education is the development of computer-based teaching materials, especially interactive multimedia programs. Interactive multimedia allows independent and interactive learning, and yet presents the learning information to the learners in newly engaging and meaningful ways. This paper delivers the theoretical concepts and design of a multimedia courseware called 'MyLexic'. 'MyLexic' is the first learning tool to nurture interest on Malay language basic reading among preschool dyslexic children in Malaysia. The theoretical framework proposed in the study is based on research in dyslexia theory with Dual Coding Theory, Structured Multi-sensory Phonic Teaching and Scaffolding instructional technique. Detail explanations on its learning content are also discussed. The courseware is hoped to contribute a significant idea to the development of technology in Malay language education for dyslexics in Malaysia.

TERENO-MED: Terrestrial Environmental Observatories in the Mediterranean Region
ABSTRACT The Mediterranean region is one of the most imperilled regions in the world concerning p... more ABSTRACT The Mediterranean region is one of the most imperilled regions in the world concerning present and future water scarcity. The region is delicately positioned at the crossroads between East and West, interlinking Europe, Asia and Africa. Societal and economic changes causing population growth, industrialisation and urbanisation lead to significant increases in food, water and energy demand. Hence, natural resources, such as water and soils, as well as ecosystems are put under pressure and water availability and quality will be severely affected in the future. At the same time, climate and extreme event projections from climate models for the Mediterranean are, unlike for most regions worldwide, consistent in their trends based on various scenarios. This consistency in the model predictions shows that the Mediterranean will face some of the most severe increases in dryness worldwide (based on consecutive dry days and soil moisture), and indicate a decrease of up to 50 % in available water resources within the next 50-100 years. These developments are accentuated by the fact that in many of the Mediterranean countries, natural renewable water resources are fully exploited or over-exploited already today, mainly due to agricultural irrigation, but also touristic activities. At the same time, the Mediterranean region is a global hot spot of freshwater biodiversity, with a high proportion of endemic and endangered species. While trend projections for water availability and climate change derived from global studies are consistent, regional patterns and heterogeneities, as well as local adaptation measures will largely determine the functioning of societies and the health of ecosystems. However, a lack of environmental data prohibits the development of sustainable adaptation measures to water scarcity on a scientific basis. Building on the experiences gained in the national TERENO network, a Mediterranean observatory network will be set-up, coordinated by two Helmholtz Centres and jointly operated with local partners across the Mediterranean region. In a number of Mediterranean mesoscale hydrological catchments TERENO-MED will investigate the long-term effects of global change on the quality and the dynamics of water resources in human-influenced environments under water scarcity. The Helmholtz Centres UFZ (overall coordinator) and FZJ have therefore initiated the set-up of a network of global change observatories in 5-10 Mediterranean river catchments. The TERENO-MED observatories will: - investigate societally relevant water problems in the context of 'typical' Mediterranean environments, - provide long-term and quality-controlled data available to the scientific community, - be operated and maintained through local research institutes and universities, - establish common monitoring platforms and foster synergies between research organizations, - provide solutions to pressing local and regional water problems by building partnerships between scientific partners and regional authorities.

Observations produced by environmental sensors and humans signify naturally occurring events. Whi... more Observations produced by environmental sensors and humans signify naturally occurring events. While there is technology available for inferring events from sensor streams, knowledge about event inference rules and event construction principles is often hidden in application logic or needs to be informally specified by domain experts. Semantic formalisms support description of spatial event inference procedures and provide a way to generalize them across implementation frameworks. Furthermore, there is a lack of algorithms which are flexible enough to capture both spatial and in-situ event inference, where spatial events extend beyond a single in-situ sensor. In this paper, we demonstrate how spatial events can be formally specified as bounded wholes connected by a process simulator. This formal blue print can be used to describe different event inference methods. Unlike simple event rules, the approach accounts for recursive conditions of spatial and temporal identity of events, e.g., lag times, and allows inference of event completeness. We propose corresponding event inference algorithms that can be used to compute process graphs and to generate and publish events as RDF. We evaluate our approach using an officially published blizzard data set.
International Journal of Geographical Information Science, 2014
The Sensor Web provides wider access to sensors and their observations via the Web. A key challen... more The Sensor Web provides wider access to sensors and their observations via the Web. A key challenge is to infer information about geographic events from these observations. A systematic approach to the representation of domain knowledge is vital when reasoning about events due to heterogeneous observational sources. This paper delivers a formal model capturing the relations between observations and events. The model is exploited with a rule-based mechanism to infer information about events from in-situ observations. The paper also describes how the model's vocabularies are used to formulate spatiotemporal queries. A use case for reasoning about blizzard events based on real time series illustrates the formal model.
Semantic Web and Beyond, 2011
Georeferencing and semantic annotations improve the findability of geoinformation because they ex... more Georeferencing and semantic annotations improve the findability of geoinformation because they exploit relationships to existing data and hence facilitate queries. Unlike georeferencing, which grounds location information in reference points on the earth's surface, semantic annotations often lack relations to entities of shared experience. We suggest an approach to semantically reference geoinformation based on underlying observations, relating data to observable entities and actions. After discussing an ontology for an observer's domain of experience, we demonstrate our approach through two use cases. First, we show how to distinguish geosensors based on observed properties and abstracting from technical implementations. Second, we show how to complement annotations of volunteered geographic information with observed affordances.
Uploads
Papers by Anusuriya Devaraju