Towards an Ontology for Propaganda Detection in News Articles
2021
https://doi.org/10.1007/978-3-030-80418-3_35…
24 pages
1 file
Sign up for access to the world's latest research
Abstract
AI
AI
This paper proposes an ontology for detecting propaganda techniques in news articles, focusing on the identification of various rhetorical strategies and the classification of these techniques within a structured framework. The ontology aims to model the socio-political context surrounding propaganda and establish a data-driven, iterative approach to enhance the quality and precision of propaganda detection, while evaluating its effectiveness through expert assessments and predefined metrics.
Related papers
2011
The National Institute for Standards and Technology sponsored a workshop in October, 2007, on the subject of ontology evaluation. An international group of invited experts met for two days to discuss problems in measuring ontology quality. The workshop highlighted several divisions among ontology developers regarding approaches to ontology evaluation. These divisions were generally reflective of the opinions of the participants. However, the workshop documented a paucity of empirical evidence in support of any particular position. Given the importance of ontologies to every knowledge-intensive human activity, there is an urgent need for research to develop an empirically derived knowledge base of best practices in ontology engineering and methods for assuring ontology quality over time. This is a report of the workshop discussion and brainstorming by the participants about what such a research program might look like. ontologies: lack of a systematic method for evaluating ontologies, inadequate techniques for verification and validation, lack of standard methods for comparing ontologies, and paucity of real-world applications demonstrating effectiveness of ontologies. To address the issues above, a workshop was held at the National Institute of Standards and Technology on October 26 th and 27 th , 2007 to generate a research plan for the development of systematic methods for evaluating ontologies. The co-chairs of the workshop were Ram D. Sriram (National Institute of Standards and Technology), Mark A. Musen (Stanford University), and Carol A. Bean (National Institutes of Health). The topics for the workshop included the following: Representation. The language in which an ontology is expressed (its metalanguage) should be used according to its intended syntax and semantics, to ensure that the ontology is properly understood by the user community and by computer-based tools. This topic addresses how to check that an ontology is using its metalanguage properly. Accuracy. A well-constructed ontology is not very useful if its content is not accurate. This topic concerns methods to ensure that an ontology reflects the latest domain knowledge. Reasoners. An ontology can support automatic computation of the knowledge that is otherwise not obvious in the ontology. This topic addresses how to determine that automatically deduced information is consistent and valid. Performance metrics. Reasoners and other computational services are not very useful if they consume too many resources, including compute time. This topic concerns the bounds that users should expect from various kinds of computational services. Tools and Testbeds. Ontology evaluation is a complex task that can be facilitated by testing environments, graphical tools, and automation of some aspects of evaluation. This topic addresses computer-aided ontology evaluation. Certification. Ontologies that pass rigorous evaluation should be recognized by the community, to encourage the development and adoption of those of higher quality. This topic concerns the methods for official recognition of ontologies meeting high standards. Of particular concern is the role of social engineering to develop practices and tools that support the routine assessment and review of ontologies by the people who use them. The workshop had several presentations and breakout sessions. This report summarizes these presentations and breakout sessions. In our report of the discussions following each presentation, we use the abbreviation AM to connote an audience member, unless otherwise specified. Additional resources related to the workshop, including slides from each of the presentations are available at http://sites.google.com/a/cme.nist.gov/workshop-on-ontology-evaluation/Home/. Presentation Summaries Summaries of presentations, except Michael Uschold's talk entitled "Evaluating Ontologies based on Requirements," are provided below.
2nd Interdisciplinary …, 2009
ABSTRACT. After almost two decades of the design, development, justification, and test-ing/implementation of a natural language ontology in the Direct Meaning Access (DMA) on-tological semantic school, the paper revisits the crucial issues of the relationship of its on- ...
Informatica (slovenia), 2010
This paper addresses the process of the ontology extension for a selected domain of interest which is defined by keywords and a glossary of relevant terms with descriptions. A new methodology for semiautomatic ontology extension, aggregating the elements of text mining and user-dialog approaches for ontology extension, is proposed and evaluated. We conduct a set of ranking, tagging and illustrative question answering experiments using Cyc ontology and business news collection. We evaluate the importance of using the textual content and structure of the ontology concept in the process of ontology extension. The experiments show that the best results are obtained with giving more to weight to ontology concept content and less weight to ontology concept structure. Povzetek: Prispevek opisuje proces razširitve obstoječe ontologije konceptov.
2004
The evaluation of ontologies is vital for the growth of the Semantic Web. We consider a number of problems in evaluating a knowledge artifact like an ontology. We propose in this paper that one approach to ontology evaluation should be corpus or data driven. A corpus is the most accessible form of knowledge and its use allows a measure to be derived of the 'fit' between an ontology and a domain of knowledge. We consider a number of methods for measuring this 'fit' and propose a measure to evaluate structural fit, and a probabilistic approach to identifying the best ontology.
2013
Building an ontology for a specific domain can start from scratch (Cristani and Cuel, 2005) or by modifying an existing ontology (Gómez-Pérez and Rojas-Amaya, 1999). In both cases, techniques for evaluating the characteristics and the validity of the ontology are necessary. Not only such techniques might be useful during the
The aim of natural language ontology is to uncover the ontological categories and structures that are implicit in the use of natural language, that is, that a speaker accepts when using a language. This talk aims to clarify how natural language ontology relates to other projects in metaphysics, what sorts of linguistic data it should and should not take into account and why natural language ontology is important.
Research trends in geographic information science, 2009
The chapter shows how minimal assumptions on difficult philosophical questions suffice for an engineering approach to the semantics of geospatial information. The key idea is to adopt a conceptual view of information system ontologies with a minimal but firm grounding in reality. The resulting constraint view of ontologies suggests mechanisms for grounding, for dealing with uncertainty, and for integrating folksonomies. Some implications and research needs beyond engineering practice are discussed.
Proceedings, 2017
1 Ontologies have emerged as a common way of representing knowledge. Recently, people with minimal domain background or ontology engineering are developing ontologies, leading to a corpus of informal and under-evaluated ontologies. Existing ontology evaluation approaches require rigorous application of formal methods and knowledge of domain experts that can be cumbersome or tedious. We propose a lightweight approach for evaluating sufficiency of ontologies based on Natural Language Processing techniques. The approach consists of verifying the extent of coverage of concepts and relationships of ontologies against words in domain corpus. As a case study, we applied our approach to evaluate sufficiency of ontology in two example domains -Education (Curriculum) and Security (Phishing). We show that our approach yields promising results, is less effort intensive and is comparable with existing evaluation methods.
2006
Recent years have seen rapid progress in the development of ontologies as semantic models intended to capture and represent aspects of the real world. There is, however, great variation in the quality of ontologies. If ontologies are to become progressively better in the future, more rigorously developed, and more appropriately compared, then a systematic discipline of ontology evaluation must be created to ensure quality of content and methodology. Systematic methods for ontology evaluation will take into account representation of individual ontologies, performance and accuracy on tasks for which the ontology is designed and used, degree of alignment with other ontologies and their compatibility with automated reasoning. A sound and systematic approach to ontology evaluation is required to transform ontology engineering into a true scientific and engineering discipline. This chapter discusses issues and problems in ontology evaluation, describes some current strategies, and suggests some approaches that might be useful in the future.

Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
References (9)
- Presenting Irrelevant Data (Red Herring)
- Exaggeration or Minimisation
- Misrepresentation of Someone's Position (Straw Man) 11. Flag-waving
- Causal Oversimplification 13. Appeal to fear/prejudice
- Obfuscation, Intentional vagueness, Confusion 14. Slogans
- Appeal to authority 15. Thought-terminating cliché
- Black-and-white Fallacy, Dictatorship 16. Bandwagon
- Name calling or labeling 17. Reductio ad hitlerum
- Loaded Language 18. Repetition https://propaganda.qcri.org/semeval2020-task11/ Propaganda techniques -example 1