Dialog Systems and their Inputs
https://doi.org/10.1007/978-3-642-39476-8_121…
5 pages
1 file
Sign up for access to the world's latest research
Abstract
One of the main limitations in existent domain-independent conver- sational agents is that the general and linguistic knowledge of these agents is limited to what the agents' developers explicitly defined. Therefore, a system which analyses user input at a deeper level of abstraction which backs its knowledge with common sense information will essentially result in a system that is capable of providing more adequate responses which in turn result in a better overall user experience. From this premise, a framework was proposed, and a working prototype was implemented upon this framework. These make use of various natural language processing tools, online and offline knowledge bases, and other information sources, to enable it to comprehend and construct relevant responses.
Related papers
Human Factors and Ergonomics, 2009
The design and development of natural interactive systems requires that the specific aspects of natural communication are taken into account. Besides the capability to understand and generate linguistic expressions, natural language use includes cooperation and planning of complex actions on the basis of observations of the communicative context, i.e., communicative competence. This Chapter discusses natural language dialogue interfaces and develops the view of interactive systems as communicating agents which can cooperate with the user on a shared task. Natural interaction is considered an approach to interface design which attempts to empower different users in various everyday situations to exploit the strategies they have learnt in human-human communication, with an ultimate aim of constructing intelligent and intuitive interfaces that are aware of the context and the user's individual needs. The notion of natural interaction thus refers to the spoken dialogue system's ability to support functionality that the user finds intuitive and easy, i.e., the interface should afford natural interaction. The view is supported by an evaluation study concerning a multimodal route navigation system.
2013
We present Dialog Moves Markup Language (DMML): an extensible markup language (XML) representation of modality independent communicative acts of automated conversational agents. In our architecture, DMML is the interface to and from conversational dialog managers for user interactions through any channel or modality. The use of a common XML interface language across different channels promotes high cost efficiency for the business. DMML itself has no application or domain specific elements; DMML elements embed elements representing application business logic. DMML captures the abstractions necessary to represent arbitrary multi-agent dialogs and to build cost-efficient, sophisticated natural language dialog systems for business applications. PDA
1993
This paper describes a method for the development of dialogue managers for natural language interfaces. A dialogue manager is presented designed on the basis of both a theoretical investigation of models for dialogue management and an analysis of empirical material. It is argued that for natural language interfaces many of the human interaction phenomena accounted for in, for instance, plan-based models of dialogue do not occur. Instead, for many applications, dialogue in natural language interfaces can be managed from information on the functional role of an utterance as conveyed in the linguistic structure. This is modelled in a dialogue grammar which controls the interaction. Focus structure is handled using dialogue objects recorded in a dialogue tree which can be accessed through a scoreboard by the various modules for interpretation, generation and background system access. A sublanguage approach is proposed. For each new application the Dialogue Manager is customized to meet the needs of the application. This requires empirical data which are collected through Wizard of Oz simulations. The corpus is used when updating the different knowledge sources involved in the natural language interface. In this paper the customization of the Dialogue Manager for database information retrieval applications is also described.
Artificial Intelligence, 1977
GUS is the first of a series of experimental computer systems that we intend to construct as part of a program of research on language understanding. In large measure, these systems will fill the role of periodic progress reports, summarizing what we have learned, assessing the mutual coherence of the various lines of investigation we have been following, and saggestin# where more emphasis is needed in future work. GUS (Genial Understander System) is intended to engage a sympathetic and highly cooperative human in an English dialog, directed towards a specific goal within a very restricted domain of discourse. As a starting point, G US was restricted to the role of a travel agent in a conversation with a client who wants to make a simple return trip to a single city in California. There is good reason for restricting the domain of discourse for a computer system which is to engage in an English dialog. Specializing the subject matter that the system can talk about permiis it to achieve some measure of realism without encompassing all the possibilities of human knowledge or of the English language. It also provides the user with specific motivation for participating in the conversation, thus narrowing the range of expectations that GUS must have about the user's purposes. A system restricted in this way will be more able to guide the conversation within the boundaries of its competence.
1995
This paper presents an action scheme for dia logue management for natural language inler faces The scheme guides a dialogue manager which directs the interface's dialogue with the user communicates with the background system, and assists the interpretation and gets eration modules The dialogue manager was designed on the basis of an investigation of empirical material collected in Wizard of Oz experiments The empirical investigations re vealed that in dialogues with database systems users specify an object, or a set of objects and ask for domain concept information, e g the value of a property of that object or set of ob|ects The interface responds 1 perform ing the appropriate action e g providing the requited information or initiating a clarified lion subdialogue The action to bt carried out by the interface can be determined based on how objects and properties are specified from information in the user utterance the dialogue context and the response from the background system and its domain model J0NSSON
Natural Language Engineering, 1997
Natural language interfaces require dialogue models that allow for robust, habitable and efficient interaction. This paper presents such a model for dialogue management for natural language interfaces. The model is based on empirical studies of human computer interaction in various simple service applications. It is shown that for applications belonging to this class the dialogue can be handled using fairly simple means. The interaction can be modeled in a dialogue grammar with information on the functional role of an utterance as conveyed in the linguistic structure. Focusing is handled using dialogue objects recorded in a dialogue tree representing the constituents of the dialogue. The dialogue objects in the dialogue tree can be accessed by the various modules for interpretation, generation and background system access. Focused entities are modeled in entities pertaining to objects or sets of objects, and related domain concept information; properties of the domain objects. A sim...
This demo paper describes our Artificial Intelligent Dialogue Agent (AIDA), a dialogue management and orchestration platform under development at the Institute for Infocomm Research. Among other features, it integrates different human-computer interaction engines across multiple domains and communication styles such as command, question answering, task-oriented dialogue and chat-oriented dialogue. The platform accepts both speech and text as input modalities by either direct microphone/keyboard connections or by means of mobile device wireless connection. The output interface, which is supported by a talking avatar, integrates speech and text along with other visual aids.
Information Modeling in the New Millennium, 2001
Intelligent Systems are served by Intelligent User Interfaces aimed to improve the efficiency, effectiveness and adaptation of the interaction between the user and the computer by representing, understanding and implementing models. The Intelligent User Interface Model (IUIM) helps to design and develop Intelligent Systems considering its architecture and its behavior. It focuses the Interaction and Dialogue between User and System at the heart of an Intelligent Interactive System. An architectural model, which defines the components of the model, and a conceptual model, which relates to its contents and behavior, compose the IUIM. The conceptual model defines three elements: an Adaptive User Model (including components for building and updating the user model), a Task Model (including general and domain specific knowledge) and an Adaptive Discourse Model (to be assisted by an intelligent help and a learning module). We will show the implementation of the model by describing an application named Stigma -A STereotypical Intelligent General Matching Agent for Improving Search Results on the Internet. Finally, we compared the new model with others, stating the differences and the advantages of the proposed model.
2004
Abstract Conversational agents integrate computational linguistics techniques with the communication channel of the Web to interpret and respond to statements made by users in ordinary natural language. Web-based conversational agents deliver high-volumes of interactive text-based dialogs. Recent years have seen significant activity in enterprise-class conversational agents.
2009 IEEE International Conference on Multimedia and Expo, 2009
ABSTRACT The importance of dialog management systems has increased in recent years. Dialog systems are created for domain specific applications, so that a high demand for a flexible dialog system framework arises. There are two basic approaches for dialog management systems: a rule-based approach and a statistic approach. In this paper, we combine both methods and form a hybrid dialog management system in a scalable agent based framework. For deciding of the next dialog step, two independent systems are used: the Java Rule Engine (JESS) as expert system for rule-based solutions, and the Partially Observable Markov Decision Process (POMDP) as model-based solution for more complex dialog sequences. Using a speech recognizer and text-to-speech systems, the human can be guided through a dialog with approximately ten steps.

Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
References (6)
- Klüwer, T.: From Chatbots to Dialog Systems. International Journal, 1-22, (2011).
- Mateas, M., Stern, A.: Natural Language Understanding in Façade: Surface-text Pro- cessing, (2004).
- Wilcox, B.: Suzette, the Most Human Computer, (2011)
- Bontcheva, K., Cunningham, H., Maynard, D., Tablan V., Saggion H.: Developing Reusa- ble and Robust Language Processing Components for Information Systems Using Gate. 3rd International Workshop on Natural Language and Information Systems, 223-227, (2002).
- Cunningham, H., Humphreys, K., Gaizauska, R.: GATE -a Tipster-Based General Archi- tecture for Text Engineering. Proceedings of the TIPSTER Text Program (Phase III) 6 Month Workshop, (1997).
- Levy, D., Batacharia, B., Catizone, R., Krotov, A., Wilks, Y.: CONVERSE: a Conversa- tional Companion. Proceedings of the 1st International Workshop on Human-Computer Conversation, (1997).