Papers by Thomas Pederson

I would like to thank my supervisor Lars-Erik Janlert for giving me the freedom to define my own ... more I would like to thank my supervisor Lars-Erik Janlert for giving me the freedom to define my own topic, for acting as an excellent sounding board for ideas, and for coming up with many others to complement my own. Your joy and curiosity in discussing the state of the world "as it appears to be", and your constant striving for clarity and simplicity when writing academic text, are two out of many aspects of the good research attitude I will do my best to bring with me and foster as a result from this sixyear learning and cooperation period. Talking about cooperation , my closest colleagues Anders Broberg and Björn Bengtsson have played important roles both as critics and supporters, as has the Cognitive Computing Group (CCG) as a whole. Many are the research ideas that have been born, tested, and mercilessly killed in this forum. I also want to thank Lennart Edblom for being the coolest head of department at the University, always willing to spend five minutes of his valuable time, keen and open for discussion, and of course most important of all: playing with unparalleled enthusiasm in the red floorball team every Wednesday. Go Big Red! As for the specific work on this thesis, a number of persons have kindly chosen to spend time and effort on proofreading drafts. Apart from my most diligent reader Lars-Erik Janlert, also Håkan Gulliksson, Michael Minock, Annabella Loconsole, and Bill Buxton have provided valuable input. Pointers to relevant information during the writing process has also been provided by Wendy Mackay, Tim Kindberg and Abigail Sellen. Thank you all for improving the quality of this thesis (from whatever level of quality it started...)! The theoretical part of this thesis is greatly inspired by the Magic Touch system developed by mostly computing science students as part of their master thesis projects. Without that work and without Magic Touch, this dissertation would be a completely different thing. Thank you
Pools and Satellites- Intimacy in the City Katja Battarbee | University of Art and Design Helsinki
This paper addresses the issue of mediating intimacy in order to support city communities. What i... more This paper addresses the issue of mediating intimacy in order to support city communities. What is intimacy and how can it be mediated through the introduction of new technology in a community? It illustrates the discussion by describing two explorative information and communication technology concepts and scenarios.
A Cup of Tea & a Piece of Cake Integration of Virtual Information Workspaces Inspired by the Way We May Think
This thesis concerns the design of an information navigation and retrieval system where some navi... more This thesis concerns the design of an information navigation and retrieval system where some navigation mechanisms have inherited functionality from concepts found in cognitive science. These mechanisms are intended to improve navigation and information retrieval in large information spaces like the World Wide Web, as well as enhancing the possibility for creative thinking. A model describing the nature of different information spaces and how they could be integrated is developed, and a proposal for a 3D user interface based on the model is introduced.
Proceedings of the 11th Nordic Conference on Human-Computer Interaction: Shaping Experiences, Shaping Society, 2020
The InvisibleAI (InvAI'20) workshop aims to systematically discuss a growing class of interactive... more The InvisibleAI (InvAI'20) workshop aims to systematically discuss a growing class of interactive systems that invisibly remove some decision-making tasks away from humans to machines, based on recent advances in artificial intelligence (AI), data science, and sensor or actuation technology. While the interest in the affordances as well as the risks of hidden pervasive AI are high on the agenda in public debate, discussion on the topic is needed within the human-computer interaction (HCI) community.
Proceedings of the 2017 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2017 ACM International Symposium on Wearable Computers, Sep 11, 2017
Highlighting information/object Hiding information/object Unavoidable stimuli Subtle stimuli Subl... more Highlighting information/object Hiding information/object Unavoidable stimuli Subtle stimuli Subliminal cueing for highlighting Change blindness for highlighting Highlighting in augmented reality Diminished Reality Human Perceivability Subliminal cueing for distracting Change blindness for hiding System functionalities

EAI Endorsed Transactions on Pervasive Health and Technology, 2017
In this paper, we present our body-and-mind-centric approach for the design of wearable personal ... more In this paper, we present our body-and-mind-centric approach for the design of wearable personal assistants (WPAs) motivated by the fact that such devices are likely to play an increasing role in everyday life. We also report on the utility of such a device for orthopedic surgeons in hospitals. A prototype of the WPA was developed on Google Glass for supporting surgeons in three di↵erent scenarios: (1) touch-less interaction with medical images, (2) tele-presence during surgeries, and (3) mobile access to Electronic Patient Records (EPR) during ward rounds. We evaluated the system in a clinical simulation facility and found that while the WPA can be a viable solution for touch-less interaction and remote collaborations during surgeries, using the WPA in the ward rounds might interfere with social interaction between clinicians and patients. Finally, we present our ongoing exploration of gaze and gesture as alternative input modalities for WPAs inspired by the hospital study.

There is a growing consensus within the field of Human-Computer Interaction (HCI) that the keyboa... more There is a growing consensus within the field of Human-Computer Interaction (HCI) that the keyboard, mouse and visual display of the PC era have to be replaced with something more mobile and more adaptable to situations where interaction with computers until now has simply not been possible. Moreover, presence of interactive computing power literally everywhere implies that the computing systems have to take into account the physical context of their users. Open issues such as these are core problem areas in fields such as Augmented Reality, Tangible User Interfaces, Context Awareness, Ubiquitous and Wearable Computing. Most existing efforts investigating and designing for the new kind of everyday computing tend to be severely hampered by the absence of a framework that could define the roles of objects in dynamically reconfigured mixed-reality environments. This position paper describes our ongoing work in developing such a framework, incorporating, among other things, the idea of emerging physical-virtual "applications" based on how collections of everyday objects are exposed to a specific human agent in the course of everyday activities.
Proceedings of the 4th International Conference on Tangible and Embedded Interaction 2010, Cambridge, MA, USA, January 24-27, 2010
International Conference on Tangible and Embedded Interaction, 2010

Proceedings of the 7th Nordic Conference on Human-Computer Interaction Making Sense Through Design - NordiCHI '12, 2012
Nowadays, several commercial and academic studies are available to measure and monitor health sta... more Nowadays, several commercial and academic studies are available to measure and monitor health status of people. However, most of these applications have remained in research laboratories and are not being used pervasively while our survey pointed out that there is a high demand for this kind of solutions in the society. In this paper, we have examined some of the practical challenges in developing health monitoring systems by designing, developing and evaluating a simple wearable mobile health monitoring system for kids. The project started with a survey among parents to understand user requirements, also we interviewed a doctor as a domain expert, and finally a wearable prototype of the system was developed and evaluated. Our finding shows that downsizing the wearable sensors increases the acceptability of these devices by children; however, placing different sensors in one point would increase noise of observed data, but removing outliers, smoothing data and using domain experts' knowledge help increasing reliability of the system results.
An egocentric approach towards ubiquitous multimodal interaction
Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2015 ACM International Symposium on Wearable Computers - UbiComp '15, 2015
In this position paper we present our take on the possibilities that emerge from a mix of recent ... more In this position paper we present our take on the possibilities that emerge from a mix of recent ideas in interaction design, wearable computers, and context-aware systems which taken together could allow us to get closer to Marc Weiser's vision of calm computing. Multisensory user experience plays an important role in this approach.
IEEE Pervasive Computing, 2015
Shahram Jalaliniya is a phD fellow at the It university of Copenhagen, where he is a member of th... more Shahram Jalaliniya is a phD fellow at the It university of Copenhagen, where he is a member of the pervasive Interaction technology (pIt) lab. His research interests include wearable computing, HCI, pervasive computing, and multimodal interaction. Jalaliniya has a master's degrees in information systems from lund university and in software and technology from the It university of Copenhagen.

The main part of this position paper presents our emerging design framework "egocentric interacti... more The main part of this position paper presents our emerging design framework "egocentric interaction" aimed to help structuring the design of support systems for personal everyday activities. The particularity of the proposed model is a) the choice to center the activity modelling around a specific human body rather than a computing device or other artefact, and b) the attempt to cover object manipulation performed by human individuals both in the real world and in the virtual world (i.e. taking place "inside" interactive computing devices). The idea of complementing the egocentric interaction framework with existing concepts from within the area of Tangible User Interfaces is raised, motivated by the need to model everyday object manipulation in more detail. The last section of the paper relates our design approach, focusing on the "what" rather than the "how", to more technology-driven design approaches. Egocentric Interaction The egocentric interaction framework differs from more classical HCI models by explicitly ignoring input and output devices of interactive computers such as PCs, PDAs and cellular phones, seeing them as completely transparent mediators for accessing virtual objects. Doing so permits the modelling of real-world and digital entities as if they were situated in the same Euclidean space. We believe that it is advantageous when modelling everyday mobile computing applications where the interaction complexity vastly surpasses what can be sufficiently described using a classical human-computer interaction dialogue model. By viewing the physical and the virtual worlds as equally important for human activity, the proposed physical-virtual perspective is a completely different way of modelling the relationship between the physical and virtual world compared to for instance how it is typically done in context awareness research where the physical world almost always is treated as mere context to the virtual world. The term 'egocentric' has been chosen to signal that it is the body and mind of a specific human individual that (sometimes literally, as will be shown later) act as centre of reference to which all interaction modelling and activity support is anchored.
Egocentric Interaction—A Design and Modelling Framework for Situative Physical-Virtual Applications
ABSTRACT
This paper presents a general modeling approach intended to facilitate design of physical-virtual... more This paper presents a general modeling approach intended to facilitate design of physical-virtual environments. Although the model is based on elements found in typical office environments, certain care has been taken to open for the modeling of more diverse settings with minimal ontological changes. The design approach finds inspiration in the technology-driven areas of Ubiquitous/Pervasive Computing (Weiser, 1991) and Graspable/Tangible User Interfaces (Fitzmaurice, Ishii & Buxton, 1995) as well as more empirical and theoretical research on
Object Location Modeling in Office Environments - First Steps
ABSTRACT
Computers, embedded in the "background" as well as more obtrusive artefacts (e.g. PCs, PDAs, cell... more Computers, embedded in the "background" as well as more obtrusive artefacts (e.g. PCs, PDAs, cellular phones), play an increasingly important role in human activity. However, there are still things that most people would prefer to do "off-screen" in the physical (real) world, such as having parties, reading long text documents, or spending vacation. I argue that there exists a class of activities that are neither physical or virtual, but "physical-virtual" [2]. People frequently do parts of an activity in the physical world (e.g. proofreading a text document under construction) and parts in the virtual world (e.g. adjusting paragraphs within "the same" document in a word processing environment). This behaviour is likely to become more common. Hence, future environments should be designed with such physical-virtual activities in mind.
“Physical-Virtual Instead of Physical or Virtual—Designing Artifacts for Future Knowledge Work Environments,”
ABSTRACT

Proceedings of the 2013 ACM conference on Pervasive and ubiquitous computing adjunct publication, 2013
Sterility restrictions in surgical settings make touch-less interaction an interesting solution f... more Sterility restrictions in surgical settings make touch-less interaction an interesting solution for surgeons to interact directly with digital images. In this demo, we present a system for gesture-based interaction with medical images based on a wristband inertial sensor and capacitive floor sensors, allowing for hand and foot gesture input. Hand gesture commands have been designed for interacting with 3D and 2D medical images in two different displays, while foot gestures can enable, disable, and switch interaction between different systems. The gestures are recognized in real time with the help of a neural network classifier, which is based on a given training set and extracts different features of accelerometer and gyroscope. For displaying the medical images a simple image viewer for 2D images is used while 3D images are presented with the open-source software InVesalius.
Uploads
Papers by Thomas Pederson