Academia.eduAcademia.edu

Outline

Evaluation methods for creativity support environments

2013, CHI '13 Extended Abstracts on Human Factors in Computing Systems

https://doi.org/10.1145/2468356.2479670

Abstract

Creativity refers to the human processes that underpin sublime forms of expression and fuel innovation. Creativity support environments (CSEs) address diverse areas, including education, science, business, programming, design, art, performance, and everyday life. An environment may consist of a desktop application, or involve specialized hardware, networked topologies, and mobile devices. CSEs may address temporal-spatial aspects of collaborative work. This workshop will gather a community of researchers developing and evaluating creativity support environments. We will share approaches, engage in dialogue, and develop best practices. The outcome will not be a single prescription, but rather a landscape of routes, an ontology of methodologies with consideration to how they map to creative activities, and an emerging consensus on the range of expectations for rigorous evaluation to shape the field of CSE research. The workshop will organize an open repository of CSE evaluation methods and test data.

Key takeaways
sparkles

AI

  1. The workshop aims to establish an ontology of methodologies for evaluating creativity support environments (CSEs).
  2. CSEs encompass diverse domains and require context-sensitive evaluation methods for effective assessment.
  3. Researchers have developed a range of quantitative and qualitative evaluation methods for assessing creativity.
  4. The repository will include various CSE evaluation methodologies and test data to support future research.
  5. Evaluating CSEs involves understanding factors that both promote and hinder creativity during creative processes.

References (18)

  1. P. D. Adamczyk, K. Hamilton, M. B. Twidale, and B. P. Bailey. Hci and new media arts: methodology and evaluation. In CHI '07 EA, 2007.
  2. R. Beghetto. Toward a broader conception of creativity: A case for" mini-c" creativity. Psychology of Aesthetics, 2007.
  3. E. Bertini, A. Perer, C. Plaisant, and G. Santucci. Beyond time and errors: novel evaluation methods for information visualization. In Proc ACM CHI, 2008.
  4. E. A. Carroll and C. Latulipe. Triangulating the Personal Creative Experience: Self-Report, External Judgments, and Physiology. In Proceedings of Graphics Interface 2012, pages 53-60, 2012.
  5. E. A. Carroll, C. Latulipe, R. Fung, and M. Terry. Creativity Factor Evaluation: Towards a Standardized Survey Metric for Creativity Support. In Proc ACM Creativity & Cognition 2009.
  6. J. M. Corbin and A. L. Strauss. Basics of qualitative research: Techniques and procedures for developing grounded theory. Sage Publications, Inc, 2008.
  7. M. Csikszentmihalyi. Creativity. flow and the psychology of discovery and invention. Harper, 1997.
  8. T. Hewett, M. Czerwinkski, M. Terry, J. Nunamaker, L. Candy, B. Kules, and E. Sylvan. Creativity Support Tool Evaluation Methods and Metrics. In NSF Workshop on Creativity Support Tools, 2005.
  9. K. Höök, P. Sengers, and G. Andersson. Sense and sensibility: evaluation and interactive art. In Proc CHI, 2003.
  10. A. Kerne and E. Koh. Representing collections as compositions to support distributed creative cognition and situated creative learning. NRHM, 13(2):135-162, 2007.
  11. A. Kerne, S. M. Smith, E. Koh, H. Choi, and R. Graeber. An experimental method for measuring the emergence of new ideas in information discovery. IJHCI, 24(5):460-477, 2008.
  12. M. Kim and M. Maher. Comparison of designers using a tangible user interface & graphical user interface and impact on spatial cognition. In Proc. Human Behaviour in Design, 2005.
  13. C. Latulipe, E. A. Carroll, and D. Lottridge. Evaluating Longitudinal Projects Combining Technology with Temporal Arts. In Proc CHI 2011.
  14. National Academy of Engineering. Engineering Research and America's Future: Meeting the Challenges of a Global Economy. 2005.
  15. J. J. Shah, S. M. Smith, and N. Vargas-Hernandez. Metrics for measuring ideation effectiveness. Design Studies, 24(2):111 -134, 2003.
  16. B. Shneiderman, G. Fischer, and M. Czerwinski. Creativity support tools: Report from a US NSF sponsored workshop. IJHCI, 20:61-77, 2006.
  17. B. Shneiderman and C. Plaisant. Strategies for evaluating information visualization tools: Multi-dimensional in-depth long-term case studies. In BELIV Workshop of AVI 2006, pages 1-7.
  18. A. M. Webb and A. Kerne. Integrating implicit structure visualization with authoring promotes ideation. In Proc JCDL, pages 203-212, 2011.