Evaluation Methods for Creativity Support Environments
Important Dates
7 May Submissions due
14 May Notification of acceptance
6 June 9:00am-12:30pm Workshop

Workshop Aim:

Creativity refers to human cognitive processes that underpin sublime forms of expression and fuel design and innovation. While a 2005 NSF workshop saw development and evaluation of creativity support tools as a new field, we take the position that researchers are developing sophisticated methods, which have progressed well beyond infancy. We expand the scope with 'environments', a superset of 'tools'.

Creativity support environments (CSEs) span and integrate diverse domains and types of systems, including software, hardware, instrumented spaces, networked topologies, and mobile devices. CSEs may involve temporal-spatial dimensions of collaborative work, requiring evaluation methods that address synchronous, asynchronous, co-located, and distributed interaction.

We seek to gather the community of researchers designing, developing, and evaluating CSEs, to share approaches, engage in dialogue, and develop best practices. We seek papers and presentations that develop in-depth methods for CSE evaluation. Methods must be explained with sufficient clarity and detail that others can apply them. They should be grounded by showing how they have been applied in the study of particular CSEs. Authors should motivate the types of CSEs for which a particular evaluation method is well-suited.

The workshop will coalesce the community involved in developing these methods, and set the table, inspiring discussion and debate about the value of particular methods in types of situated contexts. The expected outcome is not a single prescription, but a landscape of routes, an ontology of methodologies with consideration to how they map to creative activities, and an emerging consensus on the range of expectations for rigorous evaluation of CSE research.

Workshop Chair
Andruid Kerne (andruidecologylab.net)
Celine Latulipe (clatulipuncc.edu)
Andrew M. Webb (andrewecologylab.net)
Erin Carroll (e.carrolluncc.edu)
Program Committee
  • Andruid Kerne, Texas A&M University
  • Celine Latulipe, University of North Carolina
  • Steven M. Smith, Texas A&M University
  • Mary Czerwinski, Microsoft Research
  • Mary Lou Maher, University of Maryland HCIL
  • Jill Fantauzzacoffin, Georgia Tech
  • David A. Shamma, Yahoo! Research
  • Brian Bailey, UIUC
  • Andrew M. Webb, Texas A&M University
  • Erin Carroll, University of North Carolina
Workshop Notes

Workshop Website

Submission information:
Papers should be 2-4 pages long, formatted in DCC Format. The best papers will be invited for presentation in 10 minute slots, with 5 minutes discussion for each. Other papers may be invited for presentation as posters.

The submission site is open. Submit your paper here.

Workshop format:

9:00 - 9:45
Survey of the CSE Evaluation Landscape

9:45 - 10:30
3 Paper Presentations (10+5 each)

10:30 - 10:45
Coffee Break and Posters

10:45 - 11:30
3 Paper Presentations

11:30 - 11:50
Small Group Discussions

11:50 - 12:30
Full Workshop Discussion

Attendees at the workshop need to register either as an addition to the DCC'12 conference registration at a cost of $25, or if not registered for the conference at a cost of $50. Please go the main DCC12 conference page and then to Registration to register.

Return to DCC12 homepage.