Supporting Creative Work with Crowd Feedback Systems

04/20/2020
by   Jonas Oppenlaender, et al.
University of Oulu
0

Crowd feedback systems have the potential to support creative workers with feedback from the crowd. In this position paper for the Workshop on Designing Crowd-powered Creativity Support Systems (DC2S2) at CHI '19, we present three creativity support tools in which we explore how creative workers can be assisted with crowdsourced formative and summative feedback. For each of the three crowd feedback systems, we provide one idea for future research.

READ FULL TEXT VIEW PDF
POST COMMENT

Comments

There are no comments yet.

Authors

page 3

11/13/2018

Crowd Coach: Peer Coaching for Crowd Workers' Skill Growth

Traditional employment usually provides mechanisms for workers to improv...
02/13/2019

Crowd Work on a CV? Understanding How AMT Fits into Turkers' Career Goals and Professional Profiles

In 2013, scholars laid out a framework for a sustainable, ethical future...
08/17/2021

Informed Crowds Can Effectively Identify Misinformation

Can crowd workers be trusted to judge whether news-like articles circula...
05/26/2015

A U.S. Research Roadmap for Human Computation

The Web has made it possible to harness human cognition en masse to achi...
10/25/2016

Image Clustering without Ground Truth

Cluster analysis has become one of the most exercised research areas ove...
04/16/2022

An Overview of Query Processing on Crowdsourced Databases

Crowd-sourcing is a powerful solution for finding correct answers to exp...
09/16/2020

Human biases in body measurement estimation

Body measurements, including weight and height, are key indicators of he...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

1. Introduction

Supporting human creativity has been considered as one of the grand challenges of Human-Computer Interaction (HCI) (Shneiderman2009). Crowdsourcing (howe2006) is a powerful approach for tapping into the collective insights of a diverse crowd of people. The combination of crowdsourcing and creativity support is a promising area of research (p22-kittur.pdf; Shneiderman2009). and builds on a long line of work on augmenting human creativity and intellect (Doug_Engelbart-AugmentingHumanIntellect.pdf; GuilfordPresidentialAddress). More recently, research on supporting creativity has been sparking interest in the field of Human-Computer Interaction (2019_chi-paper.pdf; dc2s22019) and design (cc19workshop).

Figure 1. ArticleBot user interface for faceted filtering of ideas.
Figure 2. CrowdUI user interface to provide justifications for manipulations of the website layout.
Figure 3. High-level architecture of the situated feedback system.

Crowd feedback systems (luther-crowdcrit-cscw2015.pdf; p1433-xu.pdf) are computer-mediated systems that enable creative individuals to collect feedback from a large number of people online. These systems provide an opportunity for asynchronously sourcing feedback, decision support and critique from a crowd with a diverse background.

Researchers have investigated ways of increasing the perceived value of crowdsourced feedback (crowdInnovationCourse-hcomp2015.pdf; a063-krishna-kumaran.pdf), for instance by using rubrics to structure the feedback (Posts_paper_3.pdf; luther-crowdcrit-cscw2015.pdf) and interactive guidance to scaffold the feedback process (paper55.pdf; mg_critiki_CaC2015_CameraReady.pdf). Using techniques such as the above, crowdsourcing distributed critique from novice crowds may yield feedback in comparable quality to expert critique (luther-crowdcrit-cscw2015.pdf; Posts_paper_3.pdf).

In the past year, we explored the design space and limitations of crowd feedback systems with three feedback systems that support different types of creative work: writing, web design, and art critique. In the following sections, we give a brief overview of each system. For each system, we provide one idea for future research.

2. ArticleBot: Supporting Exploratory Writing

We designed and tested an intervention to support writers in finding and exploring different ideas and topics (ArticleBotINTERACT19). The lightweight system was adapted from Hosio et al. (Hosio:2016:LWC:3056355.3056393). The system provides decision support in form of short textual ideas that can be sorted according to different criteria (see Figure 1). Both the ideas, the criteria, and the rating of each idea across all criteria are provided by the crowd.

Currently, the system collects all data up-front in a serendipitous fashion from visitors of a website or in a controlled fashion from crowd workers. While this database of ideas, criteria, and ratings holds value, it is expensive to create in terms of time taken and/or money spent and it is scoped to only one topic. Future work could support writers with data collected from the crowd in real-time. To this end, instead of serendipitously collecting data organically from website visitors, the system could actively crowdsource information in real-time from online labour marketplaces, such as Mechanical Turk.

3. CrowdUI: Supporting Web Design

CrowdUI (CrowdUI) is a system that allows the community of a website to visually provide their feedback, using the website itself as a canvas. Website visitors can directly manipulate (move, delete, resize) the elements of the webpage. CrowdUI’s multi-step process includes a tutorial to familiarise the user with the affordances of the system and a peer review of other users’ creations. The system allows designers to inspect the individual adaptations of the user interface. It also provides further decision support by visually aggregating user modifications of the website’s user interface in heatmaps. The system was evaluated in a remotely-administered user study with 48 users with promising results.

Figure 4. Prototype of the situated feedback elicitation system. Artwork credit: Stéphan Valentin.
Figure 5. Example of a type of feedback in the situated feedback system. Artwork credit: Aman Ravi.

Web designers and developers are faced with many decisions during the website design process. One idea for future work involves crowdsourcing preference judgements from the crowd in a web-based game with a purpose (GWAP) to help the developer in making informed decisions and to explore the space of possible design alternatives. The game would automatically tweak variables in the design space (e.g. changing the font style of a headline to bold). It would then elicit a pairwise comparison of the design alternatives from its players, resulting in a ranking of individual attributes. Similar web-based games, for instance Can’t Unsee (https://cantunsee.space), have been found to be very engaging.

4. SIMPLEX: Eliciting Art Critique

In our third system (Simplex), we explored situated crowdsourcing and public displays for supporting artists with summative feedback. The system enables the situated crowd to provide feedback to artists via an installation that consists of two public displays (see Figures 3 and 4). A digital artwork is displayed on the main screen of the installation. Feedback is given on the smaller touch screen positioned in front of the main screen. Once feedback is given, a new artwork is displayed on the main screen. In a needfinding study with two artists and a user study with 12 participants, we evaluated eight different types of feedback (see the example in Figure 5).

Future work will evaluate the ecological feasibility of the SIMPLEX crowd feedback system in a longitudinal field study. The system will be evaluated using a mixed-method approach, combining the quantitative analysis of interaction data with qualitative insights from semi-structured interviews. Further evaluation could also include the application of the system in a specific use case, e.g. in the context of a design-oriented University course to evaluate architectural designs from different perspectives.

5. Conclusion

In this position paper, we presented three crowd feedback systems that support creative work. Our exploration of different modes and types of feedback highlighted that there are many opportunities to support creative work with crowd feedback, beyond mere text-based feedback.

Our ongoing work in the form of a structured literature review will guide the development of a taxonomy of activities that creative individuals engage in when they evaluate feedback from the crowd. This work will culminate in a conceptual framework to highlight critical processes that affect the sensemaking of crowdsourced feedback. We envision this design framework to create a basis for discussing challenges and issues related to the design of interactive feedback elicitation systems.

References