The Impact of Task Design on Citizen Science Results
The Impact of Task Design on Citizen Science Results
Sunday, 15 February 2015
Exhibit Hall (San Jose Convention Center)
Over the last 5 to 10 years, citizen science has grown to include projects involving a range of different scientific issues and disciplines. Not only can these projects be defined by the discipline they involve, but also by the different psychophysical tasks and judgements they require of the user. These two methods of differentiation can often be at odds with each other, for instance it could be argued that classifying a galaxy type compared to a known catalogue is very similar to identifying a hieroglyphic compared to the known alphabet, even though the disciplines of astronomy and ancient Egyptian history are clearly different. A hierarchical task analysis of twelve Zooniverse citizen science projects was carried out comparing the types of task, user judgements, task complexity, and user freedom involved for each. It was found that these factors vary considerably across the platforms, independent of the discipline involved. Although part of this variance can be explained by the specific science case needs and associated data requirements, it is clear that as of yet the citizen science community does not have a ‘best practice’ framework for task workflow design. Furthermore, comparing these findings with website visitor analytics for each platform revealed relationships between task workflow design factors and user behaviour measures such as the number of return visits, time spent classifying and total person hours spent on the site. To further investigate these relationships a study was performed, involving the Zooniverse's Planet Four platform, to investigate the effect of task workflow design on both the user and the scientific results. Results show that participants found the autonomous interface with more variety faster to learn (p< .05) and easier to use (p< .05) than other interfaces, while more rigid and less complex interfaces resulted in more time being spent on each analysis and more results being collected (p< .01). Beyond the initial motivations for visiting a citizen science platform, in order to keep citizen scientists engaged with the project over time one of the aspects that platform developers and science teams must be aware of is that of the task workflow design of the site, specifically in terms of the types of tasks, required judgements and user autonomy, and how it effects both the user experience and the scientific results collected.