{"pk":49830,"title":"Measuring and predicting variation in the difficulty of questions about data visualizations","subtitle":null,"abstract":"Understanding what is communicated by data visualizations is a critical component of scientific literacy in the modern era. However, it remains unclear why some tasks involving data visualizations are more difficult than others. Here we administered a composite test composed of five widely used tests of data visualization literacy to a large sample of U.S. adults (N=503 participants). We found that items in the composite test spanned the full range of possible difficulty levels, and that our estimates of item-level difficulty were highly reliable. However, the type of data visualization shown and the type of task involved only explained a modest amount of variation in performance across items, relative to the reliability of the estimates we obtained. These results highlight the need for finer-grained ways of characterizing these items that predict the reliable variation in difficulty measured in this study, and that generalize to other tests of data visualization understanding.","language":"eng","license":{"name":"","short_name":"","text":null,"url":""},"keywords":[{"word":"Education; Psychology; Reasoning; Spatial cognition; Quantitative Behavior"}],"section":"Papers with Poster Presentation","is_remote":true,"remote_url":"https://escholarship.org/uc/item/4s10354p","frozenauthors":[{"first_name":"Arnav","middle_name":"","last_name":"Verma","name_suffix":"","institution":"Stanford University","department":""},{"first_name":"Judith","middle_name":"E.","last_name":"Fan","name_suffix":"","institution":"Stanford University","department":""}],"date_submitted":null,"date_accepted":null,"date_published":"2025-01-01T18:00:00Z","render_galley":null,"galleys":[{"label":"PDF","type":"pdf","path":"https://journalpub.escholarship.org/cognitivesciencesociety/article/49830/galley/37792/download/"}]}