Armed and Ready (for Quality Science Journalism)
Exploring some taxpayer-funded tools and toolkits
Hi there! This is TITLE-ABS-KEY(“science journalism“), your favorite nerd-out newsletter. In the previous issue, I (a science journalist) read a preprint on science journalism co-authored by a science journalist! This may have inadvertently created an Incursion event in the multiverse — because I could not find the next paper to read and comment on even after I postponed this issue (so that the Russian and English sections of my newsletter do not go live during the same week, temporarily breaking my brain every other Sunday).
That is why today we will do something different: discuss a whole toolkit for quality science journalism! which the European Union paid for (for the record, it did not pay me to discuss this toolkit, and I was not involved in the project in any capacity.)

Today’s non-paper: Results in Brief and some more in-depth materials from the EU QUality and Effectiveness in Science and Technology communication (QUEST) project. Link: Read all about it: a toolkit for quality science journalism
Why this non-paper: after the accidental Incursion event, it was either this relatively recent update from QUEST or going back to older papers (did not want to skip an issue).
Abstract, sort of: A series of toolkits for those working in science communication has been created by the EU-funded QUEST project, with advice for academics and journalists alike.
Naturally, I am rather nonplussed (always wanted to use this word!) by this apparent interchangeability of science communication and science journalism in the headline and lede of this message. There’s nothing quite like spending a semester arguing that these are not the same thing to multiple classes of students, and then seeing them treated like one in a million-euro project. I sure hope that’s on whoever wrote that EU page and not on the project team.
But okay, I suppose I can look over this glaring faux pas and keep reading. I love toolkits, so I am actually quite excited to see which cool and shiny tools science journalists now get to use thanks to some investment in infrastructure. And the next part of this update kind of makes it clear that there is a difference? Anyway, I’ll take it.
Digital media has opened up a multidirectional information flow in science communication. Citizens have more access to science, from increasingly diverse sources. This connectivity could elicit higher levels of engagement between science and society. But it also poses risks in terms of the quality of the information being shared. Against this backdrop, the EU-funded QUEST (QUality and Effectiveness in Science and Technology communication) project investigated quality across the whole science communication ecosystem – from scientists and (R&I) stakeholders, through traditional journalism, social media and in museums, to engagement with policymakers and citizens. QUEST focused on three areas: climate change, vaccines and artificial intelligence. The ultimate goal was to offer citizens more effective and reliable communication on scientific topics that generally have a significant impact on their daily lives.
Gotta say the EU loves its acronyms! Of the three areas QUEST focused on, I am recently on the record calling two (climate change and AI) “the main stories“ for science journalists in the foreseeable future. My rationale was quite close to what I quoted above: as more and more people in our audiences are encountering these (exceedingly complex) topics in their daily lives, they need our reporting to help them make informed decisions.
So, what are the toolkits I was promised?
Explainers and suggestions for journalists: on scientific concepts, statistical terms, scientific findings, statistics for journalists, and understanding data visualization;
Something (not immediately clear from the text description) called Ject.ai,
I believe the appropriate scientific term for this is a bonanza of toolkits. Let’s dig in!
I want to start with the more general guidelines and a toolkit, in part because recently I’ve been moderately obsessed with checking whether experts are themselves practicing what they preach — that is, whether all those explainers also follow the presented guidelines and recommendations.
This resource provides a set of guidelines for journalists, editors and media professionals. It was developed on the basis of indicators and metrics identified in consultation with science journalists, science-focussed media, and members of the public. It serves as a checklist covering all the different aspects of quality science journalism. The guidelines are presented in the form of a series of questions intended to stimulate reflection and to guide you through all stages of content development.
The self-check questions are clustered around three issues: trustworthiness and scientific rigour, presentation & style, and connection to society. (No idea why the sudden urge for an ampersand, and in the section on presentation and style no less). Overall these are reasonable, and there are two mentions of deadline pressure in the questions for trustworthiness and rigour, so the authors understand the practical constraints (rather than just assuming credulity or maliciousness).
My only potential gripe with these guidelines is that they seem to be written in the established scicomm/science journalism lingo, i.e. the journalists who may need them the most may also find them rather esoteric. I am looking at the first question in the guidelines, How clear are you that established principles of scientific research properly underpin the content of your story?
, and thinking back to my 21-year-old self in my first journalism job trying to decipher what these principles are and how to tell if they Properly Underpin the content of my story. Answering that question, I would have been moderately clear at best, and not because my stories were bad. Not all questions are like that, but some do use phrasing clearly aimed at mid-career (and probably not ESL) journalists.
The toolkit, which is a slide deck, does much better on plain language where it doesn’t just quote the guidelines — but it does mix up science communication and science journalism after all. And I was slightly puzzled by the intro slide asking me Why is quality science news reporting so important?
in the headline and immediately answering European Union’s Horizon 2020 QUEST project devised tools and guidelines to boost the effectiveness of science communication – to encourage people in Europe to become more scientifically literate.
Um, sorry, but if this is a self-explaining presentation, then you shouldn’t use the most misplaced of all press release tactics where you lead with not what’s important for your audience (why?) but what’s important for you (European Union’s Horizon 2020 QUEST project). Is quality science news reporting important because there was a Horizon 2020 project?
The slide deck also raises the curtain on the mysterious Ject.ai, which is a tool “to support research into science-based stories.” Wow, do I need some support for my research into science-based stories! (no sarcasm whatsoever, I really do) So I’m going to the website and signing up.

The first thing I have to tell you is that I will be coming back to this tool outside the newsletter format: as many of these “smarter search“ products tend to be, it’s clearly a bit clunky and clumsy — but I gave it a somewhat of a trick question and the response I had been looking for was literally the first item in its results page. So it knows its… stuff. If your search query is right at least. But I also need to interrogate its “AI-powered” claim further.
JECT.ai looks at its database of media stories and research papers and pulls out metadata from these stories: spokespeople, angles (which look more like keywords but I’ll take it), and topics (which do not seem as useful). See, for example, my query and the output (links to specific items it has found are below these summaries):

While this looks like a useful birds-eye service, akin to MediaCloud, my concern is with what it promises to do: how new exactly will those new angles, voices and content be if you’re pulling them from what’s already been written? I mean, sure, it can be a good start, especially with multiple language search, but one has to be careful about going outside one’s reporting bubble into a larger and more diverse reporting bubble.
(I was also rather confused by the pricing options for Ject.ai, which offered me subscriptions but had no information on ToS, what the free version I was apparently using included etc. But that’s not on the content and functionality of the product itself.)
And finally, we have the explainers. Well, coming back to my point above, on practicing what you preach — these explainers actually look much friendlier to beginner reporters than the guidelines. They are heavy on dealing with numbers and numeracy, which is not a bad thing for journalists and, ultimately, for audiences. And they can be useful to journalism instructors such as myself (saved a couple for future classes).
Okay, do these tools help science journalists? Yes, I think so. They could use some more accessibility for journalists in other beats who are increasingly facing science-driven stories spilling over into their domains. But overall I’d be content if I were a EU taxpayer (I actually am now but that’s a very recent development, so no money from Olga was spent on this project).
That’s it! If you enjoyed this issue, let me know. If you also have opinions or would like to suggest a paper for me to read in one of the next issues, you can leave a comment or just respond to the email.
Cheers! 👩🔬