This article is authored by Mark Stencel, mentor of the Ankara module of Factory programme.
Ankara reminds me a lot of Washington, D.C. The city's core is newish, historically speaking, and largely defined by its huge governmental workforce. It also has a thriving community of technological innovators, 30 of whom sacrificed a recent weekend for a two-day workshop to help the Ankara-based fact-checker Teyit think of new ways to intercept online misinformation.
The participants were an impressive group with a mix of experience and expertise -- in digital media and journalism, art and design, data science, programing, social science, psychology and business. They assembled early on a Saturday at Kıvılcım (or "Spark"), a flexible workspace provided by the Technology Development Foundation of Turkey as a "hub for industrial creativity." The room on the fifth floor of one of the high-rise Cyberpark buildings at Bilkent University was furnished with rolling tables and chairs, whiteboards and comfortable beanbag seats -- all the set-pieces needed to evoke the atmosphere of a digital startup.
The purpose of this gathering was an exercise the organizers called "Factory" -- a Google-inspired "design sprint" to develop and quickly prototype ideas for solving some of the most vexing challenges that fact-checkers like Teyit struggle with.
An experienced team of local facilitators and experts was on hand to guide and review the work. As an American journalism instructor who studies fact-checking around the world, my role was to be a "mentor." I was there to describe and explain examples of solutions that other fact-checkers and similar journalism-related projects have tried to address related problems. I also shared some of the specific solutions we are working on at Duke University's Reporters' Lab through our Tech & Check Cooperative.
The participants broke into five groups to consider five key challenges:
* The rapid speed at which misinformation spreads online to large numbers of people.
* The limited number of internet users that any verification platform can actually reach.
* The low levels of media literacy among people who use the internet, as well as the limited understanding of how to recognize potentially false information.
* The slow speed of the verification processes and the limited access to resources that can help verify or debunk information that spreads online.
* And the challenge of finding revenue models and other financial resources that can support the production of quality content.
The workshop participants selected these challenges from a longer list, and the issues are clearly interrelated. Fact-checkers need to crank up their verification processes to respond quickly to misinformation that spreads rapidly. They also must help the public identify false information on its own -- which might mean the public could ultimately help fact-checkers identify falsehoods and disseminate accurate information. But taking on any of this important work would require new sources of financial support.
Much as the issues were interrelated, so were the solutions that each of the five groups devised. Several involved engaging the public directly -- through "gamification" or the use of other kinds of social media or interactive interfaces. The ideas included tools that might help ordinary internet users identify reliable sources of information and call out potential inaccuracies, or even identify the people who spread them. There also were ways these interfaces could either tap into databases of previous fact-checks and potentially alert fact-checkers to inaccurate content that requires their scrutiny.
As for paying for these kinds of innovations, the group that looked at financial opportunities considered ways to get other media companies to help underwrite some of these efforts, but realized that might be tricky given the state of their own finances. So instead the group focused on potential subscription models to monitor misinformation that might be highly relevant to certain business sectors, as well as options for using fact-checkers' reporting and research abilities to help non-governmental organizations collect and authenticate reliable data that could be used in public debate -- about health, economics, and other topics.
As the groups developed their ideas, two talented translators helped me share related work I knew of. For instance, I encouraged the revenue group to look at the work of Brazilian fact-checker Aos Fatos, which produces custom reports and data sets for non-partisan companies and civil society groups who will pay for the work. (In some cases those report are for the client's internal use. In others, the work might be published by the fact-checker, but with notes to the public making clear how the work was commissioned and paid for.)
For the groups that were focused on mobilizing the public, I shared some of the cautionary guidance that Mevan Babakar of U.K.'s Full Fact shared in a 2018 essay. I also showed an example of how a local TV station in Texas produces a series of "road trips" in which viewers who have questions about issues in the news investigate the matter side-by-side with one of the station's reporters. In these reports, the viewer may not always come to the same conclusion as the journalist, but even then the process reveals to a broader audience why the issue is controversial and the subject of debate and disagreement.
For the group that focused on "media literacy," I suggested a 2018 report by the American Press Institute on what it calls "media fluency." The report suggested examples of questions that could potentially help news readers, viewers and listeners identify information gaps in the content they regularly consume.
By the afternoon of Sunday, March 10, each group had devised a "rapid prototype" to explain their proposed solutions to the five problems. Some were in the form of storyboards that illustrated how their process might work. Others involved functional demos of working code. All of them were impressive. Having attended many such creative workshops over the years, there almost always are well-intended efforts that are unlikely to move forward in the real world. But in this case, there were excellent ideas from each group -- as well as enough overlap to suggest how some of the ideas might mix and merge.
Developing these prototypes and presenting them to a big group in such primordial form was a new experience for some of the participants. So on the morning of the second day, I dug out some old slides from my previous work at National Public Radio to show how we'd made these kind of exercises an important part of our product development process -- quickly turning ideas into experiments that we could try out and evolve with our audience.
Image: Bill Adair, Duke Reporters’ Lab
More directly to the point of misinformation, I shared with the whole group a picture that came from the early days of a well-known U.S. fact-checking project. In fact, it was a large piece of poster board with a label that said "The Fact-Checking Project" across the top. This not-very-beautiful flow chart was the work of an editor in the Washington, D.C, office of a Florida newspaper -- the Tampa Bay Times (then known as the St. Petersburg Times). The chart included short chunks of text and images pasted onto the cardboard, with hand-drawn arrows showing how one piece of information or content would connect with the next.
When I showed this image at the Ankara workshop, I pointed out to the participants how remarkable it was that this crude illustration was enough to somehow convince the newspaper's top editors to embark on a very bold national news project in time for the 2008 U.S. presidential election campaign, which was just getting started. But that's exactly what happened. Times' bureau chief Bill Adair (now my colleague at Duke) got the go-ahead to build the website that would become PolitiFact, now one of the leading U.S. fact-checkers. And then, just two years after Adair made that storyboard for his bosses, PolitiFact won the Pulitzer Prize.
If a very rough, handmade poster could point the way for a regional newspaper to win one of journalism's leading awards in just two years, I assured the participants at Teyit's Ankara workshop that the work they were doing that weekend could have just as much impact on the fight against misinformation.
After seeing the workshop's results, I was sure I was right. And I am as sure now that the groups gathering in Istanbul in May for a second-round of prototyping can generate equally ambitious ideas and help solve challenging issues for fact-checkers -- in Turkey and beyond.