Reflections after Four Weeks at Full Fact.

When I started at Full Fact, I wanted to build a program to carry out their first ever automated factcheck.

Four weeks later, that what’s I finished with: so far, the automated factchecker can check ‘Employment is rising’. It reads the word ‘employment’, goes to the Office for National Statistics’ labour force data, and runs a couple of simple tests to get an idea of whether the number of people in work really is rising.

Beyond that one example, the code I wrote shows real promise. It understands what kind of data needs to be looked up when presented with an example sentence.

Weeks 2-3 at Full Fact.

The automated factchecking project is split between two parts: scanning text and checking its validity. When I started it a few weeks ago, I intended to spend equal amounts of time on each. However, I started off in the first week using some very rudimentary programming tools, and it became clear that it would be much more worthwhile to explore new avenues and come up with new ideas in order to produce work that will be useful in the long term. And the long term is important, because I am kicking off something that will be built on in the coming years.

Week 1 at Full Fact.

My internship is at Full Fact, the UK’s independent factchecking organisation. Ahead of this year’s referendum, they worked with ITV and Sky News to correct factual errors made in live debates, and they have asked for and got corrections in all the national newspapers. They play an ever-growing role in the effort to hold the media and politicians accountable to their claims.

Many assertions made in public debate come up again and again, they call them “zombie claims” at Full Fact (because they just don’t die). Claims like ‘poverty increased in the past six months’ or ‘unemployment decreased last year’. Factcheckers spend valuable time finding and interpreting government data for poverty or unemployment every time new datasets are released.