Nine questions about news automation

The other day we received some excellent questions from Peter Carson at Edinburgh Napier University. Carson is currently collecting empirical material for his dissertation, which focuses on the impact of artificial intelligence on journalism. We decided to publish our thoughts in this blog.

Could automated journalism lead to the deskilling of journalists and the loss of jobs?
Considering the skills associated with the current state of the art, what those systems are capable of today, we are not too concerned about this. All new technologies come with challenges and may require new skillsets, whilst other skills may become redundant. Automation may exert increased pressure on the sense-making skills of journalists, which might lead to a higher level of specialisation for reporters and the material they produce. At the same time, if the media industry is willing to invest in depth, automation can free up resources for increased quality of stories. The role of journalists will definitely change, but there are still such wide gaps in the level of sophistication between human journalists and machines, that journalists will be needed for quite some time. In the optimistic scenario journalists will focus on the specific areas and topics that machines are not capable of, while text generators will take on the more mundane routine tasks.

If AI (artificial intelligence) becomes a dominating force in journalism production, is there a potential that journalists will only be essential in an editorial capacity?
Algorithms do currently not operate in a vacuum, they need creators and managers to function properly, so these positions will be required. The timeframe for a full-on automation of creative material is most likely longer than one would expect, and it will most likely take years before we see newsrooms only consisting of editors. We suggest looking into Dörr (2015) and Van Dalen’s (2012) respective work for more thoughts on the possible changes to the professional role of journalists.

Are there dangers in having AI-algorithms curate news on social media feeds without some sort of overseeing regulatory body?
This depends on the skills of machine learning. Algorithms are currently as biased as their creators, meaning that just as people make mistakes and false judgements, automation can make the same errors. The example of Microsoft’s Tay shows that we need to carefully think through the steps, and learn from them as we go along. It is important to keep in mind that social media algorithms learn from our behaviours, meaning that we impact on their actions.

Do we need more agreement on the ethics of data collection, when mining data on unaware individuals, to use for advertising or newsgathering purposes?
Yes. We estimate that this will be a big topic for discussion within international bodies during the coming years. We think all media needs to be overseen by an ethics committee or suchlike, and computational journalism or algorithmic text generators are no different from traditional media outlets. However, we need to invest resources into this area, since the production volumes and potential capacity of algorithms can be extremely difficult to monitor due to their immense quantities.

Are personalised news feeds curated by AI drawing people into “bubbles” of information, that shield them from new or challenging views?
The debate on whether this is happening or not, or if it actually is a new phenomenon or not, is currently taking place. While we might be able to sense that we are in a bubble, we can, however, actively try to impact that bubble by choosing to “teach” our social media algorithms that we also want to see other stuff.

Photo: Alberto Ollo

With the advent of “fake news”, is AI the answer to upholding values of truth in journalism and preventing the spread of misinformation?
Fake news is nothing new, nonetheless it is beneficial for our society that there is an ongoing debate on the impact of the information forming our opinions. Using algorithm-driven intelligence for locating and filtering false information is definitely something that could be beneficial. This, however, simultaneously poses questions of how that would be governed, and who would have the right to make those decisions.

Do we need to establish rules about transparency and accountability when articles are written by algorithms?
Yes, ethical rules and guidelines are always beneficial. At the same time, it is not as clear if a top-down approach is the best one, as the field evolves rapidly.

How do we prevent hidden biases in AI-generated news stories?
That is the million-dollar question, as we cannot even do that in traditional human-created news. Are humans even able to be unbiased? There are examples of how algorithms actually help us expose our inherent biases, as in the case of Google Image search preferred showing images of male CEOs.

Do we need to ensure that new journalism students have a degree of code literacy?
Based on discussions and interviews with data scientists and data journalists, we think the best results are achieved when journalists collaborate with the people who develop the technological aspects of automation (programmers/software engineers) instead of journalists trying to be programmers or programmers trying to be journalists. However, what is essential is that journalists adapt a computational way of thinking in their professional role, in order to better understand the possibilities and added value that algorithms and computation can have in the editorial offices or newsrooms. At the same time, a literacy of the foundations of journalistic norms and values is also relevant for the people who work with the technological aspects of the media industry.

Further reading on computational thinking:
http://www.cs.cmu.edu/~wing/publications/Wing08a.pdf

Peter Carson’s questions were discussed and answered by project lead Carl-Gustav Lindén, who is a journalism and media researcher, journalist and PhD-student Stefanie Sirén-Heikel, who focuses on journalistic verification, newsroom innovations, and media management, and research assistant and journalist Laura Klingberg.

Leave a Reply

Your email address will not be published. Required fields are marked *