AI gets cool as bots handle little details and big data
Thursday, October 10, 2019 5:58 am
The theory and practise of artificial intelligence came under the microscope when a panel at WAN-Ifra's Berlin Publishing Days discussed the 'coolness' of AI and how it had been thought to be the answer to most human problems.
IBM Watson marketing manager Stefan Pfieffer said that while the question-answering system could write a complete issue of the Washington Post, "it is not good at commenting on the news or doing creative stuff, thank God, as that's our role."
Editor-in-chief of RADAR Gary Rodgers agrees. His news agency processes data, using AI to produce content supplied by subscription to 400 news outlets across the UK.
After a launch in September 2017 and full UK trial last June, an editorial team of six now "bring together human journalism and machines" to produce two strong 300-400-word ready-to-print data stories a day for each publication," he said. The process starts with open data, with "lots" of which is available in the UK. "We are so lucky, with daily releases so we can track upcoming subjects, with wide appeal, such as childhood obesity, traffic issues and sport.
"We want plenty of granularity, so we can adapt all stories for each of our subscribing locations. A reporter looks at the dataset, uses NLG tools to write in algorithms for the 'how, when, where' of the story, and then gets comments and research for the 'who and why'.
"Multiple checking at all stages of the content creation is vital because we say the stories are print-ready on receipt."
Rogers says distribution is a major issue, with local geographies frequently matching those of each title. Subscribers might take just one story or multiple versions for all the titles in a large group.
And instead of just creating content for digital publication, much is now going into print with a newspaper's own journalists adding comment and local colour. Stories are bylined by the reporter who wrote the template, and the methodology can work in any language given the availability of open data.
Swedish group Mittmedia' s head of content development Li L'Estrade says robots play a vital role in delivering "super local" content. One of Sweden's largest groups with 28 local newspapers, 21 websites and apps, and 90 per cent of content behind paywalls.
Delivering super local content has helped the publisher - which has 500 journalists - halt churn. "This is where our robots are so vital - we simply could not do what is needed with only human journalists.
"They want super local sport - there are 480 teams in 59 series playing each week and we cover them all with Rosalinda, our sports robot launched in 2016 with Swedish tech company United Robots."
In 2017 the publisher launched a realestate robot enables it to cover every house sale in its areas - who sold the house, who bought it and how much they paid.
"More than 400,000 homes are sold each year and our readers want to know all about it, with an image, as we used to do when our group was launched more than 100 years ago," she says.
"Last year, we added a bot for business, mostly bankruptcies and finance, and use data analysis for the basis. Distribution also needs to be personalised to the person who reads, rather than to a household.
"This year we have carried around 25,000 stories written by bots on our websites, delivering 4.4 million page views behind our paywalls. We know readers are fine with our bot stories as we ask them, and know that they just want to know the info, not who wrote it."
Along the way Mittmedia has won 618 new subscribers, seen as a bonus.
In answer to a question, she confirmed that while "there had been some scepticism in the newsroom, we have using bots for some years now and this leaves them free to write the major, important stories."
Pictured: Mittmedia head of content development Li L'Estrade