Would you care if a story you read in a newspaper or online was "written" by a machine rather than a stressed-out hack? Would you even be able to tell the difference? Welcome to the world of "robo journalism" - and it's coming faster than you think.
Squirrelled away at the Press Association's (PA) headquarters in London is a small team of journalists and software engineers.
They're working on a computer system that can do the work of multiple human beings, picking out interesting local data trends - everything from crime statistics to how many babies are being born out of wedlock.
As part of a trial, the PA has begun emailing selected machine-generated stories, no more than several paragraphs or so in length, to local newspapers that might want to use such material.
"We've just been emailing them samples of stories we've produced and they've been using a reasonable number of them," says Peter Clifton, editor-in-chief.
Sometimes human journalists will rewrite or add to the algorithms' copy, but quite often, he says, it is published verbatim. Automated stories about, , or have all found their way online and in print.
This "robo-journalism" is becoming increasingly popular throughout the world's newsrooms, as publishers struggle to cope with dwindling newspaper circulations and the switch to online advertising.
Mr Clifton hopes to be distributing 30,000 of these stories every month by the end of April. The project, called Radar - is a partnership with Urbs Media and is funded by a €706,000 (£620,000) grant from Google.
But how much of a journalist's workload can really be automated? And are jobs ultimately at risk?
Mr Clifton points out that, at this stage, the system simply amplifies the work human journalists do, some of whom are involved in developing the system's output. The automated part is currently limited to trawling through the data, something that would take humans far longer to do.
Nevertheless, stories churned out by machines are becoming more and more common, particularly in the US.
The LA Times' earthquake alerts, based on data from the US Geological Survey (USGS), have been automated since 2014.
But the risks of such systems became clear last June when the newspaper published a report about a 6.8 magnitude quake off the coast of California - it was actually a record of a 1925 earthquake that had been published by the USGS in error.
The LA Times' automated story had appeared just a minute after the USGS published its outdated report. In this case, being first to the news was definitely a disadvantage.
"The stories will be automatically updated each week using box-score data submitted by high school football coaches," an article on the scheme explained.
The survey, from Oxford University's Reuters Institute of Journalism, found that many publishers are using automation to release interesting data quickly - from election results to official figures on social issues.
While AI is undoubtedly going to become more present in newsrooms, Joshua Benton at Harvard University's Nieman Journalism Lab doesn't think it yet poses a serious threats to jobs. There are far greater pressures, such as falling advertising revenues, he believes.
And he also says the really difficult and most highly scrutinised part of what professional journalists do - carefully weighing information and presenting balanced, contextualised stories - will be very hard for machines to master.
"Good journalism is not just a matter of inputs and outputs, there is a craft that, however imperfect, has evolved over decades," he explains.
"I'm not saying that machines will never get there, but I think they're still a pretty long way away."
Source: BBC NEW