Apple takes a very un-tech approach to solving fake news: human editors
We all agree there's too much disinformation on the Web.
Russians and bots played havoc with Fakebook and Twitter during the 2016 election, because nearly anyone can join and post on social media networks and much of the ad-buying is automated.
This week Apple said it was serious about tackling the issue, and bought a magazine subscription app with popular titles (including Consumer Reports, Vanity Fair and the New Yorker) to beef up its News smartphone and tablet app.
Meanwhile the YouTube video network, fighting a backlash against conspiracy videos that anyone can post to its platform, went in a totally different direction to battle fakeness—turning to crowd-sourced Wikipedia to provide viewers with links to what it hopes are authoritative viewpoints.
Let's take a look at the two approaches.
Apple's News app is on the front page of every iPhone and iPad, and promises to give users a curated, personalized view of the news, with bigger pictures and fonts than seen when reading stories in apps or on the Safari browser. Apple said in 2016 the News app had over 70 million users, and while it hasn't updated the numbers, it does say the app has grown substantially since.
At the South by Southwest conference in Austin, Texas this week, Apple senior vice-president Eddy Cue said his app is differentiated from what we see on Facebook and Twitter in that it's vetted by human curators and thus, more authentic.
"We want the best articles, we want them to look amazing and we want them to be from trusted sources," Cue said, per Deadline. "So we don't have a lot of the issues going around."
On Apple News, publishers big and small can sign up to have their work included in the app, for free. But unlike on Facebook, on News, would-be publishers need to submit at least three articles to Apple before being approved.
Not so at YouTube, where some 400 hours of video is uploaded every minute, without prior oversight.
The video network has been caught in a two-year battle against conspiracy and extremist videos showing up as legitimate news videos, and it has taken several steps to fix the issue, by de-monetizing them and removing their ads, and now, also at South by Southwest, in announcing a new strategy.
YouTube will add links from the crowd-sourced Wikipedia online encyclopedia to conspiracy videos. If you're scratching your head on this, so are we.
Wikipedia is even easier to game than the YouTube algorithm. Anyone can edit a post and add their spin to it —whether that be the biography of President Donald Trump, the history of the Santa Monica Pier or a post about the blogger Lance Ulanoff.
The longtime tech journalist woke up one morning and found that his listing has been updated, falsely, to say that he was "a member of the French Foreign Legion."
It eventually got updated, but Ulanoff doesn't think YouTube will find much success with its Wikipedia fix.
"It will be like a band-aid that will work at first, and within a few days it will be tattered and dirty," he says.
The problem, he notes, is that once Wikipedia is used to verify the bogus videos on YouTube, those same people who make and post the videos will descend upon Wikipedia to synch the facts with their point of view, he says. "It's a very slippery slope."
The bottom line: kudos to Apple for a terrific app that has so far not experienced the issues found on Twitter, Facebook and YouTube, and for the video network —nice idea, but it's back to the drawing board. Next?
©2018 USA Today
Distributed by Tribune Content Agency, LLC.