Jump to content
The Official Site of the Vancouver Canucks
Canucks Community

[NYT] Google and Facebook Take Aim at Fake News sites


Ossi Vaananen

Recommended Posts

I felt it necessary to post this one, as much of the election thread was derailed by some very fictitious news. If true, that some of the degen Trump supporters actually believe this stuff, then it may have had an adverse effect on the election. I think a lot of people choose now where they get their news, and much of it being politically charged. So here's and New York Times article, though I'm sure some will find a way of even attacking them. 

 

Article: http://www.nytimes.com/2016/11/15/technology/google-will-ban-websites-that-host-fake-news-from-using-its-ad-service.html?smid=tw-nytimes&smtyp=cur&_r=0

 

Quote

Over the last week, two of the world’s biggest internet companies have faced mounting criticism over how fake news on their sites may have influenced the presidential election’s outcome.

On Monday, those companies responded by making it clear that they would not tolerate such misinformation by taking pointed aim at fake news sites’ revenue sources.

Google kicked off the action on Monday afternoon when the Silicon Valley search giant said it would ban websites that peddle fake news from using its online advertising service. Hours later, Facebook, the social network, updated the language in its ad policy, which already says it will not display ads in sites that show misleading or illegal content, to include fake news sites.

“We have updated the policy to explicitly clarify that this applies to fake news,” a Facebook spokesman said in a statement. “Our team will continue to closely vet all prospective publishers and monitor existing ones to ensure compliance.”

Taken together, the decisions were a clear signal that the tech behemoths could no longer ignore the growing outcry over their power in distributing information to the American electorate.

Facebook has been at the epicenter of that debate, accused by some commentators of swinging some voters in favor of President-elect Donald J. Trump through misleading and outright wrong stories that spread quickly via the social network. One such false story claimed that Pope Francis had endorsed Mr. Trump.

Google did not escape the glare, with critics saying the company gave too much prominence to false news stories. On Sunday, the site Mediaitereported that the top result on a Google search for “final election vote count 2016” was a link to a story on a website called 70News that wrongly stated that Mr. Trump, who won the Electoral College, was ahead of his Democratic challenger, Hillary Clinton, in the popular vote.

By Monday evening, the fake story had fallen to No. 2 in a search for those terms. Google says software algorithms that use hundreds of factors determine the ranking of news stories.

“The goal of search is to provide the most relevant and useful results for our users,” Andrea Faville, a Google spokeswoman, said in a statement. “In this case, we clearly didn’t get it right, but we are continually working to improve our algorithms.”

Facebook’s decision to clarify its ad policy language is notable because Mark Zuckerberg, the social network’s chief executive, has repeatedly fobbed off criticism that the company had an effect on how people voted. In a post on his Facebook page over the weekend, he said that 99 percent of what people see on the site is authentic, and only a tiny amount is fake news and hoaxes.

“Over all, this makes it extremely unlikely hoaxes changed the outcome of this election in one direction or the other,” Mr. Zuckerberg wrote.

Yet within Facebook, employees and executives have been increasingly questioning their responsibilities and role in influencing the electorate, The New York Times reported on Saturday.

Facebook’s ad policy update will not stem the flow of fake news stories that spread through the news feeds that people see when they visit the social network.

Facebook has long spoken of how it helped influence and stoke democratic movements in places like the Middle East, and it tells its advertisers that it can help sway its users with ads. Facebook reaches 1.8 billion people around the globe, and the company is one of the largest distributors of news online. A Pew Research Center study said that nearly half of American adults rely on Facebook as a new source.

Google’s decision on Monday relates to the Google AdSense system that independent web publishers use to display advertising on their sites, generating revenue when ads are seen or clicked on. The advertisers pay Google, and Google pays a portion of those proceeds to the publishers. More than two million publishers use Google’s advertising network.

For some time, Google has had policies in place prohibiting misleading advertisements from its system, including promotions for counterfeit goods and weight-loss scams. Google’s new policy, which it said would go into effect “imminently,” will extend its ban on misrepresentative content to the websites its advertisements run on.

“Moving forward, we will restrict ad serving on pages that misrepresent, misstate or conceal information about the publisher, the publisher’s content or the primary purpose of the web property,” Ms. Faville said.

Ms. Faville said that the policy change had been in the works for a while and was not in reaction to the election.

It remains to be seen how effective Google’s new policy on fake news will be in practice. The policy will rely on a combination of automated and human reviews to help determine what is fake. Although satire sites like The Onion are not the target of the policy, it is not clear whether some of them, which often run fake news stories written for humorous effect, will be inadvertently affected by Google’s change.

 

Link to comment
Share on other sites

A little late. Fake news affected the US election in my opinion. Not a fan of censorship in any way, but the elected showed there is down falls to allowing anyone to say anything. People will take things as fact pretty quickly if something confirms their opinion. 

Link to comment
Share on other sites

i am against censorship

if this is a real issue, and i think it is

the better solution to is tag. or add a tag, to the fake news story to alert the reader

so the reader can use their own critical skills when reading the article

stories on the edge of real news often have value in their viewpoints despite their weak factual basis

or even if they do not, it is important to know what people are thinking out there

i think trump knew more about how people think than the establishment did

the rest of us need to know that too

Link to comment
Share on other sites

This shouldn't be so difficult for Google if they were to build something similar to PageRank and their other algorithms focusing on veracity (TruthRank?) and work it into how they return search results. Basically have a team manually go through the top few thousand news sources assigning scores based on fact checking, then build a bot out of that that crawls the internet and scores other sites based on how they connect to these primary sites. I can't imagine it would take them more than a year to have it fully implemented, and that's assuming they're starting from complete scratch. I'd assume it's at least something that they've explored in the past in some form.

 

I'm much more skeptical about Facebook, Twitter, etc. and their willingness to work a similar idea into their platforms, mostly based on track record, but ... it's a start.

 

Unfortunately one of the big take aways from the election is the proof that a lot of people would choose something that confirms their opinions over something that disproves it through proper methodology. I mean it's not a new concept, but it's still depressing anytime it gets demonstrated.

Link to comment
Share on other sites

ha, and right after posting that I come across this fun Guardian article:

 

Bursting the Facebook bubble: we asked voters on the left and right to swap feeds

 

excerpt

 

Quote

 

To test the effects of political polarization on Facebook we asked ten US voters – five conservative and five liberal – to agree to take a scroll on the other side during the final month of the campaign.

We created two Facebook accounts from scratch. “Rusty Smith”, our right-wing avatar, liked a variety of conservative news sources, organizations, and personalities, from the Wall Street Journal and The Hoover Institution to Breitbart News and Bill O’Reilly. “Natasha Smith”, our left-wing persona, preferred The New York Times, Mother Jones, Democracy Now and Think Progress. Rusty liked Tim Tebow and the NRA. Natasha liked Colin Kaepernick and 350.org.

Our liberals were given log-ins to the conservative feed, and vice versa, and we asked our participants to limit their news consumption as much as possible to the feed for the 48 hours following the third debate, the reopening of the Hillary Clinton email investigation, and the election.

Not all of our participants made it through to election day. “You might as well have been waterboarding a brother,” said one of the participants, Alphonso Pines, after his first exposure to the right-wing feed.

 

Link to comment
Share on other sites

I don't like it.  

I'm just worried about some local amateur, doing some good ol' detective work on some issue.  Maybe it's completely legit stuff, but since it's difficult to be verified (due to the lack of resources, etc)... and they are potentially censored.  The last thing you want is to have some large organization unaccountable to pretty much no one (save shareholders) controlling whatever narrative they think are best for the people.  

 

I think they should just let everything go unhindered.  If someone is dumb enough to complete agree whatever that pops up in their twitter, facebook, whatever feed... that's kind of their own fault.  

 

 

Link to comment
Share on other sites

5 minutes ago, Lancaster said:

I don't like it.  

I'm just worried about some local amateur, doing some good ol' detective work on some issue.  Maybe it's completely legit stuff, but since it's difficult to be verified (due to the lack of resources, etc)... and they are potentially censored.  The last thing you want is to have some large organization unaccountable to pretty much no one (save shareholders) controlling whatever narrative they think are best for the people.  

 

I think they should just let everything go unhindered.  If someone is dumb enough to complete agree whatever that pops up in their twitter, facebook, whatever feed... that's kind of their own fault.  

 

 

You mean believing what you see and read on CNN, etc?

I agree, but these sheeple are the majority and that will be our undoing.

The more you lean about how things are, the more you are in danger of becoming a target of the zombies. We all know what zombies crave most...

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...