Product Promotion Network


The Cambridge Analytica affair reveals Facebook’s ‘Transparency Paradox’

The scandal surrounding Cambridge Analytica, the data-mining company that Facebook has banned from its platform for violating data use policies, gets bigger by the day. On Monday, a UK television program aired undercover footage of Alexander Nix, the company’s CEO, boasting about how it could use spies, entrapment techniques and fake news to influence elections. Britain’s information commissioner, Elizabeth Denham, said she would seek a court order to look at Cambridge Analytica’s servers and databases.

Recommended for You

The same day, Facebook said it had hired forensic auditors to examine whether Cambridge Analytica had held on to data allegedly passed to it by an academic at Cambridge University, Aleksander Kogan, who had gained access to Facebook’s platform for research purposes several years ago.

Kogan, who used that access to glean information about 50 million Facebook users, was also blocked from the platform, as was Christopher Wylie, a former Cambridge Analytica employee turned whistleblower. The post-mortem into the affair and how the data were used in Cambridge Analytica’s efforts to help Donald Trump win the 2016 US presidential election is in full swing. It will add to the turmoil at Facebook, which lost £37 billion in market cap during the day’s trading and may also lose Alex Stamos, its chief security officer, who has reportedly clashed with the network’s leadership over how transparent Facebook should be about Russia’s use of the platform to spread disinformation.

One really important question is whether all this could harm researchers’ efforts to shed more light on the immense influence that social networks now have over our lives. To explore the issue, we spoke with Sinan Aral, a social media expert who is a professor at MIT’s Sloan School of Management. Were you surprised by the fact that a researcher could access so much data and allegedly pass it to a third party in violation of Facebook’s data use rules?

I wasn’t surprised they could access that much data. Facebook’s been pursuing research questions with qualified researchers for some time. What is surprising is that an academic researcher could so flagrantly violate the spirit and the terms of the data sharing policies Facebook has in place by taking that data and giving it to a firm that was never authorized to have it in the first place for the purposes of political targeting.

Are you very concerned that this episode could have a chilling effect on social networks’ willingness to share data with researchers? Yes I am. Facebook is facing what I call a ‘transparency paradox’.

On the one hand, it’s under tremendous pressure to be more transparent, to reveal more about how targeted advertising works; how its News Feed algorithms work; how its trending algorithms work; and how Russia or anyone else can spread propaganda and false news on the network. So there’s this very strong pressure to be more transparent and to share data with trusted third parties. But on the other hand, there’s really strong pressure to increase the security of the data that they do reveal to make sure that it doesn’t get in to the wrong hands and to protect users’ privacy.

This transparency paradox is at the core of Facebook’s existential crisis today and there’s a real risk that the Cambridge Analytica story will make it more conservative in what it shares, which wouldaffect the research of hundreds of good scientists who are working with the social network every day without breaching its terms of service in order to understand how Facebook is affecting our society. Is the current ad hoc process for giving researchers access to social network data governed by non-disclosure agreements working well, or could it be improved? I think each request needs to be individually vetted by Facebook and other networks.

I wouldn’t want them to give blanket access to just certain narrow kinds of data. There are a lot of different aspects of Facebook that are important to understand, and it needs to be able to consider the costs and benefits of each proposed data release. One suggestion that’s been floated is to create a portal where registered researchers could query anonymized data from social network users who have consented to share their information.

What do you think of that idea? That kind of a portal could be useful, but it’s only one very small solution to the transparency paradox. There needs to be much more openness than just a single set of data.

There needs to be lots of different ways that Facebook and other networks work with the scientific community to help us all understand how social media is affecting our democracy, our businesses, and our public health. Is the Cambridge Analytica story a signal that we should be more broadly concerned about the power of these platforms and what they are doing with our data? I think the story is about a researcher who flagrantly violated the likely terms of any data-sharing agreement he had with Facebook for research purposes and the company, Cambridge Analytica, that either knowingly or unknowingly used the data for potentially nefarious purposes without vetting the source of that data and any restrictions associated with it.

That’s the real story here. We need to better understand the threat of bad actors who may use access to data to help them spread fake news or propaganda on social platforms. And the only way we’re going to get a handle on that is if Facebook can find a way to resolve its transparency paradox effectively by becoming more open and more secure at the same time.

Facebook’s security chief is leaving after clash over Russian misinformation

Facebook’s chief security officer, Alex Stamos, will leave the company later this year, according to The New York Times. His departure reportedly comes as a result of disagreements over how to handle the spread of misinformation on the social network.

As part of Stamos leaving, Facebook has reportedly broken down and reassigned his security team. Almost all of the 120 employees have now been reassigned to product and infrastructure teams, according to the report; it’s unclear if Facebook maintains some other dedicated security team, or if this means security teams are now integrated into other departments.

Stamos’ departure was reportedly decided on last year, but the company decided to keep him on until August to help transition his duties to others — and so that it wouldn’t look quite as bad for Facebook amid continued discoveries about Russia’s abuse of the platform during the 2016 US election.

With the news breaking days after the Cambridge Analytica scandal, it actually looks worse.

The report indicates that Stamos wanted to be more open about security issues than other executives inside Facebook. He reportedly began investigating Russian activity in July 2016 and later pushed the company, sometimes unsuccessfully, to be open about its findings. According to the report, Facebook executives appear to be unhappy with Stamos’ approach and seem to blame much of the recent backlash against the company on the decision to disclose information about Russian election tampering, rather than stay quiet.

In a tweet, Stamos said his role has changed inside of Facebook but that he remains “fully engaged.” However, he did not deny that he would be leaving. “I’m currently spending more time exploring emerging security risks and working on election security,” Stamos wrote. Reuters also reported that he would be departing in August and that his responsibilities had been “taken away.”

Facebook said in an emailed statement that Stamos remains the company’s chief security officer, without acknowledging any change in his role.

But like Stamos’ tweet, the statement doesn’t deny that he’ll be leaving. “Alex Stamos continues to be the Chief Security Officer (CSO) at Facebook,” a spokesperson said. “He has held this position for nearly three years and leads our security efforts especially around emerging security risks. He is a valued member of the team and we are grateful for all he does each and every day.”

Stamos started at Facebook in 2015. Prior to that, he was the chief information security officer at Yahoo. According to Reuters, he resigned after a year after discovering that Yahoo had secretly built a program to scan all incoming email for the NSA or FBI.

In recent months, Stamos has been among the Facebook executives willing to talk about the company and its ongoing problems on Twitter.

Over the weekend, he took issue with the characterization of Cambridge Analytica’s use of Facebook info as a “data breach,” since hackers did not penetrate any systems. In fact, Facebook was set up to allow third parties to misuse data without any such difficulty.

There are a lot of big problems that the big tech companies need to be better at fixing. We have collectively been too optimistic about what we build and our impact on the world.

Believe it or not, a lot of the people at these companies, from the interns to the CEOs, agree.

— Alex Stamos (@alexstamos) March 17, 2018

eBay’s Android app has a new AR feature to help you pick the correct box size for shipping

eBay has updated its Android app to include a new augmented reality feature that helps you figure out what box size you’ll need to use to ship your stuff, via Android Police.

The AR feature is pretty simple: point your phone at whatever it is you’re trying to ship, and it’ll superimpose an augmented reality box over it so you can figure out if it’ll fit. It’s got a bunch of standard box sizes built in. The idea is that you’ll know that you need to get, say, a medium USPS flat rate box from the post office instead of overshooting and having to go back for a bunch of bubble wrap (to pick a purely hypothetical example).

eBay says that the feature will only work on a few devices, although it hasn’t given a list (Android Police speculates that it’s probably limited to ARCore-compatible phones, though).

You can try it out for yourself by updating the eBay app today.

1 2 3 7