Tag Archives: data privacy

Insider Q&A: Facebook VP of Messenger discusses privacy

[ad_1]

Government officials worry about Facebook’s plans to extend end-to-end encryption to Messenger

SAN FRANCISCO —
At Facebook, Stan Chudnovsky oversees the Messenger chat app that’s used by well over 1 billion people each month. He’s playing a key role in helping Facebook integrate that app with its other chat tools, WhatsApp and Instagram Direct.

The massive project has already gotten pushback from regulators worried about Facebook’s size and power. Government officials also worry about Facebook’s plans to extend end-to-end encryption to Messenger. Once that happens, Facebook wouldn’t be able to respond to law enforcement subpoenas because it wouldn’t have a way to unscramble messages.

Chudnovsky, who moved to the U.S. from Russia in 1994, joined Facebook in 2015. He spoke with The Associated Press recently about his work and views on privacy. Questions and answers have been edited for length and clarity.

Q: What are the biggest roadblocks in bringing end-to-end encryption?

A: It’s technologically hard to move from the system that is alive and functioning and has billions of messages being sent every day to where it’s done completely differently architecturally. We also need to figure out how to do as much as we can on safety, while being the leaders on privacy. We are trying to go through that process slowly and very responsibly while talking to everyone.

Most messages in the U.S., where (Apple’s) iMessage is leading, are already end-to-end encrypted. We want to make sure that we get to the point when we lead very strongly and we do as much on safety as we possibly can given the constraints of end-to-end encryption.

Q: How do you ensure that people are safe when you can’t see bad things people are doing?

A: We are going to continue to work very closely with law enforcement on whatever we can provide. We also have connectivity to social networks. Whoever is a bad player on social networks, we will be able to see if those bad players exist on messaging services.

I don’t want to go into details on how we are thinking about approaching that stuff. But we’re just going to invest heavily in identifying threats earlier,

Q: You can send things in a private message that you can’t post on Facebook, right?

A: Definitely. You should be able to send whatever you want to send in a private message.

Q: What if it’s illegal or hurting someone?

A: Generally we believe that conversation between people should be private. We don’t make a difference between the conversations that are happening in the living room or on the phone and conversations that are happening in a private chat.

Q: What if you try to sell a gun, despite Facebook’s ban?

A: If you’re trying to sell a gun, you are probably trying to sell a gun to many people. When someone reports that and someone provides the messages that from the point of that person are illegal, then definitely we will be able to look at that.

Q: What are the biggest things that you have to figure out before interoperability becomes reality?

A: Generally, just a features compatibility in the sense that, if I “like” some message on one app, how does it manifest itself in another? Or will I be able to also call people, not only send messages?

Q: Do you think scrutiny of Facebook will ease any time soon?

A: We have a lot of responsibility. And the criticism, sometimes it’s accurate. Sometimes it’s not accurate. At the end of the day, what it means if everyone’s talking about you positively or negatively or both, is that you’re important. We just need to continue to deliver value to people. And as long as we are building products that people like. I think it’s going to be fine.

[ad_2]

Source link

EU competitors chief hints at new information guidelines for tech companies

[ad_1]

The European Union’s highly effective competitors chief indicated Friday that she’s increasing rules on private information, dropping an preliminary trace about how she plans to make use of new powers in opposition to tech firms.

Margrethe Vestager stated that whereas Europeans have management over their very own information by the EU’s world-leading information privateness guidelines, they do not handle issues stemming from the way in which firms use different individuals’s information, “to attract conclusions about me or to undermine democracy.”

“So we may want broader guidelines to be sure that the way in which firms gather and use information would not hurt the elemental values of our society,” Vestager stated in a speech in Copenhagen .

Vestager spoke days after she was appointed to a second time period because the EU’s competitors commissioner. She was additionally given new powers as an government vp to form the bloc’s digital insurance policies.

The Danish politician earned a fame because the world’s strongest expertise regulator after issuing multibillion-dollar penalties to Silicon Valley tech giants together with Apple and Google.

She is presently investigating Amazon over whether or not it’s exploiting information from unbiased retailers for an unfair benefit.

Along with her new powers to make “Europe match for the Digital Age,” Vestager seems able to construct on the EU’s Basic Knowledge Privateness Regulation — powerful new requirements for private information that took impact in 2018.

Vestager stated she was involved about how on-line platforms “manipulate the way in which we see the world, in ways in which we frequently do not even discover,” which may have an effect on how individuals make selections.

Knowledge is changing into extra necessary to how individuals “assume and act,” Vestager stated. She did not determine particular firms that may very well be focused, however her feedback sign that she’ll sustain the strain on the U.S. tech giants.

“When a couple of firms management numerous information about us,” Vestager stated, “that may additionally assist them affect the alternatives we make.”

[ad_2]

Supply hyperlink

Q&A: Ex-Googler Harris on how tech ‘downgrades’ people

[ad_1]

Tristan Harris desires to reverse the dangerous results he believes expertise has had on all of us.

Harris, a former Google design ethicist, first rose to nationwide consciousness after a presentation he gave inside Google in 2013 unfold all through the business. In it, he argued that many tech merchandise have been designed to be addictive, inflicting folks to spend an excessive amount of time on them and distracting them from dwelling their lives. He urged designers to change their strategy.

Harris spent greater than two years pushing change inside Google, however says he could not get traction. So he give up and began a motion known as Time Properly Spent, which finally pushed firms equivalent to Apple and Google to construct display screen time utilization metrics and instruments into their telephones.

He has since widened his focus, having determined that many points going through society at the moment are literally linked and could be traced, at the very least partly, to the design of applied sciences we use day-after-day.

The aim of his group, the Middle for Humane Expertise, is to reverse human “downgrading,” or the concept that expertise is shortening our consideration spans, pushing folks towards extra excessive views and making it tougher to search out frequent floor. In brief: expertise has prompted humanity to worsen, and Harris desires to assist repair it.

Harris not too long ago spoke to the Related Press about his work, the tech business’s progress to this point, and why all hope isn’t misplaced. This interview has been condensed and edited for readability.

Q: May you inform us the necessary concepts of your work?

This is not about dependancy, it isn’t about time. It is about what we name “human downgrading.” It is a phrase that we got here up with to explain one thing we do not suppose persons are acknowledging as a linked system.

Expertise is inflicting a set of seemingly disconnected issues —shortening of consideration spans, polarization, outrage-ification of tradition, mass narcissism, election engineering, dependancy to expertise. These seem to be separate issues, and we’re really saying that these are all predictable penalties of a race between expertise firms to determine scoop consideration out of your mind.

Q: The place is the central place to battle this multifaceted downside that you’ve got outlined?

A: Very similar to you say, “How do you resolve local weather change?” Do you simply get folks to show off their gentle bulbs? No. Do you move some coverage? Sure. However is that sufficient? No. Do you need to work collaboratively with the oil firms to vary what they’re doing? Sure. Do you need to move legal guidelines and mandates and bans?

It’s important to do all this stuff. It’s important to have a mass cultural consciousness. It’s important to have all people get up.

That is just like the social local weather change of tradition. So engaged on inner advocacy and having folks on the within of tech firms really feel, frankly, responsible, and ask, “what’s my legacy on this factor that is taking place to society?”

We work on the interior advocacy. We work on public stress and coverage.

Q: How do you’re employed with firms, and the way are they taking to your imaginative and prescient?

A: Doing it from the within did not do something when the cultural catch-up wasn’t there. However now in a world post-Cambridge Analytica, submit the success of Time Properly Spent, submit extra whistleblowers popping out and speaking about the issue, we do have conversations with folks on the within who I feel begrudgingly settle for or respect this angle.

I feel that there is perhaps some frustration from a few of the people who find themselves on the YouTubes and Facebooks of the world whose enterprise fashions are utterly in opposition to the issues we’re advocating for. However we have additionally gotten Fb, Instagram, YouTube, Apple and Android to launch Time Properly Spent options by some form of advocacy with them.

Q: Is there a path that you just attempt to assist map out for these firms?

A: They don’t seem to be going to do it voluntarily. However with a number of exterior stress, shareholder activism, a public that realizes they have been lied to by the businesses, that each one begins to vary.

There are a number of enterprise fashions — subscription is one.

Would you pay $eight a month to a Fb that did not have any curiosity in manipulating your mind, mainly making you as weak as doable to advertisers, who’re their true prospects? I feel folks would possibly pay for that.

So our coverage agenda is to make the present enterprise mannequin costlier and to make the alternate options inexpensive.

Q: Washington is now in an enormous debate about privateness and knowledge and misinformation. Will that course of take care of the causes that you just care about by default?

A: I really fear that we’re so mindlessly following the herd on privateness and knowledge being the precept issues when the precise issues which might be affecting the felt sense of your life and the place your time goes, the place your consideration goes, the place democracy goes, the place teen psychological well being goes, the place outrage goes. These issues are a lot extra consequential to the outcomes of elections and what tradition seems to be like.

These points linked collectively should be named as an impression space of expertise. There must be regulation that addresses that.

My concern about how the coverage debate goes is everyone seems to be simply offended at Massive Tech. And that is not really productive, as a result of it isn’t simply the bigness that’s the downside. We now have to call that the enterprise mannequin is the issue.

Q: Do not folks have particular person company? Are we actually within the thrall of tech firms and their software program?

A: There’s this view that we must always have extra self-control or that persons are liable for no matter they see.

That hides an asymmetry of energy. Like if you suppose, “I will go to Fb simply to have a look at this one submit from a pal,” after which you end up scrolling for 2 hours.

In that second, Fb wakes up a voodoo doll-like model of you in a supercomputer. The voodoo doll of you is predicated on all of the clicks you’ve got ever made, all of the likes you’ve got ever performed, all of the belongings you’ve ever watched. The concept is that as this turns into a greater and extra correct mannequin of you, I do know you higher than you recognize your self.

We at all times borrow this from E. O. Wilson, the sociobiologist: the issue of people is that we’ve got Paleolithic brains, medieval establishments and godlike expertise. Our medieval establishments can solely keep in command of what’s taking place at a gradual clock price of each 4 years. Our primitive brains are getting hijacked and are tremendous primitive in comparison with godlike tech.

Q: Do you are feeling there’s consciousness (inside tech firms) that you just would not have thought existed two years in the past?

A: There was a sea change. For 4 years, I used to be watching how nobody was actually accepting or engaged on or addressing any of those points. After which all of a sudden within the final two years — due to the Cambridge Analytica scandal, due to “60 Minutes,” due to Roger McNamee’s e book “Zucked.” I might have by no means suspected that Chris Hughes, the co-founder of Fb, could be saying it is time to break up Fb.

I’ve seen an unlimited quantity of change within the final three years and I can solely financial institution on the truth that the clip at which issues are beginning to change is accelerating. I simply wish to offer you hope that I might have by no means anticipated a lot to begin altering that’s now altering. And we simply want that stress to proceed.

[ad_2]

Supply hyperlink

$5 billion superb would not mark the tip of Fb’s troubles

[ad_1]

Fb pays a $5 billion superb for privateness violations and will probably be topic to broader oversight, however ongoing probes in Europe and the U.S. might current even larger complications for the corporate.

The FTC superb is by far the most important the company has levied on a expertise firm. The settlement additionally comes with restrictions and authorities oversight.

Fb for a decade had largely been trusted to manage itself and hold its 2.four billion customers’ pursuits at coronary heart. Then got here Russian meddling within the 2016 elections, faux information and the Cambridge Analytica scandal, through which a political information mining agency affiliated with the 2016 presidential marketing campaign of Donald Trump improperly accessed the private information of as many as 87 million customers. 

Regulators in Europe and the U.S. took discover. Fb now faces the prospect of not solely billions of {dollars} in further fines, but additionally new restrictions.

Following are a number of the ongoing investigations and potential authorized threats involving Fb.

— U.S. Securities and Change Fee

Fb disclosed Wednesday that it’ll pay a separate $100 million superb to the SEC to settle expenses it made deceptive disclosures in regards to the threat of misuse of Fb consumer information.

— U.S. Justice Division

The U.S. Division of Justice on Tuesday mentioned it opened a sweeping antitrust investigation of main expertise corporations and whether or not their on-line platforms have harm competitors, suppressed innovation or in any other case harmed customers.

— U.S. Federal Commerce Fee

Although the privateness matter is settled, Fb disclosed Wednesday that the FTC is investigating the corporate individually for antitrust points. Fb mentioned it was knowledgeable of the investigation in June.

— Irish Knowledge Safety Fee

 Eire’s information regulator has launched an investigation of Fb over the Cambridge Analytica information leak final 12 months. At problem is whether or not the corporate complied with strict European rules that went into impact in Might 2018 protecting information safety. Underneath the brand new guidelines, corporations could possibly be hit with fines equal to four p.c of annual world turnover for probably the most critical violations. 

The probe might doubtlessly price Fb greater than $2.three billion in fines based mostly on its 2018 income, or extra if it makes more cash this 12 months, which is like. The fee, which handles on-line information regulation for the European Union, has almost a dozen open investigations on Fb that embody its subsidiaries WhatsApp and Instagram. Fb says it’s cooperating.

— U.S. Housing and City Growth

The U.S. authorities charged Fb with high-tech housing discrimination in March for allegedly permitting landlords and actual property brokers to systematically exclude teams similar to non-Christians, immigrants and minorities from seeing adverts for homes and flats.

The civil expenses filed by the Division of Housing and City Growth might price the social community hundreds of thousands of {dollars} in penalties. Extra necessary, they’re already affecting the corporate’s enterprise mannequin — its capacity to focus on adverts with near-surgical precision. By its nature, this form of focusing on excludes some folks and consists of others. And that is not all the time authorized. 

The fees got here regardless of adjustments Fb introduced only a week earlier to its advert focusing on system. The corporate had agreed to overtake its focusing on system and abandon a number of the practices singled out by HUD to forestall discrimination, not simply in housing listings however in credit score and employment adverts as properly. The transfer was a part of a settlement with the American Civil Liberties Union and different activists. However HUD did not be part of the settlement. Fb says it continues to work with civil rights specialists on the problems. 

 

— Canada’s privateness czar

In additional fallout from Cambridge Analytica, Canada’s privateness head introduced in April that he’s taking Fb to court docket after discovering that lax privateness practices allowed private data for use for political functions.

A joint report from privateness commissioner Daniel Therrien and his British Columbia counterpart mentioned main shortcomings had been uncovered in Fb’s procedures. It referred to as for stronger legal guidelines to guard Canadians. Fb says it’s taking the investigation significantly. 

 

— U.Okay., Belgium, Germany

In October, British regulators slapped Fb with a superb of 500,000 kilos ($644,000) — the utmost attainable — for failing to guard the privateness of its customers within the Cambridge Analytica scandal. The corporate mentioned it’s interesting the superb, so the matter remains to be, technically, unresolved. 

The Belgian Knowledge Safety Authority and Germany’s Federal Cartel workplace are additionally wanting into Fb’s information assortment practices. 

 

— Washington D.C., state attorneys common

If the federal investigations weren’t sufficient, Fb has been sued by Washington, D.C. ‘s lawyer common for unfair and misleading commerce practices whereas states together with California and New York are investigating it. The corporate’s unauthorized assortment of 1.5 million customers’ contact lists is underneath scrutiny by New York’s lawyer common. Fb mentioned the gathering was unintentional and it’s is cooperating with the opposite attorneys common of their probes. 

[ad_2]

Supply hyperlink

Did Fb knowledge assist Trump? ‘Nice Hack’ explores scandal

[ad_1]

The brand new documentary “The Nice Hack” captures how Fb’s cavalier dealing with of consumer knowledge within the Cambridge Analytica scandal posed a menace to democracy.

Nevertheless it does not show claims within the film that the ill-gotten knowledge helped elect Donald Trump.

The film, out on Netflix and a few theaters Wednesday, follows former Cambridge Analytica government Brittany Kaiser world wide, from the Burning Man competition in Nevada to a pool at a hideout in Thailand to a flight from New York to testify in Robert Mueller’s investigation on 2016 election interference. She reveals inside emails, calendar entries and video gross sales pitches, though the film does not fairly join the dots on what the paperwork actually say.

As a substitute, the film is usually a recap of what is already been reported in varied information retailers. When you’ve by no means heard of Cambridge Analytica, otherwise you aren’t steeped in all the small print of the scandal that landed Mark Zuckerberg in entrance of Congress and his firm below main federal investigations, “The Nice Hack” gives a very good overview on the way in which firms like Fb gather and use knowledge to affect your considering. It is also price waiting for a reminder of the large energy and menace of Huge Information.

The film’s launch coincides with the Federal Commerce Fee saying a file $5 billion nice towards Fb stemming from its investigation into the Cambridge Analytica scandal. The FTC additionally sued the British agency, which has filed for chapter.

Cambridge Analytica drew knowledge by a Fb app that presupposed to be a psychological analysis software. Roughly 270,000 individuals downloaded and shared private particulars with the app. Below Fb’s insurance policies on the time, the app was ready to attract data from these customers’ buddies as nicely, despite the fact that these buddies by no means consented. Fb mentioned as many as 87 million individuals may need had their knowledge accessed.

The app was designed by then-College of Cambridge researcher Aleksandr Kogan. Cambridge Analytica, whose purchasers included Trump’s 2016 basic election marketing campaign, paid Kogan for a replica of the info, despite the fact that the agency was not approved to have that data. Cambridge Analytica shifted the blame to Kogan, who in flip accused Fb of attempting to deflect consideration from what he referred to as its personal negligent and systematic publicity of consumer knowledge. The scandal broke in March 2018 after newspapers reported that Cambridge Analytica nonetheless had knowledge it had promised to delete after studying of its questionable origins.

Listening to Kaiser, a self-described whistleblower, you would possibly assume Cambridge Analytica gained the election for Trump. Kaiser, who was the agency’s enterprise growth director, defined that the info helped Cambridge Analytica determine “persuadable voters.” She mentioned the agency focused blogs, web sites, articles, movies and adverts particularly at them “till they noticed the world the way in which we wished them to.”

David Carroll, a Parsons Faculty of Design professor who can also be closely featured within the film, mentioned that given how shut the election was in sure states, simply turning a “tiny slice of the inhabitants” was sufficient.

Federal election information present that the Trump marketing campaign paid Cambridge Analytica roughly $6 million. Cambridge Analytica mentioned it by no means used Kogan’s knowledge in its work for Trump. The Trump marketing campaign additionally denied utilizing the agency’s knowledge.

Specialists say Cambridge Analytica’s affect was believable however inconclusive.

“They’d the info, (however) it is not fairly clear the way it was totally rolled out,” Jennifer Grygiel, a Syracuse College communications professor, advised The Related Press. “It appears to be like like they did take some form of motion. We simply haven’t got sufficient element to see what sort of affect it had.”

However she mentioned Cambridge Analytica’s work can’t be taken in isolation.

Not till 12 minutes earlier than the credit roll does the film point out different elements at play, together with a Russian-led misinformation marketing campaign centered on pretend posts and adverts to sow discontent within the U.S. voters. It was then that Kaiser expresses doubt: “Perhaps I wished to imagine that Cambridge Analytica was simply one of the best. It is a handy story to imagine.”

Kaiser advised the U.Ok. Parliament final yr that Cambridge had additionally labored with Brexit supporters. Amongst different issues, “The Nice Hack” exhibits footage of Kaiser on stage through the Depart.EU marketing campaign launch. It additionally exhibits Depart.EU’s on-line assertion on hiring the agency. However Cambridge Analytica has denied involvement within the marketing campaign for the U.Ok. to go away the European Union.

It is not stunning that Cambridge Analytica’s advertising and marketing pitches, as disclosed by Kaiser and thru undercover footage captured by Britain’s Channel 4, would boast of the corporate’s capabilities. And it is not stunning that the corporate would search to attenuate its function as soon as caught. The reality is probably going someplace in between — however simply the place, the film does not discover.

The unique Cambridge Analytica whistleblower, Chris Wylie, advised the U.Ok. Parliament that it does not actually matter whether or not the agency succeeded.

“While you’re caught within the Olympics doping, there’s not a debate about how a lot unlawful drug you took, proper? Or, ‘Effectively, he most likely would have are available in first anyway,'” Wylie mentioned in a snippet included within the film. “When you’re caught dishonest, you lose your medal.”

He was discussing the potential function Cambridge Analytica performed in Brexit, however his sentiment might have simply utilized to Trump. In different phrases, it is unhealthy sufficient that this was happening, no matter whether or not it labored.

The film might have left it there. As a substitute, it tries to counsel a bigger affect, with out totally exploring these dynamics.

British investigative journalist Carole Cadwalladr, who broke the preliminary tales on the scandal for The Guardian newspaper, famous within the film that Cambridge Analytica “really factors to this a lot larger, extra worrying story, which is that our private knowledge is on the market and getting used towards us in methods we do not perceive.”

The film tries as an instance that by Carroll’s quest to get data on what Cambridge Analytica had on him. His efforts have been in the end rebuffed, and the filmmakers did not study extra on their very own. Nor did the film discover Fb’s personal attitudes towards knowledge or what Syracuse professor Grygiel described as a pretend information setting for Cambridge Analytica to take advantage of.

“If I have been to make a film at this time, it might not be about Cambridge Analytica,” Grygiel mentioned. “It could be about Fb Inc. and the depth of their affect.”

[ad_2]

Supply hyperlink

Canada privacy watchdog taking Facebook to court

[ad_1]

Canada’s privacy czar said Thursday that he is taking Facebook to court after finding that lax practices at the social media giant allowed personal information to be used for political purposes.

A joint report from privacy commissioner Daniel Therrien and his British Columbia counterpart said major shortcomings were uncovered in Facebook’s procedures. It called for stronger laws to protect Canadians.

The commissioners expressed dismay that Facebook had rebuffed their findings and recommendations.

Facebook insisted it took the investigation seriously. The company said it offered to enter into a compliance agreement.

The Canadian report comes as Ireland’s privacy regulator is investigating Facebook over the company’s recent revelation that it had left hundreds of millions of user passwords exposed.

The Canadian probe followed reports that Facebook let an outside organization use an app to access users’ personal information and that some of the data was then passed to others. Recipients of the information included the firm Cambridge Analytica.

The app, at one point known as “This is Your Digital Life,” encouraged users to complete a personality quiz but collected much more information about those who installed the app as well as data about their Facebook friends, the commissioners said.

About 300,000 Facebook users worldwide added the app, leading to the potential disclosure of the personal information of approximately 87 million others, including more than 600,000 Canadians, the report said.

The commissioners concluded that Facebook broke Canada’s privacy law governing companies by failing to obtain valid and meaningful consent of installing users and their friends and that it had “inadequate safeguards” to protect user information.

Despite its public acknowledgment of a “major breach of trust” in the Cambridge Analytica scandal, Facebook disputes the report’s findings and refuses to implement recommendations, the commissioners said.

“Facebook’s refusal to act responsibly is deeply troubling given the vast amount of sensitive information people have entrusted to this company,” Therrien said. “The company’s privacy framework was empty.”

Therrien reiterated his longstanding call for the Canadian government to give him authority to issue binding orders to companies and levy fines for non-compliance with the law. In addition, he wants powers to inspect the practices of organizations.

The office of Innovation Minister Navdeep Bains, the Cabinet member responsible for Canada’s private-sector privacy law, said the government would take concrete actions on privacy in coming weeks.

Facebook Canada spokeswoman Erin Taylor said the company was disappointed that Therrien considers the issues unresolved.

“There’s no evidence that Canadians’ data was shared with Cambridge Analytica, and we’ve made dramatic improvements to our platform to protect people’s personal information,” Taylor said.

“We understand our responsibility to protect people’s personal information, which is why we’ve proactively taken important steps toward tackling a number of issues raised in the report.”

If the application to Federal Court is successful, it could lead to modest fines and an order for Facebook to revamp its privacy practices, Therrien said.

Also on Thursday, the New York State Attorney General’s Office announced that it is investigating the company’s unauthorized collection of the email contacts of 1.5 million users. Facebook has previously acknowledged that it unintentionally uploaded the contacts.

The Menlo Park, California, company did not immediately return a message for comment on the New York investigation.

[ad_2]

Source link

Facebook CEO Mark Zuckerberg testifies on data scandal for a 2nd day before Congress



Facebook CEO Mark Zuckerberg faces a second a day of testimony in front of the House energy and commerce committee amid concerns over privacy on the …

source