A Fb psychology researcher who previously helped harvest millions of Fb users’ profiles for the controversial Trump campaign contractor Cambridge Analytica has left the tech giant.
In December 2015, when it said it first learned that a pair of Cambridge University researchers had sold harvested user data to Cambridge Analytica, Fb began an investigation and demanded the data be deleted. Around the same time, it also hired Joseph Chancellor, one of the two researchers.
On Wednesday, a Fb spokesperson declined to explain when or why Chancellor had left the company, or to detail the results of any investigation into his work. “I can confirm that Joseph Chancellor is longer employed by Fb, and we wish him all the best,” she wrote in an email. Chancellor did not respond to a request for comment.
Fb has faced tough questions from lawmakers about Chancellor, whom Fb said worked as a quantitative researcher on its User Experience Research team. After the scandal broke, he was put on leave by the company as it investigated his role, BuzzFeed reported in May. As recently as June, Fb said it was still “investigating Mr. Chancellor’s prior work” according to written responses to Senators’ questions.
His departure was mentioned by 60 Minutes on Sunday in an addendum to an April report about Aleksander Kogan, Chancellor’s senior collaborator, who has become a focus of the controversy. In March, when the story broke, Fb said it was suspending Kogan and Cambridge Analytica from the platform for violating its terms of service. Chancellor, however, still retains a Fb account and, as of Wednesday, a company webpage. That page was no longer active on Thursday.
Kogan, a research associate in the university’s department of psychology who himself has previously collaborated with Fb on research, officially led the data harvest. But he said he “did everything with” Chancellor. The two were co-founders and equal co-owners of Global Science Research, or GSR, the company that Cambridge Analytica hired to gather the user data and analyze it for psychological traits.
Joseph Chancellor’s Fb Research page as it appeared on September 5. View full size here
The exact nature of Chancellor’s work at Fb is not clear. The company employs many in-house social scientists to better understand the psychology of its users using an unprecedented amount of data. When he was hired at Fb Research in November 2015, according to a now deleted LinkedIn profile, Chancellor had been researching happiness and so-called pro-social behavior. At some point, he was assigned to virtual reality research, a hot topic for the Oculus-owning tech giant.
In April 2017, The Intercept was the first to report on Chancellor’s role at Fb. At the time, the company said in a statement: “The work that he did previously has no bearing on the work that he does at Fb.” As for Cambridge Analytica, Fb told The Intercept then that it believed the harvested data had been deleted, and that an ongoing investigation had “not uncovered anything that suggests wrongdoing.”
Fb officials have also not made clear when it became aware of his role at GSR or that he had violated its terms of service. The company also has not said what it may have learned from him about the data harvest, either before or after the scandal erupted. Correspondence between Kogan and Fb in 2014 does not mention Chancellor. But in an interview with BuzzFeed, Kogan said that Chancellor had informed the company of his role at GSR in 2015 while he was being interviewed for the Fb position.
During testimony to the British Parliament in April, Fb’s chief technology officer Mike Schroepfer initially said Fb had only learned of Chancellor’s role in 2017. Later in the hearing, he revised his answer. “In the recruiting process, people hiring him probably saw a CV and may have known he was part of GSR,” said Schroepfer. “Is it possible that someone knew about this and the right other people in the organization didn’t know about it? That is possible.”
Damian Collins, the chair of the Parliamentary committee investigating fake news and data in elections, noted the difficulty of Fb’s position when he questioned Kogan that month.
“When Fb’s response from their deputy general counsel described your work as ‘a scam and a fraud’ … and they singled you out to say that ‘you’d lied to us and violated our platform policies,’ those remarks must apply to Joseph Chancellor as well,” Collins said.
Related: Fb reinstated a data firm with ties to its election study that had raised spying concerns
When asked about the matter in May, a spokesperson for Fb said that the company didn’t know about Chancellor’s role at GSR at the time he was hired.
“Joseph works in our VR team, and there are probably dozens of other Cambridge University grads that work here at Fb,” the spokesperson said in a phone call. “When we found out about Kogan’s app back in 2015, and the fact that they might be sharing data with Cambridge Analytica, we didn’t know about Chancellor’s involvement, because Kogan identified himself as the sole developer of the app itself, and didn’t disclose [Chancellor’s] involvement.”
Kogan did not respond to requests for comment. He signed a non-disclosure agreement with Fb in 2016, related to the issue, but Fb has said it has since released him from that agreement. Chancellor hasn’t signed an NDA with Fb related to the data harvest, but since he joined Fb, he has been bound by an employee NDA.
While at Cambridge University, where Chancellor worked as a postdoctoral student in Kogan’s psychology lab, the two formed Global Science Research in May 2014. They built and distributed Fb apps that offered users personality predictions, based on pioneering research using Fb “likes” at the university’s Psychometrics Center. In exchange, users would be turning over their personal data and more limited information about their friends–including their “likes” and in some cases their private messages–if their settings allowed it.
Just weeks after founding the company, the researchers accepted a contract for $800,000 from Cambridge Analytica to cover costs related to the apps—including paying users $1 to $2 dollars to install them—and to provide the election firm with data on Americans and their psychometric scores. They managed to amass data on more than 87 million users, even though less than 1% of those people installed the apps to begin with.
Chancellor and Kogan were equal partners at GSR, but Chancellor resigned his directorship in September 2015, company records indicate. Chancellor began his job at Fb that November, and The Guardian first reported on the data harvest the following month, which is when Fb says it first learned of the data harvest.
Concerns about GSR’s work first emerged in 2014 at the university. Other researchers there had made a separate tranche of Fb data available to Kogan and Chancellor “explicitly for academic research purposes only.” But Michal Kosinski, who was then deputy director of the university’s Psychometrics Center, told Fast Company last November that he couldn’t be sure that the center’s own data hadn’t been improperly used by GSR.
“Alex and Joe collected their own data,” Kosinski wrote in an email. “It is possible that they stole our data, but they also spent several hundred thousand [recruiting participants on Amazon’s] Mechanical Turk and data providers—enough to collect much more than what is available in our sample.”
Kogan, whose contract with Cambridge University expires this month, has insisted no academic data was used by GSR. He has also complained of having been “scapegoated” by Fb, which he said knew what he and Chancellor were doing at the time and had effectively permitted “thousands” of other apps to similarly harvest data.
A university spokesperson said it continues to investigate the issue and has contacted Fb requesting “all relevant evidence in their possession.”
Cambridge Analytica, with backing from the billionaire Mercer family and Steve Bannon, helped run the Trump campaign’s social media effort alongside representatives from Fb—so-called campaign “embeds.” SCL Group, Cambridge’s parent company, has said the Fb data wasn’t used during its work for Trump, and some analysts have said the company’s analysis was largely useless. But its role in elections is now the subject of multiple investigations, and amid the scrutiny, executives have said they are closing the firm.
Fb, which also faces investigations by the Federal Trade Commission and the Security and Exchange Commission, has since overhauled the systems and policies that allowed Chancellor and Kogan to amass torrents of data, and it has also sought to stem false news, discriminatory ad targeting, and hateful content. But these problems—and Fb’s subsequent responses—helps illustrate some of the risks that users will continue to face online, and the apparent difficulties internet giants have when it comes to preventing abuse on their platforms.
Chancellor’s fate is now clearer, if not his precise role, but lawmakers, whose patience and technological understanding has already been stretched thin, have other questions for the social network. As it faces Europe’s General Data Protection Regulation and a new privacy law in California, Fb and other tech companies are starting to push for a federal law that better meets their wishes. That effort is likely to be shaped by how Fb answers those lingering questions, even ones as seemingly minuscule as those about a single researcher.
This story has been updated to reflect Fb’s removal on Thursday of Chancellor’s company homepage.