It’s Not Always AI That Sifts Through Your Sensitive Info

It’s increasingly unremarkable for consumers to use artificial intelligence tools in their daily lives. Machine learning algorithms power your smart assistants, organize your vacation photos, and even analyze your health data. But human beings pick up the slack for those automated technologies more often than you might realize. And that means that real people can sometimes access user data that customers thought would only be seen by machines. In one particularly glaring case, that included detailed, potentially sensitive information culled from expense reports.

Covert human workforces have always been a crucial component of creating and maintaining AI-driven services, but last week, business management company Expensify set off a firestorm with listings in the crowdsourced labor marketplace Mechanical Turk seeking people to review and transcribe customer receipts.

“I wonder if Expensify SmartScan users know MTurk workers enter their receipts. I’m looking at someone’s Uber receipt with their full name, pick up, and drop off addresses,” Rochelle LaPlante, a Mechanical Turk worker who is also a co-administrator of the MTurk Crowd forum, wrote on Twitter.

Expensify Reports

Expensify aims to ease the hassle of filing expense reports and other benefit submissions by automatically scanning user-submitted documents, and then extracting the data to fill out forms. This necessarily involves placing some trust in Expensify. Customers choose to expose information to the tool in exchange for an automated service. And Expensify says that, since 2012, it has used an internal team of “SmartScan agents” to review any submissions that its automated process can’t handle for whatever reason.

But from the time Expensify launched in 2009, up until 2012, it used third-party Mechanical Turk workers to help process the receipts, reimbursement forms, and benefit claims. This fall, the company returned to Turk in a limited capacity, according to a blog post from Expensify founder and CEO David Barrett.

Ironically, Expensify says it went back to Mechanical Turk to quietly test a new privacy feature called Private SmartScan. The feature lets Expensify clients set up a customized team of Mechanical Turk data reviewers if they want more control over who can see their data. The company started testing the feature on September 20, using only receipts and documents from Expensify employees. Then on November 15, it started processing 10 percent of human review cases from its free customers through Mechanical Turk (Expensify offers tiers of paid and free service).

Throughout that trial period, Expensify says that only its own SmartScan agents who had registered as Turkers were viewing the data. Then, on November 22, the company opened the testing to all Mechanical Turk workers. It pulled this back the next day after the uproar. Expensify did not return a request from WIRED for further clarification about the incident.

“Once approved by Turk, then you enter our SmartScan system as a new agent,” Barrett wrote, describing the additional vetting Mechanical Turk workers were going to go through to do Expensify tasks. “At this point we don’t know anything about your quality, so we begin testing you with sample receipts … Failure to process them at high quality means you are banned from the system. Accordingly, the only way to continue to obtain access to more receipts is if you’ve correctly processed the historical receipts.”

That benchmark fails to ease the concerns of skeptics, though. “A worker having high accuracy and being approved to do more work for them doesn’t provide any kind of assurance that this worker is not a bad actor,” says LaPlante. “In fact, bad actors might intentionally pass this testing/keep high accuracy in order to have a continuous access to a stream of personal data off these receipts.”

Expensify argues that this type of attack wouldn’t be worth the time, and the company emphasizes that Mechanical Turk workers are bound by confidentiality clauses that Expensify claims are readily enforceable. The service’s Participation Agreement says that registered workers “may only use information or other data acquired from your use of the Site solely as necessary to use the Site and for no other purpose.”

Academic researchers have found, though, that other techniques that limit, segment, and systematically control what data individual workers can see during a task are more effective safeguards than confidentiality clauses in dense service agreements. And in practice, some research has even shown that data extraction attacks from crowdsourced labor systems can be effective.

In one chilling example, a team from Microsoft Research posted tasks on Mechanical Turk that involved fake user data. Then they set up another task offering to pay Turkers to do the first tasks, record data from them, and then report it into the second task. Essentially, the researchers showed that they could pay Turkers to steal data, if it was presented as a legitimate task.

“Every product that uses AI also uses people,” says Jeffrey Bigham, a researcher at Carnegie Mellon University who studies crowdsourced work forces. “I wouldn’t even say it’s a backstop so much as a core part of the process. People definitely believe their technology is powered only by AI when it seems intelligent, and there’s every incentive for the companies to perpetuate that myth.”

Turks in the Machine

The Expensify incident isn’t at all unique to the company. Similar services, like Ibotta and Receipt Hog, also use crowdsourced labor for receipt transcription and present different approaches to maintaining user privacy. “If you purchase any specific items that you don’t want to make visible to Receipt Hog, simply mark over them before taking pictures of the receipt so that they cannot be read. You may also simply not submit a receipt at any time for any reason—you are always in control of what information you share with us,” Receipt Hog says.

For consumers who don’t realize that a human might see their data, though, and envision a totally digital, internal AI system, it’s not necessarily obvious that the onus to protect data largely lies in the initial decision to share. And though companies do set up internal human review teams to process data in a more controlled environment than a public open-sourced work platform, cost and the challenges of scaling these groups leads many companies to seek intermediaries like Mechanical Turk, or more tailored services like CloudFactory and CrowdFlower.

“Usually you don’t even get to see this,” Bigham says. “Companies won’t use Mechanical Turk for something like this, they’ll hire a more private crowd. They want flexible access to labor, there’s a huge cost to bringing everything truly in-house and they want access to places where labor is cheaper and they can scale pretty easily. But companies do not want their users to know the extent to which their information could be viewed by a crowdworker.”

In that sense, Expensify is less an outlier than it is a window into just how human so many automated—and sensitive—tasks really are. And a warning not to trust them with information you hold dear.