Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Salon
Salon
Politics
Rae Hodge

A big win on abortion privacy

In a significant victory for the privacy of people seeking abortion in the U.S., the Federal Trade Commission has issued a groundbreaking ban on the sale of individuals' medical location data. The FTC settlement announced on Tuesday is the first of its kind and concludes the agency’s investigation into the smartphone surveillance company Outlogic (formerly X-Mode Social), which bills itself as the “second largest U.S. data company” and tracks people through more than 100 smartphone apps. Outlogic is now prohibited from selling or sharing any person’s private location data that could be used to track someone to a health care facility, domestic abuse shelter or place of worship. 

The FTC accepted the consent agreement on a 3-0 vote. In a joint statement, FTC chair Lina Khan pointed at the U.S. Supreme Court ruling that constitutional protections against unchecked government surveillance extend to geolocation tracking “even when the data is originally collected by private companies.”

“The explosion of business models that monetize people’s personal information has resulted in routine trafficking and marketing of Americans’ location data. As the FTC has stated, openly selling a person’s location data to the highest bidder can expose people to harassment, stigma, discrimination or even physical violence,” Khan said Tuesday

Under its former name as X-Mode, Outlogic is also the company that was exposed in a 2020 investigation by Motherboard’s Joseph Cox for selling users’ personal data to the U.S. military. Notably, the company deployed a precise location-tracking app that also functioned as a Muslim prayer app. It was installed 98 million times. X-Mode CEO Joshua Anton told CNN at the time that the company was using its 400 or so apps to track 25 million devices inside the U.S. every month, along with 40 million devices elsewhere.

Although Outlogic didn’t directly sell abortion-specific data, the FTC complaint specifically mentioned how easy it would be to hunt down an abortion seeker with the highly specific location data the company sells. 

“The location data could be used to track consumers who have visited women’s reproductive health clinics and as a result, may have had or contemplated sensitive medical procedures such as an abortion or in vitro fertilization. Using the data X-Mode has made available, it is possible for third parties to target consumers visiting such healthcare facilities and trace that mobile device to a single-family residence,” the commission said in its filing

Khan’s remarks are no exaggeration. One of X-Mode/Outlogic's previous clients is a private intelligence firm called HYAS which has boasted about tracking people “down to their physical doorstep” using location data bought from X-Mode.

“X-Mode must also delete all the sensitive location data it unlawfully collected, as well as any model or algorithm derived from this unlawfully collected location data,” Khan said in a series of tweets, adding that the agency “will continue to use all our tools to safeguard Americans' sensitive data from unchecked corporate surveillance.”

https://twitter.com/linakhanFTC/status/1744776434696982601 

As reported by Politico, an Outlogic spokesperson contended that the company had not misused user location data, saying it would comply with the FTC without making major changes. 

“Adherence to the FTC’s newly introduced policy will be ensured by implementing additional technical processes and will not require any significant changes to businesses or products,” the company said. 

The FTC order is also likely to have significant implications for advertising schemes on social media platforms, which often use users’ geolocation data to serve up strategically targeted ads. A pandemic-driven rise in virtual medical care, and the spread of unregulated video-conferencing and health-based apps — has also widened the doorway for patient privacy breaches

Meanwhile, data brokers are buying and selling patient mental health data with nearly zero oversight, putting user privacy at greater risk as AI-enabled tools outpace federal regulation and drive new and disturbing forms of online harassment. In the vacuum of federal regulation, websites and data brokers — especially social media companies — now face an increasingly complex obstacle course of state data privacy laws. Establishing protections for minors has become a rare bipartisan theme in many legislative debates on tech, sparked in large part by the ongoing teen mental health epidemic.

Forty-one states are currently suing Facebook and Instagram parent company Meta, alleging the company lured teens with addictive and harmful content, and then profited off data it collected from them, using geolocation and other sensitive personal data to serve targeted ads. In November, the social media giant filed a lawsuit against the FTC in a bid to dodge the $5 billion fine it has owed the government since 2019. 

Ahead of the FTC settlement on Tuesday morning, Meta debuted its latest effort to get ahead of regulation, rolling out some modest content restrictions for teens on its apps. Children’s advocacy group Fairplay called Meta’s last-minute move “desperate” and urged Congress to take action.

“Today’s announcement by Meta is yet another desperate attempt to avoid regulation and an incredible slap in the face to parents who have lost their kids to online harms on Instagram. If the company is capable of hiding pro-suicide and eating disorder content, why have they waited until 2024 to announce these changes?,” Fairplay executive director Josh Golin said in a statement. 

“The FTC and state AGs are doing their part to hold Meta accountable for their abuses of young people," he continued. "Now it’s Congress’s turn: toss Meta’s press release in the recycling bin and pass the Kids Online Safety Act and COPPA 2.0."

Facebook and Instagram aren’t the only platforms catching heat for how they collect and use kids’ data, however. TikTok, used by up to two-thirds of American teenagers, also ran afoul of ethics standards last year when its algorithm was caught serving extremist and violent content to 13-year-olds. 

The collection of kids’ geolocation data takes on even more alarming context in light of government contractors like Palantir — which helped Homeland Security and Customs and Border Protection agents track and surveil immigrant families using a combination of biometric and non-public databases.

The throughline connecting the FTC’s X-Mode settlement and the weaponization of social media surveillance wasn’t lost on Sen. Ron Wyden, D-Ore., who lauded the FTC and echoed Fairplay’s calls for legislative action. 

"I commend the FTC for taking tough action to hold this shady location data broker responsible for its sale of Americans' location data,” Wyden said in a statement

“In 2020, I discovered that the company had sold Americans' location data to U.S. military customers through defense contractors. While the FTC's action is encouraging, the agency should not have to play data broker Whack-a-Mole. Congress needs to pass tough privacy legislation to protect Americans' personal information and prevent government agencies from going around the courts by buying our data from data brokers."

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.