Popular fertility apps are engaging in widespread misuse of data, including on sex, periods and pregnancy

Popular fertility apps are engaging in widespread misuse of data, including on sex, periods and pregnancy

New research reveals serious privacy flaws in fertility apps used by Australian consumers – emphasising the need for urgent reform of the Privacy Act.

Fertility apps provide a number of features. For instance, they may help users track their periods, identify a “fertile window” if they’re trying to conceive, track different stages and symptoms of pregnancy, and prepare for parenthood up until the baby’s birth.

These apps collect deeply sensitive data about consumers’ sex lives, health, emotional states and menstrual cycles. And many of them are intended for use by children as young as 13.

My report published today analysed the privacy policies, messages and settings of 12 of the most popular fertility apps used by Australian consumers (excluding apps that require a connection with a wearable device).

This analysis uncovered a number of concerning practices by these apps including:

confusing and misleading privacy messages
a lack of choice in how data are used
inadequate de-identification measures when data are shared with other organisations
retention of data for years even after a consumer stops using the app, exposing them to unnecessary risk from potential data breaches.


Read more:
Proposed privacy reforms could help Australia play catch-up with other nations. But they fail to tackle targeted ads

The data collected

The apps in this study collect intimate data from consumers, such as:

their pregnancy test results
when they have sex and whether they had an orgasm
whether they used a condom or “withdrawal” method
when they have their period
how their moods change (including anxiety, panic and depression)
and if they have health conditions such as polycystic ovary syndrome, endometriosis or uterine fibroids.

Some ask for unnecessary details, such as when a user smokes and drinks alcohol, their education level, whether they struggle to pay their bills, if they feel safe at home, and whether they have stable housing.

See also  The Australian government has introduced new cyber security laws. Here’s what you need to know

They also track which support groups you join, what you add to your “to-do list” or “questions for doctor”, and which articles you read. All of this creates a more detailed picture of your health, family situation and intentions.

Confusing or misleading privacy messages

Consumers should expect the clearest information about how such data are collected, used and disclosed. Yet we found some of the messaging is highly confusing or misleading.

Some apps say “we will never sell your data”. But the fine print of the privacy policy contains a term that allows them to sell all your data as part of the sale of the app or database to another company.

This possibility is not just theoretical. Of the 12 apps included in the study, one was previously taken over by a drug development company, and another two by a digital media company.

Other apps explain privacy settings using language that makes it almost impossible for a consumer to understand what they are choosing, or obscure the privacy settings by placing them numerous clicks and scrolls away from the home screen.

Keeping sensitive data for too long

The major data breaches of the past six months highlight the risks of companies holding onto personal data longer than necessary.

Breaches of highly sensitive information about health and sexual activities could lead to discrimination, exploitation, humiliation or blackmail.

Most of the apps we analysed keep user data for at least three years after the user quits the app – or seven years in the case of one brand. Some apps give no indication of when user data will be deleted.

Can’t count on ‘de-identification’

Some apps also give consumers no choice regarding whether their “de-identified” health data will be sold or transferred to other companies for research or business. Or, they have consumers opted-in to these extra uses by default, putting the onus on users to opt out.

See also  Long-overdue Australian privacy law reform is here – and it’s still not fit for the digital era

Moreover, some of these data are not truly de-identified. For example, removing your name and email address and replacing it with a unique number is not de-identification for legal purposes. Someone would only need to work out the link between your name and that number in order to link your whole record with you.

When supposedly de-identified Medicare records were published in 2016, University of Melbourne researchers showed how just a few data points can connect a de-identified record to a unique individual.


Read more:
Post Roe, women in America are right to be concerned about digital surveillance – and it’s not just period-tracking apps

Need for reform

This research highlights the unfair and unsafe data practices consumers are subjected to when they use fertility apps. And these findings reinforce the need for Australia’s privacy laws to be updated.

We need improvements in what data are covered by the Privacy Act, what choices consumers can make about their data, what data uses are prohibited, and what security systems companies must have in place.

The government is seeking submissions on potential privacy law reforms until March 31.

In the meantime, if you’re using a fertility app, there are some steps you can take to help reduce some of the privacy risks:

when launching the app for the first time, don’t agree to tracking of your data, or you can limit ad tracking via iPhone device settings
don’t log in via a social media account
don’t answer questions or add data you don’t need to for your own purposes
don’t share your Apple Health or FitBit data
if the app provides privacy choices, opt out of tracking and having your data sold or used for research, and delete your data when you stop using the app
bear in mind that every article you read, and how long you spend on it, and every group you join and comment you make there may be added to a profile about you.

See also  Spider legs build webs without the brain's help – providing a model for future robot limbs


Read more:
After Roe v Wade, here’s how women could adopt ‘spycraft’ to avoid tracking and prosecution

The Conversation

Katharine Kemp receives funding from The Allens Hub for Technology, Law and Innovation. She is a Member of the Advisory Board of the Future of Finance Initiative in India, and the Australian Privacy Foundation.