3 in 4 people experience abuse on dating apps. How do we balance prevention with policing?

3 in 4 people experience abuse on dating apps. How do we balance prevention with policing?

Shutterstock

A 2022 survey by the Australian Institute of Criminology found three in four app users surveyed had experienced online abuse or harassment when using dating apps. This included image-based abuse and abusive and threatening messages. A further third experienced in-person or off-app abuse from people they met on apps.

These figures set the scene for a national roundtable convened on Wednesday by Communications Minister Michelle Rowland and Social Services Minister Amanda Rishworth.

Experiences of abuse on apps are strongly gendered and reflect preexisting patterns of marginalisation. Those targeted are typically women and members of LGBTIQA+ communities, while perpetrators are commonly men. People with disabilities, Aboriginal and Torres Strait Islander people, and people from migrant backgrounds report being directly targeted based on their perceived differences.

What do these patterns tell us? That abuse on apps isn’t new or specific to digital technologies. It reflects longstanding trends in offline behaviour. Perpetrators simply exploit the possibilities dating apps offer. With this in mind, how might we begin to solve the problem of abuse on dating apps?

Trying to find solutions

Survivors of app-related abuse and violence say apps have been slow to respond, and have failed to offer meaningful responses. In the past, users have reported abusive behaviours, only to be met with a chatbot. Also, blocking or reporting an abusive user doesn’t automatically reduce in-app violence. It just leaves the abuser free to abuse another person.

Wednesday’s roundtable considered how app-makers can work better with law enforcement agencies to respond to serious and persistent offenders. Although no formal outcomes have been announced, it has been suggested that app users should provide 100 points of identification to verify their profiles.

But this proposal raises privacy concerns. It would create a database of the real-world identities of people in marginalised groups, including LGBTIQA+ communities. If these data were leaked, it could cause untold harm.

See also  Super Bowl ads turn up the volume on cryptocurrency buzz: 6 essential reads about digital money and the promise of blockchain


Read more:
Right-swipes and red flags – how young people negotiate sex and safety on dating apps

Prevention is key

Moreover, even if the profile verification process was bolstered, regulators could still only respond to the most serious cases of harm, and after abuse has already occurred. That’s why prevention is vital when it comes to abuse on dating apps. And this is where research into everyday patterns and understanding of app use adds value.

Often, abuse and harassment are fuelled by stereotypical beliefs about men having a “right” to sexual attention. They also play on widely held assumptions that women, queer people and other marginalised groups do not deserve equal levels of respect and care in all their sexual encounters and relationships – from lifelong partnerships to casual hookups.

In response, app-makers have engaged in PSA-style campaigns seeking to change the culture among their users. For example, Grindr has a long-running “Kindr” campaign that targets sexual racism and fatphobic abuse among the gay, bisexual and trans folk who use the platform.

A mobile screen shows various dating app icons

Match Group is one of the largest dating app companies. It owns Tinder, Match.com, Meetic, OkCupid, Hinge and PlentyOfFish, among others.
Shutterstock

Other apps have sought to build safety for women into the app itself. For instance, on Bumble only women are allowed to initiate a chat in a bid to prevent unwanted contact by men. Tinder also recently made its “Report” button more visible, and provided users safety advice in collaboration with WESNET.

Similarly, the Alannah & Madeline Foundation’s eSafety-funded “Crushed But Okay” intervention offers young men advice about responding to online rejection without becoming abusive. This content has been viewed and shared more than one million times on TikTok and Instagram.

In our research, app users told us they want education and guidance for antisocial users – not just policing. This could be achieved by apps collaborating with community support services, and advocating for a culture that challenges prevailing gender stereotypes.

Policy levers for change

Apps are widely used because they promote opportunities for conversation, personal connection and intimacy. But they are a for-profit enterprise, produced by multinational corporations that generate income by serving advertising and monetising users’ data.

See also  Community broadband provides a local solution for a global problem

Taking swift and effective action against app-based abuse is part of their social license to operate. We should consider stiff penalties for app-makers who violate that license.

The United Kingdom is just about to pass legislation that contemplates time in prison for social media executives who knowingly expose children to harmful content. Similar penalties that make a dent in app-makers’ bottom line may present more of an incentive to act.

In the age of widespread data breaches, app users already have good reason to mistrust demands to supply their personal identifying information. They will not necessarily feel safer if they are required to provide more data.

Our research indicates users want transparent, accountable and timely responses from app-makers when they report conduct that makes them feel unsafe or unwelcome. They want more than chatbot-style responses to reports of abusive conduct. At a platform policy level, this could be addressed by hiring more local staff who offer transparent, timely responses to complaints and concerns.

And while prevention is key, policing can still be an important part of the picture, particularly when abusive behaviour occurs after users have taken their conversation off the app itself. App-makers need to be responsive to police requests for access to data when this occurs. Many apps, including Tinder, already have clear policies regarding cooperation with law enforcement agencies.


Read more:
Tinder fails to protect women from abuse. But when we brush off ‘dick pics’ as a laugh, so do we

The Conversation

Kath Albury receives funding from the Australian Research Council; FORTE: the Swedish Research Council for Health, Working Life and Welfare and VicHealth. She has previously received funds from the eSafety Commission, leading the research and evaluation arm of the 'Crushed But Okay' project, in collaboration with the Alannah and Madeline Foundation.

See also  The Colonial Pipeline ransomware attack and the SolarWinds hack were all but inevitable – why national cyber defense is a 'wicked' problem

Daniel Reeders is a PhD candidate at the Australian National University. They worked as a consultant on the safety in hookup apps research study mentioned in the article, led by Prof Kath Albury.