'Dystopian' | Workplace surveillance more likely to target young, female & black staff

Workplace surveillance more likely to target young, female & black staff

Worker surveillance is as old as work itself, but new technology is making it easier and cheaper than ever.

Technologies like webcam, movement and email monitoring exploded in popularity during the pandemic, and while there are legitimate uses for this monitoring, such as ensuring health and safety and fulfilling some regulatory requirements, excessive surveillance can harm workers’ wellbeing, increase staff turnover and lead to counter-productive work behaviours, such as company sabotage.

And now, a new report from the Institute for Public Policy Research (IPPR) shows that these ‘dystopian’ worker surveillance techniques are more likely to disproportionately affect young people, women and minority employees.

Workers in non-unionised, ‘low autonomy’ and low-skilled jobs are more likely to be surveilled at work, IPPR argues, and people aged 16-29 are the most likely to be in such jobs.

Listen to more from us

According to the IPPR, women are also at higher risk of worker surveillance, with non-unionised women 52% more likely to face surveillance. Black workers are also 52% more likely to face surveillance, according to the findings.

Being surveilled at work has significant negative consequences for employees. One worker told IPPR that even going to the toilet made them “feel like someone’s watching you”, and said that, as a result, “I don’t stand up and I just stay in my seat the whole time, and you’re just really paranoid.”

A union representative said that surveillance was used selectively “as a form of retribution” to intimidate and discipline staff. This was confirmed by a worker who said their manager would often threaten: “We’ll get the cameras on you.”


How AI & VR Tech Will Transform the World of Work

How AI & VR Tech Will Transform the World of Work

Immersive technologies and artificial intelligence (AI) are changing the workplace as we know it and heralding a new age for talent acquisition.

The acceleration of machine learning (ML), augmented reality (AR), blockchain, and conversational AI tools (such as ChatGPT) have presented possibilities across the HR lifecycle that businesses everywhere can’t afford to miss.

Imagine the opportunities for both employees and employers as virtual recruiting events, more immersive meetings with global colleagues, AI talent acquisition assistance, more data-driven upskilling/reskilling transition from emerging technology to the new normal.

You will learn:

  • Why implementing new AI and VR technology is a balancing act while revolutionising recruitment

  • How AI, VR and immersive tech are changing skills

  • Impacts on the future of work

Show more
Show less

However, there are also negative consequences for employers, including increased staff turnover, greater likelihood of worker sabotage, and an over emphasis on ‘measurable activity’ rather than genuine outcomes.

The pandemic saw a huge increase in worker surveillance, and this has shown little sign of abating.

Previous research shows that the number of online searches for ‘how to monitor employees working at home’ is 383% higher than before the pandemic, while searches for ‘best employee monitoring software’ are up 201%.

Common types of surveillance include:

  • Tracking physical movements of employees, for example in warehouses, to see how much time they spend in the bathroom, talking to colleagues or even their heart rates

  • Using webcam technology to monitor workers’ screen time, concentration and facial expressions to interpret mood, both in the office and at home

  • Vehicle monitoring and dash cameras to track workers by the minute in jobs that involve driving

Companies often use AI to automatically analyse data from worker surveillance, but this can lead to unfair decisions due to algorithmic bias. The complex algorithms used to make these decisions are not transparent and may benefit certain groups over others, the report points out these algorithms may be used to support assigning tasks, creating work schedules or determining how pay and promotions are awarded.

From our premium content

IPPR’s recommendations for policymakers to prevent a permanent power shift from workers to employers and to reduce the risks of algorithmic bias, include:

  • Government should consider outlawing practices such as keystroke monitoring which are unlikely to ever be acceptable

  • Employers should share data collected with employees, to empower workers and their representatives in negotiations on workplace conditions

  • Government should strengthen protections against automated decision-making, including giving workers the right to a personalised explanation of how an algorithm reached a decision

  • Introduce a statutory right to disconnect so every employee has the right to ‘switch off’ from work contact outside contracted working hours

Henry Parkes, senior economist at IPPR and the report’s author, said: “Dystopian worker surveillance techniques have exploded in popularity since the pandemic, becoming normalised and seeping into an increasing number of industries. However, regulation to safeguard employees has not kept up with the pace of this.

“Young people, women and black workers are likely to be disproportionately affected negatively by worker surveillance and as it stands, the law is not keeping up with reality. This could have disastrous consequences for the mental and physical wellbeing of the workforce. The government must urgently review what is acceptable.”



You are currently previewing this article.

This is the last preview available to you for 30 days.

To access more news, features, columns and opinions every day, create a free myGrapevine account.