Personalisation vs Discrimination

Personalisation vs Discrimination


The trail of data each of us leave is an occasionally confronting representation of our online history, and indeed, inner psyche.

While scrolling through Facebook this morning, an ad popped up for European Walking Tours for Single Females Over 65. More than a little bemused at this unwelcome snapshot of my future/advert, I wondered what kind of algorithm had led me to be the target of this message, and a waste of that company’s advertising spend.

Guess What?

Algorithms cast a taut, strategic lasso around our target audiences and allow us to implement the most effective possible solutions to draw them in.

But when does personalisation extend to potential discrimination? Concerns have been raised that algorithms designed to match outliers with “interesting content” are accidentally reflecting, and even amplifying, historical discrimination. A new study by US researchers even suggests Google AdWords are more likely to show ads for high-income jobs to men rather than women.

Instead of being drawn to content about postgraduate study or new tablets, as someone who has ticked the box of being female I’m either matched up with ads for 30 day tea-toxes or said spinster friendly walking tours. Similarly, a quick survey at Lexer HQ reveals a gay colleague is bombarded with ads for men’s underwear and protein powder, a single middle aged woman gets info on IVF and anti ageing treatments and even a man of 40 is regularly shown the merits of assisted, walk-in baths.


There’s a fine line between personalisation and discrimination, and Lexer are very conscious of the potential use of the data we provide our clients. We are involved deeply in the strategic and commercial use of our data to ensure that any misuse or bias is recognised early, and reviewed appropriately.