Site navigation

Comment | Tackling Unwanted Bias in Technology

Firas Khnaisser


Unwanted Bias

Firas Khnaisser, Chair of DMA Scotland, examines ‘unwanted bias’, one of the biggest challenges facing the tech sector today.

One of the biggest challenges facing the tech sector today, and even wider society, is how do we deal with ‘unwanted bias’. When I use the term ‘unwanted bias’, I mean the favouring of one thing over another in a manner which is considered unfair.

As we increase automation and allow AI algorithms to run critical services this will have a significant impact on individual’s wellbeing and liberties.

AI algorithms are now widely used in sensitive social spheres (credit scoring, employment, education, policing, criminal justice, mental health). None of us wants to be inaccurately judged or unfairly favoured based on our gender, race or beliefs, but these unwanted biases exist.

We now know that black people are particularly disadvantaged by these algorithms due to inherent bias in the historical data used to build these algorithms. For example, predictive policing systems for ‘at risk’ neighbourhoods are driven by bias data – using information on societal habits and structure to inform decisions.

If these algorithms are not examined, historical injustices that have happened to black communities will not only persist to the present, but will continue into the future.

It’s up to people and organisations in the private and public sphere to collectively tackle these injustices by waking up to their responsibilities. But this issue is complex and is not something that AI or the latest technology can solve.

A colleague of mine, Olivia Gambelin, Founder and CEO of Ethical Intelligence believes the problem of system bias lies deeper, driven by the two key inputs that help create it – data and people. She provides further clarity on this in our webinar, ‘Leading with Data – Tackling Bias’.

Humanising data

More often than not, when we talk about data, we’re actually talking about people – their values, lifestyle and behaviour. Yet perceptions of data couldn’t be further from being human.

This means we’re often scared, bored or mistrusting of data. The word data has come to encompass so many things that we don’t know what people mean when they say the word. Pundits tout it as the solution to everything, so no wonder folk are sceptical, and it is time we shifted the emphasis from data back to people.

AI won’t solve the issue of unwanted bias; it has to be addressed in the data and the values that guide it first. If a bias already exists then a system will only perpetuate it, as technology tends to amplify bias in societal habits.

To remove bias, we must first reflect on how systems are being deployed and start asking questions. Who have they been designed for? Who is benefitting? Who is being excluded? If a particular group are regularly over/under or misrepresented, then there will be unwanted bias.

Democratising data talent

If I gave two identical, clean, relevant datasets to two separate teams, will Team One derive the same insights from the data as Team Two? Will Team Two share the insights in the same way, to allow for improved decision making or even operational efficiency?

Will Team One be able to act on it? And if it did, does it have the right people to make that happen?

All of those things are dependent on skill sets, outlooks and experience – attributes that people bring. Yet for too long we’ve had a very linear focus on those attributes – defining teams based on their skillsets and not necessarily on their outlook and experience.

To be truly people-centric when working with data (or should I say people?), you need diverse teams that reflect society at large. Only by democratising talent can we truly democratise data and make sure we’re doing the right thing by society.

I’m not underplaying the role of having good data. I’m trying to illustrate that the value doesn’t necessarily lie in the data itself but how you use it.

This reminds me of David McCandless’ reference to ‘data as the new soil’. I love that positioning because it’s active, not passive. The much more famous notion of data being ‘the new oil’ is passive and assumes that by finding this magical resource all your wishes will come true, which we all know couldn’t be further from the truth.

The soil analogy puts the ownness back on people, that we ultimately reap what we sow and that’s very important.

Regulation will always play catch-up

There’s a lot of talk nowadays about ethics in AI. AI, and technology more broadly, is moving at such a fast pace that regulation cannot keep up. Regulation tells you what you have to do, whereas ethics helps you understand what you should do.

So, in the absence of regulation, we have to fall back on ethics for answers, which also translates to falling back on us, the people and the organisations we work for.

Regulation has a huge role in holding organisations to account, but alone the lawmakers cannot drive change and remove bias. We need people within organisations to make ethical decisions and to be empowered by organisations to be brave and question the norm.

For ideas to flow, people must be praised rather than penalised when asking questions and trying to do better. If organisations develop a top-down culture that rewards an inquisitive personality, they are setting themselves up for success.

Customers also have a key role to play. Customers collectively have the power to demand change, we have seen this permeate through society over the past decade with the proliferation of ethical values, products, and services.

People were making concerted efforts to protect the environment and our general wellbeing, and this was subsequently mirrored by organisations who realigned their values accordingly.

We are all in this together

I’ve been the chair of DMA Scotland for two years now and we lead the DMA’s Value of Data campaign. We’re trying to help organisations find where the value is in their data, to sustain their investment in this asset and more importantly the people that support it.

I always get asked, ‘Have you been successful at attributing a value to data yet?’ My answer is simple. I’ve known the answer all along. You guessed it…it’s in the people.

So, what are your responsibilities? What are your values as an individual? And what are the values of the organisation that you work for? What change are you able to bring in your role? How does that affect your business and how does that reflect on your customers?

If we are to remove unwanted bias, we must start asking ourselves these questions. AI algorithms are representative of the people and data that drive it – regulation alone cannot fix this, so it is time to bring values and ethics into the forefront when developing technology.

Join the Debate: MarTech Virtual Summit

The evolution of marketing and the place of artificial intelligence in the sector will be a key area of discussion at the upcoming MarTech Virtual Summit on 24th February.

Hear from leading experts from across the marketing landscape and explore the crucial issues facing frontline practitioners.

Register your free place now at:

Firas Khnaisser

Chair of DMA Scotland

Latest News

%d bloggers like this: