skip to navigation skip to content

News

 

Data trail

Writing in the Huffington Post, Dr David Stillwell of Cambridge Judge examines how companies should use our data in more transparent ways.

Personal data illustration

Dr David Stillwell

Dr David Stillwell

Many of us have had the unfortunate experience of being refused credit, being turned down for a mortgage or even missing out on a job opportunity. Mortgage decisions, for example, used to be made using human intuition based on a customer’s relationship with their bank manager. This was deemed to be a biased process which led to banks relying entirely on data from application forms to make objective predictions.

“Now, not just banks, but employers, advertisers and retailers know that there is data out there about us that will help them to make even more accurate decisions,” David Stillwell, University Lecturer in Big Data Analytics & Quantitative Social Science, writes in the Huffington Post.

Every time we use a social network, search engine, web browser, cellphone, or credit card, we leave a trail of data behind us that can be collected, stored, shared, and used to make predictions.

His research has shown that even Facebook Likes can be used to accurately predict personality, IQ, political views, religious views, and sexuality. So, imagine the predictions that could be made from the whole of someone’s data trail.

David believes that this can have benefits, because, for example, it can help insurance companies give lower premiums to drivers whose shopping habits suggest are less impulsive and prone to anger. But the downside is that companies are already starting to make predictions about us, and consumers are unaware of how their data is being used – so someone could be denied a loan without ever knowing that this resulted from data analysis of his or her social media profile.

“Companies should tell us when they make predictions and explain them in an understandable way, so when you see an advertisement you should be able to find out why you’ve been targeted,” David writes.