An enemy used to be easy to distinguish from an ally. Now our biggest enemies are those that collect our data points. Harvesting them in order to warp our political opinions and manipulate election results. During the summer school, I learnt more about this from David Carroll – the investigator from Netflix’s The Great Hack.
You may remember the overriding question that Professor Carroll asked in the Netflix documentary – do I have access to my data? In the documentary, he did not gain access to his data. A year later, he did. In his lecture, he presented us with some of the stats that Cambridge Analytica had on him. It ranked the probability of his future actions. Most were general rankings about his societal position in relation to which direction he would vote. There was a ranking on how likely or not Professor Carrol was to buy a gun. And even if he was a threat to national security. These things all went into account for the ads that US voters were fed through their timelines for the Trump campaign. If they were deemed easy to manipulate – they were worth spending advertising money on.
With this in mind, my group and I set to work on our final presentation. Hana, Eric, Joaquin and I met in our own time on Zoom to do this. We decided to design and pitch an idea based around an educational project. We wanted to equip youngsters with the knowledge of what tech companies do with their data. We chose to create an educational project because it would be easier to implement worldwide. Eric reminded us that a campaign would be more challenging in China because it would need government approval.
Our proposal was to introduce safeguarding for children aged 12, in the form of education. Having examined a Finnish education model within the course, we understood that Finland has a unique way of teaching school pupils how to use technology. However, no country that we are aware of gives children awareness of the manipulation they are subjected to when they turn 13. 13 years old is when it is legal (in most countries) to have a Facebook and Youtube account. In our presentation, Hana explained what tech education each member of our group was given in our relative countries – China, Argentina, Czech Republic and the UK.
Eric demonstrated how our data is made into models that enable companies, like Cambridge Analytica, to predict our behaviour and our choices. Data is collected on us every time we use our devices, even when they’re in our pockets. Our favourite food in our restaurant. The amount of steps we take every day. Where we live and where we work. How we commute, and what we listen to on the way. Our favourite genres, and subgenres. With that information, it’s easy to form a whole picture of somebody. Seeing a library suited to our preferences in Spotify or Netflix is reliant on receiving our data.
And what’s the problem with this? Tailored content is convenient. There’s certainly a purpose to it. But companies aren’t doing this to be kind. It happens so that we become addicted. If Netflix didn’t appeal to us, we’d cancel our subscriptions. Youtube needs us to spend longer watching content for us to see more ads. Facebook needs us to tell it more about ourselves to be able to target ads at us too. Data models can even be used to predict spikes and dips in our mood based on the models of behaviour and choice. This not only enables advertising to be shown to a susceptible person. It allows the advert to be shown to them at the right time – when they are most susceptive to the content of the ad.
In our presentation, I explained the dangers of this using the 2019 Youtube scandal as a case study from 2018. Youtube (owned by Google) illegally used the data of children under 13. They were fined by the Federal Trade Commission and it’s long since been filed into the dodgy deals of data privacy history. However, using data is perfectly legal as soon as a child turns 13. Unlike the bombardment of ads that a kid watches on a television network, Youtube ads are tailored.
I created an activity to help understand the vast scale that data is collected. Try going to websites on your internet browser. Each time you visit a new website, take a screenshot of the data privacy agreement. For me, it was difficult to stop myself from agreeing to the privacy agreements. Instinctively, my finger clicks ‘accept’ to clear my screen of the irritating popup box. And in one tap of my finger I have granted a website access to observe my every action.
We think of Youtube as being free. In fact, it comes at a very high price. In the age of Big Data, we are the commodity. The knowledge that tech companies have on us is being sold.
Finally, Joaquin presented our solution to this problem; alert children about data collection. We want to encourage children to continue using technology, but be mindful of the way in which these sites can ‘use’ it’s users. Streamlining what we’re exposed to narrows our options. Arguably, it takes away our free will.
The internet is a force to be reckoned with, but it is larger than any of us. Proven in the irony of these blog posts. Which are aimed at highlighting an international experience even when travel is impossible. In these three blog posts it has been my aim to talk about my experience of this. And it was all made possible using the internet and technology.
The internet is all-encompassing, omnipotent and ubiquitous. We can’t resist the technological world on our own, but we can be vigilant. Collectively, we can counteract the internet being used in a harmful way.
Ultimately it is, and should remain, a tool. The summer school with Cologne University has taught me to incorporate rather than run away from it. I am not scared, I am excited. I must use this tool so that it benefits me. I must not let it use me.
On that note, I will draw the curtains and close my screen…