Radio Sputnik discussed whether users should be more aware of the possible dangers associated with personal data sharing with Paul Levy, author of the book Digital Inferno.
There could have been issues in their mental health and they're saying how they've changed and trying to be educational; other people have been doing it for fun. But what she was pointing at, what could possibly be used when this happens, one of the possibilities is that if it's two images side-by-side, very conveniently, we're doing the sourcing of those images and they can be used with algorithms for facial recognition.
It doesn't mean they are, but with trust low in social media networks from some of the stories mentioned in the past, like Cambridge Analytica, if trust is low, she's suggesting we just should be a little bit savvier and mindful before we rush to share our data online.
So I think it's been a good wake-up call, but I'm not quite sure how many people will wake up.
READ MORE: Desperate to Control the Narrative — Facebook Censorship Rages Onward
Sputnik: It's funny how yeah, we are basically doing their work for them because I've seen so many people desperately looking for 10-year-old photos of themselves and if they can't find them on Facebook they reach out to other social media, wherever they're actually putting in a lot of effort to get those photos out there, but one wonders how an algorithm of this sort could actually be used?
Paul Levy: I think that's what's being revealed through this. Some of the algorithms are not yet able to do what some of these organisations would like them to do.
It's not that easy from a 100 or a thousand or even a few thousand photos you posted; there may be timestamps, but you may have put them up on the day that shows a picture of you five years before, and so this very simply means you organise that data, so it could be used in lots of ways.
It was suggested in facial recognition it can be used in showing how you've changed, whether you're showing truthful images of yourself, and so it's revealed a lot about yourself that algorithms can't easily pick up from the chaos of thousands of images. So at times, they're getting us to do the work, you know, when the algorithms they'd like aren't quite there and efficient enough, and that worries me.
READ MORE: Facebook Bans Sputnik Pages on Advice of NATO-funded Atlantic Council
Sputnik: Surely it is just a matter of time before algorithms of this kind will be out and will be used; it's kind of inevitable, isn't it?
Paul Levy: It is, but you know some of it is not necessarily bad. You know we're going through a phase of evolution in this field, that's social evolution in some ways where maybe all the scandals will mean that we will come to a better place, to better governance. There's a pushback that goes on here and there are court cases.
My own view, as you probably heard before, is a bit pessimistic simply because as an academic pessimism is never a bad thing, a bit of caution enables us to stay awake and scrutinise things. And at the moment corporations are doing the innovation and our ability to work out the impact is running slower than the speed of change, and that's damaging and dangerous if we don't keep awake.
READ MORE: US Cops Use Facial Recognition Tech to Open Suspect's iPhone X — Report
Sputnik: Is it necessary to obtain consent before using private or public photos of users on social networks for any sort of research?
Paul Levy: I think the problem is the whole nature of the free model, for this to occur and so many people, if they were researched and asked: do you like these platforms, would want them to happen if they weren't able to operate very easily, if we put all those rules in place.
There's nothing really wrong with it per se if we are empowered and savvy but we are simply not, and because of that you might think the government and organisations have a duty of care like in any under industry to look after us.
READ MORE: Cybersecurity Firm Warns AI May 'Go to Dark Side' in 2019
Sputnik: Of course, every time we're talking about social networks we always ask the same question: how can users protect their data posted on social networks? Is there a way to do this? Or some are just saying it's safer to avoid posting personal information, but I don't think this is going to happen, it's become a part of peoples' lives, hasn't it?
Paul Levy: The assumption is that you're sharing, the assumption is transparency, and then you have certain switches to switch things off, but the Internet has a long memory and it finds it much harder to forget than to keep memorising, so people are shocked when they google themselves and find out what's there.
You know one thing is to have some of your stuff removed is problematic, as we've read with Google. You're giving a lot of permission from the moment you press, I always joke about it, the submit button, submit means surrender.
READ MORE: Google Employees Seek to Weaponize IT Services to Aid Left-Wing Agenda — Prof
Sputnik: Do you think that anything is going to change in the future or in the next few years we're just going to carry on with what's happening on social networks right now, people being pretty oblivious? Or will something happen, in the unlikely event something that does happen, that will make people more aware?
Paul Levy: Certainly there have been real steps taken, and some messaging apps do this, for properly encrypting your data. Certainly, there is the possibility that has been pushed out by some of the founders of the Internet that the Internet will need to be premium. For those people who want pure privacy, well that is a product and that's something they will have to pay for.
And the closed-doors Internet for the average person like us isn't really being offered properly yet, but it may very well come with innovation.
Views and opinions expressed in the article are those of Paul Levy and do not necessarily reflect those of Sputnik.