‘Big Brother is already watching’ | KPMG | CH

‘Big Brother is already watching’

‘Big Brother is already watching’

Dirk Helbing, Professor of Computational Social Science at ETH Zurich, explains why he considers ethics to be a principle of success.

1000

Related content

Dirk Helbing

Dirk Helbing, Professor of Computational Social Science at ETH Zurich, Picture: Davide Caenaro

What developments will have the biggest impact on our lives over the next three to five years?

A lot of data is collected about us. This information can be used to manipulate people, and can also be used by hackers. This can influence not only our buying behavior, but also our political behavior and our opinions – people are being remotely controlled. This is also known as neuro-marketing. Such personalized marketing is more successful because it is tailored to the individual. We are not consciously aware of 95 percent of the information. The problem is that some of the people who use this method do not do so in a democratically legitimate way.

What are the ethical challenges of big data?

We are increasingly being controlled from the outside. We are losing individual control of our thoughts, feelings, actions and lives, and we are also facing the problem of hacker attacks on companies and critical infrastructures. These attacks are leading to an infiltration of society. It is becoming increasingly apparent that votes are being manipulated at the national level – and not only in and by Russia, as is often suggested.

How should society respond to this?

There is a great deal of helplessness. Another problem is ‘fake news’, which is now being advanced as an argument for the introduction of censorship laws. But this is not compatible with democracy. There are also efforts to set up government agencies to provide information on what fake news is – ‘ministries of truth’, if you like. When you compare different national positions, though, such as in the US and Europe, you can see that it is simply a case of the elite retaining their status as opinion leaders – but citizens no longer trust them.

What do you make of the risks around data protection?

The risks are high, but at the moment you cannot effectively protect yourself. As soon as you use a browser, there are cookies that record what you click on, and this information is then resold. All these links contain a lot of information about us. Big Brother is already watching. We can’t protect ourselves against it, because if we don’t accept cookies then we can’t use any services. It is untenable that we are so shamelessly researched. It makes us vulnerable. Apparently, someone can know us better from 300 Facebook clicks than our friends and partners do. I don’t think it can continue like this. Society is being manipulated. People are increasingly living in filter bubbles and are losing the ability to understand people with a different attitude or from another culture.

What should be done?

We need a new approach, and this will hopefully come with the new data protection law. We need to integrate the values of our society into technological systems. Today’s information systems are incredibly powerful. Either they will destroy our society as we know it today, or the current technological approach must change. There is a great danger that human rights could be threatened: engineers have to learn to integrate and protect human values in their systems. This is why the Institute of Electrical and Electronics Engineers (IEEE) is currently developing a guide called ‘Ethically Aligned Design’.

Who determines these values?

There is currently a consultation process; people can get in contact and offer their views. Much of it is laid down in the constitution, and there are various attributes that characterize democracy, such as the separation of powers, pluralism, the protection of minorities, participation, and transparency. These characteristics must also be guaranteed in information systems: the question we must ask ourselves is how we can implement them. In terms of today’s artificial intelligence systems, we are still very far away from such an implementation. They could do a lot of damage, for example with a Citizen Score.

What consequences does this have for society?

The digital revolution will fundamentally change our society. It is essential that we work actively to shape this transformation. This isn’t easy, because it’s not always clear exactly who is involved. There are also people who believe that from 2050, robots will take over our daily lives and people won’t be required any more. This may be an extreme position, and one I don’t share, but you can nevertheless see that we will be facing significant challenges.

It also raises the question of what is classified as censorship, and how to deal with censorship requests from companies.

The fact that algorithms can be discriminatory has now been recognized. Some companies are already embracing the idea of democracy in the design of their information platforms.

It is also a question of ethics.

Many people see ethics as an obstacle, but it rather provides principles that help us to build a peaceful and sustainable society – principles of success!

Cyber Security

Cyber Security

Taking a closer look at the current state of Swiss companies' cyber resilience.

© 2017 KPMG Holding AG is a member of the KPMG network of independent firms affiliated with KPMG International Cooperative (“KPMG International”), a Swiss legal entity. All rights reserved.

Connect with us

 

Request for proposal

 

Submit