Are you interested in the collection and analysis of ‘big data’? Are you concerned about the impact the collection of data could have on protecting human rights?
Are you affronted and seriously worried about the amount of data that the government (or ‘the state’) has about you? Or, are you more like me, much more concerned by the amount of data that your bank and supermarket have about you, your habits, and your life?
Are you horrified by the experience of reading your online newspaper or magazine, only to find yourself being targeted with adverts relating to other internet searches you have recently completed?
Do you believe like me that, if we are serious about controlling migration and about ensuring that only those entitled to get access to our public services and benefits, it is inevitable that the UK (like nearly every other country) will end up with each of us having some form of ID entitlement card? In fact, most of us have one already; it’s called a passport.
More recently, we have all started to gain an inkling about the extent to which private companies (and especially the big technology companies) are collecting, aggregating, analysing and selling data about us. This has not just been to other advertisers but, as recent investigations – some criminal, some by investigative journalists, some by democratic institutions – have revealed, our data has been sold to political agents, wealthy ideological obsessives and, almost certainly, to foreign governments who don’t have our best interests at heart.
Since the Facebook/ Cambridge Analytica scandal broke last year, the role of the big tech companies such as Facebook, Google and Apple in protecting the right to privacy is increasingly coming under scrutiny, with greater pressure for regulation. The increasingly rapid development of Artificial Intelligence presents some of the most challenging ethical and social questions – in both the public and private sector.
Now, the all-party Joint Committee on Human Rights – it has members from both the House of Commons and the House of Lords – has embarked on an inquiry into whether new safeguards to regulate the collection, use, tracking, retention and disclosure of personal data by private companies are needed in the new digital environment to protect human rights.
The key human right at risk is the right to private and family life (Article 8 ECHR), but freedom of expression (Article 10 ECHR), freedom of association (Article 11), and non-discrimination (Article 14 ECHR) are also at risk.
The Committee is seeking written evidence on the threats posed to human rights by the collection, use and storage of personal data by private companies and examples of where they have been breached. In particular, it is interested in the following questions:
- Are some uses of data by private companies so intrusive that states would be failing in their duty to protect human rights if they did not intervene?
- If so, what uses are too intrusive, and what rights are potentially at issue?
- Are consumers and individuals aware of how their data is being used, and do they have sufficient real choice to consent to this?
- What regulation is necessary and proportionate to protect individual rights without interfering unduly with freedom to use and develop new technology?
- If action is needed, how much can be done at national level, and how much needs international cooperation?
- To what extent do international human rights standards, such as the UN Guiding Principles on Business and Human Rights, have a role to play in preventing private companies from breaching individuals’ rights to privacy?
Written submissions should be no more than 3,000 words, and the deadline for submissions is Thursday 31 January 2019.
There is an online submission form at: