Clearly when data is held by a third party, a different set of risks exist – from concerns about foreign Government access to the use of the data by the third party for other purposes. Patients appreciate their information will be held by the NHS but do they think it will end up on a server in California run by companies who base their business model on knowing more about people? That is perhaps what is most troubling about the revelation that PA Consulting uploaded the entire NHS England hospital patient database was uploaded it to Google.
The point was highlighted by Sarah Wollaston MP, a member of the Health Select Committee, who tweeted: “So HES [hospital episode statistics] data uploaded to ‘google’s immense army of servers’, who consented to that?”
We have warned for many months that the new NHS database is deeply flawed. Not only does it centralise data into what cyber-security experts call a ‘honeypot’ it also puts at risk patient privacy, both from abuse and also later re-identification.
We’ve highlighted how patients still don’t know what is going on, and remain convinced that a national leaflet drop is simply inadequate to ensure people know about a fundamental change to how their medical records are used.
However, it seems the NHS is equally confused about the risks. Compare and contrast:
February 2, 2013: Tim Kelsey, national director for patients and information at the NHS Commissioning Board, said that data sharing was vital for improving the NHS: “This does not put patient confidentiality at any risk. Data quality in the NHS needs to improve: it is no longer acceptable that at a given moment no one can be sure exactly how many patients are currently receiving chemotherapy, for example.”
And today: Mark Davies, the centre’s public assurance director, told the Guardian there was a “small risk” certain patients could be “re-identified” because insurers, pharmaceutical groups and other health sector companies had their own medical data that could be matched against the “pseudonymised” records. “You may be able to identify people if you had a lot of data. It depends on how people will use the data once they have it. But I think it is a small, theoretical risk,” he said.
So is there risk or not?
If you would like to opt-out, you can use the form here. Let us know if you have any problems or feedback from your GP.
We are barely into 2014, yet we are faced with yet another serious data protection breach concerning a public sector computer. On this occasion, a police officer has been charged with stealing thousands of accident victims’ details from her police force’s computer and selling them to law firms
This case alone highlights that serious need for our courts to issue much tougher penalties for unlawfully obtaining or disclosing personal information, otherwise these cases will continue to occur.
A court has heard that Sugra Hanif accessed Thames Valley Police’s command and control computer to note down the personal details of members of the public involved in road traffic accidents, including the unique reference number each incident was given.
As the new school term gets underway, now is the time for parents to check if their children are among the hundreds of thousands of pupils who are using biometric technology.
Today we have published our latest report looking at the use of biometric technology in secondary schools and academies which, based on data from the 2012-13 academic year, makes clear that fingerprints were taken from more than one million pupils.
You can read the report here.
Our research, gathered from Freedom of Information Requests to more than 3,000 schools, shows that at the start of the academic year 2012-13:
- An estimated 40% of schools in England are using biometric technology
- An estimated 31% of schools did not consult parents before enrolling children into a biometric system prior to the Protection of Freedoms Act 2012 becoming law
December 3, 2013
Posted in CCDP, Civil Liberties, Communications Data Bill, Databases, Freedom of Expression, Internet freedom, Mastering the Internet, Online privacy, PRISM, Privacy, Surveillance, Terrorism Legislation, United States
Today, the editor of the Guardian gives evidence to the Home Affairs select committee, as part of the committee’s work on counter terrorism.
Perhaps that might give the committee to question why Parliament learned of much of GCHQ’s activity from the newspaper, rather than from Ministers. Indeed, it seems on current evidence that will remain the case – as the Lords found on the 20th November, when they were told they could not even be informed which law authorised Project Tempora.
Lord Richard: My Lords, of course the Minister cannot go into details on these very sensitive matters. We all accept that. However, for the life of me, I do not see why she cannot answer a straightforward Question about which Minister authorised the project and why the existence of the project was not disclosed to the Joint Committee on the Draft Communications Data Bill. These are not sensitive issues. They are pure matters of fact, surely capable of being answered.
Baroness Warsi: It is interesting that the noble Lord interprets it in that way but I think he would also accept that it would be inappropriate for me to comment on intelligence matters, which includes any comments on the project.
We have been repeatedly assured that it would be unacceptable for a central database of communications to be built – both by those in Government and those seeking to be.
Tesco’s new scanner sounds harmless enough – a camera that just works out whether you’re male or female, and roughly how old you are.
The advertisements shown on the screen change, and I’m sure quickly you’ll see cases of men with long hair being mistaken for women, to much hilarity from their friends.
There are two fundamental problems here; not least the fact that the only way you can ensure your face is not scanned is to not go into the shop.
Firstly, should we really be increasing the amount of surveillance we’re under so some companies can sell more advertising?
Secondly, the technology isn’t going to stay the same and be used in the same way.
The Care Quality Commission (CQC) has announced plans to install hidden cameras and ‘mystery shoppers’ in care homes in a bid to increase the regulations of social care. Care homes and social care premises are home for some of society’s most vulnerable people. To subject them to covert surveillance where there is not reasonable cause for suspicion would be both an attack on their privacy and dignity.
In a signposting document which has been published today ahead of a full public consultation, Andrea Sutcliffe, Britain’s first Chief Inspector of Adult Social Care, said: “We would like to have an open conversation with people about the use of mystery shoppers and hidden cameras, and whether they would contribute to promoting a culture of safety and quality, while respecting people’s rights to privacy and dignity.”
The National Crime Agency (NCA) has been launched today by the Home Office with announcements that it will have access to some of the most high tech surveillance tools available but will also promote an environment of transparency and openness. Yet, with an exemption from the Freedom of Act and being regulated by outdated legislation, how accountable will the Agency be?
The NCA has billed itself as being more open and transparent than its predecessors, yet it won’t be subject to the Freedom of Information Act (FOI) on the basis that this would “jeopardise its operational effectiveness and ultimately result in lower levels of protection for the public.” Considering the Agency will have highly intrusive surveillance techniques at its disposal, it is remarkable that it is allowed to be able to use them behind a cloak of secrecy. FOI would not prevent intelligence sharing, protecting sources of information or expose police tactics and technology. Indeed, every police force in the country and the Association of Chief Police Officers all manage to operate with FOI applying to them.
Responding to the statement from the French data protection authority (CNIL):
“The concern is that regulators do not have the tools to bring multinational companies to the table, let alone punish them. Trivial financial penalties are at risk of being seen as the cost of doing business, rather than a meaningful sanction. Whether consumer notices, restriction on public sector contracts or interpreting each user affected as an individual breach, regulators need to think long and hard about how they resolve this situation to ensure users privacy is respected and the law upheld.
“This case is a critical test of the legal framework that protects citizens privacy and regulators must ensure that the outcome does not set a precedent for other companies to exploit out data with impunity.”
If CCTV cameras are about public protection, why are they bringing in £300m in revenue from parking enforcement?
Firstly – and this goes to the heart of what Big Brother Watch has been campaigning on – the public are never, ever told that this is part of the deal when they accept greater CCTV surveillance. The rhetoric is always about violent crime, anti-social behavior and catching criminals. Would the public be as willing to accept yet more cameras if they had the full facts about how cameras are used?
If anyone can find us an example of a council justifying it’s need for greater CCTV on parking problems we’d love to hear about it.