First published: 30th October 2019
I was honoured to be invited to speak as a foreign expert at the International Anti-Virus Conference 2019 (IAVC 2019) at the Tianjin Meijiang Convention Center, Tianjin, China in September. The theme of IAVC 2019 was "Cope with New Challenges of Anti-virus and Promote Personal Privacy Protection" and I started to prepare my speech around hot topics in information security and personal data privacy, with particular reference to my home city, Hong Kong. However, I was advised that certain parts of my speech were a sensitive topic in China and asked to remove them. I presented my speech without the sensitive topic, but I believe that the only way we can successfully face difficult situations is with the best available information, discussed rationally. Hong Kong is a part of China that, under the Basic Law, enjoys the freedom of speech, of the press and of publication, therefore I present here the topic that was deemed too sensitive for Tianjin. This is based on an early draft of my speech for IAVC 2019, edited for the medium and with the addition of relevant later events:
Personal privacy has had legal protection in Hong Kong since 1996, when the Personal Data (Privacy) Ordinance came and the Office of the Privacy Commissioner for Personal Data is an an independent statutory body set up to oversee enforcement. Each year in May they hold Privacy Awareness Week, with various educational events for all sectors of society.
However, there have been serious data breaches.
In October 2018, Cathay Pacific reported a data breach that exposed personal data of 9.4 million customers, included me. Cathay Pacific offered the victims a free identity monitoring service, provided by Experian, a company that had their own data breach between 2013 and 2015. In November 2018, Cathay Pacific faced a panel of Hong Kong's Legislative Council (LegCo) to answer questions about the breach. At that meeting, they reported that the attacker used previously unknown malware and utilities in the attack, which Cathay's up-to-date anti-virus system did not detect. Cathay has had in place detection and monitoring systems to detect APTs, and in March 2018 they also implemented an advanced endpoint detection and response system.
I think this data breach shows us that personal data is an attractive target for criminals. The data potentially has multiple uses, not all of them criminal. The most sensitive information: passport or credit card numbers, can used in fraud but other information can be used by marketeers to target ads. Even a list of valid email addresses can be sold multiple times, and where is the proof that they came from a data breach? A company will have strong protection for their financial systems and research and development, and access will be limited to a small team. Personal data will have "ordinary" protection, and be accessible to multiple teams: customer service, marketing. If you steal top secret plans, you will need to find a dishonest competitor to sell them to, but personal data can be sold in many more ways.
Medical data is very sensitive. In June this year, it was revealed that patient data in Hong Kong's public hospitals was insecure. The problem is with a program, called AEIS, used in Accident and Emergency departments. A user could bypass the normal login through the use of a short cut on the computer’s start menu. A black window popped up to launch AEIS, also known as the Accident and Emergency Department Clinical Information System. From there, patients' details could be examined and printed.
Privacy Commissioner for Personal Data Stephen Wong Kai-yi said on Monday his office had launched a compliance check to determine if a login was required to gain access to patients’ personal information at emergency wards in hospitals under the authority.
Dr Chung Kin-lai, the Hospital Authority’s director in quality and safety, admitted that logging in to the system was not required, but stressed it had never authorised anyone to print patient data for police. He said patient information would be given to police in only two circumstances: when a patient list would help the police account for injured or missing people; and when a hospital needed police help to contact a patient’s family.
The Hospital Authority has announced a task force would be set up to identify ways to protect patients’ medical information.
In 2017-2018 reporting year, the Privacy Commissioner received 1619 complaints of which 16 were referred to the Police for criminal investigation, but between 14 June 2019 and 26 July 2019, 430 cases of Online Disclosure of Personal Data were referred for criminal investigation.
The massive rise is related to the current social unrest in Hong Kong. In a few cases, student leaders and their families have been threatened anonymously with violence. Most cases have involved Police officers, Government officers and other public figures, and their families.
"Doxing (from dox, abbreviation of documents) is the Internet-based practice of researching and broadcasting private or identifying information (especially personally identifying information) about an individual or organization."
During the protests, Police officers in the special tactical unit have been observed without number badges. A vice-chair of the Independent Police Complaints Council (IPCC), has said that those officers may be hiding their identification documents over privacy concerns. However, this raises an important difference between personal privacy and personal responsibility. The badge numbers are there so that officers can be held responsible for their actions while on duty. The only link between the number and other aspects of their identity, such as their name, address or family members, is held by the Police. Revealing the badge number does not put their sensitive personal data at risk. This can be compared to the Hong Kong ID card number, which is widely used for many purposes. It is used by schools as the student number, at the Companies Registry to identify Directors, for bank accounts, and it is recorded at the security desk of some buildings. Some ISPs and phone companies use it as a default password. Therefore, disclosure of a Hong Kong ID card number allows many other pieces of personal data to be linked from different sources.
On 23 October, the Hong Kong Junior Police Officers’ Association won an application for an injunction barring the public from checking personal details on the voters’ register. On 25 October, the Department of Justice and the Police Commissioner were granted an interim injunction to ban the release of any personal information of police officers. On 29 October, Police have announced that frontline officers will wear white identification tags with “operational callsigns” which are unique to each officer. These changes introduce some clarity about the difference between a unique identifier and protected personal information; and limit the exposure of sensitive personal information of officers and their families. However, it leaves the question of why Police officers are granted additional protection when other members of the public who may be more vulnerable are not. Also, there should be a full discussion of the implications of restricting access to the voters' register. An accessible voters' register supports free and fair elections by allowing people qualified to vote to verify that they are correctly registered before an election takes place, and it allows potentially fraudulent registrations to be identified, for example, if an enquirer notices that there are additional registrations for people not resident at their address, or registrations for voters who are deceased. It should be noted that the Online Voter Information Enquiry System" (OVIES) remains functional. The OVIES regulations claim that it only allows an elector to check his or her own registered particulars. In reality, the enquirer must know the HK ID number and full name of a person, and be able to correctly identify two items from their address from a list. This then reveals the constituencies that the person is registered to vote in.
There is a tendancy for organisations to collect ever greater amounts of personal data, but these cases in Hong Kong show some of the dangers. Although the data you are collecting has a legitimate and useful purpose, it is also necessary to consider how it could be linked to other collections and how the data could be mis-used in combination. Even if your data store has limitations and security in place, could poor security on another data store be leveraged against your security? We need to be thinking more about how to break up personal data and limit how it can be combined.
In the UK, many organisations, including the Police and commercial companies such as shopping malls have been deploying facial recognition for several years. However, there are increasing calls for debate and restrictions on how it is used.
This month, the Ada Lovelace Institute released a report on public attitudes to facial recognition technology in the UK.
One of the key findings was that the ability to consent, or opt-out of, facial recognition technology was seen as an important safeguard.
Another was that people see a trade-off between public benefit and the normalisation of surveillance. While they might accept its use in criminal investigations, use in schools or public transport was much less acceptable.
Facial Recognition Concerns
The report also highlighted some concerns about the limitations of facial recognition technology.
One was that live trials of the technology by Police resulted in 90% incorrect matches. Supporters of the technology point out that, while this is a high error rate, the procedure would be to stop and question the person, and let them go as soon as it was clear they were not the wanted person. However, I think that the errors are unlikely to be evenly distributed. A few people, who happen to be similar to wanted people, will find that they are frequently stopped and questioned.
A second concern was that the technology tended to be less accurate for minorities. This is probably simply a consequence of how the systems are developed: the set of training data favours the majority. Although the explanation is innocent, the result could be serious discrimination against minorities where they are subject to more frequent misidentification.
Personal Data and Recognition
Another complexity is that identifying a person in a photo or video, whether by eye or by technology, might change the images into Personal Data, protected by law. After an incident in an MTR station, the MTR Corporation refused to release video from surveillance cameras on the grounds of personal data privacy. However, David Webb pointed out that in a landmark case in 2000, the Court of Appeal noted that photographs taken and published of people whom the publisher does not identify (or even know the identity of) are not "personal data" within the meaning of the PDPO.