Minister of Public Security Khemraj Ramjattan, never a one for great reflection on our liberal values, was unsurprisingly in full rapturous mode at the launch of the Safe City surveillance system in Georgetown last week. “I am just so thrilled,” he was quoted as saying, “this is exactly what Guyana needs.” One can only hope that his euphoria was tempered subsequently by correspondence which appeared in this newspaper, more particularly that from Messrs Darshanand Khushial and Sherwood Lowe.
As we reported, the Safe City system, designed by the Chinese company Huawei, consists of 102 Intelligent Video Surveillance sites each of which has three or four cameras, as well as two systems which can be mounted on vehicles and three different types of radio and body cameras. There will, of course, be a command centre which is located at Liliendaal. The cameras will have various features, but the one which is causing the greatest public concern is the facial recognition and tracking system which will have 24 hour surveillance capabilities. For the moment it is sections of Georgetown and the East Bank which will be monitored, but the intention is to expand the system right across the country.
The system is being tried out by various nations, but it is in authoritarian states where it has been most unquestioningly embraced, in particular in China, which is employing it against the Muslim Uighurs in Xinjiang province. In the West, the cities of San Francisco and Oakland have placed a temporary ban on its implementation until a legal framework is created to cater for it, while the tests carried out by London’s Metropolitan Police and South Wales police have brought a flood of criticism from civil rights organisations. In the case of a face-recognition trial in Romford, for example, the police arrested four people for avoiding the cameras, and one of them, who had his face scanned over his objections and did not match anyone in the ‘criminal’ database, was fined £90 for telling an officer to “piss off”.
A case has been brought in the courts against the South Wales police, on the grounds that their use of facial recognition technology is a breach of the Human Rights Act. According to the Financial Times the outcome of the case could set a precedent from the US to India and Australia, where the system is being quietly tested. It might be noted that Guyana potentially could be added to that list. The Executive Director of Big Brother Watch, a civil rights organisation which has brought a case against the Met in London was quoted as saying: “We are not aware of anywhere live facial recognition is being used for general public surveillance, except in China … It’s really alarming for Britain to go down this path and set this precedent not only for other democracies, but certainly for less liberal states. It’s being used to track ethnic minorities in China; the possibilities are chilling.”
So what is the difference, the average member of the public might ask, between ordinary CCTV, which has been around for years and which no one has objected to, and the new 24 hour face-recognition technology? The answer is that the old video cameras did not know what they were looking at; it was only when an incident occurred that the footage would be retrieved and the event replayed to see if anyone was recognisable or some insight could be gleaned into what had occurred. However, with the new technology machine-learning algorithms are trained to recognise specific people, objects or strange behaviour, and the new cameras to all intents and purposes ‘see’ on their own account. A policeman, therefore, could theoretically sit and twiddle his thumbs, and the face-recognition system would automatically match up (or not, as the case may be) a person or behaviour, say, with someone or thing on the watch list.
It is the watch lists which are at the heart of many of the objections to the system. In a large society hypothetically millions of innocent people could be on one. That must certainly be the case in China’s Xinjiang province, for example, or perhaps Beijing. But what about Guyana? Who precisely will be on the watch list? Everyone who passes by one of the cameras? In a politically divided society such as this, will a government in office ensure that its political opponents or critics or activists or protestors are on the list, and will track them or use information against them as it sees fit? Even the lives of ordinary people could be monitored, from the time they go to work, to when they go to the rumshop, and so on; an extensive record of even a harmless existence could be in the hands of the authorities. It will be the death of privacy.
In the case of countries like the UK, the technology is also in private hands, such as supermarkets and bars, and no one knows exactly who is on those lists. Furthermore, according to the Guardian, the software is freely available and cheap, and is distributed all over the internet.
There are still problems with the technology, such as how it functions on ‘gloomy’ days, for example, and more important from our point of view, the fact that it is not so good at recognising people of colour. One presumes that the technology would soon evolve to address these problems, but in the meantime, Mr Khushial has identified what he says is another serious problem in our case arising from this. He has written that one of the things required to build a facial recognition system is a large set of photos. Given constraints which he enumerates he suggests that Guyana’s system was most likely developed offshore by Huawei, and that since the system will probably require regular tuning, a continuous stream of images of Guyanese might have to be sent abroad. These images, he theorises, may derive from drivers’ licences. If so, are we then ceding control to a foreign power? he asks.
It is not an irrelevant question. Mr Ramjattan – and since this is a technical issue – Ms Cathy Hughes as well, need to answer exactly how our facial recognition system has been developed, where the images have originated from, and whether they have been sent outside the country. Guyanese are entitled to know the details.
Mr Lowe had some fundamental questions of his own for the authorities: If the system is meant to track criminal suspects, just who is a criminal suspect? He also added a comment here about the potential for police abuse. He asked what the other purposes of data collection were, and whether these extended to traffic violations and misdemeanours. He went on to pose the critical question about who should have access to the data and what the security controls would be. He also enquired about the after-use data disposal schedule and how independent oversight would be exercised?
It might be commented that there is no evidence that the government ever considered independent oversight, let alone implemented it. Whatever the case, the public should be given answers to the questions raised by Mr Lowe. The government has approached this project very cavalierly. It goes to the heart of privacy issues and citizens’ rights. There needs to be a public discussion about it now, and the consideration of what kind of regulations it should attract. We should not be in the vanguard of the planet’s Big Brother societies.