New York’s 15,000 surveillance cameras are looking at you but mostly watching the movement of people in non-white neighborhoods.
Someone to watch over me
One of my all-time favorite songs is Someone to watch over me, and in New York, the police are very much watching your every move. To some, they feel more secure knowing that the police have eyes in every corner of the city.
But to the black community, it is another proof where the cards are dealt heavily against their favor. In Brooklyn, there are 8,000 of these cameras. There are about 800,000 people of color in Brooklyn, which means there is one surveillance camera for every 1,000 non-white people.
Are these cameras installed to make you feel safer if you are a white person?
Facial Recognition AI Struggles to Identify Correctly Black Faces
It wasn’t a very long time ago when
Google labeled black people as gorillas.
In the past, when the technology isn’t as advanced as it is now, Data sets can be manipulated and can fall victim to our human biases, a case in point when Google mistakenly labeled black people as gorillas. — Excerpt from my story : Artificial Intelligence Creates Fake Faces — and They Are as Real as It Gets
“You are never anonymous,” says Matt Mahmoudi, the AI researcher leading the project.
From the report by
The Amnesty International team found that the cameras are often clustered in majority nonwhite neighborhoods. NYC’s most surveilled neighborhood is East New York, Brooklyn, where the group found 577 cameras in less than 2 square miles. More than 90 percent of East New York’s residents are nonwhite, according to city data. — Excerpt, The All-Seeing Eyes of New York’s 15,000 Surveillance Cameras
When the police watch over a community where most of its residents are non-white, what does it say about the program? Isn’t it to track the movements of people who are suspected of committing a crime without any other evidence except what these surveillance cameras capture, and we know in the past, police have in so many incidences have used unwarranted force towards any black person who they suspect to be a criminal even without provocation, it is what brought to life the Black Lives Matter movement.
But it is not only the police who see surveillance as a necessary evil. When asked, only one Mayoralty candidate said she would ban the use of surveillance and any facial recognition artificial intelligence software that is used by the police to match faces caught by the surveillance cameras with their database.
“I support a city-wide ban on government facial recognition and biometric surveillance technologies,” — Dianne Morales, former nonprofit executive.
Politicians love the idea of surveillance
People expect the government to protect them from any harm or to be a victim of crime. So there must be a way for that to happen and be mindful of the racial issues that come with installing surveillance cameras, especially in areas known to be non-white communities.
It is immoral and wrong to be clueless as to how it is perceived in these communities. Furthermore, it creates an underlying tension between the police and the black community.
We all know of the recent cases involving the police when they are with their black suspects. It has only been a year when George Floyd died at the hands of police officers.
For eight minutes and 46 seconds, police officer Derek Chauvin kneeled on the neck of George Floyd, a 46-year old Black man, on the street in Minneapolis on May 25 last year. Another officer, Alexander Kueng, had his knees on Floyd’s upper legs while their colleague J. Thomas Lane gripped Floyd’s handcuffed arms.
“Please, please, please, I can’t breathe,” Floyd gasped, pleading about 20 times. His final words reminded many of those of Eric Garner, who died during a police chokehold in 2014.- Excerpt :How George Floyd’s death reignited a worldwide movement
A few weeks ago, I wrote about Ledell Lee, a Black Man Who Was Executed Because Arkansas Didn’t Want the Lethal Injection to Expire.
America has become a dangerous place if you are non-white, and there is no other way of saying it.
And New York isn’t helping heal the wounds that are so deep that a surveillance program by its police is like putting salt on an open wound.
Artificial intelligence technology in identifying images isn’t perfect, and when it can falsely identify a black person, what happens next can lead to the innocent losing their lives, as we all know, has happened in the past.
Then a statewide surveillance program has to be reexamined. The good should always outweigh the bad.
America is already hurting, and we don’t need to make things difficult for the black community and feel they are targeted, to make not their communities safe, but for the predominantly white communities to feel safe.
America is for everyone, and it should stay that way!
- The All-Seeing Eyes of New York’s 15,000 Surveillance Cameras
- Only One NYC Mayor Candidate Is Promising to Ban Facial Recognition
- When Private Security Cameras Are Police Surveillance Tools
- New York City’s Surveillance Battle Offers National Lessons
- The Best Algorithms Struggle to Recognize Black Faces Equally