Should Facial Recognition Technology Be Incorporated Into Educational Institutions?
Statistical measures of people’s facial characteristics are used in facial recognition technology to digitally identify individuals when they are photographed or videotaped, as well as when they are in real time.
Customs and Border Protection (CBP) utilizes face recognition technology to check passengers arriving on foreign flights at several airports in the United States. In Florida, the police utilize technology to identify criminals by scanning databases of pictures, including driver’s licenses, that include photographs of the suspects. It is now a handy function that customers may use to unlock their phones when they are locked out. Taylor Swift, the pop diva, is reported to employ face recognition technology to detect stalkers.
This technology was recently implemented as part of a safety initiative by a school district in New York. Do you believe that face recognition technology will make our schools more secure? Or impinge on our right to privacy?
Davey Alba reports in “Facial Recognition Moves Into a New Front: Schools” that facial recognition is now being used in schools.
Jim Shultz did everything he could think of to keep face recognition technology out of Lockport’s public schools, a tiny town 20 miles east of Niagara Falls. He was unsuccessful. He shared his thoughts on the matter with members of a Facebook community called Lockportians. He published an opinion piece in The New York Times. The father of a high school student filed a petition with the superintendent of the district where his daughter attends.
However, he was defeated a few weeks ago. The Lockport City School District turned on facial recognition technology to monitor who is on the grounds of its eight schools, making it the first known public school district in New York to do so, as well as one of the first in the country. The district is one of the first in the country to use facial recognition technology.
62-year-old Mr. Shultz claims that the district “made our children as test rats in a high-tech experiment in privacy violation.”
Only a few cities, such as San Francisco and Somerville, Massachusetts, have prohibited their governments from utilizing the technology, but these are the outliers. A single firm, Clearview AI, has seen an increase in the number of law enforcement agencies that have used its technology in the last year alone. It has also been used by airports and other public places, such as Madison Square Garden in Manhattan, among others.
Schools are a more recent battlefield, and the discussion that took place in Lockport perfectly illustrates the fervor that has erupted around the technology. Proponents see it as a critical crime-fighting tool that can aid in the prevention of mass shootings and the apprehension of sexual predators. Robert LiPuma, the director of technology for the Lockport City School District, expressed his belief that if the technology at Marjory Stoneman Douglas High School in Parkland, Fla., had been in place at the time of the fatal 2018 assault, it could have prevented the tragedy from occurring.
Ms. LiPuma said that she had an expelled student who would have been placed into the system since they were not allowed to be on school premises in the first place. “They sneaked in via an open door,” says the detective. It wouldn’t have taken long for the system to identify them the moment they slipped in.”
However, opponents such as Mr. Shultz argue that the privacy, accuracy, and racial bias issues regarding face recognition are much more concerning when it comes to minors.
We cannot allow invasive surveillance to become the norm in our public spaces, said Stefanie Coyle, deputy director of the Education Policy Center for the New York Civil Liberties Union. “Subjecting 5-year-olds to this technology will not make anyone safer, and we cannot allow it to become the norm in our public spaces,” she added. “Callously reminding people of their worst fears is a disappointing strategy, intended to divert attention away from the reality that this product is discriminating, unethical, and untrustworthy.”
The program, according to Mr. LiPuma, analyzes the images recorded by the hundreds of cameras and determines whether or not those faces match a “persons of interest” list created by school officials when the system is activated.
This list contains individuals who have committed sex crimes in the region, those who are banned from meeting children due to restraining orders, former workers who are barred from visiting the schools, and others who have been identified as “credible threats” by law enforcement authorities.
Once a person on the list is identified by Aegis’ software, the system sends an alarm to one of Lockport’s 14 rotating security officers, who are both part-time and full-time, according to Mr LiPuma. In order to determine if the person on the camera matches a photograph in the database, a human monitor must first “confirm” or “reject” the match with the photograph in the database.
It is not possible to dismiss an alert if an operator rejects a match. The warning is sent to a small group of district officials, who choose what action to take if the match is verified.
Facial recognition systems are being criticized for their potential for prejudice and abuse.
Those who oppose face recognition technology, such as Mr. Shultz and the New York Civil Liberties Union, refer to mounting evidence of racial prejudice in facial recognition systems.
One of the largest studies of its kind, conducted by the federal government in December, discovered that most commercial facial recognition systems were biased and incorrectly identified African-American and Asian faces at rates ranging from 10 to 100 times higher than those of Caucasian people, among other findings. In yet another government research, it was shown that youngsters had a greater incidence of incorrect matches.
Discipline is disproportionately applied to black pupils in Lockport. According to statistics from the US Department of Education, 25 percent of the district’s suspended students were black during the 2015-16 school year, despite the fact that only 12 percent of the district’s population was black.
Students being listed as “persons of interest,” according to law professor Jason Nance of the University of Florida, may have unexpected effects.
Students who have been suspended will be examined more closely if they are placed on a watch list, according to him, which may increase the chance that they will be brought into contact with the criminal court system.
Jayde McDonald, a political science major at Buffalo State College, grew up as one of the few black students in Lockport public schools, where she was one of the few black kids in the whole school district. She said that she believed it would be too dangerous for the school to install a face recognition system that would immediately contact the police if a student was identified.
According to Ms. McDonald, “Because the rates of false matches are so large, this may result in very hazardous and totally preventable scenarios.”
“Police officers will do whatever it takes to halt a suspicious individual,” she said, even if the person in question was a young kid at the time of her statement.
Write down what you learned after reading the whole essay.
When it comes to schools, should face recognition technology be used? Is this true, and what is the reason? If not, what restrictions should be imposed on its usage in the meantime?
What kind of facial recognition technology have you used in the past — for example, to verify your identification at the airport, unlock your smartphone, sort and tag pictures on social media sites, or any other purpose?
In what ways can schools benefit from the use of face recognition technology? What are some of the possible advantages? Director of technology for the Lockport City School District, Robert LiPuma thinks that if the technology in place during the Parkland, Fla., school shooting had been in place at the time, the tragedy may not have occurred. What do you think of this statement?
When it comes to face recognition in schools, what are the risks that may arise? The Lockport City School District’s choice to employ face recognition, according to parent Jim Shultz, has “converted our kids into lab rats in a high-tech experiment in privacy invasion,” according to Shultz. What is the strength of his arguments in opposition to the district’s ruling?
A number of studies have shown that some of the most popular surveillance systems were biased, incorrectly recognizing African-American and Asian faces tens to hundreds of times more often than Caucasian faces, and showing a greater incidence of erroneous matches in youngsters. When it comes to face recognition in schools, should we be concerned about prejudice and what Jason Nance refers to as the “unintended consequences?”
Your school is a place where you feel secure. Can you tell me about the safety measures in place right now? Monica Wallace, a member of the Assembly, stated:
“We all want to keep our children safe at school, but there are more effective, proven methods to do so that are less expensive,” said the author of the report.
Smaller measures such as improving entrances and exits, employing school resource officers, and investing in counseling and social workers, according to her, are possible for school systems.
What do you think of this statement? And, if so, what kinds of safety measures would you recommend for your institution?
Finally, do you think it’s a good idea for your school to use face recognition technology? Do you believe you would feel safer if it were implemented at your school? How do you feel about it, and why do you feel that way?
In a previous version of this article, the wrong statistic from the MIT Media Lab’s research was stated. The right percentage has been calculated and entered.
Despite widespread public resistance and studies that throws doubt on the technology’s efficacy, more K-12 administrators are opting to implement face recognition technology at their institutions.
According to a research conducted by the MIT Media Lab, face recognition software is incorrect up to 34.4 percent of the time when scanning pictures of women with darker skin tones.
Human Rights Watch, an international non-governmental organization, expressed concern about the use of facial recognition technology in schools, saying, “With faulty facial recognition technology, children of color could soon find themselves forced to deal with a security guard or be singled out in other ways simply for going to school.”
The Lockport City School District in New York, however, is continuing to fine-tune its face recognition technology, according to WGRZ-TV. Members decided not to save student pictures in its face recognition database in order to comply with the desires of the State Education Department, according to the district’s superintendent, who spoke to the television station.
According to Forbes, Lockport started testing this technology more than a year ago in order to monitor sex offenders, suspended employees, and other people who might pose a danger.
According to Governing, the Putnam City School District in Oklahoma recently completed a five-month trial using face recognition software that works in conjunction with other security technologies to rapidly detect and isolate threats. “It’s been welcomed well by the general people,” said the district’s head of police, according to the publication.
Related: School districts are using smart technology to improve the efficiency of their operations.
Facial scanners and fingerprint readers may help to strengthen security while increasing administrative efficiency.
Related: Edtech experts that work with educators to improve online and network security are available.
Fulton County Schools in Georgia is using face recognition software as part of a single-sign-on program that controls instructional materials and applications, according to a study published earlier this year by the District Administration.
The district started by conducting a trial with its employees. According to Serena E. Sacks, chief information officer, “the whole process will take time and effort, but after we test technology in different settings across our district, we will go through with the rollout.”
According to Wired, a face recognition technology at Texas City High School was used to quickly identify an expelled student during a graduation ceremony this summer after officials had banned him from entering any school campuses. According to the Texas City ISD’s executive director of security, authorities questioned and removed the youngster from the stadium less than 30 seconds after he sat down.
As Michelle Bradley, superintendent of Lockport City School District, recently told the Des Moines Register, “studies have shown that reaction time is crucial in saving lives.” “By being notified that a possible danger may be present in a facility, issues may be handled more quickly and effectively before they manifest themselves.”
Listed here are all of the Student Opinion questions.
FACIAL RECOGNITION is a technology that takes use of statistical measures of people’s facial features to digitally identify them in photos, videos, or real-time situations.
Customs and Border Protection (CBP) uses face recognition technology at numerous airports throughout the United States to identify passengers arriving on international flights from the United States to other countries. In Florida, the police utilize technology to identify suspects by searching through databases of photographs, including driver’s licenses, and other identifying information about the individuals.
In order to help consumers in unlocking their phones, it has now become a convenient feature. Facial recognition technology is reportedly used by pop singer Taylor Swift to identify stalkers.
Recent adoption of the technology by a New York school district was motivated by concerns about safety. Think that face recognition will make our faculties safer in the future. Invade our personal space, or both?
The author, Davey Alba, says in his article “Facial Recognition Moves Into a New Front: Schools” that:
Jim Shultz did everything he could think of to keep face recognition technology from being introduced into public schools in Lockport, a tiny city 20 miles east of Niagara Falls. He was successful. On a Facebook group known as Lockportians, he made a post on the topic in question. It was published in The New York Times as an Op-Ed. The superintendent of the district where his daughter attends high school was served with the petition.
However, he had lost it a few of weeks before. The Lockport City School System switched on face recognition technology to keep track of who was on the grounds of its eight schools, making it the first recognized public school district in New York to do so, and one of the first in the country.
Mr. Shultz, 62, said that the district had “converted our children into lab rats in a high-tech experiment in privacy invasion.” “Our children were transformed into lab rats in a high-tech experiment in privacy invasion,” he claimed.
Although other municipalities have prohibited their governments from using the technology, such as San Francisco and Somerville, Mass., they have done so as an exception. The expertise of a single company, Clearview AI, has been used by more than 600 law enforcement agencies in just the last year alone. It has also been used by airports and other public places, such as Madison Square Garden in New York City.
Educational institutions are more recent entry points, and the discussion that occurred in Lockport epitomizes the commotion around the subject matter.
Many supporters believe it is a critical crime-fighting tool that will aid in the prevention of mass shootings and the eradication of sexual predators in society. When asked whether he thought that the deadly 2018 attack at Marjory Stoneman Douglas High School in Parkland, Fla., might have been prevented, Robert LiPuma, the Lockport City School District’s director of expertise, said he believed that if the technology had been in place at the school.
In Mr. LiPuma’s opinion, “you have an expelled student who would have been placed into the system because they weren’t supposed to be on school premises.” “They sneaked in via an open door,” says the detective. When they sneaked in, the system would have immediately identified that specific individual.”
The problems surrounding face recognition, however, according to opponents such as Mr. Shultz, are far more concerning when it comes to children, especially in terms of privacy, accuracy, and racial prejudice.
In the words of Stefanie Coyle, deputy director of the Education Policy Center for the New York Civil Liberties Union, “subjecting 5-year-olds to this technology will not make anybody safer, and we will not allow intrusive monitoring to become the standard in our public spaces.” A frustrating technique used to distract people from the fact that this product is discriminatory, immoral, and never safe, is “reminding people of their worst fears.”
According to the article,
In operation, Mr. LiPuma said, the software program searches for images recorded by the dozens of cameras and determines whether or not these faces match a database of “people of interest” maintained by college administrators.
Intercourse offenders in the area, people banned from meeting college students by restraining orders, former employees who are barred from visiting the schools, and others who are considered “credible threats” by law enforcement are all included in this database.
The Aegis system sends an alarm to one in every 14 rotating part- and full-time safety employees employed by Lockport, according to Mr. LiPuma, if the software program identifies an individual on the company’s database. The human monitor then examines a picture of the individual stored in the database in order to “verify” or “reject” a match between the individual captured on the digital camera and the individual in the database, respectively.
It is not possible to dismiss an alert if an operator rejects a match. Upon confirmation of the match, a second warning is sent to a small group of district directors, who will decide what action to take next.
The usage of face recognition software has been criticized for its prejudice and abuse.
Prominent opponents of facial recognition technology, such as Mr. Shultz and the New York Civil Liberties Union, point to mounting evidence of racial prejudice in the technology. Earlier this month, the federal government released the findings of a study, which was one of many largest of its kind at the time. It discovered that the vast majority of enterprise facial recognition systems exhibited bias, incorrectly identifying African-American and Asian faces 10 to 100 instances greater than Caucasian faces. According to a second government study, mismatched matches among children are the most common kind of error.
A disproportionate number of black college students in Lockport are suspended or expelled. According to statistics from the federal Department of Education, 25 percent of suspended college students in the district during the 2015-16 school year were black, despite the fact that just 12 percent of the district’s population was black.
Using college students as “individuals of interest,” according to Jason Nance, a law professor at the University of Florida, may have unforeseen consequences:
“If suspended pupils are put on the watch list, they will be examined much more carefully,” he said, increasing the likelihood that they would be brought before the criminal court system.
A political science major at Buffalo State, Jayde McDonald grew up as one of the few black college students in Lockport public schools, which she attended for a number of years. In her opinion, it was too risky for the university to install a face recognition system that would automatically call the police on every student.
As Ms. McDonald pointed out, “since the chances of finding a false match are so high, this will result in very severe and completely preventable diseases.”
She went on to say that she thought police would “do whatever it takes to stop a suspicious specific person,” even if that particular person was a younger student at school.
After reading the whole essay, students should notify us as follows:
Would it be OK to use face recognition technology at educational institutions? Is this true, and what is the reason? If this is not the case, what restrictions must be placed on its application?
You may be wondering whether or not you’ve ever used any kind of face recognition technology — whether to check in at an airport or unlock your smartphone, write and tag photos on the internet, or anything else.
Are there any possible advantages to using face recognition technology in schools? What are some of the benefits? In According with Robert LiPuma, director of expertise for the Lockport City School District, if the required technology had been in place, the school shooting in Parkland, Fla., would not have happened. What do you think of this statement?
Are there any dangers associated with the use of face recognition in educational institutions? The Lockport City School District’s decision to utilize face recognition technology, according to Jim Shultz, a parent or guardian, has “converted our children into lab rats in a high-tech experiment in privacy invasion.” What is the strength of his arguments in support of the district’s settlement?
Studies have shown that a lot of the most recent surveillance methods showed prejudice, incorrectly identifying African-American and Asian faces 10 to 100 times more often than Caucasian faces and showing the highest rate of incorrect matches amongst children. In the case of face recognition in faculties, how concerned should we be about prejudice and what Jason Nance refers to as the unexpected consequences of facial recognition?
Are you satisfied with your college’s academic performance thus far? Are there any current security protocols in place? Monica Wallace, a member of the Assembly, stated:
“Whilst it is important for us all to ensure the safety of our children while at school, there are simpler and more proven ways of doing so that are potentially less costly.”
Schools, she said, may choose to take modest measures like as improving entrances and departure points, employing college relevant resource officers, and investing in counselors and social workers, among other things.
What do you think of this statement? If so, what alternative security measures do you intend to implement at your institution?
Finally, do you think it’s a good idea for your institution to use face recognition technology? Do you think you’d be more secure if it had been implemented at your university? How do you feel about it, and why do you feel that way?
Commentary is open to students aged 13 and above. All input is reviewed by Learning Network staff; nevertheless, it goes without saying that once your comment is approved, it will be made public.