It follows IBM's recent decision to leave the facial recognition business amid concerns such technology could be used for mass surveillance and racial profiling.
Amazon has banned police use of its face-recognition technology for a year.
The technology is used to identify and locate people from a digital image or a video frame.
It's used by many countries around the world, most notably for border-check purposes, but also to prevent knife and gun crime as well as child and sexual exploitation.
Amazon is the latest tech giant to step back from letting police use its technology, which has faced criticism for incorrectly identifying people with darker skin.
But the company said that organisations that use its tool, called Rekognition, to help find children who are missing or sexually exploited, will still have access to the technology.
Ongoing protests following the death of George Floyd have focused attention on racial injustice and how police use technology to track people.
Amazon itself expressed support for the fight against "the brutal treatment of Black people".
George Floyd died on May 25 after a white Minneapolis police officer pressed his knee into the handcuffed black man’s neck for several minutes even after Floyd stopped moving and pleaded for air.
Law enforcement agencies use facial recognition to identify suspects, but critics say it can be misused.
A number of US cities have banned its use by police and other government agencies, led by San Francisco last year.
On Tuesday, IBM said it would get out of the facial recognition business, noting concerns about how the technology can be used for mass surveillance and racial profiling.
It’s not clear at the moment if the ban on police use includes federal law enforcement agencies.
Civil rights groups and Amazon's own employees have pushed the company to stop selling its technology to government agencies, saying that it could be used to invade privacy and target people of colour.
In a blog post on Wednesday, Amazon said that it hoped Congress would put in place stronger regulations for facial recognition.
Clare Garvie, a researcher at Georgetown University’s Center on Privacy and Technology, said: "Amazon’s decision is an important symbolic step, but this doesn’t really change the face recognition landscape in the United States since it’s not a major player."
Garvie's research found only two US agencies are using or testing Rekognition.
The Orlando police department tested it, but chose not to implement it, she said. While the Washington County Sheriff’s Office in Oregon, which has been the most public about using Rekognition, said it was suspending its use indefinitely after Amazon's announcement.
Studies led by MIT researcher Joy Buolamwini found racial and gender disparities in facial recognition software.
Those findings spurred Microsoft and IBM to improve their systems but irked Amazon, which last year publicly attacked her research methods.
A group of artificial intelligence scholars last year launched a spirited defence of her work and called on Amazon to stop selling its facial recognition software to police.
A study carried last year by a US agency affirmed concerns about the technology's flaws.
The National Institute of Standards and Technology tested leading facial recognition systems -- though not from Amazon, which didn't submit its algorithms -- and found that they often performed unevenly based on a person’s race, gender or age.
Buolamwini on Wednesday called Amazon’s announcement a “welcomed though unexpected announcement.”
“Microsoft also needs to take a stand,” she said. “More importantly our lawmakers need to step up” to rein in harmful deployments of the technologies.
This week's announcements by Amazon and IBM follow a push by Democratic lawmakers to pass a sweeping police reform package in Congress that could include restrictions on the use of facial recognition, especially in police body cameras.
Though not commonly used in the US, the possibility of cameras that could monitor crowds and identify people in real-time has attracted bipartisan concern.