With Clearview AI releasing a potentially game-changing facial recognition app, what responsibility do tech companies have to back away from products that could lead to abuse?
With Clearview AI releasing a potentially game-changing facial recognition app, what responsibility do tech companies have to back away from products that could lead to abuse?
newsletter.lawtrades.com
One legal expert told The New York Times the Clearview AI facial recognition app had endless “weaponization possibilities.” And Google has declined to create a similar product despite having capabilities to do so for years.
But with the facial recognition app being used by 600 local and federal law enforcement agencies in the US this technology has become a reality. What responsibility do tech companies have to back away from products that could lead to abuse? And how can GC’s assume the lead when it comes to making these decisions?
With Clearview AI releasing a potentially game-changing facial recognition app, what responsibility do tech companies have to back away from products that could lead to abuse?
With Clearview AI releasing a potentially game-changing facial recognition app, what responsibility do tech companies have to back away from products that could lead to abuse?
With Clearview AI releasing a potentially game-changing facial recognition app, what responsibility do tech companies have to back away from products that could lead to abuse?
One legal expert told The New York Times the Clearview AI facial recognition app had endless “weaponization possibilities.” And Google has declined to create a similar product despite having capabilities to do so for years.
But with the facial recognition app being used by 600 local and federal law enforcement agencies in the US this technology has become a reality. What responsibility do tech companies have to back away from products that could lead to abuse? And how can GC’s assume the lead when it comes to making these decisions?