GANPaint Studio is a tool by MIT CSAIL to modify images of certain categories using a generative adversarial network. Using the software, you can take an input image of something like a kitchen or a church and paint over an area you want to change. You can then tell it to draw extra chairs or windows, or different rooftops or trees, and GANPaint Studio will do its best to realistically fill the areas you marked with your desired objects. More:
Police body camera company Axon has banned the use of facial recognition technology on its cameras.
This is the less shiny side of productized artificial intelligence: biased AI systems are being deployed in sensitive areas like policing, where they are likely to reinforce existing societal inequalities and (racial, gender, sexual orientation, …) discrimination. As Ben Evans wrote earlier this year
[The] scenario for AI bias causing harm that is easiest to imagine is probably not one that comes from leading researchers at a major institution. Rather, it is a third tier technology contractor or software vendor that bolts together something out of open source components, libraries and tools that it doesn’t really understand and then sells it to an unsophisticated buyer that sees ‘AI’ on the sticker and doesn’t ask the right questions, gives it to minimum-wage employees and tells them to do whatever the ‘AI’ says.
Indeed, it’s not hard to imagine Evans’ scenario happening in the police body cam setting: nothing is stopping a budget-constrained police department from trying to use body cams to automatically find criminals, without realizing that current commercially-available facial recognition software is much more likely to, for example, misrecognize a person of color than a white person.
That’s why it’s refreshing to see a body cam manufacturer stepping up to take responsibility for the problem. Charlie Warzel for the New York Times:
According to [Axon’s independent] ethics board report, in early conversations about facial recognition, Axoninitially argued that it “could not dictate to customers how products were used, nor its customers’ policies, and that it could not feasibly patrol misuse of its product.” That’s Big Tech’s version of “guns don’t kill people, people kill people.” And it’s a view that’s very widely held across the industry.
Mr. Friedman hopes that Axon’s pledge will force other vendors to think about where the new technology might be headed and how it could impact the most vulnerable. “We want them to remember that just because you can build it, doesn’t mean you should.”