Software engineer Vishnu Mohandas decided he would quit Google in more ways than one when he learned the tech giant had briefly helped the US military develop AI to study drone footage. In 2020, he left his job working on Google Assistant and also stopped backing up all of his images to Google Photos. He feared that his content could be used to train AI systems, even if they werenât specifically ones tied to the Pentagon project. âI don’t control any of the future outcomes that this will enable,â Mohandas thought. âSo now, shouldn’t I be more responsible?â
Mohandas, who taught himself programming and is based in Bengaluru, India, decided he wanted to develop an alternative service for storing and sharing photos that is open source and end-to-end encrypted. Something âmore private, wholesome, and trustworthy,â he says. The paid service he designed, Ente, is profitable and says it has over 100,000 users, many of whom are already part of the privacy-obsessed crowd. But Mohandas struggled to articulate to wider audiences why they should reconsider relying on Google Photos, despite all the conveniences it offers.
Then one weekend in May, an intern at Ente came up with an idea: Give people a sense of what some of Googleâs AI models can learn from studying images. Last month, Ente launched https://Theyseeyourphotos.com, a website and marketing stunt designed to turn Googleâs technology against itself. People can upload any photo they want to the website, which is then sent to a Google Cloud computer vision program that writes a startlingly thorough three-paragraph description of it. (Ente prompts the AI model to document small details in the uploaded images.)
One of the first photos Mohandas tried uploading was a selfie with his wife and daughter in front of a temple in Indonesia. Googleâs analysis was exhaustive, even documenting the specific watch model that his wife was wearing, a Casio F-91W. But then, Mohandas says, the AI did something strange: It noted that Casio F-91W watches are commonly associated with Islamic extremists. âWe had to tweak the prompts to make it slightly more wholesome but still spooky,â Mohandas says. Ente started asking the model to produce short, objective outputsânothing dark.
The same family photo uploaded to Theyseeyourphotos now returns a more generic result that includes the name of the temple and the âpartly cloudy sky and lush greeneryâ surrounding it. But the AI still makes a number of assumptions about Mohandas and his family, like that their faces are expressing âjoint contentmentâ and the âparents are likely of South Asian descent, middle class.â It judges their clothing (âappropriate for sightseeingâ) and notes that âthe woman’s watch displays a time as approximately 2 pm, which corroborates with the image metadata.â
Google spokesperson Colin Smith declined to comment directly on Enteâs project. He directed WIRED to support pages that state uploads to Google Photos are only used to train generative AI models that help people manage their image libraries, like those that analyze the age and location of photo subjects.The company says it doesnât sell the content stored in Google Photos to third parties or use it for advertising purposes. Users can turn off some of the analysis features in Photos, but they canât prevent Google from accessing their images entirely because the data are not end-to-end encrypted.
+ There are no comments
Add yours