Point your phone’s camera at any object around you and AI Scry will spit out descriptions of the objects it sees.
A while back I shared a video of a guy who rigged his MacBook with code that could describe what its camera sees in an image. Using the same “NeuralTalk” code, Oakland-based developer Disc Cactus recently released AI Scry, an iOS app that brings that same functionality to an iPhone.
In a nutshell, AI Scry allows users to point their iPhone’s camera at anything around them. From here, the app translates the image with automatic text descriptions of the objects it sees. To get a better idea of how it works, check out the video above for a full demo.
So how does the app know exactly what it’s looking at? Does it know what it’s looking at? To answer that question, I’ll let the creators of AI Scry explain that:
“The detector system has no direct knowledge of objects in the world (donuts, bananas, skateboards, wine glasses, etc) as we know them” according to their website. “Instead, the detector maps patterns received on the signal line to a sequence of word-choice probability distributions and assembles an output stream. The internal routing through this artificial neural network is largely meaningless to a human observer.”
The app is still a little rough around the edges, but for the most part, it’s pretty spot on. There’s limited functionality at the moment but once the technology catches up, this could be extremely useful for people dealing with visual impairments. To learn more, click here.