Andro Alex
Furthermore, the system reportedly also factors in some degree of range so that search results will not produce similar images. In so doing the image search can capture scenes, objects, animals, places, attractions and clothing items. This reportedly means that something can be located by sorting through shared photos rather than relying on tags and text descriptions. As reported by DigitalTrends, Lumos is the artificial intelligent platform that allows the computer to 'see' what is inside the image shared, even without any kind of text description. The machine learning system is apparently also responsible for a number of Facebook's image-recognition features specially eradicating nudity and in obliterating spam.
as mentioned in
Facebook Computer Vision AI Can Now Creepily Search Your Photos And Understand What's In Them
We've advanced this research by designing techniques that detect and segment the objects in a given image," Facebook explains.By leveraging Lumos, Facebook is able to offer users visual search. To make sure that search results are relevant, Facebook has be able to understand the the contents of a photo. It is now the engine for the current computer vision team at Facebook with a platform called Lumos built on top of it. For that happen, Facebook's engineers used cutting-edge deep learning techniques to process billions of photos and understand their semantic meaning.Cool stuff, if not just a little bit creepy too. Little did we know it, but Facebook's engineers have been experimenting with a computer vision platform that is able to sort through photographs and search for what's contained in them, even when they're not tagged or captioned.
Facebook's Search AI Now Knows What's in Your Photos
The social network's search engine is now using artificial intelligence to apply search terms to photos, not just status updates or other text on the site. Thanks to the neural networks, the system already knows what's depicted in the billions of photos that have been uploaded to Facebook. The image search changes might not seem revolutionary to most users, since they're simply making it easier to locate the images they expect to find anyway. The same neural networks also work in reverse: they can help translate the contents of photos into text for people who are visually impaired. So a search for "black shirt," for instance, would return photos of black shirts, even if the person who uploaded it didn't addany tags that say "black shirt."
to read more visit us facebook professional
No comments:
Post a Comment