HomeAI News
Meta's latest AI model can detect objects in images

Meta's latest AI model can detect objects in images

Hayo News
Hayo News
April 6th, 2023
View OriginalTranslated by Google

Facebook parent company Meta released a "Segment Anything" AI model capable of recognizing individual objects from images, and what it says is the largest image-annotated dataset of its kind.

In a blog post, Meta Research introduced their "Segment Anything Model" (SAM for short). SAM is able to recognize objects in images and videos, including those that were not seen during training. Users can select objects by clicking or entering text. In one demo, when "cat" was entered, the tool drew a box around the cat in an image.

Meta has demonstrated some generative AI-type capabilities that, like ChatGPT, can create entirely new content beyond just identifying and classifying data, though no concrete product announcements have yet been made. One example includes tools to generate surreal videos from text prompts, and children's book illustrations from prose. CEO Mark Zuckerberg said bringing this generative AI into Meta's apps as a "creative aid" is a priority this year.

In fact, Meta already uses SAM-like technology internally to flag photos, review prohibited content, and determine which posts to recommend to Facebook and Instagram users. The release of the SAM will expand the use of such technologies, the company said.

SAM models and datasets will be available for download under a non-commercial license. Users must also agree to use it for research purposes only when uploading their own images to the accompanying prototype.

Meta plans to commercialize its proprietary generative artificial intelligence for generating advertising images by the end of this year. "We've been investing in artificial intelligence for more than a decade and have one of the world's leading research institutions. We have a large research team with hundreds of people," Meta chief technology officer Andrew Bosworth said in an interview.


no dataCoffee time! Feel free to comment