this post was submitted on 24 Nov 2023
1 points (100.0% liked)

Machine Learning

1 readers
1 users here now

Community Rules:

founded 1 year ago
MODERATORS
 

Discover how OpenAI's CLIP Model is changing the game. Say goodbye to traditional text or tag-based image searches. Now, you can search images based on their features.
For instance, in a clothing retail scenario, imagine a customer searching for "dark color clothes." Even if your database lacks the exact tag, our Semantic Image Search powered by the CLIP AI Model can still match the features of the image and deliver the right results. It's a game-changer that text-based search can't replicate.
I've broken down the process step by step, complete with code examples. Take a look and share your thoughts.

https://medium.com/@vignesh865/next-gen-search-unleashed-the-fusion-of-text-and-semantics-redefining-how-we-search-c67161aaa517

you are viewing a single comment's thread
view the rest of the comments
[–] Blakut@alien.top 1 points 11 months ago

“sun color shirts” and hit search. However, what pops up on your screen is a mishmash of shirt-related information, not the cheerful and bright yellow shade you had in mind.

because sun is not a color?