Improve self-service and agent effectiveness at Activate Customer Service. June 24 REGISTER NOW

Close Message

Semantic Search for Digital Commerce

Presented at Activate Product Discovery 2021. In the context of digital commerce, semantic search refers to a set of techniques for finding products by meaning, as opposed to lexical search which finds products by matching words and their variants. Learn how semantic search can be leveraged to eliminate zero-results outcomes and improve on low-performing queries.

Speaker:
Eric Redman, Lucidworks Principal Search Architect, Digital Commerce


Transcript

Eric Redman:

Hi, my name is Eric Redman, I’m the Principal Search Architect in Digital Commerce at Lucidworks. And today, I’m going to talk about semantic search for digital commerce.

So let’s start with what I mean. What do I mean by semantic search? This comes up a lot in discussions with colleagues, and clients, and so on.They just see a lot of different perspectives when they’re looking at vendor websites and academic discussions and so on. So I began with semantic search and searching by meaning, but that’s not very helpful. It’s just a starting point. A response I’ll often get when I begin that way is, “Well, yeah, but where do you stand? Are you a knowledge graph adherent or do you say that semantic search has to involve vector space search?” And I say we use both and it’s not a one way and there’s not one model to rule them all yet.

We use multiple techniques to accomplish semantic search. And that’s what we have today is you’re better off combining techniques to solve certain aspects of semantic search. So where did this… not disagreement, but just variability in the perspective people take when they start using that phrase semantic search, where does that come from?

Well, the techniques available to accomplish semantic search have evolved quite a bit over the last 10 years, more than 10 years, really. And so I took a look at the Wikipedia semantic search article and how it has evolved over time since around 2007, you can see the chart here at the bottom of this slide showing the activity, essentially editors, kind of renegotiating every so often that article. And here’s a few of the evolutions of the first sentence of that article that I’ve pulled out of the way back machine. I find these interesting.

Back in 2007, it was a very much focused on link data and web and using the web to do research in XML and RDF. So like a very specific thing, we’re going to do a better job of organizing data on the web, and that will improve our ability to do research. Then in 2010, this first sentence for the Wikipedia article really evolved, and now it’s more focused on searcher intent, and the contextual meaning of terms in a query.

Move ahead nine more years to 2019, and we see the search with meaning comes back and distinguishing semantic search from lexical search, in that semantic search is trying to understand the overall meaning of the query rather than trying to match individual terms. So it’s kind of full circle because historically semantic search has been about searching by meaning, and not necessarily a particular protocol or type of model.

Now, Google was also rapidly evolving the approaches that they had, models they used and the way they thought about search. In 2012, the Google Knowledge Graph came out and they started using this motto – “things not strings.” And I thought that was a clever way to get to the essence of what they were trying to do. In 2015, Google announced that RankBrain, a thing called RankBrain was being used. And this was a machine learning approach to do a better job of ranking pages in search results.

And then in 2019, Google announced and released to the community, BERT, they call this neural matching. This is really where people really got going with the semantic vector search that I was talking about. So I think it’s useful to think about semantic search as being built today from a combination of techniques and that each technique solves specific problems. But the overall goal in combining these techniques in digital commerce to respond to a query with products that are relevant to the task or the interest that the shopper has.

So I hear about query intent all the time. We have to understand query intent, and we’ll do a better job with search, but I don’t like that phrase. The reason is that I don’t think queries have intent, shoppers have intent. We need to use language that reminds us of that. Here in my example on the slide is if I search for “OO” flour, maybe I’m trying to make a pizza. That’s a big deal in at home pizza, now that the craft, it’s all about the “OO” flour. So aren’t we providing better recall or more relevant results if we maybe pull in some other products associated with baking or making a pizza.

So I talked about multiple techniques, and I wanna talk about two examples, or two specific techniques, and how we can use them in combination, how we do use them in combination. One being semantic vector search, and another being semantic query parsing.

Okay, semantic vector search. It’s a deep learning approach, where the model is trained on shopper behavior. What it’s gonna do is to create a shared vector space where it can position products and queries in the shared vector space, such that very similar products and queries are close together in the space, and dissimilar products and queries are far apart. And that’s kind of the job of this model. And we call this type of model an encoder, because it’s encoding product names and descriptions and queries into the shared vector space. Its job is similar things close together and dissimilar things far apart.

Now, what is similar and what is dissimilar – that’s what’s learned from the shopping behavior. And I liked this analogy of a grocery store for a real world analogy, which is that grocery stores work really hard at organizing products on shelves in a way that when you go looking for one thing, the next thing that might go with what you’re interested in is nearby. So it’s almost like if I’m looking for organic lemonade, just within my peripheral vision,there are other products that
I might also be interested in.

The grocery store’s goal is to not just give you exactly what you asked for, but interest you in leaving with two or three other items that you didn’t even think you needed when you came there. This type of model, the encoder I talked about, this type of model can learn from other types of signals too. So merchandisers do things, they categorize products so that the browse experience works right.They organize products on landing pages.Those things could be used as training data too for the model. So it’s not just shopper behavior, but that’s a real primary source.

Okay, so I said for each technique I would talk about a couple of problems they solve, and semantic vector search is really good at improving recall. The idea of recall, of course depends on some definition of relevance. And so I’m saying a more intuitive interpretation of relevance is products that are relevant to, as I said relevant to the goal or the interest of the shopper and semantic vector is search is better than lexical search in accomplishing this. And it does so without curation, that’s a big deal. We can do pretty well given enough time with lexical search, but it requires a lot of synonym entries, spelling corrections, phrasing and other types of rules to make it work. And that curation – it’s not “get it right, oh phew I’m done.” You have to keep chasing it all the time, because your product assortment changes and the vocabulary that shoppers are using is changing. Semantic vectors search just keeps learning from the signals coming in, and it doesn’t require all this curation.

Here’s another problem related that semantic vector search is good at dealing with, and that is zero results. You can really slash your zero results rate with semantic factor search. And again, that’s related to this idea of a more intuitive representation of recall, but there’s a lot of curation really specific to zero results that semantic vector search frees up, and especially thinking about long tail queries that you just never get to because search teams, e-commerce teams will naturally focus on the zero results that occur more frequently. And there’s this large pool of long tail queries that are generating zero results, and semantic factor search will address those without all this curation. And really, the idea is free up the merchandiser, free up the search engineer, free the e-commerce team to focus on more important targets.

Okay, semantic query parsing. We talked about semantic vector search. Now I wanna talk about semantic query parsing, and a couple of issues that it’s good at dealing with. Semantic query parsing is basically a type of word sense disambiguation and this idea or task has been around for a long time and a lot of different approaches have been developed to accomplish it that’s disambiguating a term. You know, when I see the word Jaguar, do I mean the car or the animal, or what, and lots of competition in this area and competing for a state-of-the-art status, but the approach, many of the approaches that work well are based on a knowledge graph of some type, and this knowledge graph can be generated from your product data.It can come from external data like Wikidata, Wikipedia and so on, DBpedia, and it can also involve some level of curation.

So what can we do with semantic query parsing? Well, since we’re able with semantic query parsing to identify concepts in a query I say, mentions of concepts in a query, we can do some special processing for different concepts. We might say that some concepts are negotiable and some are not negotiable. So in this example, here I’m searching for sour cream pint. I want a pint a sour cream and the sour cream part is not negotiable. It’s not helpful to me that if you don’t have a pint of sour cream, you show me a pint of whipping cream. I need sour cream. Maybe the pint concept, which is of course the volume concept, maybe that’s negotiable, because if you are out of the 16 ounce sour cream, I could get 2 eight ounces, or get the 24 ounce, and try to figure out what to do with the leftovers.

So semantic query parsing improves precision because we can relax queries in an intelligent way where the non-negotiable concepts we hold on to those and relax a little bit on the negotiable concepts. Think about this in combination with semantic vectors search. Semantic vector search is really good at recall, but a little loose on precision. So you could think of semantic query parsing as adding a dose of precision to the semantic vector search results.

Here’s a second idea. This is pretty closely related to what I just talked about for semantic query parsing. But since we are identifying concepts in the query, maybe some of those concepts require some special processing like a model that’s specific to the concept.

And some examples I give here, what if query says inexpensive, or inexpensive blue jeans, or light blue top, or casual leggings, something like that? Well, those concepts, maybe they aren’t handled in one general kind of vector space model as well.I keep saying this – we don’t really have one model to rule them all yet. We find that we have some issues that need special attention. I think color’s one of those examples. We could use semantic query parsing to route those kind of challenging concepts for a specialized model to do further processing.

So color, when I say color, I’m talking about color names, and the meaning of color names. This has been a problem for a long time, matching color based on name is really important in a lot of industries, apparel, it’s important for sure. For more than a hundred years, people have worked on creating some sort of model of human perception. And one of the famous highly used models is the CIELab Color Space. And the idea with this is that if I locate a color name to a point in this color space, and as long as I stay within that point in space, and don’t go too far away, then most people will perceive those colors in that color space as the same or the matching the color name I just gave.

There’s also the idea of close enough.You know, like a human could tell these two colors, color A and B are a little bit different, but for my purposes, you know, I’m putting together an outfit or decorating my home, whatever, they’re close enough. Well, that perception of close enough varies a lot as you move around in this three dimensional, CIELab Color Space and it doesn’t do a very good job at that close enough problem.

And just to give a couple of examples, hot pink is more precise with a closer boundary. If we locate hot pink in this model, we can’t go very far away from it before people start saying, no, that’s not hot pink, but red has a much wider boundary.

Navy blue and dark blue, that’s another example of a variation in precision. And so maybe we need another approach for digital commerce. And maybe we shouldn’t worry too much about trying to directly model human perception, but more model human behavior.

And so back to our shopping signals, when people shop and when they search and they include a color name in their query, well, that’s something we could pick up on with our semantic query parts in. We could create training data to say, here are the color names people are using to query. Now, what do they actually add to cart? Or what have they clicked through in the search results for that? And then perhaps when there, we can train a vector space that works really well at associating the vocabulary of the shopper when they say color with the catalog we’re looking at.

So what’s our opinion? I’ve given a lot of options- you could do this, you could do that. Well, it’s time to give an opinion, and here’s our take at Lucidworks. So today, we have an implementation of semantic vector search and we have a particular solution for zero results or low performing queries that we call never null. Never null is an implementation of semantic vector search. And we can help you with other applications of semantic vector search. That’s what we have today. Coming soon, we will have an implementation of semantic query parsing and it will be integrated with semantic vector search. So, as I said, we’ll be able to add some kind of precision guardrails around the the high recall of vector search.

And then looking beyond coming soon, we’re into what research is going on and that’s this area of how could we integrate concept specific semantic models? I used the color example, but there are many more. And I think this is also a real opportunity for collaboration, where you
might bring your own model. Maybe we help you what some basic concepts, but in your business, you might have data scientists who know how to deal with specific concepts really well. And you could bring that model and plug it in. So the idea is we have semantic query parsing, which identifies concepts, and then route those different concepts for specialized processes.

That is my presentation. Thank you for watching and please join me for some Q&A after this session.

Stay up to date on Fusion