AI Demystified for Executives
Ever wish you had a coach to help you decipher the AI buzz and hype so you could make better-informed business decisions about what, when, and why to use AI in your business? You'll get that when you tune into the AI Demystified for Executives podcast.
Andrew bridges the gap between complex AI concepts and practical business applications. With experience in both large corporations and high-growth startups, he excels at communicating with business and technical teams alike. As an author, industry thought leader and international speaker, Andrew serves as your trusted advisor and coach on this AI journey.
AI doesn't have to be complicated!
Here is what you can expect from this podcast:
We'll explore a monthly theme with specific topics each week. You'll also receive a free cheat sheet or guide for reference.
In each podcast, our goal is to ensure that you walk away understanding:
- One key AI concept
- Its business applicability
- An actionable takeaway
Our Monday episodes are 7-10 minutes long, perfect for getting up to speed on the week's most important "aha" moments during your commute or while sipping your morning coffee or tea.
On Wednesdays, our episodes run 30-40 minutes, providing a deeper dive into the week's topic. (Maybe a bit much if you're driving!)
Once a month, during the Wednesday podcast, we'll host an interview with either a business or technical professional related to the monthly theme.
AI Demystified for Executives
#5 - Quick Bytes - Vector Database
In this episode of 'AI Demystified for Executives,' host Andrew Psaltis introduces vector databases and their significance in natural language search and conversational interfaces. The episode explains how vector databases enhance keyword search by understanding the context of words, akin to human conversation, and the role they play in providing long-term memory for large language models such as ChatGPT. The podcast is divided into Monday's quick bites and deeper dives scheduled for Wednesdays, with this episode focusing on the quick understanding of vector databases.
Dragon fly risings school of data proudly presents the AI demystified for executives podcast. This is the podcast for executives who want to learn how to apply aI to their business. I'm your host. Andrew Psaltis
as a reminder, this podcast is split into two parts. Seven minute. Monday's perfect for catching up on the week's most important aha moments. During your commute or while sipping your morning coffee or tea. And then 30 to 40 minute deep dives into the week's topics on Wednesday. That may be a bit much if you're driving.
Be sure to subscribe to this podcast on your favorite platform, where you listen to others.
This week's topic is vector databases. In why you should care. This is our Monday. Quick bites episode. Have you ever wondered how, when you do a Google search? It returns results related to your search. As if it understands the meaning of the words. Or you could type a sentence into Spotify, for example. Electric cars, climate impact. And get all the podcasts that are related to it.
Users today. Are beginning to expect. At least two new experiences. Natural language search. As in these two examples. And conversational interfaces. Vector databases are at the heart of these expectations. So let's dig in a little deeper on these.
We're going to work backwards. From the user's perspective. On our Wednesday podcast, we'll dig into the nuts and bolts in more detail.
Firstly is vector search. You hear it talked about as natural language search. Or also referred to semantic search. And context for LLMs. So let's look at the natural language search first.
In a nutshell. Natural language search. It matches a query. To the documents that are semantically correlated. Instead of needing exact word matches. It can matte synonyms. Paraphrases, et cetera. Any variation. Of natural language. That expresses the same meaning. Very much how we're used to interacting with each other. we can understand concepts that we're talking about. When we're having a conversation. Whereas traditional search keyword search. You could think of that as a book index. You mapped terms. A list of document IDs. Page numbers.
Think about it that way. That mentioned the term.
So a document in this case means. Whatever's being searched for a product, a job at webpage. Your product documentation, maybe 300 page. PDF that provides all the details for your product. At the back of that 300 page PDF, you're going to expect to have an index. Showing where those words are mentioned. Just like a book and any book there ends up being an index. Keyword search that's traditional search. You may hear it referred to as keyword may hear it referred to as traditional search. It is acting just like that you put in a keyword. It looks to find where there is a matching document. Where you have problems is if the keyword doesn't match. Then you get no results. There's lots of tricks that have been done in lots of ways that you could do things. to make keyword search work. You could add into the index synonyms, you could add other phrases. You could do some pretty interesting things to make it so that it's a better experience for the user and they most often get good results.
If you're doing a search. But natural language. Allows it to find more of the context. And to do things differently.
So let's imagine for a moment. That we wanted to do a search over jeopardy questions.
And you want to be able to return back all the questions. That are about an animal.
With traditional search that relies on keywords. You'd have to have a query, or creating an index. Of every place there's a dog contains a cat. Contains a Wolf. And Dan's a lion. Contains a bird, all these different. Entries in your index, you'd have to create. With a semantic search. You could just query for the concept of animals. If we were sitting down at a table, And we had a deck of cards in that deck of cards. Each one was a jeopardy question. And you said. Give me back all the cards. That have to do with animals. and I'd be able to quickly do that because I understand. The concept of animals and vice versa, you can do the same thing.
So it allows you to find these loosely similar things. Similar to how we're used to operating as humans.
A more realistic example if you use Spotify, You would experience. Spotify want to provide a better experience for users. To find more relevant content with less effort. So when you switched from the keyword mindset to this natural language search, It's less effort on your customers, less effort on all of us as knowledge workers to find relevant information. They're example, I used at the beginning. Was typing in a query. Electric cars, climate impact.
They're able to bring back podcast, episode results. That are related to your search. The terms, electric cars, climate impact may not be in the title of the podcast. May not even be in the description. But they're able to understand the concept.
Using natural language search.
So think about where in your business. It could be internal. It could be external. Is there an opportunity or are you already using. This type of search technology, and this give you a better idea as to what's really involved. and what are the, your engineering teams using houses working? And if you're not using it, how can you think about using it?
The second predominant use case. For vector databases and why they've also become very popular. Since the advent of chat GPT coming on the market. You may have noticed. When you interact with any of these GPT chat interfaces? They have a memory and they continue your conversation that you've had. But that only lasts for so long.
And it's not. What we consider a longterm memory. you have to start a new chat. You start from scratch. It doesn't have anything. then you get prompted to start a new chat. And again, there's no memory of what you were discussing. You could think about this as they don't have long-term memory.
So they have short-term or working memory that we'd be used to. That we're engaged in a conversation. We have short-term memory about what's going on or working memory. But we don't have this long-term memory of.
The conversation that we had a year ago. They're all conversation we had two years ago. Vector search comes into play for providing these large language models. With long-term memory. When you think about these large language models? They're stateless. They forget what you just discussed? If you don't store it? And they have no real long-term memory.
You could also think about that. As a way that you could provide domain specific context. So you could provide the large language model. Domain specific business information. Give it context as to what you're talking about. Almost like you're providing. All of the memory for it as context. We have the benefit as humans of being able to remember things for years. So as you and I may be sitting and having a conversation. We could switch context. We also have a lot of context. That develops over time. That we're communicating with each other. So that is always part of our conversation. as this context, that is part of the human experience. The language models don't have that.
So you provide them. This context. So you can think about providing all this history.. This'll improve their capabilities. For doing things like, question, answer. So those are the two different ways to think about search from a vector database.
So if you're using a large language model in your business, Think about how you may be able to use. Vector database and vector search to provide it. With your business context. And if you're not using one at this point in time, that's fine too. As you start to think about these types of technologies and using Keep in mind that not just, can you use a vector database for providing natural language search, but could also use it. To provide context to a large language model. In our Wednesday deep dive, we're going to go into how these work. In a bit of detail.
So please come back to that episode, listen to it. You'll learn more than nuts and bolts of how this works. For today. That's our quick bites. That's all for now.