Metallic magnifier symbols illuminated by blue and pink lights on blue and pink background symbolizing the future of ChatGPT AI Search and enterprise search.

With OpenAI’s addition of web search to ChatGPT, it’s clear the landscape of search is evolving fast. But how does this impact enterprise search, which has very different needs? We sat down with search expert Brian Land from Lucidworks to understand how ChatGPT’s update might signal what’s to come for enterprise search.

Digital Communication and Research - Businessman Using Laptop for Online Enterprise Search & Browsing

Q: OpenAI just announced that ChatGPT can now perform web searches. How does this change things for search overall?

Brian Land: ChatGPT’s addition of web search is a big step. Now, it can pull current information from across the web while still answering in a conversational style. That’s a huge advantage for general users who want quick answers on recent events or evolving topics. But for businesses, it’s different. An enterprise search tool needs to securely access private data while keeping information within the company. ChatGPT’s web search isn’t built for that level of security, so there’s a limit to its usefulness for enterprise cases.

Q: How does ChatGPT’s conversational style compare to what businesses need in search?

B.L.: Conversational search has a lot of potential, especially for companies that want to make searching more interactive. ChatGPT can handle follow-up questions naturally, which is something people like. But in a business setting, there’s more at stake. A tool needs to access specific, often confidential data and have tight controls to ensure that information doesn’t end up where it shouldn’t. Without the right protections and access control, the system risks showing sensitive information to unauthorized users, and that’s where most AI systems fall short. 

Companies like Microsoft with MS Copilot are working on this, but those security controls still lack the flexibility to give users the exact permissions they need. Without these fine-tuned controls, you’re open to potential data exposure and compliance risks.

Q: What kinds of data exposure or regulatory issues are possible without the right guardrails?

B.L.: Without strong guardrails, companies face a range of risks. One of the biggest threats is unintentional data leaks—sensitive internal information could show up in responses if the AI isn’t carefully managed. This could include customer data, financial details, or proprietary information. And that’s an expensive problem to fix — estimates say the average cost of data breaches has reached $4.45 million.

Regulatory compliance is another major factor; laws like GDPR and CCPA have strict rules on how data can be accessed and shared, especially personal or customer data. If AI search isn’t set up to respect these rules, companies can face steep fines and serious damage to customer trust. Having strict access controls and entitlements is essential to managing who can see what within an enterprise’s private data sources.

At Lucidworks, we address these challenges with our private models available on Lucidworks AI. These models are secured to ensure that no private enterprise data is leaked publicly, providing businesses with the confidence that their data stays within their own ecosystem. Our approach focuses on controlled, compliant AI search solutions that keep customer and company data safe, allowing organizations to leverage the power of private LLMs without compromising security or compliance.

Did you know? The cost of a data breach is $4.45 million (global average).

 

Q: ChatGPT uses large language models (LLMs) like GPT-4 for search. Is that enough for enterprise use cases?

B.L.: Large language models like GPT-4 are impressive, but they aren’t the best fit for every business use case. Companies we work with are finding that smaller, more specialized models can perform specific tasks faster, at lower costs, and often with better quality. In fact, ChatGPT’s search relies exclusively on OpenAI’s model, which can be limiting. For enterprise use cases, businesses should ideally have the flexibility to select the models that fit each unique task. Having that range of model options available allows a search system to adapt and fine-tune responses according to different needs.


Pro Tip: Specialized, smaller models can deliver faster, more relevant results at lower costs—ideal for specific enterprise needs.


Q: Tools like Perplexity are making headlines. Does this impact enterprise search?

B.L.: Yes, it shows that AI in search is going mainstream, which is great for the field as a whole. But enterprise search goes deeper than what these tools provide. It’s not just about finding info on the public web. It’s about accessing private, often sensitive information that’s locked behind company firewalls. Our job is to keep that access secure while delivering results tailored to a company’s specific data needs, which you don’t get with a tool made for general web search.

Q: What are some main challenges for enterprises thinking about using tools like ChatGPT for search?

B.L.: Security is a huge one. Companies need strict controls over who can see what. Many enterprise search tools, including Lucidworks, are built with security-first approaches to prevent data from being accessible to the wrong people. Then there’s model flexibility—not every business problem needs a huge model like GPT-4. In fact, smaller models are sometimes more accurate and cost-effective for specific tasks, so having flexibility is key.

Q: How does the user experience (UX) differ between ChatGPT’s approach and what enterprises need?

B.L.: ChatGPT’s conversational answers are useful, but people in companies are also used to more traditional search tools, with filters and facets that help them quickly refine results. This fine-tuning ability is often an essential lever in business environments. Pure conversational AI lacks that kind of structured UX, so it can be harder for business users to narrow down information and get to what they need. On the other hand, modern Gen AI search platforms are built with these familiar tools so users have the choice of interacting in the way that works best for them.

Q: Can you talk about why enterprise taxonomies are so important for search relevance?

B.L.: Absolutely—enterprise taxonomies, or a business’s unique classification of terms and topics, are super important because they set a shared language across the organization. General models like GPT-4 are 100% semantic-based, so they don’t inherently understand these unique structures. A modern Gen AI search platform, however, can inherit taxonomies and knowledge graphs from a business’s existing systems. This integration makes responses much more relevant to what a user is looking for and saves time by eliminating guesswork.

Taxonomy matters: For enterprises, a shared language across the organization boosts search relevance and accuracy.

 

Q: How is Lucidworks addressing these challenges differently than other providers?

B.L.: We focus on combining powerful AI with real-world needs for enterprises. Our platform blends deep natural language understanding with the flexibility to be fine-tuned for specific business contexts. Modern Gen AI search platforms, like ours, also provide capabilities like automated, prompt-generated follow-up questions, which help users uncover more insights as they search. This means we’re not only offering high-level AI but also layering it with the controls and tools that give enterprises a meaningful, relevant search experience.

Q: What’s the biggest advantage Lucidworks has in enterprise search today?

B.L.: It’s our commitment to balancing innovation with the practical needs of business. While ChatGPT is making waves with its advancements in conversational AI, Lucidworks sets itself apart by providing secure, highly relevant search solutions tailored to the unique challenges enterprises face. One of our standout features is Lucidworks AI and our hosted private models, which include cutting-edge options like Llama3, Mistral, and other specialized models. These models are designed to be hosted privately, ensuring that enterprises can leverage powerful LLM capabilities without the risk of data leakage or security breaches.

By offering a selection of these models, we give organizations the flexibility to choose the right model for different use cases—whether that means optimizing for speed, cost efficiency, or highly specific task performance. This approach allows businesses to harness advanced AI for their search needs while maintaining strict control over their data, meeting both innovation and security standards seamlessly.

Q: Any last thoughts on OpenAI’s move and what it signals for enterprise search?

B.L.: OpenAI’s update shows that AI search is a big area of growth, and that’s exciting for all of us in this space. But enterprise search requires that extra level of control and understanding that goes beyond general AI. As more businesses look to use AI for search, we’re here to help them do it effectively and securely.

Want to stay up to date on the latest advancements in AI, search, and digital experience? Subscribe to our monthly newsletter for insights like these straight to your inbox.

About Lucidworks

Read more from this author

Best of the Month. Straight to Your Inbox!
Dive into the best content with our monthly Roundup Newsletter! Each month, we handpick the top stories, insights, and updates to keep you in the know.