Spanish English French German Italian Portuguese
Social Marketing
HomeTechnologyArtificial IntelligenceWill startups have a chance in the enterprise AI race?

Will startups have a chance in the enterprise AI race?

It's impossible to escape the talk of AI as the biggest tech companies rush to create or partner with big new language models and integrate them into their software and search services. The underlying technology is advancing fast enough that there are calls for a pause in work and Congress to grill technology leaders on the issue.

But while ChatGPT and other similar tools are popular, there is a less discussed side to the current AI race: the enterprise.

Recent news from Appian, a public software company, and Neeva, a startup born to build a search engine that could compete with the offerings of large companies, make it clear that the number of participants competing to build tools is expected to increase. and AI services for large companies is healthy. Given how lucrative selling software to large companies can be, participants are not chasing a small market.

Today we're going to look back at how generative AI can fit into the enterprise, and then we're going to delve into the latest big news to better understand the direction tech companies are taking.

Industry or company?

What makes ChatGPT and related tools so fun to use is that you can throw almost anything at them and they'll give you a response. Do you want a generative AI service to write you a haiku about Dream Theater's discography? This is what ChatGPT responded to me this morning:

cascading melodies,
Dreams painted with symphonies,
Time travel unfolds.

I regret to inform you that it is a better poem - and about 1.000 times faster - than I could have achieved with the same message.

But while it's incredibly exciting to use generative AI tools built from simply massive data sets, businesses have different needs and priorities than the world's humble consumer population. Different needs, different inputs and different outputs. As Ron Miller wrote last month: "What if each industry or even each company had its own model trained to understand the jargon, language and approach of the individual entity?"

Recent news highlights that Ron might have hit the nail on the head.

In its latest earnings presentation, automation company Appian discussed its efforts to integrate new AI technologies into its own software corpus. Appian provides process mining and automation tools, along with low-code application building capabilities, for reference. Here's Appian CEO Matt Calkins (via Fool transcript, emphasis added):

We're announcing [a new feature] that I call low-code AI that makes it easier for customers to cultivate their own AI on Appian's connected data sets. This public-private split separates Appian from its biggest competition. By being champions of private AI, we attract buyers who prefer not to share their data assets. Our ability to gather large data sets to train private AI algorithms comes from a feature called data fabric.
Data fabric is a fancy term for a virtual database and means that we can treat data from across the enterprise as if it were together even though it remains separate. This strategy is preferable for our customers, who don't like having to relocate data. Data is the most difficult part of building and running processes, so this feature is a substantial advantage. In turn, our data fabric gives us a critical advantage in inventing the next generation of process mining.

Appian, valued at a few billion dollars and on track to surpass $500 million in revenue this year, is not a big tech company. It's just a big one that went public in 2017. It believes its role as the digital connective tissue between corporate data sets - helping them find processes that are inefficient and can be automated - will give it an advantage in providing AI services to customers. customers who do not want to take advantage of mass market tools.

This is very nice. Not that I have anything to say about “who will win the enterprise AI race.” But I do like a competitive market, as it tends to generate not only the fastest pace of innovation, but also allows for a greater customer surplus thanks to competitive prices. If Appian believes he has an advantage and can bring his new technology to market quickly, he could grab a big chunk of future enterprise AI (generative enterprise AI? Generative enterprise AI? Generative AI for enterprises?).

It's good that Appian is going to take on competing products that I assume will come out in legions from the tech giants, but what about even smaller tech companies? What about the startups themselves?

Neeva is an interesting case. This search-focused startup wanted to build a new search engine that wasn't monetized with ads. Instead, users would pay a small monthly fee and Neeva could invest that revenue into search technology serving end users rather than advertisers. The idea was good.

But over the weekend, Neeva put its consumer search engine into receivership. Because? Because what he originally built was an interesting version of the old or classic search model. With consumer demand and corporate search work rapidly pivoting toward using LLM to generate answers rather than lists of relevant links, Neeva had to pivot to the new reality:

By early 2022, it became clear to us the impact that generative AI and LLMs would have. We embarked on an ambitious effort to seamlessly integrate LLMs into our search stack. We brought the Neeva team together around the vision of creating an answer engine. We are proud to be the first search engine to offer real-time, cited AI answers to most queries earlier this year.

However, the company added that user acquisition had proven difficult. According to Neeva, it was easier to get users to pay for the service than to test it. With ChatGPT and LLM-based searches from big companies like Bing and Google grabbing headlines, Neeva decided to try something new (emphasis added):

In the last year, we have seen the clear and pressing need to use LLMs effectively, cheaply, safely and responsibly. Many of the techniques we've pioneered with small models, downsizing, latency reduction, and cheap deployment are things businesses really want, and need, today. We are actively exploring how we can apply our search and LLM expertise in these environments, and in the coming weeks we will provide updates on the future of our work and our team.

It's too early to know if Neeva will be able to use its technology to create internal LLMs for companies, but the fact that it is trying is interesting. Perhaps it will also gain market share in a more or less new market and give Big Tech an even denser competitive landscape to try to dominate. If you can do it, maybe other companies can do it too.

One final thought: Neeva's turn may seem like a quest twist. But if search is going the LLM route, and Neeva is simply taking that technology and applying it to a particular type of client, would it be fair to say it's still pursuing search? Business search, of course, but it's a question I'm asking myself. Going a little further, is Appian building an enterprise search engine? Maybe.

If we expand the definition of search to AI-powered answers, and expect those same LLMs to help us create and execute tasks, perhaps search is simply evolving into “a chat box that can answer questions, create on-demand, and help execute.” tasks". If so, many companies will fight for the same business terrain. Let's hope some startups carve out a niche for themselves.

RELATED

Leave a response

Please enter your comment!
Please enter your name here

Comment moderation is enabled. Your comment may take some time to appear.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

SUBSCRIBE TO TRPLANE.COM

Publish on TRPlane.com

If you have an interesting story about transformation, IT, digital, etc. that can be found on TRPlane.com, please send it to us and we will share it with the entire Community.

MORE PUBLICATIONS

Enable notifications OK No thanks