Y Combinator (YC), the renowned Silicon Valley-based startup accelerator, recently announced their Winter 2023 cohort and, unsurprisingly, ~31% of startups (80 out of 269) have a self-reported AI tag. Take the actual number with a grain of salt but the trend is clear - startups leveraging AI are now a sizable part of the YC cohort.
For this piece, I analyzed 20-25 startups from this batch to understand some of the larger trends, particularly among startups that are leveraging LLMs (large language models). The trends span across how they are identifying problems to solve, what approaches they are taking to the solutions, what they are doing right as well as potential risks in their approach.
But before we go into trends, let’s start with a general framework for how tech companies (small or large) can think about generating value from AI.
AI value chain
If you have been following tech news recently, there has been an explosion of content about AI, and it’s difficult to always make sense of where this news fits in the broad picture. Let’s use a simplified framework for how to think about it.
AI is a very broad term encompassing a wide range of technologies, from regression models that can predict things, to computer vision that can identify objects, to most recently LLMs (large language models). For the sake of this discussion, we will focus on LLMs which have been in the spotlight recently after OpenAI opened up ChatGPT to the public and started an AI race among companies.
Tech companies leveraging AI typically operate in one of these three layers:
Infrastructure - This includes hardware providers (eg. NVIDIA that makes GPUs to support all the heavy computation required for AI models), compute providers (eg. Amazon AWS, Microsoft Azure, Google Cloud that provide processing power on the cloud), AI models / algorithms (eg. OpenAI, Anthropic that provide LLMs) , and AI platforms (eg. TensorFlow that provides a platform for training your models)
Data platform / tooling layer - This includes platforms that enable collecting, storing, and processing data for AI applications (eg. Snowflake that provides data warehouse in the cloud, Databricks that provides a unified analytics platform)
Application layer - This spans across all companies (startups, mid-large tech companies as well as not-natively-tech companies) that are leveraging AI for specific applications
Based on where the market is currently is and how this has played out in similar situations in the past (eg. the cloud computing market), the Infrastructure and Data Platform layer will likely converge into a handful of players with relatively commoditized offerings. For example:
Among hardware players, NVIDIA is currently the leader with their GPU offerings (their stock tripled in 2023) and we’ll have to see who else catches up
The compute market has already converged, with AWS, Azure and Google Cloud owning two-thirds of the market
In the AI algorithms layer, OpenAI came out strong with the GPT models but it is a highly competitive market with deep-pocketed players (Google with Deepmind/Google Brain, Facebook Lambda, Anthropic, Stability AI) - see this analysis if you want a deeper take. Two things to note here:
Most of these companies have access to the same data sets, and if one company does get access to a new paid data set (eg. Reddit), it is likely that the competitors will get access to that as well
The GPT model sits at the algorithms layer but the ChatGPT product sits at the application layer (not at the algorithms layer)
Given this likely path to commoditization, the companies operating in these layers have two possible paths they can pursue:
First path is beefing up their offerings to operate across layers, as evidenced by recent M&A activity
Snowflake (data warehousing company in the data platform layer) recently acquired Neeva to strengthen their search capability as well as potentially unlock application of LLMs for enterprises
Databricks (analytics platform in the data platform layer) acquired MosaicML (in the AI algorithms layer) to make “generative AI accessible for every organization, enabling them to build, own and secure generative AI models with their own data”
Second path is moving up to the application layer
ChatGPT is a classic example - OpenAI’s strength was in the AI algorithms layer but with the launch of a consumer product, they are now the first real competitor to Google search in decades
A majority of future value unlocked from AI and LLMs will be at the application layer, including the value generated by founding new startups, which brings us to YCombinator.
How Y Combinator (YC) works
Brief piece of context about YC and then we’ll jump into trends.
Most YC companies are super early stage - 52% of the batch were accepted with only an idea and 77% of the batch had zero revenue before YC.
YC is quite selective (<2% acceptance rate) but primarily works on volume:
They invested in 300+ companies across two batches in 2023, and 600+ companies in 2022
Companies get a standard deal ($125k funding for 7% of equity)
YC provides startups a lot of mentorship as well as access to a large network of people including YC alums, investors, etc.
Therefore for them to be successful, YC only needs a few HUGE hits to make money (similar to any angel investing), and several alums have been wildly successful
All this to say - YC is a good “representative” list for where the early stage startup market is and where opportunities lie for startups just getting off the ground by leveraging AI. With that, we’ll dive into the big trends.
AI startups trends
1. Focus on specific problems and customers
Startups are targeting focused problems for a focused set of customers, i.e. there are fewer “generic” AI solutions.
One such example is Yuma.ai, which focuses on assisting Shopify merchants who struggle with handling customer requests and concerns. By utilizing large language models (LLMs), Yuma.ai automates the generation of responses from a knowledge base. Another startup called Speedy is dedicated to supporting small and medium-sized businesses (SMBs) that lack the time to create marketing content using generative AI. Haven aims to automate approximately 50% of resident interactions for property managers. OfOne targets large fast food drive-thrus, helping them automate order-taking processes and increase profitability.
In all of these examples, there is a singular focus on a narrow problem space and customer, and applying LLMs in that context.
2. Integrations with existing software
In addition to just picking up GPT / LLMs and exposing them through a UI, several startups are taking a step further by integrating with existing software that their customers already use.
A prime example of this is Lightski, which focuses on integrating with customer relationship management (CRM) software such as Salesforce. Their goal is to enable customers to update their CRM by simply sending a natural language message through Slack, eliminating the need to navigate through layers of user interfaces. Yuma.ai offers a one-click installation feature into help desk software, combining the power of LLMs with the customers’ own knowledge base, generating draft responses for service agents.
These integrations have been a big driver towards unlocking new use cases that out-of-the-box LLM application like ChatGPT cannot easily solve.
3. Leveraging LLMs in conjunction with other AI technologies
Startups are exploring creating a differentiated product by using other AI technologies such as computer vision and predictions in conjunction with LLMs.
One such example is Automat, whose customers provide a video demonstration of a repetitive Chrome process they wish to automate. Automat then utilizes computer vision techniques applied to screen recordings along with human natural language input to create the desired automations. Another startup called Persana AI takes advantage of CRM data integrations and publicly available data to predict potential hot leads for sales teams. They then employ LLMs to draft personalized outbound messages for each identified lead, utilizing available custom data about the individual.
Incorporating a combination of technologies is helping these startups create a moat and differentiation from generic LLM applications.
4. Customization of LLMs
Many startups are offering customization options based on users' past data and language style to customize the LLM models used by a customer.
For instance, Speedy, a platform that assists SMBs in generating marketing content, conducts branding workshops with their customers. The insights gathered from these workshops are then fed into their models, allowing Speedy to capture and incorporate the unique voice and brand identity of each business into the generated content. Along similar lines, Yuma.ai focuses on learning the writing style from previous helpdesk tickets. By analyzing the patterns and language used in these interactions, Yuma.ai is able to generate draft responses that align with the established style, ensuring consistency and personalization in customer communications.
5. Creative user interfaces
One of the most underrated levers that startups are starting to leverage is building unique and useful UI interfaces, which most current LLM products (like chatGPT, Bard) are not great at. These interfaces, when customized to specific uses cases, can unlock a ton of new value to customers and bring in more users in the fold who have not yet adopted existing products given the difficulty to use.
Type is an interesting example - they have built a flexible, fast document editor that lets a user quickly surface powerful AI commands by hitting cmd + k as they write. Type’s AI understands the context of a document, plus adapts suggestions as you write more and it learns your style.
Couple of other interesting examples are Lightski’s use of Slack as an interface for updating CRM information, and Persana AI using Chrome extensions as a way of easily providing access to outbound drafts while on a person’s LinkedIn page.
6. High information volume, high precision use cases
There have been a handful of startups in the batch focused on specific use cases that both require processing a large amount of information as well as require high level of precision with respect to the insights discovered.
SPRX ingests data directly from your payroll and accounting systems to calculate accurate R&D credit that meets IRS’ requirements.
In the world of healthcare, Fairway Health uses LLMs to analyze long (70+ page) medical records to determine if a patient is eligible for a treatment, helping health insurance companies be more efficient.
AiFlow uses LLMs to surface quotes and data from analysis hundreds of documents, to help private equity firms do due diligence.
7. Data silo-ed, BYOD products for enterprise customers
Unlike consumers, enterprises want control over how their data is used and shared with companies, including the providers of AI software. They want to operate in a construct where they can bring their own data (BYOD) to a baseline product, and customize the product in a silo-ed environment.
The idea for CodeComplete first came up when their founders tried to use GitHub Copilot while at Meta and their request was rejected internally due to data privacy considerations. CodeComplete is now an AI coding assistant tool that's fine tuned to customers’ own codebase to deliver more relevant suggestions, and the models are deployed directly on-premise or in the customers’ own cloud.
In a similar vein, AlphaWatch AI, an AI copilot for hedge funds, helps customers use custom LLMs that leverage both external data sources and secure private data.
Moat Risks
It’s definitely exciting to see a large number of AI startups emerge, which help both individual consumers and organizations be more effective. These products undoubtedly will be a huge unlocks for productivity and effectiveness in solving problems.
However, a key risk with several of these startups is the potential lack of a long-term moat. It is difficult to read too much into it given the stage of these startups and the limited public information available but it’s not difficult to poke holes at their long term defensibility. For example:
If a startup is built on the premise of taking base LLMs like GPT, building integrations into helpdesk software to understand knowledge base & writing style, and then generating draft responses, what’s stopping a helpdesk software giant (think Zendesk, Salesforce) from copying this feature and making it available as part of their product suite?
If a startup is building a cool interface for a text editor that helps with content generation, what’s stopping Google Docs (that is already experimenting with auto-drafting) and Microsoft word (that is already experimenting with Copilot tools) to copy that? One step further, what’s stopping them from providing a 25% worse product and giving it away for free with an existing product suite (eg. Microsoft Teams taking over Slack’s market share)?
The companies that don’t have a moat could still be successful in their current form, and the nature of what they do makes them attractive acquisition targets, both from a feature add-on and from a talent perspective. However, building a moat would be critical for startups that are interested in turning these early ideas into huge successes.
One clear approach is building a full product that solves for a problem space and is heavily using AI as part of their feature (vs an AI-only product that’s an add-on on top of an existing problem space).
For example, Pair AI is focused on the problem space of helping creators build more engaging courses in a Tiktok-esque format, with some AI features as part of their offering (like conversational Q&A). KURUKURU is building a 3D engine for creating comics, and has some AI features around creating characters.
Another approach is beefing up the product offering (going from an AI feature to a broader product for the problem space) by leveraging some of the above trends in tandem with each other - data integrations, BYOD models, enabling customizations, combining with other AI technologies.
It’s a rapidly evolving market and we’re ways away from seeing how these startups pan out - it’ll be interesting to see how they shape over the next few years. And good luck to YC’s W23 cohort!