Assumed audience: You’re a product manager working on an LLM powered feature. You’re familiar with how LLMs work. Use this as a starting point to think about the problems that LLMs are best suited to solve, and to start experimenting with these solutions.
👋 Introduction
The traditional process of identifying flows and steps is still important. Pick a core product flow, or job to be done and identify specific places where a tiny reasoning engine will improve experiences.
1️⃣ Pick the right problem
To build with LLMs, the tech stack will need to have independent modules for sources, vector indexes, picking prompts, creating API calls, and fetching summaries. Building this infra can be expensive, so it’s really important to identify specific high-ROI applications in a product. Collaborate with engineering to ideate and understand capabilities of LLMs.
Khan Academy offers Khanmigo as a personal tutor for every student. 1:1 tutoring is an undeniably superior experience in ed-tech.
Solution as a feature
Feature-level opportunities (eg: Linear, Notion, Grain) can take the form of inline context menus that are tightly scoped AI features.
They are context specific.
They allow you to use the right UI for the feature, and not retro-fit it to a conversational interface.
They are lighter and make it easier to weave AI interactions into your experience.
Linear's AI feature is an option within its Filter feature. It makes an existing feature more powerful.