# Map your product

{% hint style="info" %}
**Assumed audience:** You’re a product manager working on an LLM powered feature. You’re familiar with how LLMs work. Use this as a starting point to think about the problems that LLMs are best suited to solve, and to start experimenting with these solutions.
{% endhint %}

## 👋 Introduction

The traditional process of identifying flows and steps is still important. Pick a core product flow, or job to be done and identify specific places where a *tiny reasoning engine* will improve experiences.

***

## 1️⃣ Pick the right problem

To build with LLMs, the tech stack will need to have independent modules for sources, vector indexes, picking prompts, creating API calls, and fetching summaries. Building this infra can be expensive, so it’s really important to identify specific high-ROI applications in a product. Collaborate with engineering to ideate and understand capabilities of LLMs.

> Khan Academy offers Khanmigo as a personal tutor for every student. 1:1 tutoring is an undeniably superior experience in ed-tech.

{% embed url="<https://youtu.be/rnIgnS8Susg?si=Zx-6JFVZL-NBQKAe>" %}

### Solution as a feature

Feature-level opportunities (eg: Linear, Notion, Grain) can take the form of inline context menus that are tightly scoped AI features.

1. They are context specific.
2. They allow you to use the right UI for the feature, and not retro-fit it to a conversational interface.
3. They are lighter and make it easier to weave AI interactions into your experience.

> &#x20;[Linear's AI feature](https://linear.app/changelog/2023-06-01-ai-filters) is an option within its Filter feature. It makes an existing feature more powerful.

{% embed url="<https://file.notion.so/f/f/a2b04ede-174b-4505-87ee-675ca7c8dac0/e93c0026-bba0-4cd9-ab77-851548a3dab7/AI_Filters.mp4?downloadName=AI+Filters.mp4&expirationTimestamp=1703224800000&id=3b556b33-cb7b-488a-92a8-0db2fe66822e&signature=a-OL4T5K6dWnXaOc1A_3BBV0MCDD06PyMwef6zvaXcs&spaceId=a2b04ede-174b-4505-87ee-675ca7c8dac0&table=block>" %}

### Solution as a platform

Platform-level opportunities (eg: Microsoft Copilot, Sidekick, Bard) can take the form of a copilot/chat as a capability across your tool.

1. Natural language just got way more powerful, as it is the medium for LLMs.
2. Conversation makes it easy to maintain context move through a flow.
3. It’s powerful only when you need a central hub for your use cases.

> Shopify’s [Sidekick](https://www.shopify.com/in/magic) is an assistant that can take actions across different features of the product from a single interface.

{% embed url="<https://www.youtube.com/embed/HVvbY7A7lIQ?si=EcYszUOODagH063a>" %}

### Solution as a person

Sometimes a full blown chat interaction is the solution. This is a good idea if you are emulating a real person.

> Intercom, Duolingo, and Khanmigo have all identified that being *conversational and personal* is a core part of their products, and leaned into it.

<figure><img src="/files/psD5YtmRSR9boKicbfzS" alt=""><figcaption></figcaption></figure>

**Now you should have clarity on a high-ROI problem that only an LLM can solve, and a solution concept. It’s time to start tinkering!**

***


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://university.obvious.in/working-with-features/building-with-ai/map-your-product.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
