docs: improve build with AI chapter (#14259)

* docs: improve build with AI chapter

* update sidebar
This commit is contained in:
Shahed Nasser
2025-12-09 16:59:39 +02:00
committed by GitHub
parent cc84b9ac49
commit c26f5643b6
5 changed files with 111 additions and 89 deletions

View File

@@ -18,50 +18,6 @@ export const metadata = {
In this chapter, you'll learn how you can use AI assistants and LLMs effectively in your Medusa development.
## AI Assistant in Documentation
The Medusa documentation is equipped with an AI Assistant that can answer your questions and help you build customizations with Medusa.
### Open the AI Assistant
To open the AI Assistant, either:
- Use the keyboard shortcut `Ctrl + I` for Windows/Linux, or `Cmd + I` for macOS.
- Click the <InlineIcon Icon={AiAssistent} alt="AI Assistant" /> icon in the top right corner of the documentation.
You can then ask the AI Assistant any questions about Medusa, such as:
- What is a workflow?
- How to create a product review module?
- How to update Medusa?
- How to fix this error?
The AI Assistant will provide you with relevant documentation links, code snippets, and explanations to help you with your development.
### Ask About Code Snippets
While browsing the documentation, you'll find a <InlineIcon Icon={AiAssistentLuminosity} alt="AI Assistant" /> icon in the header of code snippets. You can click this icon to ask the AI Assistant about the code snippet.
The AI Assistant will analyze the code and provide explanations, usage examples, and any additional information you need to understand how the code works.
### Ask About Documentation Pages
If you need more help understanding a specific documentation page, you can click the "Explain with AI Assistant" link in the page's right sidebar. This will open the AI Assistant and provide context about the current page, allowing you to ask questions related to the content.
### Formatting and Code Blocks
In your questions to the AI Assistant, you can format code blocks by wrapping them in triple backticks (```). For example:
~~~markdown
```
console.log("Hello, World!")
```
~~~
You can add new lines using the `Shift + Enter` shortcut.
---
## MCP Remote Server
The Medusa documentation provides a remote Model Context Protocol (MCP) server that allows you to find information from the Medusa documentation right in your IDEs or AI tools, such as Cursor.
@@ -124,6 +80,61 @@ claude mcp add --transport http medusa https://docs.medusajs.com/mcp
</TabsContentWrapper>
</Tabs>
### How to Use the MCP Server
After connecting to the Medusa MCP server in your AI tool or IDE, you can start asking questions or requesting your AI assistant to build Medusa customizations. It will fetch the relevant information from the Medusa documentation and provide you with accurate answers, code snippets, and explanations.
For example, you can ask:
1. "Create a Product Review module for Medusa. Refer to the Medusa documentation for information."
2. "How to update Medusa to the latest version?"
3. "Explain the Medusa workflow system."
4. "Integrate Medusa with X provider. Refer to the Medusa documentation for information."
---
## AI Assistant in Documentation
The Medusa documentation is equipped with an AI Assistant that can answer your questions and help you build customizations with Medusa.
### Open the AI Assistant
To open the AI Assistant, either:
- Use the keyboard shortcut `Ctrl + I` for Windows/Linux, or `Cmd + I` for macOS.
- Click the <InlineIcon Icon={AiAssistent} alt="AI Assistant" /> icon in the top right corner of the documentation.
You can then ask the AI Assistant any questions about Medusa, such as:
- What is a workflow?
- How to create a product review module?
- How to update Medusa?
- How to fix this error?
The AI Assistant will provide you with relevant documentation links, code snippets, and explanations to help you with your development.
### Ask About Code Snippets
While browsing the documentation, you'll find a <InlineIcon Icon={AiAssistentLuminosity} alt="AI Assistant" /> icon in the header of code snippets. You can click this icon to ask the AI Assistant about the code snippet.
The AI Assistant will analyze the code and provide explanations, usage examples, and any additional information you need to understand how the code works.
### Ask About Documentation Pages
If you need more help understanding a specific documentation page, you can click the "Explain with AI Assistant" link in the page's right sidebar. This will open the AI Assistant and provide context about the current page, allowing you to ask questions related to the content.
### Formatting and Code Blocks
In your questions to the AI Assistant, you can format code blocks by wrapping them in triple backticks (```). For example:
~~~markdown
```
console.log("Hello, World!")
```
~~~
You can add new lines using the `Shift + Enter` shortcut.
---
## Plain Text Documentation

View File

@@ -122,7 +122,7 @@ export const generatedEditDates = {
"app/learn/fundamentals/api-routes/override/page.mdx": "2025-05-09T08:01:24.493Z",
"app/learn/fundamentals/module-links/index/page.mdx": "2025-05-23T07:57:58.958Z",
"app/learn/fundamentals/module-links/index-module/page.mdx": "2025-11-26T11:20:25.961Z",
"app/learn/introduction/build-with-llms-ai/page.mdx": "2025-10-02T15:10:49.394Z",
"app/learn/introduction/build-with-llms-ai/page.mdx": "2025-12-09T14:17:13.295Z",
"app/learn/installation/docker/page.mdx": "2025-10-24T08:53:46.445Z",
"app/learn/fundamentals/generated-types/page.mdx": "2025-07-25T13:17:35.319Z",
"app/learn/introduction/from-v1-to-v2/page.mdx": "2025-10-29T11:55:11.531Z",

View File

@@ -44,20 +44,20 @@ export const generatedSidebars = [
"loaded": true,
"isPathHref": true,
"type": "link",
"title": "Architecture",
"path": "/learn/introduction/architecture",
"title": "AI Assistants and LLMs",
"path": "/learn/introduction/build-with-llms-ai",
"children": [],
"chapterTitle": "1.3. Architecture",
"chapterTitle": "1.3. AI Assistants and LLMs",
"number": "1.3."
},
{
"loaded": true,
"isPathHref": true,
"type": "link",
"title": "AI Assistants and LLMs",
"path": "/learn/introduction/build-with-llms-ai",
"title": "Architecture",
"path": "/learn/introduction/architecture",
"children": [],
"chapterTitle": "1.4. AI Assistants and LLMs",
"chapterTitle": "1.4. Architecture",
"number": "1.4."
},
{

View File

@@ -25870,6 +25870,51 @@ The following diagram illustrates Medusa's architecture including all its layers
In this chapter, you'll learn how you can use AI assistants and LLMs effectively in your Medusa development.
## MCP Remote Server
The Medusa documentation provides a remote Model Context Protocol (MCP) server that allows you to find information from the Medusa documentation right in your IDEs or AI tools, such as Cursor.
Medusa hosts a Streamable HTTP MCP server available at `https://docs.medusajs.com/mcp`. you can add it to AI agents that support connecting to MCP servers.
### Cursor
<Link href="cursor://anysphere.cursor-deeplink/mcp/install?name=medusa&config=eyJ1cmwiOiJodHRwczovL2RvY3MubWVkdXNhanMuY29tL21jcCJ9" target="_blank" rel="noopener noreferrer" variant="content">Click here</Link> to add the Medusa MCP server to Cursor.
To manually connect to the Medusa MCP server in Cursor, add the following to your `.cursor/mcp.json` file or Cursor settings, as explained in the [Cursor documentation](https://docs.cursor.com/context/model-context-protocol):
```json
{
"mcpServers": {
"medusa": {
"url": "https://docs.medusajs.com/mcp"
}
}
}
```
### VSCode
To manually connect to the Medusa MCP server in Claude Code, run the following command in your terminal:
```sh
claude mcp add --transport http medusa https://docs.medusajs.com/mcp
```
### Claude Code
### How to Use the MCP Server
After connecting to the Medusa MCP server in your AI tool or IDE, you can start asking questions or requesting your AI assistant to build Medusa customizations. It will fetch the relevant information from the Medusa documentation and provide you with accurate answers, code snippets, and explanations.
For example, you can ask:
1. "Create a Product Review module for Medusa. Refer to the Medusa documentation for information."
2. "How to update Medusa to the latest version?"
3. "Explain the Medusa workflow system."
4. "Integrate Medusa with X provider. Refer to the Medusa documentation for information."
***
## AI Assistant in Documentation
The Medusa documentation is equipped with an AI Assistant that can answer your questions and help you build customizations with Medusa.
@@ -25914,40 +25959,6 @@ You can add new lines using the `Shift + Enter` shortcut.
***
## MCP Remote Server
The Medusa documentation provides a remote Model Context Protocol (MCP) server that allows you to find information from the Medusa documentation right in your IDEs or AI tools, such as Cursor.
Medusa hosts a Streamable HTTP MCP server available at `https://docs.medusajs.com/mcp`. you can add it to AI agents that support connecting to MCP servers.
### Cursor
<Link href="cursor://anysphere.cursor-deeplink/mcp/install?name=medusa&config=eyJ1cmwiOiJodHRwczovL2RvY3MubWVkdXNhanMuY29tL21jcCJ9" target="_blank" rel="noopener noreferrer" variant="content">Click here</Link> to add the Medusa MCP server to Cursor.
To manually connect to the Medusa MCP server in Cursor, add the following to your `.cursor/mcp.json` file or Cursor settings, as explained in the [Cursor documentation](https://docs.cursor.com/context/model-context-protocol):
```json
{
"mcpServers": {
"medusa": {
"url": "https://docs.medusajs.com/mcp"
}
}
}
```
### VSCode
To manually connect to the Medusa MCP server in Claude Code, run the following command in your terminal:
```sh
claude mcp add --transport http medusa https://docs.medusajs.com/mcp
```
### Claude Code
***
## Plain Text Documentation
The Medusa documentation is available in plain text format, which allows LLMs and AI tools to easily parse and understand the content.

View File

@@ -27,13 +27,13 @@ export const sidebars = [
},
{
type: "link",
title: "Architecture",
path: "/learn/introduction/architecture",
title: "AI Assistants and LLMs",
path: "/learn/introduction/build-with-llms-ai",
},
{
type: "link",
title: "AI Assistants and LLMs",
path: "/learn/introduction/build-with-llms-ai",
title: "Architecture",
path: "/learn/introduction/architecture",
},
{
type: "link",