docs: create docs workspace (#5174)

* docs: migrate ui docs to docs universe

* created yarn workspace

* added eslint and tsconfig configurations

* fix eslint configurations

* fixed eslint configurations

* shared tailwind configurations

* added shared ui package

* added more shared components

* migrating more components

* made details components shared

* move InlineCode component

* moved InputText

* moved Loading component

* Moved Modal component

* moved Select components

* Moved Tooltip component

* moved Search components

* moved ColorMode provider

* Moved Notification components and providers

* used icons package

* use UI colors in api-reference

* moved Navbar component

* used Navbar and Search in UI docs

* added Feedback to UI docs

* general enhancements

* fix color mode

* added copy colors file from ui-preset

* added features and enhancements to UI docs

* move Sidebar component and provider

* general fixes and preparations for deployment

* update docusaurus version

* adjusted versions

* fix output directory

* remove rootDirectory property

* fix yarn.lock

* moved code component

* added vale for all docs MD and MDX

* fix tests

* fix vale error

* fix deployment errors

* change ignore commands

* add output directory

* fix docs test

* general fixes

* content fixes

* fix announcement script

* added changeset

* fix vale checks

* added nofilter option

* fix vale error
This commit is contained in:
Shahed Nasser
2023-09-21 20:57:15 +03:00
committed by GitHub
parent 19c5d5ba36
commit fa7c94b4cc
3209 changed files with 32188 additions and 31018 deletions

View File

@@ -0,0 +1,476 @@
---
description: 'Learn about the different configurations available in a Medusa backend. This includes configurations related to the database, CORS, plugins, and more.'
---
# Configure Medusa Backend
This document provides a reference of all accepted Medusa configurations in `medusa-config.js`.
## Prerequisites
This document assumes you already followed along with the [Prepare Environment documentation](./prepare-environment.mdx) and have [installed a Medusa backend](./install.mdx#create-a-medusa-backend).
---
## Medusa Configurations File
The configurations for your Medusa backend are in `medusa-config.js` located in the root of your Medusa project. The configurations include database, modules, and plugin configurations, among other configurations.
`medusa-config.js` exports an object having the following properties:
| Property Name | Description | Required |
| --- | --- | --- |
| [projectConfig](#projectconfig) | An object that holds general configurations related to the Medusa backend, such as database or CORS configurations. | Yes |
| [plugins](#plugins) | An array of plugin configurations that defines what plugins are installed and optionally specifies each of their configurations. | No |
| [modules](#modules) | An object that defines what modules are installed and optionally specifies each of their configurations. | No |
| [featureFlags](#featureflags) | An object that enables or disables features guarded by a feature flag. | No |
For example:
```js title=medusa-config.js
module.exports = {
projectConfig,
plugins,
modules,
featureFlags,
}
```
---
## Environment Variables
Many of the configurations mentioned in this guide are recommended to have their values set in environment variables and referenced within `medusa-config.js`.
During development, you can set your environment variables in the `.env` file at the root of your Medusa backend project. In production, setting the environment variables depends on the hosting provider.
---
## projectConfig
This section includes all configurations that belong to the `projectConfig` property in the configuration object exported by `medusa-config.js`. The `projectConfig` property is an object, and each of the configurations in this section is added to this object as key-value pairs.
### admin_cors and store_cors
The Medusa backends endpoints are protected by Cross-Origin Resource Sharing (CORS). So, only allowed URLs or URLs matching a specified pattern can send requests to the backends endpoints.
`admin_cors` is used to specify the accepted URLs or patterns for admin endpoints, and `store_cors` is used to specify the accepted URLs or patterns for store endpoints.
For both the `admin_cors` and `store_cors`, the value is expected to be a string. This string can be a comma-separated list of accepted origins. Every origin in that list can be of the following types:
1. A URL. For example, `http://localhost:8000`. The URL shouldnt end with a backslash.
2. A regular expression pattern that can match more than one origin. For example, `.example.com`. The regex pattern that the backend tests for is `^([\/~@;%#'])(.*?)\1([gimsuy]*)$`.
Here are some examples of common use cases:
```bash
# Allow different ports locally starting with 700
ADMIN_CORS=/http:\/\/localhost:700\d+$/
# Allow any origin ending with vercel.app. For example, storefront.vercel.app
STORE_CORS=/vercel\.app$/
# Allow all HTTP requests
ADMIN_CORS=/http:\/\/*/
```
Typically, the value of these configurations would be set in an environment variable and referenced in `medusa-config.js`:
```js title=medusa-config.js
module.exports = {
projectConfig: {
admin_cors: process.env.ADMIN_CORS,
store_cors: process.env.STORE_CORS,
// ...
},
// ...
}
```
If youre adding the value directly within `medusa-config.js`, make sure to add an extra escaping `/` for every backslash in the pattern. For example:
```js title=medusa-config.js
module.exports = {
projectConfig: {
admin_cors: "/http:\\/\\/localhost:700\\d+$/",
store_cors: "/vercel\\.app$/",
// ...
},
// ...
}
```
### cookie_secret
A string that is used to create cookie tokens. Although this configuration option is not required, its highly recommended to set it for better security. Its also recommended to generate a random string.
In a development environment, if this option is not set the default secret is `supersecret` However, in production, if this configuration is not set an error will be thrown and your backend will crash.
Typically, the value of this configuration would be set in an environment variable and referenced in `medusa-config.js`.
```js title=medusa-config.js
module.exports = {
projectConfig: {
cookie_secret: process.env.COOKIE_SECRET,
// ...
},
// ...
}
```
### http_compression
This configuration enables HTTP compression from the application layer. If you have access to the HTTP server, the recommended approach would be to enable it there. However, some platforms don't offer access to the HTTP layer and in those cases, this is a good alternative.
Its value is an object that has the following properties:
- `enabled`: A boolean flag that indicates whether HTTP compression is enabled. It is disabled by default.
- `level`: A `number` value that indicates the level of zlib compression to apply to responses. A higher level will result in better compression but will take longer to complete. A lower level will result in less compression but will be much faster. The default value is 6.
- `memLevel`: A `number` value that specifies how much memory should be allocated to the internal compression state. It's an integer in the range of 1 (minimum level) and 9 (maximum level). The default value is `8`.
- `threshold`: A `number` or a `string` value in bytes that specifies the minimum response body size that compression is applied on. This is the number of bytes or any string accepted by the bytes module. The default value is `1024`.
If you enable HTTP compression and you want to disable it for specific endpoints, you can pass in the request header `"x-no-compression": true`.
```js title=medusa-config.js
module.exports = {
projectConfig: {
http_compression: {
enabled: true,
level: 6,
memLevel: 8,
threshold: 1024,
},
// ...
},
// ...
}
```
### jwt_secret
A string that is used to create authentication tokens. Although this configuration option is not required, its highly recommended to set it for better security. Its also recommended to generate a random string.
In a development environment, if this option is not set the default secret is `supersecret` However, in production, if this configuration is not set an error will be thrown and your backend will crash.
Typically, the value of this configuration would be set in an environment variable and referenced in `medusa-config.js`.
```js title=medusa-config.js
module.exports = {
projectConfig: {
jwt_secret: process.env.JWT_SECRET,
// ...
},
// ...
}
```
### database_database
The name of the database to connect to. If provided in `database_url`, then its not necessary to include it.
Make sure to create the PostgreSQL database before using it. You can check how to create a database in [PostgreSQL's documentation](https://www.postgresql.org/docs/current/sql-createdatabase.html).
```js title=medusa-config.js
module.exports = {
projectConfig: {
database_database: "medusa-store",
// ...
},
// ...
}
```
### database_extra
An object that includes additional configurations to pass to the database connection. You can pass any configuration. One defined configuration to pass is `ssl` which enables support for TLS/SSL connections.
This is useful for production databases, which can be supported by setting the `rejectUnauthorized` attribute of `ssl` object to `false`. During development, its recommended not to pass this option.
```js title=medusa-config.js
module.exports = {
projectConfig: {
database_extra:
process.env.NODE_ENV !== "development"
? { ssl: { rejectUnauthorized: false } }
: {},
// ...
},
// ...
}
```
### database_logging
This configuration specifies what messages to log. Its value can be one of the following:
- (default) A boolean value that indicates whether any messages should be logged or not. By default, if no value is provided, the value will be `false`.
- The string value `all` that indicates all types of messages should be logged.
- An array of log-level strings to indicate which type of messages to show in the logs. The strings can be `query`, `schema`, `error`, `warn`, `info`, `log`, or `migration`. Refer to [Typeorms documentation](https://typeorm.io/logging#logging-options) for more details on what each of these values means.
```js title=medusa-config.js
module.exports = {
projectConfig: {
database_logging: [
"query", "error",
],
// ...
},
// ...
}
```
### database_schema
A string indicating the database schema to connect to. This is not necessary to provide if youre using the default schema, which is `public`.
```js title=medusa-config.js
module.exports = {
projectConfig: {
database_schema: "custom",
// ...
},
// ...
}
```
### database_type
A string indicating the type of database to connect to. At the moment, only `postgres` is accepted, which is also the default value.
```js title=medusa-config.js
module.exports = {
projectConfig: {
database_type: "postgres",
// ...
},
// ...
}
```
### database_url
A string indicating the connection URL of the database. Typically, the connection URL would be set in an environment variable, and the variable would be referenced in `medusa-config.js`.
The format of the connection URL for PostgreSQL is:
```bash
postgres://[user][:password]@[host][:port]/[dbname]
```
Where:
- `[user]`: (required) your PostgreSQL username. If not specified, the system's username is used by default. The database user that you use must have create privileges. If you're using the `postgres` superuser, then it should have these privileges by default. Otherwise, make sure to grant your user create privileges. You can learn how to do that in [PostgreSQL's documentation](https://www.postgresql.org/docs/current/ddl-priv.html).
- `[:password]`: an optional password for the user. When provided, make sure to put `:` before the password.
- `[host]`: (required) your PostgreSQL host. When run locally, it should be `localhost`.
- `[:post]`: an optional port that the PostgreSQL server is listening on. By default, it's `5432`. When provided, make sure to put `:` before the port.
- `[dbname]`: (required) the name of the database.
For example, you can set the following database URL in your environment variables:
```bash
DATABASE_URL=postgres://postgres@localhost/medusa-store
```
You can learn more about the connection URL format in [PostgreSQLs documentation](https://www.postgresql.org/docs/current/libpq-connect.html).
```js title=medusa-config.js
module.exports = {
projectConfig: {
database_url: process.env.DATABASE_URL,
// ...
},
// ...
}
```
### redis_url
This configuration is used to specify the URL to connect to Redis. This is only used for scheduled jobs. If you omit this configuration, scheduled jobs will not work.
:::note
You must first have Redis installed. You can refer to [Redis's installation guide](https://redis.io/docs/getting-started/installation/).
:::
The Redis connection URL has the following format:
```bash
redis[s]://[[username][:password]@][host][:port][/db-number]
```
For a local Redis installation, the connection URL should be `redis://localhost:6379` unless youve made any changes to the Redis configuration during installation.
Typically, the value would be added as an environment variable and referenced in `medusa-config.js`.
```js title=medusa-config.js
module.exports = {
projectConfig: {
redis_url: process.env.REDIS_URL,
// ...
},
// ...
}
```
### redis_prefix
The prefix set on all keys stored in Redis. The default value is `sess:`. If this configuration option is provided, it is prepended to `sess:`.
```js title=medusa-config.js
module.exports = {
projectConfig: {
redis_prefix: "medusa:",
// ...
},
// ...
}
```
### redis_options
An object of options to pass ioredis. You can refer to [iorediss RedisOptions documentation](https://redis.github.io/ioredis/index.html#RedisOptions) for the list of available options.
```js title=medusa-config.js
module.exports = {
projectConfig: {
redis_options: {
connectionName: "medusa",
},
// ...
},
// ...
}
```
### session_options
An object of options to pass to `express-session`. The object can have the following properties:
- `name`: A string indicating the name of the session ID cookie to set in the response (and read from in the request). The default value is `connect.sid`. Refer to [express-sessions documentation](https://www.npmjs.com/package/express-session#name) for more details.
- `resave`: A boolean value that indicates whether the session should be saved back to the session store, even if the session was never modified during the request. The default value is `true`. Refer to [express-sessions documentation](https://www.npmjs.com/package/express-session#resave) for more details.
- `rolling`: A boolean value that indicates whether the session identifier cookie should be force-set on every response. The default value is `false`. Refer to [express-sessions documentation](https://www.npmjs.com/package/express-session#rolling) for more details.
- `saveUninitialized`: A boolean value that indicates whether a session that is "uninitialized" is forced to be saved to the store. The default value is `true`. Refer to [express-sessions documentation](https://www.npmjs.com/package/express-session#saveUninitialized) for more details.
- `secret`: A string that indicates the secret to sign the session ID cookie. By default, the value of [cookie_secret](#cookie_secret) will be used. Refer to [express-sessions documentation](https://www.npmjs.com/package/express-session#secret) for details.
- `ttl`: A number is used when calculating the `Expires` `Set-Cookie` attribute of cookies. By default, itll be `10 * 60 * 60 * 1000`. Refer to [express-sessions documentation](https://www.npmjs.com/package/express-session#cookiemaxage) for details.
```js title=medusa-config.js
module.exports = {
projectConfig: {
session_options: {
name: "custom",
},
// ...
},
// ...
}
```
---
## plugins
On your Medusa backend, you can use Plugins to add custom features or integrate third-party services. For example, installing a plugin to use Stripe as a payment processor.
Aside from installing the plugin with NPM, you need to pass the plugin you installed into the `plugins` array defined in `medusa-config.js`.
The items in the array can either be:
- A string, which is the name of the plugin to add. You can pass a plugin as a string if it doesnt require any configurations.
- An object having the following properties:
- `resolve`: A string indicating the name of the plugin.
- `options`: An object that includes the plugins options. These options vary for each plugin, and you should refer to the plugins documentation for details on them.
For example:
```js title=medusa-config.js
module.exports = {
plugins: [
`medusa-my-plugin-1`,
{
resolve: `medusa-my-plugin`,
options: {
apiKey: `test`, // or use env variables
},
},
// ...
],
// ...
}
```
You can refer to the [Plugins Overview documentation](../../plugins/overview.mdx) for a list of available official plugins.
---
## modules
In Medusa, commerce and core logic are modularized to allow developers to extend or replace certain modules with custom implementations.
Aside from installing the module with NPM, you need to add it to the exported object in `medusa-config.js`.
The keys of the `modules` configuration object refer to the type of module. Its value can be one of the following:
1. A boolean value that indicates whether the module type is enabled.
2. A string value that indicates the name of the module to be used for the module type. This can be used if the module does not require any options.
3. An object having the following properties, but typically you would mainly use the `resolve` and `options` properties only:
1. `resolve`: a string indicating the name of the module.
2. `options`: an object indicating the options to pass to the module. These options vary for each module, and you should refer to the modules documentation for details on them.
3. `resources`: a string indicating whether the module shares the dependency container with the Medusa core. Its value can either be `shared` or `isolated`. Refer to the [Modules documentation](../modules/create.mdx#module-scope) for more details.
4. `alias`: a string indicating a unique alias to register the module under. Other modules cant use the same alias.
5. `main`: a boolean value indicating whether this module is the main registered module. This is useful when an alias is used.
For example:
```js title=medusa-config.js
module.exports = {
modules: {
eventBus: {
resolve: "@medusajs/event-bus-local",
},
cacheService: {
resolve: "@medusajs/cache-redis",
options: {
redisUrl: process.env.CACHE_REDIS_URL,
ttl: 30,
},
},
// ...
},
// ...
}
```
Learn more about [Modules and how to create and use them](../modules/overview.mdx).
---
## featureFlags
Some features in the Medusa backend are guarded by a feature flag. This ensures constant shipping of new features while maintaining the engines stability.
You can specify whether a feature should or shouldnt be used in your backend by enabling its feature flag. Feature flags can be enabled through environment variables or through this configuration exported in `medusa-config.js`. If you want to use the environment variables method, learn more about it in the [Feature Flags documentation](../feature-flags/toggle.md#method-one-using-environment-variables).
The `featureFlags` configuration is an object. Its properties are the names of the different feature flags. Each propertys value is a boolean indicating whether the feature flag is enabled.
You can find available feature flags and their key name [here](https://github.com/medusajs/medusa/tree/master/packages/medusa/src/loaders/feature-flags).
For example:
```js title=medusa-config.js
module.exports = {
featureFlags: {
product_categories: true,
// ...
},
// ...
}
```
Learn more about [Feature flags and how to toggle them](../feature-flags/overview.mdx).
:::note
After enabling a feature flag, make sure to [run migrations](../entities/migrations/overview.mdx#migrate-command) as it may require making changes to the database.
:::

View File

@@ -0,0 +1,177 @@
---
description: "In this document, youll learn about the directory structure of a Medusa backend. Itll help you understand the purpose of each file and folder in your Medusa backend project."
---
# Medusa Backend Directory Structure
In this document, youll learn about the directory structure of a Medusa backend. Itll help you understand the purpose of each file and folder in your Medusa backend project.
---
## Root Files
These are files present at the root of your Medusa backend.
### .babelrc.js
Defines Babels configurations, which are used when running the `build` command that transpiles files from the `src` directory to the `dist` directory.
### .env
Includes the values of environment variables. This is typically only used in development. In production you should define environment variables based on your hosting provider.
### .env.template
Gives an example of what variables may be included in `.env`.
### .gitignore
Specifies files that shouldnt be committed to a Git repository.
### .yarnrc.yml
Ensures dependencies are always installed in `node-modules`. This ensures compatibility with pnpm.
### index.js
Defines an entry file, which is useful when starting the Medusa backend with a process manager like pm2.
### medusa-config.js
Defines the Medusa backends configurations, including the database configurations, plugins used, modules used, and more.
**Read more:** [Medusa backend configurations](./configurations.md).
### package.json
Since the Medusa backend is an NPM package, this file defines its information as well as its dependencies. It will also include any new dependencies you install.
### README.md
Provides general information about the Medusa backend.
### tsconfig.admin.json
Defines the TypeScript configurations that are used to transpile admin customization files. So, it only works for files under the [src/admin directory](#admin).
### tsconfig.json
Defines the general TypeScript configurations used to transpile files from the `src` directory to the `dist` directory.
### tsconfig.server.json
Defines the TypeScript configurations that are used to transpile Medusa backend customization files. It works for all files except for the files under the `src/admin` directory.
### tsconfig.spec.json
Defines TypeScript configurations for test files. These are files that either reside under a `__tests__` directory under `src`, or that have a file name ending with one of the following:
- `.test.ts` or `.test.js`
- `.spec.ts` or `.test.js`
### yarn.lock or package-lock.json
An automatically generated file by `yarn` or `npm` that holds the current versions of all dependencies installed to ensure the correct versions are always installed.
If you used the `create-medusa-app` command to install the Medusa backend, itll attempt to use `yarn` by default to install the dependencies. If `yarn` is not installed on your machine, it will then fall back to using `npm`.
Based on the package manager used to install the dependencies, either `yarn.lock` or `package-lock.json` will be available, or both.
---
## Root Directories
These are the directories present at the root of your Medusa backend.
### .cache
This directory will only be available if you have the Medusa admin installed and youve already started your Medusa backend at least once before. It holds all cached files related to building the Medusa admin assets.
### build
This directory will only be available if you have the Medusa admin installed and youve either built your admin files or ran the Medusa backend at least once before. It holds the built files that are used to serve the admin in your browser.
### data
This directory holds a JSON file used to seed your Medusa backend with dummy data which can be useful for demo purposes. The data is seeded automatically if you include the `--seed` option when using either the `create-medusa-app` or `medusa new` commands.
You can also seed the data by running the following command:
```bash npm2yarn
npm run seed
```
### dist
This directory holds the transpiled Medusa backend customizations. This directory may not be available when you first install the Medusa backend. Itll be available when you run the `build` command or start your Medusa backend with the `dev` command.
The files under this directory are the files that are used in your Medusa backend. So, when you make any changes under `src`, make sure the changes are transpiled into the `dist` directory. If youre using the `dev` or `medusa develop` commands, this is handled automatically whenever changes occur under the `src` directory.
### node_modules
This directory holds all installed dependencies in your project.
### src
This directory holds all Medusa backend and admin customizations. More details about each subdirectory are included in [this section](#src-subdirectories).
### uploads
This directory holds all file uploads to the Medusa backend. Its only used if youre using the [Local File Service plugin](../../plugins/file-service/local.md), which is installed by default.
---
## src Subdirectories
Files under the `src` directory hold the Medusa backend and admin customizations. These files should later be transpiled into the `dist` directory for them to be used during the backends runtime.
If any of these directories are not available, you can create them yourself.
### admin
This directory holds all Medusa admin customizations. The main subdirectories of this directory are:
- `widgets`: Holds all [Medusa admin widgets](../../admin/widgets.md).
- `routes`: Holds all [Medusa admin UI routes](../../admin/routes.md).
### api
This directory holds all custom endpoints. You can create as many subdirectories and files that hold endpoint definitions, but only endpoints exported by the `index.ts` file are registered in the Medusa backend.
**Read more:** [Endpoints](../endpoints/overview.mdx)
### loaders
This directory holds scripts that run when the Medusa backend starts. For example, the scripts can define a scheduled job.
**Read more:** [Loaders](../loaders/overview.mdx)
### migrations
This directory holds all migration scripts that reflect changes on the database the Medusa backend is connected to.
**Read more:** [Migrations](../entities/migrations/overview.mdx)
### models
This directory holds all custom entities, which represent tables in your database. You can create a new entity, or customize a Medusa entity.
**Read more:** [Entities](../entities/overview.mdx)
### repositories
This directory holds all custom repositories which provide utility methods to access and modify data related to an entity.
**Read more:** [Repositories](../entities/overview.mdx#what-are-repositories)
### services
This directory holds all custom services. Services define utility methods related to an entity or feature that can be used across the Medusa backends resources.
**Read more**: [Services](../services/overview.mdx)
### subscribers
This directory holds all custom subscribers. Subscribers listen to emitted events and registers method to handle them.
**Read more:** [Subscribers](../events/subscribers.mdx)

View File

@@ -0,0 +1,165 @@
---
description: 'This quickstart guide will help you set up a Medusa backend in three steps.'
addHowToData: true
---
import Feedback from '@site/src/components/Feedback';
import DocCardList from '@theme/DocCardList';
import Icons from '@theme/Icon';
import Troubleshooting from '@site/src/components/Troubleshooting'
import SaslSection from '../../troubleshooting/database-errors/_sasl.md'
import ConnectionErrorSection from '../../troubleshooting/database-errors/_connection-error.md'
import FreshInstallationSection from '../../troubleshooting/awilix-resolution-error/_fresh-installation.md'
import EaddrinuseSection from '../../troubleshooting/eaddrinuse.md'
# Install Medusa Backend
This document will guide you through setting up your Medusa backend in a three steps.
## Prerequisites
Before you can install and use Medusa, you need the following tools installed on your machine:
- [Node.js v16+](./prepare-environment.mdx#nodejs)
- [Git](./prepare-environment.mdx#git)
- [PostgreSQL](./prepare-environment.mdx#postgresql)
---
## Create a Medusa Backend
:::tip
It is recommended to use [Yarn](https://yarnpkg.com/getting-started/install) for the installation process as it's much faster than using NPM.
:::
### 1. Install Medusa CLI
To install the Medusa backend, you need Medusa's CLI tool. You can install it globally or, alternatively, use it through npx with `npx @medusajs/medusa-cli <command>`.
```bash npm2yarn
npm install @medusajs/medusa-cli -g
```
:::note
If you run into any errors while installing the CLI tool, check out the [troubleshooting guide](../../troubleshooting/cli-installation-errors.mdx).
:::
### 2. Create a new Medusa project
```bash noReport
medusa new my-medusa-store # or npx @medusajs/medusa-cli new
```
You'll then be asked to specify your PostgreSQL database credentials. You can choose "Continue" to use the default credentials shown in the terminal, choose "Change credentials" to specify your PostgreSQL credentails, or choose "Skip database setup" to create the database later.
:::warning
If you choose "Skip database setup" you will need to [set the database configurations](./configurations.md#database-configuration) and [run migrations](../entities/migrations/overview.mdx#migrate-command) later.
:::
### 3. Start your Medusa backend
:::note
Make sure your PostgreSQL server is running before you run the Medusa backend.
:::
```bash noReport
cd my-medusa-store
medusa develop # or npx medusa develop
```
After these three steps and in only a couple of minutes, you now have a complete commerce engine running locally. You can test it out by sending a request using a tool like Postman or through the command line:
```bash noReport
curl localhost:9000/store/products
```
<Feedback
event="survey_server_quickstart"
question="Did you set up the backend successfully?"
positiveQuestion="Is there anything that should improved?"
negativeQuestion="Please describe the issue you faced."
/>
---
## Troubleshooting Installation
<Troubleshooting
sections={[
{
title: 'Error: SASL: SCRAM-SERVER-FIRST-MESSAGE: Client password must be a string',
content: <SaslSection />
},
{
title: 'Error: connect ECONNREFUSED ::1:5432',
content: <ConnectionErrorSection />
},
{
title: 'Error: EADDRINUSE',
content: <EaddrinuseSection />
},
{
title: 'AwilixResolutionError: Could Not Resolve X',
content: <FreshInstallationSection />
},
]}
/>
---
## Seed Data
For better testing, you can add demo data to your Medusa backend by running the seed command in your Medusa backend directory:
```bash
medusa seed --seed-file=data/seed.json
# or npx medusa seed --seed-file=data/seed.json
```
---
## Health Route
You can access `/health` to get health status of your backend.
---
## Next Steps
<DocCardList colSize={4} items={[
{
type: 'link',
href: '/development/backend/directory-structure',
label: 'Directory Structure',
customProps: {
icon: Icons['document-text-solid'],
description: 'Learn about the purpose of each file and directory in the Medusa backend.'
}
},
{
type: 'link',
href: '/development/backend/configurations',
label: 'Backend Configurations',
customProps: {
icon: Icons['tools-solid'],
description: 'Learn about configuring your backend and loading environment variables.'
}
},
{
type: 'link',
href: '/development/fundamentals/architecture-overview',
label: 'Backend Architecture',
customProps: {
icon: Icons['circle-stack-solid'],
description: 'Learn about the different resources that your Medusa backend is made of.'
}
}
]} />

View File

@@ -0,0 +1,177 @@
---
description: 'Learn how to prepare your development environment while using Medusa. This guide includes how to install Node.js, Git, Medusa CLI tool, and PostgreSQL.'
---
import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem';
import MedusaCliTroubleshootingSection from '../../troubleshooting/cli-installation-errors/_reusable-section.mdx'
# Prepare Development Environment
This document includes the installation instructions for the tools required to use and run Medusa.
## Node.js
Node.js is the environment that makes it possible for Medusa to run, so you must install Node.js on your machine to start Medusa development.
:::caution
Medusa supports v16+ of Node.js. You can check your Node.js version using the following command:
```bash noReport
node -v
```
:::
<Tabs groupId="operating-systems" queryString="os">
<TabItem value="windows" label="Windows" default>
You can install the executable directly from [the Node.js website](https://nodejs.org/en/#home-downloadhead).
For other approaches, you can check out [Node.jss guide](https://nodejs.org/en/download/package-manager/#windows-1).
</TabItem>
<TabItem value="linux" label="Linux">
You can use the following commands to install Node.js on Ubuntu:
```bash
#Ubuntu
sudo apt update
sudo apt install nodejs
```
For other Linux distributions, you can check out [Node.jss guide](https://nodejs.org/en/download/package-manager/).
</TabItem>
<TabItem value="macos" label="macOS">
You can use the following commands to install Node.js on macOS:
<Tabs groupId="homebrew" isCodeTabs={true}>
<TabItem value="homebrew" label="Homebrew">
```bash
brew install node
```
</TabItem>
<TabItem value="no-homebrew" label="Without Homebrew">
```bash
curl \
"https://nodejs.org/dist/latest/node-${VERSION:-$(wget -qO- \
https://nodejs.org/dist/latest/ | sed -nE \
's|.*>node-(.*)\.pkg</a>.*|\1|p')}.pkg" \
> "$HOME/Downloads/node-latest.pkg" &&
sudo installer -store -pkg "$HOME/Downloads/node-latest.pkg" -target "/"
```
</TabItem>
</Tabs>
For other approaches, you can check out [Node.jss guide](https://nodejs.org/en/download/package-manager/#macos).
:::tip
Make sure that you have Xcode command line tools installed; if not, run the following command to install it: `xcode-select --install`
:::
</TabItem>
</Tabs>
## Git
Medusa uses Git behind the scenes when you create a new project. So, you'll have to install it on your machine to get started.
<Tabs groupId="operating-systems" queryString="os">
<TabItem value="windows" label="Windows" default>
To install Git on Windows, you need to [download the installable package](https://git-scm.com/download/win).
</TabItem>
<TabItem value="linux" label="Linux">
For Debian/Ubuntu, you can use the following command:
```bash
apt-get install git
```
As for other Linux distributions, please check [gits guide](https://git-scm.com/download/linux).
</TabItem>
<TabItem value="macos" label="macOS">
You should already have Git installed as part of the Xcode command-line tools.
However, if for any reason you need to install it manually, you can install it with Homebrew:
```bash
brew install git
```
You can also check out [gits guide](https://git-scm.com/download/mac) for more installation options.
</TabItem>
</Tabs>
## PostgreSQL
The Medusa backend uses PostgreSQL to store data of your commerce system.
<Tabs groupId="operating-systems" queryString="os">
<TabItem value="windows" label="Windows">
You can [download the PostgreSQL Windows installer](https://www.postgresql.org/download/windows/) from their website.
</TabItem>
<TabItem value="linux" label="Linux">
If youre using Ubuntu, you can use the following commands to download and install PostgreSQL:
```bash
sudo sh -c \
'echo "deb http://apt.postgresql.org/pub/repos/apt $(lsb_release -cs)-pgdg main" > /etc/apt/sources.list.d/pgdg.list'
wget --quiet -O - \
https://www.postgresql.org/media/keys/ACCC4CF8.asc | sudo apt-key add -
sudo apt-get update
sudo apt-get -y install postgresql
```
For other distributions, you can check out [PostgreSQLs website for more guides](https://www.postgresql.org/download/linux/).
</TabItem>
<TabItem value="macos" label="macOS">
You can download PostgreSQL on your macOS using [the installer on their website](https://www.postgresql.org/download/macosx/).
</TabItem>
</Tabs>
## (Optional) Medusa CLI
Medusa provides a CLI tool that can aid your through your Medusa development. You can install it globally, or you can use it through `npx`.
You can install Medusas CLI with the following command:
```bash npm2yarn
npm install @medusajs/medusa-cli -g
```
To confirm that the CLI tool was installed successfully, run the following command:
```bash noReport
medusa -v
```
### Troubleshooting Installation
<MedusaCliTroubleshootingSection />
---
## Install Medusa Backend
Once you're done installing the necessary tools in your environment, check out the [Medusa Backend Quickstart](./install.mdx) to install your Medusa backend.

View File

@@ -0,0 +1,449 @@
---
description: 'Learn how to create a batch job strategy in Medusa. This guide also includes how to test your batch job strategy.'
addHowToData: true
---
import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem';
# Create a Batch Job Strategy
In this document, youll learn how to create a batch job strategy in Medusa.
:::info
If youre interested to learn more about what Batch Jobs are and how they work, check out [this documentation](./index.mdx).
:::
## Overview
Batch jobs can be used to perform long tasks in the background of your Medusa backend. Batch jobs are handled by batch job strategies. An example of a batch job strategy is the Import Products functionality.
This documentation helps you learn how to create a batch job strategy. The batch job strategy used in this example changes the status of all draft products to `published`.
---
## Prerequisites
### Medusa Components
It is assumed that you already have a Medusa backend installed and set up. If not, you can follow our [quickstart guide](../backend/install.mdx) to get started. The Medusa backend must also have an event bus module installed, which is available when using the default Medusa backend starter.
---
## 1. Create a File
A batch job strategy is essentially a class defined in a TypeScript or JavaScript file. You should create this file in `src/strategies`.
Following the example used in this documentation, create the file `src/strategies/publish.ts`.
---
## 2. Create Class
Batch job strategies must extend the abstract class `AbstractBatchJobStrategy` and implement its abstract methods.
Add the following content to the file you created:
```ts title=src/strategies/publish.ts
import {
AbstractBatchJobStrategy,
BatchJobService,
} from "@medusajs/medusa"
import { EntityManager } from "typeorm"
class PublishStrategy extends AbstractBatchJobStrategy {
protected batchJobService_: BatchJobService
processJob(batchJobId: string): Promise<void> {
throw new Error("Method not implemented.")
}
buildTemplate(): Promise<string> {
throw new Error("Method not implemented.")
}
protected manager_: EntityManager
protected transactionManager_: EntityManager
}
export default PublishStrategy
```
---
## 3. Define Required Properties
A batch job strategy class must have two static properties: the `identifier` and `batchType` properties. The `identifier` must be a unique string associated with your batch job strategy, and `batchType` must be the batch job's type.
You will use the `batchType` later when you [interact with the Batch Job APIs](#test-your-batch-job-strategy).
Following the same example, add the following properties to the `PublishStrategy` class:
```ts
class PublishStrategy extends AbstractBatchJobStrategy {
static identifier = "publish-products-strategy"
static batchType = "publish-products"
// ...
}
```
---
## 4. Define Methods
### (Optional) prepareBatchJobForProcessing
Medusa runs this method before it creates the batch job to prepare the content of the batch job record in the database. It accepts two parameters: the batch job data sent in the body of the [Create Batch Job request](https://docs.medusajs.com/api/admin#batch-jobs_postbatchjobs), and the request instance.
Implementing this method is optional. For example:
```ts
class PublishStrategy extends AbstractBatchJobStrategy {
// ...
async prepareBatchJobForProcessing(
batchJob: CreateBatchJobInput,
req: Express.Request
): Promise<CreateBatchJobInput> {
// make changes to the batch job's fields...
return batchJob
}
}
```
### (Optional) preProcessBatchJob
Medusa runs this method after it creates the batch job, but before it is confirmed and processed. You can use this method to perform any necessary action before the batch job is processed. You can also use this method to add information related to the expected result.
For example, this implementation of the `preProcessBatchJob` method calculates how many draft products it will published and adds it to the `result` attribute of the batch job:
```ts
class PublishStrategy extends AbstractBatchJobStrategy {
// ...
async preProcessBatchJob(batchJobId: string): Promise<void> {
return await this.atomicPhase_(
async (transactionManager) => {
const batchJob = (await this.batchJobService_
.withTransaction(transactionManager)
.retrieve(batchJobId))
const count = await this.productService_
.withTransaction(transactionManager)
.count({
status: ProductStatus.DRAFT,
})
await this.batchJobService_
.withTransaction(transactionManager)
.update(batchJob, {
result: {
advancement_count: 0,
count,
stat_descriptors: [
{
key: "product-publish-count",
name: "Number of products to publish",
message:
`${count} product(s) will be published.`,
},
],
},
})
})
}
}
```
The `result` attribute is an object that can hold many properties including:
- `count`: used to indicate how many items (in this case, products) that the task will run on.
- `advancement_count`: used to indicate the current number of items processed at a given moment. Since the batch job isn't processed yet, you set it to zero.
- `stat_descriptors`: can be used to show human-readable messages.
### processJob
Medusa runs this method to process the batch job once it is confirmed.
For example, this implementation of the `processJob` method retrieves all draft products and changes their status to published:
```ts
class PublishStrategy extends AbstractBatchJobStrategy {
// ...
async processJob(batchJobId: string): Promise<void> {
return await this.atomicPhase_(
async (transactionManager) => {
const productServiceTx = this.productService_
.withTransaction(transactionManager)
const productList = await productServiceTx
.list({
status: [ProductStatus.DRAFT],
})
productList.forEach(async (product: Product) => {
await productServiceTx
.update(product.id, {
status: ProductStatus.PUBLISHED,
})
})
await this.batchJobService_
.withTransaction(transactionManager)
.update(batchJobId, {
result: {
advancement_count: productList.length,
},
})
}
)
}
}
```
:::note
When a batch job is canceled, the processing of the batch job doesnt automatically stop. You will have to manually check for changes in the status of the batch job. For example, you can retrieve the batch job and use the condition `batchJob.status === BatchJobStatus.CANCELED` to check if the batch job was canceled.
:::
### buildTemplate
This method can be used in cases where you provide a template file to download, such as when implementing an import or export functionality.
If not necessary to your use case, you can simply return an empty string:
```ts
class PublishStrategy extends AbstractBatchJobStrategy {
// ...
async buildTemplate(): Promise<string> {
return ""
}
}
```
### (Optional) shouldRetryOnProcessingError
Medusa uses this method to decide whether it should retry the batch job if an error occurs during processing.
By default, the `AbstractBatchJobStrategy` class implements this method and returns `false`, indicating that if a batch jobs process fails it will not be retried.
If you would like to change that behavior, you can override this method to return a different value:
```ts
class PublishStrategy extends AbstractBatchJobStrategy {
// ...
protected async shouldRetryOnProcessingError(
batchJob: BatchJob,
err: unknown
): Promise<boolean> {
return true
}
}
```
### (Optional) handleProcessingError
Medusa uses this method to handle errors that occur during processing. By default, it changes the status of the batch job to `failed` and sets the `errors` property of the batch jobs `result` attribute.
You can use this method as implemented in `AbstractBatchJobStrategy` at any point in your batch job process to set the batch job as failed.
You can also override this method in your batch job strategy and change how it works:
```ts
class PublishStrategy extends AbstractBatchJobStrategy {
// ...
protected async handleProcessingError<T>(
batchJobId: string,
err: unknown,
result: T
): Promise<void> {
// different implementation...
}
}
```
---
## 5. Run Build Command
After you create the batch job and before testing it out, you must run the build command in the directory of your Medusa backend:
```bash npm2yarn
npm run build
```
---
## Test your Batch Job Strategy
This section covers how to test and use your batch job strategy. Make sure to start your backend first:
```bash
npx medusa develop
```
You must also use an authenticated user to send batch job requests. You can refer to the [authentication guide in the API reference](https://docs.medusajs.com/api/admin#authentication) for more details.
If you follow along with the JS Client code snippets, make sure to [install and set it up first](../../js-client/overview.md).
### Create Batch Job
The first step is to create a batch job using the [Create Batch Job endpoint](https://docs.medusajs.com/api/admin#batch-jobs_postbatchjobs). In the body of the request, you must set the `type` to the value of `batchType` in the batch job strategy you created.
For example, this creates a batch job of the type `publish-products`:
<Tabs groupId="request-types" isCodeTabs={true}>
<TabItem value="client" label="Medusa JS Client" default>
```jsx
medusa.admin.batchJobs.create({
type: "publish-products",
context: { },
dry_run: true,
})
.then(( batch_job ) => {
console.log(batch_job.status)
})
```
</TabItem>
<TabItem value="fetch" label="Fetch API">
```jsx
fetch(`<BACKEND_URL>/admin/batch-jobs`, {
method: "POST",
credentials: "include",
headers: {
"Content-Type": "application/json",
},
body: JSON.stringify({
type: "publish-products",
context: { },
dry_run: true,
}),
})
.then((response) => response.json())
.then(({ batch_job }) => {
console.log(batch_job.status)
})
```
</TabItem>
<TabItem value="curl" label="cURL">
```bash
curl -L -X POST '<BACKEND_URL>/admin/batch-jobs' \
-H 'Authorization: Bearer <API_TOKEN>' \
-H 'Content-Type: application/json' \
--data-raw '{
"type": "publish-products",
"context": { },
"dry_run": true
}'
```
</TabItem>
</Tabs>
You set the `dry_run` to `true` to disable automatic confirmation and running of the batch job. If you want the batch job to run automatically, you can remove this body parameter.
Make sure to replace `<BACKEND_URL>` with the backend URL where applicable.
### (Optional) Retrieve Batch Job
You can retrieve the batch job afterward to get its status and view details about the process in the `result` property:
<Tabs groupId="request-type" isCodeTabs={true}>
<TabItem value="client" label="Medusa JS Client" default>
```jsx
medusa.admin.batchJobs.retrieve(batchJobId)
.then(( batch_job ) => {
console.log(batch_job.status, batch_job.result)
})
```
</TabItem>
<TabItem value="fetch" label="Fetch API">
```jsx
fetch(`<BACKEND_URL>/admin/batch-jobs/${batchJobId}`, {
credentials: "include",
})
.then((response) => response.json())
.then(({ batch_job }) => {
console.log(batch_job.status, batch_job.result)
})
```
</TabItem>
<TabItem value="curl" label="cURL">
```bash
curl -L -X GET '<BACKEND_URL>/admin/batch-jobs/<BATCH_JOB_ID>' \
-H 'Authorization: Bearer <API_TOKEN>'
# <BATCH_JOB_ID> is the ID of the batch job
```
</TabItem>
</Tabs>
Based on the batch job strategy implemented in this documentation, the `result` property could be something like this:
```json noReport
"result": {
"count": 1,
"stat_descriptors": [
{
"key": "product-publish-count",
"name": "Number of products to publish",
"message": "1 product(s) will be published."
}
],
"advancement_count": 0
},
```
### Confirm Batch Job
To process the batch job, send a request to [confirm the batch job](https://docs.medusajs.com/api/admin#batch-jobs_postbatchjobsbatchjobconfirmprocessing):
<Tabs groupId="request-type" isCodeTabs={true}>
<TabItem value="client" label="Medusa JS Client" default>
```jsx
medusa.admin.batchJobs.confirm(batchJobId)
.then(( batch_job ) => {
console.log(batch_job.status)
})
```
</TabItem>
<TabItem value="fetch" label="Fetch API">
```jsx
fetch(`<BACKEND_URL>/admin/batch-jobs/${batchJobId}/confirm`, {
method: "POST",
credentials: "include",
})
.then((response) => response.json())
.then(({ batch_job }) => {
console.log(batch_job.status)
})
```
</TabItem>
<TabItem value="curl" label="cURL">
```bash
curl -L -X POST '<BACKEND_URL>/admin/batch-jobs/<BATCH_JOB_ID>/confirm' \
-H 'Authorization: Bearer <API_TOKEN>'
# <BATCH_JOB_ID> is the ID of the batch job
```
</TabItem>
</Tabs>
The batch job will start processing afterward. Based on the batch job strategy implemented in this documentation, draft products will be published.
You can [retrieve the batch job](#optional-retrieve-batch-job) at any given point to check its status.

View File

@@ -0,0 +1,122 @@
---
description: 'Learn how to customize the import strategy in Medusa. The import strategy can be used to import entities such as products, prices in a price list, orders, or other entities.'
addHowToData: true
---
# How to Customize Import Strategy
In this document, youll learn how to create a custom product import strategy either by overriding the default strategy or creating your own.
## Overview
Product Import Strategy is essentially a batch job strategy. Medusa provides the necessary mechanisms to override or create your own strategy.
Although this documentation specifically targets import strategies, you can use the same steps to override any batch job strategy in Medusa, including export strategies.
---
## Prerequisites
### Medusa Components
It's assumed that you already have a Medusa backend installed and set up. If not, you can follow our [quickstart guide](../backend/install.mdx) to get started. The Medusa backend must also have an event bus module installed, which is available when using the default Medusa backend starter.
---
## Override Batch Job Strategy
The steps required for overriding a batch job strategy are essentially the same steps required to create a batch job strategy with a minor difference. For that reason, this documentation does not cover the basics of a batch job strategy.
If youre interested to learn more about batch job strategies and how they work, please check out the [Create Batch Job Strategy documentation](./create.mdx).
### 1. Create a File
You must store batch job strategies in the `src/strategies` directory of your Medusa backend. They are either TypeScript or JavaScript files.
So, for example, you can create the file `src/strategies/import.ts`.
### 2. Create a Class
The batch job strategy class must extend the `AbstractBatchJobStrategy` class which you can import from Medusas core repository.
For example, you can define the following class in the file you created:
```ts title=src/strategies/import.ts
import {
AbstractBatchJobStrategy,
BatchJobService,
} from "@medusajs/medusa"
import { EntityManager } from "typeorm"
class MyImportStrategy extends AbstractBatchJobStrategy {
protected batchJobService_: BatchJobService
protected manager_: EntityManager
protected transactionManager_: EntityManager
processJob(batchJobId: string): Promise<void> {
throw new Error("Method not implemented.")
}
buildTemplate(): Promise<string> {
throw new Error("Method not implemented.")
}
}
export default MyImportStrategy
```
:::note
This is the base implementation of a batch job strategy. You can learn about all the different methods and properties in [this documentation](./create.mdx#3-define-required-properties).
:::
### 3. Set the batchType Property
Every batch job strategy class must have the static property `batchType` defined. It determines the type of batch job this strategy handles.
Since only one batch job strategy can handle a batch job type, you can override Medusas default batch job strategies by using the same `batchType` value in your custom strategy.
So, for example, to override the product import strategy set the `batchType` property in your strategy to `product-import`:
```ts
class MyImportStrategy extends AbstractBatchJobStrategy {
static batchType = "product-import"
// ...
}
```
### 4. Define your Custom Functionality
You can now define your custom functionality in your batch job strategy. For example, you can create custom import logic to import products.
Refer to the [Create a Batch Job documentation](./create.mdx#3-define-required-properties) to understand what properties and methods are required in your batch job strategy and how you can use them to implement your custom functionality.
### 5. Run Build Command
Before you can test out your batch job strategy, you must run the `build` command:
```bash npm2yarn
npm run build
```
### 6. Test your Functionality
Since you didnt create a new batch job type and overwrote the functionality of the strategy, you can test out your functionality using the [same steps used with the default strategy](./create.mdx#test-your-batch-job-strategy).
Specifically, since you create batch jobs using the [Create Batch Job](https://docs.medusajs.com/api/admin#batch-jobs_postbatchjobs) endpoint which accepts the batch job type as a body parameter, you just need to send the same type you used for this field. In the example of this documentation, the `type` would be `product-import`.
If you overwrote the import functionality, you can follow [these steps to learn how to import products using the Admin APIs](../../modules/products/admin/import-products.mdx).
---
## Create Custom Batch Job Strategy
If you dont want to override Medusas batch job strategy, you can create a custom batch job strategy with a different `batchType` value. Then, use that type when you send a request to [Create a Batch Job](https://docs.medusajs.com/api/admin#batch-jobs_postbatchjobs).
For more details on creating custom batch job strategies, please check out the [Create Batch Job Strategy documentation](./create.mdx).
---
## See Also
- [Use the Import Product APIs](../../modules/products/admin/import-products.mdx).

View File

@@ -0,0 +1,95 @@
---
description: 'Learn what batch jobs in Medusa are and their flow. Batch jobs are tasks that can be performed asynchronously and iteratively in Medusa.'
---
import DocCardList from '@theme/DocCardList';
import Icons from '@theme/Icon';
# Batch Jobs
In this document, youll learn what Batch Jobs are and how they work in Medusa.
## What are Batch Jobs
Batch Jobs are tasks that can be performed asynchronously and iteratively. They can be [created using the Admin API](https://docs.medusajs.com/api/admin#batch-jobs_postbatchjobs), then, once confirmed, they are processed asynchronously.
Every status change of a batch job triggers an event that can be handled using [subscribers](../events/subscribers.mdx).
Medusa uses batch jobs in its core to perform some asynchronous tasks. For example, the Export Products functionality uses batch jobs.
You can also create a custom batch job or overwrite Medusas batch jobs.
### BatchJob Entity Overview
A batch job is stored in the database as a [BatchJob](../../references/entities/classes/BatchJob) entity. Some of its important attributes are:
- `status`: A string that determines the current status of the Batch Job.
- `context`: An object that can be used to store configurations related to the batch job. For example, you can use it to store listing configurations such as the limit or offset of the list to be retrieved during the processing of the batch job.
- `dry_run`: A boolean value that indicates whether this batch job should actually produce an output. By default its false, meaning that by default it produces an output. It can be used to simulate a batch job.
- `type`: A string that is used to later resolve the batch job strategy that should be used to handle the batch job.
- `result`: An object that includes important properties related to the result of the batch job. Some of these properties are:
- `errors`: An error object that contains any errors that occur during the batch jobs execution.
- `progress`: A numeric value indicating the progress of the batch job.
- `count`: A number that includes the total count of records related to the operation. For example, in the case of product exports, it is used to indicate the total number of products exported.
- `advancement_count`: A number that indicates the number of records processed so far. Can be helpful when retrying a batch job.
---
## What are Batch Job Strategies
Batch jobs are handled by batch job strategies. A batch job strategy is a class that extends the `AbstractBatchJobStrategy` abstract class and implements the methods defined in that class to handle the different states of a batch job.
A batch job strategy must implement the necessary methods to handle the preparation of a batch job before it is created, the preparation of the processing of the batch job after it is created, and the processing of the batch job once it is confirmed.
When you create a batch job strategy, the `batchType` class property indicates the batch job types this strategy handles. Then, when you create a new batch job, you set the batch jobs type to the value of `batchType` in your strategy.
---
## How Batch Jobs Work
A batch jobs flow from creation to completion is:
1. A batch job is created using the [Create Batch Job API endpoint](https://docs.medusajs.com/api/admin#batch-jobs_postbatchjobs).
2. Once the batch job is created, the batch jobs status is changed to `created` and the `batch.created` event is triggered by the `BatchJobService`.
3. The `BatchJobSubscriber` handles the `created` event. It resolves the batch job strategy based on the `type` of the batch job, then uses it to pre-process the batch job. After this, the batch jobs status is changed to `pre_processed`. Only when the batch job has the status `pre_processed` can be confirmed.
4. If `dry_run` is not set in the Create Batch Job request in step one or if it is set to `false`, the batch job will automatically be confirmed after processing. Otherwise, if `dry_run` is set to `true`, the batch job can be confirmed using the [Confirm Batch Job API](https://docs.medusajs.com/api/admin#batch-jobs_postbatchjobsbatchjobconfirmprocessing) endpoint.
5. Once the batch job is confirmed, the batch jobs status is changed to `confirmed` and the `batch.confirmed` event is triggered by the `BatchJobService`.
6. The `BatchJobSubscriber` handles the `confirmed` event. It resolves the batch job strategy, then uses it to process the batch job.
7. Once the batch job is processed successfully, the batch job has the status `completed`.
You can track the progress of the batch job at any point using the [Retrieve Batch Job](https://docs.medusajs.com/api/admin#batch-jobs_getbatchjobsbatchjob) endpoint.
:::info
If the batch job fails at any point in this flow, its status is changed to `failed`.
:::
![Flowchart summarizing the batch job's flow from creation to completion](https://res.cloudinary.com/dza7lstvk/image/upload/v1668001632/Medusa%20Docs/Diagrams/Qja0kAz_ns4vm8.png)
---
## Custom Development
Developers can create custom batch jobs in the Medusa backend, a plugin, or in a module. Developers can also customize existing batch job strategies in the core, such as the import strategy.
<DocCardList colSize={6} items={[
{
type: 'link',
href: '/development/batch-jobs/create',
label: 'Create a Batch Job Strategy',
customProps: {
icon: Icons['academic-cap-solid'],
description: 'Learn how to create a batch job strategy in Medusa.'
}
},
{
type: 'link',
href: '/development/batch-jobs/customize-import',
label: 'Customize Import Strategy',
customProps: {
icon: Icons['academic-cap-solid'],
description: 'Learn how to customize the import strategy in Medusa.'
}
},
]} />

View File

@@ -0,0 +1,267 @@
---
description: 'In this document, youll learn about how to create a cache module in Medusa, using Memcached as an example.'
---
# How to Create a Cache Module
In this document, you will learn how to build your own Medusa cache module.
## Overview
Medusa provides ready-made modules for cache, including in-memory and Redis modules. If you prefer another technology used for caching in your commerce application, you can build a module locally and use it in your Medusa backend. You can also publish to NPM and reuse it across multiple Medusa backend instances.
In this document, you will learn how to build your own Medusa cache module based on Memcached as an example. This gives you a real-life example of creating the cache module. You can follow the general steps with any other caching system or service.
---
## Prerequisites
If you want to create the Memcached cache module as explained in this guide, you must have [Memcached](https://memcached.org/) installed and running.
---
## (Optional) Step 0: Prepare Module Directory
Before you start implementing your module, it's recommended to prepare the directory or project holding your custom implementation.
You can refer to the [Project Preparation step in the Create Module documentation](../modules/create.mdx#optional-step-0-project-preparation) to learn how to do that.
---
## Step 1: Create the Service
Create the file `src/services/memcached-cache.ts` which will hold your cache service. Note that the name of the file is recommended to be in the format `<service_name>-cache`. So, if youre not integrating `memcached`, you should replace the name with whats relevant for your module.
Add the following content to the file:
```ts title=src/services/memcached-cache.ts
import { ICacheService } from "@medusajs/types"
class MemcachedCacheService implements ICacheService {
async get<T>(key: string): Promise<T> {
throw new Error("Method not implemented.")
}
async set(
key: string,
data: unknown,
ttl?: number
): Promise<void> {
throw new Error("Method not implemented.")
}
async invalidate(key: string): Promise<void> {
throw new Error("Method not implemented.")
}
}
export default MemcachedCacheService
```
This creates the class `MemcachedCacheService` that implements the `ICacheService` interface imported from `@medusajs/types`. Feel free to rename the class to whats relevant for your cache service.
In the class, you must implement three methods: `get`, `set`, and `invalidate`. Youll learn what each of these methods do in the upcoming section.
---
## Step 2: Implement the Methods
### constructor
The `constructor` method of a service allows you to prepare any third-party client or service necessary to be used in other methods. It also allows you to get access to the modules options which are typically defined in `medusa-config.js`, and to other services and resources in the Medusa backend using [dependency injection](../fundamentals/dependency-injection.md).
Heres an example of how you can use the `constructor` to create a memcached instance and save the modules options:
```ts title=src/services/memcached-cache.ts
import { ICacheService } from "@medusajs/types"
import Memcached from "memcached"
const DEFAULT_CACHE_TIME = 30
export type MemcachedCacheModuleOptions = {
/**
* Time to keep data in the cache (in seconds)
*/
ttl?: number
/**
* Allow passing the configuration for Memcached client
*/
location: Memcached.Location
options?: Memcached.options
}
class MemcachedCacheService implements ICacheService {
protected readonly memcached: Memcached
protected readonly TTL: number
constructor(
{
// inject services through dependency injection
// for example you can access the logger
logger,
},
options: MemcachedCacheModuleOptions
) {
this.memcached = new Memcached(
options.location,
options.options
)
this.TTL = options.ttl || DEFAULT_CACHE_TIME
}
// ...
}
```
### get
The `get` method allows you to retrieve the value of a cached item based on its key. The method accepts a string as a first parameter, which is the key in the cache. It either returns the cached item or `null` if it doesnt exist.
Heres an example implementation of this method for a Memcached service:
```ts title=src/services/memcached-cache.ts
class MemcachedCacheService implements ICacheService {
// ...
async get<T>(cacheKey: string): Promise<T | null> {
return new Promise((res, rej) => {
this.memcached.get(cacheKey, (err, data) => {
if (err) {
res(null)
} else {
if (data) {
res(JSON.parse(data))
} else {
res(null)
}
}
})
})
}
}
```
### set
The `set` method is used to set an item in the cache. It accepts three parameters:
1. The first parameter `key` is a string that represents the key of the data being added to the cache. This key can be used later to get or invalidate the cached item.
2. The second parameter `data` is the data to be added to the cache. Theres no defined type for this data.
3. The third parameter `ttl` is optional. Its a number indicating how long (in seconds) the data should be kept in the cache.
Heres an example of an implementation of this method for a Memcached service:
```ts title=src/services/memcached-cache.ts
class MemcachedCacheService implements ICacheService {
// ...
async set(
key: string,
data: Record<string, unknown>,
ttl: number = this.TTL
): Promise<void> {
return new Promise((res, rej) =>
this.memcached.set(
key, JSON.stringify(data), ttl, (err) => {
if (err) {
rej(err)
} else {
res()
}
})
)
}
}
```
### invalidate
The `invalidate` method removes an item from the cache using its key. By default, the item is removed from the cache when the time-to-live (ttl) expires. The `invalidate` method can be used to remove the item beforehand.
The method accepts a string as a first parameter, which is the key of the item to invalidate and remove from the cache.
Heres an example of an implementation of this method for a Memcached service:
```ts title=src/services/memcached-cache.ts
class MemcachedCacheService implements ICacheService {
// ...
async invalidate(key: string): Promise<void> {
return new Promise((res, rej) => {
this.memcached.del(key, (err) => {
if (err) {
rej(err)
} else {
res()
}
})
})
}
}
```
---
## Step 3: Export the Service
After implementing the cache service, you must export it so that the Medusa backend can use it.
Create the file `src/index.ts` with the following content:
```ts title=src/index.ts
import { ModuleExports } from "@medusajs/modules-sdk"
import {
MemcachedCacheService,
} from "./services/memcached-cache"
const service = MemcachedCacheService
const moduleDefinition: ModuleExports = {
service,
}
export default moduleDefinition
```
This exports a module definition, which requires at least a `service`. If you named your service something other than `MemcachedCacheService`, make sure to replace it with that.
You can learn more about what other properties you can export in your module definition in the [Create a Module documentation](../modules/create.mdx#step-2-export-module).
---
## Step 4: Test your Module
You can test your module in the Medusa backend by referencing it in the configurations.
To do that, add the module to the exported configuration in `medusa-config.js` as follows:
```js title=medusa-config.js
module.exports = {
// ...
modules: {
// ...
cacheService: {
resolve: "path/to/custom-module/src/index.ts",
options: {
// any necessary options
ttl: 30,
location: "localhost:55000",
},
},
},
}
```
Make sure to replace the `path/to/custom-module` with a relative path from your Medusa backend to your module. You can learn more about module reference in the [Create Module documentation](../modules/create.mdx#module-reference).
You can also add any necessary options to the module. The options added in the example above are relevant to the memcached module and you can replace them with your own options.
Then, to test the module, run the Medusa backend which also runs your module:
```bash npm2yarn
npx medusa develop
```
---
## (Optional) Step 5: Publish your Module
You can publish your cache module to NPM. This can be useful if you want to reuse your module across Medusa backend instances, or want to allow other developers to use it.
You can refer to the [Publish Module documentation](../modules/publish.md) to learn how to publish your module.

View File

@@ -0,0 +1,70 @@
---
description: 'In this document, youll learn about the in-memory cache module and how you can install it in your Medusa backend.'
---
# In-Memory Cache Module
In this document, youll learn about the in-memory cache module and how you can install it in your Medusa backend.
## Overview
Medusas modular architecture allows developers to extend or completely replace the logic used for caching. You can create a custom module, or you can use the modules Medusa provides.
Medusas default starter project uses the in-memory cache module. The in-memory cache module uses a plain JavaScript Map object to store the cache data.
This module is helpful for development or when youre testing out Medusa, but its not recommended to be used in production. For production, its recommended to use modules like [Redis Cache Module](./redis.md).
This document will guide you through installing the in-memory cache module.
---
## Prerequisites
### Medusa Backend
Its assumed you already have a Medusa backend installed. If not, you can learn how to install it by following [this guide](../../backend/install.mdx).
---
## Step 1: Install the Module
In the root directory of your Medusa backend, install the in-memory cache module with the following command:
```bash npm2yarn
npm install @medusajs/cache-inmemory
```
---
## Step 2: Add Configuration
In `medusa-config.js`, add the following to the exported object:
```js title=medusa-config.js
module.exports = {
// ...
modules: {
// ...
cacheService: {
resolve: "@medusajs/cache-inmemory",
options: {
ttl: 30,
},
},
},
}
```
This registers the in-memory cache module as the main cache service to use. You pass it the option `ttl`. This means time-to-live, and it indicates the number of seconds an item can live in the cache before its removed.
---
## Step 3: Test Module
To test the module, run the following command to start the Medusa backend:
```bash npm2yarn
npx medusa develop
```
The backend should then start with no errors, indicating that the module was installed successfully.

View File

@@ -0,0 +1,92 @@
---
description: 'In this document, youll learn about the Redis cache module and how you can install it in your Medusa backend.'
---
# Redis Cache Module
In this document, youll learn about the Redis cache module and how you can install it in your Medusa backend.
## Overview
Medusas modular architecture allows developers to extend or replace the logic used for [caching](../overview.mdx). You can create a custom module, or you can use the modules Medusa provides.
One of these modules is the Redis module. This module allows you to utilize Redis for the caching functionality. This document will you guide you through installing the Redis module.
---
## Prerequisites
### Medusa Backend
Its assumed you already have a Medusa backend installed. If not, you can learn how to install it by following [this guide](../../backend/install.mdx).
### Redis
You must have Redis installed and configured in your Medusa backend. You can learn how to install it from the [Redis documentation](https://redis.io/docs/getting-started/installation/).
---
## Step 1: Install the Module
In the root directory of your Medusa backend, install the Redis cache module with the following command:
```bash npm2yarn
npm install @medusajs/cache-redis
```
---
## Step 2: Add Environment Variable
The Redis cache module requires a connection URL to Redis as part of its options. If you dont already have an environment variable set for a Redis URL, make sure to add one:
```bash
CACHE_REDIS_URL=<YOUR_REDIS_URL>
```
Where `<YOUR_REDIS_URL>` is a connection URL to your Redis instance.
---
## Step 3: Add Configuration
In `medusa-config.js`, add the following to the exported object:
```js title=medusa-config.js
module.exports = {
// ...
modules: {
// ...
cacheService: {
resolve: "@medusajs/cache-redis",
options: {
redisUrl: process.env.CACHE_REDIS_URL,
ttl: 30,
},
},
},
}
```
This registers the Redis cache module as the main cache service to use. In the options, you pass `redisUrl` with the value being the environment variable you set. You also pass the option `ttl`. This means time-to-live, and it indicates the number of seconds an item can live in the cache before its removed.
Other available options include:
- `redisOptions`: an object containing options for the Redis instance. You can learn about available options in [io-rediss documentation](https://luin.github.io/ioredis/index.html#RedisOptions). By default, its an empty object.
- `namespace`: a string used to prefix event keys. By default, it's `medusa`.
---
## Step 4: Test Module
To test the module, run the following command to start the Medusa backend:
```bash npm2yarn
npx medusa develop
```
If the module was installed successfully, you should see the following message in the logs:
```bash noCopy noReport
Connection to Redis in module 'cache-redis' established
```

View File

@@ -0,0 +1,65 @@
---
description: 'In this document, youll learn about Cache Modules in Medusa, how they work, and which modules Medusa provides.'
---
import DocCardList from '@theme/DocCardList';
import DocCard from '@theme/DocCard';
import Icons from '@theme/Icon';
# Cache
In this document, youll learn about Cache Modules in Medusa, how they work, and which modules Medusa provides.
## Overview
Cache is used in Medusa to store the results of computations such as price selection or various tax calculations. The caching layer in Medusa is implemented as a [module](../modules/overview.mdx). This allows you to replace the underlying database/cache store in your project.
Modules are packages that export a service. The main service exported from a cache module has to implement the `ICacheService` interface exported from the `@medusajs/types` package.
During the app startup, the modules loader will load the service into the [dependency injection container](../fundamentals/dependency-injection.md) to be available as `cacheService` to the rest of the commerce application.
---
## Available Modules
Medusa default starter project comes with the in-memory cache module (`@medusajs/cache-inmemory`). In-memory cache uses a plain JS Map object to store data, which is great for testing and development purposes.
For production environments, there is the Redis cache module package (`@medusajs/cache-redis`) that you can install.
<DocCardList colSize={6} items={[
{
type: 'link',
href: '/development/cache/modules/redis',
label: 'Redis Module',
customProps: {
icon: Icons['academic-cap-solid'],
description: 'Learn how to install and use the Redis Module'
}
},
{
type: 'link',
href: '/development/cache/modules/in-memory',
label: 'In-Memory Module',
customProps: {
icon: Icons['academic-cap-solid'],
description: 'Learn how to install and use the In-Memory Module'
}
},
]} />
---
## Custom Development
Developers can create custom cache modules to implement their custom cache logic. The module can be installed and used in any Medusa backend.
<DocCard item={{
type: 'link',
href: '/development/cache/create',
label: 'Create a Cache Module',
customProps: {
icon: Icons['academic-cap-solid'],
description: 'Learn how to create a cache module in Medusa.'
}
}}
/>

View File

@@ -0,0 +1,216 @@
---
description: 'Learn how to add a middleware in Medusa. A middleware is a function that has access to the request and response objects and can be used to perform actions around an endpoint.'
addHowToData: true
---
import Troubleshooting from '@site/src/components/Troubleshooting'
import ServiceLifetimeSection from '../../troubleshooting/awilix-resolution-error/_service-lifetime.md'
import CustomRegistrationSection from '../../troubleshooting/awilix-resolution-error/_custom-registration.md'
# Middlewares
In this document, youll learn how to add a middleware to an existing or custom route in Medusa.
## Overview
As the Medusa backend is built on top of [Express](https://expressjs.com/), Expresss features can be utilized during your development with Medusa.
One feature in particular is adding a [middleware](http://expressjs.com/en/guide/using-middleware.html#using-middleware). A middleware is a function that has access to the request and response objects.
A middleware can be used to perform an action when an endpoint is called or modify the response, among other usages.
You can add a middleware to an existing route in the Medusa backend, a route in a plugin, or your custom routes.
---
## How to Add a Middleware
### Step 1: Create the Middleware File
You can organize your middlewares as you see fit, but it's recommended to create Middlewares in the `src/api/middlewares` directory. It's recommended to create each middleware in a different file.
Each file should export a middleware function that accepts three parameters:
1. The first one is an Express request object. It can be used to get details related to the request or resolve resources from the dependency container.
2. The second one is an Express response object. It can be used to modify the response, or in some cases return a response without executing the associated endpoint.
3. The third one is a next middleware function that ensures that other middlewares and the associated endpoint are executed.
:::info
You can learn more about Middlewares and their capabilities in [Expresss documentation](http://expressjs.com/en/guide/using-middleware.html#using-middleware).
:::
Here's an example of a middleware:
```ts title=src/api/middlewares/custom-middleware.ts
export function customMiddleware(req, res, next) {
// TODO perform an action
next()
}
```
### Step 2: Apply Middleware on an Endpoint
To apply a middleware on any endpoint, you can use the same router defined in `src/api/index.ts` or any other router that is used or exported by `src/api/index.ts`. For example:
:::warning
The examples used here don't apply Cross-Origin Resource Origin (CORS) options for simplicity. Make sure to apply them, especially for core routes, as explained in the [Create Endpoint](./create.mdx#cors-configuration) documentation.
:::
```ts title=src/api/index.ts
import { Router } from "express"
import {
customMiddleware,
} from "./middlewares/custom-middleware"
export default (rootDirectory, pluginOptions) => {
const router = Router()
// custom route
router.get("/hello", (req, res) => {
res.json({
message: "Welcome to My Store!",
})
})
// middleware for the custom route
router.use("/hello", customMiddleware)
// middleware for core route
router.use("/store/products", customMiddleware)
return router
}
```
## Step 3: Building Files
Similar to custom endpoints, you must transpile the files under `src` into the `dist` directory for the backend to load them.
To do that, run the following command before running the Medusa backend:
```bash npm2yarn
npm run build
```
You can then test that the middleware is working by running the backend.
---
## Registering New Resources in Dependency Container
In some cases, you may need to register a resource to use within your commerce application. For example, you may want to register the logged-in user to access it in other services. You can do that in your middleware.
:::tip
If you want to register a logged-in user and access it in your resources, you can check out [this example guide](./example-logged-in-user.mdx).
:::
To register a new resource in the dependency container, use the `req.scope.register` method:
```ts title=src/api/middlewares/custom-middleware.ts
export function customMiddleware(req, res, next) {
// TODO perform an action
req.scope.register({
customResource: {
resolve: () => "my custom resource",
},
})
next()
}
```
You can then load this new resource within other resources. For example, to load it in a service:
<!-- eslint-disable prefer-rest-params -->
```ts title=src/services/custom-service.ts
import { TransactionBaseService } from "@medusajs/medusa"
class CustomService extends TransactionBaseService {
constructor(container, options) {
super(...arguments)
// use the registered resource.
try {
container.customResource
} catch (e) {
// avoid errors when the backend first loads
}
}
}
export default CustomService
```
Notice that you have to wrap your usage of the new resource in a try-catch block when you use it in a constructor. This is to avoid errors that can arise when the backend first loads, as the resource is not registered yet.
### Note About Services Lifetime
If you want to access new registrations in the dependency container within a service, you must set the lifetime of the service either to `Lifetime.SCOPED` or `Lifetime.TRANSIENT`. Services that have a `Lifetime.SINGLETON` lifetime can't access new registrations since they're resolved and cached in the root dependency container beforehand. You can learn more in the [Create Services documentation](../services/create-service.mdx#service-life-time).
For custom services, no additional action is required as the default lifetime is `Lifetime.SCOPED`. However, if you extend a core service, you must change the lifetime since the default lifetime for core services is `Lifetime.SINGLETON`.
For example:
<!-- eslint-disable prefer-rest-params -->
```ts
import { Lifetime } from "awilix"
import {
ProductService as MedusaProductService,
} from "@medusajs/medusa"
// extending ProductService from the core
class ProductService extends MedusaProductService {
// The default life time for a core service is SINGLETON
static LIFE_TIME = Lifetime.SCOPED
constructor(container, options) {
super(...arguments)
// use the registered resource.
try {
container.customResource
} catch (e) {
// avoid errors when the backend first loads
}
}
// ...
}
export default ProductService
```
---
## Troubleshooting
<Troubleshooting
sections={[
{
title: 'AwilixResolutionError: Could Not Resolve X',
content: <ServiceLifetimeSection />
},
{
title: 'AwilixResolutionError: Could Not Resolve X (Custom Registration)',
content: <CustomRegistrationSection />
}
]}
/>
---
## See Also
- [Store API reference](https://docs.medusajs.com/api/store)
- [Admin API reference](https://docs.medusajs.com/api/admin)

View File

@@ -0,0 +1,953 @@
---
description: 'Learn how to create endpoints in Medusa. This guide also includes how to add CORS configurations, creating multiple endpoints, adding protected routes, and more.'
addHowToData: true
---
import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem';
# How to Create Endpoints
In this document, youll learn how to create endpoints in Medusa.
## Overview
Custom endpoints are created under the `src/api` directory in your Medusa Backend. They're defined in a TypeScript or JavaScript file named `index` (for example, `index.ts`). This file should export a function that returns an Express router or an array of routes and middlewares.
To consume the custom endpoints in your Medusa backend, you must transpile them with the `build` command before starting your backend.
---
## Basic Implementation
To create a new endpoint, start by creating a new file in `src/api` called `index.ts`. At its basic format, `index.ts` should look something like this:
```ts title=src/api/index.ts
import { Router } from "express"
export default (rootDirectory, options) => {
const router = Router()
router.get("/hello", (req, res) => {
res.json({
message: "Welcome to My Store!",
})
})
return router
}
```
This exports a function that returns an Express router. The function receives two parameters:
- `rootDirectory` is the absolute path to the root directory that your backend is running from.
- `options` is an object that contains the configurations exported from `medusa-config.js`. If your API route is part of a plugin, then it will contain the plugin's options instead.
---
## Building Files
Custom endpoints must be transpiled and moved to the `dist` directory before you can start consuming them. When you run your backend using either the `medusa develop` or `npx medusa develop` commands, it watches the files under `src` for any changes, then triggers the `build` command and restarts the server.
However, the build isn't triggered when the backend first starts running, and it's never triggered when the `medusa start` or `npx medusa start` commands are used.
So, make sure to run the `build` command before starting the backend:
```bash npm2yarn
npm run build
```
---
## Defining Multiple Routes or Middlewares
Instead of returning an Express router in the function, you can return an array of routes and [middlewares](./add-middleware.mdx).
For example:
```ts title=src/api/index.ts
import { Router } from "express"
export default (rootDirectory, options) => {
const router = Router()
router.get("/hello", (req, res) => {
res.json({
message: "Welcome to My Store!",
})
})
// you can also define the middleware
// in another file and import it
const middleware = (res, req, next) => {
// TODO define global middleware
console.log("hello from middleware")
next()
}
const anotherRouter = Router()
anotherRouter.get("/store/*", (req, res, next) => {
// TODO perform an actions for all store endpoints
next()
})
return [middleware, router, anotherRouter]
}
```
This allows you to export multiple routers and middlewares from the same file. You can also import the routers, routes, and middlewares from other files, then import them in `src/api/index.ts` instead of defining them within the same file.
---
## Endpoint Path
Your endpoint can be under any path you wish.
By Medusas conventions:
- All Storefront REST APIs are prefixed by `/store`. For example, the `/store/products` endpoint lets you retrieve the products to display them on your storefront.
- All Admin REST APIs are prefixed by `/admin`. For example, the `/admin/products` endpoint lets you retrieve the products to display them on your Admin.
You can also create endpoints that don't reside under these two prefixes, similar to the `hello` endpoint in the previous example.
---
## CORS Configuration
If youre adding a storefront or admin endpoint and you want to access these endpoints from the storefront or Medusa admin, you need to pass your endpoints Cross-Origin Resource Origin (CORS) options using the `cors` package.
First, import the necessary utility functions and types from Medusa's packages with the `cors` package:
```ts
import {
getConfigFile,
parseCorsOrigins,
} from "medusa-core-utils"
import {
ConfigModule,
} from "@medusajs/medusa/dist/types/global"
import cors from "cors"
```
Next, in the exported function, retrieve the CORS configurations of your backend using the utility functions you imported:
```ts
export default (rootDirectory) => {
// ...
const { configModule } =
getConfigFile<ConfigModule>(rootDirectory, "medusa-config")
const { projectConfig } = configModule
// ....
}
```
Then, create an object that will hold the CORS configurations. Based on whether it's storefront or admin CORS options, you pass it the respective configuration from `projectConfig`:
<Tabs groupId="endpoint-type" isCodeTabs={true}>
<TabItem value="storefront" label="Storefront CORS" default>
```ts
const storeCorsOptions = {
origin: projectConfig.store_cors.split(","),
credentials: true,
}
```
</TabItem>
<TabItem value="admin" label="Admin CORS">
```ts
const adminCorsOptions = {
origin: projectConfig.admin_cors.split(","),
credentials: true,
}
```
</TabItem>
</Tabs>
Finally, you can either pass the `cors` middleware for a specific route, or pass it to the entire router:
<Tabs groupId="pass-type" isCodeTabs={true}>
<TabItem value="single" label="Pass to Endpoint" default>
```ts
adminRouter.options("/admin/hello", cors(adminCorsOptions))
adminRouter.get(
"/admin/hello",
cors(adminCorsOptions),
(req, res) => {
// ...
}
)
```
</TabItem>
<TabItem value="router" label="Pass to Router">
```ts
adminRouter.use(cors(adminCorsOptions))
```
</TabItem>
</Tabs>
---
## Parse Request Body Parameters
If you want to accept request body parameters, you need to pass express middlewares that parse the payload type to your router.
For example:
```ts title=src/api/index.ts
import bodyParser from "body-parser"
import express, { Router } from "express"
export default (rootDirectory, pluginOptions) => {
const router = Router()
router.use(express.json())
router.use(express.urlencoded({ extended: true }))
router.post("/store/hello", (req, res) => {
res.json({
message: req.body.name,
})
})
return router
}
```
In the code snippet above, you use the following middlewares:
- `express.json()`: parses requests with JSON payloads
- `express.urlencoded()`: parses requests with urlencoded payloads.
You can learn about other available middlewares in the [Express documentation](https://expressjs.com/en/api.html#express).
---
## Protected Routes
Protected routes are routes that should only be accessible by logged-in customers or users.
### Protect Store Routes
There are two approaches to make a storefront route protected:
- Using the `requireCustomerAuthentication` middleware, which disallows unauthenticated customers from accessing a route, and allows you to access the logged-in customer's ID.
- Using the `authenticateCustomer` middleware, which allows both authenticated and unauthenticated customers to access your route, but allows you to access the logged-in customer's ID as well.
To make a storefront route protected using either middlewares, first, import the middleware at the top of your file:
<!-- eslint-disable max-len -->
```ts
import { requireCustomerAuthentication } from "@medusajs/medusa"
// import { authenticateCustomer } from "@medusajs/medusa"
```
Then, pass the middleware to either a single route or an entire router:
<Tabs groupId="pass-type" isCodeTabs={true}>
<TabItem value="single" label="Pass to Endpoint" default>
```ts
// only necessary if you're passing cors options per route
router.options("/store/hello", cors(storeCorsOptions))
router.get(
"/store/hello",
cors(storeCorsOptions),
requireCustomerAuthentication(),
// authenticateCustomer()
async (req, res) => {
// access current customer
const id = req.user.customer_id
// if you're using authenticateCustomer middleware
// check if id is set first
const customerService = req.scope.resolve("customerService")
const customer = await customerService.retrieve(id)
// ...
}
)
```
</TabItem>
<TabItem value="router" label="Pass to Router">
```ts
storeRouter.use(requireCustomerAuthentication())
// all routes added to storeRouter are now protected
// the logged in customer can be accessed using:
// req.user.customer_id
// storeRouter.use(authenticateCustomer())
```
</TabItem>
</Tabs>
### Protect Admin Routes
To protect admin routes and only allow logged-in users from accessing them, first, import the `authenticate` middleware at the top of the file:
<!-- eslint-disable max-len -->
```ts
import { authenticate } from "@medusajs/medusa"
```
Then, pass the middleware to either a single route or an entire router:
<Tabs groupId="pass-type" isCodeTabs={true}>
<TabItem value="single" label="Pass to Endpoint" default>
```ts
// only necessary if you're passing cors options per route
adminRouter.options("/admin/hello", cors(adminCorsOptions))
adminRouter.get(
"/admin/hello",
cors(adminCorsOptions),
authenticate(),
async (req, res) => {
// access current user
const id = req.user.userId
const userService = req.scope.resolve("userService")
const user = await userService.retrieve(id)
// ...
}
)
```
</TabItem>
<TabItem value="router" label="Pass to Router">
```ts
adminRouter.use(authenticate())
// all routes added to adminRouter are now protected
// the logged in user can be accessed using:
// req.user.userId
```
</TabItem>
</Tabs>
---
## Retrieve Medusa Config
As mentioned, the second parameter `options` includes the configurations exported from `medusa-config.js`. However, in plugins it only includes the plugin's options.
If you need to access the Medusa configuration in your endpoint, you can use the `getConfigFile` method imported from `medusa-core-utils`. It accepts the following parameters:
1. `rootDirectory`: The first parameter is a string indicating root directory of your Medusa backend.
2. `config`: The second parameter is a string indicating the name of the config file, which should be `medusa-config` unless you change it.
The function returns an object with the following properties:
1. `configModule`: An object containing the configurations exported from `medusa-config.js`.
2. `configFilePath`: A string indicating absolute path to the configuration file.
3. `error`: if any errors occur, they'll be included as the value of this property. Otherwise, its value will be `undefined`.
Here's an example of retrieving the configurations within an endpoint using `getConfigFile`:
```ts title=src/api/index.ts
import { Router } from "express"
import { ConfigModule } from "@medusajs/medusa"
import { getConfigFile } from "medusa-core-utils"
export default (rootDirectory) => {
const router = Router()
const { configModule } = getConfigFile<ConfigModule>(
rootDirectory,
"medusa-config"
)
router.get("/store-cors", (req, res) => {
res.json({
store_cors: configModule.projectConfig.store_cors,
})
})
return router
}
```
Notice that `getConfigFile` is a generic function. So, if you're using TypeScript, you should pass it the type `ConfigModule` imported from `@medusajs/medusa`.
If you're accessing custom configurations, you'll need to create a new type that defines these configurations. For example:
```ts title=src/api/index.ts
import { Router } from "express"
import { ConfigModule } from "@medusajs/medusa"
import { getConfigFile } from "medusa-core-utils"
type MyConfigModule = ConfigModule & {
projectConfig: {
custom_config?: string
}
}
export default (rootDirectory) => {
const router = Router()
const { configModule } = getConfigFile<MyConfigModule>(
rootDirectory,
"medusa-config"
)
router.get("/hello", (req, res) => {
res.json({
custom_config: configModule.projectConfig.custom_config,
})
})
return router
}
```
---
## Handle Errors
As Medusa uses v4 of Express, you need to manually handle errors thrown asynchronously as explained in [Express's documentation](https://expressjs.com/en/guide/error-handling.html).
You can use [middlewares](./add-middleware.mdx) to handle errors. You can also use middlewares defined by Medusa, which ensure that your error handling is consistent across your Medusa backend.
:::note
Code snippets are taken from the [full example available at the end of this document](#example-crud-endpoints).
:::
To handle errors using Medusa's middlewares, first, import the `errorHandler` middleware from `@medusajs/medusa` and apply it on your routers. Make sure it's applied after all other middlewares and routes:
```ts title=src/api/index.ts
import express, { Router } from "express"
import adminRoutes from "./admin"
import storeRoutes from "./store"
import { errorHandler } from "@medusajs/medusa"
export default (rootDirectory, options) => {
const router = Router()
router.use(express.json())
router.use(express.urlencoded({ extended: true }))
adminRoutes(router, options)
storeRoutes(router, options)
router.use(errorHandler())
return router
}
```
Then, wrap the function handler of every route with the `wrapHandler` middleware imported from `@medusajs/medusa`. For example:
```ts title=src/api/admin.ts
import { wrapHandler } from "@medusajs/medusa"
// ...
export default function adminRoutes(
router: Router,
options: ConfigModule
) {
// ...
adminRouter.get("/posts", wrapHandler(async (req, res) => {
const postService: PostService =
req.scope.resolve("postService")
res.json({
posts: await postService.list(),
})
}))
// ...
}
```
Alternatively, you can define the endpoints in different files, and import and use them in your router:
<!-- eslint-disable @typescript-eslint/no-var-requires -->
```ts title=src/api/admin.ts
import { wrapHandler } from "@medusajs/medusa"
// ...
export default function adminRoutes(
router: Router,
options: ConfigModule
) {
// ...
adminRouter.get(
"/posts",
wrapHandler(require("./list-posts").default)
)
// ...
}
```
Now all errors thrown in your custom endpoints, including in their custom services, will be caught and returned to the user.
However, if you throw errors like this:
```ts
throw new Error ("Post was not found")
```
You'll notice that the endpoint returns the following object error in the response:
```json
{
"code": "unknown_error",
"type": "unknown_error",
"message": "An unknown error occurred."
}
```
To ensure the error message is relayed in the response, it's recommended to use `MedusaError` imported from `@medusajs/utils` as the thrown error instead:
```ts
import { MedusaError } from "@medusajs/utils"
// ...
throw new MedusaError(
MedusaError.Types.NOT_FOUND,
"Post was not found"
)
```
The constructor of `MedusaError` accepts the following parameters:
1. The first parameter is the type of the error. You can use one of the predefined errors under `MedusaError.Types`, such as `MedusaError.Types.NOT_FOUND` which sets the response status code to `404` automatically.
2. The second parameter is the message of the error.
3. The third parameter is an optional code, which is a string, that can be returned in the error object.
After using `MedusaError`, you'll notice that the returned error in the response provides a clearer message:
```json
{
"type": "not_found",
"message": "Post was not found"
}
```
---
## Use Other Resources
### Entities and Repositories
Your endpoints likely perform an action on an entity. For example, you may create an endpoint to retrieve a list of posts.
You can perform actions on an entity either through its [Repository](../entities/overview.mdx#what-are-repositories) or through a [service](#services). This section covers how to retrieve a repository in an endpoint, but it's recommended to use services instead.
You can retrieve any registered resource, including repositories, in your endpoint using `req.scope.resolve` passing it the resource's registration name in the [dependency container](../fundamentals/dependency-injection.md). Repositories are registered as their camel-case name. So, for example, if you have a `PostRepository`, it's registered as `postRepository`.
Heres an example of an endpoint that retrieves the list of posts in a store:
:::note
Posts are represented by a custom entity not covered in this guide. You can refer to the [entities](../entities/create.mdx#adding-relations) for more details on how to create a custom entity.
:::
```ts
import { PostRepository } from "../repositories/post"
import { EntityManager } from "typeorm"
// ...
export default () => {
// ...
storeRouter.get("/posts", async (req, res) => {
const postRepository: typeof PostRepository =
req.scope.resolve("postRepository")
const manager: EntityManager = req.scope.resolve("manager")
const postRepo = manager.withRepository(postRepository)
return res.json({
posts: await postRepo.find(),
})
})
// ...
}
```
Notice that to retrieve an instance of the repository, you need to retrieve first Typeorm's Entity manager and use its `withRepository` method.
### Services
Services in Medusa bundle a set of functionalities into one class. Typically, these functionalities are associated with an entity, such as methods to retrieve, create, or update its records.
You can retrieve any registered resource, including services, in your endpoint using `req.scope.resolve` passing it the services registration name in the [dependency container](../fundamentals/dependency-injection.md). Services are registered as their camel-case name. So, for example, if you have a `PostService`, it's registered as `postService`.
Heres an example of an endpoint that retrieves the list of posts in a store:
:::note
`PostService` is a custom service that is not covered in this guide. You can refer to the [services](../services/create-service.mdx) documentation for more details on how to create a custom service, and find an [example of PostService](../services/create-service.mdx#example-services-with-crud-operations)
:::
```ts
storeRouter.get("/posts", async (req, res) => {
const postService: PostService = req.scope.resolve(
"postService"
)
return res.json({
posts: await postService.list(),
})
})
```
### Other Resources
Any resource that is registered in the dependency container, such as strategies or file services, can be accessed through `req.scope.resolve`. Refer to the [dependency injection](../fundamentals/dependency-injection.md) documentation for details on registered resources.
---
## Example: CRUD Endpoints
This section services as an example of creating endpoints that perform Create, Read, Update, and Delete (CRUD) operations. Note that all admin endpoints are placed in `src/api/admin.ts`, and store endpoints are placed in `src/api/store.ts`. You can also place each endpoint in a separate file, import it, and add it to its respective router.
You can refer to the [Entities](../entities/create.mdx#adding-relations) and [Services](../services/create-service.mdx#example-services-with-crud-operations) documentation to learn how to create the custom entities and services used in this example.
<Tabs groupId="files" isCodeTabs={true}>
<TabItem value="index" label="src/api/index.ts" default>
```ts
import express, { Router } from "express"
import adminRoutes from "./admin"
import storeRoutes from "./store"
import { errorHandler } from "@medusajs/medusa"
export default (rootDirectory, options) => {
const router = Router()
router.use(express.json())
router.use(express.urlencoded({ extended: true }))
adminRoutes(router, options)
storeRoutes(router, options)
router.use(errorHandler())
return router
}
```
</TabItem>
<TabItem value="admin" label="src/api/admin.ts">
```ts
import { Router } from "express"
import PostService from "../services/post"
import {
ConfigModule,
} from "@medusajs/medusa/dist/types/global"
import cors from "cors"
import { authenticate, wrapHandler } from "@medusajs/medusa"
import AuthorService from "../services/author"
export default function adminRoutes(
router: Router,
options: ConfigModule
) {
const { projectConfig } = options
const corsOptions = {
origin: projectConfig.admin_cors.split(","),
credentials: true,
}
const adminRouter = Router()
router.use("/admin/blog", adminRouter)
adminRouter.use(cors(corsOptions))
adminRouter.use(authenticate())
// it's recommended to define the routes
// in separate files. They're done in
// the same file here for simplicity
// list all blog posts
adminRouter.get(
"/posts",
wrapHandler(async (req, res) => {
const postService: PostService = req.scope.resolve(
"postService"
)
res.json({
posts: await postService.list(),
})
}))
// retrieve a single blog post
adminRouter.get(
"/posts/:id",
wrapHandler(async (req, res) => {
const postService: PostService = req.scope.resolve(
"postService"
)
const post = await postService.retrieve(req.params.id)
res.json({
post,
})
}))
// create a blog post
adminRouter.post(
"/posts",
wrapHandler(async (req, res) => {
const postService: PostService = req.scope.resolve(
"postService"
)
// basic validation of request body
if (!req.body.title || !req.body.author_id) {
throw new Error("`title` and `author_id` are required.")
}
const post = await postService.create(req.body)
res.json({
post,
})
}))
// update a blog post
adminRouter.post(
"/posts/:id",
wrapHandler(async (req, res) => {
const postService: PostService = req.scope.resolve(
"postService"
)
// basic validation of request body
if (req.body.id) {
throw new Error("Can't update post ID")
}
const post = await postService.update(
req.params.id,
req.body
)
res.json({
post,
})
}))
// delete a blog post
adminRouter.delete(
"/posts/:id",
wrapHandler(async (req, res) => {
const postService: PostService = req.scope.resolve(
"postService"
)
await postService.delete(req.params.id)
res.status(200).end()
}))
// list all blog authors
adminRouter.get(
"/authors",
wrapHandler(async (req, res) => {
const authorService: AuthorService = req.scope.resolve(
"authorService"
)
res.json({
authors: await authorService.list(),
})
}))
// retrieve a single blog author
adminRouter.get(
"/authors/:id",
wrapHandler(async (req, res) => {
const authorService: AuthorService = req.scope.resolve(
"authorService"
)
res.json({
post: await authorService.retrieve(req.params.id),
})
}))
// create a blog author
adminRouter.post(
"/authors",
wrapHandler(async (req, res) => {
const authorService: AuthorService = req.scope.resolve(
"authorService"
)
// basic validation of request body
if (!req.body.name) {
throw new Error("`name` is required.")
}
const author = await authorService.create(req.body)
res.json({
author,
})
}))
// update a blog author
adminRouter.post(
"/authors/:id",
wrapHandler(async (req, res) => {
const authorService: AuthorService = req.scope.resolve(
"authorService"
)
// basic validation of request body
if (req.body.id) {
throw new Error("Can't update author ID")
}
const author = await authorService.update(
req.params.id,
req.body
)
res.json({
author,
})
}))
// delete a blog author
adminRouter.delete(
"/authors/:id",
wrapHandler(async (req, res) => {
const authorService: AuthorService = req.scope.resolve(
"authorService"
)
await authorService.delete(req.params.id)
res.status(200).end()
}))
}
```
</TabItem>
<TabItem value="store" label="src/api/store.ts">
```ts
import { Router } from "express"
import {
ConfigModule,
} from "@medusajs/medusa/dist/types/global"
import PostService from "../services/post"
import cors from "cors"
import AuthorService from "../services/author"
import { wrapHandler } from "@medusajs/medusa"
export default function storeRoutes(
router: Router,
options: ConfigModule
) {
const { projectConfig } = options
const storeCorsOptions = {
origin: projectConfig.store_cors.split(","),
credentials: true,
}
const storeRouter = Router()
router.use("/store/blog", storeRouter)
storeRouter.use(cors(storeCorsOptions))
// list all blog posts
storeRouter.get(
"/posts",
wrapHandler(async (req, res) => {
const postService: PostService = req.scope.resolve(
"postService"
)
res.json({
posts: await postService.list(),
})
}))
// retrieve a single blog post
storeRouter.get(
"/posts/:id",
wrapHandler(async (req, res) => {
const postService: PostService = req.scope.resolve(
"postService"
)
res.json({
post: await postService.retrieve(req.params.id),
})
}))
// list all blog authors
storeRouter.get(
"/authors",
wrapHandler(async (req, res) => {
const authorService: AuthorService = req.scope.resolve(
"authorService"
)
res.json({
authors: await authorService.list(),
})
}))
// retrieve a single blog author
storeRouter.get(
"/authors/:id",
wrapHandler(async (req, res) => {
const authorService: AuthorService = req.scope.resolve(
"authorService"
)
res.json({
post: await authorService.retrieve(req.params.id),
})
}))
}
```
</TabItem>
</Tabs>
---
## See Also
- [Storefront API Reference](https://docs.medusajs.com/api/store)
- [Admin API Reference](https://docs.medusajs.com/api/admin)

View File

@@ -0,0 +1,220 @@
---
description: 'In this document, youll see an example of how you can use middlewares and endpoints to register the logged-in user in the dependency container of your commerce application.'
addHowToData: true
---
import Troubleshooting from '@site/src/components/Troubleshooting'
import ServiceLifetimeSection from '../../troubleshooting/awilix-resolution-error/_service-lifetime.md'
import CustomRegistrationSection from '../../troubleshooting/awilix-resolution-error/_custom-registration.md'
# Example: Access Logged-In User
In this document, youll see an example of how you can use middlewares and endpoints to register the logged-in user in the dependency container of your commerce application. You can then access the logged-in user in other resources, such as services.
This guide showcases how to register the logged-in admin user, but you can apply the same steps if you want to register the current customer.
This documentation does not explain the basics of [middlewares](./add-middleware.mdx) and [endpoints](./create.mdx). You can refer to their respective guides for more details about each.
## Step 1: Create the Middleware
Create the file `src/api/middlewares/logged-in-user.ts` with the following content:
```ts title=src/api/middlewares/logged-in-user.ts
import { User, UserService } from "@medusajs/medusa"
export async function registerLoggedInUser(req, res, next) {
let loggedInUser: User | null = null
if (req.user && req.user.userId) {
const userService =
req.scope.resolve("userService") as UserService
loggedInUser = await userService.retrieve(req.user.userId)
}
req.scope.register({
loggedInUser: {
resolve: () => loggedInUser,
},
})
next()
}
```
This retrieves the ID of the current user to retrieve an instance of it, then registers it in the scope under the name `loggedInUser`.
---
## Step 2: Apply Middleware on Endpoint
If you don't have the `cors` package installed, make sure to install it first:
```bash npm2yarn
npm install cors
```
Then, create the file `src/api/routes/create-product.ts` with the following content:
```ts title=src/api/routes/create-product.ts
import cors from "cors"
import { Router } from "express"
import {
registerLoggedInUser,
} from "../middlewares/logged-in-user"
import
authenticate
from "@medusajs/medusa/dist/api/middlewares/authenticate"
const router = Router()
export default function (adminCorsOptions) {
// This router will be applied before the core routes.
// Therefore, the middleware will be executed
// before the create product handler is hit
router.use(
"/admin/products",
cors(adminCorsOptions),
authenticate(),
registerLoggedInUser
)
return router
}
```
In the example above, the middleware is applied on the `/admin/products` core endpoint. However, you can apply it on any other endpoint. You can also apply it to custom endpoints.
For endpoints that require Cross-Origin Resource Origin (CORS) options, such as core endpoints, you must pass the CORS options to the middleware as well since it will be executed before the underlying endpoint.
:::tip
In the above code snippet, the `authenticate` middleware imported from `@medusajs/medusa` is used to ensure that the user is logged in first. If you're implementing this for middleware to register the logged-in customer, make sure to use the [customer's authenticate middleware](./create.mdx#protect-store-routes).
:::
---
## Step 3: Register Endpoint in the API
Create the file `src/api/index.ts` with the following content:
```ts title=src/api/index.ts
import configLoader from "@medusajs/medusa/dist/loaders/config"
import createProductRouter from "./routes/create-product"
export default function (rootDirectory: string) {
const config = configLoader(rootDirectory)
const adminCors = {
origin: config.projectConfig.admin_cors.split(","),
credentials: true,
}
const productRouters = [
createProductRouter(adminCors),
]
return [...productRouters]
}
```
This exports an array of endpoints, one of them being the product endpoint that you applied the middleware on in the second step. You can export more endpoints as well.
---
## Step 4: Use in a Service
You can now access the logged-in user in a service. For example, to access it in a custom service:
<!-- eslint-disable prefer-rest-params -->
```ts
import { Lifetime } from "awilix"
import {
TransactionBaseService,
User,
} from "@medusajs/medusa"
class HelloService extends TransactionBaseService {
protected readonly loggedInUser_: User | null
constructor(container, options) {
super(...arguments)
try {
this.loggedInUser_ = container.loggedInUser
} catch (e) {
// avoid errors when backend first runs
}
}
// ...
}
export default HelloService
```
If you're accessing it in an extended core service, its important to change the lifetime of the service to `Lifetime.SCOPED`. For example:
<!-- eslint-disable prefer-rest-params -->
```ts
import { Lifetime } from "awilix"
import {
ProductService as MedusaProductService,
User,
} from "@medusajs/medusa"
// extend core product service
class ProductService extends MedusaProductService {
// The default life time for a core service is SINGLETON
static LIFE_TIME = Lifetime.SCOPED
protected readonly loggedInUser_: User | null
constructor(container, options) {
super(...arguments)
this.loggedInUser_ = container.loggedInUser
}
}
export default ProductService
```
You can learn more about the importance of changing the service lifetime in the [Middlewares documentation](./add-middleware.mdx#note-about-services-lifetime).
---
## Step 5: Test it Out
To test out your implementation, run the following command in the root directory of the Medusa backend to transpile your changes:
```bash npm2yarn
npm run build
```
Then, run your backend with the following command:
```bash npm2yarn
npx medusa develop
```
If you try accessing the endpoints you added the middleware to, you should see your implementation working as expected.
---
## Troubleshooting
<Troubleshooting
sections={[
{
title: 'AwilixResolutionError: Could Not Resolve X',
content: <ServiceLifetimeSection />
},
{
title: 'AwilixResolutionError: Could Not Resolve X (Custom Registration)',
content: <CustomRegistrationSection />
}
]}
/>

View File

@@ -0,0 +1,83 @@
---
description: 'Learn how to extend a validator. This is useful when you want to pass additional data to endpoints in the Medusa core.'
addHowToData: true
---
# How to Extend an Endpoint Validator
In this guide, you'll learn how to extend an endpoint validator from the Medusa core.
## Overview
Request fields passed to endpoints that are defined in the Medusa core are validated to ensure that only expected fields are passed, and the passed fields are of correct types.
In some scenarios, you may need to allow passing custom fields into an existing endpoint. If a custom field is passed to an endpoint in the core, the endpoint returns an error in the response.
To allow passing custom fields into core endpoints, you must extend Validators. Validators are classes that are used by the core to validate the request parameters to an endpoint.
This guide explains how to extend a validator to allow passing custom fields to an endpoint. You'll be extending the validator of the admin API Create Product endpoint as an example.
---
## Prerequisites
This guide assumes you already have a Medusa backend installed and configured. If not, you can check out the [backend quickstart guide](../backend/install.mdx).
---
## Step 1: Create File
You can add the code to extend the validator in any file under the `src` directory of your Medusa project, but it should be executed by `src/api/index.ts`.
For example, you can add the code in an exported function defined in the file `src/api/routes/admin/products/create-product.ts`, then import that file in `src/api/index.ts` and execute the function.
For simplicity, this guide adds the code directly in `src/api/index.ts`. Make sure to create it if it's not already created.
---
## Step 2: Extend Validator
In the file you created, which in this case is `src/api/index.ts`, add the following content to extend the validator:
<!-- eslint-disable max-len -->
```ts title=src/api/index.ts
import { registerOverriddenValidators } from "@medusajs/medusa"
import {
AdminPostProductsReq as MedusaAdminPostProductsReq,
} from "@medusajs/medusa/dist/api/routes/admin/products/create-product"
import { IsString } from "class-validator"
class AdminPostProductsReq extends MedusaAdminPostProductsReq {
@IsString()
custom_field: string
}
registerOverriddenValidators(AdminPostProductsReq)
```
In this code snippet you:
1. Import the `registerOverriddenValidators` function from the `@medusajs/medusa` package. This utility function allows you to extend validators in the core.
2. Import the `AdminPostProductsReq` class from `@medusajs/medusa` as `MedusaAdminPostProductsReq` since this guide extends the Create Product endpoint validator. If you're extending a different validator, make sure to import it instead.
3. Create a class `AdminPostProductsReq` that extends `MedusaAdminPostProductsReq` and adds a new field `custom_field`. Notice that the name of the class must be the same name of the validator defined in the core. `custom_field` has the type `string`. You can change the type or name of the field, or add more fields.
4. Call `registerOverriddenValidators` passing it the `AdminPostProductsReq` class you created. This will override the validator defined in the core to include the new field `custom_field` among the existing fields defined in the core.
:::tip
Validators are defined in the same file as the endpoint. To find the validator you need to override, find the endpoint file under `@medusajs/medusa/dist/api/routes` and import the validator in that file.
:::
---
## Step 3: Test it Out
To test out your extended validator, build and start your Medusa backend:
```bash npm2yarn
npm run build
npx medusa develop
```
Then, send a request to the endpoint you extended passing it your custom fields. To test out the example in this guide, send an [authenticated request](https://docs.medusajs.com/api/admin#authentication) to the [Create Product endpoint](https://docs.medusajs.com/api/admin#products_postproducts) and pass it the `custom_field` body parameter. The request should execute with no errors.

View File

@@ -0,0 +1,76 @@
---
description: "Learn what endpoints are in Medusa. Endpoints are REST APIs that allow a frontend or external system to interact with the Backend."
---
import DocCardList from '@theme/DocCardList';
import Icons from '@theme/Icon';
# Endpoints
In this document, youll learn what endpoints are in Medusa.
## Introduction
The Medusa Backend is a web server built on top of [Express](https://expressjs.com/), a Node.js web framework. This provides developers with all the functionalities available within Express during development. One of those are endpoints.
Endpoints are REST APIs that allow a frontend or an external system to interact with the Medusa Backend to retrieve and process data, or perform business logic. Endpoints are [Express routes](https://expressjs.com/en/starter/basic-routing.html).
Each [commerce module](../../modules/overview.mdx) contains a set of endpoints specific to the functionalities that it provides. Since the core package that powers the Medusa Backend acts as an orchestrator of commerce modules and exposes their endpoints, the endpoints of each of these commerce modules are available within the Medusa Backend.
The commerce modules provide two types of endpoints: Store APIs and Admin APIs. The Store APIs are typically accessed from the storefront. For example, you can use the Store APIs to show customers available products or implement a cart and checkout flow.
The Admin APIs are typically accessed from an admin dashboard. For example, you can use the Admin APIs to allow admins to manage the stores data such as products, orders, and so on.
<DocCardList colSize={6} items={[
{
type: 'link',
href: 'https://docs.medusajs.com/api/store',
label: 'Store APIs',
customProps: {
icon: Icons['server-solid'],
description: 'Check out available Store REST APIs.'
}
},
{
type: 'link',
href: 'https://docs.medusajs.com/api/admin',
label: 'Admin APIs',
customProps: {
icon: Icons['server-solid'],
description: 'Check out available Admin REST APIs.'
}
},
]} />
---
## Custom Development
Aside from using the endpoints that commerce modules, developers can create their own REST APIs either directly in the Medusa Backend, in a plugin, or in a custom commerce module.
:::tip
As the core Medusa package is completely customizable, developers can also extend the functionality even further to implement GraphQL endpoints.
:::
<DocCardList colSize={6} items={[
{
type: 'link',
href: '/development/endpoints/create',
label: 'Create an Endpoint',
customProps: {
icon: Icons['academic-cap-solid'],
description: 'Learn how to create an endpoint in Medusa.'
}
},
{
type: 'link',
href: '/development/endpoints/add-middleware',
label: 'Add a Middleware',
customProps: {
icon: Icons['academic-cap-solid'],
description: 'Learn how to add a middleware in Medusa.'
}
},
]} />

View File

@@ -0,0 +1,164 @@
---
description: 'Learn how to create an entity in Medusa. This guide also explains how to create a repository and access and delete the entity.'
addHowToData: true
---
import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem';
# How to Create an Entity
In this document, youll learn how you can create a custom [Entity](./overview.mdx).
## Step 1: Create the Entity
To create an entity, create a TypeScript file in `src/models`. For example, heres a `Post` entity defined in the file `src/models/post.ts`:
:::note
Entities can only be placed in the top level of the `src/models` directory. So, you can't create an entity in a subfolder.
:::
```ts title=src/models/post.ts
import {
BeforeInsert,
Column,
Entity,
PrimaryColumn,
} from "typeorm"
import { BaseEntity } from "@medusajs/medusa"
import { generateEntityId } from "@medusajs/medusa/dist/utils"
@Entity()
export class Post extends BaseEntity {
@Column({ type: "varchar" })
title: string | null
@BeforeInsert()
private beforeInsert(): void {
this.id = generateEntityId(this.id, "post")
}
}
```
This entity has one column `title` defined. However, since it extends `BaseEntity` it will also have the `id`, `created_at`, and `updated_at` columns.
Medusas core entities all have the following format for IDs: `<PREFIX>_<RANDOM>`. For example, an order might have the ID `order_01G35WVGY4D1JCA4TPGVXPGCQM`.
To generate an ID for your entity that matches the IDs generated for Medusas core entities, you should add a `BeforeInsert` event handler. Then, inside that handler use Medusas utility function `generateEntityId` to generate the ID. It accepts the ID as a first parameter and the prefix as a second parameter. The `Post` entity IDs will be of the format `post_<RANDOM>`.
You can learn more about what decorators and column types you can use in [Typeorms documentation](https://typeorm.io/entities).
### Soft-Deletable Entities
If you want the entity to also be soft deletable then it should extend `SoftDeletableEntity` instead:
```ts
import { SoftDeletableEntity } from "@medusajs/medusa"
@Entity()
export class Post extends SoftDeletableEntity {
// ...
}
```
### Adding Relations
Your entity may be related to another entity. You can showcase the relation with [Typeorm's relation decorators](https://typeorm.io/relations).
For example, you can create another entity `Author` and add a `ManyToOne` relation to it from the `Post`, and a `OneToMany` relation from the `Author` to the `Post`:
<Tabs groupId="files" isCodeTabs={true}>
<TabItem value="post" label="src/models/post.ts" default>
```ts
import {
BeforeInsert,
Column,
Entity,
JoinColumn,
ManyToOne,
} from "typeorm"
import { BaseEntity } from "@medusajs/medusa"
import { generateEntityId } from "@medusajs/medusa/dist/utils"
import { Author } from "./author"
@Entity()
export class Post extends BaseEntity {
@Column({ type: "varchar" })
title: string | null
@Column({ type: "varchar" })
author_id: string
@ManyToOne(() => Author, (author) => author.posts)
@JoinColumn({ name: "author_id" })
author: Author
@BeforeInsert()
private beforeInsert(): void {
this.id = generateEntityId(this.id, "post")
}
}
```
</TabItem>
<TabItem value="author" label="src/services/author.ts">
```ts
import { BaseEntity, generateEntityId } from "@medusajs/medusa"
import {
BeforeInsert,
Column,
Entity,
OneToMany,
} from "typeorm"
import { Post } from "./post"
@Entity()
export class Author extends BaseEntity {
@Column({ type: "varchar" })
name: string
@Column({ type: "varchar", nullable: true })
image?: string
@OneToMany(() => Post, (post) => post.author)
posts: Post[]
@BeforeInsert()
private beforeInsert(): void {
this.id = generateEntityId(this.id, "auth")
}
}
```
</TabItem>
</Tabs>
Adding these relations allows you to later on expand these relations when retrieving records of this entity with repositories.
---
## Step 2: Create and Run Migrations
Additionally, you must create a migration for your entity. Migrations are used to update the database schema with new tables or changes to existing tables.
You can learn more about Migrations, how to create or generate them, and how to run them in the [Migration documentation](./migrations/create.md).
Make sure to run the migrations before you start using the entity.
---
## Advanced Entity Definitions
With entities, you can create relationships, index keys, and more. As Medusa uses Typeorm, you can learn about using these functionalities through [Typeorm's documentation](https://typeorm.io/entities).
---
## See Also
- [How to use repositories](./repositories.md)
- [Extend an entity](./extend-entity.md)
- [Create a plugin](../plugins/create.mdx)

View File

@@ -0,0 +1,177 @@
---
description: 'Learn how to extend a core entity in Medusa to add custom attributes.'
addHowToData: true
---
# How to Extend an Entity
In this document, youll learn how to extend a core entity in Medusa.
## Overview
Medusa uses entities to represent tables in the database. As you build your custom commerce application, youll often need to add your own properties to those entities. This guide explains the necessary steps to extend core Medusa entities.
This guide will use the Product entity as an example to demonstrate the steps.
### Word of Caution about Overriding
Extending entities to add new attributes or methods shouldn't cause any issues within your commerce application. However, if you extend them to override their existing methods or attributes, you should be aware that this could have negative implications, such as unanticipated bugs, especially when you try to upgrade the core Medusa package to a newer version.
---
## Step 1: Create Entity File
In your Medusa backend, create the file `src/models/product.ts`. This file will hold your extended entity.
Note that the name of the file must be the same as the name of the original entity in the core package. Since in this guide youre overriding the Product entity, its named `product` to match the core. If youre extending the customer entity, for example, the file should be named `customer.ts`.
---
## Step 2: Implement Extended Entity
In the file you created, you can import the entity youre extending from the core package, then create a class that extends that entity. You can add in that class the new attributes and methods.
Heres an example of extending the Product entity:
```ts title=src/models/product.ts
import { Column, Entity } from "typeorm"
import {
// alias the core entity to not cause a naming conflict
Product as MedusaProduct,
} from "@medusajs/medusa"
@Entity()
export class Product extends MedusaProduct {
@Column()
customAttribute: string
}
```
---
## (Optional) Step 3: Create a TypeScript Declaration File
If youre using JavaScript instead of TypeScript in your implementation, you can skip this step.
To ensure that TypeScript is aware of your extended entity and affects the typing of the Medusa package itself, create the file `src/index.d.ts` with the following content:
```ts title=src/index.d.ts
export declare module "@medusajs/medusa/dist/models/product" {
declare interface Product {
customAttribute: string;
}
}
```
Notice that you must pass the attributes you added to the entity into the `interface`. The attributes will be merged with the attributes defined in the core `Product` entity.
---
## Step 4: Create Migration
To reflect your entity changes on the database schema, you must create a migration with those changes.
You can learn how to create or generate a migration in [this documentation](./migrations/create.md).
Heres an example of a migration of the entity extended in this guide:
```ts title=src/migration/1680013376180-changeProduct.ts
import { MigrationInterface, QueryRunner } from "typeorm"
class changeProduct1680013376180 implements MigrationInterface {
public async up(queryRunner: QueryRunner): Promise<void> {
await queryRunner.query(
"ALTER TABLE \"product\"" +
" ADD COLUMN \"customAttribute\" text"
)
}
public async down(queryRunner: QueryRunner): Promise<void> {
await queryRunner.query(
"ALTER TABLE \"product\" DROP COLUMN \"customAttribute\""
)
}
}
export default changeProduct1680013376180
```
---
## Step 5: Use Custom Entity
For changes to take effect, you must transpile your code by running the `build` command in the root of the Medusa backend:
```bash npm2yarn
npm run build
```
Then, run the following command to migrate your changes to the database:
```bash
npx medusa migrations run
```
You should see that your migration was executed, which means your changes were reflected in the database schema.
You can now use your extended entity throughout your commerce application.
---
## Access Custom Attributes and Relations in Core Endpoints
### Request Parameters
In most cases, after you extend an entity to add new attributes, you'll likely need to pass these attributes to endpoints defined in the core. By default, this causes an error, as request parameters are validated to ensure only those that are defined are passed to the endpoint.
To allow passing your custom attribute, you'll need to [extend the validator](../endpoints/extend-validator.md) of the endpoint.
### Response Fields
After you add custom attributes, you'll notice that these attributes aren't returned as part of the response fields of core endpoints. Core endpoints have a defined set of fields and relations that can be returned by default in requests.
To change that and ensure your custom attribute is returned in your request, you can extend the allowed fields of a set of endpoints in a loader file and add your attribute into them.
For example, if you added a custom attribute in the `Product` entity and you want to ensure it's returned in all the product's store endpoints (endpoints under the prefix `/store/products`), you can create a file under the `src/loaders` directory in your Medusa backend with the following content:
```ts title=src/loaders/extend-product-fields.ts
export default async function () {
const imports = (await import(
"@medusajs/medusa/dist/api/routes/store/products/index"
)) as any
imports.allowedStoreProductsFields = [
...imports.allowedStoreProductsFields,
"customAttribute",
]
imports.defaultStoreProductsFields = [
...imports.defaultStoreProductsFields,
"customAttribute",
]
}
```
In the code snippet above, you import `@medusajs/medusa/dist/api/routes/store/products/index`, which is where all the product's store endpoints are exported. In that file, there are the following defined variables:
- `allowedStoreProductsFields`: The fields or attributes of a product that are allowed to be retrieved and returned in the product's store endpoints. This would allow you to pass your custom attribute in the `fields` request parameter of the product's store endpoints.
- `defaultStoreProductsFields`: The fields or attributes of a product that are retrieved and returned by default in the product's store endpoints.
You change the values of these variables and pass the name of your custom attribute. Make sure to change `customAttribute` to the name of your custom attribute.
:::tip
Before you test out the above change, make sure to build your changes before you start the Medusa backend.
:::
You can also add custom relations by changing the following defined variables:
- `allowedStoreProductsRelations`: The relations of a product that are allowed to be retrieved and returned in the product's store endpoints. This would allow you to pass your custom relation in the `expand` request parameter of the product's store endpoints.
- `defaultStoreProductsRelations`: The relations of a product that are retrieved and returned by default in the product's store endpoints.
If you want to apply this example for a different entity or set of endpoints, you would need to change the import path `@medusajs/medusa/dist/api/routes/store/products/index` to the path of the endpoints you're targeting. You also need to change `allowedStoreProductsFields` and `defaultStoreProductsFields` to the names of the variables in that file, and the same goes for relations. Typically, these names would be of the format `(allowed|default)(Store|Admin)(Entity)(Fields|Relation)`.
---
## Advanced Entity Definitions
With entities, you can create relationships, index keys, and more. As Medusa uses Typeorm, you can learn about using these functionalities through [Typeorm's documentation](https://typeorm.io/).

View File

@@ -0,0 +1,131 @@
---
description: 'Learn how to extend a core repository in Medusa to add custom methods.'
addHowToData: true
---
# How to Extend a Repository
In this document, youll learn how to extend a core repository in Medusa.
## Overview
Medusa uses Typeorms Repositories to perform operations on an entity, such as retrieve or update the entity. Typeorm already provides these basic functionalities within a repository, but sometimes you need to implement a custom implementation to handle the logic behind these operations differently. You might also want to add custom methods related to processing entities that arent available in the default repositories.
In this guide, youll learn how to extend a repository in the core Medusa package. This guide will use the Product repository as an example to demonstrate the steps.
### Word of Caution about Overriding
Extending repositories to add new methods shouldn't cause any issues within your commerce application. However, if you extend them to override their existing methods, you should be aware that this could have negative implications, such as unanticipated bugs, especially when you try to upgrade the core Medusa package to a newer version.
---
## Step 1: Create Repository File
In your Medusa backend, create the file `src/repositories/product.ts`. This file will hold your extended repository.
Note that the name of the file must be the same as the name of the original repository in the core package. Since in this guide youre extending the Product repository, its named `product` to match the core. If youre extending the customer repository, for example, the file should be named `customer.ts`.
---
## Step 2: Implement Extended Repository
In the file you created, you must retrieve both the repository you're extending along with its entity from the core. Youll then use the data source exported from the core package to extend the repository.
:::tip
A data source is Typeorms connection settings that allows you to connect to your database. You can learn more about it in [Typeorms documentation](https://typeorm.io/data-source).
:::
Heres an example of the implementation of the extended Product repository:
```ts title=src/repositories/product.ts
import { Product } from "@medusajs/medusa"
import {
dataSource,
} from "@medusajs/medusa/dist/loaders/database"
import {
// alias the core repository to not cause a naming conflict
ProductRepository as MedusaProductRepository,
} from "@medusajs/medusa/dist/repositories/product"
export const ProductRepository = dataSource
.getRepository(Product)
.extend({
// it is important to spread the existing repository here.
// Otherwise you will end up losing core properties
...MedusaProductRepository,
/**
* Here you can create your custom function
* For example
*/
customFunction(): void {
// TODO add custom implementation
return
},
})
export default ProductRepository
```
You first import all necessary resources from the core package: the `Product` entity, the `dataSource` instance, and the cores `ProductRepository` aliased as `MedusaProductRepository` to avoid naming conflict.
You then use the `dataSource` instance to retrieve the `Product` entitys repository and extend it using the repositorys `extend` method. This method is available as part of Typeorm Repository API. This method returns your extended repository.
The `extend` method accepts an object with all the methods to add to the extended repository.
You must first add the properties of the repository youre extending, which in this case is the product repository (aliased as `MedusaProductRepository`). This will ensure you dont lose core methods, which can lead to the core not working as expected. You add use the spread operator (``) with the `MedusaProductRepository` to spread its properties.
After that, you can add your custom methods to the repository. In the example above, you add the method `customFunction`. You can use any name for your methods.
---
## Step 3: Use Your Extended Repository
You can now use your extended repository in other resources such as services or endpoints.
Heres an example of using it in an endpoint:
```ts
import ProductRepository from "./path/to/product.ts"
import EntityManager from "@medusajs/medusa"
export default () => {
// ...
router.get("/custom-endpoint", (req, res) => {
// ...
const productRepository: typeof ProductRepository =
req.scope.resolve(
"productRepository"
)
const manager: EntityManager = req.scope.resolve("manager")
const productRepo = manager.withRepository(
productRepository
)
productRepo.customFunction()
// ...
})
}
```
---
## Step 4: Test Your Implementation
For changes to take effect, you must transpile your code by running the `build` command in the root of the Medusa backend:
```bash npm2yarn
npm run build
```
Then, run the following command to start your backend:
```bash npm2yarn
npx medusa develop
```
You should see your custom implementation working as expected.

View File

@@ -0,0 +1,144 @@
---
description: 'Learn how to create a migration in Medusa. This guide explains how to write and run migrations.'
addHowToData: true
---
# How to Create Migrations
In this document, youll learn how to create a [Migration](./overview.mdx) using [Typeorm](https://typeorm.io) in Medusa.
## Step 1: Create Migration File
There are two ways to create a migration file: create and write its content manually, or create and generate its content.
If you're creating a custom entity, then it's recommended to generate the migration file. However, if you're extending an entity from Medusa's core, then you should create and write the migration manually.
### Option 1: Generate Migration File
:::warning
Generating migration files for extended entities may cause unexpected errors. It's highly recommended to write them manually instead.
:::
Typeorm provides a `migration:generate` command that allows you to pass it a Typeorm [DataSource](https://typeorm.io/data-source). The `DataSource` includes database connection details, as well as the path to your custom entities.
Start by creating the file `datasource.js` in the root of your Medusa backend project with the following content:
```js
const { DataSource } = require("typeorm")
const AppDataSource = new DataSource({
type: "postgres",
port: 5432,
username: "<YOUR_DB_USERNAME>",
password: "<YOUR_DB_PASSWORD>",
database: "<YOUR_DB_NAME>",
entities: [
"dist/models/*.js",
],
migrations: [
"dist/migrations/*.js",
],
})
module.exports = {
datasource: AppDataSource,
}
```
Make sure to replace `<YOUR_DB_USERNAME>`, `<YOUR_DB_PASSWORD>`, and `<YOUR_DB_NAME>` with the necessary values for your database connection.
Then, after creating your entity, run the `build` command:
```bash npm2yarn
npm run build
```
Finally, run the following command to generate a migration for your custom entity:
```bash
npx typeorm migration:generate -d datasource.js src/migrations/PostCreate
```
This will generate the migration file in the path you specify, where `PostCreate` is just an example of the name of the migration to create. The migration file must be inside the `src/migrations` directory. When you run the build command, it will be transpiled into the `dist/migrations` directory.
The `migrations run` command can only pick up migrations under the `dist/migrations` directory on a Medusa backend. This applies to migrations created in a Medusa backend, and not in a Medusa plugin. For plugins, check out the [Plugin's Structure section](../../plugins/create.mdx).
You can now continue to [step 2](#step-2-build-files) of this guide.
### Option 2: Write Migration File
With this option, you'll use Typeorm's CLI tool to create the migration file, but you'll write the content yourself.
Run the following command in the root directory of your Medusa backend project:
```bash
npx typeorm migration:create src/migrations/PostCreate
```
This will create the migration file in the path you specify, where `PostCreate` is just an example of the name of the migration to create. The migration file must be inside the `src/migrations` directory. When you run the build command, it will be transpiled into the `dist/migrations` directory.
The `migrations run` command can only pick up migrations under the `dist/migrations` directory on a Medusa backend. This applies to migrations created in a Medusa backend, and not in a Medusa plugin. For plugins, check out the [Plugin's Structure section](../../plugins/create.mdx).
If you open the file, you'll find `up` and `down` methods. The `up` method is used to reflect the changes on the database. The `down` method is used to revert the changes, which will be executed if the `npx medusa migrations revert` command is used.
In each of the `up` and `down` methods, you can write the migration either with [SQL syntax](https://www.postgresql.org/docs/current/sql-syntax.html), or using the [migration API](https://typeorm.io/migrations#using-migration-api-to-write-migrations).
For example:
<!-- eslint-disable max-len -->
```ts
import { MigrationInterface, QueryRunner } from "typeorm"
export class AddAuthorsAndPosts1690876698954 implements MigrationInterface {
name = "AddAuthorsAndPosts1690876698954"
public async up(queryRunner: QueryRunner): Promise<void> {
await queryRunner.query(`CREATE TABLE "post" ("id" character varying NOT NULL, "created_at" TIMESTAMP WITH TIME ZONE NOT NULL DEFAULT now(), "updated_at" TIMESTAMP WITH TIME ZONE NOT NULL DEFAULT now(), "title" character varying NOT NULL, "author_id" character varying NOT NULL, "authorId" character varying, CONSTRAINT "PK_be5fda3aac270b134ff9c21cdee" PRIMARY KEY ("id"))`)
await queryRunner.query(`CREATE TABLE "author" ("id" character varying NOT NULL, "created_at" TIMESTAMP WITH TIME ZONE NOT NULL DEFAULT now(), "updated_at" TIMESTAMP WITH TIME ZONE NOT NULL DEFAULT now(), "name" character varying NOT NULL, "image" character varying, CONSTRAINT "PK_5a0e79799d372fe56f2f3fa6871" PRIMARY KEY ("id"))`)
await queryRunner.query(`ALTER TABLE "post" ADD CONSTRAINT "FK_c6fb082a3114f35d0cc27c518e0" FOREIGN KEY ("authorId") REFERENCES "author"("id") ON DELETE NO ACTION ON UPDATE NO ACTION`)
}
public async down(queryRunner: QueryRunner): Promise<void> {
await queryRunner.query(`ALTER TABLE "post" DROP CONSTRAINT "FK_c6fb082a3114f35d0cc27c518e0"`)
await queryRunner.query(`DROP TABLE "author"`)
await queryRunner.query(`DROP TABLE "post"`)
}
}
```
:::warning
If you're copying the code snippet above, make sure to not copy the class name or the `name` attribute in it. Your migration should keep its timestamp.
:::
---
## Step 2: Build Files
Before you can run the migrations, you need to run the build command to transpile the TypeScript files to JavaScript files:
```bash npm2yarn
npm run build
```
---
## Step 3: Run Migration
The last step is to run the migration with the command detailed earlier
```bash
npx medusa migrations run
```
If you check your database now you should see that the change defined by the migration has been applied successfully.
---
## See Also
- [Create a Plugin](../../plugins/create.mdx)

View File

@@ -0,0 +1,89 @@
---
description: 'Learn what Migrations are in Medusa and how to run them. Migrations are used to make changes to the database schema that Medusa is linked to.'
---
import DocCardList from '@theme/DocCardList';
import Icons from '@theme/Icon';
# Migrations
In this document, you'll learn what Migrations are in Medusa.
## What are Migrations
Migrations are scripts that are used to make additions or changes to your database schema. In Medusa, they are essential for both when you first install your backend and for subsequent backend upgrades later on.
When you first create your Medusa backend, the database schema used must have all the tables necessary for the backend to run.
When a new Medusa version introduces changes to the database schema, you'll have to run migrations to apply them to your own database.
:::tip
Migrations are used to apply changes to the database schema. However, there are some version updates of Medusa that require updating the data in your database to fit the new schema. Those are specific to each version and you should check out the version under Upgrade Guides for details on the steps.
:::
---
## Migration Commands
### Migrate Command
Using the Medusa CLI tool, you can run migrations with the following command:
```bash
npx medusa migrations run
```
This will check for any migrations that contain changes to your database schema that aren't applied yet and run them on your backend.
### Seed Command
Seeding is the process of filling your database with data that is either essential or for testing and demo purposes. In Medusa, the `seed` command will run the migrations to your database if necessary before it seeds your database with dummy data.
You can use the following command to seed your database:
```bash
npx @medusajs/medusa-cli@latest seed -f ./data/seed.json
```
This assumes that you have the file `data/seed.json` in your Medusa backend, which is available by default. It includes the demo data to seed into your database.
### Revert Migrations
To revert the last migration you ran, run the following command:
```bash
npx @medusajs/medusa-cli@latest migrations revert
```
### Other Migrations Commands
You can learn about migration commands available in the Medusa CLI tool by referring to the [Medusa CLI reference](../../../cli/reference.mdx#migrations)
---
## Custom Development
Developers can create custom entities in the Medusa backend, a plugin, or in a module, then ensure it reflects in the database using a migration.
<DocCardList colSize={6} items={[
{
type: 'link',
href: '/development/entities/migrations/create',
label: 'Create a Migration',
customProps: {
icon: Icons['academic-cap-solid'],
description: 'Learn how to create migrations in Medusa.'
}
},
{
type: 'link',
href: '/development/entities/create',
label: 'Create an Endpoint',
customProps: {
icon: Icons['academic-cap-solid'],
description: 'Learn how to create endpoints in Medusa.'
}
},
]} />

View File

@@ -0,0 +1,127 @@
---
description: 'Learn what entities are in Medusa. There are entities defined in the Medusa backend, and developers can create custom entities.'
---
import DocCardList from '@theme/DocCardList';
import Icons from '@theme/Icon';
import LearningPath from '@site/src/components/LearningPath';
# Entities
In this document, you'll learn what Entities are in Medusa.
## What are Entities
Entities in medusa represent tables in the database as classes. An example of this would be the `Order` entity which represents the `order` table in the database. Entities provide a uniform way of defining and interacting with data retrieved from the database.
Aside from the entities in the Medusa core package, you can also create custom entities to use in your Medusa backend. Custom entities are TypeScript or JavaScript files located in the `src/models` directory of your Medusa backend. You then transpile these entities to be used during the backend's runtime using the `build` command, which moves them to the `dist/models` directory.
Entities are based on [Typeorms Entities](https://typeorm.io/entities) and use Typeorm decorators. Each entity also require a [repository](./repositories.md) to be created. A repository provides basic methods to access and manipulate the entity's data.
<LearningPath pathName="entity-and-api" />
---
## Base Entities
All entities must extend either the `BaseEntity` or `SoftDeletableEntity` classes. The `BaseEntity` class holds common columns including the `id`, `created_at`, and `updated_at` columns.
The `SoftDeletableEntity` class extends the `BaseEntity` class and adds another column `deleted_at`. If an entity can be soft deleted, meaning that a row in it can appear to the user as deleted but still be available in the database, it should extend `SoftDeletableEntity`.
---
## metadata Attribute
Most entities in Medusa have a `metadata` attribute. This attribute is an object that can be used to store custom data related to that entity. In the database, this attribute is stored as a [JSON Binary (JSONB)](https://www.postgresql.org/docs/current/datatype-json.html#JSON-CONTAINMENT) column. On retrieval, the attribute is parsed into an object.
Some example use cases for the `metadata` attribute include:
- Store an external ID of an entity related to a third-party integration.
- Store product customization such as personalization options.
### Add and Update Metadata
You can add or update metadata entities either through the REST APIs or through create and update methods in the entity's respective [service](../services/overview.mdx).
In the [admin REST APIs](https://docs.medusajs.com/api/admin), you'll find that in create or update requests of some entities you can also set the `metadata`.
In services, there are typically `create` or `update` methods that allow you to set or update the metadata.
If you want to add a property to the `metadata` object or update a property in the `metadata` object, you can pass the `metadata` object with the properties you want to add or update in it. For example:
```json
{
// other data
"metadata": {
"is_b2b": true
}
}
```
If you want to remove a property from the `metadata` object, you can pass the `metadata` object with the property you want to delete. The property should have an empty string value. For example:
```json
{
// other data
"metadata": {
"is_b2b": "" // this deletes the `is_b2b` property from `metadata`
}
}
```
---
## Custom Development
Developers can create custom entities in the Medusa backend, a plugin, or in a module, then ensure it reflects in the database using a migration.
<DocCardList colSize={4} items={[
{
type: 'link',
href: '/development/entities/create',
label: 'Create an Entity',
customProps: {
icon: Icons['academic-cap-solid'],
description: 'Learn how to create an entity in Medusa.'
}
},
{
type: 'link',
href: '/development/entities/repositories',
label: 'Create a Repository',
customProps: {
icon: Icons['academic-cap-solid'],
description: 'Learn how to use a repository in Medusa.'
}
},
{
type: 'link',
href: '/development/entities/migrations/create',
label: 'Create a Migration',
customProps: {
icon: Icons['academic-cap-solid'],
description: 'Learn how to create migrations in Medusa.'
}
},
]} />
<DocCardList colSize={6} items={[
{
type: 'link',
href: '/development/entities/extend-entity',
label: 'Extend an Entity',
customProps: {
icon: Icons['academic-cap-solid'],
description: 'Learn how to extend a core Medusa entity.'
}
},
{
type: 'link',
href: '/development/entities/extend-repository',
label: 'Extend a Repository',
customProps: {
icon: Icons['academic-cap-solid'],
description: 'Learn how to extend a core Medusa repository.'
}
},
]} />

View File

@@ -0,0 +1,339 @@
---
description: "In this document, you'll learn what repositories are, how to use them within your Medusa backend, and what are some of their common methods."
---
# Repositories
In this document, you'll learn what repositories are, how to use them within your Medusa backend, and what are some of their common methods.
## Overview
Repositories provide generic helper methods for entities. For example, you can use the `find` method to retrieve all entities with pagination, or `findOne` to retrieve a single entity record.
Repostories are [Typeorm repositories](https://typeorm.io/working-with-repository), so you can refer to Typeorm's documentation on all available methods.
By default, you don't need to create a repository for your custom entities. You can retrieve the default repository of an entity using the Entity Manager. You should only create a repository if you want to implement custom methods in it.
---
## Access Default Repository
If you haven't created a custom repository, you can access the default repository using the [Entity Manager](https://typeorm.io/entity-manager-api). The Entity Manager is registered in the [dependency container](../fundamentals/dependency-injection.md) under the name `manager`. So, you can [resolve it](../fundamentals/dependency-injection.md#resolve-resources) and use its method `getRepository` to retrieve the repository of an entity.
For example, to retrieve the default repository of an entity in a service:
```ts title=src/services/post.ts
import { Post } from "../models/post"
class PostService extends TransactionBaseService {
// ...
async list(): Promise<Post[]> {
const postRepo = this.activeManager_.getRepository(
Post
)
return await postRepo.find()
}
// ...
}
```
Another example is retrieving the default repository of an entity in an endpoint:
```ts title=src/api/index.ts
import { Post } from "../models/post"
import { EntityManager } from "typeorm"
// ...
export default () => {
// ...
storeRouter.get("/posts", async (req, res) => {
const manager: EntityManager = req.scope.resolve("manager")
const postRepo = manager.getRepository(Post)
return res.json({
posts: await postRepo.find(),
})
})
// ...
}
```
---
## Create Custom Repository
If you want to add custom methods or override existing methods in a repository, you can create a custom repository.
Repositories are created under the `src/repositories` directory of your Medusa backend project. The file name is the name of the repository without `Repository`.
For example, to create a repository for a `Post` entity, create the file `src/repositories/post.ts` with the following content:
```ts src=src/repositories/post.ts
import { Post } from "../models/post"
import {
dataSource,
} from "@medusajs/medusa/dist/loaders/database"
export const PostRepository = dataSource
.getRepository(Post)
.extend({
customMethod(): void {
// TODO add custom implementation
return
},
})
export default PostRepository
```
The repository is created using the `getRepository` method of the data source exported from the core package in Medusa. This method accepts the entity as a parameter. You then use the `extend` method to add a new method `customMethod`.
You can learn about available Repository methods that you can override in [Typeorm's documentation](https://typeorm.io/repository-api).
:::tip
A data source is Typeorms connection settings that allows you to connect to your database. You can learn more about it in [Typeorms documentation](https://typeorm.io/data-source).
:::
---
## Using Custom Repositories in Other Resources
### Endpoints
To access a custom repository within an endpoint, use the `req.scope.resolve` method. For example:
```ts title=src/api/index.ts
import { PostRepository } from "../repositories/post"
import { EntityManager } from "typeorm"
// ...
export default () => {
// ...
storeRouter.get("/posts", async (req, res) => {
const postRepository: typeof PostRepository =
req.scope.resolve("postRepository")
const manager: EntityManager = req.scope.resolve("manager")
const postRepo = manager.withRepository(postRepository)
return res.json({
posts: await postRepo.find(),
})
})
// ...
}
```
You can learn more about endpoints [here](../endpoints/overview.mdx).
### Services and Subscribers
As custom repositories are registered in the [dependency container](../fundamentals/dependency-injection.md#dependency-container-and-injection), they can be accessed through dependency injection in the constructor of a service or a subscriber.
For example:
```ts title=src/services/post.ts
import { PostRepository } from "../repositories/post"
class PostService extends TransactionBaseService {
// ...
protected postRepository_: typeof PostRepository
constructor(container) {
super(container)
// ...
this.postRepository_ = container.postRepository
}
async list(): Promise<Post[]> {
const postRepo = this.activeManager_.withRepository(
this.postRepository_
)
return await postRepo.find()
}
// ...
}
```
You can learn more about services [here](../services/overview.mdx).
### Other Resources
Resources that have access to the dependency container can access repositories just like any other resources. You can learn more about the dependency container and dependency injection in [this documentation](../fundamentals/dependency-injection.md).
---
## Common Methods
This section covers some common methods and use cases you'll use with repositories. You can refer to [Typeorm's documentation](https://typeorm.io/repository-api) for full details on available methods.
### Retrieving a List of Records
To retrieve a list of records of an entity, use the `find` method:
```ts
const posts = await postRepository.find()
```
You can also filter the retrieved items by passing an object of type [FindOption](https://typeorm.io/find-options) as a first parameter:
```ts
const posts = await postRepository.find({
where: {
id: "1",
},
})
```
In addition, you can pass `skip` and `take` properties to the object for pagination purposes. `skip`'s value is a number that indicates how many items to skip before retrieving the results, and `take` indicates how many items to return:
```ts
const posts = await postRepository.find({
skip: 0,
take: 20,
})
```
To expand relations and retrieve them as part of each item in the result, you can pass the `relations` property to the parameter object:
```ts
const posts = await postRepository.find({
relations: ["authors"],
})
```
Medusa provides a utility method `buildQuery` that allows you to easily format the object to pass to the `find` method. `buildQuery` accepts two parameters:
1. The first parameter is an object whose keys are the attributes of the entity, and their values are the value to filter by.
2. The second parameter includes the options related to pagination (such as `skip` and `take`), the relations to expand, and fields to select in each returned item.
For example:
```ts
import { buildQuery } from "@medusajs/medusa"
// ...
const selector = {
id: "1",
}
const config = {
skip: 0,
take: 20,
relations: ["authors"],
select: ["title"],
}
const query = buildQuery(selector, config)
const posts = await postRepository.find(query)
```
### Retrieving a List of Records with Count
You can retrieve a list of records along with their count using the `findAndCount` method:
```ts
const [posts, count] = await postRepository.findAndCount()
```
This method also accepts the same options object as a parameter similar to the [find](#retrieving-a-list-of-records) method.
### Retrieving a Single Record
You can retrieve one record of an entity using the `findOne` method:
```ts
const post = await postRepository.findOne({
where: {
id: "1",
},
})
```
If the record does not exist, `null` will be returned instead.
You can also pass the method an options object similar to the [find](#retrieving-a-list-of-records) method to expand relations or specify what fields to select.
### Create Record
To create a new record of an entity, use the `create` and `save` methods of the repository:
```ts
const post = postRepository.create()
post.title = data.title
post.author_id = data.author_id
const result = await postRepository.save(post)
```
The `save` method is what actually persists the created record in the database.
### Update Record
To update a record of an entity, use the `save` method of the repository:
```ts
// const data = {
// title: ''
// }
// const post = await postRepository.findOne({
// where: {
// id: '1'
// }
// })
Object.assign(post, data)
const updatedPost = await postRepository.save(post)
```
### Delete a Record
To delete a record of an entity, use the `remove` method of the repository:
```ts
const post = await postRepository.findOne({
where: {
id: "1",
},
})
await postRepository.remove([post])
```
This method accepts an array of records to delete.
### Soft Delete a Record
If an entity extends the `SoftDeletableEntity` class, it can be soft deleted. This means that the entity won't be fully deleted from the database, but it can't be retrieved as a non-deleted entity would.
To soft-delete a record of an entity, use the `softRemove` method:
```ts
const post = await postRepository.findOne({
where: {
id: "1",
},
})
await postRepository.softRemove([post])
```
You can later retrieve that entity by passing the `withDeleted` option to methods like `find`, `findAndCount`, or `findOne`:
```ts
const posts = await postRepository.find({
withDeleted: true,
})
```

View File

@@ -0,0 +1,260 @@
---
description: 'In this document, youll learn how to create an events module, then test and publish the module as an NPM package.'
addHowToData: true
---
# Create Events Module
In this document, youll learn how to create an events module.
## Overview
Medusa provides ready-made modules for events, including the local and Redis modules. If you prefer another technology used for managing events, you can build a module locally and use it in your Medusa backend. You can also publish to NPM and reuse it across multiple Medusa backend instances.
In this document, youll learn how to build your own Medusa events module, mainly focusing on creating the event bus service and the available methods you need to implement within your module.
---
## (Optional) Step 0: Prepare Module Directory
Before you start implementing your module, it's recommended to prepare the directory or project holding your custom implementation.
You can refer to the [Project Preparation step in the Create Module documentation](../modules/create.mdx#optional-step-0-project-preparation) to learn how to do that.
---
## Step 1: Create the Service
Create the file `services/event-bus-custom.ts` which will hold your event bus service. Note that the name of the file is recommended to be in the format `event-bus-<service_name>` where `<service_name>` is the name of the service youre integrating. For example, `event-bus-redis`.
Add the following content to the file:
```ts title=services/event-bus-custom.ts
import { EmitData, EventBusTypes } from "@medusajs/types"
import { AbstractEventBusModuleService } from "@medusajs/utils"
class CustomEventBus extends AbstractEventBusModuleService {
async emit<T>(
eventName: string,
data: T,
options: Record<string, unknown>
): Promise<void>;
async emit<T>(data: EmitData<T>[]): Promise<void>;
async emit(
eventName: unknown,
data?: unknown,
options?: unknown
): Promise<void> {
throw new Error("Method not implemented.")
}
}
export default CustomEventBus
```
This creates the class `CustomEventBus` that implements the `AbstractEventBusModuleService` class imported from `@medusajs/utils`. Feel free to rename the class to whats relevant for your event bus service.
In the class you must implement the `emit` method. You can optionally implement the `subscribe` and `unsubscribe` methods.
---
## Step 2: Implement Methods
### Note About the eventToSubscribersMap Property
The `AbstractEventBusModuleService` implements two methods for handling subscription: `subscribe` and `unsubscribe`. In these methods, the subscribed handler methods are managed within a class property `eventToSubscribersMap`, which is a JavaScript Map. They map keys are the event names, whereas the value of each key is an array of subscribed handler methods.
In your custom implementation, you can use this property to manage the subscribed handler methods. For example, you can get the subscribers of a method using the `get` method of the map:
```ts
const eventSubscribers =
this.eventToSubscribersMap.get(eventName) || []
```
Alternatively, you can implement custom logic for the `subscribe` and `unsubscribe` events, which is explained later in this guide.
### constructor
The `constructor` method of a service allows you to prepare any third-party client or service necessary to be used in other methods. It also allows you to get access to the modules options which are typically defined in `medusa-config.js`, and to other services and resources in the Medusa backend using [dependency injection](../fundamentals/dependency-injection.md).
Heres an example of how you can use the `constructor` to store the options of your module:
<!-- eslint-disable prefer-rest-params -->
```ts title=services/event-bus-custom.ts
class CustomEventBus extends AbstractEventBusModuleService {
protected readonly moduleOptions: Record<string, any>
constructor({
// inject resources from the Medusa backend
// for example, you can inject the logger
logger,
}, options) {
super(...arguments)
this.moduleOptions = options
}
// ...
}
```
### emit
The `emit` method is used to push an event from Medusa into your messaging system. Typically, the subscribers to that event would then pick up the message and execute their asynchronous tasks.
The `emit` method has two different signatures:
1. The first signature accepts three parameters. The first parameter is `eventName` being a required string indicating the name of the event to trigger. The second parameter is `data` being the optional data to send to subscribers of that event. The third optional parameter `options` which can be used to pass options specific to the event bus.
2. The second signature accepts one parameter, which is an array of objects having three properties: `eventName`, `data`, and `options`. These are the same as the parameters that can be passed in the first signature. This signature allows emitting more than one event.
The `options` parameter depends on the event bus integrating. For example, the Redis event bus accept the following options:
```ts title=services/event-bus-custom.ts
type JobData<T> = {
eventName: string
data: T
completedSubscriberIds?: string[] | undefined
}
```
You can implement your method in a way that supports both signatures by checking the type of the first input. For example:
```ts title=services/event-bus-custom.ts
class CustomEventBus extends AbstractEventBusModuleService {
// ...
async emit<T>(
eventName: string,
data: T,
options: Record<string, unknown>
): Promise<void>;
async emit<T>(data: EmitData<T>[]): Promise<void>;
async emit<T>(
eventOrData: string | EmitData<T>[],
data?: T,
options: Record<string, unknown> = {}
): Promise<void> {
const isBulkEmit = Array.isArray(eventOrData)
// emit event
}
}
```
### (optional) subscribe
As mentioned earlier, this method is already implemented in the `AbstractEventBusModuleService` class. This section explains how you can implement your custom subscribe logic if necessary.
The `subscribe` method attaches a handler method to the specified event, which is run when the event is triggered. It is typically used inside a subscriber class.
The `subscribe` method accepts three parameters:
1. The first parameter `eventName` is a required string. It indicates which event the handler method is subscribing to.
2. The second parameter `subscriber` is a required function that performs an action when the event is triggered.
3. The third parameter `context` is an optional object that has the property `subscriberId`. Subscriber IDs are useful to differentiate between handler methods when retrying a failed method. Its also useful for unsubscribing an event handler. Note that if you must implement the mechanism around assigning IDs to subscribers when you override the `subscribe` method.
The implementation of this method depends on the service youre using for the event bus:
```ts title=services/event-bus-custom.ts
class CustomEventBus extends AbstractEventBusModuleService {
// ...
subscribe(
eventName: string | symbol,
subscriber: EventBusTypes.Subscriber,
context?: EventBusTypes.SubscriberContext): this {
// TODO implement subscription
}
}
```
### (optional) unsubscribe
As mentioned earlier, this method is already implemented in the `AbstractEventBusModuleService` class. This section explains how you can implement your custom unsubscribe logic if necessary.
The `unsubscribe` method is used to unsubscribe a handler method from an event.
The `unsubscribe` method accepts three parameters:
1. The first parameter `eventName` is a required string. It indicates which event the handler method is unsubscribing from.
2. The second parameter `subscriber` is a required function that was initially subscribed to the event.
3. The third parameter `context` is an optional object that has the property `subscriberId`. It can be used to specify the ID of the subscriber to unsubscribe.
The implementation of this method depends on the service youre using for the event bus:
```ts title=services/event-bus-custom.ts
class CustomEventBus extends AbstractEventBusModuleService {
// ...
unsubscribe(
eventName: string | symbol,
subscriber: EventBusTypes.Subscriber,
context?: EventBusTypes.SubscriberContext): this {
// TODO implement subscription
}
}
```
---
## Step 3: Export the Service
After implementing the event bus service, you must export it so that the Medusa backend can use it.
Create the file `index.ts` with the following content:
```ts title=services/event-bus-custom.ts
import { ModuleExports } from "@medusajs/modules-sdk"
import { CustomEventBus } from "./services"
const service = CustomEventBus
const moduleDefinition: ModuleExports = {
service,
}
export default moduleDefinition
```
This exports a module definition, which requires at least a `service`. If you named your service something other than `CustomEventBus`, make sure to replace it with that.
You can learn more about what other properties you can export in your module definition in the [Create a Module documentation](../modules/create.mdx#step-2-export-module).
---
## Step 4: Test your Module
You can test your module in the Medusa backend by referencing it in the configurations.
To do that, add the module to the exported configuration in `medusa-config.js` as follows:
```js title=medusa-config.js
module.exports = {
// ...
modules: {
// ...
cacheService: {
resolve: "path/to/custom-module",
options: {
// any necessary options
},
},
},
}
```
Make sure to replace the `path/to/custom-module` with a relative path from your Medusa backend to your module. You can learn more about module reference in the [Create Module documentation](../modules/create.mdx#module-reference).
You can also add any necessary options to the module.
Then, to test the module, run the Medusa backend which also runs your module:
```bash npm2yarn
npx medusa develop
```
---
## (Optional) Step 5: Publish your Module
You can publish your events module to NPM. This can be useful if you want to reuse your module across Medusa backend instances, or want to allow other developers to use it.
You can refer to the [Publish Module documentation](../modules/publish.md) to learn how to publish your module.

View File

@@ -0,0 +1,134 @@
---
description: 'Learn how to create a subscriber in Medusa. You can use subscribers to implement functionalities like sending an order confirmation email.'
addHowToData: true
---
# How to Create a Subscriber
In this document, youll learn how to create a [Subscriber](./subscribers.mdx) in Medusa that listens to events to perform an action.
## Implementation
A subscriber is a TypeScript or JavaScript file that is created under `src/subscribers`. Its file name, by convention, should be the class name of the subscriber without the word `Subscriber`. For example, if the subscriber is `HelloSubscriber`, the file name should be `hello.ts`.
After creating the file under `src/subscribers`, in the constructor of your subscriber, listen to events using `eventBusService.subscribe` , where `eventBusService` is a service injected into your subscribers constructor.
The `eventBusService.subscribe` method receives the name of the event as a first parameter and as a second parameter a method in your subscriber that will handle this event.
For example, here is the `OrderNotifierSubscriber` class created in `src/subscribers/order-notifier.ts`:
```ts title=src/subscribers/order-notifier.ts
class OrderNotifierSubscriber {
constructor({ eventBusService }) {
eventBusService.subscribe("order.placed", this.handleOrder)
}
handleOrder = async (data) => {
console.log("New Order: " + data.id)
}
}
export default OrderNotifierSubscriber
```
This subscriber registers the method `handleOrder` as one of the handlers of the `order.placed` event. The method `handleOrder` will be executed every time an order is placed. It receives the order ID in the `data` parameter. You can then use the orders details to perform any kind of task you need.
:::tip
For the `order.placed` event, the `data` object won't contain other order data. Only the ID of the order. You can retrieve the order information using the `orderService`.
:::
### Subscriber ID
The `subscribe` method of the `eventBusService` accepts a third optional parameter which is a context object. This object has a property `subscriberId` with its value being a string. This ID is useful when there is more than one handler method attached to a single event or if you have multiple Medusa backends running. This allows the events bus service to differentiate between handler methods when retrying a failed one.
If a subscriber ID is not passed on subscription, all handler methods are run again. This can lead to data inconsistencies or general unwanted behavior in your system. On the other hand, if you want all handler methods to run again when one of them fails, you can omit passing a subscriber ID.
An example of using the subscribe method with the third parameter:
```ts
eventBusService.subscribe("order.placed", this.handleOrder, {
subscriberId: "my-unique-subscriber",
})
```
---
## Retrieve Medusa Configurations
Within your subscriber, you may need to access the Medusa configuration exported from `medusa-config.js`. To do that, you can access `configModule` using dependency injection.
For example:
```ts
import { ConfigModule, EventBusService } from "@medusajs/medusa"
type InjectedDependencies = {
eventBusService: EventBusService
configModule: ConfigModule
}
class OrderNotifierSubscriber {
protected readonly configModule_: ConfigModule
constructor({
eventBusService,
configModule,
}: InjectedDependencies) {
this.configModule_ = configModule
eventBusService.subscribe("order.placed", this.handleOrder)
}
// ...
}
export default OrderNotifierSubscriber
```
---
## Using Services in Subscribers
You can access any service through the dependencies injected to your subscribers constructor.
For example:
```ts title=src/subscribers/order-notifier.ts
class OrderNotifierSubscriber {
constructor({ productService, eventBusService }) {
this.productService = productService
eventBusService.subscribe(
"order.placed",
this.handleOrder
)
}
// ...
}
```
You can then use `this.productService` anywhere in your subscribers methods. For example:
```ts title=src/subscribers/order-notifier.ts
class OrderNotifierSubscriber {
// ...
handleOrder = async (data) => {
// ...
const product = this.productService.list()
}
}
```
:::note
When using attributes defined in the subscriber, such as the `productService` in the example above, you must use an arrow function to declare the method. Otherwise, the attribute will be undefined when used.
:::
---
## See Also
- [Example: send order confirmation email](../../modules/orders/backend/send-order-confirmation.md)
- [Example: send registration confirmation email](../../modules/customers/backend/send-confirmation.md)
- [Create a Plugin](../plugins/create.mdx)

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,131 @@
---
description: 'Learn how the events system is implemented in Medusa. It is built on a publish-subscribe architecture. The Medusa core publishes events when certain actions take place.'
---
import DocCardList from '@theme/DocCardList';
import Icons from '@theme/Icon';
# Events
In this document, youll learn what events are and why theyre useful in Medusa.
## Overview
Events are used in Medusa to inform different parts of the commerce ecosystem that this event occurred. For example, when an order is placed, the `order.placed` event is triggered, which informs notification services like SendGrid to send a confirmation email to the customer.
:::tip
If you want to implement order confirmation emails, you can check this [step-by-step guide](../../modules/orders/backend/send-order-confirmation.md).
:::
The events system in Medusa is built on a publish/subscribe architecture. The Medusa core publish an event when an action takes place, and modules, plugins, or other forms of customizations can subscribe to that event. [Subscribers](./subscribers.mdx) can then perform a task asynchronously when the event is triggered.
Although the core implements the main logic behind the events system, youll need to use an event module that takes care of the publish/subscribe functionality such as subscribing and emitting events. Medusa provides modules that you can use both for development and production, including Redis and Local modules.
---
## Database Transactions
Transactions in Medusa ensure Atomicity, Consistency, Isolation, and Durability (ACID) guarantees for operations in the Medusa core.
In many cases, services typically update resources in the database and emit an event within a transactional operation. To ensure that these events don't cause data inconsistencies (for example, a plugin subscribes to an event to contact a third-party service, but the transaction fails) the concept of a staged job is introduced.
Instead of events being processed immediately, they're stored in the database as a staged job until they're ready. In other words, until the transaction has succeeded.
This rather complex logic is abstracted away from the consumers of the EventBusService, but here's an example of the flow when an API request is made:
- API request starts.
- Transaction is initiated.
- Service layer performs some logic.
- Events are emitted and stored in the database for eventual processing.
- Transaction is committed.
- API request ends.
- Events in the database become visible.
To pull staged jobs from the database, a separate enqueuer polls the database every three seconds to discover new visible jobs. These jobs are then added to the queue and processed as described in the Processing section earlier.
---
## Emitting Events
You can emit events in Medusa using the `EventBusService`. For example:
```ts
this.eventBusService.emit("custom-event", {
// attach any data to the event
})
```
You can also emit more than one event:
```ts
this.eventBusService.emit([
{
eventName: "custom-event-1",
data: {
// attach any data to the event
},
},
{
eventName: "custom-event-2",
data: {
// attach any data to the event
},
},
])
```
---
## Available Modules
Medusas default starter project comes with the local event module (`@medusajs/event-bus-local`). For production environments, its recommended to use the Redis event module package (`@medusajs/event-bus-redis`) that you can install.
<DocCardList colSize={6} items={[
{
type: 'link',
href: '/development/events/modules/redis',
label: 'Redis',
customProps: {
icon: Icons['academic-cap-solid'],
description: 'Learn how to install Redis events module in Medusa.'
}
},
{
type: 'link',
href: '/development/events/modules/local',
label: 'Local',
customProps: {
icon: Icons['academic-cap-solid'],
description: 'Learn how to install local events module in Medusa.'
}
},
]} />
---
## Custom Development
Developers can create custom event modules, allowing them to integrate any third-party services or logic to handle this functionality. Developers can also create and use subscribers to handle events in Medusa.
<DocCardList colSize={6} items={[
{
type: 'link',
href: '/development/events/create-module',
label: 'Create an Event Module',
customProps: {
icon: Icons['academic-cap-solid'],
description: 'Learn how to create an event module.'
}
},
{
type: 'link',
href: '/development/events/create-subscriber',
label: 'Create a Subscriber',
customProps: {
icon: Icons['academic-cap-solid'],
description: 'Learn how to create a subscriber.'
}
},
]} />

View File

@@ -0,0 +1,68 @@
---
description: 'In this document, youll learn about the local events module and how you can install it in your Medusa backend.'
addHowToData: true
---
# Local Events Module
In this document, youll learn about the local events module and how you can install it in your Medusa backend.
## Overview
Medusas modular architecture allows developers to extend or completely replace the logic used for events. You can create a custom module, or you can use the modules Medusa provides.
One of these modules is the local events module. This module allows you to utilize Node EventEmitter for the events system in Medusa. The Node EventEmitter is limited to a single process environment. This module is useful for development and testing, but its recommended to use the [Redis events module](./redis.md) in production.
This document will you guide you through installing the local events module.
---
## Prerequisites
Its assumed you already have a Medusa backend installed. If not, you can learn how to install it by following [this guide](../../backend/install.mdx).
---
## Step 1: Install the Module
In the root directory of your Medusa backend, install the Redis events module with the following command:
```bash npm2yarn
npm install @medusajs/event-bus-local
```
---
## Step 2: Add Configuration
In `medusa-config.js`, add the following to the exported object:
```js title=medusa-config.js
module.exports = {
// ...
modules: {
// ...
eventBus: {
resolve: "@medusajs/event-bus-local",
},
},
}
```
This registers the local events module as the main events service to use. This module does not require any options.
---
## Step 4: Test Module
To test the module, run the following command to start the Medusa backend:
```bash npm2yarn
npx medusa develop
```
If the module was installed successfully, you should see the following message in the logs:
```bash noCopy noReport
Local Event Bus installed. This is not recommended for production.
```

View File

@@ -0,0 +1,95 @@
---
description: 'In this document, youll learn about the Redis events module and how you can install it in your Medusa backend.'
addHowToData: true
---
# Redis Events Module
In this document, youll learn about the Redis events module and how you can install it in your Medusa backend.
## Overview
Medusas modular architecture allows developers to extend or completely replace the logic used for events. You can create a custom module, or you can use the modules Medusa provides.
One of these modules is the Redis module. This module allows you to utilize Redis for the event bus functionality. When installed, the Medusas events system is powered by BullMQ and `io-redis`. BullMQ is responsible for the message queue and worker, and `io-redis` is the underlying Redis client that BullMQ connects to for events storage.
This document will you guide you through installing the Redis module.
---
## Prerequisites
### Medusa Backend
Its assumed you already have a Medusa backend installed. If not, you can learn how to install it by following [this guide](../../backend/install.mdx).
### Redis
You must have Redis installed and configured in your Medusa backend. You can learn how to install Redis in [their documentation](https://redis.io/docs/getting-started/installation/).
---
## Step 1: Install the Module
In the root directory of your Medusa backend, install the Redis events module with the following command:
```bash npm2yarn
npm install @medusajs/event-bus-redis
```
---
## Step 2: Add Environment Variable
The Redis events module requires a connection URL to Redis as part of its options. If you dont already have an environment variable set for a Redis URL, make sure to add one:
```bash
EVENTS_REDIS_URL=<YOUR_REDIS_URL>
```
Where `<YOUR_REDIS_URL>` is a connection URL to your Redis instance.
---
## Step 3: Add Configuration
In `medusa-config.js`, add the following to the exported object:
```js title=medusa-config.js
module.exports = {
// ...
modules: {
// ...
eventBus: {
resolve: "@medusajs/event-bus-redis",
options: {
redisUrl: process.env.EVENTS_REDIS_URL,
},
},
},
}
```
This registers the Redis events module as the main event bus service to use. In the options, you pass `redisUrl` with the value being the environment variable you set. This is the only required option.
Other available options include:
- `queueName`: a string indicating the name of the BullMQ queue. By default, its `events-queue`.
- `queueOptions`: an object containing options for the BullMQ queue. You can learn about available options in [BullMQs documentation](https://api.docs.bullmq.io/interfaces/QueueOptions.html). By default, its an empty object.
- `redisOptions`: an object containing options for the Redis instance. You can learn about available options in [io-rediss documentation](https://luin.github.io/ioredis/index.html#RedisOptions). By default, its an empty object.
---
## Step 4: Test Module
To test the module, run the following command to start the Medusa backend:
```bash npm2yarn
npx medusa develop
```
If the module was installed successfully, you should see the following message in the logs:
```bash noCopy noReport
Connection to Redis in module 'event-bus-redis' established
```

View File

@@ -0,0 +1,45 @@
---
description: 'Learn what subscribers are in Medusa. Subscribers are used to listen to triggered events to perform an action.'
---
import DocCard from '@theme/DocCard';
import Icons from '@theme/Icon';
# Subscribers
In this document, you'll learn what Subscribers are in Medusa.
## What are Subscribers
Subscribers register handlers for an events and allows you to perform an action when that event occurs. For example, if you want to send your customer an email when they place an order, then you can listen to the `order.placed` event and send the email when the event is emitted.
Natively in Medusa there are subscribers to handle different events. However, you can also create your own custom subscribers.
Custom subscribers are TypeScript or JavaScript files in your project's `src/subscribers` directory. Files here should export classes, which will be treated as subscribers by Medusa. By convention, the class name should end with `Subscriber` and the file name should be the camel-case version of the class name without `Subscriber`. For example, the `WelcomeSubscriber` class is in the file `src/subscribers/welcome.ts`.
Whenever an event is emitted, the subscribers registered handler method is executed. The handler method receives as a parameter an object that holds data related to the event. For example, if an order is placed the `order.placed` event will be emitted and all the handlers will receive the order id in the parameter object.
### Example Use Cases
Subscribers are useful in many use cases, including:
- Send a confirmation email to the customer when they place an order by subscribing to the `order.placed` event.
- Automatically assign new customers to a customer group by subscribing to the `customer.created`.
- Handle custom events that you emit
---
## Custom Development
Developers can create custom subscribers in the Medusa backend, a plugin, or in a module.
<DocCard item={{
type: 'link',
href: '/development/events/create-subscriber',
label: 'Create a Subscriber',
customProps: {
icon: Icons['academic-cap-solid'],
description: 'Learn how to create a subscriber in Medusa.'
}
}}
/>

View File

@@ -0,0 +1,28 @@
---
description: "Learn what feature flags in Medusa. Feature flags are used in Medusa to guard beta features that arent ready for live and production applications."
---
import DocCard from '@theme/DocCard';
import Icons from '@theme/Icon';
# Feature Flags
In this document, youll learn what feature flags in Medusa.
## Introduction
Feature flags are used in Medusa to guard beta features that arent ready for live and production applications. This allows the Medusa team to keep publishing releases more frequently, while also working on necessary future features behind the scenes. To use these beta features, you must enable their feature flags.
If a feature is guarded by a flag, entities, migrations, endpoints, and other resources associated with that feature are guarded by that flag as well. So, these resources will only be available to use in Medusa if you have enabled the associated feature flag.
You can view a list of available feature flags that you can toggle in [the Beta Features documentation](../../beta.md).
<DocCard item={{
type: 'link',
href: '/development/feature-flags/toggle',
label: 'Toggle Feature Flags',
customProps: {
icon: Icons['academic-cap-solid'],
description: 'Learn how to enable or disabled a feature flag.'
}
}} />

View File

@@ -0,0 +1,83 @@
---
description: 'Learn how to toggle feature flags in Medusa. This guide explains the steps required to toggle a feature flag.'
addHowToData: true
---
# How to Toggle Feature Flags
In this document, youll learn about how to toggle feature flags.
:::info
If a feature flag is enabled/disabled by default, you dont need to manually enable/disable it. Only set the feature flags value if its different than the default.
:::
## Enable Feature Flags
:::caution
Features guarded by feature flags are experimental and beta features. Enable them with caution.
:::
There are two ways to enable a feature flag.
### Method One: Using Environment Variables
You can enable a feature by setting the value of its environment variable to `true`. You can find [the name of a feature flags environment variable in the Beta Features documentation page](../../beta.md).
For example, to enable the Tax-Inclusive Pricing beta feature, add the following environment variable:
```bash
MEDUSA_FF_TAX_INCLUSIVE_PRICING=true
```
### Method Two: Using Backend Configurations
You can enable a feature by using the backend configurations in `medusa-config.js`. You can find [the feature flags key in the Beta Features documentation page](../../beta.md) its defined in. It is defined under the property `key` in the exported object.
For example, to enable the Tax-Inclusive Pricing beta feature, add the following to the exported object in `medusa-config.js`:
```js title=medusa-config.js
module.exports = {
featureFlags: {
tax_inclusive_pricing: true,
},
// ...
}
```
### Note About Precedence
The environment variables value has higher precedence over the backend configurations. So, if you use both these methods on your backend, the value of the environment variable will be used.
For example, if the value of the environment variable is set to `false`, but the value of the feature flag in the backend configurations is set to `true`, the feature flag will take the value of the environment variable and will be disabled.
### Running Migrations
As feature flags generally include adding new entities or making changes to entities in the database, you must run the migrations after enabling the feature flag:
```bash
npx medusa migrations run
```
:::info
You can learn more about migrations in this documentation.
:::
---
## Disable Feature Flags
Disabling feature flags follows the same process as enabling the feature flags. All you have to do is change the value in the environment variables or the backend configurations to `false`.
Once you disable a feature flag, all endpoints, entities, services, or other related classes and functionalities are disabled.
### Revert Migrations
If you had the feature flag previously enabled, and you want to disable this feature flag completely, you might need to revert the migrations you ran when you enabled it.
You can follow [this documentation to learn how to revert the last migration you ran](../../cli/reference.mdx#migrations).

View File

@@ -0,0 +1,472 @@
---
description: "Learn how to create a file service in a Medusa backend or a plugin."
---
# How to Create a File Service
In this document, youll learn how to create a file service in Medusa.
## Overview
In this guide, youll learn about the steps to implement a file service and the methods youre required to implement in a file service. You can implement the file service within the Medusa backend codebase or in a plugin.
The file service youll be creating in this guide will be a local file service that allows you to upload files into your Medusa backends codebase. This is to give you a realistic example of the implementation of a file service. Youre free to implement the file service as required for your case.
---
## Prerequisites
### (Optional) Multer Types Package
If youre using TypeScript, as instructed by this guide, you should install the Multer types package to resolve errors within your file service types.
To do that, run the following command in the directory of your Medusa backend or plugin:
```bash npm2yarn
npm install @types/multer
```
---
## Step 1: Create the File Service Class
A file service class is defined in a TypeScript or JavaScript file thats created in the `src/services` directory. The class must extend the `AbstractFileService` class imported from the `@medusajs/medusa` package.
Based on services naming conventions, the files name should be the slug version of the file services name without `service`, and the classs name should be the pascal case of the file services name following by `Service`.
For example, if youre creating a local file service, the file name would be `local-file.ts`, whereas the class name would be `LocalFileService`.
:::tip
You can learn more about services and their naming convention in [this documentation](../services/overview.mdx).
:::
For example, create the file `src/services/local-file.ts` with the following content:
```ts title=src/services/local-file.ts
import { AbstractFileService } from "@medusajs/medusa"
import {
DeleteFileType,
FileServiceGetUploadStreamResult,
FileServiceUploadResult,
GetUploadedFileType,
UploadStreamDescriptorType,
} from "@medusajs/types"
class LocalFileService extends AbstractFileService {
async upload(
fileData: Express.Multer.File
): Promise<FileServiceUploadResult> {
throw new Error("Method not implemented.")
}
async uploadProtected(
fileData: Express.Multer.File
): Promise<FileServiceUploadResult> {
throw new Error("Method not implemented.")
}
async delete(fileData: DeleteFileType): Promise<void> {
throw new Error("Method not implemented.")
}
async getUploadStreamDescriptor(
fileData: UploadStreamDescriptorType
): Promise<FileServiceGetUploadStreamResult> {
throw new Error("Method not implemented.")
}
async getDownloadStream(
fileData: GetUploadedFileType
): Promise<NodeJS.ReadableStream> {
throw new Error("Method not implemented.")
}
async getPresignedDownloadUrl(
fileData: GetUploadedFileType
): Promise<string> {
throw new Error("Method not implemented.")
}
}
export default LocalFileService
```
This creates the service `LocalFileService` which, at the moment, adds a general implementation of the methods defined in the abstract class `AbstractFileService`. Notice that the methods' signatures require importing types from the `@medusajs/types` package.
:::tip
If you get errors regarding the types not being available in the `@medusajs/types` package, make sure that the latest version of the package is installed:
```bash npm2yarn
npm install @medusajs/types@latest
```
:::
### Using a Constructor
You can use a constructor to access services and resources registered in the dependency container, to define any necessary clients if youre integrating a third-party storage service, and to access plugin options if your file service is defined in a plugin.
For example, the local services constructor could be useful to prepare the local upload directory:
```ts title=src/services/local-file.ts
// ...
import * as fs from "fs"
class LocalFileService extends AbstractFileService {
// can also be replaced by an environment variable
// or a plugin option
protected serverUrl = "http://localhost:9000"
protected publicPath = "uploads"
protected protectedPath = "protected-uploads"
constructor(container) {
super(container)
// for public uploads
if (!fs.existsSync(this.publicPath)) {
fs.mkdirSync(this.publicPath)
}
// for protected uploads
if (!fs.existsSync(this.protectedPath)) {
fs.mkdirSync(this.protectedPath)
}
}
// ...
}
```
Another example showcasing how to access resources using dependency injection:
<!-- eslint-disable prefer-rest-params -->
```ts title=src/services/local-file.ts
type InjectedDependencies = {
logger: Logger
}
class LocalFileService extends AbstractFileService {
// ...
protected logger_: Logger
constructor({ logger }: InjectedDependencies) {
super(...arguments)
this.logger_ = logger
// ...
}
// ...
}
```
You can access the plugin options in the second parameter passed to the constructor:
```ts title=src/services/local-file.ts
class LocalFileService extends AbstractFileService {
protected serverUrl = "http://localhost:9000"
// ...
constructor(
container,
pluginOptions
) {
super(container)
if (pluginOptions?.serverUrl) {
this.serverUrl = pluginOptions?.serverUrl
}
// ...
}
// ...
}
```
---
## Step 2: Implement Required Methods
In this section, youll learn about the required methods to implement in the file service.
### upload
This method is used to upload a file to the Medusa backend. You must handle the upload logic within this method.
This method accepts one parameter, which is a [multer file object](http://expressjs.com/en/resources/middleware/multer.html#file-information). The file is uploaded to a temporary directory by default. Among the files details, you can access the files path in the `path` property of the file object.
So, for example, you can create a read stream to the files content if necessary using the `fs` library:
```ts
fs.createReadStream(file.path)
```
Where `file` is the parameter passed to the `upload` method.
The method is expected to return an object that has the following properties:
- `url`: a string indicating the full accessible URL to the file.
- `key`: a string indicating the key used to reference the uploaded file. This is useful when retrieving the file later with other methods like the [getDownloadStream](#getdownloadstream).
An example implementation of this method for the local file service:
```ts title=src/services/local-file.ts
class LocalFileService extends AbstractFileService {
async upload(
fileData: Express.Multer.File
): Promise<FileServiceUploadResult> {
const filePath =
`${this.publicPath}/${fileData.originalname}`
fs.copyFileSync(fileData.path, filePath)
return {
url: `${this.serverUrl}/${filePath}`,
key: filePath,
}
}
// ...
}
```
:::tip
This example does not account for duplicate names to maintain simplicity in this guide. So, an uploaded file can replace another existing file that has the same name.
:::
### uploadProtected
This method is used to upload a file to the Medusa backend, but to a protected storage. Typically, this would be used to store files that shouldnt be accessible by using the files URL or should only be accessible by authenticated users.
You must handle the upload logic and the file permissions or private storage configuration within this method.
This method accepts one parameter, which is a [multer file object](http://expressjs.com/en/resources/middleware/multer.html#file-information). The file is uploaded to a temporary directory by default. Among the files details, you can access the files path in the `path` property of the file object.
So, for example, you can create a read stream to the files content if necessary using the `fs` library:
```ts
fs.createReadStream(file.path)
```
Where `file` is the parameter passed to the `uploadProtected` method.
The method is expected to return an object that has the following properties:
- `url`: a string indicating the full accessible URL to the file.
- `key`: a string indicating the key used to reference the uploaded file. This is useful when retrieving the file later with other methods like the [getDownloadStream](#getdownloadstream).
An example implementation of this method for the local file service:
```ts title=src/services/local-file.ts
class LocalFileService extends AbstractFileService {
async uploadProtected(
fileData: Express.Multer.File
): Promise<FileServiceUploadResult> {
const filePath =
`${this.protectedPath}/${fileData.originalname}`
fs.copyFileSync(fileData.path, filePath)
return {
url: `${this.serverUrl}/${filePath}`,
}
}
// ...
}
```
### delete
This method is used to delete a file from storage. You must handle the delete logic within this method.
This method accepts one parameter, which is an object that holds a `fileKey` property. The value of this property is a string that acts as an identifier of the file to delete. Typically, this would be the returned `key` property by other methods, such as the `upload` method.
For example, for local file service, it could be the file name.
You can also pass custom properties to the object passed as a first parameter.
This method is not expected to return anything.
An example implementation of this method for the local file service:
```ts title=src/services/local-file.ts
class LocalFileService extends AbstractFileService {
async delete(
fileData: DeleteFileType
): Promise<void> {
fs.rmSync(fileData.fileKey)
}
// ...
}
```
### getUploadStreamDescriptor
This method is used to upload a file using a write stream. This is useful if the file is being written through a stream rather than uploaded to the temporary directory.
The method accepts one parameter, which is an object that has the following properties:
- `name`: a string indicating the name of the file.
- `ext`: an optional string indicating the extension of the file.
- `isPrivate`: an optional boolean value indicating if the file should be uploaded to a private bucket or location. By convention, the default value of this property should be `true`
The method is expected to return an object having the following properties:
- `writeStream`: a write stream object.
- `promise`: A promise that should resolved when the writing process is done to finish the upload. This depends on the type of file service youre creating.
- `url`: a string indicating the URL of the file once its uploaded.
- `fileKey`: a string indicating the identifier of your file in the storage. For example, for a local file service this can be the file name.
You can also return custom properties within the object.
An example implementation of this method for the local file service:
```ts title=src/services/local-file.ts
class LocalFileService extends AbstractFileService {
async getUploadStreamDescriptor({
name,
ext,
isPrivate = true,
}: UploadStreamDescriptorType
): Promise<FileServiceGetUploadStreamResult> {
const filePath = `${isPrivate ?
this.publicPath : this.protectedPath
}/${name}.${ext}`
const writeStream = fs.createWriteStream(filePath)
return {
writeStream,
promise: Promise.resolve(),
url: `${this.serverUrl}/${filePath}`,
fileKey: filePath,
}
}
// ...
}
```
### getDownloadStream
This method is used to read a file using a read stream, typically for download.
The method accepts as a parameter an object having the following properties:
- `fileKey`: a string indicating the identifier of the file. This is typically the value returned by other methods like [upload](#upload).
- `isPrivate`: an optional boolean value indicating whether the file should be downloaded from the private bucket or storage location. By convention, this should default to `true`.
The method is expected to return a readable stream.
An example implementation of this method for the local file service:
```ts title=src/services/local-file.ts
class LocalFileService extends AbstractFileService {
async getDownloadStream({
fileKey,
isPrivate = true,
}: GetUploadedFileType
): Promise<NodeJS.ReadableStream> {
const filePath = `${isPrivate ?
this.publicPath : this.protectedPath
}/${fileKey}`
const readStream = fs.createReadStream(filePath)
return readStream
}
// ...
}
```
### getPresignedDownloadUrl
The `getPresignedDownloadUrl` method is used to retrieve a download URL of the file. For some file services, such as S3, a presigned URL indicates a temporary URL to get access to a file.
If your file service doesnt perform or offer a similar functionality, you can just return the URL to download the file.
The method accepts as a parameter an object having the following properties:
- `fileKey`: a string indicating the identifier of the file. This is typically the value returned by other methods like [upload](#upload).
- `isPrivate`: an optional boolean value indicating whether the file should be downloaded from the private bucket or storage location. By convention, this should default to `true`.
The method is expected to return a string, being the URL of the file.
An example implementation of this method for the local file service:
```ts title=src/services/local-file.ts
class LocalFileService extends AbstractFileService {
async getPresignedDownloadUrl({
fileKey,
isPrivate = true,
}: GetUploadedFileType
): Promise<string> {
const filePath = `${isPrivate ?
this.publicPath : this.protectedPath
}/${fileKey}`
return `${this.serverUrl}/${filePath}`
}
// ...
}
```
---
## Step 3: Run Build Command
In the directory of the Medusa backend, run the `build` command to transpile the files in the `src` directory into the `dist` directory:
```bash npm2yarn
npm run build
```
---
## Test it Out
:::note
This section explains how to test out your implementation if the file service was created in the Medusa backend codebase. You can refer to the [plugin documentation](../plugins/create.mdx#test-your-plugin) on how to test a plugin.
:::
Run your backend to test it out:
```bash npm2yarn
npx medusa develop
```
Then, try uploading a file, for example, using the [Upload File endpoint](https://docs.medusajs.com/api/admin#uploads_postuploads). The file should be uploaded based on the logic youve implemented.
### (Optional) Accessing the File
:::note
This step is only useful if you're implementing a local file service.
:::
Since the file is uploaded to a local directory `uploads`, you need to configure a static route in express that allows accessing the files within the `uploads` directory.
To do that, create the file `src/api/index.ts` with the following content:
```ts
import express from "express"
export default () => {
const app = express.Router()
app.use(`/uploads`, express.static(uploadDir))
return app
}
```
---
## See Also
- [How to create a plugin](../plugins/create.mdx)
- [How to publish a plugin](../plugins/publish.mdx)

View File

@@ -0,0 +1,34 @@
---
description: "Learn what a file service is in Medusa. A file service defines how files are stored in the Medusa Backend."
---
import DocCard from '@theme/DocCard';
import Icons from '@theme/Icon';
# File Service
In this document, youll learn what a file service is in Medusa.
## Introduction
A file service defines how files are stored in the Medusa Backend. Those files include products images and files used to import or export data.
Medusa Backend includes a default file service that acts as a placeholder, but does not actually perform any storage functionalities. So, you must either install one of the [existing file-service plugins](../../plugins/file-service/index.mdx), such as [MinIO](../../plugins/file-service/minio.md) or [S3](../../plugins/file-service/s3.mdx), or create your own file service if you want to utilize storage functionalities.
A file service is a TypeScript or JavaScript class that extends the `AbstractFileService` class from the core `@medusajs/medusa` package. By extending this class, the file service must implement the necessary methods that take care of general upload and download functionalities. The Medusa Backend then uses these methods when necessary, for example, when a product image is uploaded.
---
## Custom Development
Developers can create a custom file service with the desired functionality directly within the Medusa Core, in a plugin, or in a module.
<DocCard item={{
type: 'link',
href: '/development/file-service/create-file-service',
label: 'Create a File Service',
customProps: {
icon: Icons['academic-cap-solid'],
description: 'Learn how to create a file service.',
}
}} />

View File

@@ -0,0 +1,27 @@
---
description: "Learn about Medusa's architecture and get a general overview of how all different tools work together."
---
# Medusa Architecture Overview
In this document, you'll get an overview of Medusa's architecture to better understand how all resources and tools work together.
## Architecture Overview
Medusa's core package `@medusajs/medusa` is a Node.js backend built on top of [Express](https://expressjs.com/). It combines all the [Commerce Modules](../../modules/overview.mdx) that Medusa provides. Commerce Modules are ecommerce features that can be used as building blocks in an ecommerce ecosystem. Product is an example of a Commerce Module.
![Medusa Core Architecture](https://res.cloudinary.com/dza7lstvk/image/upload/v1677607702/Medusa%20Docs/Diagrams/medusa-architecture-3_e385zk.jpg)
The backend connects to a database, such as [PostgreSQL](https://www.postgresql.org/), to store the ecommerce stores data. The tables in that database are represented by [Entities](../entities/overview.mdx), built on top of [Typeorm](https://typeorm.io/). Entities can also be reflected in the database using [Migrations](../entities/migrations/overview.mdx).
The retrieval, manipulation, and other utility methods related to that entity are created inside a [Service](../services/overview.mdx). Services are TypeScript or JavaScript classes that, along with other resources, can be accessed throughout the Medusa backend through [dependency injection](./dependency-injection.md).
The backend does not have any tightly-coupled frontend. Instead, it exposes [Endpoints](../endpoints/overview.mdx) which are REST APIs that frontends such as an admin or a storefront can use to communicate with the backend. Endpoints are [Express routes](https://expressjs.com/en/guide/routing.html).
Medusa also uses an [Events Architecture](../events/index.mdx) to trigger and handle events. Events are triggered when a specific action occurs, such as when an order is placed. To manage this events system, Medusa connects to a service that implements a pub/sub model, such as [Redis](https://redis.io/).
Events can be handled using [Subscribers](../events/subscribers.mdx). Subscribers are TypeScript or JavaScript classes that add their methods as handlers for specific events. These handler methods are only executed when an event is triggered.
You can create any of the resources in the backends architecture, such as entities, endpoints, services, and more, as part of your custom development without directly modifying the backend itself. The Medusa backend uses [loaders](../loaders/overview.mdx) to load the backends resources, as well as your custom resources and resources in [Plugins](../plugins/overview.mdx).
You can package your customizations into Plugins to reuse them in different Medusa backends or publish them for others to use. You can also install existing plugins into your Medusa backend.

View File

@@ -0,0 +1,735 @@
---
description: 'Learn what the dependency container is and how to use it in Medusa. Learn also what dependency injection is, and what the resources registered and their names are.'
---
# Dependency Container and Injection
In this document, youll learn what the dependency container is and how you can use it in Medusa with dependency injection.
## Introduction
### What is Dependency Injection
Dependency Injection is the act of delivering the required resources to a class. These resources are the classs dependencies. This is usually done by passing (or injecting) the dependencies in the constructor of the class.
Generally, all resources are registered in a container. Then, whenever a class depends on one of these resources, the system retrieves the resources from the container and injects them into the classs constructor.
### Medusas Dependency Container
Medusa uses a dependency container to register essential resources of the backend. You can then access these resources in classes and endpoints using the dependency container.
For example, if you create a custom service, you can access any other service registered in Medusa in your services constructor. That includes Medusas core services, services defined in plugins, or other services that you create on your backend.
You can load more than services in your Medusa backend. You can load the Entity Manager, logger instance, and much more.
### MedusaContainer
To manage dependency injections, Medusa uses [Awilix](https://github.com/jeffijoe/awilix). Awilix is an NPM package that implements dependency injection in Node.js projects.
When you run the Medusa backend, a container of the type `MedusaContainer` is created. This type extends the [AwilixContainer](https://github.com/jeffijoe/awilix#the-awilixcontainer-object) object.
The backend then registers all important resources in the container, which makes them accessible in classes and endpoints.
---
## Registered Resources
The Medusa backend scans the core Medusa package, plugins, and your files in the `dist` directory and registers the following resources:
:::tip
The Lifetime column indicates the lifetime of a service. Other resources that aren't services don't have a lifetime, which is indicated with the `-` in the column. You can learn about what a lifetime is in the [Create a Service](../services/create-service.mdx) documentation.
:::
<table class="reference-table table-col-4">
<thead>
<tr>
<th>
Resource
</th>
<th>
Description
</th>
<th>
Registration Name
</th>
<th>
Lifetime
</th>
</tr>
</thead>
<tbody>
<tr>
<td>
Configurations
</td>
<td>
The configurations that are exported from `medusa-config.js`.
</td>
<td>
`configModule`
</td>
<td>
\-
</td>
</tr>
<tr>
<td>
Services
</td>
<td>
Services that extend the `TransactionBaseService` class.
</td>
<td>
Each service is registered under its camel-case name. For example, the `ProductService` is registered as `productService`.
</td>
<td>
Core services by default have the `SINGLETON` lifetime. However, some have a different lifetime which is indicated in this table. Custom services, including services in plugins, by default have the `SCOPED` lifetime, unless defined differently within the custom service.
</td>
</tr>
<tr>
<td>
Entity Manager
</td>
<td>
An instance of Typeorms Entity Manager.
</td>
<td>
`manager`
</td>
<td>
\-
</td>
</tr>
<tr>
<td>
Logger
</td>
<td>
An instance of Medusa CLIs logger. You can use it to log messages to the terminal.
</td>
<td>
`logger`
</td>
<td>
\-
</td>
</tr>
<tr>
<td>
Single Payment Processor
</td>
<td>
An instance of every payment processor that extends the `AbstractPaymentService` or the `AbstractPaymentProcessor` classes.
</td>
<td>
Every payment processor is registered under two names:
- Its camel-case name of the processor. For example, the `StripeProviderService` is registered as `stripeProviderService`.
- `pp_` followed by its identifier. For example, the `StripeProviderService` is registered as `pp_stripe`.
</td>
<td>
By default, it's `SINGLETON` unless defined differently within the payment processor service.
</td>
</tr>
<tr>
<td>
All Payment Processors
</td>
<td>
An array of all payment processor that extend the `AbstractPaymentService` or `AbstractPaymentProcessor` class.
</td>
<td>
`paymentProviders`
</td>
<td>
`paymentProviders` is `TRANSIENT`, and each item in it is `SINGLETON`.
</td>
</tr>
<tr>
<td>
Single Fulfillment Provider
</td>
<td>
An instance of every fulfillment provider that extends the `FulfillmentService` class.
</td>
<td>
Every fulfillment provider is registered under two names:
- Its camel-case name. For example, the `WebshipperFulfillmentService` is registered as `webshipperFulfillmentService`.
- `fp_` followed by its identifier. For example, the `WebshipperFulfillmentService` is registered as `fp_webshipper`.
</td>
<td>
By default, it's `SINGLETON` unless defined differently within the fulfillemnt provider service.
</td>
</tr>
<tr>
<td>
All Fulfillment Providers
</td>
<td>
An array of all fulfillment providers that extend the `FulfillmentService` class.
</td>
<td>
`fulfillmentProviders`
</td>
<td>
`fulfillmentProviders` is `TRANSIENT`, and each item in it is `SINGLETON`.
</td>
</tr>
<tr>
<td>
Single Notification Provider
</td>
<td>
An instance of every notification provider that extends the `AbstractNotificationService` or the `BaseNotificationService` classes.
</td>
<td>
Every notification provider is registered under two names:
- Its camel-case name. For example, the `SendGridService` is registered as `sendGridService`.
- `noti_` followed by its identifier. For example, the `SendGridService` is registered as `noti_sendgrid`.
</td>
<td>
By default, it's `SINGLETON` unless defined differently within the notification provider service.
</td>
</tr>
<tr>
<td>
All Notification Providers
</td>
<td>
An array of all notification providers that extend the `AbstractNotificationService` or the `BaseNotificationService` classes.
</td>
<td>
`notificationProviders`
</td>
<td>
`notificationProviders` is `TRANSIENT`, and each item in it is `SINGLETON`.
</td>
</tr>
<tr>
<td>
File Service
</td>
<td>
An instance of the class that extends the `FileService` class, if any.
</td>
<td>
The file service is registered under two names:
- Its camel-case name. For example, the `MinioService` is registered as `minioService`.
- `fileService`
</td>
<td>
By default, it's `SINGLETON` unless defined differently within the file service.
</td>
</tr>
<tr>
<td>
Search Service
</td>
<td>
An instance of the class that extends the `AbstractSearchService` or the `SearchService` classes, if any.
</td>
<td>
The search service is registered under two names:
- Its camel-case name. For example, the `AlgoliaService` is registered as `algoliaService`.
- `searchService`
</td>
<td>
By default, it's `SINGLETON` unless defined differently within the search service.
</td>
</tr>
<tr>
<td>
Single Tax Provider
</td>
<td>
An instance of every tax provider that extends the `AbstractTaxService` class.
</td>
<td>
The tax provider is registered under two names:
- Its camel-case name.
- `tp_` followed by its identifier.
</td>
<td>
By default, it's `SINGLETON` unless defined differently within the tax provider service.
</td>
</tr>
<tr>
<td>
All Tax Providers
</td>
<td>
An array of every tax provider that extends the `AbstractTaxService` class.
</td>
<td>
`taxProviders`
</td>
<td>
`taxProviders` is `TRANSIENT`, and each item in it is `SINGLETON`.
</td>
</tr>
<tr>
<td>
Oauth Services
</td>
<td>
An instance of every service that extends the `OauthService` class.
</td>
<td>
Each Oauth Service is registered under its camel-case name followed by `Oauth`.
</td>
<td>
By default, it's `SINGLETON` unless defined differently within the Oauth service.
</td>
</tr>
<tr>
<td>
Feature Flag Router
</td>
<td>
An instance of the `FlagRouter`. This can be used to list feature flags, set a feature flags value, or check if theyre enabled.
</td>
<td>
`featureFlagRouter`
</td>
<td>
\-
</td>
</tr>
<tr>
<td>
Redis
</td>
<td>
An instance of the Redis client. If Redis is not configured, a fake Redis client is registered.
</td>
<td>
`redisClient`
</td>
<td>
\-
</td>
</tr>
<tr>
<td>
Single Entity
</td>
<td>
An instance of every entity.
</td>
<td>
Each entity is registered under its camel-case name followed by Model. For example, the `CustomerGroup` entity is stored under `customerGroupModel`.
</td>
<td>
\-
</td>
</tr>
<tr>
<td>
All Entities
</td>
<td>
An array of all database entities that is passed to Typeorm when connecting to the database.
</td>
<td>
`db_entities`
</td>
<td>
\-
</td>
</tr>
<tr>
<td>
Repositories
</td>
<td>
An instance of each repository.
</td>
<td>
Each repository is registered under its camel-case name. For example, `CustomerGroupRepository` is stored under `customerGroupRepository`.
</td>
<td>
\-
</td>
</tr>
<tr>
<td>
Single Batch Job Strategy
</td>
<td>
An instance of every class extending the `AbstractBatchJobStrategy` class.
</td>
<td>
Each batch job strategy is registered under three names:
- Its camel-case name. For example, `ProductImportStrategy` is registered as `productImportStrategy`.
- `batch_` followed by its identifier. For example, the `ProductImportStrategy` is registered under `batch_product-import-strategy`.
- `batchType_` followed by its batch job type. For example, the `ProductImportStrategy` is registered under `batchType_product-import`.
</td>
<td>
\-
</td>
</tr>
<tr>
<td>
All Batch Job Strategies
</td>
<td>
An array of all classes extending the `AbstractBatchJobStrategy` abstract class.
</td>
<td>
`batchJobStrategies`
</td>
<td>
\-
</td>
</tr>
<tr>
<td>
Tax Calculation Strategy
</td>
<td>
An instance of the class implementing the `ITaxCalculationStrategy` interface.
</td>
<td>
`taxCalculationStrategy`
</td>
<td>
\-
</td>
</tr>
<tr>
<td>
Cart Completion Strategy
</td>
<td>
An instance of the class extending the `AbstractCartCompletionStrategy` class.
</td>
<td>
`cartCompletionStrategy`
</td>
<td>
\-
</td>
</tr>
<tr>
<td>
Price Selection Strategy
</td>
<td>
An instance of the class implementing the `IPriceSelectionStrategy` interface.
</td>
<td>
`priceSelectionStrategy`
</td>
<td>
\-
</td>
</tr>
<tr>
<td>
Strategies
</td>
<td>
An instance of strategies that arent of the specific types mentioned above and that are under the `strategies` directory.
</td>
<td>
Its camel-case name.
</td>
<td>
\-
</td>
</tr>
</tbody>
</table>
---
## Resolve Resources
This section covers how to resolve resources from the dependency container to use them in endpoints and classes in general.
### In Endpoints
To resolve resources, such as services, in endpoints, use the `req.scope.resolve` method. The method receives the registration name of the resource as a parameter.
For example:
```ts
const logger = req.scope.resolve("logger")
```
Please note that in endpoints some resources, such as repositories, are not available. Refer to the [repositories](../entities/repositories.md) documentation to learn how you can load them.
### In Classes
In classes such as services, strategies, or subscribers, you can load resources in the constructor function using dependency injection. The constructor receives an object of dependencies as a first parameter. Each dependency in the object should use the registration name of the resource that should be injected to the class.
For example:
```ts
import { OrderService } from "@medusajs/medusa"
class OrderSubscriber {
protected orderService: OrderService
constructor({ orderService }) {
this.orderService = orderService
}
}
```
---
## See Also
- [Create services](../services/create-service.mdx)
- [Create subscribers](../events/create-subscriber.md)

View File

@@ -0,0 +1,194 @@
---
description: 'Learn how to perform local development in the Medusa monorepo. This includes how to use the dev CLI tool and perform unit, integration, and plugin tests.'
---
# Local Development of Medusa Backend and Monorepo
In this document, youll learn how to customize Medusas core and run tests.
## Overview
As an open-source platform, Medusas core can be completely customized.
Whether you want to implement something differently, introduce a new feature as part of Medusas core or any of the other packages, or contribute to Medusa, this guide helps you learn how to run Medusas integration tests, as well as test your own Medusa core in a local backend.
### Medusa Repository Overview
[Medusas repository on GitHub](https://github.com/medusajs/medusa) includes all packages related to Medusa under the [`packages` directory](https://github.com/medusajs/medusa/tree/master/packages). This includes the [core Medusa package](https://github.com/medusajs/medusa/tree/master/packages/medusa), the [JS Client](https://github.com/medusajs/medusa/tree/master/packages/medusa-js), the CLI tools, and much more.
All the packages are part of a [Yarn workspace](https://classic.yarnpkg.com/lang/en/docs/workspaces/). So, when you run a command in the root of the project, such as `yarn build`, it goes through all registered packages in the workspace under the `packages` directory and runs the `build` command in each of those packages.
---
## Prerequisites
### Yarn
When using and developing with the Medusa repository, its highly recommended that you use [Yarn](https://yarnpkg.com/getting-started/install) to avoid any errors or issues.
### Fork and Clone Medusas Repository
To customize Medusas core or contribute to it, you must first [fork](https://docs.github.com/en/get-started/quickstart/fork-a-repo) and then [clone](https://docs.github.com/en/get-started/quickstart/fork-a-repo#cloning-your-forked-repository) the [GitHub repository](https://github.com/medusajs/medusa).
### Install Dependencies and Build Packages
In the directory of the forked GitHub repository, run the following commands to install necessary dependencies then build all packages in the repository:
```bash
yarn install
yarn build
```
### Medusas Dev CLI tool
Medusa provides a CLI tool to be used for development. This tool facilitates testing your local installment and changes to Medusas core without having to publish the changes to NPM.
To install Medusas dev CLI tool:
```bash npm2yarn
npm install medusa-dev-cli -g
```
### Set the Location of the Medusa Repository
In the directory of your forked GitHub repository, run the following command to specify to the dev CLI tool the location of your Medusa repository:
```bash
medusa-dev --set-path-to-repo `pwd`
```
---
## Run Tests in the Repository
In this section, youll learn how to run tests in the Medusa repository. This is helpful after you customize any of Medusas packages and want to make sure everything is still working as expected.
### Set System Environment Variables
Before you can run the tests, make sure you set the following system environment variables:
```bash
DB_HOST=<YOUR_DB_HOST>
DB_USERNAME=<YOUR_DB_USERNAME>
DB_PASSWORD=<YOUR_PASSWORD>
```
### Run Unit Tests
To run unit tests in all packages in the Medusa repository, run the following command in the root directory of the repository:
```bash
yarn test
```
This runs the `test` script defined in the `package.json` file of each package under the `packages` directory.
Alternatively, if you want to run the unit tests in a specific package, you can run the `test` command in the directory of that package.
For example, to run the unit tests of the Medusa core:
```bash
cd packages/medusa
yarn test
```
### Run API Integration Tests
API integration tests are used to test out Medusas core endpoints.
To run the API integration tests, run the following command in the root directory of the repository:
```bash
yarn test:integration:api
```
### Run Plugin Integration Tests
Plugin integration tests are used to test out Medusas official plugins, which are also stored in the `packages` directory in the repository.
To run the plugin integration tests, run the following command in the root directory of the repository:
```bash
yarn test:integration:plugins
```
---
## Test in a Local Backend
Using Medusas dev CLI tool, you can test any changes you make to Medusas packages in a local backend installation. This eliminates the need to publish these packages on NPM publicly to be able to use them.
Medusas dev CLI tool scans and finds the Medusa packages used in your Medusa backend. Then, it copies the files of these packages from the `packages` directory in the Medusa repository into the `node_modules` directory of your Medusa backend.
:::info
Medusas Dev CLI tool uses the [path you specified earlier](#set-the-location-of-the-medusa-repository) to copy the files of the packages.
:::
### Copy Files to Local Backend
To test in a local backend:
1. Change to the directory of the backend you want to test your changes in:
```bash
cd medusa-backend
```
2\. Run the following command to copy the files from the `packages` directory of your Medusa repository into `node_modules`:
```bash
medusa-dev
```
By default, Medusas dev CLI runs in watch mode. So, it copies the files when you first run it. Then, whenever you make changes in the `dist` directory of the packages in the Medusa repository, it copies the changed files again.
### Watch and Compile Changes
While the above command is running, it's recommended to run the `watch` command inside the directory of every package you're making changes to.
The combination of these two commands running at the same time will compile the package into the `dist` directory of the package, then copy the compiled changes into your local backend.
For example, if you're making changes in the `medusa` package, run the following command inside the directory of the `medusa` package:
```bash title=packages/medusa
yarn watch
```
Make sure the `medusa-dev` command is also running to copy the changes automatically.
Alternatively, you can manually run the `build` command every time you want to compile the changes:
```bash title=packages/medusa
yarn build
```
### CLI Options
Here are some options you can use to customize how Medusas dev CLI tool works:
- `--scan-once` or `-s`: Copies files only one time then stops processing. If you make any changes after running the command with this option, you have to run the command again.
```bash
medusa-dev -s
```
- `--quiet` or `-q`: Disables showing any output.
```bash
medusa-dev -q
```
- `--packages`: Only copies specified packages. It accepts at least one package name. Package names are separated by a space.
```bash
medusa-dev --packages @medusajs/medusa-cli medusa-file-minio
```
---
## See Also
- [Create a Plugin](../plugins/create.mdx)
- [Contribution Guidelines](https://github.com/medusajs/medusa/blob/master/CONTRIBUTING.md)

View File

@@ -0,0 +1,156 @@
---
description: 'Learn about the Transaction Orchestrator used in the core Medusa package. The transaction orchestrator (TO) offers an effective way of managing transactions within an increasingly complex environment.'
---
# Transaction Orchestrator
In this document, youll learn about the Transaction Orchestrator used in the core Medusa package.
## Introduction
The transaction orchestrator (TO) offers an effective way of managing transactions within an increasingly complex environment. It supports Medusas modularity and composability by handling transactions from different modules rather than one whole system.
Medusas core package uses the transaction orchestrator to enhance the control and management of transactions and workflows across multiple services or modules. It simplifies creating and executing distributed transactions.
The transaction orchestrator supervises transaction flows and guarantees that successful transactions are executed fully or entirely rolled back in case of failure. With clearly defined steps to Invoke and Compensate actions, the transaction orchestrator follows the separation of concerns principle, providing you with improved control over transactions and workflows.
---
## Why Medusa Uses the Transaction Orchestrator
The transaction orchestrator is a necessity for modular or distributed systems in scenarios where a given workflow involves different modules for several reasons:
1. **Data consistency:** In a distributed system, maintaining data consistency across different databases becomes challenging. A transaction orchestrator ensures that if any part of the transaction fails, all the previous steps are rolled back (compensated), keeping the data consistent across all involved databases.
2. **Coordination:** The transaction orchestrator acts as a coordinator between different modules and their respective servers. It also manages the order of execution and communication between modules, ensuring that each transaction step is executed correctly and at the right time.
3. **Simplifying complex workflows:** In a distributed environment, transactions can become complex due to the need to coordinate between different services and databases. A transaction orchestrator simplifies this process by abstracting away the complexities of managing distributed transactions, allowing developers to focus on implementing the business logic.
4. **Scalability:** As a system grows, managing transactions across multiple services becomes increasingly difficult. A transaction orchestrator helps with scalability by providing a robust framework for managing distributed transactions, making it easier to maintain and expand the system.
In addition to the above reasons, there are reasons more relevant in the context of digital commerce which makes it an important addition to Medusas toolbox. These reasons are:
- **Composable architectures:** A transaction orchestrator supports composable architectures, allowing developers to easily combine and reuse modules as needed. This enables the creation of highly customizable commerce applications, tailored to specific business requirements.
- **Adoption in legacy systems:** The transaction orchestrator can also facilitate the gradual transition of legacy systems to more modern, distributed architectures. This makes it easier for businesses to adopt and integrate new technologies without having to rebuild their entire infrastructure from scratch.
- **Unlocking infrastructure technologies:** With a transaction orchestrator, developers can leverage advanced infrastructure technologies, such as serverless and edge computing. This can lead to improved performance, reduced latency, and increased reliability for commerce applications, resulting in better user experiences and higher customer satisfaction.
---
## Example of Using the Transaction Orchestrator
To better illustrate how the transaction orchestrator works, heres an example of how its used in the multi-warehouse feature that coordinates the flow to create a product variant, creating an inventory item and finally linking both together:
```ts
const createVariantFlow: TransactionStepsDefinition = {
next: {
action: "createVariantStep",
saveResponse: true,
next: {
action: "createInventoryItemStep",
saveResponse: true,
next: {
action: "attachInventoryItemStep",
noCompensation: true,
},
},
},
}
```
The actions are handled by a single function called by the transaction orchestrator:
```ts
async function transactionHandler(
actionId: string,
type: TransactionHandlerType,
payload: TransactionPayload
) {
const command = {
createVariantStep: {
invoke: async (data: CreateProductVariantInput) => {
return await createProductVariant(data) // omitted
},
compensate: async (
data: CreateProductVariantInput,
{ invoke }
) => {
await removeProductVariant(
invoke.createVariantStep
) // omitted
},
},
createInventoryItemStep: {
invoke: async (
data: CreateProductVariantInput,
{ invoke }
) => {
return await createInventoryItem(
invoke.createVariantStep
) // omitted
},
compensate: async (
data: CreateProductVariantInput,
{ invoke }
) => {
await removeInventoryItem(
invoke.createInventoryItemStep
) // omitted
},
},
attachInventoryItemStep: {
invoke: async (
data: CreateProductVariantInput,
{ invoke }
) => {
return await attachInventoryItem( // omitted
invoke.createVariantStep,
invoke.createInventoryItemStep
)
},
},
}
return command[actionId][type](payload.data, payload.context)
}
```
Note that the implementation of each function was omitted to keep the example short; however, their names are self-explanatory.
Finally, the transaction orchestrator is instantiated and a new transaction initialized:
```ts
const strategy = new TransactionOrchestrator(
"create-variant-with-inventory", // transaction name
createVariantFlow // transaction steps definition
)
const transaction = await strategy.beginTransaction(
ulid(), // unique id
transactionHandler, // handler
createProductVariantInput // input
)
await strategy.resume(transaction)
```
---
## Achieving Cleaner Code with the Transaction Orchestrator
Utilizing a transaction orchestrator results in cleaner code and single-responsibility functions compared to manually managing distributed transactions. This is due to several factors:
1. **Abstraction:** A transaction orchestrator abstracts the complexity of managing distributed transactions by providing a standardized framework for defining transaction steps, their corresponding compensation actions, and the flow of execution.
2. **Separation of concerns:** By clearly delineating the responsibilities of each function, the transaction orchestrator enforces the separation of concerns. Each function is responsible for either the "Invoke" (execution) or "Compensate" (rollback) action, ensuring they perform a single, specific task. This makes the code more readable, maintainable, and testable.
3. **Modularity:** By using a transaction orchestrator, the code becomes more modular, as each step in the transaction is encapsulated within its own function. This allows developers to easily modify, add, or remove transaction steps without affecting the overall structure of the transaction.
4. **Error Handling:** When handling distributed transactions manually, the code can become convoluted due to intricate error handling logic. The transaction orchestrator simplifies this by automatically managing errors and retries, allowing developers to create cleaner code without the need to address error handling for each step.
5. **Reusability:** The transaction orchestrator allows you to define reusable functions for both the "Invoke" and "Compensate" actions, which can be easily reused across different transaction scenarios. This reduces code duplication and ensures that changes to a specific action only need to be made in one place.
---
## Handling Complex Workflows with the Transaction Orchestrator
The Transaction Orchestrator enables developers to create complex workflows for synchronous and long-running tasks that may take a while to receive a response (asynchronous). These workflows can be organized in a way that may not necessarily be transactional, but can still be effectively orchestrated.
There are several scenarios where asynchronous workflows with long-running steps can be applied using the transaction orchestrator:
1. **Order fulfillment:** In a commerce system, a workflow might involve creating an order, reserving inventory, charging the customer, generating shipping labels, and updating the shipping status. Some of these steps, such as generating shipping labels or charging the customer, could take a considerable amount of time due to external dependencies like payment gateways and shipping services.
2. **Fraud detection and prevention:** fraud detection and prevention workflows might involve analyzing customer data, order patterns, and payment information to identify potential fraudulent activities. These workflows may include time-consuming tasks like querying external fraud detection APIs or applying machine learning models to analyze data.
3. **Returns and refunds processing:** In a returns and refunds workflows, several steps may involve long-running tasks, such as receiving returned products, inspecting their condition, updating inventory, and processing refunds. Some of these steps may take a considerable amount of time, especially when dealing with external payment gateways or waiting for products to be returned and inspected.
In all these examples, workflows are valuable for handling intricate, multistep processes that include tasks taking considerable time or involving external dependencies without immediate responses.

View File

@@ -0,0 +1,72 @@
---
description: "Learn what an idempotency key is in Medusa. An Idempotency Key is a unique key associated with an operation that allows you to safely retry requests."
---
import DocCard from '@theme/DocCard';
import Icons from '@theme/Icon';
# Idempotency Key
In this document, you'll learn what an idempotency key is in Medusa.
## Overview
An Idempotency Key is a unique, randomly generated key associated with an operation, such as the cart completion process. The idempotency key can be passed in the header of a request to an endpoint. This allows you to safely retry requests without accidentally performing the same operation twice.
For example, if a connection error occurs while the customer is completing their cart and placing an order, you can retry from the last recovery point before the error occurred.
When an operation first starts, the idempotency key is generated using the `uuid` package's `v4` method. Then, the backend sets the following headers in the response:
```bash
Access-Control-Expose-Headers: Idempotency-Key
Idempotency-Key: <IDEM_VAL>
```
Where `<IDEM_VAL>` is the idempotency key generated.
These headers can then be passed again for subsequent retrying requests and will be available on the response of these requests as well. The value of the idempotency key remains the same across requests and responses, even if an error occurs and you retry the request.
For example, when the cart completion process starts, the Medusa backend generates the idempotency key and sets the necessary headers on the response. If an error occurs in the request, you can later retry the request by passing these same headers in your request.
---
## IdempotencyKey Entity Overview
The idempotency key's data is stored within the `IdempotencyKey` entity. Some of its attributes include:
- `idempotency_key`: a unique string indicating the value of the idempotency key.
- `request_method`: a string indicating the method of the latest request related to the idempotency key's operation. For example, `POST`.
- `request_params`: a JSONB object indicating the parameters of the latest request related to the idempotency key's operation.
- `request_path`: a string indicating the path of the latest request related to the idempotency key's operation.
- `response_code`: a number indicating the response code of the latest request related to the idempotency key's operation.
- `response_body`: a JSONB object indicating the response body of the latest request related to the idempotency key's operation.
- `recovery_point`: a string indicating the point to continue from when retrying the request. The default value is `started`.
---
## Idempotency Key Stages
Idempotency key stages are the different recovery points that are available for an operation. Every operation has at least the `started` and `finished` stages.
For example, the cart completion operation has the following stages or recovery points:
- `started`
- `tax_lines_created`
- `payment_authorized`
- `finished`
---
## Custom Development
You can use the `IdempotencyKeyService` in your custom development to ensure requests can be safely retried or continued.
<DocCard item={{
type: 'link',
href: '/development/idempotency-key/use-service',
label: 'Use IdempotencyKeyService',
customProps: {
icon: Icons['academic-cap-solid'],
description: 'Learn how to use the IdempotencyKeyService in your code.'
}
}} />

View File

@@ -0,0 +1,153 @@
---
description: 'Learn how to create a loader in Medusa. A loader can be created in the Medusa backend codebase, in a plugin, or in a module.'
addHowToData: true
---
# How to Use IdempotencyKeyService
In this document, you'll learn how to use the `IdempotencyKeyService`.
## Overview
You can use the `IdempotencyKeyService` within your custom development to ensure that your custom endpoints and operations can be safely retried or continued if an error occurs. This guide is also useful if you're overriding an existing feature in Medusa that uses the `IdempotencyKeyService` and you want to maintain its usage, such as if you're overriding the cart completion strategy.
The `IdempotencyKeyService` includes methods that can be used to create and update idempotency keys, among other functionalities.
---
## Create Idempotency Key
You can create an idempotency key within an endpoint using the `create` method of the `IdempotencyKeyService`:
```ts
router.post("/custom-route", async (req, res) => {
// ...
const idempotencyKey = await idempotencyKeyService.create({
request_method: req.method,
request_params: req.params,
request_path: req.path,
})
// ...
})
```
The method requires as a parameter an object having the following properties:
- `request_method`: a string indicating the request method to be associated with the idempotency key.
- `request_params`: an object indicating the request parameters to be associated with the idempotency key.
- `request_path`: a string indicating the request path to be associated with the idempotency key.
The method handles generating the idempotency key value and saving the idempotency key with its details in the database. It returns the full idempotency key object.
Alternatively, you can use the `initializeRequest` method that allows you to retrieve an idempotency key based on the value passed in the `Idempotency-Key` header of the request if it exists, or create a new key otherwise. For example:
```ts
router.post("/custom-route", async (req, res) => {
// ...
const headerKey = req.get("Idempotency-Key") || ""
const idempotencyKey = await idempotencyKeyService
.initializeRequest(
headerKey,
req.method,
req.params,
req.path
)
// ...
})
```
The method requires the following parameters:
1. The first parameter is the key existing in the header of the request, if there's any.
2. The second parameter is the request's method.
3. The third parameter is the request's parameters.
4. The fourth method is the request's path.
The method returns the full idempotency key object.
---
## Perform Actions Within Idempotency Key Stages
Each [idempotency key stage](./overview.mdx#idempotency-key-stages) typically has transactions performed within it. Using the `IdempotencyKeyService`'s `workStage` method allows you to perform related functionalities in transactional isolation within each stage. You can access the stage or recovery point of an idempotency key using the `recovery_point` attribute.
The following example is taken from the `CartCompletionStrategy` implemented in the Medusa backend:
<!-- eslint-disable no-fallthrough -->
```ts
class CartCompletionStrategy
extends AbstractCartCompletionStrategy {
// ...
async complete(
id: string,
ikey: IdempotencyKey,
context: RequestContext
): Promise<CartCompletionResponse> {
// ...
let inProgress = true
let err: unknown = false
while (inProgress) {
switch (idempotencyKey.recovery_point) {
case "started": {
await this.activeManager_
.transaction(
"SERIALIZABLE",
async (transactionManager) => {
idempotencyKey =
await this.idempotencyKeyService_
.withTransaction(transactionManager)
.workStage(
idempotencyKey.idempotency_key,
async (manager) =>
await this.handleCreateTaxLines(
id,
{
manager,
}
)
)
}
)
.catch((e) => {
inProgress = false
err = e
})
break
}
case "tax_lines_created": {
// ...
}
case "payment_authorized": {
// ...
}
case "finished": {
// ...
}
default: {
// ...
}
}
}
// ...
}
}
```
The method requires the following parameters:
1. The first parameter is the idempotency key value.
2. The second parameter is a callback function to be executed. The function should return an object that is used to update the idempotency key's details. The object can include the following parameters:
1. `recovery_point`: a string indicating the new recovery point associated with the idempotency key's operation. If no `recovery_point` is returned in the object, the `finished` recovery point is assigned by default.
2. `response_code`: a number indicating the latest response code of the idempotency key's operation.
3. `response_body`: an object indicating the latest response body of the idempotency key's operation.
The method returns an updated idempotency key object.

View File

@@ -0,0 +1,86 @@
---
description: 'Learn how to create a loader in Medusa. A loader can be created in the Medusa backend codebase, in a plugin, or in a module.'
addHowToData: true
---
# How to Create a Loader
In this document, youll learn how to create a loader in Medusa. A loader can be created in the Medusa backend codebase, in a plugin, or in a module.
## Step 1: Create Loader File
Create a TypeScript or JavaScript file in the `src/loaders` directory that will hold your custom script. There are no restrictions on the name of the file.
For example, create the file `src/loaders/my-loader.ts` that will hold the loader.
---
## Step 2: Define the Loader
The loader file must export a function.
### Parameters of Loaders in Medusa Backend and Plugins
When the loader is defined in the Medusa backend or a plugin, the function receives the following parameters:
1. `container`: the first parameter, which is a `AwilixContainer` object. You can use the container to register custom resources into the dependency container or resolve resources from the dependency container.
2. `config`: the second parameter, which is an object that holds the loaders plugin options. If the loader is not created inside a plugin, the config object will be empty.
### Parameters of Loaders in Modules
When the loader is defined in a module, it receives the following parameters:
1. `container`: the first parameter, which is a `AwilixContainer` object. You can use the container to register custom resources into the dependency container or resolve resources from the dependency container.
2. `logger`: the second parameter, which is a `Logger` object. The logger can be used to log messages in the console.
3. `config`: the third parameter, which is an object that holds the loaders module options.
### Example Implementation
For example, this loader function resolves the `ProductService` and logs in the console the count of products in the Medusa backend:
```ts title=src/loaders/my-loader.ts
import { ProductService } from "@medusajs/medusa"
import { AwilixContainer } from "awilix"
export default async (
container: AwilixContainer,
config: Record<string, unknown>
): Promise<void> => {
console.info("Starting loader...")
const productService = container.resolve<ProductService>(
"productService"
)
console.info(`Products count: ${
await productService.count()
}`)
console.info("Ending loader...")
}
```
---
## Step 3: Run Build Command
In the directory of your project, run the following command to transpile the files from the `src` to the `dist` directory:
```bash npm2yarn
npm run build
```
---
## Test it Out
:::note
This section explains how to test out the loader if its created in the Medusa backend codebase. If youre creating your loader in a plugin, you can learn how to test it in the [plugins documentation](../plugins/create.mdx#test-your-plugin). Alternatively, if youre creating your loader in a module, you can learn how to test it in the [modules documentation](../modules/create.mdx#step-4-test-your-module).
:::
Run the following command to start the Medusa backend:
```bash npm2yarn
npx medusa develop
```
Your loader script should run on the Medusa backend startup. If you logged a message in the console, similar to the example above, you should see it in the console.

View File

@@ -0,0 +1,69 @@
---
description: "Learn what loaders are in Medusa. A loader is a script that runs when the Medusa backend starts."
---
import DocCard from '@theme/DocCard';
import Icons from '@theme/Icon';
# Loaders
In this document, youll learn what loaders are in Medusa.
## Overview
A loader is a script that runs when the Medusa backend starts. The Medusa backend uses loaders to initialize the database connection, load plugins, register resources in the dependency container, and more.
Loaders can be created within the Medusa backend codebase, in a plugin, or in a module to perform custom actions when the backend starts. The loader is created in a TypeScript or JavaScript file located in the `src/loaders` directory, then transpiled using the `build` command into the `dist/loaders` directory.
### Loader Examples
For example, the Redis Event Bus module uses a loader to establish a connection with Redis and log a message in the console.
Another example is the Algolia plugin, which uses a loader to update the index settings when the Medusa backend starts based on the plugins options.
Loaders can be used to access the dependency container and register custom resources in it.
Loaders can also be used to create [scheduled jobs](../scheduled-jobs/overview.mdx) that run at a specified interval of time. For example:
```ts
const publishJob = async (container, options) => {
const jobSchedulerService =
container.resolve("jobSchedulerService")
jobSchedulerService.create(
"publish-products",
{},
"0 0 * * *",
async () => {
// job to execute
const productService = container.resolve("productService")
const draftProducts = await productService.list({
status: "draft",
})
for (const product of draftProducts) {
await productService.update(product.id, {
status: "published",
})
}
}
)
}
export default publishJob
```
---
## Custom Development
Developers can create a loader with the desired functionality directly within the Medusa Core, in a plugin, or in a module.
<DocCard item={{
type: 'link',
href: '/development/loaders/create',
label: 'Create a Loader',
customProps: {
icon: Icons['academic-cap-solid'],
description: 'Learn how to create a loader.'
}
}} />

View File

@@ -0,0 +1,309 @@
---
description: 'Learn how to create a module and test the module in your Medusa backend.'
---
import DocCardList from '@theme/DocCardList';
import Icons from '@theme/Icon';
# How to Create a Module
In this document, youll learn how to create a module and test the module in your Medusa backend.
This document covers the general steps of creating an internal module, but does not explore actually implementing any functionality in it. An internal module is a TypeScript/JavaScript module that is loaded by the Medusa backend as part of the commerce application to extend or replace a functionality within it, such as the cache or event bus functionality.
---
## (Optional) Step 0: Project Preparation
Before you start implementing the custom functionality in your module, it's recommended to create a directory that holds your module and prepare the following structure in it:
```
custom-module
|
|___ src
|
|___ index.ts
|
|___ services // directory
```
The directory can be an NPM project, but that is optional. `index.ts` acts as an entry point to your Module. You'll learn about its content in a later step. The `service` directory will hold your custom services. If you're adding other resources you can add other directories for them. For example, if you're adding an entity you can add a `models` directory.
:::tip
You can use JavaScript instead of TypeScript.
:::
It's also recommended to use the following TypeScript config in `tsconfig.json`, which should be added in the root of the project holding your module:
```json
{
"compilerOptions": {
"lib": [
"es2020"
],
"target": "2020",
"outDir": "./dist",
"esModuleInterop": true,
"declaration": true,
"module": "commonjs",
"moduleResolution": "node",
"emitDecoratorMetadata": true,
"experimentalDecorators": true,
"sourceMap": true,
"noImplicitReturns": true,
"strictNullChecks": true,
"strictFunctionTypes": true,
"noImplicitThis": true,
"allowJs": true,
"skipLibCheck": true,
},
"include": ["src"],
"exclude": [
"dist",
"./src/**/__tests__",
"./src/**/__mocks__",
"./src/**/__fixtures__",
"node_modules"
]
}
```
---
## Step 1: Implement the Custom Functionality
This step depends on what youre actually implementing. For example, you can implement the cache or events module, or you can implement a commerce module. If what youre creating has a guide, you can refer to it while implementing the functionalities.
### Note About Project Structure
When developing your module, it's important to note that you'll later be referencing the module using a file path to an `index.ts` or `index.js` file. This file is explained in the next step and acts as an entry point and a definition of your module.
So, make sure when implementing your module you take into account that the module should be easily referenced from your local Medusa server. For example, you can develop your module in a sibling directory of the Medusa backend that you can reference with `../custom-module`.
Keep in mind that when you publish the module to NPM, you'll have to move your module into a new NPM project. This is covered in the [Publish Module documentation](./publish.md).
### Recommended Guides
<DocCardList colSize={6} items={[
{
type: 'link',
href: '/development/cache/create',
label: 'Create a Cache Module',
customProps: {
icon: Icons['academic-cap-solid'],
description: 'Learn how to create a cache module in Medusa.',
}
},
{
type: 'link',
href: '/development/events/create-module',
label: 'Create an Events Module',
customProps: {
icon: Icons['academic-cap-solid'],
description: 'Learn how to create an events module in Medusa.',
}
},
]} />
---
### Step 2: Export Module
After implementing the module, you must export a module object that helps the Medusa backend understand how to use this Module. This is done in the file `index.ts`.
The file must export an object with the following properties:
```ts
type ModuleExports = {
service: Constructor<any>
loaders?: ModuleLoaderFunction[]
migrations?: any[]
models?: Constructor<any>[]
runMigrations?(
options: LoaderOptions,
moduleDeclaration: InternalModuleDeclaration
): Promise<void>
revertMigration?(
options: LoaderOptions,
moduleDeclaration: InternalModuleDeclaration
): Promise<void>
}
```
:::tip
All property types such as `ModuleLoaderFunction` can be loaded from the `@medusajs/modules-sdk` package.
:::
Where:
- `service`: This is the only required property to be exported. It should be the main service your module exposes, and it must implement all the declared methods on the module interface. For example, if it's a cache module, it must implement the `ICacheService` interface exported from `@medusajs/types`.
- `loaders`: (optional) an array of [loader](../loaders/overview.mdx) functions used to perform an action while loading the module. For example, you can log a message that the module has been loaded, or if your module's scope is [isolated](#module-scope) you can use the loader to establish a database connection.
- `migrations`: (optional) an array of objects containing database migrations that should run when the `migration` command is used with Medusa's CLI.
- `models`: (optional) an array of entities that your module creates.
- `runMigrations`: (optional) a function that can be used to define migrations to run when the `migration run` command is used with Medusa's CLI. The migrations will only run if they haven't already. This will only be executed if the module's scope is [isolated](#module-scope).
- `revertMigration`: (optional) a function can be used to define how migrations should be reverted when the `migration revert` command is used with Medusa's CLI. This will only be executed if the module's scope is [isolated](#module-scope).
Here's an example implementation of `index.ts` from Medusa's Redis Cache module:
```ts title=index.ts
import { ModuleExports } from "@medusajs/modules-sdk"
import Loader from "./loaders"
import { RedisCacheService } from "./services"
const service = RedisCacheService
const loaders = [Loader]
const moduleDefinition: ModuleExports = {
service,
loaders,
}
export default moduleDefinition
```
---
## Step 3: Reference Module
To use your module in the Medusa backend, add your module to `medusa-config.js`:
```js title=medusa-config.js
module.exports = {
// ...
modules: {
// ...
moduleType: {
resolve: "module-name-or-path",
options: {
// options if necessary
},
// optional
resources: "shared",
},
},
}
```
The way you add your module depends on its type and what options it requires, if any. Note that in the above code example:
- `moduleType` is the type of your module. For example, if your module is a cache module, it should be changed to `cacheService`.
- `resolve` is used to reference the Module. Its value should be either the name of an NPM package module, or a relative file path to the module. This is explained more in the [Module Reference section](#module-reference).
- `options` should hold any options of your module, if necessary.
- `resources` is an optional property that indicates whether the module shares the same dependency container as the rest of the resources in the Medusa backend. More details are explained in the [Module Scope section](#module-scope).
### Module Reference
When the module is installed as an NPM package, the value of the `resolve` property should be the name of that package. For example:
```js title=medusa-config.js
module.exports = {
// ...
modules: {
// ...
moduleType: {
resolve: "custom-module",
// ...
},
},
}
```
However, when using a local module, you must reference the module using a relative file path to it from the Medusa backend.
For example, consider you have the following file structure:
```
|
|___ custom-module
| |
| |___ src
| |
| |___ index.ts
| |
| |___ services
| | |
| | |___ custom-service.ts
| |___ // more files
|
|
|___ medusa-backend
```
You can reference your module in two ways:
1\. Referencing the directory: In this case, it's assumed that the `index.ts` file that contains the module definition is in the root of the directory you referenced. Using the above example, the file path would be in this case:
```js title=medusa-config.js
module.exports = {
// ...
modules: {
// ...
moduleType: {
resolve: "../custom-module/src",
// ...
},
},
}
```
2\. Referencing `index` file: In this case, it's assumed that the `index.ts` or `index.js` file you're referencing includes the module definition. Using the above example, the file path would be in this case:
```js title=medusa-config.js
module.exports = {
// ...
modules: {
// ...
moduleType: {
resolve: "../custom-module/src/index.ts",
// ...
},
},
}
```
### Module Scope
By default, the module shares the same dependency container used across the Medusa backend. So, the module can benefit from the core services and other resources available through [dependency injection](../fundamentals/dependency-injection.md). The module can also benefit from the same database connection.
The module's scope can be changed using the `resources` property available as part of the module's configurations:
```js title=medusa-config.js
module.exports = {
// ...
modules: {
// ...
moduleType: {
// other configurations
resources: "shared",
},
},
}
```
The `resources` property can have one of the following values:
- `shared`: (default) The dependency container is shared with the module, including the database connection. You don't need to establish the database connection yourself in a loader.
- `isolated`: the module receives an empty dependency container, and only its own dependencies will be registered in the container. When using this value, you must establish the database connection yourself and managing other resources within your module.
---
## Step 4: Test Your Module
Finally, to test your module, run the following command:
```bash npm2yarn
npx medusa develop
```
This starts the Medusa backend and runs your module as part of it.
---
## Next Steps
After you finish developing your module, you can publish it as an NPM package with [this guide](./publish.md).

View File

@@ -0,0 +1,47 @@
---
description: 'Learn what Modules are and how can you use them during your custom development with Medusa.'
---
import DocCardList from '@theme/DocCardList';
import Icons from '@theme/Icon';
# Modules
In this document, youll learn what Modules are and how can you use them during your custom development with Medusa.
## Overview
Modules are self-contained, reusable pieces of code that encapsulate specific functionality or features within an ecommerce application. They foster separation of concerns, maintainability, and reusability by organizing code into smaller, independent units that can be easily managed, tested, and integrated with other modules.
Modules further increase Medusas extensibility. commerce modules, such as the cart engine, can be extended or entirely replaced with your own custom logic. They can also run independently of the core Medusa package, allowing you to utilize the commerce module within a larger commerce ecosystem. For example, you can use the Order module as an Order Management System (OMS) without using Medusas core.
This also applies to core logic such as caching or events systems. You can use modules to integrate any logic or third-party service to handle this logic. This gives you greater flexibility in how you choose your tech stack.
Modules are created and loaded similarly to plugins. They can be loaded from a local project, or they can be installed and loaded from an NPM package. In the Medusa backend, theyre added as part of the configurations in `medusa-config.js` to use and load them within the backend.
---
## Custom Development
Developers can create their own modules and use them in their Medusa backend. They can also publish these modules to NPM to reuse them across Medusa backend or allow other developers to use them.
<DocCardList colSize={6} items={[
{
type: 'link',
href: '/development/modules/create',
label: 'Create a Module',
customProps: {
icon: Icons['academic-cap-solid'],
description: 'Learn how to create a module in Medusa.'
}
},
{
type: 'link',
href: '/development/modules/publish',
label: 'Publish Module',
customProps: {
icon: Icons['academic-cap-solid'],
description: 'Learn how to publish your module on NPM.'
}
},
]} />

View File

@@ -0,0 +1,198 @@
---
description: 'Learn how to prepare and publish your custom module to NPM, then how to install it in the Medusa backend.'
---
# How to Publish a Module
In this document, you'll learn how to prepare and publish your custom module to NPM, then how to install it in the Medusa backend.
## Prerequisites
This guide assumes you've already created a custom module. If not, follow [this guide](./create.mdx) first to create a module.
You also need an [NPM account](https://www.npmjs.com/signup) to publish the module with.
---
## Step 1: Create an NPM Project
If your module isn't located in an NPM project already, you must create one first that will hold your module.
To do that, run the following commands to create a directory and initialize an NPM project in it:
```bash npm2yarn
mkdir my-module
npm init
```
Youll be asked a couple of questions related to your package, such as its name or license. You can keep the default for now or set them right away.
Once youre done, you should have a `package.json` created in the directory.
---
## Step 2: Changes to package.json
In your `package.json` file, add or update the following fields:
```json title=package.json
{
// other fields
"main": "dist/index.js",
"publishConfig": {
"access": "public"
},
"files": [
"dist"
],
"devDependencies": {
"@medusajs/types": "^0.0.2",
"cross-env": "^5.2.1",
"typescript": "^4.4.4"
},
"scripts": {
"watch": "tsc --build --watch",
"prepare": "cross-env NODE_ENV=production npm run build",
"build": "tsc --build",
},
"dependencies": {
"@medusajs/modules-sdk": "^0.1.0",
}
}
```
This adds the necessary dependencies for development and publishing, including the `@medusajs/modules-sdk` package. It also adds the following scripts:
- `build`: can be used to manually build your module.
- `prepare`: can be used to prepare your module for publishing on NPM
- `watch`: (optional, for development) can be used to re-build your module whenever any changes occur without having to manually trigger the `build`.
---
## Step 3: Configure tsconfig.json
If you don't already have a `tsconfig.json` file, create one in the root of your NPM project with the following content:
```json title=tsconfig.json
{
"compilerOptions": {
"lib": [
"es2020"
],
"target": "2020",
"outDir": "./dist",
"esModuleInterop": true,
"declaration": true,
"module": "commonjs",
"moduleResolution": "node",
"emitDecoratorMetadata": true,
"experimentalDecorators": true,
"sourceMap": true,
"noImplicitReturns": true,
"strictNullChecks": true,
"strictFunctionTypes": true,
"noImplicitThis": true,
"allowJs": true,
"skipLibCheck": true,
},
"include": ["src"],
"exclude": [
"dist",
"./src/**/__tests__",
"./src/**/__mocks__",
"./src/**/__fixtures__",
"node_modules"
]
}
```
This allows you to use the recommended TypeScript configurations and sets the output directory to `dist`. This is essential for preparing your module for publishing.
---
## Step 4: Change Module Structure
To ensure that the files are built from the `src` directory to the `dist` directory, make sure to move the module content to a `src` directory inside the new NPM project.
---
## Step 5: Publish and Use Module
This section explains how to publish your module to NPM.
### Run Prepare Command
Before you publish or update your module, make sure to run the `prepare` command defined earlier:
```bash npm2yarn
npm run prepare
```
### Login
In your terminal, log in with your NPM account:
```bash
npm login
```
Youll be asked to enter your NPM email and password.
### Publish Module Package
Once youre logged in, you can publish your package with the following command:
```bash
npm publish
```
Your package is then published on NPM and everyone can use it and install it.
### Install Module
To install your published module, you can run the following command on any Medusa backend project:
```bash
npm install module-name
```
Where `module-name` is the name of your module.
### Add Module to medusa-config.js
In `medusa-config.js` on your Medusa backend, add your module to the exported configurations:
```js title=medusa-config.js
module.exports = {
// ...
modules: {
// ...
moduleType: {
resolve: "<module-name>",
options: {
// options if necessary
},
},
},
}
```
Where `<module-name>` is the name of your NPM package.
You can learn more about the available options in the [Create Module documentation](./create.mdx#step-3-reference-module).
### Update Module
To update your module at a later point, you can run the following command to change the NPM version:
```bash
npm version <type>
```
Where `<type>` indicates the type of version update youre publishing. For example, it can be `major` or `minor`. You can see the [full list of types in NPMs documentation](https://docs.npmjs.com/cli/v8/commands/npm-version).
Then, publish the new update:
```bash
npm publish
```

View File

@@ -0,0 +1,346 @@
---
description: 'Learn how to create a notification provider in Medusa. This guide explains the different methods available in a Notification provider.'
addHowToData: true
---
# How to Create a Notification Provider
In this document, youll learn how to create a Notification Provider in Medusa.
:::note
If youre unfamiliar with the Notification architecture in Medusa, it is recommended to check out the [architecture overview](./overview.mdx) first.
:::
## Prerequisites
Before you start creating a Notification Provider, you need to either install a [Medusa backend](../backend/install.mdx), or create it in a [plugin](../plugins/overview.mdx). The Medusa backend must also have an event bus module installed, which is available when using the default Medusa backend starter.
---
## Create a Notification Provider
Creating a Notification Provider is as simple as creating a TypeScript or JavaScript file in `src/services`. The name of the file is the name of the provider (for example, `sendgrid.ts`). A Notification Provider is essentially a Service that extends the `AbstractNotificationService` from `@medusajs/medusa`.
For example, create the file `src/services/email-sender.ts` with the following content:
```ts title=src/services/email-sender.ts
import { AbstractNotificationService } from "@medusajs/medusa"
import { EntityManager } from "typeorm"
class EmailSenderService extends AbstractNotificationService {
protected manager_: EntityManager
protected transactionManager_: EntityManager
sendNotification(
event: string,
data: unknown,
attachmentGenerator: unknown
): Promise<{
to: string;
status: string;
data: Record<string, unknown>;
}> {
throw new Error("Method not implemented.")
}
resendNotification(
notification: unknown,
config: unknown,
attachmentGenerator: unknown
): Promise<{
to: string;
status: string;
data: Record<string, unknown>;
}> {
throw new Error("Method not implemented.")
}
}
export default EmailSenderService
```
Where `EmailSenderService` is the name of your Notification Provider Service.
Notification Providers must extend `NotificationService` from `medusa-interfaces`.
:::info
Following the naming convention of Services, the name of the file should be the slug name of the Notification Provider, and the name of the class should be the camel case name of the Notification Provider suffixed with “Service”. In the example above, the name of the file should be `email-sender.js`. You can learn more in the [service documentation](../services/create-service.mdx).
:::
### identifier
Notification Provider Services must have a static property `identifier`.
The `NotificationProvider` entity has 2 properties: `identifier` and `is_installed`. The value of the `identifier` property in the Service class is used when the Notification Provider is created in the database.
The value of this property is also used later when you want to subscribe the Notification Provider to events in a Subscriber.
For example, in the class you created in the previous code snippet you can add the following property:
```ts
class EmailSenderService extends AbstractNotificationService {
static identifier = "email-sender"
// ...
}
```
### constructor
You can use the `constructor` of your Notification Provider to have access to different Services in Medusa through dependency injection.
You can also use the constructor to initialize your integration with the third-party provider. For example, if you use a client to connect to the third-party providers APIs, you can initialize it in the constructor and use it in other methods in the Service.
Additionally, if youre creating your Notification Provider as an external plugin to be installed on any Medusa backend and you want to access the options added for the plugin, you can access it in the constructor. The options are passed as a second parameter.
:::info
You can learn more about plugins and how to create them in the [Plugins](../plugins/overview.mdx) documentation.
:::
Continuing on with the previous example, if you want to use the [`OrderService`](../../references/services/classes/OrderService.md) later when sending notifications, you can inject it into the constructor:
```ts
import {
AbstractNotificationService,
OrderService,
} from "@medusajs/medusa"
class EmailSenderService extends AbstractNotificationService {
protected manager_: EntityManager
protected transactionManager_: EntityManager
static identifier = "email-sender"
protected orderService: OrderService
constructor(container, options) {
super(container)
// you can access options here in case you're
// using a plugin
this.orderService = container.orderService
}
// ...
}
```
### sendNotification
When an event is triggered that your Notification Provider is registered as a handler for, the [`NotificationService`](../../references/services/classes/NotificationService.md) in Medusas core will execute the `sendNotification` method of your Notification Provider.
In this method, you can perform the necessary operation to send the Notification. Following the example above, you can send an email to the customer when they place an order.
This method receives three parameters:
1. `eventName`: This is the name of the event that was triggered. For example, `order.placed`.
2. `eventData`: This is the data payload of the event that was triggered. For example, if the `order.placed` event is triggered, the `eventData` object contains the property `id` which is the ID of the order that was placed.
3. `attachmentGenerator`: If youve previously attached a generator to the `NotificationService` using the [`registerAttachmentGenerator`](../../references/services/classes/NotificationService.md#registerattachmentgenerator) method, you have access to it here. You can use the `attachmentGenerator` to generate on-demand invoices or other documents. The default value of this parameter is null.
:::info
You can learn more about what events are triggered in Medusa and their data payload in the [Events List](../events/events-list.md) documentation.
:::
This method must return an object containing two properties:
1. `to`: a string that represents the receiver of the Notification. For example, if you sent an email to the customer then `to` is the email address of the customer. In other cases, it might be a phone number or a username.
2. `data`: an object that contains the data used to send the Notification. For example, if you sent an order confirmation email to the customer, then the `data` object might include the order items or the subject of the email. This `data` is necessary if the notification is resent later as you can use the same data.
Continuing with the previous example you can have the following implementation of the `sendNotification` method:
```ts
class EmailSenderService extends AbstractNotificationService {
// ...
async sendNotification(
event: string,
data: any,
attachmentGenerator: unknown
): Promise<{
to: string;
status: string;
data: Record<string, unknown>;
}> {
if (event === "order.placed") {
// retrieve order
const order = await this.orderService.retrieve(data.id)
// TODO send email
console.log("Notification sent")
return {
to: order.email,
status: "done",
data: {
// any data necessary to send the email
// for example:
subject: "You placed a new order!",
items: order.items,
},
}
}
}
}
```
In this code snippet, you check first if the event is `order.placed`. This can be helpful if youre handling multiple events using the same Notification Provider.
You then retrieve the order using the ID and send the email. Here, the logic related to sending the email is not implemented as it is generally specific to your Notification Provider.
Finally, you return an object with the `to` property set to the customer email and the `data` property is an object that contains data necessary to send the email such as a `subject` or `items`.
:::note
The `to` and `data` properties are used in the `NotificationService` in Medusas core to create a new `Notification` record in the database. You can learn more about the `Notification` entity in the [Architecture Overview](./overview.mdx#notification-entity-overview) documentation.
:::
### resendNotification
Using the [Resend Notification endpoint](https://docs.medusajs.com/api/admin#notifications_postnotificationsnotificationresend), an admin user can resend a Notification to the customer. The [`NotificationService`](../../references/services/classes/NotificationService.md) in Medusas core then executes the `resendNotification` method in your Notification Provider.
This method receives three parameters:
1. `notification`: This is the original Notification record that was created after you sent the notification with `sendNotification`. You can get an overview of the entity and its attributes in the [architecture overview](./overview.mdx#notification-entity-overview), but most notably it includes the `to` and `data` attributes which are populated originally using the `to` and `data` properties of the object you return in `sendNotification`.
2. `config`: In the Resend Notification endpoint you may specify an alternative receiver of the notification using the `to` request body parameter. For example, you may want to resend the order confirmation email to a different email. If thats the case, you have access to it in the `config` parameter object. Otherwise, `config` will be an empty object.
3. `attachmentGenerator`: If youve previously attached a generator to the Notification Service using the [`registerAttachmentGenerator`](../../references/services/classes/NotificationService.md#registerattachmentgenerator) method, you have access to it here. You can use the `attachmentGenerator` to generate on-demand invoices or other documents. The default value of this parameter is null.
Similarly to the `sendNotification` method, this method must return an object containing two properties:
1. `to`: a string that represents the receiver of the Notification. You can either return the same `to` available in the `notification` parameter or the updated one in the `config` parameter.
2. `data`: an object that contains the data used to send the Notification. You can either return the same `data` in the `notification` parameter or make any necessary updates to it.
Continuing with the previous example you can have the following implementation of the `resendNotification` method:
```ts
class EmailSenderService extends AbstractNotificationService {
// ...
async resendNotification(
notification: any,
config: any,
attachmentGenerator: unknown
): Promise<{
to: string;
status: string;
data: Record<string, unknown>;
}> {
// check if the receiver should be changed
const to: string = config.to ? config.to : notification.to
// TODO resend the notification using the same data
// that is saved under notification.data
console.log("Notification resent")
return {
to,
status: "done",
data: notification.data, // make changes to the data
}
}
}
```
In the above snippet, you check if the `to` should be changed by checking if the `config` parameter has a `to` property. Otherwise, you keep the same `to` address stored in the `notification` parameter.
You then resend the email. Here, the logic related to sending the email is not implemented as it is generally specific to your Notification Provider.
Finally, you return an object with the `to` property set to the email (either new or old) and the `data` property is the same data used before to send the original notification. you can alternatively make any changes to the data.
:::note
The `to` and `data` properties are used in the `NotificationService` in Medusas core to create a new `Notification` record in the database. No changes are made to the original `Notification` record created after the `sendNotification` method. This new record is associated with the original `Notification` record using the `parent_id` attribute in the entity. You can learn more about the `Notification` entity in the [Architecture Overview](./overview.mdx#notification-entity-overview) documentation.
:::
---
## Create a Subscriber
After creating your Notification Provider Service, you must create a Subscriber that registers this Service as a notification handler of events.
:::note
This section will not cover the basics of Subscribers. You can read the [Subscribers](../events/create-subscriber.md) documentation to learn more about them and how to create them.
:::
Following the previous example, to make sure the `email-sender` Notification Provider handles the `order.placed` event, create the file `src/subscribers/notification.js` with the following content:
```ts title=src/subscribers/notification.js
class NotificationSubscriber {
constructor({ notificationService }) {
notificationService.subscribe(
"order.placed",
"email-sender"
)
}
// ...
}
export default NotificationSubscriber
```
This subscriber accesses the `notificationService` using dependency injection. The `notificationService` contains a `subscribe` method that accepts 2 parameters. The first one is the name of the event to subscribe to, and the second is the identifier of the Notification Provider that is subscribing to that event.
:::tip
Notice that the value of the `identifier` static property defined in the `EmailSenderService` is used to register the Notification Provider to handle the `order.placed` event.
:::
---
## Test Sending Notifications with your Notification Provider
Make sure you have an event bus module configured in your Medusa backend. You can learn more on how to do that in the [Configurations guide](../backend/configurations.md).
Then, run the build command in the root directory of your Medusa backend:
```bash npm2yarn
npm run build
```
Next, start your Medusa backend:
```bash npm2yarn
npx medusa develop
```
Try now to place an order either using the [REST APIs](https://docs.medusajs.com/api/store) or using the storefront.
:::tip
If you dont have a storefront installed you can get started with the [Next.js Starter Template](../../starters/nextjs-medusa-starter.mdx) in minutes.
:::
After placing an order, you can see in your console the message “Notification Sent”. If you added your own notification sending logic, you should receive an email or alternatively the type of notification youve set up.
---
## Test Resending Notifications with your Notification Provider
To test resending a notification, first, retrieve the ID of the notification you just sent using the [List Notifications admin endpoint](https://docs.medusajs.com/api/admin#notifications_getnotifications). You can pass as a body parameter the `to` or `event_name` parameters to filter out the notification you just sent.
:::tip
You must be authenticated as an admin user before sending this request. You can use the [Authenticate a User](https://docs.medusajs.com/api/admin#auth_postauth) endpoint to get authenticated.
:::
![List Notifications Request](https://res.cloudinary.com/dza7lstvk/image/upload/v1668001650/Medusa%20Docs/Screenshots/iF1rZX1_msps2t.png)
Then, send a request to the [Resend Notification](https://docs.medusajs.com/api/admin#notifications_postnotificationsnotificationresend) endpoint using the ID retrieved from the previous request. You can pass the `to` parameter in the body to change the receiver of the notification. You should see the message “Notification Resent” in your console and if you implemented your own logic for resending the notification it will be resent.
![Resend Notifications Request](https://res.cloudinary.com/dza7lstvk/image/upload/v1668001659/Medusa%20Docs/Screenshots/0zFfPed_og7one.png)
This request returns the same notification object as the List Notifications endpoint, but it now has a new object in the `resends` array. This is the resent notification. If you supplied a `to` parameter in the request body, you should see its value in the `to` property of the resent notification object.
---
## See Also
- [Create a Plugin](../plugins/create.mdx)

View File

@@ -0,0 +1,109 @@
---
description: 'Learn about the Notification architecture in Medusa and the automation flow. The Notification Architecture is made up of the Notification Provider and Notification.'
---
import DocCard from '@theme/DocCard';
import Icons from '@theme/Icon';
# Notification Architecture Overview
This document gives an overview of the notification architecture and how it works.
## Introduction
Medusa provides a Notification API to mainly handle sending and resending notifications when an event occurs. For example, sending an email to the customer when they place an order.
The Notification architecture is made up of two main components: the Notification Provider and the Notification. Simply put, the Notification Provider handles the sending and resending of a Notification.
---
## Notification Provider
A Notification Provider is a provider that handles sending and resending of notifications. You can either create and integrate your own provider or install a Notification Provider through a third-party plugin.
An example of a notification provider is SendGrid. When an order is placed, the SendGrid plugin sends an email to the customer.
### How Notification Provider is Created
A Notification Provider is essentially a Medusa [Service](../services/create-service.mdx) with a unique identifier, and it extends the [`NotificationService`](../../references/services/classes/NotificationService.md) provided by the `medusa-interfaces` package. It can be created as part of a [Plugin](../plugins/overview.mdx), or it can be created just as a Service file in your Medusa backend.
As a developer, you mainly work with the Notification Provider when integrating a third-party service that handles notifications in Medusa.
When you run your Medusa backend, the Notification Provider is registered in your backend. If it's a new Notification Provider, it will be inserted into the `notification_provider` table in your database.
### NotificationProvider Entity Overview
The `NotificationProvider` entity only has 2 attributes: `id` and `is_installed`.
`id` is the value of the static property `identifier` defined inside the notification Service class.
`is_installed` indicates whether the Notification Provider is installed or not. When you install a Notification Provider, the value of this attribute is `true`.
If you installed a Notification provider and then removed the Service files or plugin that registered the Notification Provider, the Notification Provider remains in your database, but the value of the `is_installed` field changes to `false`.
---
## Notification
A notification is a form of an alert sent to the customers or users to inform them of an action that has occurred. For example, if an order is placed, the notification, in this case, can be an email that confirms their order and lists the order details.
Notifications can take on other forms such as an SMS or a Slack message.
### How Notification is Created
Notifications are created in the `NotificationService` class in Medusas core after the Notification has been handled by the Notification Provider.
The data and additional details that the Notification Provider returns to the `NotificationService` is used to fill some of the attributes of the Notification in the database.
A Notification also represents a resent notification. So, when a notification is resent, a new one is created that references the original Notification as a parent. This Notification is also created by the `NotificationService` class.
### Notification Entity Overview
The two most important properties in the [`Notification`](../../references/entities/classes/Notification.md) entity are the `to` and `data` properties.
The `to` property is a string that represents the receiver of the Notification. For example, if the Notification was sent to an email address, the `to` property holds the email address the Notification was sent to.
The `to` property can alternatively be a phone number or a chat username. It depends on the Notification Provider and how it sends the Notification.
The `data` property is an object that holds all the data necessary to send the Notification. For example, in the case of an order confirmation Notification, it can hold data related to the order.
The `data` property is useful when a notification is resent later. The same `data` can be used to resend the notification.
In the case of resent notifications, the resent notification has a `parent_id` set to the ID of the original Notification. The value of the `parent_id` property in the original Notification is `null`.
The `Notification` entity has some properties that determine the context of this Notification. This includes the `event_name` property which is the event that triggered the sending of this notification.
Additionally, the `resource_type` property is used to determine what resource this event is associated with. For example, if the `event_name` is `order.placed`, the `resource_type` is `order`.
You can also access the specific resource using the `resource_id` property, which is the ID of the resource. So, in case of the `order.placed` event, the `resource_id` is the ID of the order that was created.
The `Notification` entity also includes properties related to the receiver of the Notification. In case the receiver is a customer, the `customer_id` property is used to identify which customer.
---
## Automating Flows with Notifications
With Medusa you can create notifications as a reaction to a wide spectrum of events, allowing you to automate communication and processes.
An example of a flow that can be implemented using Medusa's Notification API is automated return flows:
- A customer requests a return by sending a `POST` request to the `/store/returns` endpoint.
- The Notification Provider listens to the `order.return_requested` event and sends an email to the customer with a return invoice and return label generated by the Fulfillment Provider.
- The customer returns the items triggering the `return.received` event.
- The Notification Provider listens to the `return.received` event and sends an email to the customer with confirmation that their items have been received and that a refund has been issued.
---
## Custom Development
Developers can create custom notification providers in the Medusa backend, a plugin, or in a module.
<DocCard item={{
type: 'link',
href: '/development/notification/create-notification-provider',
label: 'Create a Notification Provider',
customProps: {
icon: Icons['academic-cap-solid'],
description: 'Learn how to create a notification provider in Medusa.'
}
}}
/>

View File

@@ -0,0 +1,203 @@
---
description: "Learn about development with Medusa, fundamental concepts, and more."
hide_table_of_contents: true
---
import DocCardList from '@theme/DocCardList';
import Icons from '@theme/Icon';
# Medusa Development
This part of the documentation provides you with the fundamental concepts and guides that can help you build and customize commerce applications with Medusa.
## Introduction
Medusa is a set of tools that developers can use to build digital commerce applications. Whether you want to offer unique customer experiences, create powerful automations, or build rich commerce applications like marketplaces, Medusa provides all the necessary tools.
Other ecommerce platforms offer a finite set of features accessible through an API. Medusa is different because it provides building blocks for businesses and developers to build commerce features. This means that you can extend your commerce API for your exact needs.
Medusa's building blocks ship as NPM packages of the following types:
- [Commerce modules](../modules/overview.mdx), which are isolated commerce logic for different domains. For example, an Inventory Module.
- A core package responsible for orchestrating the different commerce modules and exposing REST APIs.
---
## How Does Medusa Work
The core package is the NPM package `@medusajs/medusa`. It's a Node.js server built with Express and other tools that offer functionalities for managing events, caching, job queues, and more.
The core package has two main objectives.
### Orchestrate Commerce Modules
When you build a commerce application with Medusa, youll typically interact with more than one commerce module. The core package manages relationships between modules, and forwarding calls to modules at the right time during business logic execution.
For example, imagine an Inventory module that contains lightweight logic to increment and decrement stock levels for a Stock-Keeping Unit (SKU). In a commerce application, you typically want to associate the stock levels with a specific product. Medusa offers both an Inventory module and a Product module, and the core package creates associations between these modules and executing the related business logic. So, the core package contains code similar to this:
```ts
async function handler(req, res) {
// ...
// associate a product with an inventory item
const product = await productService.create(data)
const inventoryItem = await inventoryService.create(
inventoryData
)
await productVariantInventoryService.associate(
product.id,
inventoryItem.id
)
// ...
}
```
### Expose REST APIs
The goal of orchestrating the modules is to expose an API that client applications, like websites or apps, can consume. By default, Medusas core package exposes a REST API that offers commerce functionalities similar to what other platforms give you.
The core package also holds the logic that allows developers to extend and add custom endpoints, among other available customizations.
---
## Backend First Steps
<DocCardList colSize={4} items={[
{
type: 'link',
href: '/development/backend/install',
label: 'Backend Quickstart',
customProps: {
icon: Icons['server-stack-solid'],
description: 'Learn how to install a Medusa backend and the available next steps.'
}
},
{
type: 'link',
href: '/development/backend/prepare-environment',
label: 'Prepare Environment',
customProps: {
icon: Icons['tools-solid'],
description: 'Learn how to install the necessary tools to use Medusa.'
}
},
{
type: 'link',
href: '/development/backend/configurations',
label: 'Configure Backend',
customProps: {
icon: Icons['cog-six-tooth-solid'],
description: 'Learn how to configure the Medusa backend.'
}
},
]} />
---
## Understanding Fundamental Concepts
These concepts will guide you through your development and building customization with Medusa.
<DocCardList colSize={4} items={[
{
type: 'link',
href: '/development/entities/overview',
label: 'Entities',
customProps: {
icon: Icons['academic-cap-solid'],
description: 'A class representation of database tables to handle commerce data.'
}
},
{
type: 'link',
href: '/development/endpoints/overview',
label: 'Endpoints',
customProps: {
icon: Icons['academic-cap-solid'],
description: 'REST APIs that frontends consume to communicate with the backend.'
}
},
{
type: 'link',
href: '/development/services/overview',
label: 'Services',
customProps: {
icon: Icons['academic-cap-solid'],
description: 'Utility classes associated with Entities or a specific functionality.'
}
},
{
type: 'link',
href: '/development/events',
label: 'Events and Subscribers',
customProps: {
icon: Icons['academic-cap-solid'],
description: 'Perform an asynchronous task when an action occurs.'
}
},
{
type: 'link',
href: '/development/scheduled-jobs/overview',
label: 'Scheduled Jobs',
customProps: {
icon: Icons['academic-cap-solid'],
description: 'Automate tasks to be executed at specified times.'
}
},
{
type: 'link',
href: '/development/plugins/overview',
label: 'Plugins',
customProps: {
icon: Icons['academic-cap-solid'],
description: 'Publish customizations as NPM packages to be reused.'
}
},
]} />
---
## Medusa Use Cases
To better understand how you can use Medusa, here are some common use cases that Medusa is the ideal solution for.
### Ecommerce Building Blocks
Developers can set up the core package and handpick the commerce modules they want to use. This gives them great flexibility in choosing the features they want to provide in their ecommerce store, while utilizing the powerful architecture in the core package.
![Ecommerce Building Blocks](https://res.cloudinary.com/dza7lstvk/image/upload/v1678954316/Medusa%20Docs/Diagrams/ecommerce-building-blocks_llgnn2.jpg)
Developers can modify and tailor the modules that Medusa offers to their use case. They can also create custom Modules to implement new features. All these modules become building blocks that shape their ecommerce system.
### Medusa in Microservices Architectures
Medusas commerce modules can be used in isolation from the core package and within a larger ecosystem. For example, you can use Medusas Cart module within a blog to allow readers to buy merch.
![Medusa in Microservices Architecture](https://res.cloudinary.com/dza7lstvk/image/upload/v1678954316/Medusa%20Docs/Diagrams/microservices-architecture-use-case_vubgno.jpg)
Developers can benefit from Medusas Modules that provide essential ecommerce features while maintaining the ecommerce ecosystem of their choice. Commerce modules can be installed in your setup as NPM packages.
### Vertical Ecommerce Platforms
A Vertical Ecommerce Platform is a platform that provides features and functionalities specialized for a type of business sector. For example, a platform for pharmaceuticals.
Developers can use Medusa to build a vertical ecommerce platform as it provides the stepping stones that eliminate the need to reinvent the wheel for basic ecommerce features, but are customizable enough to be changed for their use case.
### Out-of-Box APIs
Since Medusas commerce modules are NPM packages, they can be installed and used in any JavaScript project.
By installing a Module in your project and expose its APIs based on the framework youre using, you can get ecommerce REST APIs right from your frontend framework without having to create a separate project.
### Full-Fledged Ecommerce System
Developers can use Medusas toolkit to create their ecommerce system. With the use of the [create-medusa-app](../create-medusa-app.mdx) command, developers can set up a Medusa Backend, Medusa admin, and a storefront.
![Full-Fledged Ecommerce System](https://res.cloudinary.com/dza7lstvk/image/upload/v1678954316/Medusa%20Docs/Diagrams/fully-fledged-ecom_qqwraq.jpg)
Developers can still benefit from customization opportunities here that Medusa provides. This includes creating resources such as endpoints and services, creating plugins, integrating third-party services, create a custom storefront, and more.
### Your Own Use Case
Medusas vision is to allow developers to build anything they want using it. There are no limitations to what you can build and the ideas you can come up with. If you have an idea, you can use Medusas tools to start building it.

View File

@@ -0,0 +1,586 @@
---
description: 'Learn how to create a plugin in Medusa. This guide explains how to develop, configure, and test a plugin.'
addHowToData: true
---
import DocCardList from '@theme/DocCardList';
import Icons from '@theme/Icon';
import LearningPath from '@site/src/components/LearningPath';
# How to Create a Plugin
In this document, youll learn how to create a plugin and some tips for development. If youre interested to learn more about what plugins are and where to find available official and community plugins, check out the [overview document](./overview.mdx).
Alternatively, you can follow this recipe to create a plugin with step-by-step guidance.
<LearningPath pathName="plugin" />
## Prerequisites
You must have an existing Medusa project that you want to create the plugin with.
The recommended way to create a plugin is using the `new` command from Medusa CLI:
```bash
npx @medusajs/medusa-cli@latest new medusa-plugin-custom
```
Where `medusa-plugin-custom` is the name of the plugin youre creating. In Medusa, plugins are named based on their functionalities.
By convention, all plugin names start with `medusa` followed by a descriptive name of what the plugin does. For example, the Stripe plugin is named `medusa-payment-stripe`.
---
## Changes to package.json
### Package Name
By default, your package name in `package.json` will be `medusa-starter-default`. This should instead be the name of your plugin. For example, the Stripe plugin's package name is `medusa-payment-stripe`.
### Change Dependencies
A basic Medusa backend installed with the `medusa new` command has dependencies that are necessary for the backend, but not necessary for plugins.
For example, can remove the dependencies `medusa-fulfillment-manual`, `medusa-payment-manual`, and `medusa-payment-stripe` as they are fulfillment and payment plugins necessary for a Medusa backend, but not for a plugin. The same goes for modules like `@medusajs/cache-inmemory`.
Additionally, you can remove `@medusajs/medusa-cli` as you dont need to use the Medusa CLI while developing a plugin.
You should also add `@medusajs/medusa` as a peer dependency:
```json
"peerDependencies": {
"@medusajs/medusa": "YOUR_MEDUSA_VERSION",
// other peer dependencies...
}
```
Where `YOUR_MEDUSA_VERSION` is the version you're using of the Medusa core package. You should be able to find it under `devDependencies`.
Once youre done making these changes, re-run the install command to update your `node_modules` directory:
```bash npm2yarn
npm install
```
Then, make sure to remove the plugins and modules you removed from `medusa-config.js`:
```js title=medusa-config.js
// previously had plugins
const plugins = []
// previously had modules
const modules = {}
```
### Changes for Admin Plugins
If your plugin contains customizations to the admin dashboard, it's recommended to create different `tsconfig` files for backend and admin customizations, then modify the scripts in `package.json` to handle building backend and admin customizations separately.
:::note
These changes may already be available in your Medusa project. They're included here for reference purposes.
:::
Start by updating your `tsconfig.json` with the following configurations:
```json title=tsconfig.json
{
"compilerOptions": {
"target": "es2019",
"module": "commonjs",
"allowJs": true,
"checkJs": false,
"jsx": "react-jsx",
"declaration": true,
"outDir": "./dist",
"rootDir": "./src",
"experimentalDecorators": true,
"emitDecoratorMetadata": true,
"noEmit": false,
"strict": false,
"moduleResolution": "node",
"esModuleInterop": true,
"resolveJsonModule": true,
"skipLibCheck": true,
"forceConsistentCasingInFileNames": true
},
"include": ["src/"],
"exclude": [
"dist",
"build",
".cache",
"tests",
"**/*.spec.js",
"**/*.spec.ts",
"node_modules",
".eslintrc.js"
]
}
```
The important changes to note here are the inclusion of the field `"jsx": "react-jsx"` and the addition of `"build"` and `“.cache”` to `exclude`.
The addition of `"jsx": "react-jsx"` specified how should TypeScript transform JSX, and excluding `build` and `.cache` ensures that TypeScript ignores build and development files.
Next, create the file `tsconfig.server.json` with the following content:
```json title=tsconfig.server.json
{
"extends": "./tsconfig.json",
"compilerOptions": {
/* Emit a single file with source maps instead of having a separate file. */
"inlineSourceMap": true
},
"exclude": ["src/admin", "**/*.spec.js"]
}
```
This is the configuration that will be used to transpile your custom backend code, such as services or entities. The important part is that it excludes `src/admin` as that is where your Admin code will live.
Then, create the file `tsconfig.admin.json` with the following content:
```json title=tsconfig.admin.json
{
"extends": "./tsconfig.json",
"compilerOptions": {
"module": "esnext"
},
"include": ["src/admin"],
"exclude": ["**/*.spec.js"]
}
```
This is the configuration that will be used when transpiling your admin code.
Finally, update the `build` scripts in your project and add a new `prepare` command:
```json title=package.json
"scripts": {
// other scripts...
"build": "cross-env npm run clean && npm run build:server && npm run build:admin",
"build:server": "cross-env npm run clean && tsc -p tsconfig.json",
"build:admin": "cross-env medusa-admin build",
"prepare": "cross-env NODE_ENV=production npm run build:server && medusa-admin bundle"
}
```
Each of these scripts do the following:
- `build`: used to build resources for both admin and backend for development. You'll typically use this script during your plugin development.
- `build:server`: used to build backend resources for development.
- `build:admin`: used to build admin resources for development.
- `prepare`: used to build resources for publishing. You'll typically use this script during plugin testing and publishing.
Furthermore, make sure to add `react` to `peerDependencies` along with `react-router-dom` if you're using it:
```json title=package.json
"peerDependencies": {
// other dependencies...
"react": "^18.2.0",
"react-router-dom": "^6.13.0"
}
```
### Delete Irrelevant Files
If you've installed the Medusa backend using the [create-medusa-app](../../create-medusa-app.mdx) command, you might find files under the `src` sub-directories that aren't necessary for your plugin development. For example, `src/model/onboarding.ts` or migrations under the `src/migrations` directory.
Make sure to delete these files if you're not using them in your plugin.
---
## Plugin Development
### Plugin Structure
While developing your plugin, you can create your TypeScript or JavaScript files under the `src` directory. This includes creating services, endpoints, migrations, and other resources.
However, before you test the changes on a Medusa backend or publish your plugin, you must transpile your files and move them either to a `dist` directory or to the root of the plugin's directory.
For example, if you have an endpoint in `src/api/index.js`, after running the `build` or `watch` commands [as defined earlier](#recommended-change-scripts), the file should be transpiled into `dist/api/index.js` in your plugin's root. You can alternative transpile them into the `api/index.js` in your plugin's root.
:::note
It was previously required to output your files into the root of the plugin's directory (for example, `api/index.js` instead of `dist/api/index.js`). As of v1.8, you can either have your files in the root of the directory or under the `dist` directory.
:::
### Development Resources
This guide doesn't cover how to create different files and components. If youre interested in learning how to do that, you can check out these guides:
<DocCardList colSize={6} items={[
{
type: 'link',
href: '/development/entities/create',
label: 'Create an Entity',
customProps: {
icon: Icons['academic-cap-solid'],
description: 'Learn how to create an entity.'
}
},
{
type: 'link',
href: '/development/services/create-service',
label: 'Create a Service',
customProps: {
icon: Icons['academic-cap-solid'],
description: 'Learn how to create a service.'
}
},
{
type: 'link',
href: '/development/endpoints/create',
label: 'Create an Endpoint',
customProps: {
icon: Icons['academic-cap-solid'],
description: 'Learn how to create an endpoint.'
}
},
{
type: 'link',
href: '/development/events/create-subscriber',
label: 'Create a Subscriber',
customProps: {
icon: Icons['academic-cap-solid'],
description: 'Learn how to create a subscriber.'
}
},
{
type: 'link',
href: '/admin/widgets',
label: 'Create an Admin Widget',
customProps: {
icon: Icons['academic-cap-solid'],
description: 'Learn how to create an admin widget.',
badge: {
variant: 'orange',
children: 'Beta'
}
}
},
{
type: 'link',
href: '/admin/routes',
label: 'Create an Admin UI Route',
customProps: {
icon: Icons['academic-cap-solid'],
description: 'Learn how to create an admin UI route.',
badge: {
variant: 'orange',
children: 'Beta'
}
}
},
]} />
If you're developing something specific, such as a payment processor plugin, you can follow one of the following guides to learn how to create different services within your plugin.
<DocCardList colSize={6} items={[
{
type: 'link',
href: '/modules/carts-and-checkout/backend/add-payment-provider',
label: 'Create a Payment Processor',
customProps: {
icon: Icons['academic-cap-solid'],
description: 'Learn how to create a payment processor.'
}
},
{
type: 'link',
href: '/modules/carts-and-checkout/backend/add-fulfillment-provider',
label: 'Create a Fulfillment Provider',
customProps: {
icon: Icons['academic-cap-solid'],
description: 'Learn how to create a fulfillment provider.'
}
},
]} />
<DocCardList colSize={4} items={[
{
type: 'link',
href: '/development/search/create',
label: 'Create a Search Service',
customProps: {
icon: Icons['academic-cap-solid'],
description: 'Learn how to create a search service.'
}
},
{
type: 'link',
href: '/development/file-service/create-file-service',
label: 'Create a File Service',
customProps: {
icon: Icons['academic-cap-solid'],
description: 'Learn how to create a file service.'
}
},
{
type: 'link',
href: '/development/file-service/create-file-service',
label: 'Create a Notification Service',
customProps: {
icon: Icons['academic-cap-solid'],
description: 'Learn how to create a notification service.'
}
},
]} />
### Plugin Options
Plugins often allow developers that will later use them to provide their own option. For example, you can allow developers to specify the API key of a service youre integrating.
Developers that use your plugin will pass options to your plugin in the `plugins` array in `medusa-config.js`:
```js title=medusa-config.js
const plugins = [
// ...
{
resolve: `medusa-plugin-custom`,
options: {
name: "My Store",
},
},
]
```
In your plugin's services, you can have access to the option in their constructor. The options are passed as a second parameter to the `constructor` method.
For example:
```js title=src/service/my.ts
// In a service in your plugin
class MyService extends TransactionBaseService {
constructor(container, options) {
super(container)
// options contains plugin options
this.name = options.name
}
// ...
}
```
You can also access the options in your plugin's endpoints. The second parameter that the function declared in `src/api/index.ts` receives is an object including your plugin's configurations.
For example:
```js title=src/api/index.ts
// in an endpoint in your plugin
export default (rootDirectory, options) => {
// options contain the plugin options
const router = Router()
router.get("/hello-world", (req, res) => {
res.json({
message:
`Welcome to ${options.name ? options.name : "Medusa"}!`,
})
})
return router
}
```
:::tip
Make sure to include in the README of your plugin the options that can be passed to a plugin.
:::
### enableUI Plugin Option
All plugins accept an option named `enableUI`. This option is useful mainly if your plugin contains admin customizations. It allows users to enable or disable admin customizations in the admin dashboard.
A developer using your plugin can pass the `enableUI` option as part of the plugin's options:
```js title=medusa-config.js
const plugins = [
// ...
{
resolve: `medusa-plugin-custom`,
options: {
// other options
enableUI: true,
},
},
]
```
If you're passing your plugin options to third-party services, make sure to omit it from the plugin options you receive in your resources, such as services. The `enableUI` option will always be passed as part of your plugin options.
For example:
```js title=src/service/test.ts
// In a service in your plugin
class MyService extends TransactionBaseService {
constructor(container, options) {
super(container)
// options contains plugin options
const { enableUI, ...otherOptions } = options
// pass otherOptions to a third-party service
const client = new Client(otherOptions)
}
// ...
}
```
:::note
`enableUI`'s default value is `false` if not provided by the plugin users. This means that it must be enabled manually in a plugin's option for the customizations to appear in the admin dashboard.
:::
---
## Test Your Plugin
While you develop your plugin, youll need to test it on an actual Medusa backend. This can be done using the [npm link](https://docs.npmjs.com/cli/v8/commands/npm-link) command.
### Step 1: Build Changes
<Tabs groupId="plugin-preference">
<TabItem value="without-admin" label="Without Admin Customizations" default>
In the root of your plugin directory, run the `build` command:
```bash
npm run build
```
</TabItem>
<TabItem value="with-admin" label="With Admin Customizations">
In the root of your plugin directory, run the `prepare` command:
```bash
npm run prepare
```
If the `prepare` script is not available in your project, you can find it in [this section](#changes-for-admin-plugins).
</TabItem>
</Tabs>
### Step 2: Link Package
In the root of your plugin directory, run the following command:
```bash npm2yarn
npm link
```
Then, in the directory of the Medusa backend you want to test the plugin on, run the following command:
```bash npm2yarn
npm link medusa-plugin-custom
```
Where `medusa-plugin-custom` is the package name of your plugin.
### Step 3: Remove Medusa Dependency
As your plugin has the `@medusajs/medusa` package installed, and the Medusa backend has `@medusajs/medusa` installed as well, this can cause dependency errors.
To avoid that, remove the `@medusajs` directory from the `node_modules` of your plugin's directory. For Unix-based operating systems you can use the following command:
```bash
rm -rf node_modules/@medusajs
```
### Step 4: Add Plugin to Configurations
In the `medusa-config.js` file of the Medusa backend you're testing the plugin on, add your custom plugin to the `plugins` array:
```js
const plugins = [
// other plugins...
{
resolve: `medusa-plugin-custom`,
options: {
// plugin options...
// if plugin has admin customizations:
enableUI: true,
},
},
]
```
Make sure to change `medusa-plugin-custom` with the name of your plugin. Also, if your plugin has admin customizations, make sure to include the [enableUI](#enableui-plugin-option) option.
### (Optional) Step 5: Run Migrations
If your plugin includes migrations, run the following command in the Medusa backend's directory:
```bash
npx medusa migrations run
```
### Step 6: Run the Medusa Backend
In the directory of the Medusa backend, start the backend with the `dev` command passing it the `--preserve-symlinks` option:
```bash npm2yarn
npm run dev -- -- --preserve-symlinks
```
### Making Changes to the Plugin
While testing your plugin, if you need to make changes you need to re-install the plugin's dependencies:
```bash npm2yarn
npm install
```
Then, after making the changes, run the steps [one](#step-1-build-changes), [three](#step-3-remove-medusa-dependency), and [six](#step-6-run-the-medusa-backend) mentioned above.
### Troubleshoot Errors
#### Error: The class must be a valid service implementation
Please make sure that your plugin is following the correct structure. If the error persists then please try the following fix:
```bash npm2yarn
cd <BACKEND_PATH>/node_modules/medusa-interfaces
npm link
cd <BACKEND_PATH>/node_modules/@medusajs/medusa
npm link
cd <PLUGIN_PATH>
rm -rf node_modules/medusa-interfaces
rm -rf node_modules/@medusajs/medusa
npm link medusa-interfaces
npm link @medusajs/medusa
npm link
cd <BACKEND_PATH>
npm link your-plugin
```
Where `<BACKEND_PATH>` is the path to your Medusa backend and `<PLUGIN_PATH>` is the path to your plugin.
This links the `medusa-interfaces` and `@medusajs/medusa` packages from your `medusa-backend` to your plugin directory and then links your plugin to your `medusa-backend`.
#### APIs not loading
If the APIs you added to your Medussa backend are not loading then please try the following steps:
```bash npm2yarn
cd <PLUGIN_PATH>
rm -rf node_modules
cd <BACKEND_PATH>/node_modules/<PLUGIN_NAME>
npm install
cd <PLUGIN_PATH>
npm run build
cd <BACKEND_PATH>
npm run start
```
Where `<BACKEND_PATH>` is the path to your Medusa backend, `<PLUGIN_PATH>` is the path to your plugin and `<PLUGIN_NAME>` is the name of your plugin as it is in your plugin `package.json` file.
---
## Publish Plugin
Once you're done with the development of the plugin, you can publish it to NPM so that other Medusa developers and users can use it.
Please refer to [this guide on required steps to publish a plugin](./publish.mdx).

View File

@@ -0,0 +1,117 @@
---
description: 'Learn what Plugins are and how they are used in Medusa. Plugins are re-usable customizations that can be added to a Medusa backend.'
---
import DocCardList from '@theme/DocCardList';
import Icons from '@theme/Icon';
import LearningPath from '@site/src/components/LearningPath';
# Plugins
In this document, youll get an overview of plugins in Medusa, where to find them, and how to install them. If you want to learn how to create a plugin, check out [this guide](create.mdx) instead.
## Overview
Medusa was built with flexibility and extendability in mind. All different components and functionalities in Medusa are built with an abstraction layer that gives developers the freedom to choose what services they want to use or how to implement a certain component in their ecommerce store.
Developers can use plugins to take advantage of this abstraction, flexibility, and extendibility. Plugins allow developers to implement custom features or integrate third-party services into Medusa.
For example, if you want to use Stripe as a payment processor in your store, then you can install the Stripe plugin on your backend and use it.
An alternative approach is developing a custom way of handling payment on your ecommerce store. Both approaches are achievable by either creating a plugin or using an existing plugin.
Plugins run within the same process as the core Medusa backend eliminating the need for extra backend capacity, infrastructure, and maintenance. As a result, plugins can use all other services as dependencies and access the database.
Plugins can contain customizations to the Medusa backend or the admin dashboard.
<LearningPath pathName="plugin" />
---
## Using Existing Plugins
### Official Plugins
Medusa has official plugins that cover different aspects and functionalities such as payment, Content Management System (CMS), fulfillment, and notifications. You can check out the available plugins under the [Plugins section of this documentation](../../plugins/overview.mdx).
:::tip
To feature your plugin in our repository, you can send a pull request that adds your plugin into the `packages` directory. Our team will review your plugin and, if approved, will merge the pull request and add your plugin in the repository.
:::
### Community Plugins
You can find community plugins by [searching NPM for the `medusa` or `medusa-plugin` keywords](https://www.npmjs.com/search?q=keywords%3Amedusa%2Cmedusa-plugin).
You can also check the [Awesome Medusa repository](https://github.com/adrien2p/awesome-medusajs#plugins) for a list of community plugins among other resources.
---
## How to Install a Plugin
To install an existing plugin, in your Medusa backend run the following command:
```bash npm2yarn
npm install <plugin_name>
```
Where `<plugin_name>` is the package name of the plugin. For example, if youre installing the Stripe plugin `<plugin_name>` is `medusa-payment-stripe`.
### Plugin Configuration
If youre installing an official plugin from the Medusa repository, you can find in its `README.md` file a list of configurations that are either required or optional. You can also refer to the documentation related to that plugin for more details on how to install, configure, and use it.
For community plugins, please refer to the installation instructions of that plugin to learn about any required configurations.
### enableUI Plugin Option
All plugins accept an option named `enableUI`. This option allows you to disable admin customizations from appearing in the admin dashboard.
:::note
`enableUI`'s default value is `false` if not provided by the plugin users. This means that it must be enabled manually in a plugin's configuration for the customizations to appear in the admin dashboard.
:::
You can set the `enableUI` value by passing it as part of the plugin's configurations:
```js title=medusa-config.js
const plugins = [
// ...
{
resolve: `medusa-plugin-custom`,
options: {
// other options
enableUI: true,
},
},
]
```
---
## Custom Development
Developers can create plugins and reuse them across different Medusa backends. They can also share them with the community to help out other developers.
<DocCardList colSize={6} items={[
{
type: 'link',
href: '/development/plugins/create',
label: 'Create a Plugin',
customProps: {
icon: Icons['academic-cap-solid'],
description: 'Learn how to create plugins in Medusa.'
}
},
{
type: 'link',
href: '/development/plugins/publish',
label: 'Publish a Plugin',
customProps: {
icon: Icons['academic-cap-solid'],
description: 'Learn how to publish a plugin to NPM.'
}
},
]} />

View File

@@ -0,0 +1,182 @@
---
description: 'Learn how to publish a Medusa plugin to NPM. This guide lists some check lists to ensure you have implemented before publishing, as well as required steps.'
addHowToData: true
---
import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem';
# How to Publish a Plugin
In this document, you'll learn how to publish a Medusa plugin to NPM and what are some requirements to keep in mind before publishing. Afterwards, your plugin will be published on the [Medusa Plugins page](https://medusajs.com/plugins/).
## Prerequisites
If you haven't created a plugin yet, please check [this guide to learn how to create a plugin](./create.mdx).
---
## Prepare Plugin
### package.json Checklist
Before publishing your plugin, make sure you've set the following fields in your plugin's package.json:
- `name`: The name of your plugin. By convention, all plugin names start with `medusa` followed by a descriptive name of what the plugin does. For example, `medusa-payment-stripe`.
- `description`: A short description of what the plugin does.
- `author`: Your name or your company's name.
- `repository`: This includes details about the repository that holds the source code of the plugin. It's an object that holds the following properties:
- `type`: Should be `git`.
- `url`: The URL to the repository (for example, the GitHub repository holding the code of your plugin).
- `keywords`: An array of keywords that are related to the plugin. It's required for all Medusa plugins to use the keywords `medusa-plugin`. Other recommended keywords are:
- `medusa-plugin-analytics`: For plugins that add analytics functionalities or integrations.
- `medusa-plugin-cms`: For plugins that add CMS functionalities or integrations.
- `medusa-plugin-notification`: For plugins that add notification functionalities or integrations.
- `medusa-plugin-payment`: For plugins that add payment functionalities or integrations.
- `medusa-plugin-search`: For plugins that add search functionalities or integrations.
- `medusa-plugin-shipping`: For plugins that add shipping functionalities or integrations.
- `medusa-plugin-storage`: For plugins that add a file service or storage integration.
- `medusa-plugin-source`: For plugins that help migrate or import data into Medusa from another platform.
- `medusa-plugin-storefront`: For storefronts that can be integrated with a Medusa backend.
- `medusa-plugin-admin`: For plugins that include only admin customizations.
- `medusa-plugin-other`: For any other type of plugin.
### Scripts in package.json
<Tabs groupId="plugin-preference">
<TabItem value="without-admin" label="Without Admin Customizations" default>
Make sure you add the `publish` script to your `scripts` field:
```json title=package.json
"scripts": {
// other scripts...
"build": "cross-env npm run clean && tsc -p tsconfig.json",
"prepare": "cross-env NODE_ENV=production npm run build"
}
```
The `build` script ensures that the plugin's built files are placed as explained in the [plugin structure](./create.mdx#plugin-structure) section of the Create Plugin documentation.
The `prepare` script facilitates your publishing process. You would typically run this script before publishing your plugin.
</TabItem>
<TabItem value="with-admin" label="With Admin Customizations">
First, make sure to change `tsconfig` files as recommended in the [create guide](./create.mdx#changes-for-admin-plugins).
Then, add the following `prepare` and `build` scripts to your `scripts`
```json title=package.json
"scripts": {
// other scripts...
"build:server": "cross-env npm run clean && tsc -p tsconfig.json",
"prepare": "cross-env NODE_ENV=production npm run build:server && medusa-admin bundle"
}
```
The `build:server` script builds the resources of the backend for development and ensures they are placed as explained in the [plugin structure](./create.mdx#plugin-structure) section of the Create Plugin documentation.
The `prepare` script creates a production build of both backend and admin resources.
</TabItem>
</Tabs>
### Plugin Structure
Make sure your plugin's structure is as described in the [Create Plugin](./create.mdx#plugin-structure) documentation. If you've made the changes mentioned in [the above section to the scripts](#scripts-in-packagejson) in `package.json`, you should have the correct structure when you run the `prepare` command.
### NPM Ignore File
Not all files that you use while developing your plugin are necessary to be published.
For example, the files you add in the `src` directory are compiled to the root of the plugin directory before publishing. Then, when a developer installs your plugin, theyll just be using the files in the root.
So, you can ignore files and directories like `src` from the final published NPM package.
To do that, create the file `.npmignore` with the following content:
```bash title=.npmignore
/lib
node_modules
.DS_store
.env*
/*.js
!index.js
yarn.lock
src
.gitignore
.eslintrc
.babelrc
.prettierrc
build
.cache
.yarn
uploads
# These are files that are included in a
# Medusa project and can be removed from a
# plugin project
medusa-config.js
Dockerfile
medusa-db.sql
develop.sh
```
---
## Publish Plugin
This section explains how to publish your plugin to NPM.
Before you publish a plugin, you must [create an account on NPM](https://www.npmjs.com/signup).
### Step 1: Run Prepare Command
Before you publish or update your plugin, make sure to run the `prepare` command [defined earlier](#packagejson-checklist):
```bash npm2yarn
npm run prepare
```
### Step 2: Publish Plugin Package
You can publish your package with the following command:
```bash
npm publish
```
If you haven't logged in before with your NPM account, you'll be asked to login first.
Your package is then published on NPM and everyone can use it and install it.
---
## Install Plugin
To install your published plugin, run the following command on any Medusa backend project:
```bash npm2yarn
npm install medusa-plugin-custom
```
Where `medusa-plugin-custom` is your plugin's package name.
---
## Update Plugin
If you make changes to your plugin and you want to publish those changes, run the following command to change the NPM version:
```bash
npm version <type>
```
Where `<type>` indicates the type of version update youre publishing. For example, it can be `major` or `minor`. You can see the [full list of types in NPMs documentation](https://docs.npmjs.com/cli/v8/commands/npm-version).
Then, publish the new update:
```bash
npm publish
```

View File

@@ -0,0 +1,771 @@
---
description: 'Learn how to implement publishable API key functionalities for admins in Medusa using the REST APIs. This includes how to list, create, update, and delete a publishable API key.'
addHowToData: true
---
import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem';
# How to Manage Publishable API Keys
In this document, youll learn how to manage the publishable API keys using the admin APIs.
## Overview
Publishable API keys allow you to associate a set of resources with an API key. You can then pass that API key in storefront requests to guarantee that processed or returned data is within the scope you defined when creating the API key.
Currently, publishable API keys can only be associated with sales channels.
Using the Admin APIs, you can manage your Publishable API Keys.
:::note
Publishable API keys are only for client-side use. They can be publicly accessible in your code, as they are not authorized for the Admin API.
:::
### Scenario
You want to use or implement the following admin functionalities:
- Manage publishable keys including listing, creating, updating, and deleting them.
- Manage sales channels associated with a publishable API key including listing, adding, and deleting them from the publishable API key.
---
## Prerequisites
### Medusa Components
It is assumed that you already have a Medusa backend installed and set up. If not, you can follow the [quickstart guide](../../backend/install.mdx) to get started.
### JS Client
This guide includes code snippets to send requests to your Medusa backend using Medusas JS Client, among other methods.
If you follow the JS Client code blocks, its assumed you already have [Medusas JS Client](../../../js-client/overview.md) installed and have [created an instance of the client](../../../js-client/overview.md#configuration).
### Medusa React
This guide also includes code snippets to send requests to your Medusa backend using Medusa React, among other methods.
If you follow the Medusa React code blocks, it's assumed you already have [Medusa React installed](../../../medusa-react/overview.mdx) and have [used MedusaProvider higher in your component tree](../../../medusa-react/overview.mdx#usage).
### Authenticated Admin User
You must be an authenticated admin user before following along with the steps in the tutorial.
You can learn more about [authenticating as an admin user in the API reference](https://docs.medusajs.com/api/admin#authentication).
---
## List Publishable API Keys
You can retrieve a list of publishable API keys by sending a request to the [List Publishable API Keys](https://docs.medusajs.com/api/admin#publishable-api-keys_getpublishableapikeys) endpoint:
<Tabs groupId="request-type" isCodeTabs={true}>
<TabItem value="client" label="Medusa JS Client" default>
```ts
medusa.admin.publishableApiKeys.list()
.then(({ publishable_api_keys, count, limit, offset }) => {
console.log(publishable_api_keys)
})
```
</TabItem>
<TabItem value="medusa-react" label="Medusa React">
```tsx
import { PublishableApiKey } from "@medusajs/medusa"
import { useAdminPublishableApiKeys } from "medusa-react"
const PublishabelApiKeys = () => {
const { publishable_api_keys, isLoading } =
useAdminPublishableApiKeys()
return (
<div>
{isLoading && <span>Loading...</span>}
{publishable_api_keys && !publishable_api_keys.length && (
<span>No Publishable API Keys</span>
)}
{publishable_api_keys &&
publishable_api_keys.length > 0 && (
<ul>
{publishable_api_keys.map(
(publishableApiKey: PublishableApiKey) => (
<li key={publishableApiKey.id}>
{publishableApiKey.title}
</li>
)
)}
</ul>
)}
</div>
)
}
export default PublishabelApiKeys
```
</TabItem>
<TabItem value="fetch" label="Fetch API">
```ts
fetch(`<BACKEND_URL>/admin/publishable-api-keys`, {
credentials: "include",
})
.then((response) => response.json())
.then(({ publishable_api_keys, count, limit, offset }) => {
console.log(publishable_api_keys)
})
```
</TabItem>
<TabItem value="curl" label="cURL">
```bash
curl -L -X GET '<BACKEND_URL>/admin/publishable-api-keys' \
-H 'Authorization: Bearer <API_TOKEN>'
```
</TabItem>
</Tabs>
This request does not require any path parameters. You can pass it optional query parameters related to expanding fields and pagination, which you can check out in the [API reference](https://docs.medusajs.com/api/admin#publishable-api-keys_getpublishableapikeys).
This request returns the following data in the response:
- `publishable_api_keys`: An array of publishable API keys.
- `limit`: The maximum number of keys that can be returned.
- `offset`: The number of keys skipped in the result.
- `count`: The total number of keys available.
:::note
You can learn more about pagination in the [API reference](https://docs.medusajs.com/api/admin#pagination).
:::
---
## Create a Publishable API Key
You can create a publishable API key by sending a request to the [Create Publishable API Key](https://docs.medusajs.com/api/admin#publishable-api-keys_postpublishableapikeys) endpoint:
<Tabs groupId="request-type" isCodeTabs={true}>
<TabItem value="client" label="Medusa JS Client" default>
```ts
medusa.admin.publishableApiKeys.create({
title: "Web API Key",
})
.then(({ publishable_api_key }) => {
console.log(publishable_api_key.id)
})
```
</TabItem>
<TabItem value="medusa-react" label="Medusa React">
```tsx
import { useAdminCreatePublishableApiKey } from "medusa-react"
const CreatePublishableApiKey = () => {
const createKey = useAdminCreatePublishableApiKey()
// ...
const handleCreate = (title: string) => {
createKey.mutate({
title,
})
}
// ...
}
export default CreatePublishableApiKey
```
</TabItem>
<TabItem value="fetch" label="Fetch API">
```ts
fetch(`<BACKEND_URL>/admin/publishable-api-keys`, {
method: "POST",
credentials: "include",
headers: {
"Content-Type": "application/json",
},
body: JSON.stringify({
title: "Web API Key",
}),
})
.then((response) => response.json())
.then(({ publishable_api_key }) => {
console.log(publishable_api_key.id)
})
```
</TabItem>
<TabItem value="curl" label="cURL">
```bash
curl -L -X POST '<BACKEND_URL>/admin/publishable-api-keys' \
-H 'Authorization: Bearer <API_TOKEN>' \
-H 'Content-Type: application/json' \
--data-raw '{
"title": "Web API Key"
}'
```
</TabItem>
</Tabs>
This request requires a body parameter `title`, which is the title of the Publishable API Key.
It returns the created publishable API key in the response.
---
## Update a Publishable API Key
You can update a publishable API keys details by sending a request to the [Update Publishable API Key](https://docs.medusajs.com/api/admin#publishable-api-keys_postpublishableapikyspublishableapikey) endpoint:
<Tabs groupId="request-type" isCodeTabs={true}>
<TabItem value="client" label="Medusa JS Client" default>
```ts
medusa.admin.publishableApiKeys.update(publishableApiKeyId, {
title: "Web API Key",
})
.then(({ publishable_api_key }) => {
console.log(publishable_api_key.id)
})
```
</TabItem>
<TabItem value="medusa-react" label="Medusa React">
```tsx
import { useAdminUpdatePublishableApiKey } from "medusa-react"
const UpdatePublishableApiKey = () => {
const updateKey = useAdminUpdatePublishableApiKey(
publishableApiKeyId
)
// ...
const handleUpdate = (title: string) => {
updateKey.mutate({
title,
})
}
// ...
}
export default UpdatePublishableApiKey
```
</TabItem>
<TabItem value="fetch" label="Fetch API">
<!-- eslint-disable max-len -->
```ts
fetch(`<BACKEND_URL>/admin/publishable-api-keys/${publishableApiKeyId}`, {
method: "POST",
credentials: "include",
headers: {
"Content-Type": "application/json",
},
body: JSON.stringify({
title: "Web API Key",
}),
})
.then((response) => response.json())
.then(({ publishable_api_key }) => {
console.log(publishable_api_key.id)
})
```
</TabItem>
<TabItem value="curl" label="cURL">
```bash
curl -L -X POST '<BACKEND_URL>/admin/publishable-api-keys/<PUBLISHABLE_API_KEY>' \
-H 'Authorization: Bearer <API_TOKEN>' \
-H 'Content-Type: application/json' \
--data-raw '{
"title": "Web API Key"
}'
```
</TabItem>
</Tabs>
This request requires the ID of the publishable API key as a path parameter. In its body, it optionally accepts the new `title` of the publishable API key.
This request returns the update publishable API key object in the response.
---
## Revoke a Publishable API Key
Revoking a publishable API key does not remove it, but does not allow using it in future requests.
You can revoke a publishable API key by sending a request to the [Revoke Publishable API Key](https://docs.medusajs.com/api/admin#publishable-api-keys_postpublishableapikeyspublishableapikeyrevoke) endpoint:
<Tabs groupId="request-type" isCodeTabs={true}>
<TabItem value="client" label="Medusa JS Client" default>
```ts
medusa.admin.publishableApiKeys.revoke(publishableApiKeyId)
.then(({ publishable_api_key }) => {
console.log(publishable_api_key.id)
})
```
</TabItem>
<TabItem value="medusa-react" label="Medusa React">
```tsx
import { useAdminRevokePublishableApiKey } from "medusa-react"
const PublishableApiKey = () => {
const revokeKey = useAdminRevokePublishableApiKey(
publishableApiKeyId
)
// ...
const handleRevoke = () => {
revokeKey.mutate()
}
// ...
}
export default PublishableApiKey
```
</TabItem>
<TabItem value="fetch" label="Fetch API">
<!-- eslint-disable max-len -->
```ts
fetch(
`<BACKEND_URL>/admin/publishable-api-keys/${publishableApiKeyId}/revoke`,
{
method: "POST",
credentials: "include",
}
)
.then((response) => response.json())
.then(({ publishable_api_key }) => {
console.log(publishable_api_key.id)
})
```
</TabItem>
<TabItem value="curl" label="cURL">
```bash
curl -L -X POST '<BACKEND_URL>/admin/publishable-api-keys/<PUBLISHABLE_API_KEY>/revoke' \
-H 'Authorization: Bearer <API_TOKEN>'
```
</TabItem>
</Tabs>
This request requires the ID of the publishable API key as a path parameter. It returns the updated publishable API key in the response.
---
## Delete a Publishable API Key
You can delete a publishable API key by sending a request to the [Delete Publishable API Key](https://docs.medusajs.com/api/admin#publishable-api-keys_deletepublishableapikeyspublishableapikey) endpoint:
<Tabs groupId="request-type" isCodeTabs={true}>
<TabItem value="client" label="Medusa JS Client" default>
```ts
medusa.admin.publishableApiKeys.delete(publishableApiKeyId)
.then(({ id, object, deleted }) => {
console.log(id)
})
```
</TabItem>
<TabItem value="medusa-react" label="Medusa React">
```tsx
import { useAdminDeletePublishableApiKey } from "medusa-react"
const PublishableApiKey = () => {
const deleteKey = useAdminDeletePublishableApiKey(
publishableApiKeyId
)
// ...
const handleDelete = () => {
deleteKey.mutate()
}
// ...
}
export default PublishableApiKey
```
</TabItem>
<TabItem value="fetch" label="Fetch API">
<!-- eslint-disable max-len -->
```ts
fetch(`<BACKEND_URL>/admin/publishable-api-keys/${publishableApiKeyId}`, {
method: "DELETE",
credentials: "include",
})
.then((response) => response.json())
.then(({ id, object, deleted }) => {
console.log(id)
})
```
</TabItem>
<TabItem value="curl" label="cURL">
```bash
curl -L -X DELETE '<BACKEND_URL>/admin/publishable-api-keys/<PUBLISHABLE_API_KEY>' \
-H 'Authorization: Bearer <API_TOKEN>'
```
</TabItem>
</Tabs>
This request requires the ID of the publishable API key as a path parameter.
It returns the following data in the response:
- `id`: The ID of the deleted publishable API key.
- `object`: A string indicating the type of object deleted. By default, its value is `publishable_api_key`.
- `deleted`: A boolean value indicating whether the publishable API key was deleted or not.
---
## Manage Sales Channels of Publishable API Keys
This section covers how to manage sales channels in a publishable API key. This doesnt affect sales channels and their data, only its association with the publishable API key.
### List Sales Channels of a Publishable API Key
You can retrieve the list of sales channels associated with a publishable API key by sending a request to the [List Sales Channels](https://docs.medusajs.com/api/admin#sales-channels_getsaleschannels) endpoint:
<Tabs groupId="request-type" isCodeTabs={true}>
<TabItem value="client" label="Medusa JS Client" default>
```ts
medusa.admin.publishableApiKeys
.listSalesChannels(publishableApiKeyId)
.then(({ sales_channels }) => {
console.log(sales_channels)
})
```
</TabItem>
<TabItem value="medusa-react" label="Medusa React">
```tsx
import { SalesChannel } from "@medusajs/medusa"
import {
useAdminPublishableApiKeySalesChannels,
} from "medusa-react"
const SalesChannels = () => {
const { sales_channels, isLoading } =
useAdminPublishableApiKeySalesChannels(
publishableApiKeyId
)
return (
<div>
{isLoading && <span>Loading...</span>}
{sales_channels && !sales_channels.length && (
<span>No Sales Channels</span>
)}
{sales_channels && sales_channels.length > 0 && (
<ul>
{sales_channels.map((salesChannel: SalesChannel) => (
<li key={salesChannel.id}>{salesChannel.name}</li>
))}
</ul>
)}
</div>
)
}
export default SalesChannels
```
</TabItem>
<TabItem value="fetch" label="Fetch API">
<!-- eslint-disable max-len -->
```ts
fetch(
`<BACKEND_URL>/admin/publishable-api-keys/${publishableApiKeyId}/sales-channels`,
{
credentials: "include",
}
)
.then((response) => response.json())
.then(({ sales_channels }) => {
console.log(sales_channels)
})
```
</TabItem>
<TabItem value="curl" label="cURL">
```bash
curl -L -X GET '<BACKEND_URL>/admin/publishable-api-keys/<PUBLISHABLE_API_KEY>/sales-channels' \
-H 'Authorization: Bearer <API_TOKEN>'
```
</TabItem>
</Tabs>
This request requires the ID of the publishable API key as a path parameter.
It returns an array of sales channels associated with the publishable API key in the response.
### Add Sales Channels to Publishable API Key
You can add a sales channel to a publishable API key by sending a request to the [Add Sales Channels](https://docs.medusajs.com/api/admin#publishable-api-keys_postpublishableapikeysaleschannelschannelsbatch) endpoint:
<Tabs groupId="request-type" isCodeTabs={true}>
<TabItem value="client" label="Medusa JS Client" default>
```ts
medusa.admin.publishableApiKeys.addSalesChannelsBatch(
publishableApiKeyId,
{
sales_channel_ids: [
{
id: salesChannelId,
},
],
}
)
.then(({ publishable_api_key }) => {
console.log(publishable_api_key.id)
})
```
</TabItem>
<TabItem value="medusa-react" label="Medusa React">
```tsx
import {
useAdminAddPublishableKeySalesChannelsBatch,
} from "medusa-react"
const PublishableApiKey = () => {
const addSalesChannels =
useAdminAddPublishableKeySalesChannelsBatch(
publishableApiKeyId
)
// ...
const handleAdd = (salesChannelId: string) => {
addSalesChannels.mutate({
sales_channel_ids: [
{
id: salesChannelId,
},
],
})
}
// ...
}
export default PublishableApiKey
```
</TabItem>
<TabItem value="fetch" label="Fetch API">
<!-- eslint-disable max-len -->
```ts
fetch(
`<BACKEND_URL>/admin/publishable-api-keys/${publishableApiKeyId}/sales-channels/batch`,
{
method: "POST",
credentials: "include",
headers: {
"Content-Type": "application/json",
},
body: JSON.stringify({
sales_channel_ids: [
{
id: salesChannelId,
},
],
}),
}
)
.then((response) => response.json())
.then(({ publishable_api_key }) => {
console.log(publishable_api_key.id)
})
```
</TabItem>
<TabItem value="curl" label="cURL">
```bash
curl -L -X POST '<BACKEND_URL>/admin/publishable-api-keys/<PUBLISHABLE_API_KEY>/sales-channels/batch' \
-H 'Authorization: Bearer <API_TOKEN>' \
-H 'Content-Type: application/json' \
--data-raw '{
"sales_channel_ids": [
{
"id": "<SALES_CHANNEL_ID>"
}
]
}'
```
</TabItem>
</Tabs>
This request requires passing the ID of the publishable API key as a path parameter.
In its body parameters, its required to pass the `sales_channel_ids` array parameter. Each item in the array is an object containing an `id` property, with its value being the ID of the sales channel to add to the publishable API key.
You can add more than one sales channel in the same request by passing each of them as an object in the array.
This request returns the updated publishable API key in the response.
### Delete Sales Channels from a Publishable API Key
You can delete a sales channel from a publishable API key by sending a request to the [Delete Sales Channels](https://docs.medusajs.com/api/admin#publishable-api-keys_deletepublishableapikeysaleschannelschannelsbatch) endpoint:
<Tabs groupId="request-type" isCodeTabs={true}>
<TabItem value="client" label="Medusa JS Client" default>
```ts
medusa.admin.publishableApiKeys.deleteSalesChannelsBatch(
publishableApiKeyId,
{
sales_channel_ids: [
{
id: salesChannelId,
},
],
}
)
.then(({ publishable_api_key }) => {
console.log(publishable_api_key.id)
})
```
</TabItem>
<TabItem value="medusa-react" label="Medusa React">
```tsx
import {
useAdminRemovePublishableKeySalesChannelsBatch,
} from "medusa-react"
const PublishableApiKey = () => {
const deleteSalesChannels =
useAdminRemovePublishableKeySalesChannelsBatch(
publishableApiKeyId
)
// ...
const handleDelete = (salesChannelId: string) => {
deleteSalesChannels.mutate({
sales_channel_ids: [
{
id: salesChannelId,
},
],
})
}
// ...
}
export default PublishableApiKey
```
</TabItem>
<TabItem value="fetch" label="Fetch API">
<!-- eslint-disable max-len -->
```ts
fetch(
`<BACKEND_URL>/admin/publishable-api-keys/${publishableApiKeyId}/sales-channels/batch`,
{
method: "DELETE",
credentials: "include",
headers: {
"Content-Type": "application/json",
},
body: JSON.stringify({
sales_channel_ids: [
{
id: salesChannelId,
},
],
}),
}
)
.then((response) => response.json())
.then(({ publishable_api_key }) => {
console.log(publishable_api_key.id)
})
```
</TabItem>
<TabItem value="curl" label="cURL">
```bash
curl -L -X DELETE '<BACKEND_URL>/admin/publishable-api-keys/<PUBLISHABLE_API_KEY>/sales-channels/batch' \
-H 'Authorization: Bearer <API_TOKEN>' \
-H 'Content-Type: application/json' \
--data-raw '{
"sales_channel_ids": [
{
"id": "<SALES_CHANNEL_ID>"
}
]
}'
```
</TabItem>
</Tabs>
This request requires the ID of the publishable API key as a path parameter.
In its body parameters, its required to pass the `sales_channel_ids` array parameter. Each item in the array is an object containing an `id` property, with its value being the ID of the sales channel to delete from the publishable API key.
You can delete more than one sales channel in the same request by passing each of them as an object in the array.
This request returns the updated publishable API key in the response.
---
## See Also
- [Use publishable API keys in client requests](../storefront/use-in-requests.md)

View File

@@ -0,0 +1,69 @@
---
description: 'Learn what publishable API keys are and how they can be used in the Medusa backend. Publishable API keys can be used to scope API calls with an API key.'
---
import DocCardList from '@theme/DocCardList';
import Icons from '@theme/Icon';
# Publishable API Keys
In this document, youll learn about Publishable API Keys and their architecture.
## Introduction
While using Medusas APIs, you might have to pass some query parameters for certain resources with every or most requests.
Taking Sales Channels as an example, you have to pass the Sales Channels ID as a query parameter to all the necessary endpoints, such as the List Products endpoint.
This is a tedious and error-prone process. This is where Publishable API Keys are useful.
Publishable API Keys can be used to scope API calls with an API key, determining what resources are retrieved when querying the API. Currently, they can be associated only with Sales Channels.
For example, you can associate an API key with a B2B channel, then, on the storefront, retrieve only products available in that channel using the API key.
---
## PublishableApiKey Entity Overview
The `PublishableApiKey` entity represents a publishable API key that is stored in the database. Some of its important attributes include:
- `id`: The ID of the publishable API key. This is the API key youll use in your API requests.
- `created_by`: The ID of the user that created this API key.
- `revoked_by`: The ID of the user that revoked this API key. A revoked publishable API key cannot be used in requests.
---
## Relation to Other Entities
### Sales Channels
A publishable API key can be associated with more than one sales channel, and a sales channel can be associated with more than one publishable API key.
The relation is represented by the entity `PublishableApiKeySalesChannel`.
---
## Custom Development
Developers can manage Publishable API Keys and use them when making requests to the Store APIs.
<DocCardList colSize={6} items={[
{
type: 'link',
href: '/development/publishable-api-keys/admin/manage-publishable-api-keys',
label: 'Admin: Manage Publishable API Keys',
customProps: {
icon: Icons['academic-cap-solid'],
description: 'Learn how to manage publishable API keys using Admin APIs.'
}
},
{
type: 'link',
href: '/development/publishable-api-keys/storefront/use-in-requests',
label: 'Storefront: Use in Requests',
customProps: {
icon: Icons['academic-cap-solid'],
description: 'Learn how to use publishable API keys in a storefront.'
}
},
]} />

View File

@@ -0,0 +1,88 @@
---
description: 'Learn how to use Publishable API Keys in Client Requests using Medusa JS Client, Medusa React, or other methods.'
---
# Use Publishable API Keys in Client Requests
In this document, you'll learn how to use Publishable API Keys in client requests.
:::note
[Publishable API keys](../index.mdx) are only for client-side use. They can be publicly accessible in your code, as they are not authorized for the Admin API.
:::
## Default Behaviour In Product Store Endpoints
If you don't pass a publishable API Key for the store endpoints `/store/products` and `/store/products/{product_id}`, the default sales channel of the store is assigned to the request.
---
## Using Medusa JS Client
When using [Medusas JS Client](../../../js-client/overview.md), you can pass it to the client only once when you create the instance of the client:
```ts
const medusa = new Medusa({
maxRetries: 3,
baseUrl: "https://api.example.com",
publishableApiKey,
})
```
This will add the API key as in the header parameter `x-publishable-api-key` on all requests.
You can also use the `setPublishableKey` method to set it at a later point:
```ts
const medusa = new Medusa({
// ...
})
// at a later point
medusa.setPublishableKey(publishableApiKey)
```
---
## Using Medusa React
You can pass the publishable API key to the `MedusaProvider` component:
```tsx
const App = () => {
return (
<MedusaProvider
queryClientProviderProps={{ client: queryClient }}
baseUrl="http://localhost:9000"
// ...
publishableApiKey={publishableApiKey}
>
<MyStorefront />
</MedusaProvider>
)
}
```
Then, the API key will be passed in the header parameter `x-publishable-api-key` of every request.
---
## Using Other Methods
For other ways of sending requests to your Medusa backend, such as using the Fetch API, you must pass `x-publishable-api-key` in the header of every request. Its value is the publishable API keys `id`.
```ts
fetch(`<BACKEND_URL>/store/products`, {
credentials: "include",
headers: {
"x-publishable-api-key": publishableApiKey,
},
})
```
---
## See Also
- [Manage publishable keys as an admin](../admin/manage-publishable-api-keys.mdx)

View File

@@ -0,0 +1,180 @@
---
description: 'Learn how to create a scheduled job in Medusa. The scheduled job in this example will simply change the status of draft products to published.'
addHowToData: true
---
# How to Create a Scheduled Job
In this document, youll learn how to create a scheduled job in Medusa.
## Overview
Medusa allows you to create scheduled jobs that run at specific times during your backend's lifetime. For example, you can synchronize your inventory with an Enterprise Resource Planning (ERP) system once a day.
This guide explains how to create a scheduled job on your Medusa backend. The scheduled job in this example will simply change the status of draft products to `published`.
---
## Prerequisites
### Medusa Components
It is assumed that you already have a Medusa backend installed and set up. If not, you can follow the [quickstart guide](../backend/install.mdx) to get started. The Medusa backend must also have an event bus module installed, which is available when using the default Medusa backend starter.
---
## 1. Create a File
Each scheduled job should reside in a [loader](../loaders/overview.mdx), which is a TypeScript or JavaScript file located under the `src/loaders` directory.
Start by creating the `src/loaders` directory. Then, inside that directory, create the JavaScript or TypeScript file that youll add the scheduled job in. You can use any name for the file.
For the example in this tutorial, you can create the file `src/loaders/publish.ts`.
---
## 2. Create Scheduled Job
To create a scheduled job, add the following code in the file you created, which is `src/loaders/publish.ts` in this example:
```ts title=src/loaders/publish.ts
import { AwilixContainer } from "awilix"
const publishJob = async (
container: AwilixContainer,
options: Record<string, any>
) => {
const jobSchedulerService =
container.resolve("jobSchedulerService")
jobSchedulerService.create(
"publish-products",
{},
"0 0 * * *",
async () => {
// job to execute
const productService = container.resolve("productService")
const draftProducts = await productService.list({
status: "draft",
})
for (const product of draftProducts) {
await productService.update(product.id, {
status: "published",
})
}
}
)
}
export default publishJob
```
:::info
The service taking care of background jobs was renamed in v1.7.1. If you are running a previous version, use `eventBusService` instead of `jobSchedulerService`.
:::
This file should export a function that accepts a `container` and `options` parameters. `container` is the dependency container that you can use to resolve services, such as the JobSchedulerService. `options` are the plugins options if this scheduled job is created in a plugin.
You then resolve the `JobSchedulerService` and use the `jobSchedulerService.create` method to create the scheduled job. This method accepts four parameters:
- The first parameter is a unique name to give to the scheduled job. In the example above, you use the name `publish-products`.
- The second parameter is an object which can be used to [pass data to the job](#pass-data-to-the-scheduled-job).
- The third parameter is the scheduled job expression pattern. In this example, it will execute the scheduled job once a day at 12 AM.
- The fourth parameter is the function to execute. This is where you add the code to execute once the scheduled job runs. In this example, you retrieve the draft products using the [ProductService](../../references/services/classes/ProductService.md) and update the status of each of these products to `published`.
:::tip
You can see examples of scheduled job expression patterns on [crontab guru](https://crontab.guru/examples.html).
:::
### Scheduled Job Name
As mentioned earlier, the first parameter of the `create` method is the name of the scheduled job. By default, if another scheduled job has the same name, your custom scheduled job will replace it.
If you want to ensure both scheduled jobs are registered and used, you can pass as a fifth parameter an options object with a `keepExisting` property set to `true`. For example:
```ts
jobSchedulerService.create(
"publish-products",
{},
"0 0 * * *",
async () => {
// ...
},
{
keepExisting: true,
}
)
```
### Pass Data to the Scheduled Job
To pass data to your scheduled job, you can add them to the object passed as a second parameter under the `data` property. This is helpful if you use one function to handle multiple scheduled jobs.
For example:
```ts
jobSchedulerService.create("publish-products", {
data: {
productId,
},
}, "0 0 * * *", async (job) => {
console.log(job.data) // {productId: 'prod_124...'}
// ...
})
```
---
## 3. Run Medusa Backend
:::info
Scheduled Jobs only run while the Medusa backend is running.
:::
Before you run the Medusa backend, make sure to build your code that's under the `src` directory into the `dist` directory with the following command:
```bash npm2yarn
npm run build
```
Then, run the following command to start your Medusa backend:
```bash npm2yarn
npx medusa develop
```
If the scheduled job was registered successfully, you should see a message similar to this logged on your Medusa backend:
```bash
Registering publish-products
```
Where `publish-products` is the unique name you provided to the scheduled job.
Once it is time to run your scheduled job based on the scheduled job expression pattern, the scheduled job will run and you can see it logged on your Medusa backend.
For example, the above scheduled job will run at 12 AM and, when it runs, you can see the following logged on your Medusa backend:
```bash noReport
info: Processing scheduled job: publish-products
```
If you log anything in the scheduled job, for example using `console.log`, or if any errors are thrown, itll also be logged on your Medusa backend.
:::tip
To test the previous example out instantly, you can change the scheduled job expression pattern passed as the third parameter to `jobSchedulerService.create` to `* * * * *`. This will run the scheduled job every minute.
:::
---
## See Also
- [Create a Plugin](../plugins/create.mdx)

View File

@@ -0,0 +1,40 @@
---
description: "Learn what scheduled jobs are in Medusa. Scheduled jobs (also known as cron jobs) are tasks performed at a specific time in the Medusa Backend."
---
import DocCard from '@theme/DocCard';
import Icons from '@theme/Icon';
# Scheduled Jobs
In this document, youll learn what scheduled jobs are in Medusa.
## Introduction
Scheduled jobs (also known as cron jobs) are tasks performed at a specific time while the Medusa Backend is running. Theyre used to perform asynchronous tasks in the background.
For example, you can synchronize your inventory with an Enterprise Resource Planning (ERP) system once a day using a scheduled job.
In the Medusa Backend, the scheduled jobs queue is implemented using [Redis](https://redis.io/) and [Bull](https://www.npmjs.com/package/bull). So, for scheduled jobs to work, you must have [Redis installed and enabled](../../development/backend/configurations.md#redis).
:::tip
Future versions of Medusa will allow switching out Redis and using a different pub/sub service.
:::
---
## Custom Development
Developers can create an unlimited number of scheduled jobs within the Medusa Backend, a plugin, or a custom module.
<DocCard item={{
type: 'link',
href: '/development/scheduled-jobs/create',
label: 'Create a Scheduled Job',
customProps: {
icon: Icons['academic-cap-solid'],
description: 'Learn how to create a scheduled job.'
}
}} />

View File

@@ -0,0 +1,416 @@
---
description: 'Learn how to create a search service in Medusa. You can create the search service directly in your Medusa backend codebase, in a plugin, or in a module.'
addHowToData: true
---
# How to Create a Search Service
In this document, youll learn how to create a search service in Medusa. You can create the search service directly in your Medusa backend codebase, in a plugin, or in a module.
## Prerequisites
### Medusa Utils Package
A search service must extend the `AbstractSearchService` class imported from the `@medusajs/utils` package. If you dont already have the package installed, run the following command to install it within your project:
```bash npm2yarn
npm install @medusajs/utils
```
---
## Step 1: Create Search Service Class
A search service class should be defined in a TypeScript or JavaScript file created in the `src/services` directory. The class must extend the `AbstractSearchService` class imported from the `@medusajs/utils` package.
Based on services naming conventions, the files name should be the slug version of the search services name without `service`, and the classs name should be the pascal case of the search services name following by `Service`.
For example, if youre creating an algolia search service, the file name would be `algolia.ts`, whereas the class name would be `AlgoliaService`.
:::note
You can learn more about services and their naming convention in [this documentation](../services/overview.mdx).
:::
For example, create the file `src/services/my-search.ts` with the following content:
```ts title=src/services/my-search.ts
import { AbstractSearchService } from "@medusajs/utils"
class MySearchService extends AbstractSearchService {
isDefault = false
createIndex(indexName: string, options: Record<string, any>) {
throw new Error("Method not implemented.")
}
getIndex(indexName: string) {
throw new Error("Method not implemented.")
}
addDocuments(
indexName: string,
documents: Record<string, any>[],
type: string
) {
throw new Error("Method not implemented.")
}
replaceDocuments(
indexName: string,
documents: Record<string, any>[],
type: string
) {
throw new Error("Method not implemented.")
}
deleteDocument(
indexName: string,
document_id: string | number
) {
throw new Error("Method not implemented.")
}
deleteAllDocuments(indexName: string) {
throw new Error("Method not implemented.")
}
search(
indexName: string,
query: string,
options: Record<string, any>
) {
return {
message: "test",
}
}
updateSettings(
indexName: string,
settings: Record<string, any>
) {
throw new Error("Method not implemented.")
}
}
export default MySearchService
```
This creates the service `MySearchService` which, at the moment, adds a general implementation of the methods defined in the abstract class `AbstractSearchService`.
### Using a Constructor
You can use a constructor to access services and resources registered in the dependency container, to define any necessary clients if youre integrating a third-party storage service, and to access plugin options if your search service is defined in a plugin.
For example:
<!-- eslint-disable prefer-rest-params -->
```ts title=src/services/my-search.ts
// ...
import { ProductService } from "@medusajs/medusa"
type InjectedDependencies = {
productService: ProductService
}
class MySearchService extends AbstractSearchService {
// ...
protected readonly productService_: ProductService
constructor({ productService }: InjectedDependencies) {
// @ts-expect-error prefer-rest-params
super(...arguments)
this.productService_ = productService
}
// ...
}
```
You can access the plugin options in the second parameter passed to the constructor:
<!-- eslint-disable prefer-rest-params -->
```ts title=src/services/my-search.ts
// ...
class MySearchService extends AbstractSearchService {
// ...
protected readonly pluginOptions: Record<string, any>
constructor({
productService,
}: InjectedDependencies, pluginOptions) {
// @ts-expect-error prefer-rest-params
super(...arguments)
// ...
this.pluginOptions = pluginOptions
}
// ...
}
```
### isDefault Property
The `isDefault` property is mainly used to pinpoint the default search service defined in the Medusa core. For custom search services, the `isDefault` property should be `false`.
---
## Step 2: Implement Required Methods
In this section, youll learn about the required methods to implement in the search service.
:::note
The Medusa backend mainly uses the `addDocuments`, `deleteDocument`, and `search` methods in different scenarios that are explained for each of the methods. Other methods can be helpful based on the search engine youre integrating.
:::
### createIndex
This method is used to create an index in the search engine.
The method accepts two parameters:
1. `indexName`: this is the first parameter, and its a string indicating the name of the index to create.
2. `options`: this is the second parameter is typically an object, and it can be used to pass any necessary options to the method. This parameter does not have any defined format.
The method does not require any specific data type to be returned.
An example implementation, assuming `client_` would interact with a third-party service:
```ts title=src/services/my-search.ts
class MySearchService extends AbstractSearchService {
// ...
createIndex(indexName: string, options: Record<string, any>) {
return this.client_.initIndex(indexName)
}
}
```
### getIndex
This method is used to retrieve an indexs results from the search engine.
The method accepts one parameter, which is a string indicating the name of the index. The method does not require any specific data type to be returned.
An example implementation, assuming `client_` would interact with a third-party service:
```ts title=src/services/my-search.ts
class MySearchService extends AbstractSearchService {
// ...
getIndex(indexName: string) {
return this.client_.getIndex(indexName)
}
}
```
### addDocuments
This method is used to add a document to an index in the search engine.
This method is used when the Medusa backend loads, indexing all products available in the Medusa backend. Its also used whenever a new product is added or a product is updated.
The method accepts the following parameters:
- `indexName`: the first parameter is a string indicating the name of the index to add the document to.
- `documents`: the second parameter is typically an array of objects to index. For example, it can be an array of products to index.
- `type`: the third parameter is a string indicating the type of object being indexed. For example, when indexing products, the type would be `products`.
The method should return the response of saving the documents in the search engine, but theres no required format of the response.
An example implementation, assuming `client_` would interact with a third-party service:
```ts title=src/services/my-search.ts
class MySearchService extends AbstractSearchService {
// ...
async addDocuments(
indexName: string,
documents: Record<string, any>[],
type: string
) {
return await this.client_
.addDocuments(indexName, documents)
}
}
```
### replaceDocuments
This method is used to replace the existing documents in the search engine of an index with new documents.
The method accepts the following parameters:
- `indexName`: the first parameter is a string indicating the name of the index to replace the documents in.
- `documents`: the second parameter is typically an array of objects to index. For example, it can be an array of products to index. This would be the new documents to add to the index.
- `type`: the third parameter is a string indicating the type of object being indexed. For example, when indexing products, the type would be `products`.
The method should return the response of saving the documents in the search engine, but theres no required format of the response.
An example implementation, assuming `client_` would interact with a third-party service:
```ts title=src/services/my-search.ts
class MySearchService extends AbstractSearchService {
// ...
async replaceDocuments(
indexName: string,
documents: Record<string, any>[],
type: string
) {
await this.client_
.removeDocuments(indexName)
return await this.client_
.addDocuments(indexName, documents)
}
}
```
### deleteDocument
This method is used to delete a document from an index.
When a product is deleted in the Medusa backend, this method is used to delete the product from the search engines index.
The method accepts the following parameters:
- `indexName`: the first parameter is a string indicating the name of the index this document belongs in.
- `document_id`: the second parameter is a string or a number indicating the ID of the document to delete. When a product is deleted, the products ID is passed as the value of this parameter.
The method should return the response of deleting the document in the search engine, but theres no required format of the response.
An example implementation, assuming `client_` would interact with a third-party service:
```ts title=src/services/my-search.ts
class MySearchService extends AbstractSearchService {
// ...
async deleteDocument(
indexName: string,
document_id: string | number
) {
return await this.client_
.deleteDocument(indexName, document_id)
}
}
```
### deleteAllDocuments
This method is used to delete all documents from an index.
The method accepts one parameter, which is a string indicating the name of the index to delete its documents.
The method should return the response of deleting the documents of that index in the search engine, but theres no required format of the response.
An example implementation, assuming `client_` would interact with a third-party service:
```ts title=src/services/my-search.ts
class MySearchService extends AbstractSearchService {
// ...
async deleteAllDocuments(indexName: string) {
return await this.client_
.deleteDocuments(indexName)
}
}
```
### search
This method is used to search through an index by a query.
In the Medusa backend, this method is used within the [Search Products endpoint](https://docs.medusajs.com/api/store#products_postproductssearch) to retrieve the search results.
This method accepts the following parameters:
1. `indexName`: the first parameter is a string indicating the index to search through. When using the Search Products endpoint, the index is the default index defined in the `IndexName` static property of the `ProductService`, which is `products`.
2. `query`: the second parameter is a string indicating the query to use to search through the documents.
3. `options`: the third parameter is typically an object that can be used to pass any necessary options to the search engine.
Although theres no required data format or type to be returned to the method, its recommended to return an object having a property `hits` with its value being an array of results. Each result can be an object of any format. This is recommended as this is the convention followed within Medusas official search plugins.
An example implementation, assuming `client_` would interact with a third-party service:
```ts title=src/services/my-search.ts
class MySearchService extends AbstractSearchService {
// ...
async search(
indexName: string,
query: string,
options: Record<string, any>
) {
const hits = await this.client_
.search(indexName, query)
return {
hits,
}
}
}
```
### updateSettings
This method is used to update the settings of an index within the search engine. This can be useful if you want to update the index settings when the plugin options change.
For example, in the Algolia plugin, a loader, which runs when the Medusa backend loads, is used to update the settings of indices based on the plugin options. The loader uses this method to update the settings.
The method accepts the following parameters:
1. `indexName`: the first parameter is a string indicating the index that should be updated.
2. `settings`: the second parameter is typically an object that holds the settings of the index. Theres no defined format for this parameter.
The method should return the response of updating the index in the search engine, but theres no required format of the response.
An example implementation, assuming `client_` would interact with a third-party service:
```ts title=src/services/my-search.ts
class MySearchService extends AbstractSearchService {
// ...
async updateSettings(
indexName: string,
settings: Record<string, any>
) {
return await this.client_
.updateSettings(indexName, settings)
}
}
```
---
## Step 3: Run Build Command
In the directory of the Medusa backend, run the build command to transpile the files in the `src` directory into the `dist` directory:
```bash npm2yarn
npm run build
```
---
## Test it Out
:::note
This section explains how to test out your implementation if the search service was created in the Medusa backend codebase. You can refer to [the plugin documentation](../plugins/create.mdx#test-your-plugin) on how to test a plugin.
:::
Run your backend to test it out:
```bash npm2yarn
npx medusa develop
```
You can then send a request to the [Search Products endpoint](https://docs.medusajs.com/api/store#products_postproductssearch) to see if your search service returns any results.
---
## See Also
- [How to create a plugin](../plugins/create.mdx)
- [How to publish a plugin](../plugins/publish.mdx)

View File

@@ -0,0 +1,44 @@
---
description: "Learn what a search service is and how its used in Medusa. A search service is used to manage search indices of searchable items, such as products, and providing results for search operations."
---
import DocCard from '@theme/DocCard';
import Icons from '@theme/Icon';
# Search Service
In this document, youll learn what a search service is and how its used in Medusa.
## Overview
A search service is used to manage search indices of searchable items, such as products, and providing results for search operations. Although the Medusa core provides basic search functionalities through its endpoints, a search service allows you to integrate third-party services for an optimized search experience and rich search functionalities.
A search service is a service class that is defined in a TypeScript or JavaScript file, which is created in the `src/services` directory of your Medusa backend codebase or plugin. The class must extend the `AbstractSearchService` class imported from the `@medusajs/utils` package.
Using the [dependency container and injection](../fundamentals/dependency-injection.md), the Medusa backend will then use and resolve the search service within the backends search operations, such as when the [Search Products](https://docs.medusajs.com/api/store#products_postproductssearch) endpoint is used. You can also [resolve the service](../services/create-service.mdx#use-a-service) within your resources to trigger the search where necessary.
Medusa provides official plugins that you can install and use in your Medusa backend. Check out available search plugins [here](../../plugins/search/index.mdx).
### Hierarchy of Search Services
Medusa provides a default search service that doesnt actually perform any indexing or searching, but acts like a placeholder search service. Only one search service is registered in the dependency container under the `searchService` name.
If you install a search plugin, the search service within it will be registered in the dependency container and used throughout the Medusa backend.
If you create your own search service in the Medusa backend codebase, it will be registered in the dependency container and used throughout the Medusa backend.
---
## Custom Development
Developers can create a custom search service with the desired functionality or third-party integration either directly within the Medusa Core, in a plugin, or in a module.
<DocCard item={{
type: 'link',
href: '/development/search/create',
label: 'Create a Search service',
customProps: {
icon: Icons['academic-cap-solid'],
description: 'Learn how to create a search service.'
}
}} />

View File

@@ -0,0 +1,677 @@
---
description: 'Learn how to create a service in Medusa. This guide also includes how to use services in other services, subscribers, and endpoints.'
addHowToData: true
---
import Troubleshooting from '@site/src/components/Troubleshooting'
import ServiceLifetimeSection from '../../troubleshooting/awilix-resolution-error/_service-lifetime.md'
import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem';
# How to Create a Service
In this document, youll learn how you can create a [Service](./overview.mdx) and use it across your Medusa backend just like any of the core services.
## Basic Service Implementation
To create a service, create a TypeScript or JavaScript file in `src/services` to hold the service. The name of the file should be the name of the service without `Service`. This is essential as the file name is used when registering the service in the [dependency container](../fundamentals/dependency-injection.md), and `Service` is appended to the camel-case version of the file name automatically.
For example, if you want to create a service `PostService`, eventually registered as `postService`, create the file `post.ts` in `src/services` with the following content:
```ts title=/src/services/post.ts
import { TransactionBaseService } from "@medusajs/medusa"
class PostService extends TransactionBaseService {
getMessage() {
return `Welcome to My Store!`
}
}
export default PostService
```
This service will be registered in the [dependency container](../fundamentals/dependency-injection.md) as `postService`. It contains a single sample method `getMessage`.
---
## Build Files
Custom services must be transpiled and moved to the `dist` directory before you can start consuming them. When you run your backend using either the `medusa develop` or `npx medusa develop` commands, it watches the files under `src` for any changes, then triggers the `build` command and restarts the server.
However, the build isn't triggered when the backend first starts running, and it's never triggered when the `medusa start` or `npx medusa start` commands are used.
So, make sure to run the `build` command before starting the backend:
```bash npm2yarn
npm run build
```
---
## Service Constructor
As the service extends the `TransactionBaseService` class, all resources registered in the dependency container can be accessed through [dependency injection](../fundamentals/dependency-injection.md). This includes services, repositories, and other resources in the Medusa core, as well as your custom services and resources.
So, if you want your service to use another service, add it as part of your constructors dependencies and set it to a field inside your services class:
```ts title=/src/services/post.ts
import { ProductService } from "@medusajs/medusa"
import { PostRepository } from "../repositories/post"
class PostService extends TransactionBaseService {
private productService: ProductService
constructor(container) {
super(container)
this.productService = container.productService
}
// ...
}
```
Then, you can use that service anywhere in your custom service. For example:
```ts title=/src/services/post.ts
class PostService extends TransactionBaseService {
// ...
async getProductCount() {
return await this.productService.count()
}
}
```
---
## Use Repositories
As your service provides helper methods related to one or more entities in your backend, you'll need to perform operations on that entity. To do that, you need to use the entity's repository.
Repositories, just like services, are registered in the dependency container and can be accessed with [dependency injection](../fundamentals/dependency-injection.md).
However, to actually get an instance of the repository within the service's methods, you need to use the service's `activeManager_`, which is declared in the parent class `TransactionBaseService`. `activeManager_` is an instance of Typeorm's `EntityManager` and has a method `withRepository` which allows you to retrieve an instance of the repository.
For example:
```ts title=/src/services/post.ts
import { PostRepository } from "../repositories/post"
class PostService extends TransactionBaseService {
// ...
protected postRepository_: typeof PostRepository
constructor(container) {
super(container)
// ...
this.postRepository_ = container.postRepository
}
async list(): Promise<Post[]> {
const postRepo = this.activeManager_.withRepository(
this.postRepository_
)
return await postRepo.find()
}
// ...
}
```
Refer to the [repositories](../entities/repositories.md) documentation to learn about its different methods.
---
## Transactions
Transactions ensure that when an error occurs, all data manipulation within the transaction is reverted. As services are likely to include methods that manipulate data, such as create or update a post, it's very useful to wrap that logic within a transaction block.
Since services extend the `TransactionBaseService` class, you can use its `atomicPhase_` method. The `atomicPhase_` method allows you to wrap code within a transactional block.
It accepts as a parameter a function, which includes the logic to be performed inside the transactional block. The function accepts as a parameter a transaction manager, which is Typeorm's `EntityManager`. You can use it within the function to retrieve repositories, among other functionalities.
The data returned by the function passed as a parameter to the `atomicPhase_` method will be return by the `atomicPhase_` method as well.
For example, the `PostService`'s `create` method with the `atomicPhase_` method:
```ts title=/src/services/post.ts
class PostService extends TransactionBaseService {
protected postRepository_: typeof PostRepository
// ...
async create(
data: Pick<Post, "title" | "author_id">
): Promise<Post> {
return this.atomicPhase_(async (manager) => {
const postRepo = manager.withRepository(
this.postRepository_
)
const post = postRepo.create()
post.title = data.title
post.author_id = data.author_id
const result = await postRepo.save(post)
return result
})
}
}
```
---
## Service Life Time
As the dependency container in Medusa is built on top of [awilix](https://github.com/jeffijoe/awilix), you can specify the [Lifetime](https://github.com/jeffijoe/awilix#lifetime-management) of a service. The lifetime is added as a static property to the service.
There are three lifetime types:
1. `Lifetime.TRANSIENT`: when used, a new instance of the service is created every time it is resolved in other resources from the dependency container.
2. `Lifetime.SCOPED`: (default for custom services) when used, an instance of the service is created and reused in the scope of the dependency container. So, when the service is resolved in other resources that share that dependency container, the same instance of the service will be returned.
3. `Lifetime.SINGLETON`: (default for core services) when used, the service is always reused, regardless of the scope. An instance of the service is cached in the root container.
You can set the lifetime of your service by setting the `LIFE_TIME` static property:
```ts title=/src/services/post.ts
import { TransactionBaseService } from "@medusajs/medusa"
import { Lifetime } from "awilix"
class PostService extends TransactionBaseService {
static LIFE_TIME = Lifetime.SCOPED
// ...
}
```
---
## Retrieve Medusa Configurations
Within your service, you may need to access the Medusa configuration exported from `medusa-config.js`. To do that, you can access `configModule` using dependency injection.
For example:
```ts title=/src/services/post.ts
import {
ConfigModule,
TransactionBaseService,
} from "@medusajs/medusa"
class PostService extends TransactionBaseService {
protected readonly configModule_: ConfigModule
constructor(container) {
super(container)
// ...
this.configModule_ = container.configModule
}
getConfigurations() {
return this.configModule_
}
// ...
}
export default PostService
```
---
## Pagination, Filtering, and Relations
Often, your service will provide methods that retrieve a list of items, which can be used by endpoints. In these methods, it can be helpful to provide filtering and pagination utilities that can be used by endpoints or any other resources utilizing this service.
The `@medusajs/medusa` package provides the following generic types that you can use to create the signature of your method that accepts filtering and pagination parameters:
1. `Selector`: Can be used to accepts the attributes of an entity as possible filtering parameters, based on each attribute's type.
2. `FindConfig`: Can be used to provide pagination parameters such as `skip`, `take`, and `relations`. `skip` indicates how many items to skip before retrieving the results, `take` indicates how many results to return, and `relations`, indicate which relations to expand and include in the returned objects.
The `@medusajs/medusa` package also provides a `buildQuery` method that allows you to pass two parameter, the first of type `Selector` and the second of type `FindConfig`, to build the object that should be passed to the repository.
So, for example, to create a method that retrieves a list of posts and the total number of posts available:
```ts title=src/services/post.ts
import {
FindConfig,
Selector,
TransactionBaseService,
buildQuery,
} from "@medusajs/medusa"
class PostService extends TransactionBaseService {
// ...
async listAndCount(
selector?: Selector<Post>,
config: FindConfig<Post> = {
skip: 0,
take: 20,
relations: [],
}): Promise<[Post[], number]> {
const postRepo = this.activeManager_.withRepository(
this.postRepository_
)
const query = buildQuery(selector, config)
return postRepo.findAndCount(query)
}
}
```
In addition, you can expand relations when retrieving a single item with the help of `FindConfig` and `buildQuery`.
For example, to create a method that retrieves a single post:
```ts title=src/services/post.ts
import {
FindConfig,
TransactionBaseService,
buildQuery,
} from "@medusajs/medusa"
import { MedusaError } from "@medusajs/utils"
class PostService extends TransactionBaseService {
// ...
async retrieve(
id: string,
config?: FindConfig<Post>
): Promise<Post> {
const postRepo = this.activeManager_.withRepository(
this.postRepository_
)
const query = buildQuery({
id,
}, config)
const post = await postRepo.findOne(query)
if (!post) {
throw new MedusaError(
MedusaError.Types.NOT_FOUND,
"Post was not found"
)
}
return post
}
}
```
Then, any other resources such as endpoints or services that use this method can pass what relations to expand in the next parameter:
```ts
await postService.retrieve(id, {
relations: ["authors"],
})
```
---
## Throwing Errors
When you need to throw errors in your service methods, it's recommended to use `MedusaError` imported from `@medusajs/util`. That way, when an error is thrown within a request, the error will be returned in the response in a consistent format as Medusa's errors.
:::note
This assumes you're handling errors in your custom endpoints as explained [here](../endpoints/create.mdx#handle-errors).
:::
For example:
```ts title=src/services/post.ts
import { MedusaError } from "@medusajs/utils"
class PostService extends TransactionBaseService {
// ...
async retrieve(
id: string,
config?: FindConfig<Post>
): Promise<Post> {
const postRepo = this.activeManager_.withRepository(
this.postRepository_
)
const query = buildQuery({
id,
}, config)
const post = await postRepo.findOne(query)
if (!post) {
throw new MedusaError(
MedusaError.Types.NOT_FOUND,
"Post was not found"
)
}
return post
}
}
```
---
## Use a Service
In this section, you'll learn how to use services throughout your Medusa backend. This includes both Medusa's services and your custom services.
:::note
Before using your service, make sure you run the [build command](#build-files).
:::
### In a Service
To use your custom service in another custom service, you can have easy access to it in the dependencies injected to the constructor of your service:
```ts
class MyService extends TransactionBaseService {
constructor(container) {
super(container)
this.postService = container.postService
}
// ...
}
```
### In an Endpoint
To use your custom service in an endpoint, you can use `req.scope.resolve` passing it the services registration name:
```ts
const postService = req.scope.resolve("postService")
res.json({
posts: postService.list(),
})
```
### In a Subscriber
To use your custom service in a subscriber, you can have easy access to it in the subscribers dependencies injected to the constructor of your subscriber:
```ts
class MySubscriber {
constructor({ postService, eventBusService }) {
this.postService = postService
}
// ...
}
```
---
## Troubleshooting
<Troubleshooting
sections={[
{
title: 'AwilixResolutionError: Could Not Resolve X',
content: <ServiceLifetimeSection />
}
]}
/>
---
## Example: Services with CRUD Operations
In this section, you'll find a full example of the `PostService` and `AuthorService` that implement Create, Read, Update, and Delete (CRUD) operations.
You can refer to the [Entities](../entities/create.mdx#adding-relations) documentation to learn how to create the custom entities used in this example.
<Tabs groupId="files" isCodeTabs={true}>
<TabItem value="post" label="src/services/post.ts" default>
```ts
import {
FindConfig,
Selector,
TransactionBaseService,
buildQuery,
} from "@medusajs/medusa"
import { PostRepository } from "../repositories/post"
import { Post } from "../models/post"
import { MedusaError } from "@medusajs/utils"
class PostService extends TransactionBaseService {
protected postRepository_: typeof PostRepository
constructor(container) {
super(container)
this.postRepository_ = container.postRepository
}
async listAndCount(
selector?: Selector<Post>,
config: FindConfig<Post> = {
skip: 0,
take: 20,
relations: [],
}): Promise<[Post[], number]> {
const postRepo = this.activeManager_.withRepository(
this.postRepository_
)
const query = buildQuery(selector, config)
return postRepo.findAndCount(query)
}
async list(
selector?: Selector<Post>,
config: FindConfig<Post> = {
skip: 0,
take: 20,
relations: [],
}): Promise<Post[]> {
const [posts] = await this.listAndCount(selector, config)
return posts
}
async retrieve(
id: string,
config?: FindConfig<Post>
): Promise<Post> {
const postRepo = this.activeManager_.withRepository(
this.postRepository_
)
const query = buildQuery({
id,
}, config)
const post = await postRepo.findOne(query)
if (!post) {
throw new MedusaError(
MedusaError.Types.NOT_FOUND,
"Post was not found"
)
}
return post
}
async create(
data: Pick<Post, "title" | "author_id">
): Promise<Post> {
return this.atomicPhase_(async (manager) => {
const postRepo = manager.withRepository(
this.postRepository_
)
const post = postRepo.create()
post.title = data.title
post.author_id = data.author_id
const result = await postRepo.save(post)
return result
})
}
async update(
id: string,
data: Omit<Partial<Post>, "id">
): Promise<Post> {
return await this.atomicPhase_(async (manager) => {
const postRepo = manager.withRepository(
this.postRepository_
)
const post = await this.retrieve(id)
Object.assign(post, data)
return await postRepo.save(post)
})
}
async delete(id: string): Promise<void> {
return await this.atomicPhase_(async (manager) => {
const postRepo = manager.withRepository(
this.postRepository_
)
const post = await this.retrieve(id)
await postRepo.remove([post])
})
}
}
export default PostService
```
</TabItem>
<TabItem value="author" label="src/services/author.ts">
```ts
import {
FindConfig,
Selector,
TransactionBaseService,
buildQuery,
} from "@medusajs/medusa"
import { EntityManager } from "typeorm"
import AuthorRepository from "../repositories/author"
import { Author } from "../models/author"
import { MedusaError } from "@medusajs/utils"
class AuthorService extends TransactionBaseService {
protected manager_: EntityManager
protected transactionManager_: EntityManager
protected authorRepository_: typeof AuthorRepository
constructor(container) {
super(container)
this.authorRepository_ = container.authorRepository
}
async listAndCount(
selector?: Selector<Author>,
config: FindConfig<Author> = {
skip: 0,
take: 20,
relations: [],
}): Promise<[Author[], number]> {
const authorRepo = this.activeManager_.withRepository(
this.authorRepository_
)
const query = buildQuery(selector, config)
return authorRepo.findAndCount(query)
}
async list(
selector?: Selector<Author>,
config: FindConfig<Author> = {
skip: 0,
take: 20,
relations: [],
}): Promise<Author[]> {
const [authors] = await this.listAndCount(selector, config)
return authors
}
async retrieve(
id: string,
config?: FindConfig<Author>
): Promise<Author> {
const authorRepo = this.activeManager_.withRepository(
this.authorRepository_
)
const query = buildQuery({
id,
}, config)
const author = await authorRepo.findOne(query)
if (!author) {
throw new MedusaError(
MedusaError.Types.NOT_FOUND,
"Author was not found"
)
}
return
}
async create(
data: Pick<Author, "name" | "image">
): Promise<Author> {
return this.atomicPhase_(async (manager) => {
const authorRepo = manager.withRepository(
this.authorRepository_
)
const post = authorRepo.create(data)
const result = await authorRepo.save(post)
return result
})
}
async update(
id: string,
data: Omit<Partial<Author>, "id">
): Promise<Author> {
return await this.atomicPhase_(async (manager) => {
const authorRepo = manager.withRepository(
this.authorRepository_
)
const post = await this.retrieve(id)
Object.assign(post, data)
return await authorRepo.save(post)
})
}
async delete(id: string): Promise<void> {
return await this.atomicPhase_(async (manager) => {
const authorRepo = manager.withRepository(
this.authorRepository_
)
const post = await this.retrieve(id)
await authorRepo.remove([post])
})
}
}
export default AuthorService
```
</TabItem>
</Tabs>
---
## See Also
- [Create a Plugin](../plugins/create.mdx)

View File

@@ -0,0 +1,102 @@
---
description: 'Learn how to create a service in Medusa. This guide also includes how to use services in other services, subscribers, and endpoints.'
addHowToData: true
---
import Troubleshooting from '@site/src/components/Troubleshooting'
import ServiceLifetimeSection from '../../troubleshooting/awilix-resolution-error/_service-lifetime.md'
# How to Extend a Service
In this document, youll learn how to extend a core service in Medusa.
## Overview
Medusas core services cover a wide range of functionalities related to each domain or entity. You can extend these services to add custom methods or override existing methods.
### Word of Caution about Overriding
Extending services to add new methods shouldn't cause any issues within your commerce application. However, if you extend them to override their existing methods, you should be aware that this could have negative implications, such as unanticipated bugs, especially when you try to upgrade the core Medusa package to a newer version.
---
## Step 1: Create the Service File
In your Medusa backend, create the file `src/services/product.ts`. This file will hold your extended service.
Note that the name of the file must be the same as the name of the original service in the core package. So, if youre extending the `ProductService`, the files name should be `product.ts`. On the other hand, if youre extending the `CustomerService`, the files name should be `customer.ts`.
---
## Step 2: Implementing the Service
In the file, you can import the original service from the Medusa core, then create your service that extends the core service.
For example, to extend the Product service:
```ts title=src/services/product.ts
import {
ProductService as MedusaProductService,
} from "@medusajs/medusa"
class ProductService extends MedusaProductService {
// TODO add customizations
}
export default ProductService
```
Notice that you alias the `ProductService` of the core to avoid naming conflicts.
Within the service, you can add new methods or extend existing ones.
You can also change the lifetime of the service:
```ts title=src/services/product.ts
import { Lifetime } from "awilix"
import {
ProductService as MedusaProductService,
} from "@medusajs/medusa"
class ProductService extends MedusaProductService {
// The default life time for a core service is SINGLETON
static LIFE_TIME = Lifetime.SCOPED
// ...
}
export default ProductService
```
You can learn more details about the service lifetime and other considerations when creating a service in the [Create Service documentation](./create-service.mdx).
---
## Step 3: Test it Out
To test out your customization, start by transpiling your files by running the following command in the root directory of the Medusa backend:
```bash npm2yarn
npm run build
```
Then, start the backend:
```bash npm2yarn
npx medusa develop
```
You should see the customizations you made in effect.
---
## Troubleshooting
<Troubleshooting
sections={[
{
title: 'AwilixResolutionError: Could Not Resolve X (Service Lifetime)',
content: <ServiceLifetimeSection />
}
]}
/>

View File

@@ -0,0 +1,63 @@
---
description: 'Learn what Services are in Medusa. Services represent bundled helper methods that you want to use across your commerce application.'
---
import DocCardList from '@theme/DocCardList';
import Icons from '@theme/Icon';
# Services
In this document, you'll learn about what Services are in Medusa.
## What are Services
Services in Medusa represent bundled helper methods that you want to use across your commerce application. By convention, they represent a certain entity or functionality in Medusa.
For example, you can use Medusas `productService` to get the list of products, as well as perform other functionalities related to products. Theres also an `authService` that provides functionalities like authenticating customers and users.
In the Medusa backend, custom services are TypeScript or JavaScript files located in the `src/services` directory. Each service should be a class that extends the `TransactionBaseService` class from the core Medusa package `@medusajs/medusa`. Each file you create in `src/services` should hold one service and export it.
The file name is important as it determines the name of the service when you need to use it elsewhere. The name of the service will be registered in the dependency container as the camel-case version of the file name with `Service` appended to the end of the name. Other resources, such as other services or endpoints, will use that name when resolving the service from the dependency container.
For example, if the file name is `hello.ts`, the service will be registered as `helloService` in the dependency container. If the file name is `hello-world.ts`, the service name will be registered as `helloWorldService`.
:::note
You can learn more about the dependency container and how it works in the [dependency injection](../fundamentals/dependency-injection.md) documentation.
:::
The service must then be transpiled using the `build` command, which moves them to the `dist` directory, to be used across your commerce application.
:::tip
If you're creating a service in a plugin, learn more about the required structure [here](../plugins/create.mdx#plugin-structure).
:::
---
## Custom Development
Developers can create custom services in the Medusa backend, a plugin, or in a module.
<DocCardList colSize={6} items={[
{
type: 'link',
href: '/development/services/create-service',
label: 'Create a Service',
customProps: {
icon: Icons['academic-cap-solid'],
description: 'Learn how to create a service in Medusa.'
}
},
{
type: 'link',
href: '/development/services/extend-service',
label: 'Extend a Service',
customProps: {
icon: Icons['academic-cap-solid'],
description: 'Learn how to extend a core Medusa service.'
}
},
]} />

View File

@@ -0,0 +1,81 @@
---
description: "Learn how to override a strategy in a Medusa backend or plugin."
addHowToData: true
---
# How to Override a Strategy
In this document, youll learn how to override a strategy in a Medusa backend or plugin.
## Overview
The Medusa core defines and uses strategies for certain functionalities, which allows developers to override these functionalities and implement them as fits for their use case.
For example, the cart completion process is implemented within a `CartCompletionStrategy` that is used inside the Complete Cart endpoint. If you need to change the cart completion process, you can override the `CartCompletionStrategy` and implement your own strategy. The Medusa backend will then use your strategy instead of the one defined in the core.
### Hierarchy of Strategy Resolution
When the Medusa backend starts, the strategies defined in the core are registered in the dependency container first.
If a strategy is overridden in a plugin, it will be registered in the dependency container in place of the original strategy.
If a strategy is overridden in the Medusa backend codebase, it will be registered in the dependency container in place of the original strategy.
So, if a strategy is overridden within a plugin and in the Medusa backend codebase, the one in the codebase is registered and used instead of the one in the plugin.
:::tip
You can learn more about the dependency container and injection [here](../fundamentals/dependency-injection.md).
:::
### Existing Resources
This guide explains how to override a strategy in general. It doesnt explain how to override specific strategies.
There are other resources that provide steps specific to a strategy type:
- [How to override the Cart Completion Strategy](../../modules/carts-and-checkout/backend/cart-completion-strategy.md)
- [How to override the Tax Calculation Strategy](../../modules/taxes/backend/tax-calculation-strategy.md)
- [How to override the Price Selection Strategy](../../modules/price-lists/backend/override-price-selection-strategy.md)
- [How to override a Batch Job Strategy](../batch-jobs/customize-import.md)
---
## Step 1: Create Strategy Class
A strategy class is created in a TypeScript or JavaScript file in the `src/strategies` directory of your Medusa backend or plugin.
The class must extend or implement a strategy interface or class from within the core, depending on what strategy youre overriding. For example, if youre overriding the `CartCompletionStrategy`, you must extend the `AbstractCartCompletionStrategy`.
---
## Step 2: Implement Functionalities
Each strategy class is required to implement certain methods defined by the interface it implements or the abstract class it extends. Depending on what the strategy is used for, you must implement the functionalities in each of the required methods.
For example, if youre overriding the `CartCompletionStrategy`, you must implement the `complete` method using the signature defined in `AbstractCartCompletionStrategy`.
---
## Step 3: Run Build Command
In the directory of the Medusa backend, run the `build` command to transpile the files in the `src` directory into the `dist` directory:
```bash npm2yarn
npm run build
```
---
## Test it Out
Run your backend to test it out:
```bash npm2yarn
npx medusa develop
```
You can test now whether your strategy is working by performing the actions that run your strategy.
For example, if youre overriding the `CartCompletionStrategy`, you can simulate the cart completion flow and see if your strategy is being used.

View File

@@ -0,0 +1,57 @@
---
description: "Learn what a Strategy is in Medusa. A strategy is an isolated piece of business logic that can be overridden and customized."
---
import DocCard from '@theme/DocCard';
import Icons from '@theme/Icon';
# Strategy
In this document, youll learn what a Strategy is in Medusa.
## Introduction
A strategy is an isolated piece of business logic that can be overridden and customized. Its a TypeScript or JavaScript class that doesnt have to follow a specific definition, but performs only a single functionality.
For example, in the core `@medusajs/medusa` package, strategies are used to implement functionalities like cart completion and product import.
These strategy classes are then resolved in endpoints, services, or wherever needed using dependency injection and used to perform their designated functionality.
For example, the `CartCompletionStrategy` is resolved in the Complete Cart endpoint that is defined in the core `@medusajs/medusa` package. Its then used to complete the cart and place the order:
```ts
export default async (req, res) => {
// ...
const completionStrat: AbstractCartCompletionStrategy =
req.scope.resolve(
"cartCompletionStrategy"
)
const {
response_code,
response_body,
} = await completionStrat.complete(
id,
idempotencyKey,
req.request_context
)
res.status(response_code).json(response_body)
}
```
When a strategy is overridden, dependency injection then resolves the strategy (in the code example above, `CartCompletionStrategy`) to the custom strategy that the developer created. Then, the method (in the code example above, `complete`) of the custom strategy will be executed instead of the one defined in the core package.
## Custom Development
Developers can override strategies defined in the core to customize their functionality within their Medusa backend codebase or within a plugin.
<DocCard item={{
type: 'link',
href: '/development/strategies/override-strategy',
label: 'Backend: Override a Strategy',
customProps: {
icon: Icons['academic-cap-solid'],
description: 'Learn how to override a strategy.',
}
}} />