The PR for the Products section is growing quite large, so I would like to merge this PR that contains a lot of the ground work before moving onto finalizing the rest of the domain.
**Note**
Since the PR contains changes to the core, that the dashboard depends on, the staging env will not work. To preview this PR, you will need to run it locally.
## `@medusajs/medusa`
**What**
- Adds missing query params to `GET /admin/products/:id/variants`
- `options.values` has been added to the default relations of admin product endpoints.
## `medusa-react`
**What**
- Adds missing hook for `GET /admin/products/:id/variants`
## `@medusajs/dashboard`
- Adds base implementation for `DataGrid` component (formerly `BulkEditor`) (WIP)
- Adds `/products` overview page
- Adds partial `/products/create` page for creating new products (WIP - need to go over design w/ Ludvig before continuing)
- Adds `/products/:id` details page
- Adds `/products/:id/gallery` page for inspecting a products images in fullscreen.
- Adds `/products/:id/edit` page for editing the general information of a product
- Adds `/products/:id/attributes` page for editing the attributes information of a product
- Adds `/products/:id/sales-channels` page for editing which sales channels a product is available in
- Fixes a bug in `DataTable` where a table with two fixed columns would not display correctly
For the review its not important to test the DataGrid, as it is still WIP, and I need to go through some minor changes to the behaviour with Ludvig, as virtualizing it adds some constraints.
## `@medusajs/icons`
**What**
- Pulls latest icons from Figma
## TODO in next PR
- [ ] Fix the typing of POST /admin/products/:id as it is currently not possible to delete any of the nullable fields once they have been added. Be aware of this when reviewing this PR.
- [ ] Wrap up `/products/create` page
- [ ] Add `/products/:id/media` page for managing media associated with the product.
- [ ] Add `/products/id/options` for managing product options (need Ludvig to rethink this as the current API is very limited and we can implement the current design as is.)
- [ ] Add `/products/:id/variants/:id` page for editing a variant. (Possibly concat all of these into one BulkEditor page?)
**What**
- Adds Tax rules to allow overrides of tax rates for certain products, product types or shipping options.
**Punted to future PR**
- Currently, the creation methods only support bulk operations. A later PR will include support for singular operations, too.
- It should be possible to add products, types, and shipping options to a tax rate after creating it. Add, remove, update will come in a later PR.
**What**
Add missing support for updating countries on the POST /admin/regions/:id route.
Furthermore, the PR refactors how the normalization and validation was handled in both create and update scenarios.
Finally, the PR adds a unique index on the country entity, which ensures at the DB level that we cannot have a duplicate country in the DB for a single region.
**What**
Default ordering. By default, only the top level entity ordering is applied using the primary keys, the relations are default ordered by the foreign keys.
It include tests fixing for deterministic data ordering
**What**
Add `POST /admin/regions`
Add `POST /admin/regions/:id`
Add `DELETE /admin/regions/:id`
All are added for v2 using API Routes and workflows.
In follow-up PRs, I will add support for passing countries in update and create.
Update: First follow-up PR is #6372
**What**
> [!NOTE]
> I can see this pr becoming huge, so I d like to get this partial one merged 👍
- Fixes shared connection usage (mikro orm compare the instance to its own package and therefore was resulting in not trully reusing the provided connection leading to exhausting the connection pool as multiple connections was created and end up not being all destroyed properly under the hood, discovered in my integration tests)
- Create shipping options method implementation
- DTO's definition and service interface update
- integration tests
- Re work of the indexes with new util update
- Test runner utils to remove a big chunk of the boilerplate of the packages integrations
FIXES CORE-1742
**What**
- add accept invite endpoint
**Invite accept flow**
- authenticate using the auth endpoint to create and auth-user
- invoke accept endpoint with token and info to create user
**What**
- Adds a TaxRegion entity.
**For context: High-level design of the Tax module**
- A TaxRegion scopes tax rates to a geographical place.
- You can define tax regions on two levels: country-level, province-level (this corresponds to state in US contexts)
- Each Tax Region can have a default Tax Rate.
- [not yet done] - Each Tax Region can also have granularly defined tax rates for different products and shipping rates. For example, California can have a base rate for default products, but a reduced rate for groceries.
- Tax Rates specify if they can be combined with other rates - it's always the lowest level rate that wins.
The above allows a merchant to define their tax settings along the lines of this:
- Denmark (Region)
- Default rate: 25% (TaxRate)
- Germany (Region)
- Default rate: 19% (TaxRate)
- Reduced rate (books): 9% (TaxRate w. rule)
- United States (Region)
- Default rate: 0% (TaxRate)
- California: (Region)
- Default rate: 7.25% (TaxRate)
- Arkansas: (Region)
- Default rate: 6.5%
- Reduced rate (groceries): 0.125% (TaxRate w. rule)
The TaxModule can then receive a list of products and the shipping address to determine what tax rates apply to the line items.
**What**
The internal service abstraction should return idempotently if an empty array is given in order to prevent full soft deletion of the full data
What:
Workflow Engine API.
Endpoints for:
- List workflow executions
- Run a workflow
- Set async steps as success or failure
- Retrieve the details of a workflow run
## What
This PR adds support for generating OAS in the docblock generator tool.
## How
As OAS are generated in a different manner/location than regular TSDocs, it requires a new type of generator within the tool. As such, the existing docblock generator now only handles files that aren't under the `packages/medusa/src/api` and `packages/medusa/src/api-v2` directories. The new generator handles files under these directories. However, it only considers a node to be an API route if it's a function having two parameters of types `MedusaRequest` and `MedusaResponse` respectively. So, only new API Routes are considered.
The new generator runs the same way as the existing docblock generator with the same method. The generators will detect whether they can run on the file or not and the docblocks/oas are generated based on that. I've also added a `--type` option to the CLI commands of the docblock generator tool to further filter and choose which generator to use.
When the OAS generator finds an API route, it will generate its OAS under the `docs-util/oas-output/operations` directory in a TypeScript file. I chose to generate in TS files rather than YAML files to maintain the functionality of `medusa-oas` without major changes.
Schemas detected in the OAS operation, such as the request and response schemas, are generated as OAS schemas under the `docs-util/oas-output/schemas` directory and referenced in operations and other resources.
The OAS generator also handles updating OAS. When you run the same command on a file/directory and an API route already has OAS associated with it, its information and associated schemas are updated instead of generating new schemas/operations. However, summaries and descriptions aren't updated unless they're not available or their values are the default value SUMMARY.
## API Route Handling
### Request and Response Types
The tool extracts the type of request/response schemas from the type arguments passed to the `MedusaRequest` and `MedusaResponse` respectively. For example:
```ts
export const POST = async (
req: MedusaRequest<{
id: string
}>,
res: MedusaResponse<ResponseType>
) => {
// ...
}
```
If these types aren't provided, the request/response is considered empty.
### Path Parameters
Path parameters are extracted from the file's path name. For example, for `packages/medusa/src/api-v2/admin/campaigns/[id]/route.ts` the `id` path parameter is extracted.
### Query Parameters
The tool extracts the query parameters of an API route based on the type of `request.validatedQuery`. Once we narrow down how we're typing query parameters, we can revisit this implementation.
## Changes to Medusa Oas CLI
I added a `--v2` option to the Medusa OAS CLI to support loading OAS from `docs-util/oas-output` directory rather than the `medusa` package. This will output the OAS in `www/apps/api-reference/specs`, wiping out old OAS. This is only helpful for testing purposes to check how the new OAS looks like in the API reference. It also allows us to slowly start adapting the new OAS.
## Other Notes and Changes
- I've added a GitHub action that creates a PR for generated OAS when Version Packages is merged (similar to regular TSDocs). However, this will only generate the OAS in the `docs-util/oas-output` directory and will not affect the existing OAS in the API reference. Once we're ready to include it those OAS, we can talk about next steps.
- I've moved the base YAML from the `medusa` package to the `docs-util/oas-output/base` directory and changed the `medusa-oas` tool to load them from there.
- I added a `clean:oas` command to the docblock generator CLI tool that removes unused OAS operations, schemas, and tags from `docs-util/oas-output`. The tool also supports updating OAS operations and their associated schemas. However, I didn't add a specific mechanism to update schemas on their own as that's a bit tricky and would require the help of typedoc. I believe with the process of running the tool on the `api-v2` directory whenever there's a new release should be enough to update associated schemas, but if we find that not enough, we can revisit updating schemas individually.
- Because of the `clean:oas` command which makes changes to tags (removing the existing ones, more details on this one later), I've added new base YAML under `docs-util/oas-output/base-v2`. This is used by the tool when generating/cleaning OAS, and the Medusa OAS CLI when the `--v2` option is used.
## Testing
### Prerequisites
To test with request/response types, I recommend minimally modifying `packages/medusa/src/types/routing.ts` to allow type arguments of `MedusaRequest` and `MedusaResponse`:
```ts
import type { NextFunction, Request, Response } from "express"
import type { Customer, User } from "../models"
import type { MedusaContainer } from "./global"
export interface MedusaRequest<T = unknown> extends Request {
user?: (User | Customer) & { customer_id?: string; userId?: string }
scope: MedusaContainer
}
export type MedusaResponse<T = unknown> = Response
export type MedusaNextFunction = NextFunction
export type MedusaRequestHandler = (
req: MedusaRequest,
res: MedusaResponse,
next: MedusaNextFunction
) => Promise<void> | void
```
You can then add type arguments to the routes in `packages/medusa/src/api-v2/admin/campaigns/[id]/route.ts`. For example:
```ts
import {
deleteCampaignsWorkflow,
updateCampaignsWorkflow,
} from "@medusajs/core-flows"
import { ModuleRegistrationName } from "@medusajs/modules-sdk"
import { CampaignDTO, IPromotionModuleService } from "@medusajs/types"
import { MedusaRequest, MedusaResponse } from "../../../../types/routing"
interface ResponseType {
campaign: CampaignDTO
}
export const GET = async (
req: MedusaRequest,
res: MedusaResponse<ResponseType>
) => {
const promotionModuleService: IPromotionModuleService = req.scope.resolve(
ModuleRegistrationName.PROMOTION
)
const campaign = await promotionModuleService.retrieveCampaign(
req.params.id,
{
select: req.retrieveConfig.select,
relations: req.retrieveConfig.relations,
}
)
res.status(200).json({ campaign })
}
export const POST = async (
req: MedusaRequest<{
id: string
}>,
res: MedusaResponse<ResponseType>
) => {
const updateCampaigns = updateCampaignsWorkflow(req.scope)
const campaignsData = [
{
id: req.params.id,
...(req.validatedBody || {}),
},
]
const { result, errors } = await updateCampaigns.run({
input: { campaignsData },
throwOnError: false,
})
if (Array.isArray(errors) && errors[0]) {
throw errors[0].error
}
res.status(200).json({ campaign: result[0] })
}
export const DELETE = async (
req: MedusaRequest,
res: MedusaResponse<{
id: string
object: string
deleted: boolean
}>
) => {
const id = req.params.id
const manager = req.scope.resolve("manager")
const deleteCampaigns = deleteCampaignsWorkflow(req.scope)
const { errors } = await deleteCampaigns.run({
input: { ids: [id] },
context: { manager },
throwOnError: false,
})
if (Array.isArray(errors) && errors[0]) {
throw errors[0].error
}
res.status(200).json({
id,
object: "campaign",
deleted: true,
})
}
```
### Generate OAS
- Install dependencies in the `docs-util` directory
- Run the following command in the `docs-util/packages/docblock-generator` directory:
```bash
yarn dev run "../../../packages/medusa/src/api-v2/admin/campaigns/[id]/route.ts"
```
This will generate the OAS operation and schemas and necessary and update the base YAML to include the new tags.
### Generate OAS with Examples
By default, the tool will only generate cURL examples for OAS operations. To generate templated JS Client and (placeholder) Medusa React examples, add the `--generate-examples` option to the command:
```bash
yarn dev run "../../../packages/medusa/src/api-v2/admin/campaigns/[id]/route.ts" --generate-examples
```
> Note: the command will update the existing OAS you generated in the previous test.
### Testing Updates
To test updating OAS, you can try updating request/response types, then running the command, and the associated OAS/schemas will be updated.
### Clean OAS
The `clean:oas` command will remove any unused operation, tags, or schemas. To test it out you can try:
- Remove an API Route => this removes its associated operation and schemas (if not referenced anywhere else).
- Remove all references to a schema => this removes the schema.
- Remove all operations in `docs-util/oas-output/operations` associated with a tag => this removes the tag from the base YAML.
```bash
yarn dev clean:oas
```
> Note: when running this command, existing tags in the base YAML (such as Products) will be removed since there are no operations using it. As it's running on the base YAML under `base-v2`, this doesn't affect base YAML used for the API reference.
### Medusa Oas CLI
- Install and build dependencies in the root of the monorepo
- Run the following command to generate reference OAS for v2 API Routes (must have generated OAS previously using the docblock generator tool):
```bash
yarn openapi:generate --v2
```
- This wipes out existing OAS in `www/apps/api-reference/specs` and replaces them with the new ones. At this point, you can view the new API routes in the API reference by running the `yarn dev` command in `www/apps/api-reference` (although not necessary for testing here).
- Run the command again without the `--v2` option:
```bash
yarn openapi:generate
```
The specs in `www/apps/api-reference/specs` are reverted back to the old routes.
**What**
- always return an index expression, also for unique constraints
**why**
- constraints can't be partial, meaning `UNIQUE ... WHERE` is not possible with constraints
- constraints are indicies under the hood so it doesn't change the behavior of the system when we're not using constraint specific features but just using them for `UNIQUE`
**What**
Fix the test utils database to trully run the migrations, the migration path is deprecated and not used anymore as umzung infer the path under the hood. It also fix the schema so that it is possible to specify different schema if needed.
The index helper can now create named constraints for uniqueness.
extracted [from](https://github.com/medusajs/medusa/pull/6381)
**Dashboard**
- Adds different views for managing manual/custom gift cards (not associated with a product gift card).
- Cleans up several table implementations to use new DataTable component.
- Minor cleanup of translation file.
**Medusa**
- Adds missing query params for list endpoints in the following admin domains: /customers, /customer-groups, /collections, and /gift-cards.
**UI**
- Adds new sizes for Badge component.
**Note for review**
Since this PR contains updates to the translation keys, it touches a lot of files. For the review the parts that are relevant are: the /gift-cards domain of admin, the table overview of collections, customers, and customer groups. And the changes to the list endpoints in the core.
> This is a proposal - not necessarily the end result - to kick off the discussion about the implementation of the new totals utilities
### What
Introduces a BigNumber class implementation, enabling us to work with high-precision numeric values.
**Scope**
- Introduce the BigNumber class
- Remain somewhat backward-compatible (in behavior)
- Establish a foundation for handling high-precision values in more complex scenarios
**Not in scope**
- The implementation will not address complex use cases. However, the concept introduced now should be open for extensibility, so this can be added later without major changes to the calculation logic
### How
There are significant changes to three areas in this PR:
- Schemas
- (De)-Serialization
- Totals calculations
**Schemas**
Domains that need high-precision values will have two DB columns for each value in the database: a standard numeric column and a raw value column.
The standard column is for basic operations like sorting and filtering in the database and is what should be publicly exposed in our API.
The raw value is initially used solely for precise calculations and is stored as a JSONB column. Keeping it as JSONB is flexible and will allow us to extend the concept in future iterations. As of now, the raw value will only require a single property `value`.
**(De)-Serialization**
We cast the raw JSONB value to a `BigNumberRawValue` when reading from the database.
We serialize the standard value to a `BigNumber` when reading from the database.
We use the standard numeric value to construct the raw value upon writing to the database.
For example, the unit price and raw unit price on line items will be inserted as follows:
```ts
@BeforeCreate()
onCreate() {
this.id = generateEntityId(this.id, "cali")
const asBigNumber = new BigNumber(this.raw_unit_price ?? this.unit_price)
this.unit_price = asBigNumber.numeric
this.raw_unit_price = asBigNumber.raw
}
```
**Totals calculations**
For totals calculations, we will use the [`bignumber.js`](https://github.com/MikeMcl/bignumber.js/) library. The library ships with a `BigNumber` class with arithmetic methods for precise calculations.
When we need to perform a calculation, we construct the BigNumber class from the library using the raw value from the database.
Let's have a look at an oversimplified example:
```ts
// create cart with line items
const [createdCart] = await service.create([
{
currency_code: "eur",
items: [
// li_1234
{
title: "test",
quantity: 2,
unit_price: 100,
},
// li_4321
{
title: "test",
quantity: 3,
// raw price creation
unit_price: 200,
},
],
},
])
```
```ts
// calculating line item totals
import BN from "bignumber.js"
const lineItem1 = await service.retrieveLineItem("li_1234")
const lineItem2 = await service.retrieveLineItem("li_4321")
const bnUnitPrice1 = new BN(lineItem1.unit_price.raw)
const bnUnitPrice2 = new BN(lineItem2.unit_price.raw)
const line1Total = bnUnitPrice1.multipliedBy(lineItem1.quantity)
const line2Total = bnUnitPrice2.multipliedBy(lineItem2.quantity)
const total = line1Total.plus(line2Total)
```
**A note on backward compatibility**
Our BigNumber implementation is built to support the existing behavior of numeric values in the database. So even though we serialize the value to a BigNumber, you will still be able to treat it as a standard number, as we've always done.
For example, the following works perfectly fine:
```ts
const lineItem = await service.createLineItem({
title: "test",
quantity: 2,
unit_price: 100,
})
console.log(lineItem.unit_price) // will print `100`
```
However, the type of `unit_price` will be `number | BigNumber`.
**What**
Initialize work on the fulfillment module entities.
This pr finally also include the indexes as i was working on some utilities i though it would make sense to test them directly.
Also this pr add a new utility to generate proper index for our entity properties. It also include a new monkey patch for the migration generator to handle most of if exists/not exists cases. The monkey patch is a workaround the fact that the transpilation does work well with the ECMA used by mikro orm and therefore we end up with some constructor issue when mikro orm try to instanciate the custom generator class extension.
**Comment**
- The rule part will likely evolved when we reach the point of rule filtering based data, so no need for details review I believe
FIXES CORE-1714
FIXES CORE-1715
FIXES CORE-1718
FIXES CORE-1722
FIXES CORE-1723
Current schema diagram
