* chore(): Accept an extra agument 'all-or-nothing' on the migrate command
* Create rich-camels-brush.md
* chore(): Accept an extra agument 'all-or-nothing' on the migrate command
* chore(): Accept an extra agument 'all-or-nothing' on the migrate command
* chore(): Accept an extra agument 'all-or-nothing' on the migrate command
* chore(): fix broken down migrations
* chore(): update changeset
**What**
This PR introduces experimental Hot Module Replacement (HMR) for the Medusa backend, enabling developers to see code changes reflected immediately without restarting the server. This significantly improves the development experience by reducing iteration time.
### Key Features
- Hot reload support for:
- API Routes
- Workflows & Steps
- Scheduled Jobs
- Event Subscribers
- Modules
- IPC-based architecture: The dev server runs in a child process, communicating with the parent watcher via IPC. When HMR fails, the child process is killed and restarted, ensuring
clean resource cleanup.
- Recovery mechanism: Automatically recovers from broken module states without manual intervention.
- Graceful fallback: When HMR cannot handle a change (e.g., medusa-config.ts, .env), the server restarts completely.
### Architecture
```mermaid
flowchart TB
subgraph Parent["develop.ts (File Watcher)"]
W[Watch Files]
end
subgraph Child["start.ts (HTTP Server)"]
R[reloadResources]
R --> MR[ModuleReloader]
R --> WR[WorkflowReloader]
R --> RR[RouteReloader]
R --> SR[SubscriberReloader]
R --> JR[JobReloader]
end
W -->|"hmr-reload"| R
R -->|"hmr-result"| W
```
### How to enable it
Backend HMR is behind a feature flag. Enable it by setting:
```ts
// medusa-config.ts
module.exports = defineConfig({
featureFlags: {
backend_hmr: true
}
})
```
or
```bash
export MEDUSA_FF_BACKEND_HMR=true
```
or
```
// .env
MEDUSA_FF_BACKEND_HMR=true
```
Co-authored-by: Oli Juhl <59018053+olivermrbl@users.noreply.github.com>
Co-authored-by: Carlos R. L. Rodrigues <37986729+carlos-r-l-rodrigues@users.noreply.github.com>
## Summary
**What** — What changes are introduced in this PR?
*Please provide answer here*
**Why** — Why are these changes relevant or necessary?
*Please provide answer here*
**How** — How have these changes been implemented?
*Please provide answer here*
**Testing** — How have these changes been tested, or how can the reviewer test the feature?
*Please provide answer here*
---
## Examples
Provide examples or code snippets that demonstrate how this feature works, or how it can be used in practice.
This helps with documentation and ensures maintainers can quickly understand and verify the change.
```ts
// Example usage
```
---
## Checklist
Please ensure the following before requesting a review:
- [ ] I have added a **changeset** for this PR
- Every non-breaking change should be marked as a **patch**
- To add a changeset, run `yarn changeset` and follow the prompts
- [ ] The changes are covered by relevant **tests**
- [ ] I have verified the code works as intended locally
- [ ] I have linked the related issue(s) if applicable
---
## Additional Context
Add any additional context, related issues, or references that might help the reviewer understand this PR.
RESOLVES CORE-1153
**What**
- This pr mainly lay the foundation the caching layer. It comes with a modules (built in memory cache) and a redis provider.
- Apply caching to few touch point to test
Co-authored-by: Carlos R. L. Rodrigues <37986729+carlos-r-l-rodrigues@users.noreply.github.com>
* chore(): Upgrade mikro orm
* handle 'null' value for big number props
* 6.5.2
* remove only
* fix pricing module rule value
* switch select in strategy for balances
* revert to select in strategy for order module
* fix defining DML ManyToOne
* fix define relationship
* test fix
* more fixes
* change order strategy to balanced
* change order strategy to balanced
* prevent unnecessary manager fork
* revert generated www changes
* remove unnecessary changes
* Create real-cobras-deny.md
* address feedback
---------
Co-authored-by: Oli Juhl <59018053+olivermrbl@users.noreply.github.com>
What:
- Added query planning to the Remote Joiner, enabling phased and parallel execution of data aggregation.
- Replaced object deletes with non-enumerable property hiding to improve performance.
** What
- Allow auto-loaded Medusa files to export a config object.
- Currently supports isDisabled to control loading.
- new instance `FeatureFlag` exported by `@medusajs/framework/utils`
- `feature-flags` is now a supported folder for medusa projects, modules, providers and plugins. They will be loaded and added to `FeatureFlag`
** Why
- Enables conditional loading of routes, migrations, jobs, subscribers, workflows, and other files based on feature flags.
```ts
// /src/feature-flags
import { FlagSettings } from "@medusajs/framework/feature-flags"
const CustomFeatureFlag: FlagSettings = {
key: "custom_feature",
default_val: false,
env_key: "FF_MY_CUSTOM_FEATURE",
description: "Enable xyz",
}
export default CustomFeatureFlag
```
```ts
// /src/modules/my-custom-module/migration/Migration20250822135845.ts
import { FeatureFlag } from "@medusajs/framework/utils"
export class Migration20250822135845 extends Migration {
override async up(){ }
override async down(){ }
}
defineFileConfig({
isDisabled: () => !FeatureFlag.isFeatureEnabled("custom_feature")
})
```
* feat: add view_configurations feature flag
- Add feature flag provider and hooks to admin dashboard
- Add backend API endpoint for feature flags
- Create view_configurations feature flag (disabled by default)
- Update order list table to use legacy version when flag is disabled
- Can be enabled with MEDUSA_FF_VIEW_CONFIGURATIONS=true env var
* fix: naming
* fix: feature flags unauthenticated
* fix: add test
* feat: add settings module
* fix: deps
* fix: cleanup
* fix: add more tetsts
* fix: rm changelog
* fix: deps
* fix: add settings module to default modules list
* fix: pr comments
* fix: deps,build
* fix: alias
* fix: tests
* fix: update snapshots
**What**
- Add missing index for query that falls in select in strategy, specifically the heaviest one with variant IN filtering
- Allow to force the strategy to be sent to the entry point module when using the graph API. It is useful with big dataset where filtering is enough without pagination and select in can offer better performances
e.g
```ts
await query.graph({
entity: 'product',
fields,
filters,
strategy: 'select-in' // <-- this will force the module to use receive this value and apply it accordingly
})
```
* feat(index): add filterable fields to link definition
* rm comment
* break recursion
* validate read only links
* validate filterable
* gql schema array
* link parents
* isInverse
* push id when not present
* Fix ciruclar relationships and add tests to ensure proper behaviour (part 1)
* log and fallback to entity.alias
* cleanup and fixes
* cleanup and fixes
* cleanup and fixes
* fix get attributes
* gql type
* unit test
* array inference
* rm only
* package.json
* pacvkage.json
* fix link retrieval on duplicated entity type and aliases + tests
* link parents as array
* Match only parent entity
* rm comment
* remove hard coded schema
* extend types
* unit test
* test
* types
* pagination type
* type
* fix integration tests
* Improve performance of in selection
* use @@ to filter property
* escape jsonPath
* add Event Bus by default
* changeset
* rm postgres analyze
* estimate count
* new query
* parent aliases
* inner query w/ filter and sort relations
* address comments
---------
Co-authored-by: adrien2p <adrien.deperetti@gmail.com>
Co-authored-by: Oli Juhl <59018053+olivermrbl@users.noreply.github.com>
* feat: add createMultiple flag to enforce inApp link uniqueness
* changes
* mocks
* default
* many to many
---------
Co-authored-by: Carlos R. L. Rodrigues <rodrigolr@gmail.com>
* fix(): Allow to query deleted records from graph API
* fix(): Allow to query deleted records from graph API
* handle empty q value
* update staled at sync
* rename integration tests file
* Create strong-houses-marry.md
* try to fix flacky tests
* fix pricing context
* update changeset
* update changeset
* fix import
* skip test for now
---------
Co-authored-by: Oli Juhl <59018053+olivermrbl@users.noreply.github.com>
**What**
- Add index engine feature flag
- apply it to the `store/products` end point as well as `admin/products`
- Query builder various fixes
- search capabilities on full data of every entities. The `q` search will be applied to all involved joined table for selection/where clauses
Co-authored-by: Carlos R. L. Rodrigues <37986729+carlos-r-l-rodrigues@users.noreply.github.com>
What:
- `query.index` helper. It queries the index module, and aggregate the rest of requested fields/relations if needed like `query.graph`.
Not covered in this PR:
- Hydrate only sub entities returned by the query. Example: 1 out of 5 variants have returned, it should only hydrate the data of the single entity, currently it will merge all the variants of the product.
- Generate types of indexed data
example:
```ts
const query = container.resolve(ContainerRegistrationKeys.QUERY)
await query.index({
entity: "product",
fields: [
"id",
"description",
"status",
"variants.sku",
"variants.barcode",
"variants.material",
"variants.options.value",
"variants.prices.amount",
"variants.prices.currency_code",
"variants.inventory_items.inventory.sku",
"variants.inventory_items.inventory.description",
],
filters: {
"variants.sku": { $like: "%-1" },
"variants.prices.amount": { $gt: 30 },
},
pagination: {
order: {
"variants.prices.amount": "DESC",
},
},
})
```
This query return all products where at least one variant has the title ending in `-1` and at least one price bigger than `30`.
The Index Module only hold the data used to paginate and filter, and the returned object is:
```json
{
"id": "prod_01JKEAM2GJZ14K64R0DHK0JE72",
"title": null,
"variants": [
{
"id": "variant_01JKEAM2HC89GWS95F6GF9C6YA",
"sku": "extra-variant-1",
"prices": [
{
"id": "price_01JKEAM2JADEWWX72F8QDP6QXT",
"amount": 80,
"currency_code": "USD"
}
]
}
]
}
```
All the rest of the fields will be hydrated from their respective modules, and the final result will be:
```json
{
"id": "prod_01JKEAY2RJTF8TW9A23KTGY1GD",
"description": "extra description",
"status": "draft",
"variants": [
{
"sku": "extra-variant-1",
"barcode": null,
"material": null,
"id": "variant_01JKEAY2S945CRZ6X4QZJ7GVBJ",
"options": [
{
"value": "Red"
}
],
"prices": [
{
"amount": 20,
"currency_code": "CAD",
"id": "price_01JKEAY2T2EEYSWZHPGG11B7W7"
},
{
"amount": 80,
"currency_code": "USD",
"id": "price_01JKEAY2T2NJK2E5468RK84CAR"
}
],
"inventory_items": [
{
"variant_id": "variant_01JKEAY2S945CRZ6X4QZJ7GVBJ",
"inventory_item_id": "iitem_01JKEAY2SNY2AWEHPZN0DDXVW6",
"inventory": {
"sku": "extra-variant-1",
"description": "extra variant 1",
"id": "iitem_01JKEAY2SNY2AWEHPZN0DDXVW6"
}
}
]
}
]
}
```
Co-authored-by: Adrien de Peretti <25098370+adrien2p@users.noreply.github.com>
Closes: FRMW-2892, FRMW-2893
**What**
Wired up the building block that we merged previously in order to manage data synchronization. The flow is as follow
- On application start
- Build schema object representation from configuration
- Check configuration changes
- if new entities configured
- Data synchronizer initialize orchestrator and start sync
- for each entity
- acquire lock
- mark existing data as staled
- sync all data by batch
- marked them not staled anymore
- acknowledge each processed batch and renew lock
- update metadata with last synced cursor for entity X
- release lock
- remove all remaining staled data
- if any entities removed from last configuration
- remove the index data and relations
Co-authored-by: Carlos R. L. Rodrigues <37986729+carlos-r-l-rodrigues@users.noreply.github.com>
Fixes: FRMW-2858
This PR merge the modules exported by the plugins with the modules defined within the user config. As a result, all modules get loaded without changing the internals of the loader.
However, you cannot disable the module of a plugin by re-adding it to the `modules` array. That is something we should handle separately.
We've added the breaking change label because of the following fix:
We did broke the ability to completely disable modules in the past pr's, in this pr we re add the ability to disable a module and that this modules does not get loaded at all. ([here](6dd164f783))
Co-authored-by: Adrien de Peretti <25098370+adrien2p@users.noreply.github.com>
* feat(auth-google,auth-github): Allow passing a custom callbackUrl to oauth providers
* feat: Add state management in auth providers
* chore: Changes based on PR review
* chore: Update modules providers configuration with 'identifier' and 'PROVIDER'
* update check
* fix tests
* type
* normalize auth provider
* emailpass
* providers
---------
Co-authored-by: Carlos R. L. Rodrigues <rodrigolr@gmail.com>
Co-authored-by: Carlos R. L. Rodrigues <37986729+carlos-r-l-rodrigues@users.noreply.github.com>
What:
* removes resouces type "shared" or "isolated" from internal modules.
* modules can have an isolated database connection by providing a database config as part of their options on `medusa-config`
CLOSES: FRMW-2593
Fixes: FRMW-2742
In this PR, we fix the build output of the backend source code, which eliminates a lot of magic between the development and production environments.
Right now, we only compile the source files from the `src` directory and write them within the `dist` directory.
**Here's how the `src` directory with a custom module looks like**
```
src
├── modules
│ └── hello
│ ├── index.ts
```
**Here's the build output**
```
dist
├── modules
│ └── hello
│ ├── index.js
```
Let's imagine a file at the root of your project (maybe the `medusa-config.js` file) that wants to import the `modules/hello/index` file. How can we ensure that the import will work in both the development and production environments?
If we write the import targeting the `src` directory, it will break in production because it should target the `dist` directory.
## Solution
The solution is to compile everything within the project and mimic the file structure in the build output, not just the `src` directory.
**Here's how the fixed output should look like**
```
dist
├── src
│ ├── modules
│ │ └── hello
│ │ ├── index.js
├── medusa-config.js
├── yarn.lock
├── package.json
```
If you notice carefully, we also have `medusa-config.js`, `yarn.lock`, and `package.json` within the `dist` directory. We do so to create a standalone built application, something you can copy/paste to your server and run without relying on the original source code.
- This results in small containers since you are not copying unnecessary files.
- Clear distinction between the development and the production code. If you want to run the production server, then `cd` into the `dist` directory and run it from there.
## Changes in the PR
- Breaking: Remove the `dist` and `build` folders. Instead, write them production artefacts within the `.medusa` directory as `.medusa/admin` and `.medusa/server`.
- Breaking: Change the output of the `.medusa/server` folder to mimic the root project structure.
- Refactor: Remove `Symbol.for("ts-node.register.instance")]` check to find from where to load the source code.
- Refactor: Use `tsc` for creating the production build. This ensures we respect `tsconfig` settings when creating the build and also perform type-checking.
Co-authored-by: Adrien de Peretti <25098370+adrien2p@users.noreply.github.com>