**What**
- Fix product category repository missing context passed to down level methods
- Ensure the base repository when getting the active manager returns a fresh one if possible instead of the global one in order to prevent shared entity map by mistake
* chore: Update modules providers configuration with 'identifier' and 'PROVIDER'
* update check
* fix tests
* type
* normalize auth provider
* emailpass
* providers
---------
Co-authored-by: Carlos R. L. Rodrigues <rodrigolr@gmail.com>
Co-authored-by: Carlos R. L. Rodrigues <37986729+carlos-r-l-rodrigues@users.noreply.github.com>
What:
* removes resouces type "shared" or "isolated" from internal modules.
* modules can have an isolated database connection by providing a database config as part of their options on `medusa-config`
CLOSES: FRMW-2593
* chore: add compare_at_unit_price when price list price is retrieved
* chore: add test for update item + more fixes along the way
* chore: fix tests
* chore: add refresh spec
* Apply suggestions from code review
Co-authored-by: Adrien de Peretti <adrien.deperetti@gmail.com>
* chore: use undefined checker
* chore: switch to map
---------
Co-authored-by: Adrien de Peretti <adrien.deperetti@gmail.com>
**What**
- validate that variants are unique with respect to options on product update/create and variant update/create
- validate that the product has options upon creation
- ensure variants have the same number of option values as the product has options
- admin error handling
- update tests
---
FIXES FRMW-2707 CC-556
**What**
- remove cascade delete of inventory items on product delete
- implement inventory deletion in product/variant delete workflows with checks:
- product/variant cannot be deleted if there are reservations associated with their inventory items
- inventory item will be cascade deleted if it's not used by other variants (that are not being deleted in the current flow)
---
FIXES CC-581 CC-582
**What**
The module service name case has changed and the hash generation was performed on the non lower cased version of it while after the hash generation everything is based on the lower case version of the generated table name form the service names leading to different hash computation. This pr update the link migration table to adjust the to/from module value with its new value as well as generating the hash from the lower case version of the computed table name
Co-authored-by: Carlos R. L. Rodrigues <37986729+carlos-r-l-rodrigues@users.noreply.github.com>
Fixes: FRMW-2742
In this PR, we fix the build output of the backend source code, which eliminates a lot of magic between the development and production environments.
Right now, we only compile the source files from the `src` directory and write them within the `dist` directory.
**Here's how the `src` directory with a custom module looks like**
```
src
├── modules
│ └── hello
│ ├── index.ts
```
**Here's the build output**
```
dist
├── modules
│ └── hello
│ ├── index.js
```
Let's imagine a file at the root of your project (maybe the `medusa-config.js` file) that wants to import the `modules/hello/index` file. How can we ensure that the import will work in both the development and production environments?
If we write the import targeting the `src` directory, it will break in production because it should target the `dist` directory.
## Solution
The solution is to compile everything within the project and mimic the file structure in the build output, not just the `src` directory.
**Here's how the fixed output should look like**
```
dist
├── src
│ ├── modules
│ │ └── hello
│ │ ├── index.js
├── medusa-config.js
├── yarn.lock
├── package.json
```
If you notice carefully, we also have `medusa-config.js`, `yarn.lock`, and `package.json` within the `dist` directory. We do so to create a standalone built application, something you can copy/paste to your server and run without relying on the original source code.
- This results in small containers since you are not copying unnecessary files.
- Clear distinction between the development and the production code. If you want to run the production server, then `cd` into the `dist` directory and run it from there.
## Changes in the PR
- Breaking: Remove the `dist` and `build` folders. Instead, write them production artefacts within the `.medusa` directory as `.medusa/admin` and `.medusa/server`.
- Breaking: Change the output of the `.medusa/server` folder to mimic the root project structure.
- Refactor: Remove `Symbol.for("ts-node.register.instance")]` check to find from where to load the source code.
- Refactor: Use `tsc` for creating the production build. This ensures we respect `tsconfig` settings when creating the build and also perform type-checking.
Co-authored-by: Adrien de Peretti <25098370+adrien2p@users.noreply.github.com>
What
- Store result of cart-completion workflow for three days by default
- This enables the built-in idempotency mechanism to kick-in, provided the same transaction ID is used on workflow executions
- Return order from cart-completion workflow if the cart has already been completed
- In case transaction ID is not used on workflow executions, we still only want to complete a cart once
FIXES: FRMW-2727
MikroORM (with version 5.9) has [hardcoded the TypeScript module](https://github.com/mikro-orm/mikro-orm/blob/5.x/packages/core/src/utils/ConfigurationLoader.ts#L138-L139) system to `commonjs`, which makes it incompatible with the module system we are using, ie `Node16`.
So, in order to continue using the Mikro ORM CLI within our modules, we will have to monkey-patch the block of code responsible for configuring `ts-node`. However, the monkey-patching must be done before their CLI gets booted.
As a result of this, we have to create a wrapper CLI on top of Mikro ORM CLI that performs the following steps.
- Monkey-patch the relevant code
- Register Mikro ORM CLI as the second step.
Due do this, we will have to use this new wrapper CLI within the modules, which is exposed as `medusa-db`. Maybe, `medusa-db` is not a great name, so please send your suggestions.
**What**
- hide disabled fulfilment providers on the admin
- check if the fulfilment provider has an identifier when loading providers
---
FIXES CC-549
**Why**
- price set update uses `upsertWithReplace` so price list prices are removed since we "ignore" them in regular price set operations and use price list methods to manage them
**What**
- preserve price list prices when updating price sets
---
FIXES CC-516
What
- Require `code` on Tax Rates
- Update dashboard to account for non-nullable code on Tax Rates
- Include `automatic_taxes` in API Route response
Closes CC-524 CC-525