**What**
- Reworks how admin extensions are loaded from plugins.
- Reworks how extensions are managed internally in the dashboard project.
**Why**
- Previously we loaded extensions from plugins the same way we do for extension found in a users application. This being scanning the source code for possible extensions in `.medusa/server/src/admin`, and including any extensions that were discovered in the final virtual modules.
- This was causing issues with how Vite optimizes dependencies, and would lead to CJS/ESM issues. Not sure of the exact cause of this, but the issue was pinpointed to Vite not being able to register correctly which dependencies to optimize when they were loaded through the virtual module from a plugin in `node_modules`.
**What changed**
- To circumvent the above issue we have changed to a different strategy for loading extensions from plugins. The changes are the following:
- We now build plugins slightly different, if a plugin has admin extensions we now build those to `.medusa/server/src/admin/index.mjs` and `.medusa/server/src/admin/index.js` for a ESM and CJS build.
- When determining how to load extensions from a source we follow these rules:
- If the source has a `medusa-plugin-options.json` or is the root application we determine that it is a `local` extension source, and load extensions as previously through a virtual module.
- If it has neither of the above, but has a `./admin` export in its package.json then we determine that it is a `package` extension, and we update the entry point for the dashboard to import the package and pass its extensions a long to the dashboard manager.
**Changes required by plugin authors**
- The change has no breaking changes, but requires plugin authors to update the `package.json` of their plugins to also include a `./admin` export. It should look like this:
```json
{
"name": "@medusajs/plugin",
"version": "0.0.1",
"description": "A starter for Medusa plugins.",
"author": "Medusa (https://medusajs.com)",
"license": "MIT",
"files": [
".medusa/server"
],
"exports": {
"./package.json": "./package.json",
"./workflows": "./.medusa/server/src/workflows/index.js",
"./.medusa/server/src/modules/*": "./.medusa/server/src/modules/*/index.js",
"./modules/*": "./.medusa/server/src/modules/*/index.js",
"./providers/*": "./.medusa/server/src/providers/*/index.js",
"./*": "./.medusa/server/src/*.js",
"./admin": {
"import": "./.medusa/server/src/admin/index.mjs",
"require": "./.medusa/server/src/admin/index.js",
"default": "./.medusa/server/src/admin/index.js"
}
},
}
```
* ../../core/types/src/dml/index.ts
* ../../core/types/src/dml/index.ts
* fix: relationships mapping
* handle nullable foreign keys types
* handle nullable foreign keys types
* handle nullable foreign keys types
* continue to update product category repository
* fix all product category repositories issues
* fix product category service types
* fix product module service types
* fix product module service types
* fix repository template type
* refactor: use a singleton DMLToMikroORM factory instance
Since the MikroORM MetadataStorage is global, we will also have to turn DML
to MikroORM entities conversion use a global bucket as well
* refactor: update product module to use DML in tests
* wip: tests
* WIP product linkable fixes
* continue type fixing and start test fixing
* test: fix more tests
* fix repository
* fix pivot table computaion + fix mikro orm repository
* fix many to many management and configuration
* fix many to many management and configuration
* fix many to many management and configuration
* update product tag relation configuration
* Introduce experimental dml hooks to fix some issues with categories
* more fixes
* fix product tests
* add missing id prefixes
* fix product category handle management
* test: fix more failing tests
* test: make it all green
* test: fix breaking tests
* fix: build issues
* fix: build issues
* fix: more breaking tests
* refactor: fix issues after merge
* refactor: fix issues after merge
* refactor: surpress types error
* test: fix DML failing tests
* improve many to many inference + tests
* Wip fix columns from product entity
* remove product model before create hook and manage handle validation and transformation at the service level
* test: fix breaking unit tests
* fix: product module service to not update handle on product update
* fix define link and joiner config
* test: fix joiner config test
* test: fix joiner config test
* fix joiner config primary keys
* Fix joiner config builder
* Fix joiner config builder
* test: remove only modifier from test
* refactor: remove hooks usage from product collection
* refactor: remove hooks usage from product-option
* refactor: remove hooks usage for computing category handle
* refactor: remove hooks usage from productCategory model
* refactor: remove hooks from DML
* refactor: remove cruft
* order dml
* cleanup
* re add foerign key indexes
* wip
* chore: remove unused types
* wip
* changes
* rm raw
* autoincrement
* wip
* rel
* refactor: cleanup
* migration and models configuration adjustments
* cleanup
* number searchable
* fix random ordering
* fix
* test: fix product-category tests
* test: update breaking DML tests
* test: array assertion to not care about ordering
* fix: temporarily apply id ordering for products
* types
* wip
* WIP type improvements
* update order models
* partially fix types temporarely
* rel
* fix: recursive type issue
* improve type inference breaks
* improve type inference breaks
* update models
* rm nullable
* default value
* repository
* update default value handling
* fix unit tests
* WIP
* toMikroORM
* fix relations
* cascades
* fix
* experimental dml hooks
* rm migration
* serial
* nullable autoincrement
* fix model
* model changes
* fix one to one DML
* order test
* fix addresses
* fix unit tests
* Re align dml entity name inference
* update model table name config
* update model table name config
* revert
* update return relation
* WIP
* hasOne
* models
* fix
* model
* initial commit
* cart service
* order module
* utils unit test
* index engine
* changeset
* merge
* fix hasOne with fk
* update
* free text filter per entity
* tests
* prod category
* property string many to many
* fix big number
* link modules migration set names
* merge
* shipping option rules
* serializer
* unit test
* fix test mikro orm init
* fix test mikro orm init
* Maintain merge object properties
* fix test mikro orm init
* prevent unit test from connecting to db
* wip
* fix test
* fix test
* link test
* schema
* models
* auto increment
* hook
* model hooks
* order
* wip
* orm version
* request return field
* fix return configuration on order model
* workflows
* core flows
* unit test
* test
* base repo
* test
* base repo
* test fix
* inventory move
* locking inventory
* test
* free text fix
* rm timeout mock
* migrate fulfillment values
* v6.4.3
* cleanup
* link-modules update sql
* revert test
* remove fake timers
---------
Co-authored-by: adrien2p <adrien.deperetti@gmail.com>
Co-authored-by: Harminder Virk <virk.officials@gmail.com>
Co-authored-by: Oli Juhl <59018053+olivermrbl@users.noreply.github.com>
What:
* removes resouces type "shared" or "isolated" from internal modules.
* modules can have an isolated database connection by providing a database config as part of their options on `medusa-config`
CLOSES: FRMW-2593
Fixes: FRMW-2742
In this PR, we fix the build output of the backend source code, which eliminates a lot of magic between the development and production environments.
Right now, we only compile the source files from the `src` directory and write them within the `dist` directory.
**Here's how the `src` directory with a custom module looks like**
```
src
├── modules
│ └── hello
│ ├── index.ts
```
**Here's the build output**
```
dist
├── modules
│ └── hello
│ ├── index.js
```
Let's imagine a file at the root of your project (maybe the `medusa-config.js` file) that wants to import the `modules/hello/index` file. How can we ensure that the import will work in both the development and production environments?
If we write the import targeting the `src` directory, it will break in production because it should target the `dist` directory.
## Solution
The solution is to compile everything within the project and mimic the file structure in the build output, not just the `src` directory.
**Here's how the fixed output should look like**
```
dist
├── src
│ ├── modules
│ │ └── hello
│ │ ├── index.js
├── medusa-config.js
├── yarn.lock
├── package.json
```
If you notice carefully, we also have `medusa-config.js`, `yarn.lock`, and `package.json` within the `dist` directory. We do so to create a standalone built application, something you can copy/paste to your server and run without relying on the original source code.
- This results in small containers since you are not copying unnecessary files.
- Clear distinction between the development and the production code. If you want to run the production server, then `cd` into the `dist` directory and run it from there.
## Changes in the PR
- Breaking: Remove the `dist` and `build` folders. Instead, write them production artefacts within the `.medusa` directory as `.medusa/admin` and `.medusa/server`.
- Breaking: Change the output of the `.medusa/server` folder to mimic the root project structure.
- Refactor: Remove `Symbol.for("ts-node.register.instance")]` check to find from where to load the source code.
- Refactor: Use `tsc` for creating the production build. This ensures we respect `tsconfig` settings when creating the build and also perform type-checking.
Co-authored-by: Adrien de Peretti <25098370+adrien2p@users.noreply.github.com>
**What**
- Create main workflow
- Update create pricing rule types step to be idempotent
- update remote joiner to support granular isList confguration on field aliases
- Add full workflow integration tests
**What**
- Better error handling and error message
- update deps management and dynamic import/require
- Pass a new flag to the modules loaders for the module loaders to be able to act depending on it. In that case, the module can determine what should be run or not. e.g in the workflow engine redis, when we are only partially loading the module, we do not want to set the Distributed transaction storage
**What**
- Fixes tsconfig to ensure a declaration file is included in the build.
**Why**
- Ensure tsserver can understand things from medusa-test-utils
- Get rid of "Could not find a declaration file for module" warning in editors.
**What**
Update all transform middleware to support the new API
- deprecate `defaultRelations`
- deprecate `allowedRelations`
- Add `defaults` and `allowed` in replacement for `defaultFields` and `allowedFields` respectively
- in the `defaults` it is possible to specify a field such as `*variants` in order to be recognized as a relation only without specifying any property
- add support for `remoteQueryConfig` assigned to req like we have for `listConfig` and `retrieveConfig`
- add support to override `allowed|allowedFields` if a previous middleware have set it up on the req.allowed
- The api now accepts `fields` as the only accepted fields to manage the requested props and relations, the `expand` property have been deprecated. New supported symbols have been added in complement of the fields
- `+` (e.g `/store/products?fields=+description`) to specify that description should be added as part of the returned data among the other defined fields
- `-` (e.g `/store/products?fields=-description`) to specify that description should be removed as part of the returned data
- `*` (e.g `/store/products?fields=*variants`) to specify that the variants relations should be added as part of the returned data among the other defined fields without having to specify which property of the variants should be returned. In the `defaults` config of the transform middleware it is also possible to use this symbol
- In the case no symbol is provided, it will replace the default fields and mean that only the specified fields must be returned
About the allowed validation, all fields in the `defaults` configuration must be present in the `allowed` configuration.
In case the `defaults` contains full relation selection (e.g `*product.variants`) it should be present in the `allowed` as `product.variants`. In case in the `defaults` you add `product.variants.id`, it will be allowed if the `allowed` configuration includes either `product.variants.id` as full match or `product.variants` as it means that we allow all properties from `product.variants`
Also, support for `*` selection on the remote query/joiner have been added
**Note**
All v2 end points refactoring can be done separately