RESOLVES CORE-1153
**What**
- This pr mainly lay the foundation the caching layer. It comes with a modules (built in memory cache) and a redis provider.
- Apply caching to few touch point to test
Co-authored-by: Carlos R. L. Rodrigues <37986729+carlos-r-l-rodrigues@users.noreply.github.com>
* test(): test dynamic max workers
* test(): test dynamic max workers
* test(): test dynamic max workers
* test(): test dynamic max workers
* test(): test dynamic max workers
* test(): test dynamic max workers
* test(): test dynamic max workers
* test(): test dynamic max workers
* test(): test dynamic max workers
* test(): test dynamic max workers
* test(): test dynamic max workers
* test(): test dynamic max workers
* test(): test dynamic max workers
* test(): test dynamic max workers
* test(): test dynamic max workers
* test(): test dynamic max workers
* Clarify test description and improve CI
* chore(): Upgrade mikro orm
* handle 'null' value for big number props
* 6.5.2
* remove only
* fix pricing module rule value
* switch select in strategy for balances
* revert to select in strategy for order module
* fix defining DML ManyToOne
* fix define relationship
* test fix
* more fixes
* change order strategy to balanced
* change order strategy to balanced
* prevent unnecessary manager fork
* revert generated www changes
* remove unnecessary changes
* Create real-cobras-deny.md
* address feedback
---------
Co-authored-by: Oli Juhl <59018053+olivermrbl@users.noreply.github.com>
Glob 7 uses the `inflight` module, which leaks memory. Also, all other Medusa packages are using glob 10+. So upgraded the one used by the framework too.
Fixes: FRMW-2972
Fixes: FRMW-2960
This PR adds support for processing large CSV files by breaking them into chunks and processing one chunk at a time. This is how it works in nutshell.
- The CSV file is read as a stream and each chunk of the stream is one CSV row.
- We read upto 1000 rows (plus a few more to ensure product variants of a product are not split into multiple chunks).
- Each chunk is then normalized using the `CSVNormalizer` and validated using zod schemas. If there is an error, the entire process will be aborted and the existing chunks will be deleted.
- Each chunk is written to a JSON file, so that we can process them later (after user confirms) without re-processing or validating the CSV file.
- The confirmation process will start consuming one chunk at a time and create/update products using the `batchProducts` workflow.
## Resume or not to resume processing of chunks
Let's imagine during processing of chunks, we find that chunk 3 leads to a database error. However, till this time we have processed the first two chunks already. How do we deal with this situation? Options are:
- We store at which chunk we failed and then during the re-upload we ignore chunks before the failed one. In my conversation with @olivermrbl we discovered that resuming will have to work with certain assumptions if we decide to implement it.
- What if a user updates the CSV rows which are part of the already processed chunks? These changes will be ignored and they will never notice it.
- Resuming works if the file name is still the same. What if they made changes and saved the file with "Save as - New name". In that case we will anyways process the entire file.
- We will have to fetch the old workflow from the workflow engine using some `ilike` search, so that we can see at which chunk the last run failed for the given file.
Co-authored-by: Carlos R. L. Rodrigues <37986729+carlos-r-l-rodrigues@users.noreply.github.com>
**What**
- Bumps the versions of Vite across the entire stack, to prevent an issue similar to what is described here: https://github.com/vitejs/vite/discussions/18271
Not entirely sure what was happening as I couldn't reproduce the issue, but Adrien faced the issue yesterday when working with local versions of our packages. It does appear as if the range we had before could lead to a version of Vite to be installed with said bug.
* ../../core/types/src/dml/index.ts
* ../../core/types/src/dml/index.ts
* fix: relationships mapping
* handle nullable foreign keys types
* handle nullable foreign keys types
* handle nullable foreign keys types
* continue to update product category repository
* fix all product category repositories issues
* fix product category service types
* fix product module service types
* fix product module service types
* fix repository template type
* refactor: use a singleton DMLToMikroORM factory instance
Since the MikroORM MetadataStorage is global, we will also have to turn DML
to MikroORM entities conversion use a global bucket as well
* refactor: update product module to use DML in tests
* wip: tests
* WIP product linkable fixes
* continue type fixing and start test fixing
* test: fix more tests
* fix repository
* fix pivot table computaion + fix mikro orm repository
* fix many to many management and configuration
* fix many to many management and configuration
* fix many to many management and configuration
* update product tag relation configuration
* Introduce experimental dml hooks to fix some issues with categories
* more fixes
* fix product tests
* add missing id prefixes
* fix product category handle management
* test: fix more failing tests
* test: make it all green
* test: fix breaking tests
* fix: build issues
* fix: build issues
* fix: more breaking tests
* refactor: fix issues after merge
* refactor: fix issues after merge
* refactor: surpress types error
* test: fix DML failing tests
* improve many to many inference + tests
* Wip fix columns from product entity
* remove product model before create hook and manage handle validation and transformation at the service level
* test: fix breaking unit tests
* fix: product module service to not update handle on product update
* fix define link and joiner config
* test: fix joiner config test
* test: fix joiner config test
* fix joiner config primary keys
* Fix joiner config builder
* Fix joiner config builder
* test: remove only modifier from test
* refactor: remove hooks usage from product collection
* refactor: remove hooks usage from product-option
* refactor: remove hooks usage for computing category handle
* refactor: remove hooks usage from productCategory model
* refactor: remove hooks from DML
* refactor: remove cruft
* order dml
* cleanup
* re add foerign key indexes
* wip
* chore: remove unused types
* wip
* changes
* rm raw
* autoincrement
* wip
* rel
* refactor: cleanup
* migration and models configuration adjustments
* cleanup
* number searchable
* fix random ordering
* fix
* test: fix product-category tests
* test: update breaking DML tests
* test: array assertion to not care about ordering
* fix: temporarily apply id ordering for products
* types
* wip
* WIP type improvements
* update order models
* partially fix types temporarely
* rel
* fix: recursive type issue
* improve type inference breaks
* improve type inference breaks
* update models
* rm nullable
* default value
* repository
* update default value handling
* fix unit tests
* WIP
* toMikroORM
* fix relations
* cascades
* fix
* experimental dml hooks
* rm migration
* serial
* nullable autoincrement
* fix model
* model changes
* fix one to one DML
* order test
* fix addresses
* fix unit tests
* Re align dml entity name inference
* update model table name config
* update model table name config
* revert
* update return relation
* WIP
* hasOne
* models
* fix
* model
* initial commit
* cart service
* order module
* utils unit test
* index engine
* changeset
* merge
* fix hasOne with fk
* update
* free text filter per entity
* tests
* prod category
* property string many to many
* fix big number
* link modules migration set names
* merge
* shipping option rules
* serializer
* unit test
* fix test mikro orm init
* fix test mikro orm init
* Maintain merge object properties
* fix test mikro orm init
* prevent unit test from connecting to db
* wip
* fix test
* fix test
* link test
* schema
* models
* auto increment
* hook
* model hooks
* order
* wip
* orm version
* request return field
* fix return configuration on order model
* workflows
* core flows
* unit test
* test
* base repo
* test
* base repo
* test fix
* inventory move
* locking inventory
* test
* free text fix
* rm timeout mock
* migrate fulfillment values
* v6.4.3
* cleanup
* link-modules update sql
* revert test
* remove fake timers
---------
Co-authored-by: adrien2p <adrien.deperetti@gmail.com>
Co-authored-by: Harminder Virk <virk.officials@gmail.com>
Co-authored-by: Oli Juhl <59018053+olivermrbl@users.noreply.github.com>
FIXES FRMW-2879
**What**
The exclusion regexp was broken and instead use the ignore options of the glob sync function to properly ignore definition files
In the framework package we are directly importing many uninstalled dependencies. The issue does not occur for transitive dependencies. However, the `glob` package is not a transitive dependency of any production dependency and hence it fails.
Fixes: #11044
Fixes: FRMW-2877