RESOLVES FRMW-2978
**What**
Add retry mechanism to database connection management to prevent failing when the server start faster than what makes the connection available
FIXES SUP-1824
**What**
The medusa internal service update should always return the data in the expected shape described by the interface. The medusa service should not have to handle the reshapre
* docs: add missing export for CreateBundledProduct component
This commit adds a missing `export` statement to the `CreateBundledProduct` component in the "Create Form Component" example.
Without the `export`, readers copying the example wouldn't be able to use the component elsewhere in their project, which could be confusing. This ensures the example is complete and ready to use as-is.
* updated to default export for CreateBundledProduct component instead
FIXES CLO-524
**What**
Add hidden stepDefinition object as part of the step argument and ensure the runAsStep handlers rely on the latest definition when config is being used on the returned step in order to ensure async configuration propagation and nested configuration
Corrected an inaccurate example in the useQueryGraphStep tip within the review workflow documentation.
Updated the entity reference from "cart's promotions" to "product" to align with the actual code context.
* [feat] add Indonesian json translation file
* [feat] include and export indonesian translation to index file
* [feat] export the indonesian language object to languages.ts
---------
Co-authored-by: luky <luzion1508@gmail.com>
Fixes: FRMW-2960
This PR adds support for processing large CSV files by breaking them into chunks and processing one chunk at a time. This is how it works in nutshell.
- The CSV file is read as a stream and each chunk of the stream is one CSV row.
- We read upto 1000 rows (plus a few more to ensure product variants of a product are not split into multiple chunks).
- Each chunk is then normalized using the `CSVNormalizer` and validated using zod schemas. If there is an error, the entire process will be aborted and the existing chunks will be deleted.
- Each chunk is written to a JSON file, so that we can process them later (after user confirms) without re-processing or validating the CSV file.
- The confirmation process will start consuming one chunk at a time and create/update products using the `batchProducts` workflow.
## Resume or not to resume processing of chunks
Let's imagine during processing of chunks, we find that chunk 3 leads to a database error. However, till this time we have processed the first two chunks already. How do we deal with this situation? Options are:
- We store at which chunk we failed and then during the re-upload we ignore chunks before the failed one. In my conversation with @olivermrbl we discovered that resuming will have to work with certain assumptions if we decide to implement it.
- What if a user updates the CSV rows which are part of the already processed chunks? These changes will be ignored and they will never notice it.
- Resuming works if the file name is still the same. What if they made changes and saved the file with "Save as - New name". In that case we will anyways process the entire file.
- We will have to fetch the old workflow from the workflow engine using some `ilike` search, so that we can see at which chunk the last run failed for the given file.
Co-authored-by: Carlos R. L. Rodrigues <37986729+carlos-r-l-rodrigues@users.noreply.github.com>