chore(): start moving some packages to the core directory (#7215)
This commit is contained in:
committed by
GitHub
parent
fdee748eed
commit
bbccd6481d
@@ -0,0 +1,4 @@
|
||||
node_modules
|
||||
/dist
|
||||
yarn.lock
|
||||
my-medusa-store
|
||||
@@ -0,0 +1,188 @@
|
||||
# Change Log
|
||||
|
||||
## 1.2.8
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#6027](https://github.com/medusajs/medusa/pull/6027) [`fe1d3a4a78`](https://github.com/medusajs/medusa/commit/fe1d3a4a78385bb356539d6f9cba5ce2012f5da8) Thanks [@shahednasser](https://github.com/shahednasser)! - feat(create-medusa-app): Add a `--verbose` option.
|
||||
|
||||
## 1.2.7
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#6729](https://github.com/medusajs/medusa/pull/6729) [`fbc369705d`](https://github.com/medusajs/medusa/commit/fbc369705d4feae21d77ea2fc59173ac9519cee6) Thanks [@shahednasser](https://github.com/shahednasser)! - feat(create-medusa-app): Add `--v2` option
|
||||
|
||||
## 1.2.6
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#5983](https://github.com/medusajs/medusa/pull/5983) [`f86877586`](https://github.com/medusajs/medusa/commit/f86877586147ecedbf7f56a1c57f37ef0c33286c) Thanks [@kasperkristensen](https://github.com/kasperkristensen)! - fix(create-medusa-app,medusa-core-utils): Use NodeJS.Timeout instead of NodeJS.Timer as the latter was deprecated in v14.
|
||||
chore(icons): Update icons to latest version.
|
||||
|
||||
## 1.2.5
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#5547](https://github.com/medusajs/medusa/pull/5547) [`09ab1d1be`](https://github.com/medusajs/medusa/commit/09ab1d1be6ec0c50e3fcc828c740265debd2222b) Thanks [@egormkn](https://github.com/egormkn)! - feat(create-medusa-app): print error message for failed database connection
|
||||
|
||||
## 1.2.4
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#5474](https://github.com/medusajs/medusa/pull/5474) [`03959c3e3`](https://github.com/medusajs/medusa/commit/03959c3e3a38c155607647e70668ee250b76fda9) Thanks [@shahednasser](https://github.com/shahednasser)! - feat(create-medusa-app): improve spinner style
|
||||
|
||||
## 1.2.3
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#5404](https://github.com/medusajs/medusa/pull/5404) [`a1807aea8`](https://github.com/medusajs/medusa/commit/a1807aea83a5bf82c4221839a1fa57b41d2cc6ac) Thanks [@shahednasser](https://github.com/shahednasser)! - feat(create-medusa-app): add tracking for selected options
|
||||
|
||||
## 1.2.2
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#5189](https://github.com/medusajs/medusa/pull/5189) [`18a05dee8`](https://github.com/medusajs/medusa/commit/18a05dee86f55366c92a2669eacbccf526373ff5) Thanks [@shahednasser](https://github.com/shahednasser)! - fix(create-medusa-app): fix inconsistency in checking errors in migrations
|
||||
|
||||
## 1.2.1
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#5061](https://github.com/medusajs/medusa/pull/5061) [`5b09f816c`](https://github.com/medusajs/medusa/commit/5b09f816cbb44d457c8cbdee803d9f84bd3fed0d) Thanks [@shahednasser](https://github.com/shahednasser)! - fix(create-medusa-app): fix command for windows OS
|
||||
|
||||
## 1.2.0
|
||||
|
||||
### Minor Changes
|
||||
|
||||
- [#4968](https://github.com/medusajs/medusa/pull/4968) [`240b03800`](https://github.com/medusajs/medusa/commit/240b038006924d4872de068c98e2d6862145cb52) Thanks [@shahednasser](https://github.com/shahednasser)! - feat(create-medusa-app): add install Next.js storefront option
|
||||
|
||||
## 1.1.1
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- Updated dependencies [[`c58588904`](https://github.com/medusajs/medusa/commit/c58588904c5631111603b15afacf7cdc4c738cc4)]:
|
||||
- medusa-telemetry@0.0.17
|
||||
|
||||
## 1.1.0
|
||||
|
||||
### Minor Changes
|
||||
|
||||
- [#4794](https://github.com/medusajs/medusa/pull/4794) [`c684d16ec`](https://github.com/medusajs/medusa/commit/c684d16ec012a547a9981480db8b4b96f5f22904) Thanks [@shahednasser](https://github.com/shahednasser)! - feat(create-medusa-app): remove `--stable` option and change to clone default branch
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#4733](https://github.com/medusajs/medusa/pull/4733) [`30ce35b16`](https://github.com/medusajs/medusa/commit/30ce35b163afa25f4e1d8d1bd392f401a3b413df) Thanks [@shahednasser](https://github.com/shahednasser)! - feat(create-medusa-app, utils, medusa-cli): add database options + remove util from `@medusajs/utils`
|
||||
|
||||
## 1.0.4
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#4567](https://github.com/medusajs/medusa/pull/4567) [`15e87a810`](https://github.com/medusajs/medusa/commit/15e87a8100c9fe66f6d120423ef0351f4c657e7e) Thanks [@shahednasser](https://github.com/shahednasser)! - feat(create-medusa-app): add `stable` option + add URI encoding to database string
|
||||
|
||||
- Updated dependencies [[`131477faf`](https://github.com/medusajs/medusa/commit/131477faf0409c49d4aacf26ea591e33b2fa22fd)]:
|
||||
- @medusajs/utils@1.9.3
|
||||
|
||||
## 1.0.3
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#4543](https://github.com/medusajs/medusa/pull/4543) [`f32588122`](https://github.com/medusajs/medusa/commit/f32588122733a4dca54d06a53d18e14ad8ed021b) Thanks [@shahednasser](https://github.com/shahednasser)! - fix(create-medusa-app): improved error messages
|
||||
|
||||
## 1.0.2
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#4493](https://github.com/medusajs/medusa/pull/4493) [`4b4296dc1`](https://github.com/medusajs/medusa/commit/4b4296dc1651d325bd7e64eb0fdb0584989828f7) Thanks [@shahednasser](https://github.com/shahednasser)! - feat(create-medusa-app): remove .git directory in the project
|
||||
|
||||
- [#4369](https://github.com/medusajs/medusa/pull/4369) [`d363da2b7`](https://github.com/medusajs/medusa/commit/d363da2b72820defe90b12b0dc741087f3b21090) Thanks [@adrien2p](https://github.com/adrien2p)! - chore(create-medusa-app): Cleanup the main script for readability and maintanability
|
||||
|
||||
- Updated dependencies [[`499c3478c`](https://github.com/medusajs/medusa/commit/499c3478c910c8b922a15cc6f4d9fbad122a347f), [`9dcdc0041`](https://github.com/medusajs/medusa/commit/9dcdc0041a2b08cc0723343dd8d9127d9977b086), [`9760d4a96`](https://github.com/medusajs/medusa/commit/9760d4a96c27f6f89a8c3f3b6e73b17547f97f2a)]:
|
||||
- @medusajs/utils@1.9.2
|
||||
|
||||
## 1.0.1
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#4339](https://github.com/medusajs/medusa/pull/4339) [`6d5da9166`](https://github.com/medusajs/medusa/commit/6d5da9166f609017f92c8c4c34c8eeca53699e3d) Thanks [@shahednasser](https://github.com/shahednasser)! - feat(create-medusa-app): support admin onboarding experience
|
||||
|
||||
## 1.0.0
|
||||
|
||||
### Major Changes
|
||||
|
||||
- [`b779040ac`](https://github.com/medusajs/medusa/commit/b779040acbaded8596db697a9f8dbe7451cfb9ca) Thanks [@olivermrbl](https://github.com/olivermrbl)! - chore: Bump create-medusa-app to major
|
||||
|
||||
### Minor Changes
|
||||
|
||||
- [#4215](https://github.com/medusajs/medusa/pull/4215) [`c04d93cd0`](https://github.com/medusajs/medusa/commit/c04d93cd041a0080af46c6068d6cf4724600af02) Thanks [@shahednasser](https://github.com/shahednasser)! - feat(create-medusa-app): update command for a better onboarding experience
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#4273](https://github.com/medusajs/medusa/pull/4273) [`f98ba5bde`](https://github.com/medusajs/medusa/commit/f98ba5bde83ba785eead31b0c9eb9f135d664178) Thanks [@shahednasser](https://github.com/shahednasser)! - feat(create-medusa-app,medusa-cli): Allow clearing project
|
||||
|
||||
- [#4321](https://github.com/medusajs/medusa/pull/4321) [`5ad734740`](https://github.com/medusajs/medusa/commit/5ad7347408746fd717354c99a530aaab5e86ffa2) Thanks [@shahednasser](https://github.com/shahednasser)! - fix(create-medusa-app): disable opening browser
|
||||
|
||||
- [#4328](https://github.com/medusajs/medusa/pull/4328) [`e1f43fd20`](https://github.com/medusajs/medusa/commit/e1f43fd207621deec885c06f704c2e4c5d7c559b) Thanks [@shahednasser](https://github.com/shahednasser)! - chore(create-medusa-app): change boxen version
|
||||
|
||||
- [#4247](https://github.com/medusajs/medusa/pull/4247) [`4b5b7b514`](https://github.com/medusajs/medusa/commit/4b5b7b51483bae8996235a37e75c0671f9e2994f) Thanks [@shahednasser](https://github.com/shahednasser)! - fix(create-medusa-app): remove seed command from create-medusa-app and improve success message
|
||||
|
||||
- Updated dependencies [[`f98ba5bde`](https://github.com/medusajs/medusa/commit/f98ba5bde83ba785eead31b0c9eb9f135d664178), [`14c0f62f8`](https://github.com/medusajs/medusa/commit/14c0f62f84704a4c87beff3daaff60a52f5c88b8)]:
|
||||
- @medusajs/utils@1.9.1
|
||||
|
||||
## 0.0.10
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#3376](https://github.com/medusajs/medusa/pull/3376) [`88392e017`](https://github.com/medusajs/medusa/commit/88392e017653b53069793bd2ee451c2e6b7bcf17) Thanks [@olivermrbl](https://github.com/olivermrbl)! - chore(create-medusa-app): Remove admin + Gatsby starter from npx
|
||||
|
||||
## 0.0.10-rc.0
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#3376](https://github.com/medusajs/medusa/pull/3376) [`88392e017`](https://github.com/medusajs/medusa/commit/88392e017653b53069793bd2ee451c2e6b7bcf17) Thanks [@olivermrbl](https://github.com/olivermrbl)! - chore(create-medusa-app): Remove admin + Gatsby starter from npx
|
||||
|
||||
## 0.0.9
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#3217](https://github.com/medusajs/medusa/pull/3217) [`8c5219a31`](https://github.com/medusajs/medusa/commit/8c5219a31ef76ee571fbce84d7d57a63abe56eb0) Thanks [@adrien2p](https://github.com/adrien2p)! - chore: Fix npm packages files included
|
||||
|
||||
## 0.0.8
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#3185](https://github.com/medusajs/medusa/pull/3185) [`08324355a`](https://github.com/medusajs/medusa/commit/08324355a4466b017a0bc7ab1d333ee3cd27b8c4) Thanks [@olivermrbl](https://github.com/olivermrbl)! - chore: Patches all dependencies + minor bumps `winston` to include a [fix for a significant memory leak](https://github.com/winstonjs/winston/pull/2057)
|
||||
|
||||
## 0.0.7
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#2069](https://github.com/medusajs/medusa/pull/2069) [`ad717b953`](https://github.com/medusajs/medusa/commit/ad717b9533a0500e20c4e312d1ee48b35ea9d5e1) Thanks [@olivermrbl](https://github.com/olivermrbl)! - Remove deprecated dependency `@hapi/joi`
|
||||
|
||||
All notable changes to this project will be documented in this file.
|
||||
See [Conventional Commits](https://conventionalcommits.org) for commit guidelines.
|
||||
|
||||
## [0.0.6](https://github.com/medusajs/medusa/compare/create-medusa-app@0.0.5...create-medusa-app@0.0.6) (2022-01-11)
|
||||
|
||||
### Features
|
||||
|
||||
- **create-medusa-app:** add medusa.express ([#981](https://github.com/medusajs/medusa/issues/981)) ([bbc16d6](https://github.com/medusajs/medusa/commit/bbc16d6b115fb389ee0fe58d909e74a162686163))
|
||||
|
||||
## [0.0.5](https://github.com/medusajs/medusa/compare/create-medusa-app@0.0.3...create-medusa-app@0.0.5) (2021-10-18)
|
||||
|
||||
**Note:** Version bump only for package create-medusa-app
|
||||
|
||||
## [0.0.4](https://github.com/medusajs/medusa/compare/create-medusa-app@0.0.3...create-medusa-app@0.0.4) (2021-10-18)
|
||||
|
||||
**Note:** Version bump only for package create-medusa-app
|
||||
|
||||
## [0.0.3](https://github.com/medusajs/medusa/compare/create-medusa-app@0.0.1...create-medusa-app@0.0.3) (2021-09-15)
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
- flip seed flag to default to true ([#398](https://github.com/medusajs/medusa/issues/398)) ([dd1025f](https://github.com/medusajs/medusa/commit/dd1025fd5369eb264d7d5f5d6db41c888259d786))
|
||||
- versioning ([3b8901e](https://github.com/medusajs/medusa/commit/3b8901ebc2fb41dc8d5372a808c5eaafd7d32646))
|
||||
|
||||
## 0.0.1 (2021-09-14)
|
||||
|
||||
### Features
|
||||
|
||||
- adds create-medusa-app ([#377](https://github.com/medusajs/medusa/issues/377)) ([ec6d16e](https://github.com/medusajs/medusa/commit/ec6d16e945f4b8a99e9dcc8ae2e92a2318fbc709))
|
||||
@@ -0,0 +1,58 @@
|
||||
<p align="center">
|
||||
<a href="https://www.medusajs.com">
|
||||
<img alt="Medusa" src="https://user-images.githubusercontent.com/7554214/153162406-bf8fd16f-aa98-4604-b87b-e13ab4baf604.png" width="100" />
|
||||
</a>
|
||||
</p>
|
||||
<h1 align="center">
|
||||
create-medusa-app
|
||||
</h1>
|
||||
|
||||
<h4 align="center">
|
||||
<a href="https://docs.medusajs.com">Documentation</a> |
|
||||
<a href="https://www.medusajs.com">Website</a>
|
||||
</h4>
|
||||
|
||||
<p align="center">
|
||||
An open source composable commerce engine built for developers.
|
||||
</p>
|
||||
<p align="center">
|
||||
<a href="https://github.com/medusajs/medusa/blob/master/LICENSE">
|
||||
<img src="https://img.shields.io/badge/license-MIT-blue.svg" alt="Medusa is released under the MIT license." />
|
||||
</a>
|
||||
<a href="https://circleci.com/gh/medusajs/medusa">
|
||||
<img src="https://circleci.com/gh/medusajs/medusa.svg?style=shield" alt="Current CircleCI build status." />
|
||||
</a>
|
||||
<a href="https://github.com/medusajs/medusa/blob/master/CONTRIBUTING.md">
|
||||
<img src="https://img.shields.io/badge/PRs-welcome-brightgreen.svg?style=flat" alt="PRs welcome!" />
|
||||
</a>
|
||||
<a href="https://www.producthunt.com/posts/medusa"><img src="https://img.shields.io/badge/Product%20Hunt-%231%20Product%20of%20the%20Day-%23DA552E" alt="Product Hunt"></a>
|
||||
<a href="https://discord.gg/xpCwq3Kfn8">
|
||||
<img src="https://img.shields.io/badge/chat-on%20discord-7289DA.svg" alt="Discord Chat" />
|
||||
</a>
|
||||
<a href="https://twitter.com/intent/follow?screen_name=medusajs">
|
||||
<img src="https://img.shields.io/twitter/follow/medusajs.svg?label=Follow%20@medusajs" alt="Follow @medusajs" />
|
||||
</a>
|
||||
</p>
|
||||
|
||||
## Overview
|
||||
|
||||
Using this NPX command, you can setup a Medusa backend and admin along with a PostgreSQL database in simple steps.
|
||||
|
||||
---
|
||||
|
||||
## Usage
|
||||
|
||||
Run the following command in your terminal:
|
||||
|
||||
```bash
|
||||
npx create-medusa-app@latest
|
||||
```
|
||||
|
||||
Then, answer the prompted questions to setup your PostgreSQL database and Medusa project. Once the setup is done, the Medusa admin dashboard will open in your default browser.
|
||||
|
||||
### Options
|
||||
|
||||
| Option | Description | Default value |
|
||||
|--------------------|-------------------------------------------------------|------------------------------------------------------|
|
||||
| `--repo-url <url>` | Create Medusa project from a different repository URL | `https://github.com/medusajs/medusa-starter-default` |
|
||||
| `--seed` | Using this option seeds the database with demo data | false |
|
||||
@@ -0,0 +1,68 @@
|
||||
{
|
||||
"name": "create-medusa-app",
|
||||
"version": "1.2.8",
|
||||
"description": "Create a Medusa project using a single command.",
|
||||
"type": "module",
|
||||
"exports": "./dist/index.js",
|
||||
"bin": "dist/index.js",
|
||||
"license": "MIT",
|
||||
"scripts": {
|
||||
"dev": "ts-node --esm src/index.ts",
|
||||
"build": "tsc",
|
||||
"watch": "tsc --watch",
|
||||
"prepublishOnly": "cross-env NODE_ENV=production tsc --build"
|
||||
},
|
||||
"dependencies": {
|
||||
"boxen": "^5",
|
||||
"chalk": "^5.2.0",
|
||||
"commander": "^10.0.1",
|
||||
"glob": "^7.1.6",
|
||||
"inquirer": "^9.2.2",
|
||||
"medusa-telemetry": "^0.0.17",
|
||||
"nanoid": "^4.0.2",
|
||||
"node-emoji": "^2.0.2",
|
||||
"node-fetch": "^3.3.1",
|
||||
"open": "^9.1.0",
|
||||
"ora": "^6.3.0",
|
||||
"pg": "^8.10.0",
|
||||
"slugify": "^1.6.6",
|
||||
"uuid": "^9.0.0",
|
||||
"validator": "^13.9.0",
|
||||
"wait-on": "^7.0.1",
|
||||
"winston": "^3.9.0"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@types/chalk": "^2.2.0",
|
||||
"@types/commander": "^2.12.2",
|
||||
"@types/configstore": "^6.0.0",
|
||||
"@types/inquirer": "^9.0.3",
|
||||
"@types/node-emoji": "^1.8.2",
|
||||
"@types/pg": "^8.6.6",
|
||||
"@types/uuid": "^9.0.1",
|
||||
"@types/validator": "^13.7.17",
|
||||
"@types/wait-on": "^5.3.1",
|
||||
"@typescript-eslint/eslint-plugin": "^6.19.0",
|
||||
"@typescript-eslint/parser": "^6.19.0",
|
||||
"configstore": "^6.0.0",
|
||||
"eslint": "^8.40.0",
|
||||
"eslint-config-google": "^0.14.0",
|
||||
"eslint-config-prettier": "^8.8.0",
|
||||
"eslint-plugin-prettier": "^4.2.1",
|
||||
"prettier": "^2.8.8",
|
||||
"ts-node": "^10.9.1",
|
||||
"typescript": "^5.0.4"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=14.16"
|
||||
},
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "https://github.com/medusajs/medusa.git",
|
||||
"directory": "packages/create-medusa-app"
|
||||
},
|
||||
"author": "Medusa",
|
||||
"publishConfig": {
|
||||
"access": "public"
|
||||
},
|
||||
"gitHead": "41a5425405aea5045a26def95c0dc00cf4a5a44d"
|
||||
}
|
||||
@@ -0,0 +1,342 @@
|
||||
import inquirer from "inquirer"
|
||||
import slugifyType from "slugify"
|
||||
import chalk from "chalk"
|
||||
import { getDbClientAndCredentials, runCreateDb } from "../utils/create-db.js"
|
||||
import prepareProject from "../utils/prepare-project.js"
|
||||
import startMedusa from "../utils/start-medusa.js"
|
||||
import open from "open"
|
||||
import waitOn from "wait-on"
|
||||
import ora, { Ora } from "ora"
|
||||
import fs from "fs"
|
||||
import path from "path"
|
||||
import isEmailImported from "validator/lib/isEmail.js"
|
||||
import logMessage from "../utils/log-message.js"
|
||||
import createAbortController, {
|
||||
isAbortError,
|
||||
} from "../utils/create-abort-controller.js"
|
||||
import { track } from "medusa-telemetry"
|
||||
import boxen from "boxen"
|
||||
import { emojify } from "node-emoji"
|
||||
import ProcessManager from "../utils/process-manager.js"
|
||||
import { nanoid } from "nanoid"
|
||||
import { displayFactBox, FactBoxOptions } from "../utils/facts.js"
|
||||
import { EOL } from "os"
|
||||
import { runCloneRepo } from "../utils/clone-repo.js"
|
||||
import {
|
||||
askForNextjsStarter,
|
||||
installNextjsStarter,
|
||||
startNextjsStarter,
|
||||
} from "../utils/nextjs-utils.js"
|
||||
|
||||
const slugify = slugifyType.default
|
||||
const isEmail = isEmailImported.default
|
||||
|
||||
export type CreateOptions = {
|
||||
repoUrl?: string
|
||||
seed?: boolean
|
||||
// commander passed --no-boilerplate as boilerplate
|
||||
boilerplate?: boolean
|
||||
skipDb?: boolean
|
||||
dbUrl?: string
|
||||
browser?: boolean
|
||||
migrations?: boolean
|
||||
directoryPath?: string
|
||||
withNextjsStarter?: boolean
|
||||
verbose?: boolean
|
||||
v2?: boolean
|
||||
}
|
||||
|
||||
export default async ({
|
||||
repoUrl = "",
|
||||
seed,
|
||||
boilerplate,
|
||||
skipDb,
|
||||
dbUrl,
|
||||
browser,
|
||||
migrations,
|
||||
directoryPath,
|
||||
withNextjsStarter = false,
|
||||
verbose = false,
|
||||
v2 = false,
|
||||
}: CreateOptions) => {
|
||||
track("CREATE_CLI_CMA")
|
||||
|
||||
const spinner: Ora = ora()
|
||||
const processManager = new ProcessManager()
|
||||
const abortController = createAbortController(processManager)
|
||||
const factBoxOptions: FactBoxOptions = {
|
||||
interval: null,
|
||||
spinner,
|
||||
processManager,
|
||||
message: "",
|
||||
title: "",
|
||||
verbose,
|
||||
}
|
||||
const dbName = !skipDb && !dbUrl ? `medusa-${nanoid(4)}` : ""
|
||||
let isProjectCreated = false
|
||||
let isDbInitialized = false
|
||||
let printedMessage = false
|
||||
let nextjsDirectory = ""
|
||||
|
||||
processManager.onTerminated(async () => {
|
||||
spinner.stop()
|
||||
// prevent an error from occurring if
|
||||
// client hasn't been declared yet
|
||||
if (isDbInitialized && client) {
|
||||
await client.end()
|
||||
}
|
||||
|
||||
// the SIGINT event is triggered twice once the backend runs
|
||||
// this ensures that the message isn't printed twice to the user
|
||||
if (!printedMessage && isProjectCreated) {
|
||||
printedMessage = true
|
||||
showSuccessMessage(projectName, undefined, nextjsDirectory)
|
||||
}
|
||||
|
||||
return
|
||||
})
|
||||
|
||||
const projectName = await askForProjectName(directoryPath)
|
||||
const projectPath = getProjectPath(projectName, directoryPath)
|
||||
const adminEmail =
|
||||
!skipDb && migrations ? await askForAdminEmail(seed, boilerplate) : ""
|
||||
const installNextjs = withNextjsStarter || (await askForNextjsStarter())
|
||||
|
||||
let { client, dbConnectionString } = !skipDb
|
||||
? await getDbClientAndCredentials({
|
||||
dbName,
|
||||
dbUrl,
|
||||
verbose,
|
||||
})
|
||||
: { client: null, dbConnectionString: "" }
|
||||
isDbInitialized = true
|
||||
|
||||
track("CMA_OPTIONS", {
|
||||
repoUrl,
|
||||
seed,
|
||||
boilerplate,
|
||||
skipDb,
|
||||
browser,
|
||||
migrations,
|
||||
installNextjs,
|
||||
verbose,
|
||||
})
|
||||
|
||||
logMessage({
|
||||
message: `${emojify(
|
||||
":rocket:"
|
||||
)} Starting project setup, this may take a few minutes.`,
|
||||
})
|
||||
|
||||
spinner.start()
|
||||
|
||||
factBoxOptions.interval = displayFactBox({
|
||||
...factBoxOptions,
|
||||
title: "Setting up project...",
|
||||
})
|
||||
|
||||
try {
|
||||
await runCloneRepo({
|
||||
projectName: projectPath,
|
||||
repoUrl,
|
||||
abortController,
|
||||
spinner,
|
||||
verbose,
|
||||
v2,
|
||||
})
|
||||
} catch {
|
||||
return
|
||||
}
|
||||
|
||||
factBoxOptions.interval = displayFactBox({
|
||||
...factBoxOptions,
|
||||
message: "Created project directory",
|
||||
})
|
||||
|
||||
nextjsDirectory = installNextjs
|
||||
? await installNextjsStarter({
|
||||
directoryName: projectPath,
|
||||
abortController,
|
||||
factBoxOptions,
|
||||
verbose,
|
||||
})
|
||||
: ""
|
||||
|
||||
if (client && !dbUrl) {
|
||||
factBoxOptions.interval = displayFactBox({
|
||||
...factBoxOptions,
|
||||
title: "Creating database...",
|
||||
})
|
||||
client = await runCreateDb({ client, dbName, spinner })
|
||||
|
||||
factBoxOptions.interval = displayFactBox({
|
||||
...factBoxOptions,
|
||||
message: `Database ${dbName} created`,
|
||||
})
|
||||
}
|
||||
|
||||
// prepare project
|
||||
let inviteToken: string | undefined = undefined
|
||||
try {
|
||||
inviteToken = await prepareProject({
|
||||
directory: projectPath,
|
||||
dbConnectionString,
|
||||
admin: {
|
||||
email: adminEmail,
|
||||
},
|
||||
seed,
|
||||
boilerplate,
|
||||
spinner,
|
||||
processManager,
|
||||
abortController,
|
||||
skipDb,
|
||||
migrations,
|
||||
onboardingType: installNextjs ? "nextjs" : "default",
|
||||
nextjsDirectory,
|
||||
client,
|
||||
verbose,
|
||||
v2,
|
||||
})
|
||||
} catch (e: any) {
|
||||
if (isAbortError(e)) {
|
||||
process.exit()
|
||||
}
|
||||
|
||||
spinner.stop()
|
||||
logMessage({
|
||||
message: `An error occurred while preparing project: ${e}`,
|
||||
type: "error",
|
||||
})
|
||||
|
||||
return
|
||||
} finally {
|
||||
// close db connection
|
||||
await client?.end()
|
||||
}
|
||||
|
||||
spinner.succeed(chalk.green("Project Prepared"))
|
||||
|
||||
if (skipDb || !browser) {
|
||||
showSuccessMessage(projectPath, inviteToken, nextjsDirectory)
|
||||
process.exit()
|
||||
}
|
||||
|
||||
// start backend
|
||||
logMessage({
|
||||
message: "Starting Medusa...",
|
||||
})
|
||||
|
||||
try {
|
||||
startMedusa({
|
||||
directory: projectPath,
|
||||
abortController,
|
||||
})
|
||||
|
||||
if (installNextjs && nextjsDirectory) {
|
||||
startNextjsStarter({
|
||||
directory: nextjsDirectory,
|
||||
abortController,
|
||||
verbose,
|
||||
})
|
||||
}
|
||||
} catch (e) {
|
||||
if (isAbortError(e)) {
|
||||
process.exit()
|
||||
}
|
||||
|
||||
logMessage({
|
||||
message: `An error occurred while starting Medusa`,
|
||||
type: "error",
|
||||
})
|
||||
|
||||
return
|
||||
}
|
||||
|
||||
isProjectCreated = true
|
||||
|
||||
await waitOn({
|
||||
resources: ["http://localhost:9000/health"],
|
||||
}).then(async () =>
|
||||
open(
|
||||
inviteToken
|
||||
? `http://localhost:7001/invite?token=${inviteToken}&first_run=true`
|
||||
: "http://localhost:7001"
|
||||
)
|
||||
)
|
||||
}
|
||||
|
||||
async function askForProjectName(directoryPath?: string): Promise<string> {
|
||||
const { projectName } = await inquirer.prompt([
|
||||
{
|
||||
type: "input",
|
||||
name: "projectName",
|
||||
message: "What's the name of your project?",
|
||||
default: "my-medusa-store",
|
||||
filter: (input) => {
|
||||
return slugify(input).toLowerCase()
|
||||
},
|
||||
validate: (input) => {
|
||||
if (!input.length) {
|
||||
return "Please enter a project name"
|
||||
}
|
||||
const projectPath = getProjectPath(input, directoryPath)
|
||||
return fs.existsSync(projectPath) &&
|
||||
fs.lstatSync(projectPath).isDirectory()
|
||||
? "A directory already exists with the same name. Please enter a different project name."
|
||||
: true
|
||||
},
|
||||
},
|
||||
])
|
||||
return projectName
|
||||
}
|
||||
|
||||
async function askForAdminEmail(
|
||||
seed?: boolean,
|
||||
boilerplate?: boolean
|
||||
): Promise<string> {
|
||||
const { adminEmail } = await inquirer.prompt([
|
||||
{
|
||||
type: "input",
|
||||
name: "adminEmail",
|
||||
message: "Enter an email for your admin dashboard user",
|
||||
default: !seed && boilerplate ? "admin@medusa-test.com" : undefined,
|
||||
validate: (input) => {
|
||||
return typeof input === "string" && input.length > 0 && isEmail(input)
|
||||
? true
|
||||
: "Please enter a valid email"
|
||||
},
|
||||
},
|
||||
])
|
||||
|
||||
return adminEmail
|
||||
}
|
||||
|
||||
function showSuccessMessage(
|
||||
projectName: string,
|
||||
inviteToken?: string,
|
||||
nextjsDirectory?: string
|
||||
) {
|
||||
logMessage({
|
||||
message: boxen(
|
||||
chalk.green(
|
||||
// eslint-disable-next-line prettier/prettier
|
||||
`Change to the \`${projectName}\` directory to explore your Medusa project.${EOL}${EOL}Start your Medusa app again with the following command:${EOL}${EOL}npx @medusajs/medusa-cli develop${EOL}${EOL}${inviteToken ? `After you start the Medusa app, you can set a password for your admin user with the URL ${getInviteUrl(inviteToken)}${EOL}${EOL}` : ""}${nextjsDirectory?.length ? `The Next.js Starter storefront was installed in the \`${nextjsDirectory}\` directory. Change to that directory and start it with the following command:${EOL}${EOL}npm run dev${EOL}${EOL}` : ""}Check out the Medusa documentation to start your development:${EOL}${EOL}https://docs.medusajs.com/${EOL}${EOL}Star us on GitHub if you like what we're building:${EOL}${EOL}https://github.com/medusajs/medusa/stargazers`
|
||||
),
|
||||
{
|
||||
titleAlignment: "center",
|
||||
textAlignment: "center",
|
||||
padding: 1,
|
||||
margin: 1,
|
||||
float: "center",
|
||||
}
|
||||
),
|
||||
})
|
||||
}
|
||||
|
||||
function getProjectPath(projectName: string, directoryPath?: string) {
|
||||
return path.join(directoryPath || "", projectName)
|
||||
}
|
||||
|
||||
function getInviteUrl(inviteToken: string) {
|
||||
return `http://localhost:7001/invite?token=${inviteToken}&first_run=true`
|
||||
}
|
||||
@@ -0,0 +1,53 @@
|
||||
#!/usr/bin/env node
|
||||
import { program } from "commander"
|
||||
import create from "./commands/create.js"
|
||||
|
||||
program
|
||||
.description("Create a new Medusa project")
|
||||
.option("--repo-url <url>", "URL of repository to use to setup project.")
|
||||
.option("--seed", "Seed the created database with demo data.")
|
||||
.option(
|
||||
"--no-boilerplate",
|
||||
"Install a Medusa project without the boilerplate and demo files."
|
||||
)
|
||||
.option(
|
||||
"--skip-db",
|
||||
"Skips creating the database, running migrations, and seeding, and subsequently skips opening the browser.",
|
||||
false
|
||||
)
|
||||
.option(
|
||||
"--db-url <url>",
|
||||
"Skips database creation and sets the database URL to the provided URL. Throws an error if can't connect to the database. Will still run migrations and open the admin after project creation."
|
||||
)
|
||||
.option(
|
||||
"--no-migrations",
|
||||
"Skips running migrations, creating admin user, and seeding. If used, it's expected that you pass the --db-url option with a url of a database that has all necessary migrations. Otherwise, unexpected errors will occur.",
|
||||
true
|
||||
)
|
||||
.option(
|
||||
"--no-browser",
|
||||
"Disables opening the browser at the end of the project creation and only shows success message.",
|
||||
true
|
||||
)
|
||||
.option(
|
||||
"--directory-path <path>",
|
||||
"Specify the directory path to install the project in."
|
||||
)
|
||||
.option(
|
||||
"--with-nextjs-starter",
|
||||
"Install the Next.js starter along with the Medusa backend",
|
||||
false
|
||||
)
|
||||
.option(
|
||||
"--verbose",
|
||||
"Show all logs of underlying commands. Useful for debugging.",
|
||||
false
|
||||
)
|
||||
.option(
|
||||
"--v2",
|
||||
"Install Medusa with the V2 feature flag enabled. WARNING: Medusa V2 is still in development and shouldn't be used in production.",
|
||||
false
|
||||
)
|
||||
.parse()
|
||||
|
||||
void create(program.opts())
|
||||
@@ -0,0 +1 @@
|
||||
declare module "medusa-telemetry"
|
||||
@@ -0,0 +1,25 @@
|
||||
import fs from "fs"
|
||||
import glob from "glob"
|
||||
import path from "path"
|
||||
|
||||
export function clearProject(directory: string) {
|
||||
const adminFiles = glob.sync(path.join(directory, `src`, `admin/**/*`))
|
||||
const onboardingFiles = glob.sync(
|
||||
path.join(directory, `src`, `**/onboarding/`)
|
||||
)
|
||||
const typeFiles = glob.sync(path.join(directory, `src`, `types`))
|
||||
const srcFiles = glob.sync(
|
||||
path.join(directory, `src`, `**/*.{ts,tsx,js,jsx}`)
|
||||
)
|
||||
|
||||
const files = [...adminFiles, ...onboardingFiles, ...typeFiles, ...srcFiles]
|
||||
|
||||
files.forEach((file) =>
|
||||
fs.rmSync(file, {
|
||||
recursive: true,
|
||||
force: true,
|
||||
})
|
||||
)
|
||||
// add empty typescript file to avoid build errors
|
||||
fs.openSync(path.join(directory, "src", "index.ts"), "w")
|
||||
}
|
||||
@@ -0,0 +1,82 @@
|
||||
import execute from "./execute.js"
|
||||
import { Ora } from "ora"
|
||||
import { isAbortError } from "./create-abort-controller.js"
|
||||
import logMessage from "./log-message.js"
|
||||
import fs from "fs"
|
||||
import path from "path"
|
||||
|
||||
type CloneRepoOptions = {
|
||||
directoryName?: string
|
||||
repoUrl?: string
|
||||
abortController?: AbortController
|
||||
verbose?: boolean
|
||||
v2?: boolean
|
||||
}
|
||||
|
||||
const DEFAULT_REPO = "https://github.com/medusajs/medusa-starter-default"
|
||||
const V2_BRANCH = "feat/v2"
|
||||
|
||||
export default async function cloneRepo({
|
||||
directoryName = "",
|
||||
repoUrl,
|
||||
abortController,
|
||||
verbose = false,
|
||||
v2 = false,
|
||||
}: CloneRepoOptions) {
|
||||
await execute(
|
||||
[
|
||||
`git clone ${repoUrl || DEFAULT_REPO}${
|
||||
v2 ? ` -b ${V2_BRANCH}` : ""
|
||||
} ${directoryName}`,
|
||||
{
|
||||
signal: abortController?.signal,
|
||||
},
|
||||
],
|
||||
{ verbose }
|
||||
)
|
||||
}
|
||||
|
||||
export async function runCloneRepo({
|
||||
projectName,
|
||||
repoUrl,
|
||||
abortController,
|
||||
spinner,
|
||||
verbose = false,
|
||||
v2 = false,
|
||||
}: {
|
||||
projectName: string
|
||||
repoUrl: string
|
||||
abortController: AbortController
|
||||
spinner: Ora
|
||||
verbose?: boolean
|
||||
v2?: boolean
|
||||
}) {
|
||||
try {
|
||||
await cloneRepo({
|
||||
directoryName: projectName,
|
||||
repoUrl,
|
||||
abortController,
|
||||
verbose,
|
||||
v2,
|
||||
})
|
||||
|
||||
deleteGitDirectory(projectName)
|
||||
} catch (e) {
|
||||
if (isAbortError(e)) {
|
||||
process.exit()
|
||||
}
|
||||
|
||||
spinner.stop()
|
||||
logMessage({
|
||||
message: `An error occurred while setting up your project: ${e}`,
|
||||
type: "error",
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
function deleteGitDirectory(projectDirectory: string) {
|
||||
fs.rmSync(path.join(projectDirectory, ".git"), {
|
||||
recursive: true,
|
||||
force: true,
|
||||
})
|
||||
}
|
||||
@@ -0,0 +1,16 @@
|
||||
import ProcessManager from "./process-manager.js"
|
||||
|
||||
export default (processManager: ProcessManager) => {
|
||||
const abortController = new AbortController()
|
||||
processManager.onTerminated(() => abortController.abort())
|
||||
return abortController
|
||||
}
|
||||
|
||||
export const isAbortError = (e: any) =>
|
||||
e !== null && "code" in e && e.code === "ABORT_ERR"
|
||||
|
||||
export const getAbortError = () => {
|
||||
return {
|
||||
code: "ABORT_ERR",
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,184 @@
|
||||
import { EOL } from "os"
|
||||
import pg from "pg"
|
||||
import postgresClient from "./postgres-client.js"
|
||||
import inquirer from "inquirer"
|
||||
import logMessage from "./log-message.js"
|
||||
import formatConnectionString from "./format-connection-string.js"
|
||||
import { Ora } from "ora"
|
||||
import { getCurrentOs } from "./get-current-os.js"
|
||||
|
||||
type CreateDbOptions = {
|
||||
client: pg.Client
|
||||
db: string
|
||||
}
|
||||
|
||||
export default async function createDb({ client, db }: CreateDbOptions) {
|
||||
await client.query(`CREATE DATABASE "${db}"`)
|
||||
}
|
||||
|
||||
export async function runCreateDb({
|
||||
client,
|
||||
dbName,
|
||||
spinner,
|
||||
}: {
|
||||
client: pg.Client
|
||||
dbName: string
|
||||
spinner: Ora
|
||||
}): Promise<pg.Client> {
|
||||
let newClient = client
|
||||
|
||||
try {
|
||||
// create postgres database
|
||||
await createDb({
|
||||
client,
|
||||
db: dbName,
|
||||
})
|
||||
|
||||
// create a new connection with database selected
|
||||
await client.end()
|
||||
newClient = await postgresClient({
|
||||
user: client.user,
|
||||
password: client.password,
|
||||
database: dbName,
|
||||
})
|
||||
} catch (e) {
|
||||
spinner.stop()
|
||||
logMessage({
|
||||
message: `An error occurred while trying to create your database: ${e}`,
|
||||
type: "error",
|
||||
})
|
||||
}
|
||||
|
||||
return newClient
|
||||
}
|
||||
|
||||
async function getForDbName({
|
||||
dbName,
|
||||
verbose = false,
|
||||
}: {
|
||||
dbName: string
|
||||
verbose?: boolean
|
||||
}): Promise<{
|
||||
client: pg.Client
|
||||
dbConnectionString: string
|
||||
}> {
|
||||
let client!: pg.Client
|
||||
let postgresUsername = "postgres"
|
||||
let postgresPassword = ""
|
||||
|
||||
try {
|
||||
client = await postgresClient({
|
||||
user: postgresUsername,
|
||||
password: postgresPassword,
|
||||
})
|
||||
} catch (e) {
|
||||
if (verbose) {
|
||||
logMessage({
|
||||
message: `The following error occured when connecting to the database: ${e}`,
|
||||
type: "verbose",
|
||||
})
|
||||
}
|
||||
// ask for the user's postgres credentials
|
||||
const answers = await inquirer.prompt([
|
||||
{
|
||||
type: "input",
|
||||
name: "postgresUsername",
|
||||
message: "Enter your Postgres username",
|
||||
default: "postgres",
|
||||
validate: (input) => {
|
||||
return typeof input === "string" && input.length > 0
|
||||
},
|
||||
},
|
||||
{
|
||||
type: "password",
|
||||
name: "postgresPassword",
|
||||
message: "Enter your Postgres password",
|
||||
},
|
||||
])
|
||||
|
||||
postgresUsername = answers.postgresUsername
|
||||
postgresPassword = answers.postgresPassword
|
||||
|
||||
try {
|
||||
client = await postgresClient({
|
||||
user: postgresUsername,
|
||||
password: postgresPassword,
|
||||
})
|
||||
} catch (e) {
|
||||
logMessage({
|
||||
message: `Couldn't connect to PostgreSQL because of the following error: ${e}.${EOL}${EOL}Make sure you have PostgreSQL installed and the credentials you provided are correct.${EOL}${EOL}You can learn how to install PostgreSQL here: https://docs.medusajs.com/development/backend/prepare-environment?os=${getCurrentOs()}#postgresql${EOL}${EOL}If you keep running into this issue despite having PostgreSQL installed, please check out our troubleshooting guidelines: https://docs.medusajs.com/troubleshooting/database-error`,
|
||||
type: "error",
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
// format connection string
|
||||
const dbConnectionString = formatConnectionString({
|
||||
user: postgresUsername,
|
||||
password: postgresPassword,
|
||||
host: client!.host,
|
||||
db: dbName,
|
||||
})
|
||||
|
||||
return {
|
||||
client,
|
||||
dbConnectionString,
|
||||
}
|
||||
}
|
||||
|
||||
async function getForDbUrl({
|
||||
dbUrl,
|
||||
verbose = false,
|
||||
}: {
|
||||
dbUrl: string
|
||||
verbose?: boolean
|
||||
}): Promise<{
|
||||
client: pg.Client
|
||||
dbConnectionString: string
|
||||
}> {
|
||||
let client!: pg.Client
|
||||
|
||||
try {
|
||||
client = await postgresClient({
|
||||
connectionString: dbUrl,
|
||||
})
|
||||
} catch (e) {
|
||||
if (verbose) {
|
||||
logMessage({
|
||||
message: `The following error occured when connecting to the database: ${e}`,
|
||||
type: "verbose",
|
||||
})
|
||||
}
|
||||
logMessage({
|
||||
message: `Couldn't connect to PostgreSQL using the database URL you passed. Make sure it's correct and try again.`,
|
||||
type: "error",
|
||||
})
|
||||
}
|
||||
|
||||
return {
|
||||
client,
|
||||
dbConnectionString: dbUrl,
|
||||
}
|
||||
}
|
||||
|
||||
export async function getDbClientAndCredentials({
|
||||
dbName = "",
|
||||
dbUrl = "",
|
||||
verbose = false,
|
||||
}): Promise<{
|
||||
client: pg.Client
|
||||
dbConnectionString: string
|
||||
verbose?: boolean
|
||||
}> {
|
||||
if (dbName) {
|
||||
return await getForDbName({
|
||||
dbName,
|
||||
verbose,
|
||||
})
|
||||
} else {
|
||||
return await getForDbUrl({
|
||||
dbUrl,
|
||||
verbose,
|
||||
})
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,72 @@
|
||||
import { exec, spawnSync, SpawnSyncOptions } from "child_process"
|
||||
import util from "util"
|
||||
import { getAbortError } from "./create-abort-controller.js"
|
||||
|
||||
const promiseExec = util.promisify(exec)
|
||||
|
||||
type ExecuteOptions = {
|
||||
stdout?: string
|
||||
stderr?: string
|
||||
}
|
||||
|
||||
type VerboseOptions = {
|
||||
verbose?: boolean
|
||||
// Since spawn doesn't allow us to both retrieve the
|
||||
// output and output it live without using events,
|
||||
// enabling this option, which is only useful if `verbose` is `true`,
|
||||
// defers the output of the process until after the process is executed
|
||||
// instead of outputting the log in realtime, which is the default.
|
||||
// it prioritizes retrieving the output over outputting it in real-time.
|
||||
needOutput?: boolean
|
||||
}
|
||||
|
||||
type PromiseExecParams = Parameters<typeof promiseExec>
|
||||
type SpawnParams = [string, SpawnSyncOptions]
|
||||
|
||||
const execute = async (
|
||||
command: SpawnParams | PromiseExecParams,
|
||||
{ verbose = false, needOutput = false }: VerboseOptions
|
||||
): Promise<ExecuteOptions> => {
|
||||
if (verbose) {
|
||||
const [commandStr, options] = command as SpawnParams
|
||||
const childProcess = spawnSync(commandStr, {
|
||||
...options,
|
||||
shell: true,
|
||||
stdio: needOutput
|
||||
? "pipe"
|
||||
: [process.stdin, process.stdout, process.stderr],
|
||||
})
|
||||
|
||||
if (childProcess.error) {
|
||||
throw childProcess.error
|
||||
}
|
||||
|
||||
if (
|
||||
childProcess.signal &&
|
||||
["SIGINT", "SIGTERM"].includes(childProcess.signal)
|
||||
) {
|
||||
console.log("abortingggg")
|
||||
throw getAbortError()
|
||||
}
|
||||
|
||||
if (needOutput) {
|
||||
console.log(
|
||||
childProcess.stdout?.toString() || childProcess.stderr?.toString()
|
||||
)
|
||||
}
|
||||
|
||||
return {
|
||||
stdout: childProcess.stdout?.toString() || "",
|
||||
stderr: childProcess.stderr?.toString() || "",
|
||||
}
|
||||
} else {
|
||||
const childProcess = await promiseExec(...(command as PromiseExecParams))
|
||||
|
||||
return {
|
||||
stdout: childProcess.stdout as string,
|
||||
stderr: childProcess.stderr as string,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
export default execute
|
||||
@@ -0,0 +1,136 @@
|
||||
import boxen from "boxen"
|
||||
import chalk from "chalk"
|
||||
import { emojify } from "node-emoji"
|
||||
import { Ora } from "ora"
|
||||
import ProcessManager from "./process-manager.js"
|
||||
|
||||
export type FactBoxOptions = {
|
||||
interval: NodeJS.Timeout | null
|
||||
spinner: Ora
|
||||
processManager: ProcessManager
|
||||
message?: string
|
||||
title?: string
|
||||
verbose?: boolean
|
||||
}
|
||||
|
||||
const facts = [
|
||||
"Plugins allow you to integrate third-party services for payment, fulfillment, notifications, and more.",
|
||||
"You can specify a product's availability in one or more sales channels.",
|
||||
"Payment and shipping options and providers can be configured per region.",
|
||||
"Tax-inclusive pricing allows you to set prices for products, shipping options, and more without having to worry about calculating taxes.",
|
||||
"Medusa provides multi-currency and region support, with full control over prices for each currency and region.",
|
||||
"You can organize customers by customer groups and set special prices for them.",
|
||||
"You can specify the inventory of products per location and sales channel.",
|
||||
"Publishable-API Keys allow you to send requests to the backend within a scoped resource.",
|
||||
"You can create custom endpoints by creating a TypeScript file under the src/api directory.",
|
||||
"You can listen to events to perform asynchronous actions using Subscribers.",
|
||||
"An entity represents a table in the database. You can create a table by creating a custom entity and migration.",
|
||||
"Medusa's store endpoint paths are prefixed by /store. The admin endpoints are prefixed by /admin.",
|
||||
"Medusa provides a JavaScript client and a React library that you can use to build a storefront or a custom admin.",
|
||||
"Services are classes with methods related to an entity or functionality. You can create a custom service in a TypeScript file under src/services.",
|
||||
"Modules allow you to replace an entire functionality with your custom logic.",
|
||||
"The event bus module is responsible for triggering events and relaying them to subscribers.",
|
||||
"The cache module is responsible for caching data that requires heavy computation.",
|
||||
]
|
||||
|
||||
export const getFact = () => {
|
||||
const randIndex = Math.floor(Math.random() * facts.length)
|
||||
|
||||
return facts[randIndex]
|
||||
}
|
||||
|
||||
export const showFact = ({
|
||||
spinner,
|
||||
title,
|
||||
verbose,
|
||||
}: Pick<FactBoxOptions, "spinner" | "verbose"> & {
|
||||
title: string
|
||||
}) => {
|
||||
const fact = getFact()
|
||||
if (verbose) {
|
||||
spinner.stopAndPersist({
|
||||
symbol: chalk.cyan("⠋"),
|
||||
text: title,
|
||||
})
|
||||
} else {
|
||||
spinner.text = `${title}\n${boxen(`${fact}`, {
|
||||
title: chalk.cyan(`${emojify(":bulb:")} Medusa Tips`),
|
||||
titleAlignment: "center",
|
||||
textAlignment: "center",
|
||||
padding: 1,
|
||||
margin: 1,
|
||||
})}`
|
||||
}
|
||||
}
|
||||
|
||||
export const createFactBox = ({
|
||||
spinner,
|
||||
title,
|
||||
processManager,
|
||||
verbose,
|
||||
}: Pick<FactBoxOptions, "spinner" | "processManager" | "verbose"> & {
|
||||
title: string
|
||||
}): NodeJS.Timeout => {
|
||||
showFact({ spinner, title, verbose })
|
||||
const interval = setInterval(() => {
|
||||
showFact({ spinner, title, verbose })
|
||||
}, 10000)
|
||||
|
||||
processManager.addInterval(interval)
|
||||
|
||||
return interval
|
||||
}
|
||||
|
||||
export const resetFactBox = ({
|
||||
interval,
|
||||
spinner,
|
||||
successMessage,
|
||||
processManager,
|
||||
newTitle,
|
||||
verbose,
|
||||
}: Pick<
|
||||
FactBoxOptions,
|
||||
"interval" | "spinner" | "processManager" | "verbose"
|
||||
> & {
|
||||
successMessage: string
|
||||
newTitle?: string
|
||||
}): NodeJS.Timeout | null => {
|
||||
if (interval) {
|
||||
clearInterval(interval)
|
||||
}
|
||||
|
||||
spinner.succeed(chalk.green(successMessage)).start()
|
||||
let newInterval = null
|
||||
if (newTitle) {
|
||||
newInterval = createFactBox({
|
||||
spinner,
|
||||
title: newTitle,
|
||||
processManager,
|
||||
verbose,
|
||||
})
|
||||
}
|
||||
|
||||
return newInterval
|
||||
}
|
||||
|
||||
export function displayFactBox({
|
||||
interval,
|
||||
spinner,
|
||||
processManager,
|
||||
title = "",
|
||||
message = "",
|
||||
verbose = false,
|
||||
}: FactBoxOptions): NodeJS.Timeout | null {
|
||||
if (!message) {
|
||||
return createFactBox({ spinner, title, processManager, verbose })
|
||||
}
|
||||
|
||||
return resetFactBox({
|
||||
interval,
|
||||
spinner,
|
||||
successMessage: message,
|
||||
processManager,
|
||||
newTitle: title,
|
||||
verbose,
|
||||
})
|
||||
}
|
||||
@@ -0,0 +1,29 @@
|
||||
type ConnectionStringOptions = {
|
||||
user?: string
|
||||
password?: string
|
||||
host?: string
|
||||
db: string
|
||||
}
|
||||
|
||||
export function encodeDbValue(value: string): string {
|
||||
return encodeURIComponent(value)
|
||||
}
|
||||
|
||||
export default ({ user, password, host, db }: ConnectionStringOptions) => {
|
||||
let connection = `postgres://`
|
||||
if (user) {
|
||||
connection += encodeDbValue(user)
|
||||
}
|
||||
|
||||
if (password) {
|
||||
connection += `:${encodeDbValue(password)}`
|
||||
}
|
||||
|
||||
if (user || password) {
|
||||
connection += "@"
|
||||
}
|
||||
|
||||
connection += `${host}/${db}`
|
||||
|
||||
return connection
|
||||
}
|
||||
@@ -0,0 +1,17 @@
|
||||
import Configstore from "configstore"
|
||||
|
||||
let config: Configstore
|
||||
|
||||
export const getConfigStore = (): Configstore => {
|
||||
if (!config) {
|
||||
config = new Configstore(
|
||||
`medusa`,
|
||||
{},
|
||||
{
|
||||
globalConfigPath: true,
|
||||
}
|
||||
)
|
||||
}
|
||||
|
||||
return config
|
||||
}
|
||||
@@ -0,0 +1,10 @@
|
||||
export const getCurrentOs = (): string => {
|
||||
switch (process.platform) {
|
||||
case "darwin":
|
||||
return "macos"
|
||||
case "linux":
|
||||
return "linux"
|
||||
default:
|
||||
return "windows"
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,27 @@
|
||||
import chalk from "chalk"
|
||||
import { program } from "commander"
|
||||
import { logger } from "./logger.js"
|
||||
|
||||
type LogOptions = {
|
||||
message: string
|
||||
type?: "error" | "success" | "info" | "warning" | "verbose"
|
||||
}
|
||||
|
||||
export default ({ message, type = "info" }: LogOptions) => {
|
||||
switch (type) {
|
||||
case "info":
|
||||
logger.info(chalk.white(message))
|
||||
break
|
||||
case "success":
|
||||
logger.info(chalk.green(message))
|
||||
break
|
||||
case "warning":
|
||||
logger.warning(chalk.yellow(message))
|
||||
break
|
||||
case "verbose":
|
||||
logger.info(`${chalk.bgYellowBright("VERBOSE LOG:")} ${message}`)
|
||||
break
|
||||
case "error":
|
||||
program.error(chalk.bold.red(message))
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,10 @@
|
||||
import winston from "winston"
|
||||
|
||||
const consoleTransport = new winston.transports.Console({
|
||||
format: winston.format.printf((log) => log.message),
|
||||
})
|
||||
const options = {
|
||||
transports: [consoleTransport],
|
||||
}
|
||||
|
||||
export const logger = winston.createLogger(options)
|
||||
@@ -0,0 +1,116 @@
|
||||
import inquirer from "inquirer"
|
||||
import { exec } from "child_process"
|
||||
import execute from "./execute.js"
|
||||
import { FactBoxOptions, displayFactBox } from "./facts.js"
|
||||
import fs from "fs"
|
||||
import path from "path"
|
||||
import { customAlphabet } from "nanoid"
|
||||
import { isAbortError } from "./create-abort-controller.js"
|
||||
import logMessage from "./log-message.js"
|
||||
|
||||
const NEXTJS_REPO = "https://github.com/medusajs/nextjs-starter-medusa"
|
||||
|
||||
export async function askForNextjsStarter(): Promise<boolean> {
|
||||
const { installNextjs } = await inquirer.prompt([
|
||||
{
|
||||
type: "confirm",
|
||||
name: "installNextjs",
|
||||
message: `Would you like to create the Next.js storefront? You can also create it later`,
|
||||
default: false,
|
||||
},
|
||||
])
|
||||
|
||||
return installNextjs
|
||||
}
|
||||
|
||||
type InstallOptions = {
|
||||
directoryName: string
|
||||
abortController?: AbortController
|
||||
factBoxOptions: FactBoxOptions
|
||||
verbose?: boolean
|
||||
}
|
||||
|
||||
export async function installNextjsStarter({
|
||||
directoryName,
|
||||
abortController,
|
||||
factBoxOptions,
|
||||
verbose = false,
|
||||
}: InstallOptions): Promise<string> {
|
||||
factBoxOptions.interval = displayFactBox({
|
||||
...factBoxOptions,
|
||||
title: "Installing Next.js Storefront...",
|
||||
})
|
||||
|
||||
let nextjsDirectory = `${directoryName}-storefront`
|
||||
|
||||
if (
|
||||
fs.existsSync(nextjsDirectory) &&
|
||||
fs.lstatSync(nextjsDirectory).isDirectory()
|
||||
) {
|
||||
// append a random number to the directory name
|
||||
nextjsDirectory += `-${customAlphabet(
|
||||
// npm throws an error if the directory name has an uppercase letter
|
||||
"123456789abcdefghijklmnopqrstuvwxyz",
|
||||
4
|
||||
)()}`
|
||||
}
|
||||
|
||||
try {
|
||||
await execute(
|
||||
[
|
||||
`npx create-next-app -e ${NEXTJS_REPO} ${nextjsDirectory}`,
|
||||
{
|
||||
signal: abortController?.signal,
|
||||
env: {
|
||||
...process.env,
|
||||
npm_config_yes: "yes",
|
||||
},
|
||||
},
|
||||
],
|
||||
{ verbose }
|
||||
)
|
||||
} catch (e) {
|
||||
if (isAbortError(e)) {
|
||||
process.exit()
|
||||
}
|
||||
|
||||
logMessage({
|
||||
message: `An error occurred while installing Next.js storefront: ${e}`,
|
||||
type: "error",
|
||||
})
|
||||
}
|
||||
|
||||
fs.renameSync(
|
||||
path.join(nextjsDirectory, ".env.template"),
|
||||
path.join(nextjsDirectory, ".env.local")
|
||||
)
|
||||
|
||||
displayFactBox({
|
||||
...factBoxOptions,
|
||||
message: `Installed Next.js Starter successfully in the ${nextjsDirectory} directory.`,
|
||||
})
|
||||
|
||||
return nextjsDirectory
|
||||
}
|
||||
|
||||
type StartOptions = {
|
||||
directory: string
|
||||
abortController?: AbortController
|
||||
verbose?: boolean
|
||||
}
|
||||
|
||||
export function startNextjsStarter({
|
||||
directory,
|
||||
abortController,
|
||||
verbose = false,
|
||||
}: StartOptions) {
|
||||
const childProcess = exec(`npm run dev`, {
|
||||
cwd: directory,
|
||||
signal: abortController?.signal,
|
||||
})
|
||||
|
||||
if (verbose) {
|
||||
childProcess.stdout?.pipe(process.stdout)
|
||||
childProcess.stderr?.pipe(process.stderr)
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,17 @@
|
||||
import pg from "pg"
|
||||
const { Client } = pg
|
||||
|
||||
type PostgresConnection = {
|
||||
user?: string
|
||||
password?: string
|
||||
connectionString?: string
|
||||
database?: string
|
||||
}
|
||||
|
||||
export default async (connect: PostgresConnection) => {
|
||||
const client = new Client(connect)
|
||||
|
||||
await client.connect()
|
||||
|
||||
return client
|
||||
}
|
||||
@@ -0,0 +1,292 @@
|
||||
import chalk from "chalk"
|
||||
import fs from "fs"
|
||||
import path from "path"
|
||||
import { Ora } from "ora"
|
||||
import execute from "./execute.js"
|
||||
import { EOL } from "os"
|
||||
import { displayFactBox, FactBoxOptions } from "./facts.js"
|
||||
import ProcessManager from "./process-manager.js"
|
||||
import { clearProject } from "./clear-project.js"
|
||||
import type { Client } from "pg"
|
||||
|
||||
type PrepareOptions = {
|
||||
directory: string
|
||||
dbConnectionString: string
|
||||
admin?: {
|
||||
email: string
|
||||
}
|
||||
seed?: boolean
|
||||
boilerplate?: boolean
|
||||
spinner: Ora
|
||||
processManager: ProcessManager
|
||||
abortController?: AbortController
|
||||
skipDb?: boolean
|
||||
migrations?: boolean
|
||||
onboardingType?: "default" | "nextjs"
|
||||
nextjsDirectory?: string
|
||||
client: Client | null
|
||||
verbose?: boolean
|
||||
v2?: boolean
|
||||
}
|
||||
|
||||
export default async ({
|
||||
directory,
|
||||
dbConnectionString,
|
||||
admin,
|
||||
seed,
|
||||
boilerplate,
|
||||
spinner,
|
||||
processManager,
|
||||
abortController,
|
||||
skipDb,
|
||||
migrations,
|
||||
onboardingType = "default",
|
||||
nextjsDirectory = "",
|
||||
client,
|
||||
verbose = false,
|
||||
v2 = false,
|
||||
}: PrepareOptions) => {
|
||||
// initialize execution options
|
||||
const execOptions = {
|
||||
cwd: directory,
|
||||
signal: abortController?.signal,
|
||||
}
|
||||
|
||||
const npxOptions = {
|
||||
...execOptions,
|
||||
env: {
|
||||
...process.env,
|
||||
npm_config_yes: "yes",
|
||||
},
|
||||
}
|
||||
|
||||
const factBoxOptions: FactBoxOptions = {
|
||||
interval: null,
|
||||
spinner,
|
||||
processManager,
|
||||
message: "",
|
||||
title: "",
|
||||
verbose,
|
||||
}
|
||||
|
||||
// initialize the invite token to return
|
||||
let inviteToken: string | undefined = undefined
|
||||
|
||||
if (!skipDb) {
|
||||
let env = `DATABASE_TYPE=postgres${EOL}DATABASE_URL=${dbConnectionString}${EOL}MEDUSA_ADMIN_ONBOARDING_TYPE=${onboardingType}${EOL}STORE_CORS=http://localhost:8000,http://localhost:7001`
|
||||
if (v2) {
|
||||
env += `${EOL}POSTGRES_URL=${dbConnectionString}`
|
||||
}
|
||||
if (nextjsDirectory) {
|
||||
env += `${EOL}MEDUSA_ADMIN_ONBOARDING_NEXTJS_DIRECTORY=${nextjsDirectory}`
|
||||
}
|
||||
// add connection string to project
|
||||
fs.appendFileSync(path.join(directory, `.env`), env)
|
||||
}
|
||||
|
||||
factBoxOptions.interval = displayFactBox({
|
||||
...factBoxOptions,
|
||||
spinner,
|
||||
title: "Installing dependencies...",
|
||||
processManager,
|
||||
})
|
||||
|
||||
await processManager.runProcess({
|
||||
process: async () => {
|
||||
try {
|
||||
await execute([`yarn`, execOptions], { verbose })
|
||||
} catch (e) {
|
||||
// yarn isn't available
|
||||
// use npm
|
||||
await execute([`npm install --legacy-peer-deps`, execOptions], {
|
||||
verbose,
|
||||
})
|
||||
}
|
||||
},
|
||||
ignoreERESOLVE: true,
|
||||
})
|
||||
|
||||
factBoxOptions.interval = displayFactBox({
|
||||
...factBoxOptions,
|
||||
message: "Installed Dependencies",
|
||||
})
|
||||
|
||||
if (!boilerplate) {
|
||||
factBoxOptions.interval = displayFactBox({
|
||||
...factBoxOptions,
|
||||
title: "Preparing Project Directory...",
|
||||
})
|
||||
// delete files and directories related to onboarding
|
||||
clearProject(directory)
|
||||
displayFactBox({
|
||||
...factBoxOptions,
|
||||
message: "Prepared Project Directory",
|
||||
})
|
||||
}
|
||||
|
||||
factBoxOptions.interval = displayFactBox({
|
||||
...factBoxOptions,
|
||||
title: "Building Project...",
|
||||
})
|
||||
|
||||
await processManager.runProcess({
|
||||
process: async () => {
|
||||
try {
|
||||
await execute([`yarn build`, execOptions], { verbose })
|
||||
} catch (e) {
|
||||
// yarn isn't available
|
||||
// use npm
|
||||
await execute([`npm run build`, execOptions], { verbose })
|
||||
}
|
||||
},
|
||||
ignoreERESOLVE: true,
|
||||
})
|
||||
|
||||
displayFactBox({ ...factBoxOptions, message: "Project Built" })
|
||||
|
||||
if (!skipDb && migrations) {
|
||||
factBoxOptions.interval = displayFactBox({
|
||||
...factBoxOptions,
|
||||
title: "Running Migrations...",
|
||||
})
|
||||
|
||||
// run migrations
|
||||
await processManager.runProcess({
|
||||
process: async () => {
|
||||
const proc = await execute(
|
||||
["npx @medusajs/medusa-cli@latest migrations run", npxOptions],
|
||||
{ verbose, needOutput: true }
|
||||
)
|
||||
|
||||
if (client) {
|
||||
// check the migrations table is in the database
|
||||
// to ensure that migrations ran
|
||||
let errorOccurred = false
|
||||
try {
|
||||
const migrations = await client.query(
|
||||
`SELECT * FROM "${v2 ? "mikro_orm_migrations" : "migrations"}"`
|
||||
)
|
||||
errorOccurred = migrations.rowCount == 0
|
||||
} catch (e) {
|
||||
// avoid error thrown if the migrations table
|
||||
// doesn't exist
|
||||
errorOccurred = true
|
||||
}
|
||||
|
||||
// ensure that migrations actually ran in case of an uncaught error
|
||||
if (errorOccurred && (proc.stderr || proc.stdout)) {
|
||||
throw new Error(
|
||||
`An error occurred while running migrations: ${
|
||||
proc.stderr || proc.stdout
|
||||
}`
|
||||
)
|
||||
}
|
||||
}
|
||||
},
|
||||
})
|
||||
|
||||
factBoxOptions.interval = displayFactBox({
|
||||
...factBoxOptions,
|
||||
message: "Ran Migrations",
|
||||
})
|
||||
}
|
||||
|
||||
if (admin && !skipDb && migrations && !v2) {
|
||||
// create admin user
|
||||
factBoxOptions.interval = displayFactBox({
|
||||
...factBoxOptions,
|
||||
title: "Creating an admin user...",
|
||||
})
|
||||
|
||||
await processManager.runProcess({
|
||||
process: async () => {
|
||||
const proc = await execute(
|
||||
[
|
||||
`npx @medusajs/medusa-cli@latest user -e ${admin.email} --invite`,
|
||||
npxOptions,
|
||||
],
|
||||
{ verbose, needOutput: true }
|
||||
)
|
||||
|
||||
// get invite token from stdout
|
||||
const match = (proc.stdout as string).match(
|
||||
/Invite token: (?<token>.+)/
|
||||
)
|
||||
inviteToken = match?.groups?.token
|
||||
},
|
||||
})
|
||||
|
||||
factBoxOptions.interval = displayFactBox({
|
||||
...factBoxOptions,
|
||||
message: "Created admin user",
|
||||
})
|
||||
}
|
||||
|
||||
if (!skipDb && migrations) {
|
||||
if (seed || !boilerplate) {
|
||||
factBoxOptions.interval = displayFactBox({
|
||||
...factBoxOptions,
|
||||
title: "Seeding database...",
|
||||
})
|
||||
|
||||
// check if a seed file exists in the project
|
||||
if (!fs.existsSync(path.join(directory, "data", "seed.json"))) {
|
||||
spinner
|
||||
?.warn(
|
||||
chalk.yellow(
|
||||
"Seed file was not found in the project. Skipping seeding..."
|
||||
)
|
||||
)
|
||||
.start()
|
||||
return inviteToken
|
||||
}
|
||||
|
||||
await processManager.runProcess({
|
||||
process: async () => {
|
||||
await execute(
|
||||
[
|
||||
`npx @medusajs/medusa-cli@latest seed --seed-file=${path.join(
|
||||
"data",
|
||||
"seed.json"
|
||||
)}`,
|
||||
npxOptions,
|
||||
],
|
||||
{ verbose }
|
||||
)
|
||||
},
|
||||
})
|
||||
|
||||
displayFactBox({
|
||||
...factBoxOptions,
|
||||
message: "Seeded database with demo data",
|
||||
})
|
||||
} else if (
|
||||
fs.existsSync(path.join(directory, "data", "seed-onboarding.json"))
|
||||
) {
|
||||
// seed the database with onboarding seed
|
||||
factBoxOptions.interval = displayFactBox({
|
||||
...factBoxOptions,
|
||||
title: "Finish preparation...",
|
||||
})
|
||||
|
||||
await processManager.runProcess({
|
||||
process: async () => {
|
||||
await execute(
|
||||
[
|
||||
`npx @medusajs/medusa-cli@latest seed --seed-file=${path.join(
|
||||
"data",
|
||||
"seed-onboarding.json"
|
||||
)}`,
|
||||
npxOptions,
|
||||
],
|
||||
{ verbose }
|
||||
)
|
||||
},
|
||||
})
|
||||
}
|
||||
|
||||
displayFactBox({ ...factBoxOptions, message: "Finished Preparation" })
|
||||
}
|
||||
|
||||
return inviteToken
|
||||
}
|
||||
@@ -0,0 +1,60 @@
|
||||
type ProcessOptions = {
|
||||
process: Function
|
||||
ignoreERESOLVE?: boolean
|
||||
}
|
||||
|
||||
export default class ProcessManager {
|
||||
intervals: NodeJS.Timeout[] = []
|
||||
static MAX_RETRIES = 3
|
||||
|
||||
constructor() {
|
||||
this.onTerminated(() => {
|
||||
this.intervals.forEach((interval) => {
|
||||
clearInterval(interval)
|
||||
})
|
||||
})
|
||||
}
|
||||
|
||||
onTerminated(fn: () => Promise<void> | void) {
|
||||
process.on("SIGTERM", () => fn())
|
||||
process.on("SIGINT", () => fn())
|
||||
}
|
||||
|
||||
addInterval(interval: NodeJS.Timeout) {
|
||||
this.intervals.push(interval)
|
||||
}
|
||||
|
||||
// when running commands with npx or npm sometimes they
|
||||
// terminate with EAGAIN error unexpectedly
|
||||
// this utility function allows retrying the process if
|
||||
// EAGAIN occurs, or otherwise throw the error that occurs
|
||||
async runProcess({ process, ignoreERESOLVE }: ProcessOptions) {
|
||||
let processError = false
|
||||
let retries = 0
|
||||
do {
|
||||
++retries
|
||||
try {
|
||||
await process()
|
||||
} catch (error) {
|
||||
if (
|
||||
typeof error === "object" &&
|
||||
error !== null &&
|
||||
"code" in error &&
|
||||
error?.code === "EAGAIN"
|
||||
) {
|
||||
processError = true
|
||||
} else if (
|
||||
ignoreERESOLVE &&
|
||||
typeof error === "object" &&
|
||||
error !== null &&
|
||||
"code" in error &&
|
||||
error?.code === "ERESOLVE"
|
||||
) {
|
||||
// ignore error
|
||||
} else {
|
||||
throw error
|
||||
}
|
||||
}
|
||||
} while (processError && retries <= ProcessManager.MAX_RETRIES)
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,21 @@
|
||||
import { exec } from "child_process"
|
||||
|
||||
type StartOptions = {
|
||||
directory: string
|
||||
abortController?: AbortController
|
||||
}
|
||||
|
||||
export default ({ directory, abortController }: StartOptions) => {
|
||||
const childProcess = exec(`npx @medusajs/medusa-cli@latest develop`, {
|
||||
cwd: directory,
|
||||
signal: abortController?.signal,
|
||||
env: {
|
||||
...process.env,
|
||||
OPEN_BROWSER: "false",
|
||||
npm_config_yes: "yes",
|
||||
},
|
||||
})
|
||||
|
||||
childProcess.stdout?.pipe(process.stdout)
|
||||
childProcess.stderr?.pipe(process.stderr)
|
||||
}
|
||||
@@ -0,0 +1,18 @@
|
||||
{
|
||||
"compilerOptions": {
|
||||
"target": "ESNext",
|
||||
"module": "Node16",
|
||||
"moduleResolution": "node16",
|
||||
"outDir": "./dist",
|
||||
"esModuleInterop": true,
|
||||
"strict": true,
|
||||
"skipLibCheck": true,
|
||||
"resolveJsonModule": true,
|
||||
},
|
||||
"include": ["src"],
|
||||
"ts-node": {
|
||||
"esm": true,
|
||||
"experimentalSpecifierResolution": "node",
|
||||
"transpileOnly": true
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,5 @@
|
||||
.env
|
||||
dist/
|
||||
node_modules/
|
||||
|
||||
.env
|
||||
@@ -0,0 +1,423 @@
|
||||
# Change Log
|
||||
|
||||
## 1.3.22
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#5701](https://github.com/medusajs/medusa/pull/5701) [`6975eacb3`](https://github.com/medusajs/medusa/commit/6975eacb338874b976c14aae030c74362d57410c) Thanks [@adrien2p](https://github.com/adrien2p)! - feat(medusa, medusa-cli): Improve add line item + cluster starting with medusa cli
|
||||
|
||||
- Updated dependencies [[`079f0da83`](https://github.com/medusajs/medusa/commit/079f0da83f482562bbb525807ee1a7e32993b4da), [`8f25ed8a1`](https://github.com/medusajs/medusa/commit/8f25ed8a10fe23e9342dc3d03545546b4ad4d6da)]:
|
||||
- @medusajs/utils@1.11.2
|
||||
|
||||
## 1.3.21
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#4968](https://github.com/medusajs/medusa/pull/4968) [`240b03800`](https://github.com/medusajs/medusa/commit/240b038006924d4872de068c98e2d6862145cb52) Thanks [@shahednasser](https://github.com/shahednasser)! - fix(medusa-cli): remove .git directory in `new` command
|
||||
|
||||
- Updated dependencies [[`460161a69`](https://github.com/medusajs/medusa/commit/460161a69f22cf6d561952e92e7d9b56912113e6), [`fcb6b4f51`](https://github.com/medusajs/medusa/commit/fcb6b4f510dba2757570625acb5da9476b7544fd), [`4d16acf5f`](https://github.com/medusajs/medusa/commit/4d16acf5f096b5656b645f510f9c971e7c2dc9ef), [`87bade096`](https://github.com/medusajs/medusa/commit/87bade096e3d536f29ddc57dbc4c04e5d7a46e4b), [`4d16acf5f`](https://github.com/medusajs/medusa/commit/4d16acf5f096b5656b645f510f9c971e7c2dc9ef)]:
|
||||
- @medusajs/utils@1.10.0
|
||||
|
||||
## 1.3.20
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- Updated dependencies [[`c58588904`](https://github.com/medusajs/medusa/commit/c58588904c5631111603b15afacf7cdc4c738cc4)]:
|
||||
- medusa-telemetry@0.0.17
|
||||
|
||||
## 1.3.19
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#4733](https://github.com/medusajs/medusa/pull/4733) [`30ce35b16`](https://github.com/medusajs/medusa/commit/30ce35b163afa25f4e1d8d1bd392f401a3b413df) Thanks [@shahednasser](https://github.com/shahednasser)! - feat(create-medusa-app, utils, medusa-cli): add database options + remove util from `@medusajs/utils`
|
||||
|
||||
- Updated dependencies [[`3f3a84262`](https://github.com/medusajs/medusa/commit/3f3a84262ce9cbd911923278a54e301fbe9a4634), [`30ce35b16`](https://github.com/medusajs/medusa/commit/30ce35b163afa25f4e1d8d1bd392f401a3b413df)]:
|
||||
- @medusajs/utils@1.9.6
|
||||
|
||||
## 1.3.18
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#4696](https://github.com/medusajs/medusa/pull/4696) [`03fb0479c`](https://github.com/medusajs/medusa/commit/03fb0479c0c5389d2569d622b6bec188fcee0ad6) Thanks [@pevey](https://github.com/pevey)! - Feat/allow logging to file
|
||||
|
||||
- Updated dependencies [[`5c60aad17`](https://github.com/medusajs/medusa/commit/5c60aad177a99574ffff5ebdc02ce9dc86ef9af9), [`4073b7313`](https://github.com/medusajs/medusa/commit/4073b73130c874dc7d2240726224a01b7b19b1a1)]:
|
||||
- @medusajs/utils@1.9.5
|
||||
|
||||
## 1.3.17
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#4420](https://github.com/medusajs/medusa/pull/4420) [`6f1fa244f`](https://github.com/medusajs/medusa/commit/6f1fa244fa47d4ecdaa7363483bd7da555dbbf32) Thanks [@adrien2p](https://github.com/adrien2p)! - chore(medusa-cli): Cleanup plugin setup + include Logger type update which is used across multiple packages
|
||||
|
||||
- Updated dependencies [[`499c3478c`](https://github.com/medusajs/medusa/commit/499c3478c910c8b922a15cc6f4d9fbad122a347f), [`9dcdc0041`](https://github.com/medusajs/medusa/commit/9dcdc0041a2b08cc0723343dd8d9127d9977b086), [`9760d4a96`](https://github.com/medusajs/medusa/commit/9760d4a96c27f6f89a8c3f3b6e73b17547f97f2a)]:
|
||||
- @medusajs/utils@1.9.2
|
||||
|
||||
## 1.3.16
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#4273](https://github.com/medusajs/medusa/pull/4273) [`f98ba5bde`](https://github.com/medusajs/medusa/commit/f98ba5bde83ba785eead31b0c9eb9f135d664178) Thanks [@shahednasser](https://github.com/shahednasser)! - feat(create-medusa-app,medusa-cli): Allow clearing project
|
||||
|
||||
- [#4192](https://github.com/medusajs/medusa/pull/4192) [`8676ee7a2`](https://github.com/medusajs/medusa/commit/8676ee7a2e3da9fecc2e81db3ab12693d02f63f4) Thanks [@shahednasser](https://github.com/shahednasser)! - feat(medusa,medusa-cli): Added an invite option to the create user command, and allow seeding publishable api keys
|
||||
|
||||
- [#4214](https://github.com/medusajs/medusa/pull/4214) [`260dc55b6`](https://github.com/medusajs/medusa/commit/260dc55b6f122351f8e1d75a4b5ca797745735be) Thanks [@olivermrbl](https://github.com/olivermrbl)! - chore(medusa,medusa-cli): Clean up new command + fix CI
|
||||
|
||||
- [#4276](https://github.com/medusajs/medusa/pull/4276) [`afd1b67f1`](https://github.com/medusajs/medusa/commit/afd1b67f1c7de8cf07fd9fcbdde599a37914e9b5) Thanks [@olivermrbl](https://github.com/olivermrbl)! - chore: Use caret range
|
||||
|
||||
- Updated dependencies [[`f98ba5bde`](https://github.com/medusajs/medusa/commit/f98ba5bde83ba785eead31b0c9eb9f135d664178), [`14c0f62f8`](https://github.com/medusajs/medusa/commit/14c0f62f84704a4c87beff3daaff60a52f5c88b8)]:
|
||||
- @medusajs/utils@1.9.1
|
||||
|
||||
## 1.3.15
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#4026](https://github.com/medusajs/medusa/pull/4026) [`a91987fab`](https://github.com/medusajs/medusa/commit/a91987fab33745f9864eab21bd1c27e8e3e24571) Thanks [@olivermrbl](https://github.com/olivermrbl)! - feat(medusa): Remove sqlite support
|
||||
|
||||
- Updated dependencies [[`a91987fab`](https://github.com/medusajs/medusa/commit/a91987fab33745f9864eab21bd1c27e8e3e24571), [`db4199530`](https://github.com/medusajs/medusa/commit/db419953075e0907b8c4d27ab5188e9bd3e3d72b), [`c0e527d6e`](https://github.com/medusajs/medusa/commit/c0e527d6e0a67d0c53577a0b9c3d16ee8dc5740f)]:
|
||||
- @medusajs/utils@1.9.0
|
||||
|
||||
## 1.3.14
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- Updated dependencies [[`cdbac2c84`](https://github.com/medusajs/medusa/commit/cdbac2c8403a3c15c0e11993f6b7dab268fa5c08), [`6511959e2`](https://github.com/medusajs/medusa/commit/6511959e23177f3b4831915db0e8e788bc9047fa)]:
|
||||
- @medusajs/utils@1.8.5
|
||||
|
||||
## 1.3.13
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- Updated dependencies [[`1ea57c3a6`](https://github.com/medusajs/medusa/commit/1ea57c3a69a5377a8dd0821df819743ded4a222b)]:
|
||||
- @medusajs/utils@1.8.4
|
||||
|
||||
## 1.3.12
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- Updated dependencies [[`0e488e71b`](https://github.com/medusajs/medusa/commit/0e488e71b186f7d08b18c4c6ba409ef3cadb8152), [`d539c6fee`](https://github.com/medusajs/medusa/commit/d539c6feeba8ee431f9a655b6cd4e9102cba2b25)]:
|
||||
- @medusajs/utils@1.8.3
|
||||
|
||||
## 1.3.11
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- Updated dependencies [[`af710f1b4`](https://github.com/medusajs/medusa/commit/af710f1b48a4545a5064029a557013af34c4c100), [`491566df6`](https://github.com/medusajs/medusa/commit/491566df6b7ced35f655f810961422945e10ecd0)]:
|
||||
- @medusajs/utils@1.8.2
|
||||
|
||||
## 1.3.10
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- Updated dependencies [[`654a54622`](https://github.com/medusajs/medusa/commit/654a54622303139e7180538bd686630ad9a46cfd), [`abdb74d99`](https://github.com/medusajs/medusa/commit/abdb74d997f49f994bff49787a396179982843b0)]:
|
||||
- @medusajs/utils@1.8.1
|
||||
|
||||
## 1.3.9
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#3688](https://github.com/medusajs/medusa/pull/3688) [`a0c919a8d`](https://github.com/medusajs/medusa/commit/a0c919a8d01ca5edf62336de48e9a112e3822f38) Thanks [@olivermrbl](https://github.com/olivermrbl)! - chore(medusa-cli): Add missing utils dep
|
||||
|
||||
- [#3603](https://github.com/medusajs/medusa/pull/3603) [`cd54c7dca`](https://github.com/medusajs/medusa/commit/cd54c7dca9f7444dd2bfa91b4e3e3359dc6658cf) Thanks [@kasperkristensen](https://github.com/kasperkristensen)! - fix(medusa-cli): Add direct dependency of `semver`
|
||||
|
||||
- Updated dependencies [[`121b42acf`](https://github.com/medusajs/medusa/commit/121b42acfe98c12dd593f9b1f2072ff0f3b61724), [`8ddb3952c`](https://github.com/medusajs/medusa/commit/8ddb3952c045e6c05c8d0f6922f0d4ba30cf3bd4), [`aa690beed`](https://github.com/medusajs/medusa/commit/aa690beed775646cbc86b445fb5dc90dcac087d5), [`a0c919a8d`](https://github.com/medusajs/medusa/commit/a0c919a8d01ca5edf62336de48e9a112e3822f38), [`74bc4b16a`](https://github.com/medusajs/medusa/commit/74bc4b16a07f78668003ca930bf2a0d928897ceb), [`54dcc1871`](https://github.com/medusajs/medusa/commit/54dcc1871c8f28bea962dbb9df6e79b038d56449), [`77d46220c`](https://github.com/medusajs/medusa/commit/77d46220c23bfe19e575cbc445874eb6c22f3c73), [`4e9d257d3`](https://github.com/medusajs/medusa/commit/4e9d257d3bf76703ef5be8ca054cc9f0f7339def)]:
|
||||
- medusa-core-utils@1.2.0
|
||||
- @medusajs/utils@1.8.0
|
||||
|
||||
## 1.3.9-rc.2
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#3688](https://github.com/medusajs/medusa/pull/3688) [`a0c919a8d`](https://github.com/medusajs/medusa/commit/a0c919a8d01ca5edf62336de48e9a112e3822f38) Thanks [@olivermrbl](https://github.com/olivermrbl)! - chore(medusa-cli): Add missing utils dep
|
||||
|
||||
- Updated dependencies [[`a0c919a8d`](https://github.com/medusajs/medusa/commit/a0c919a8d01ca5edf62336de48e9a112e3822f38)]:
|
||||
- @medusajs/utils@0.0.2-rc.2
|
||||
|
||||
## 1.3.9-rc.1
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#3603](https://github.com/medusajs/medusa/pull/3603) [`cd54c7dca`](https://github.com/medusajs/medusa/commit/cd54c7dca9f7444dd2bfa91b4e3e3359dc6658cf) Thanks [@kasperkristensen](https://github.com/kasperkristensen)! - fix(medusa-cli): Add direct dependency of `semver`
|
||||
|
||||
## 1.3.9-rc.0
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- Updated dependencies [[`121b42acf`](https://github.com/medusajs/medusa/commit/121b42acfe98c12dd593f9b1f2072ff0f3b61724), [`aa690beed`](https://github.com/medusajs/medusa/commit/aa690beed775646cbc86b445fb5dc90dcac087d5), [`54dcc1871`](https://github.com/medusajs/medusa/commit/54dcc1871c8f28bea962dbb9df6e79b038d56449), [`77d46220c`](https://github.com/medusajs/medusa/commit/77d46220c23bfe19e575cbc445874eb6c22f3c73)]:
|
||||
- medusa-core-utils@1.2.0-rc.0
|
||||
|
||||
## 1.3.8
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#3217](https://github.com/medusajs/medusa/pull/3217) [`8c5219a31`](https://github.com/medusajs/medusa/commit/8c5219a31ef76ee571fbce84d7d57a63abe56eb0) Thanks [@adrien2p](https://github.com/adrien2p)! - chore: Fix npm packages files included
|
||||
|
||||
- Updated dependencies [[`8c5219a31`](https://github.com/medusajs/medusa/commit/8c5219a31ef76ee571fbce84d7d57a63abe56eb0)]:
|
||||
- medusa-core-utils@1.1.39
|
||||
|
||||
## 1.3.7
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#3185](https://github.com/medusajs/medusa/pull/3185) [`08324355a`](https://github.com/medusajs/medusa/commit/08324355a4466b017a0bc7ab1d333ee3cd27b8c4) Thanks [@olivermrbl](https://github.com/olivermrbl)! - chore: Patches all dependencies + minor bumps `winston` to include a [fix for a significant memory leak](https://github.com/winstonjs/winston/pull/2057)
|
||||
|
||||
- Updated dependencies [[`08324355a`](https://github.com/medusajs/medusa/commit/08324355a4466b017a0bc7ab1d333ee3cd27b8c4)]:
|
||||
- medusa-core-utils@1.1.38
|
||||
|
||||
## 1.3.6
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#3025](https://github.com/medusajs/medusa/pull/3025) [`93d0dc1bd`](https://github.com/medusajs/medusa/commit/93d0dc1bdcb54cf6e87428a7bb9b0dac196b4de2) Thanks [@adrien2p](https://github.com/adrien2p)! - fix(medusa): test, build and watch scripts
|
||||
|
||||
- Updated dependencies [[`93d0dc1bd`](https://github.com/medusajs/medusa/commit/93d0dc1bdcb54cf6e87428a7bb9b0dac196b4de2)]:
|
||||
- medusa-telemetry@0.0.16
|
||||
|
||||
## 1.3.5
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- Updated dependencies [[`a76762418`](https://github.com/medusajs/medusa/commit/a76762418877e675977540dc95e095492873af44)]:
|
||||
- medusa-telemetry@0.0.15
|
||||
|
||||
## 1.3.4
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- Updated dependencies [[`cfb24d72f`](https://github.com/medusajs/medusa/commit/cfb24d72fa303a6755e8579c46d3c7f36278b120)]:
|
||||
- medusa-telemetry@0.0.14
|
||||
|
||||
## 1.3.3
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#2069](https://github.com/medusajs/medusa/pull/2069) [`ad717b953`](https://github.com/medusajs/medusa/commit/ad717b9533a0500e20c4e312d1ee48b35ea9d5e1) Thanks [@olivermrbl](https://github.com/olivermrbl)! - Remove deprecated dependency `@hapi/joi`
|
||||
|
||||
All notable changes to this project will be documented in this file.
|
||||
See [Conventional Commits](https://conventionalcommits.org) for commit guidelines.
|
||||
|
||||
## [1.3.1](https://github.com/medusajs/medusa/compare/@medusajs/medusa-cli@1.3.0...@medusajs/medusa-cli@1.3.1) (2022-07-05)
|
||||
|
||||
### Features
|
||||
|
||||
- **medusa-cli:** Allow to revert migrations from the CLI ([#1353](https://github.com/medusajs/medusa/issues/1353)) ([012513b](https://github.com/medusajs/medusa/commit/012513b6a1e90169e9e0e53f7a59841a34fbaeb3))
|
||||
|
||||
# [1.3.0](https://github.com/medusajs/medusa/compare/@medusajs/medusa-cli@1.2.1...@medusajs/medusa-cli@1.3.0) (2022-05-01)
|
||||
|
||||
**Note:** Version bump only for package @medusajs/medusa-cli
|
||||
|
||||
## [1.2.1](https://github.com/medusajs/medusa/compare/@medusajs/medusa-cli@1.1.27...@medusajs/medusa-cli@1.2.1) (2022-02-28)
|
||||
|
||||
### Features
|
||||
|
||||
- new tax api ([#979](https://github.com/medusajs/medusa/issues/979)) ([47588e7](https://github.com/medusajs/medusa/commit/47588e7a8d3b2ae2fed0c1e87fdf1ee2db6bcdc2)), closes [#885](https://github.com/medusajs/medusa/issues/885) [#896](https://github.com/medusajs/medusa/issues/896) [#911](https://github.com/medusajs/medusa/issues/911) [#945](https://github.com/medusajs/medusa/issues/945) [#950](https://github.com/medusajs/medusa/issues/950) [#951](https://github.com/medusajs/medusa/issues/951) [#954](https://github.com/medusajs/medusa/issues/954) [#969](https://github.com/medusajs/medusa/issues/969) [#998](https://github.com/medusajs/medusa/issues/998) [#1017](https://github.com/medusajs/medusa/issues/1017) [#1110](https://github.com/medusajs/medusa/issues/1110)
|
||||
|
||||
# [1.2.0](https://github.com/medusajs/medusa/compare/@medusajs/medusa-cli@1.1.27...@medusajs/medusa-cli@1.2.0) (2022-02-25)
|
||||
|
||||
### Features
|
||||
|
||||
- new tax api ([#979](https://github.com/medusajs/medusa/issues/979)) ([c56660f](https://github.com/medusajs/medusa/commit/c56660fca9921a3f3637bc137d9794781c5b090f)), closes [#885](https://github.com/medusajs/medusa/issues/885) [#896](https://github.com/medusajs/medusa/issues/896) [#911](https://github.com/medusajs/medusa/issues/911) [#945](https://github.com/medusajs/medusa/issues/945) [#950](https://github.com/medusajs/medusa/issues/950) [#951](https://github.com/medusajs/medusa/issues/951) [#954](https://github.com/medusajs/medusa/issues/954) [#969](https://github.com/medusajs/medusa/issues/969) [#998](https://github.com/medusajs/medusa/issues/998) [#1017](https://github.com/medusajs/medusa/issues/1017) [#1110](https://github.com/medusajs/medusa/issues/1110)
|
||||
|
||||
## [1.1.27](https://github.com/medusajs/medusa/compare/@medusajs/medusa-cli@1.1.26...@medusajs/medusa-cli@1.1.27) (2022-01-11)
|
||||
|
||||
**Note:** Version bump only for package @medusajs/medusa-cli
|
||||
|
||||
## [1.1.26](https://github.com/medusajs/medusa/compare/@medusajs/medusa-cli@1.1.25...@medusajs/medusa-cli@1.1.26) (2021-12-29)
|
||||
|
||||
**Note:** Version bump only for package @medusajs/medusa-cli
|
||||
|
||||
## [1.1.25](https://github.com/medusajs/medusa/compare/@medusajs/medusa-cli@1.1.24...@medusajs/medusa-cli@1.1.25) (2021-12-17)
|
||||
|
||||
### Features
|
||||
|
||||
- add medusa-react ([#913](https://github.com/medusajs/medusa/issues/913)) ([d0d8dd7](https://github.com/medusajs/medusa/commit/d0d8dd7bf62eaac71df8714c2dfb4f204d192f51))
|
||||
|
||||
## [1.1.24](https://github.com/medusajs/medusa/compare/@medusajs/medusa-cli@1.1.23...@medusajs/medusa-cli@1.1.24) (2021-12-08)
|
||||
|
||||
**Note:** Version bump only for package @medusajs/medusa-cli
|
||||
|
||||
## [1.1.23](https://github.com/medusajs/medusa/compare/@medusajs/medusa-cli@1.1.22...@medusajs/medusa-cli@1.1.23) (2021-11-11)
|
||||
|
||||
**Note:** Version bump only for package @medusajs/medusa-cli
|
||||
|
||||
## [1.1.22](https://github.com/medusajs/medusa/compare/@medusajs/medusa-cli@1.1.21...@medusajs/medusa-cli@1.1.22) (2021-10-18)
|
||||
|
||||
**Note:** Version bump only for package @medusajs/medusa-cli
|
||||
|
||||
## [1.1.21](https://github.com/medusajs/medusa/compare/@medusajs/medusa-cli@1.1.20...@medusajs/medusa-cli@1.1.21) (2021-10-18)
|
||||
|
||||
**Note:** Version bump only for package @medusajs/medusa-cli
|
||||
|
||||
## [1.1.20](https://github.com/medusajs/medusa/compare/@medusajs/medusa-cli@1.1.18...@medusajs/medusa-cli@1.1.20) (2021-10-18)
|
||||
|
||||
**Note:** Version bump only for package @medusajs/medusa-cli
|
||||
|
||||
## [1.1.19](https://github.com/medusajs/medusa/compare/@medusajs/medusa-cli@1.1.18...@medusajs/medusa-cli@1.1.19) (2021-10-18)
|
||||
|
||||
**Note:** Version bump only for package @medusajs/medusa-cli
|
||||
|
||||
## [1.1.18](https://github.com/medusajs/medusa/compare/@medusajs/medusa-cli@1.1.17...@medusajs/medusa-cli@1.1.18) (2021-09-15)
|
||||
|
||||
**Note:** Version bump only for package @medusajs/medusa-cli
|
||||
|
||||
## [1.1.17](https://github.com/medusajs/medusa/compare/@medusajs/medusa-cli@1.1.16...@medusajs/medusa-cli@1.1.17) (2021-09-14)
|
||||
|
||||
**Note:** Version bump only for package @medusajs/medusa-cli
|
||||
|
||||
## [1.1.16](https://github.com/medusajs/medusa/compare/@medusajs/medusa-cli@1.1.15...@medusajs/medusa-cli@1.1.16) (2021-08-17)
|
||||
|
||||
**Note:** Version bump only for package @medusajs/medusa-cli
|
||||
|
||||
## [1.1.15](https://github.com/medusajs/medusa/compare/@medusajs/medusa-cli@1.1.14...@medusajs/medusa-cli@1.1.15) (2021-08-05)
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
- canary assist ([b988b67](https://github.com/medusajs/medusa/commit/b988b67118553c88ef6c6d53ae99ef1ad9d67305))
|
||||
|
||||
### Features
|
||||
|
||||
- medusa-telemetry ([#328](https://github.com/medusajs/medusa/issues/328)) ([cfe19f7](https://github.com/medusajs/medusa/commit/cfe19f7f9d3bb17425348362b148a0b4b7a649ef))
|
||||
|
||||
## [1.1.14](https://github.com/medusajs/medusa/compare/@medusajs/medusa-cli@1.1.13...@medusajs/medusa-cli@1.1.14) (2021-07-26)
|
||||
|
||||
### Features
|
||||
|
||||
- CLI + local linking ([#313](https://github.com/medusajs/medusa/issues/313)) ([f4a7138](https://github.com/medusajs/medusa/commit/f4a7138a5888e69e19bebe8f4962afc42e9a945d)), closes [#320](https://github.com/medusajs/medusa/issues/320)
|
||||
|
||||
## [1.1.13](https://github.com/medusajs/medusa/compare/@medusajs/medusa-cli@1.1.11...@medusajs/medusa-cli@1.1.13) (2021-07-15)
|
||||
|
||||
**Note:** Version bump only for package @medusajs/medusa-cli
|
||||
|
||||
## [1.1.12](https://github.com/medusajs/medusa/compare/@medusajs/medusa-cli@1.1.11...@medusajs/medusa-cli@1.1.12) (2021-07-15)
|
||||
|
||||
**Note:** Version bump only for package @medusajs/medusa-cli
|
||||
|
||||
## [1.1.11](https://github.com/medusajs/medusa/compare/@medusajs/medusa-cli@1.1.10...@medusajs/medusa-cli@1.1.11) (2021-07-02)
|
||||
|
||||
**Note:** Version bump only for package @medusajs/medusa-cli
|
||||
|
||||
## [1.1.10](https://github.com/medusajs/medusa/compare/@medusajs/medusa-cli@1.1.9...@medusajs/medusa-cli@1.1.10) (2021-06-22)
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
- release assist ([668e8a7](https://github.com/medusajs/medusa/commit/668e8a740200847fc2a41c91d2979097f1392532))
|
||||
|
||||
## [1.1.9](https://github.com/medusajs/medusa/compare/@medusajs/medusa-cli@1.1.8...@medusajs/medusa-cli@1.1.9) (2021-06-09)
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
- better cli ([953747f](https://github.com/medusajs/medusa/commit/953747f3d2409cef82faf926ad316a384e6667b4))
|
||||
- setup to allow login to Medusa Cloud ([bbd2f02](https://github.com/medusajs/medusa/commit/bbd2f02d549330df160c76cf1f4e4d5e7d08f246))
|
||||
|
||||
### Features
|
||||
|
||||
- **cli:** adds seed script ([5136c77](https://github.com/medusajs/medusa/commit/5136c7740137afcda52393131ef931eb76ea9f5d))
|
||||
|
||||
## [1.1.8](https://github.com/medusajs/medusa/compare/@medusajs/medusa-cli@1.1.8...@medusajs/medusa-cli@1.1.8) (2021-06-09)
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
- better cli ([953747f](https://github.com/medusajs/medusa/commit/953747f3d2409cef82faf926ad316a384e6667b4))
|
||||
- setup to allow login to Medusa Cloud ([bbd2f02](https://github.com/medusajs/medusa/commit/bbd2f02d549330df160c76cf1f4e4d5e7d08f246))
|
||||
|
||||
### Features
|
||||
|
||||
- **cli:** adds seed script ([5136c77](https://github.com/medusajs/medusa/commit/5136c7740137afcda52393131ef931eb76ea9f5d))
|
||||
|
||||
## [1.1.7](https://github.com/medusajs/medusa/compare/@medusajs/medusa-cli@1.1.8...@medusajs/medusa-cli@1.1.7) (2021-06-09)
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
- better cli ([953747f](https://github.com/medusajs/medusa/commit/953747f3d2409cef82faf926ad316a384e6667b4))
|
||||
- setup to allow login to Medusa Cloud ([bbd2f02](https://github.com/medusajs/medusa/commit/bbd2f02d549330df160c76cf1f4e4d5e7d08f246))
|
||||
|
||||
### Features
|
||||
|
||||
- **cli:** adds seed script ([5136c77](https://github.com/medusajs/medusa/commit/5136c7740137afcda52393131ef931eb76ea9f5d))
|
||||
|
||||
## [1.1.6](https://github.com/medusajs/medusa/compare/@medusajs/medusa-cli@1.1.8...@medusajs/medusa-cli@1.1.6) (2021-06-09)
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
- better cli ([953747f](https://github.com/medusajs/medusa/commit/953747f3d2409cef82faf926ad316a384e6667b4))
|
||||
- setup to allow login to Medusa Cloud ([bbd2f02](https://github.com/medusajs/medusa/commit/bbd2f02d549330df160c76cf1f4e4d5e7d08f246))
|
||||
|
||||
### Features
|
||||
|
||||
- **cli:** adds seed script ([5136c77](https://github.com/medusajs/medusa/commit/5136c7740137afcda52393131ef931eb76ea9f5d))
|
||||
|
||||
## [1.1.5](https://github.com/medusajs/medusa/compare/@medusajs/medusa-cli@1.1.8...@medusajs/medusa-cli@1.1.5) (2021-06-08)
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
- better cli ([953747f](https://github.com/medusajs/medusa/commit/953747f3d2409cef82faf926ad316a384e6667b4))
|
||||
- setup to allow login to Medusa Cloud ([bbd2f02](https://github.com/medusajs/medusa/commit/bbd2f02d549330df160c76cf1f4e4d5e7d08f246))
|
||||
|
||||
### Features
|
||||
|
||||
- **cli:** adds seed script ([5136c77](https://github.com/medusajs/medusa/commit/5136c7740137afcda52393131ef931eb76ea9f5d))
|
||||
|
||||
## [1.1.8](https://github.com/medusajs/medusa/compare/@medusajs/medusa-cli@1.1.7...@medusajs/medusa-cli@1.1.8) (2021-05-05)
|
||||
|
||||
**Note:** Version bump only for package @medusajs/medusa-cli
|
||||
|
||||
## [1.1.7](https://github.com/medusajs/medusa/compare/@medusajs/medusa-cli@1.1.4...@medusajs/medusa-cli@1.1.7) (2021-04-28)
|
||||
|
||||
**Note:** Version bump only for package @medusajs/medusa-cli
|
||||
|
||||
## [1.1.6](https://github.com/medusajs/medusa/compare/@medusajs/medusa-cli@1.1.5...@medusajs/medusa-cli@1.1.6) (2021-04-20)
|
||||
|
||||
**Note:** Version bump only for package @medusajs/medusa-cli
|
||||
|
||||
## [1.1.5](https://github.com/medusajs/medusa/compare/@medusajs/medusa-cli@1.1.4...@medusajs/medusa-cli@1.1.5) (2021-04-20)
|
||||
|
||||
**Note:** Version bump only for package @medusajs/medusa-cli
|
||||
|
||||
## [1.1.4](https://github.com/medusajs/medusa/compare/@medusajs/medusa-cli@1.1.3...@medusajs/medusa-cli@1.1.4) (2021-03-17)
|
||||
|
||||
**Note:** Version bump only for package @medusajs/medusa-cli
|
||||
|
||||
## [1.1.3](https://github.com/medusajs/medusa/compare/@medusajs/medusa-cli@1.1.0...@medusajs/medusa-cli@1.1.3) (2021-03-17)
|
||||
|
||||
### Features
|
||||
|
||||
- dev cli ([#203](https://github.com/medusajs/medusa/issues/203)) ([695b1fd](https://github.com/medusajs/medusa/commit/695b1fd0a54a247502cb48ffb73d060356293b76)), closes [#199](https://github.com/medusajs/medusa/issues/199)
|
||||
|
||||
## [1.1.2](https://github.com/medusajs/medusa/compare/@medusajs/medusa-cli@1.1.0...@medusajs/medusa-cli@1.1.2) (2021-03-17)
|
||||
|
||||
### Features
|
||||
|
||||
- dev cli ([#203](https://github.com/medusajs/medusa/issues/203)) ([695b1fd](https://github.com/medusajs/medusa/commit/695b1fd0a54a247502cb48ffb73d060356293b76)), closes [#199](https://github.com/medusajs/medusa/issues/199)
|
||||
|
||||
# [1.1.0](https://github.com/medusajs/medusa/compare/@medusajs/medusa-cli@1.0.11...@medusajs/medusa-cli@1.1.0) (2021-01-26)
|
||||
|
||||
**Note:** Version bump only for package @medusajs/medusa-cli
|
||||
|
||||
## 1.0.11 (2020-11-24)
|
||||
|
||||
## 1.0.10 (2020-09-09)
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
- ignore files ([eca1e00](https://github.com/medusajs/medusa/commit/eca1e006a77472c9402cd85bb879f08134af200b))
|
||||
- updates license ([db519fb](https://github.com/medusajs/medusa/commit/db519fbaa6f8ad02c19cbecba5d4f28ba1ee81aa))
|
||||
|
||||
## 1.0.1 (2020-09-05)
|
||||
|
||||
# 1.0.0 (2020-09-03)
|
||||
|
||||
# 1.0.0-alpha.21 (2020-08-25)
|
||||
|
||||
# 1.0.0-alpha.3 (2020-08-20)
|
||||
|
||||
# 1.0.0-alpha.2 (2020-08-20)
|
||||
|
||||
# 1.0.0-alpha.1 (2020-08-20)
|
||||
|
||||
# 1.0.0-alpha.0 (2020-08-20)
|
||||
|
||||
## [1.0.10](https://github.com/medusajs/medusa/compare/v1.0.9...v1.0.10) (2020-09-09)
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
- ignore files ([eca1e00](https://github.com/medusajs/medusa/commit/eca1e006a77472c9402cd85bb879f08134af200b))
|
||||
- updates license ([db519fb](https://github.com/medusajs/medusa/commit/db519fbaa6f8ad02c19cbecba5d4f28ba1ee81aa))
|
||||
Executable
+4
@@ -0,0 +1,4 @@
|
||||
#!/usr/bin/env node
|
||||
|
||||
require("dotenv").config()
|
||||
require("./dist/index.js")
|
||||
@@ -0,0 +1,13 @@
|
||||
module.exports = {
|
||||
globals: {
|
||||
"ts-jest": {
|
||||
tsconfig: "tsconfig.spec.json",
|
||||
isolatedModules: false,
|
||||
},
|
||||
},
|
||||
transform: {
|
||||
"^.+\\.[jt]s?$": "ts-jest",
|
||||
},
|
||||
testEnvironment: `node`,
|
||||
moduleFileExtensions: [`js`, `jsx`, `ts`, `tsx`, `json`],
|
||||
}
|
||||
@@ -0,0 +1,71 @@
|
||||
{
|
||||
"name": "@medusajs/medusa-cli",
|
||||
"version": "1.3.22",
|
||||
"description": "Command Line interface for Medusa Commerce",
|
||||
"main": "dist/index.js",
|
||||
"bin": {
|
||||
"medusa": "cli.js"
|
||||
},
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "https://github.com/medusajs/medusa",
|
||||
"directory": "packages/medusa-cli"
|
||||
},
|
||||
"publishConfig": {
|
||||
"access": "public"
|
||||
},
|
||||
"files": [
|
||||
"dist",
|
||||
"cli.js"
|
||||
],
|
||||
"engines": {
|
||||
"node": ">=16"
|
||||
},
|
||||
"scripts": {
|
||||
"test": "jest --passWithNoTests src",
|
||||
"build": "rimraf dist && tsc",
|
||||
"watch": "tsc --watch",
|
||||
"prepublishOnly": "cross-env NODE_ENV=production tsc --build"
|
||||
},
|
||||
"author": "Sebastian Rindom",
|
||||
"license": "MIT",
|
||||
"devDependencies": {
|
||||
"@types/yargs": "^15.0.15",
|
||||
"cross-env": "^5.2.1",
|
||||
"jest": "^25.5.4",
|
||||
"rimraf": "^5.0.1",
|
||||
"ts-jest": "^25.5.1",
|
||||
"typescript": "^4.9.5"
|
||||
},
|
||||
"dependencies": {
|
||||
"@medusajs/utils": "^1.11.2",
|
||||
"axios": "^0.21.4",
|
||||
"chalk": "^4.0.0",
|
||||
"configstore": "5.0.1",
|
||||
"core-js": "^3.6.5",
|
||||
"dotenv": "^16.4.5",
|
||||
"execa": "^5.1.1",
|
||||
"fs-exists-cached": "^1.0.0",
|
||||
"fs-extra": "^10.0.0",
|
||||
"glob": "^7.1.6",
|
||||
"hosted-git-info": "^4.0.2",
|
||||
"inquirer": "^8.0.0",
|
||||
"is-valid-path": "^0.1.1",
|
||||
"meant": "^1.0.3",
|
||||
"medusa-core-utils": "^1.2.0",
|
||||
"medusa-telemetry": "^0.0.17",
|
||||
"open": "^8.0.6",
|
||||
"ora": "^5.4.1",
|
||||
"pg": "^8.11.0",
|
||||
"pg-god": "^1.0.12",
|
||||
"prompts": "^2.4.2",
|
||||
"regenerator-runtime": "^0.13.11",
|
||||
"resolve-cwd": "^3.0.0",
|
||||
"semver": "^7.3.8",
|
||||
"stack-trace": "^0.0.10",
|
||||
"ulid": "^2.3.0",
|
||||
"winston": "^3.8.2",
|
||||
"yargs": "^15.3.1"
|
||||
},
|
||||
"gitHead": "81a7ff73d012fda722f6e9ef0bd9ba0232d37808"
|
||||
}
|
||||
@@ -0,0 +1,16 @@
|
||||
const chalk = require(`chalk`)
|
||||
|
||||
const showSuccessMessage = () => {
|
||||
console.log(chalk.green(`Success!\n`))
|
||||
console.log(chalk.cyan(`Welcome to the Medusa CLI!`))
|
||||
}
|
||||
|
||||
try {
|
||||
// check if it's a global installation of medusa-cli
|
||||
const npmArgs = JSON.parse(process.env[`npm_config_argv`])
|
||||
if (npmArgs.cooked && npmArgs.cooked.includes(`--global`)) {
|
||||
const createCli = require(`../dist/create-cli`)
|
||||
showSuccessMessage()
|
||||
createCli(`--help`)
|
||||
}
|
||||
} catch (e) {}
|
||||
@@ -0,0 +1,657 @@
|
||||
/*
|
||||
* Adapted from https://github.com/gatsbyjs/gatsby/blob/master/packages/gatsby-cli/src/init-starter.ts
|
||||
*/
|
||||
|
||||
import { execSync } from "child_process"
|
||||
import execa from "execa"
|
||||
import { sync as existsSync } from "fs-exists-cached"
|
||||
import fs from "fs-extra"
|
||||
import hostedGitInfo from "hosted-git-info"
|
||||
import isValid from "is-valid-path"
|
||||
import sysPath from "path"
|
||||
import prompts from "prompts"
|
||||
import { Pool } from "pg"
|
||||
import url from "url"
|
||||
import { createDatabase } from "pg-god"
|
||||
import { track } from "medusa-telemetry"
|
||||
import inquirer from "inquirer"
|
||||
|
||||
import reporter from "../reporter"
|
||||
import { getPackageManager, setPackageManager } from "../util/package-manager"
|
||||
import { PanicId } from "../reporter/panic-handler"
|
||||
import { clearProject } from "../util/clear-project"
|
||||
import path from "path"
|
||||
|
||||
const removeUndefined = (obj) => {
|
||||
return Object.fromEntries(
|
||||
Object.entries(obj)
|
||||
.filter(([_, v]) => v != null)
|
||||
.map(([k, v]) => [k, v === Object(v) ? removeUndefined(v) : v])
|
||||
)
|
||||
}
|
||||
|
||||
const spawnWithArgs = (file, args, options) =>
|
||||
execa(file, args, { stdio: `inherit`, preferLocal: false, ...options })
|
||||
|
||||
const spawn = (cmd, options) => {
|
||||
const [file, ...args] = cmd.split(/\s+/)
|
||||
return spawnWithArgs(file, args, options)
|
||||
}
|
||||
// Checks the existence of yarn package
|
||||
// We use yarnpkg instead of yarn to avoid conflict with Hadoop yarn
|
||||
// Refer to https://github.com/yarnpkg/yarn/issues/673
|
||||
const checkForYarn = () => {
|
||||
try {
|
||||
execSync(`yarnpkg --version`, { stdio: `ignore` })
|
||||
return true
|
||||
} catch (e) {
|
||||
return false
|
||||
}
|
||||
}
|
||||
|
||||
const isAlreadyGitRepository = async () => {
|
||||
try {
|
||||
return await spawn(`git rev-parse --is-inside-work-tree`, {
|
||||
stdio: `pipe`,
|
||||
}).then((output) => output.stdout === `true`)
|
||||
} catch (err) {
|
||||
return false
|
||||
}
|
||||
}
|
||||
|
||||
// Initialize newly cloned directory as a git repo
|
||||
const gitInit = async (rootPath) => {
|
||||
reporter.info(`Initialising git in ${rootPath}`)
|
||||
|
||||
return await spawn(`git init`, { cwd: rootPath })
|
||||
}
|
||||
|
||||
// Create a .gitignore file if it is missing in the new directory
|
||||
const maybeCreateGitIgnore = async (rootPath) => {
|
||||
if (existsSync(sysPath.join(rootPath, `.gitignore`))) {
|
||||
return
|
||||
}
|
||||
|
||||
const gignore = reporter.activity(
|
||||
`Creating minimal .gitignore in ${rootPath}`
|
||||
)
|
||||
await fs.writeFile(
|
||||
sysPath.join(rootPath, `.gitignore`),
|
||||
`.cache\nnode_modules\npublic\n`
|
||||
)
|
||||
reporter.success(gignore, `Created .gitignore in ${rootPath}`)
|
||||
}
|
||||
|
||||
// Create an initial git commit in the new directory
|
||||
const createInitialGitCommit = async (rootPath, starterUrl) => {
|
||||
reporter.info(`Create initial git commit in ${rootPath}`)
|
||||
|
||||
await spawn(`git add -A`, { cwd: rootPath })
|
||||
// use execSync instead of spawn to handle git clients using
|
||||
// pgp signatures (with password)
|
||||
try {
|
||||
execSync(`git commit -m "Initial commit from medusa: (${starterUrl})"`, {
|
||||
cwd: rootPath,
|
||||
})
|
||||
} catch {
|
||||
// Remove git support if initial commit fails
|
||||
reporter.warn(`Initial git commit failed - removing git support\n`)
|
||||
fs.removeSync(sysPath.join(rootPath, `.git`))
|
||||
}
|
||||
}
|
||||
|
||||
// Executes `npm install` or `yarn install` in rootPath.
|
||||
const install = async (rootPath) => {
|
||||
const prevDir = process.cwd()
|
||||
|
||||
reporter.info(`Installing packages...`)
|
||||
console.log() // Add some space
|
||||
|
||||
process.chdir(rootPath)
|
||||
|
||||
const npmConfigUserAgent = process.env.npm_config_user_agent
|
||||
|
||||
try {
|
||||
if (!getPackageManager()) {
|
||||
if (npmConfigUserAgent?.includes(`yarn`)) {
|
||||
setPackageManager(`yarn`)
|
||||
} else {
|
||||
setPackageManager(`npm`)
|
||||
}
|
||||
}
|
||||
if (getPackageManager() === `yarn` && checkForYarn()) {
|
||||
await fs.remove(`package-lock.json`)
|
||||
await spawn(`yarnpkg`, {})
|
||||
} else {
|
||||
await fs.remove(`yarn.lock`)
|
||||
await spawn(`npm install`, {})
|
||||
}
|
||||
} finally {
|
||||
process.chdir(prevDir)
|
||||
}
|
||||
}
|
||||
|
||||
const ignored = (path) => !/^\.(git|hg)$/.test(sysPath.basename(path))
|
||||
|
||||
// Copy starter from file system.
|
||||
const copy = async (starterPath, rootPath) => {
|
||||
// Chmod with 755.
|
||||
// 493 = parseInt('755', 8)
|
||||
await fs.ensureDir(rootPath, { mode: 493 })
|
||||
|
||||
if (!existsSync(starterPath)) {
|
||||
throw new Error(`starter ${starterPath} doesn't exist`)
|
||||
}
|
||||
|
||||
if (starterPath === `.`) {
|
||||
throw new Error(
|
||||
`You can't create a starter from the existing directory. If you want to
|
||||
create a new project in the current directory, the trailing dot isn't
|
||||
necessary. If you want to create a project from a local starter, run
|
||||
something like "medusa new my-medusa-store ../local-medusa-starter"`
|
||||
)
|
||||
}
|
||||
|
||||
reporter.info(`Creating new site from local starter: ${starterPath}`)
|
||||
|
||||
const copyActivity = reporter.activity(
|
||||
`Copying local starter to ${rootPath} ...`
|
||||
)
|
||||
|
||||
await fs.copy(starterPath, rootPath, { filter: ignored })
|
||||
|
||||
reporter.success(copyActivity, `Created starter directory layout`)
|
||||
console.log() // Add some space
|
||||
|
||||
await install(rootPath)
|
||||
|
||||
return true
|
||||
}
|
||||
|
||||
// Clones starter from URI.
|
||||
const clone = async (hostInfo, rootPath, v2 = false) => {
|
||||
let url
|
||||
// Let people use private repos accessed over SSH.
|
||||
if (hostInfo.getDefaultRepresentation() === `sshurl`) {
|
||||
url = hostInfo.ssh({ noCommittish: true })
|
||||
// Otherwise default to normal git syntax.
|
||||
} else {
|
||||
url = hostInfo.https({ noCommittish: true, noGitPlus: true })
|
||||
}
|
||||
|
||||
const branch = v2 ? [`-b`, "feat/v2"] :
|
||||
hostInfo.committish ? [`-b`, hostInfo.committish] : []
|
||||
|
||||
const createAct = reporter.activity(`Creating new project from git: ${url}`)
|
||||
|
||||
const args = [
|
||||
`clone`,
|
||||
...branch,
|
||||
url,
|
||||
rootPath,
|
||||
`--recursive`,
|
||||
`--depth=1`,
|
||||
].filter((arg) => Boolean(arg))
|
||||
|
||||
await execa(`git`, args, {})
|
||||
.then(() => {
|
||||
reporter.success(createAct, `Created starter directory layout`)
|
||||
})
|
||||
.catch((err) => {
|
||||
reporter.failure(createAct, `Failed to clone repository`)
|
||||
throw err
|
||||
})
|
||||
|
||||
await fs.remove(sysPath.join(rootPath, `.git`))
|
||||
|
||||
await install(rootPath)
|
||||
const isGit = await isAlreadyGitRepository()
|
||||
if (!isGit) await gitInit(rootPath)
|
||||
await maybeCreateGitIgnore(rootPath)
|
||||
if (!isGit) await createInitialGitCommit(rootPath, url)
|
||||
}
|
||||
|
||||
const getMedusaConfig = (rootPath) => {
|
||||
try {
|
||||
const configPath = sysPath.join(rootPath, "medusa-config.js")
|
||||
if (existsSync(configPath)) {
|
||||
const resolved = sysPath.resolve(configPath)
|
||||
const configModule = require(resolved)
|
||||
return configModule
|
||||
}
|
||||
throw Error()
|
||||
} catch (err) {
|
||||
console.log(err)
|
||||
reporter.warn(
|
||||
`Couldn't find a medusa-config.js file; please double check that you have the correct starter installed`
|
||||
)
|
||||
}
|
||||
return {}
|
||||
}
|
||||
|
||||
const getPaths = async (starterPath, rootPath, v2 = false) => {
|
||||
let selectedOtherStarter = false
|
||||
|
||||
// if no args are passed, prompt user for path and starter
|
||||
if (!starterPath && !rootPath) {
|
||||
const response = await prompts.prompt([
|
||||
{
|
||||
type: `text`,
|
||||
name: `path`,
|
||||
message: `What is your project called?`,
|
||||
initial: `my-medusa-store`,
|
||||
},
|
||||
!v2 && {
|
||||
type: `select`,
|
||||
name: `starter`,
|
||||
message: `What starter would you like to use?`,
|
||||
choices: [
|
||||
{ title: `medusa-starter-default`, value: `medusa-starter-default` },
|
||||
{ title: `(Use a different starter)`, value: `different` },
|
||||
],
|
||||
initial: 0,
|
||||
},
|
||||
])
|
||||
|
||||
// exit gracefully if responses aren't provided
|
||||
if ((!v2 && !response.starter) || !response.path.trim()) {
|
||||
throw new Error(
|
||||
`Please mention both starter package and project name along with path(if its not in the root)`
|
||||
)
|
||||
}
|
||||
|
||||
selectedOtherStarter = response.starter === `different`
|
||||
starterPath = `medusajs/${v2 ? "medusa-starter-default" : response.starter}`
|
||||
rootPath = response.path
|
||||
}
|
||||
|
||||
// set defaults if no root or starter has been set yet
|
||||
rootPath = rootPath || process.cwd()
|
||||
starterPath = starterPath || `medusajs/medusa-starter-default`
|
||||
|
||||
return { starterPath, rootPath, selectedOtherStarter }
|
||||
}
|
||||
|
||||
const successMessage = (path) => {
|
||||
reporter.info(`Your new Medusa project is ready for you! To start developing run:
|
||||
|
||||
cd ${path}
|
||||
medusa develop
|
||||
`)
|
||||
}
|
||||
|
||||
const defaultDBCreds = {
|
||||
user: process.env.USER || "postgres",
|
||||
database: "postgres",
|
||||
password: "",
|
||||
port: 5432,
|
||||
host: "localhost",
|
||||
}
|
||||
|
||||
const verifyPgCreds = async (creds) => {
|
||||
const pool = new Pool(creds)
|
||||
return new Promise((resolve, reject) => {
|
||||
pool.query("SELECT NOW()", (err, res) => {
|
||||
pool.end()
|
||||
if (err) {
|
||||
reject(err)
|
||||
} else {
|
||||
resolve(res)
|
||||
}
|
||||
})
|
||||
})
|
||||
}
|
||||
|
||||
const interactiveDbCreds = async (dbName, dbCreds = {}) => {
|
||||
const credentials = Object.assign({}, defaultDBCreds, dbCreds)
|
||||
|
||||
const collecting = true
|
||||
|
||||
while (collecting) {
|
||||
const result = await inquirer
|
||||
.prompt([
|
||||
{
|
||||
type: "list",
|
||||
name: "continueWithDefault",
|
||||
message: `
|
||||
|
||||
Will attempt to setup Postgres database "${dbName}" with credentials:
|
||||
user: ${credentials.user}
|
||||
password: ***
|
||||
port: ${credentials.port}
|
||||
host: ${credentials.host}
|
||||
Do you wish to continue with these credentials?
|
||||
|
||||
`,
|
||||
choices: [`Continue`, `Change credentials`, `Skip database setup`],
|
||||
},
|
||||
{
|
||||
type: "input",
|
||||
when: ({ continueWithDefault }) =>
|
||||
continueWithDefault === `Change credentials`,
|
||||
name: "user",
|
||||
default: credentials.user,
|
||||
message: `DB user`,
|
||||
},
|
||||
{
|
||||
type: "password",
|
||||
when: ({ continueWithDefault }) =>
|
||||
continueWithDefault === `Change credentials`,
|
||||
name: "password",
|
||||
default: credentials.password,
|
||||
message: `DB password`,
|
||||
},
|
||||
{
|
||||
type: "number",
|
||||
when: ({ continueWithDefault }) =>
|
||||
continueWithDefault === `Change credentials`,
|
||||
name: "port",
|
||||
default: credentials.port,
|
||||
message: `DB port`,
|
||||
},
|
||||
{
|
||||
type: "input",
|
||||
when: ({ continueWithDefault }) =>
|
||||
continueWithDefault === `Change credentials`,
|
||||
name: "host",
|
||||
default: credentials.host,
|
||||
message: `DB host`,
|
||||
},
|
||||
])
|
||||
.then(async (answers) => {
|
||||
const collectedCreds = Object.assign({}, credentials, {
|
||||
user: answers.user,
|
||||
password: answers.password,
|
||||
host: answers.host,
|
||||
port: answers.port,
|
||||
})
|
||||
|
||||
switch (answers.continueWithDefault) {
|
||||
case "Continue": {
|
||||
const done = await verifyPgCreds(credentials).catch((_) => false)
|
||||
if (done) {
|
||||
return credentials
|
||||
}
|
||||
return false
|
||||
}
|
||||
case "Change credentials": {
|
||||
const done = await verifyPgCreds(collectedCreds).catch((_) => false)
|
||||
if (done) {
|
||||
return collectedCreds
|
||||
}
|
||||
return false
|
||||
}
|
||||
default:
|
||||
return null
|
||||
}
|
||||
})
|
||||
|
||||
if (result !== false) {
|
||||
return result
|
||||
}
|
||||
|
||||
console.log("\n\nCould not verify DB credentials - please try again\n\n")
|
||||
}
|
||||
|
||||
return
|
||||
}
|
||||
|
||||
const setupDB = async (dbName, dbCreds = {}) => {
|
||||
const credentials = Object.assign({}, defaultDBCreds, dbCreds)
|
||||
|
||||
const dbActivity = reporter.activity(`Setting up database "${dbName}"...`)
|
||||
await createDatabase(
|
||||
{
|
||||
databaseName: dbName,
|
||||
errorIfExist: true,
|
||||
},
|
||||
credentials
|
||||
)
|
||||
.then(() => {
|
||||
reporter.success(dbActivity, `Created database "${dbName}"`)
|
||||
})
|
||||
.catch((err) => {
|
||||
if (err.name === "PDG_ERR::DuplicateDatabase") {
|
||||
reporter.success(
|
||||
dbActivity,
|
||||
`Database ${dbName} already exists; skipping setup`
|
||||
)
|
||||
} else {
|
||||
reporter.failure(dbActivity, `Skipping database setup.`)
|
||||
reporter.warn(
|
||||
`Failed to setup database; install PostgresQL or make sure to manage your database connection manually`
|
||||
)
|
||||
console.error(err)
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
const setupEnvVars = async (rootPath, dbName, dbCreds = {}) => {
|
||||
const templatePath = sysPath.join(rootPath, ".env.template")
|
||||
const destination = sysPath.join(rootPath, ".env")
|
||||
if (existsSync(templatePath)) {
|
||||
fs.renameSync(templatePath, destination)
|
||||
}
|
||||
|
||||
const credentials = Object.assign({}, defaultDBCreds, dbCreds)
|
||||
let dbUrl = ""
|
||||
if (
|
||||
credentials.user !== defaultDBCreds.user ||
|
||||
credentials.password !== defaultDBCreds.password
|
||||
) {
|
||||
dbUrl = `postgres://${credentials.user}:${credentials.password}@${credentials.host}:${credentials.port}/${dbName}`
|
||||
} else {
|
||||
dbUrl = `postgres://${credentials.host}:${credentials.port}/${dbName}`
|
||||
}
|
||||
|
||||
fs.appendFileSync(destination, `DATABASE_URL=${dbUrl}\n`)
|
||||
}
|
||||
|
||||
const runMigrations = async (rootPath) => {
|
||||
const migrationActivity = reporter.activity("Applying database migrations...")
|
||||
|
||||
const cliPath = sysPath.join(
|
||||
`node_modules`,
|
||||
`@medusajs`,
|
||||
`medusa-cli`,
|
||||
`cli.js`
|
||||
)
|
||||
|
||||
return await execa(cliPath, [`migrations`, `run`], {
|
||||
cwd: rootPath,
|
||||
})
|
||||
.then(() => {
|
||||
reporter.success(migrationActivity, "Database migrations completed.")
|
||||
})
|
||||
.catch((err) => {
|
||||
reporter.failure(
|
||||
migrationActivity,
|
||||
"Failed to migrate database you must complete migration manually before starting your server."
|
||||
)
|
||||
console.error(err)
|
||||
})
|
||||
}
|
||||
|
||||
const attemptSeed = async (rootPath) => {
|
||||
const seedActivity = reporter.activity("Seeding database")
|
||||
|
||||
const pkgPath = sysPath.resolve(rootPath, "package.json")
|
||||
if (existsSync(pkgPath)) {
|
||||
const pkg = require(pkgPath)
|
||||
if (pkg.scripts && pkg.scripts.seed) {
|
||||
const proc = execa(getPackageManager(), [`run`, `seed`], {
|
||||
cwd: rootPath,
|
||||
})
|
||||
|
||||
// Useful for development
|
||||
// proc.stdout.pipe(process.stdout)
|
||||
|
||||
await proc
|
||||
.then(() => {
|
||||
reporter.success(seedActivity, "Seed completed")
|
||||
})
|
||||
.catch((err) => {
|
||||
reporter.failure(seedActivity, "Failed to complete seed; skipping")
|
||||
console.error(err)
|
||||
})
|
||||
} else {
|
||||
reporter.failure(
|
||||
seedActivity,
|
||||
"Starter doesn't provide a seed command; skipping."
|
||||
)
|
||||
}
|
||||
} else {
|
||||
reporter.failure(seedActivity, "Could not find package.json")
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Main function that clones or copies the starter.
|
||||
*/
|
||||
export const newStarter = async (args) => {
|
||||
track("CLI_NEW")
|
||||
|
||||
const {
|
||||
starter,
|
||||
root,
|
||||
skipDb,
|
||||
skipMigrations,
|
||||
skipEnv,
|
||||
seed,
|
||||
useDefaults,
|
||||
dbUser,
|
||||
dbDatabase,
|
||||
dbPass,
|
||||
dbPort,
|
||||
dbHost,
|
||||
v2
|
||||
} = args
|
||||
|
||||
const dbCredentials = removeUndefined({
|
||||
user: dbUser,
|
||||
database: dbDatabase,
|
||||
password: dbPass,
|
||||
port: dbPort,
|
||||
host: dbHost,
|
||||
})
|
||||
|
||||
const { starterPath, rootPath, selectedOtherStarter } = await getPaths(
|
||||
starter,
|
||||
root,
|
||||
v2
|
||||
)
|
||||
|
||||
const urlObject = url.parse(rootPath)
|
||||
|
||||
if (selectedOtherStarter) {
|
||||
reporter.info(
|
||||
`Find the url of the Medusa starter you wish to create and run:
|
||||
|
||||
medusa new ${rootPath} [url-to-starter]
|
||||
|
||||
`
|
||||
)
|
||||
return
|
||||
}
|
||||
|
||||
if (urlObject.protocol && urlObject.host) {
|
||||
const isStarterAUrl =
|
||||
starter && !url.parse(starter).hostname && !url.parse(starter).protocol
|
||||
|
||||
if (/medusa-starter/gi.test(rootPath) && isStarterAUrl) {
|
||||
reporter.panic({
|
||||
id: PanicId.InvalidProjectName,
|
||||
context: {
|
||||
starter,
|
||||
rootPath,
|
||||
},
|
||||
})
|
||||
return
|
||||
}
|
||||
|
||||
reporter.panic({
|
||||
id: PanicId.InvalidProjectName,
|
||||
context: {
|
||||
rootPath,
|
||||
},
|
||||
})
|
||||
return
|
||||
}
|
||||
|
||||
if (!isValid(rootPath)) {
|
||||
reporter.panic({
|
||||
id: PanicId.InvalidPath,
|
||||
context: {
|
||||
path: sysPath.resolve(rootPath),
|
||||
},
|
||||
})
|
||||
return
|
||||
}
|
||||
|
||||
if (existsSync(sysPath.join(rootPath, `package.json`))) {
|
||||
reporter.panic({
|
||||
id: PanicId.AlreadyNodeProject,
|
||||
context: {
|
||||
rootPath,
|
||||
},
|
||||
})
|
||||
return
|
||||
}
|
||||
|
||||
const hostedInfo = hostedGitInfo.fromUrl(starterPath)
|
||||
if (hostedInfo) {
|
||||
await clone(hostedInfo, rootPath, v2)
|
||||
} else {
|
||||
await copy(starterPath, rootPath)
|
||||
}
|
||||
|
||||
track("CLI_NEW_LAYOUT_COMPLETED")
|
||||
|
||||
let creds = dbCredentials
|
||||
|
||||
const dbName = `medusa-db-${Math.random().toString(36).substring(2, 7)}` // generate random 5 character string
|
||||
|
||||
if (!useDefaults && !skipDb && !skipEnv) {
|
||||
creds = await interactiveDbCreds(dbName, dbCredentials)
|
||||
}
|
||||
|
||||
if (creds === null) {
|
||||
reporter.info(
|
||||
"Skipping automatic database setup. Please note that you need to create a database and run migrations before you can run your Medusa backend"
|
||||
)
|
||||
} else {
|
||||
if (!skipDb) {
|
||||
track("CLI_NEW_SETUP_DB")
|
||||
await setupDB(dbName, creds)
|
||||
}
|
||||
|
||||
if (!skipEnv) {
|
||||
track("CLI_NEW_SETUP_ENV")
|
||||
await setupEnvVars(rootPath, dbName, creds)
|
||||
}
|
||||
|
||||
if (!skipMigrations) {
|
||||
track("CLI_NEW_RUN_MIGRATIONS")
|
||||
await runMigrations(rootPath)
|
||||
}
|
||||
|
||||
if (seed) {
|
||||
track("CLI_NEW_SEED_DB")
|
||||
await attemptSeed(rootPath)
|
||||
}
|
||||
}
|
||||
|
||||
if (!selectedOtherStarter) {
|
||||
reporter.info("Final project preparations...")
|
||||
// remove demo files
|
||||
clearProject(rootPath)
|
||||
// remove .git directory
|
||||
fs.rmSync(path.join(rootPath, '.git'), {
|
||||
recursive: true,
|
||||
force: true,
|
||||
})
|
||||
}
|
||||
|
||||
successMessage(rootPath)
|
||||
track("CLI_NEW_SUCCEEDED")
|
||||
}
|
||||
@@ -0,0 +1,415 @@
|
||||
import { sync as existsSync } from "fs-exists-cached"
|
||||
import { setTelemetryEnabled } from "medusa-telemetry"
|
||||
import path from "path"
|
||||
import resolveCwd from "resolve-cwd"
|
||||
|
||||
import { didYouMean } from "./did-you-mean"
|
||||
import { getLocalMedusaVersion } from "./util/version"
|
||||
|
||||
import { newStarter } from "./commands/new"
|
||||
import reporter from "./reporter"
|
||||
|
||||
const yargs = require(`yargs`)
|
||||
|
||||
const handlerP =
|
||||
(fn) =>
|
||||
(...args) => {
|
||||
Promise.resolve(fn(...args)).then(
|
||||
() => process.exit(0),
|
||||
(err) => console.log(err)
|
||||
)
|
||||
}
|
||||
|
||||
function buildLocalCommands(cli, isLocalProject) {
|
||||
const defaultHost = `localhost`
|
||||
const defaultPort = `9000`
|
||||
const directory = path.resolve(`.`)
|
||||
|
||||
const projectInfo = { directory }
|
||||
const useYarn = existsSync(path.join(directory, `yarn.lock`))
|
||||
|
||||
if (isLocalProject) {
|
||||
projectInfo["sitePackageJson"] = require(path.join(
|
||||
directory,
|
||||
`package.json`
|
||||
))
|
||||
}
|
||||
|
||||
function resolveLocalCommand(command) {
|
||||
if (!isLocalProject) {
|
||||
cli.showHelp()
|
||||
}
|
||||
|
||||
try {
|
||||
const cmdPath = resolveCwd.silent(
|
||||
`@medusajs/medusa/dist/commands/${command}`
|
||||
)!
|
||||
return require(cmdPath).default
|
||||
} catch (err) {
|
||||
if (!process.env.NODE_ENV?.startsWith("prod")) {
|
||||
console.log("--------------- ERROR ---------------------")
|
||||
console.log(err)
|
||||
console.log("-------------------------------------------")
|
||||
}
|
||||
cli.showHelp()
|
||||
}
|
||||
}
|
||||
|
||||
function getCommandHandler(command, handler) {
|
||||
return (argv) => {
|
||||
const localCmd = resolveLocalCommand(command)
|
||||
const args = { ...argv, ...projectInfo, useYarn }
|
||||
|
||||
return handler ? handler(args, localCmd) : localCmd(args)
|
||||
}
|
||||
}
|
||||
|
||||
cli
|
||||
.command({
|
||||
command: `new [root] [starter]`,
|
||||
builder: (_) =>
|
||||
_.option(`seed`, {
|
||||
type: `boolean`,
|
||||
describe: `If flag is set the command will attempt to seed the database after setup.`,
|
||||
default: false,
|
||||
})
|
||||
.option(`y`, {
|
||||
type: `boolean`,
|
||||
alias: "useDefaults",
|
||||
describe: `If flag is set the command will not interactively collect database credentials`,
|
||||
default: false,
|
||||
})
|
||||
.option(`skip-db`, {
|
||||
type: `boolean`,
|
||||
describe: `If flag is set the command will not attempt to complete database setup`,
|
||||
default: false,
|
||||
})
|
||||
.option(`skip-migrations`, {
|
||||
type: `boolean`,
|
||||
describe: `If flag is set the command will not attempt to complete database migration`,
|
||||
default: false,
|
||||
})
|
||||
.option(`skip-env`, {
|
||||
type: `boolean`,
|
||||
describe: `If flag is set the command will not attempt to populate .env`,
|
||||
default: false,
|
||||
})
|
||||
.option(`db-user`, {
|
||||
type: `string`,
|
||||
describe: `The database user to use for database setup and migrations.`,
|
||||
})
|
||||
.option(`db-database`, {
|
||||
type: `string`,
|
||||
describe: `The database use for database setup and migrations.`,
|
||||
})
|
||||
.option(`db-pass`, {
|
||||
type: `string`,
|
||||
describe: `The database password to use for database setup and migrations.`,
|
||||
})
|
||||
.option(`db-port`, {
|
||||
type: `number`,
|
||||
describe: `The database port to use for database setup and migrations.`,
|
||||
})
|
||||
.option(`db-host`, {
|
||||
type: `string`,
|
||||
describe: `The database host to use for database setup and migrations.`,
|
||||
})
|
||||
.option(`v2`, {
|
||||
type: `boolean`,
|
||||
describe: `Install Medusa with the V2 feature flag enabled. WARNING: Medusa V2 is still in development and shouldn't be used in production.`,
|
||||
default: false
|
||||
}),
|
||||
desc: `Create a new Medusa project.`,
|
||||
handler: handlerP(newStarter),
|
||||
})
|
||||
.command({
|
||||
command: `telemetry`,
|
||||
describe: `Enable or disable collection of anonymous usage data.`,
|
||||
builder: (yargs) =>
|
||||
yargs
|
||||
.option(`enable`, {
|
||||
type: `boolean`,
|
||||
description: `Enable telemetry (default)`,
|
||||
})
|
||||
.option(`disable`, {
|
||||
type: `boolean`,
|
||||
description: `Disable telemetry`,
|
||||
}),
|
||||
|
||||
handler: handlerP(({ enable, disable }) => {
|
||||
const enabled = Boolean(enable) || !disable
|
||||
setTelemetryEnabled(enabled)
|
||||
reporter.info(
|
||||
`Telemetry collection ${enabled ? `enabled` : `disabled`}`
|
||||
)
|
||||
}),
|
||||
})
|
||||
.command({
|
||||
command: `seed`,
|
||||
desc: `Migrates and populates the database with the provided file.`,
|
||||
builder: (_) =>
|
||||
_.option(`f`, {
|
||||
alias: `seed-file`,
|
||||
type: `string`,
|
||||
describe: `Path to the file where the seed is defined.`,
|
||||
required: true,
|
||||
}).option(`m`, {
|
||||
alias: `migrate`,
|
||||
type: `boolean`,
|
||||
default: true,
|
||||
describe: `Flag to indicate if migrations should be run prior to seeding the database`,
|
||||
}),
|
||||
handler: handlerP(
|
||||
getCommandHandler(`seed`, (args, cmd) => {
|
||||
process.env.NODE_ENV ??= `development`
|
||||
return cmd(args)
|
||||
})
|
||||
),
|
||||
})
|
||||
.command({
|
||||
command: `migrations [action]`,
|
||||
desc: `Manage migrations from the core and your own project`,
|
||||
builder: {
|
||||
action: {
|
||||
demand: true,
|
||||
choices: ["run", "revert", "show"],
|
||||
},
|
||||
},
|
||||
handler: handlerP(
|
||||
getCommandHandler(`migrate`, (args, cmd) => {
|
||||
process.env.NODE_ENV = process.env.NODE_ENV || `development`
|
||||
return cmd(args)
|
||||
})
|
||||
),
|
||||
})
|
||||
.command({
|
||||
command: `develop`,
|
||||
desc: `Start development server. Watches file and rebuilds when something changes`,
|
||||
builder: (_) =>
|
||||
_.option(`H`, {
|
||||
alias: `host`,
|
||||
type: `string`,
|
||||
default: defaultHost,
|
||||
describe: `Set host. Defaults to ${defaultHost}`,
|
||||
}).option(`p`, {
|
||||
alias: `port`,
|
||||
type: `string`,
|
||||
default: process.env.PORT || defaultPort,
|
||||
describe: process.env.PORT
|
||||
? `Set port. Defaults to ${process.env.PORT} (set by env.PORT) (otherwise defaults ${defaultPort})`
|
||||
: `Set port. Defaults to ${defaultPort}`,
|
||||
}),
|
||||
handler: handlerP(
|
||||
getCommandHandler(`develop`, (args, cmd) => {
|
||||
process.env.NODE_ENV = process.env.NODE_ENV || `development`
|
||||
cmd(args)
|
||||
// Return an empty promise to prevent handlerP from exiting early.
|
||||
// The development server shouldn't ever exit until the user directly
|
||||
// kills it so this is fine.
|
||||
return new Promise((resolve) => {})
|
||||
})
|
||||
),
|
||||
})
|
||||
.command({
|
||||
command: `start`,
|
||||
desc: `Start development server.`,
|
||||
builder: (_) =>
|
||||
_.option(`H`, {
|
||||
alias: `host`,
|
||||
type: `string`,
|
||||
default: defaultHost,
|
||||
describe: `Set host. Defaults to ${defaultHost}`,
|
||||
}).option(`p`, {
|
||||
alias: `port`,
|
||||
type: `string`,
|
||||
default: process.env.PORT || defaultPort,
|
||||
describe: process.env.PORT
|
||||
? `Set port. Defaults to ${process.env.PORT} (set by env.PORT) (otherwise defaults ${defaultPort})`
|
||||
: `Set port. Defaults to ${defaultPort}`,
|
||||
}),
|
||||
handler: handlerP(
|
||||
getCommandHandler(`start`, (args, cmd) => {
|
||||
process.env.NODE_ENV = process.env.NODE_ENV || `development`
|
||||
cmd(args)
|
||||
// Return an empty promise to prevent handlerP from exiting early.
|
||||
// The development server shouldn't ever exit until the user directly
|
||||
// kills it so this is fine.
|
||||
return new Promise((resolve) => {})
|
||||
})
|
||||
),
|
||||
})
|
||||
.command({
|
||||
command: `start-cluster`,
|
||||
desc: `Start development server in cluster mode (beta).`,
|
||||
builder: (_) =>
|
||||
_.option(`H`, {
|
||||
alias: `host`,
|
||||
type: `string`,
|
||||
default: defaultHost,
|
||||
describe: `Set host. Defaults to ${defaultHost}`,
|
||||
})
|
||||
.option(`p`, {
|
||||
alias: `port`,
|
||||
type: `string`,
|
||||
default: process.env.PORT || defaultPort,
|
||||
describe: process.env.PORT
|
||||
? `Set port. Defaults to ${process.env.PORT} (set by env.PORT) (otherwise defaults ${defaultPort})`
|
||||
: `Set port. Defaults to ${defaultPort}`,
|
||||
})
|
||||
.option(`c`, {
|
||||
alias: `cpus`,
|
||||
type: `number`,
|
||||
default: process.env.CPUS,
|
||||
describe:
|
||||
"Set number of cpus to use. Defaults to max number of cpus available on the system (set by env.CPUS)",
|
||||
}),
|
||||
handler: handlerP(
|
||||
getCommandHandler(`start-cluster`, (args, cmd) => {
|
||||
process.env.NODE_ENV = process.env.NODE_ENV || `development`
|
||||
cmd(args)
|
||||
// Return an empty promise to prevent handlerP from exiting early.
|
||||
// The development server shouldn't ever exit until the user directly
|
||||
// kills it so this is fine.
|
||||
return new Promise((resolve) => {})
|
||||
})
|
||||
),
|
||||
})
|
||||
.command({
|
||||
command: `user`,
|
||||
desc: `Create a user`,
|
||||
builder: (_) =>
|
||||
_.option(`e`, {
|
||||
alias: `email`,
|
||||
type: `string`,
|
||||
describe: `User's email.`,
|
||||
})
|
||||
.option(`p`, {
|
||||
alias: `password`,
|
||||
type: `string`,
|
||||
describe: `User's password.`,
|
||||
})
|
||||
.option(`i`, {
|
||||
alias: `id`,
|
||||
type: `string`,
|
||||
describe: `User's id.`,
|
||||
})
|
||||
.option(`invite`, {
|
||||
type: `boolean`,
|
||||
describe: `If flag is set, an invitation will be created instead of a new user and the invite token will be returned.`,
|
||||
default: false,
|
||||
}),
|
||||
handler: handlerP(
|
||||
getCommandHandler(`user`, (args, cmd) => {
|
||||
cmd(args)
|
||||
// Return an empty promise to prevent handlerP from exiting early.
|
||||
// The development server shouldn't ever exit until the user directly
|
||||
// kills it so this is fine.
|
||||
return new Promise((resolve) => {})
|
||||
})
|
||||
),
|
||||
})
|
||||
}
|
||||
|
||||
function isLocalMedusaProject() {
|
||||
let inMedusaProject = false
|
||||
|
||||
try {
|
||||
const { dependencies, devDependencies } = require(path.resolve(
|
||||
`./package.json`
|
||||
))
|
||||
inMedusaProject = !!(
|
||||
(dependencies && dependencies["@medusajs/medusa"]) ||
|
||||
(devDependencies && devDependencies["@medusajs/medusa"])
|
||||
)
|
||||
} catch (err) {
|
||||
// ignore
|
||||
}
|
||||
|
||||
return inMedusaProject
|
||||
}
|
||||
|
||||
function getVersionInfo() {
|
||||
const { version } = require(`../package.json`)
|
||||
const isMedusaProject = isLocalMedusaProject()
|
||||
if (isMedusaProject) {
|
||||
let medusaVersion = getLocalMedusaVersion()
|
||||
|
||||
if (!medusaVersion) {
|
||||
medusaVersion = `unknown`
|
||||
}
|
||||
|
||||
return `Medusa CLI version: ${version}
|
||||
Medusa version: ${medusaVersion}
|
||||
Note: this is the Medusa version for the site at: ${process.cwd()}`
|
||||
} else {
|
||||
return `Medusa CLI version: ${version}`
|
||||
}
|
||||
}
|
||||
|
||||
export default (argv) => {
|
||||
const cli = yargs()
|
||||
const isLocalProject = isLocalMedusaProject()
|
||||
|
||||
cli
|
||||
.scriptName(`medusa`)
|
||||
.usage(`Usage: $0 <command> [options]`)
|
||||
.alias(`h`, `help`)
|
||||
.alias(`v`, `version`)
|
||||
.option(`verbose`, {
|
||||
default: false,
|
||||
type: `boolean`,
|
||||
describe: `Turn on verbose output`,
|
||||
global: true,
|
||||
})
|
||||
.option(`no-color`, {
|
||||
alias: `no-colors`,
|
||||
default: false,
|
||||
type: `boolean`,
|
||||
describe: `Turn off the color in output`,
|
||||
global: true,
|
||||
})
|
||||
.option(`json`, {
|
||||
describe: `Turn on the JSON logger`,
|
||||
default: false,
|
||||
type: `boolean`,
|
||||
global: true,
|
||||
})
|
||||
|
||||
buildLocalCommands(cli, isLocalProject)
|
||||
|
||||
try {
|
||||
cli.version(
|
||||
`version`,
|
||||
`Show the version of the Medusa CLI and the Medusa package in the current project`,
|
||||
getVersionInfo()
|
||||
)
|
||||
} catch (e) {
|
||||
// ignore
|
||||
}
|
||||
|
||||
return cli
|
||||
.wrap(cli.terminalWidth())
|
||||
.demandCommand(1, `Pass --help to see all available commands and options.`)
|
||||
.strict()
|
||||
.fail((msg, err, yargs) => {
|
||||
const availableCommands = yargs
|
||||
.getCommands()
|
||||
.map((commandDescription) => {
|
||||
const [command] = commandDescription
|
||||
return command.split(` `)[0]
|
||||
})
|
||||
const arg = argv.slice(2)[0]
|
||||
const suggestion = arg ? didYouMean(arg, availableCommands) : ``
|
||||
|
||||
if (!process.env.NODE_ENV?.startsWith("prod")) {
|
||||
console.log("--------------- ERROR ---------------------")
|
||||
console.log(err)
|
||||
console.log("-------------------------------------------")
|
||||
}
|
||||
|
||||
cli.showHelp()
|
||||
reporter.info(suggestion)
|
||||
reporter.info(msg)
|
||||
})
|
||||
.parse(argv.slice(2))
|
||||
}
|
||||
@@ -0,0 +1,18 @@
|
||||
import meant from "meant"
|
||||
|
||||
export function didYouMean(scmd, commands): string {
|
||||
const bestSimilarity = meant(scmd, commands).map((str) => {
|
||||
return ` ${str}`
|
||||
})
|
||||
|
||||
if (bestSimilarity.length === 0) return ``
|
||||
if (bestSimilarity.length === 1) {
|
||||
return `\nDid you mean this?\n ${bestSimilarity[0]}\n`
|
||||
} else {
|
||||
return (
|
||||
[`\nDid you mean one of these?`]
|
||||
.concat(bestSimilarity.slice(0, 3))
|
||||
.join(`\n`) + `\n`
|
||||
)
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,44 @@
|
||||
#!/usr/bin/env node
|
||||
|
||||
import "core-js/stable"
|
||||
import "regenerator-runtime/runtime"
|
||||
import os from "os"
|
||||
import util from "util"
|
||||
import createCli from "./create-cli"
|
||||
|
||||
const useJsonLogger = process.argv.slice(2).some((arg) => arg.includes(`json`))
|
||||
|
||||
if (useJsonLogger) {
|
||||
process.env.GATSBY_LOGGER = `json`
|
||||
}
|
||||
|
||||
// Ensure stable runs on Windows when started from different shells (i.e. c:\dir vs C:\dir)
|
||||
if (os.platform() === `win32`) {
|
||||
// ensureWindowsDriveLetterIsUppercase()
|
||||
}
|
||||
|
||||
// Check if update is available
|
||||
// updateNotifier({ pkg }).notify({ isGlobal: true })
|
||||
|
||||
const MIN_NODE_VERSION = `10.13.0`
|
||||
|
||||
process.on(`unhandledRejection`, (reason) => {
|
||||
// This will exit the process in newer Node anyway so lets be consistent
|
||||
// across versions and crash
|
||||
|
||||
// reason can be anything, it can be a message, an object, ANYTHING!
|
||||
// we convert it to an error object, so we don't crash on structured error validation
|
||||
if (!(reason instanceof Error)) {
|
||||
reason = new Error(util.format(reason))
|
||||
}
|
||||
|
||||
console.log(reason)
|
||||
// report.panic(`UNHANDLED REJECTION`, reason as Error)
|
||||
})
|
||||
|
||||
process.on(`uncaughtException`, (error) => {
|
||||
console.log(error)
|
||||
// report.panic(`UNHANDLED EXCEPTION`, error)
|
||||
})
|
||||
|
||||
createCli(process.argv)
|
||||
@@ -0,0 +1,24 @@
|
||||
// Jest Snapshot v1, https://goo.gl/fbAQLP
|
||||
|
||||
exports[`Reporter handles "Error" signature correctly 1`] = `
|
||||
Object {
|
||||
"level": "error",
|
||||
"message": "Error",
|
||||
"stack": Any<Array>,
|
||||
}
|
||||
`;
|
||||
|
||||
exports[`Reporter handles "String" signature correctly 1`] = `
|
||||
Object {
|
||||
"level": "error",
|
||||
"message": "Test log",
|
||||
}
|
||||
`;
|
||||
|
||||
exports[`Reporter handles "String, Error" signature correctly 1`] = `
|
||||
Object {
|
||||
"level": "error",
|
||||
"message": "Test log",
|
||||
"stack": Any<Array>,
|
||||
}
|
||||
`;
|
||||
@@ -0,0 +1,49 @@
|
||||
import logger, { Reporter } from "../"
|
||||
|
||||
describe(`Reporter`, () => {
|
||||
const winstonMock = {
|
||||
log: jest.fn(),
|
||||
}
|
||||
|
||||
const reporter = new Reporter({
|
||||
logger: winstonMock,
|
||||
activityLogger: {},
|
||||
})
|
||||
|
||||
const getErrorMessages = fn =>
|
||||
fn.mock.calls
|
||||
.map(([firstArg]) => firstArg)
|
||||
.filter(structuredMessage => structuredMessage.level === `error`)
|
||||
|
||||
beforeEach(() => {
|
||||
winstonMock.log.mockClear()
|
||||
})
|
||||
|
||||
it(`handles "String" signature correctly`, () => {
|
||||
reporter.error("Test log")
|
||||
|
||||
const generated = getErrorMessages(winstonMock.log)[0]
|
||||
|
||||
expect(generated).toMatchSnapshot()
|
||||
})
|
||||
|
||||
it(`handles "String, Error" signature correctly`, () => {
|
||||
reporter.error("Test log", new Error("String Error"))
|
||||
|
||||
const generated = getErrorMessages(winstonMock.log)[0]
|
||||
|
||||
expect(generated).toMatchSnapshot({
|
||||
stack: expect.any(Array),
|
||||
})
|
||||
})
|
||||
|
||||
it(`handles "Error" signature correctly`, () => {
|
||||
reporter.error(new Error("Error"))
|
||||
|
||||
const generated = getErrorMessages(winstonMock.log)[0]
|
||||
|
||||
expect(generated).toMatchSnapshot({
|
||||
stack: expect.any(Array),
|
||||
})
|
||||
})
|
||||
})
|
||||
@@ -0,0 +1,332 @@
|
||||
import stackTrace from "stack-trace"
|
||||
import { ulid } from "ulid"
|
||||
import winston from "winston"
|
||||
import ora from "ora"
|
||||
import { track } from "medusa-telemetry"
|
||||
|
||||
import { panicHandler } from "./panic-handler"
|
||||
import * as Transport from "winston-transport"
|
||||
|
||||
const LOG_LEVEL = process.env.LOG_LEVEL || "silly"
|
||||
const LOG_FILE = process.env.LOG_FILE || ""
|
||||
const NODE_ENV = process.env.NODE_ENV || "development"
|
||||
const IS_DEV = NODE_ENV.startsWith("dev")
|
||||
|
||||
let transports: Transport[] = []
|
||||
|
||||
if (!IS_DEV) {
|
||||
transports.push(new winston.transports.Console())
|
||||
} else {
|
||||
transports.push(
|
||||
new winston.transports.Console({
|
||||
format: winston.format.combine(
|
||||
winston.format.cli(),
|
||||
winston.format.splat()
|
||||
),
|
||||
})
|
||||
)
|
||||
}
|
||||
|
||||
if (LOG_FILE) {
|
||||
transports.push(
|
||||
new winston.transports.File({
|
||||
filename: LOG_FILE
|
||||
})
|
||||
)
|
||||
}
|
||||
|
||||
const loggerInstance = winston.createLogger({
|
||||
level: LOG_LEVEL,
|
||||
levels: winston.config.npm.levels,
|
||||
format: winston.format.combine(
|
||||
winston.format.timestamp({
|
||||
format: "YYYY-MM-DD HH:mm:ss",
|
||||
}),
|
||||
winston.format.errors({ stack: true }),
|
||||
winston.format.splat(),
|
||||
winston.format.json()
|
||||
),
|
||||
transports,
|
||||
})
|
||||
|
||||
export class Reporter {
|
||||
protected activities_: Record<string, any>
|
||||
protected loggerInstance_: winston.Logger
|
||||
protected ora_: typeof ora
|
||||
|
||||
constructor({ logger, activityLogger }) {
|
||||
this.activities_ = {}
|
||||
this.loggerInstance_ = logger
|
||||
this.ora_ = activityLogger
|
||||
}
|
||||
|
||||
panic = (data) => {
|
||||
const parsedPanic = panicHandler(data)
|
||||
|
||||
this.loggerInstance_.log({
|
||||
level: "error",
|
||||
details: data,
|
||||
message: parsedPanic.message,
|
||||
})
|
||||
|
||||
track("PANIC_ERROR_REACHED", {
|
||||
id: data.id,
|
||||
})
|
||||
|
||||
process.exit(1)
|
||||
}
|
||||
|
||||
/**
|
||||
* Determines if the logger should log at a given level.
|
||||
* @param {string} level - the level to check if logger is configured for
|
||||
* @return {boolean} whether we should log
|
||||
*/
|
||||
shouldLog = (level) => {
|
||||
level = this.loggerInstance_.levels[level]
|
||||
const logLevel = this.loggerInstance_.levels[this.loggerInstance_.level]
|
||||
return level <= logLevel
|
||||
}
|
||||
|
||||
/**
|
||||
* Sets the log level of the logger.
|
||||
* @param {string} level - the level to set the logger to
|
||||
*/
|
||||
setLogLevel = (level) => {
|
||||
this.loggerInstance_.level = level
|
||||
}
|
||||
|
||||
/**
|
||||
* Resets the logger to the value specified by the LOG_LEVEL env var. If no
|
||||
* LOG_LEVEL is set it defaults to "silly".
|
||||
*/
|
||||
unsetLogLevel = () => {
|
||||
this.loggerInstance_.level = LOG_LEVEL
|
||||
}
|
||||
|
||||
/**
|
||||
* Begin an activity. In development an activity is displayed as a spinner;
|
||||
* in other environments it will log the activity at the info level.
|
||||
* @param {string} message - the message to log the activity under
|
||||
* @param {any} config
|
||||
* @returns {string} the id of the activity; this should be passed to do
|
||||
* further operations on the activity such as success, failure, progress.
|
||||
*/
|
||||
activity = (message, config = {}) => {
|
||||
const id = ulid()
|
||||
if (IS_DEV && this.shouldLog("info")) {
|
||||
const activity = this.ora_(message).start()
|
||||
|
||||
this.activities_[id] = {
|
||||
activity,
|
||||
config,
|
||||
start: Date.now(),
|
||||
}
|
||||
|
||||
return id
|
||||
} else {
|
||||
this.activities_[id] = {
|
||||
start: Date.now(),
|
||||
config,
|
||||
}
|
||||
this.loggerInstance_.log({
|
||||
activity_id: id,
|
||||
level: "info",
|
||||
config,
|
||||
message,
|
||||
})
|
||||
|
||||
return id
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Reports progress on an activity. In development this will update the
|
||||
* activity log message, in other environments a log message will be issued
|
||||
* at the info level. Logging will include the activityId.
|
||||
* @param {string} activityId - the id of the activity as returned by activity
|
||||
* @param {string} message - the message to log
|
||||
*/
|
||||
progress = (activityId, message) => {
|
||||
const toLog = {
|
||||
level: "info",
|
||||
message,
|
||||
}
|
||||
|
||||
if (typeof activityId === "string" && this.activities_[activityId]) {
|
||||
const activity = this.activities_[activityId]
|
||||
if (activity.activity) {
|
||||
activity.text = message
|
||||
} else {
|
||||
toLog["activity_id"] = activityId
|
||||
this.loggerInstance_.log(toLog)
|
||||
}
|
||||
} else {
|
||||
this.loggerInstance_.log(toLog)
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Logs an error. If an error object is provided the stack trace for the error
|
||||
* will also be logged.
|
||||
* @param {String | Error} messageOrError - can either be a string with a
|
||||
* message to log the error under; or an error object.
|
||||
* @param {Error?} error - an error object to log message with
|
||||
*/
|
||||
error = (messageOrError, error = null) => {
|
||||
let message = messageOrError
|
||||
if (typeof messageOrError === "object") {
|
||||
message = messageOrError.message
|
||||
error = messageOrError
|
||||
}
|
||||
|
||||
const toLog = {
|
||||
level: "error",
|
||||
message,
|
||||
}
|
||||
|
||||
if (error) {
|
||||
toLog["stack"] = stackTrace.parse(error)
|
||||
}
|
||||
|
||||
this.loggerInstance_.log(toLog)
|
||||
|
||||
// Give stack traces and details in dev
|
||||
if (error && IS_DEV) {
|
||||
console.log(error)
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Reports failure of an activity. In development the activity will be udpated
|
||||
* with the failure message in other environments the failure will be logged
|
||||
* at the error level.
|
||||
* @param {string} activityId - the id of the activity as returned by activity
|
||||
* @param {string} message - the message to log
|
||||
* @returns {object} data about the activity
|
||||
*/
|
||||
failure = (activityId, message) => {
|
||||
const time = Date.now()
|
||||
const toLog = {
|
||||
level: "error",
|
||||
message,
|
||||
}
|
||||
|
||||
if (typeof activityId === "string" && this.activities_[activityId]) {
|
||||
const activity = this.activities_[activityId]
|
||||
if (activity.activity) {
|
||||
activity.activity.fail(`${message} – ${time - activity.start}`)
|
||||
} else {
|
||||
toLog["duration"] = time - activity.start
|
||||
toLog["activity_id"] = activityId
|
||||
this.loggerInstance_.log(toLog)
|
||||
}
|
||||
} else {
|
||||
this.loggerInstance_.log(toLog)
|
||||
}
|
||||
|
||||
if (this.activities_[activityId]) {
|
||||
const activity = this.activities_[activityId]
|
||||
return {
|
||||
...activity,
|
||||
duration: time - activity.start,
|
||||
}
|
||||
}
|
||||
|
||||
return null
|
||||
}
|
||||
|
||||
/**
|
||||
* Reports success of an activity. In development the activity will be udpated
|
||||
* with the failure message in other environments the failure will be logged
|
||||
* at the info level.
|
||||
* @param {string} activityId - the id of the activity as returned by activity
|
||||
* @param {string} message - the message to log
|
||||
* @returns {Record<string, any>} data about the activity
|
||||
*/
|
||||
success = (activityId, message) => {
|
||||
const time = Date.now()
|
||||
const toLog = {
|
||||
level: "info",
|
||||
message,
|
||||
}
|
||||
|
||||
if (typeof activityId === "string" && this.activities_[activityId]) {
|
||||
const activity = this.activities_[activityId]
|
||||
if (activity.activity) {
|
||||
activity.activity.succeed(`${message} – ${time - activity.start}ms`)
|
||||
} else {
|
||||
toLog["duration"] = time - activity.start
|
||||
toLog["activity_id"] = activityId
|
||||
this.loggerInstance_.log(toLog)
|
||||
}
|
||||
} else {
|
||||
this.loggerInstance_.log(toLog)
|
||||
}
|
||||
|
||||
if (this.activities_[activityId]) {
|
||||
const activity = this.activities_[activityId]
|
||||
return {
|
||||
...activity,
|
||||
duration: time - activity.start,
|
||||
}
|
||||
}
|
||||
|
||||
return null
|
||||
}
|
||||
|
||||
/**
|
||||
* Logs a message at the info level.
|
||||
* @param {string} message - the message to log
|
||||
*/
|
||||
debug = (message) => {
|
||||
this.loggerInstance_.log({
|
||||
level: "debug",
|
||||
message,
|
||||
})
|
||||
}
|
||||
|
||||
/**
|
||||
* Logs a message at the info level.
|
||||
* @param {string} message - the message to log
|
||||
*/
|
||||
info = (message) => {
|
||||
this.loggerInstance_.log({
|
||||
level: "info",
|
||||
message,
|
||||
})
|
||||
}
|
||||
|
||||
/**
|
||||
* Logs a message at the warn level.
|
||||
* @param {string} message - the message to log
|
||||
*/
|
||||
warn = (message) => {
|
||||
this.loggerInstance_.warn({
|
||||
level: "warn",
|
||||
message,
|
||||
})
|
||||
}
|
||||
|
||||
/**
|
||||
* A wrapper around winston's log method.
|
||||
*/
|
||||
log = (...args) => {
|
||||
if (args.length > 1) {
|
||||
// @ts-ignore
|
||||
this.loggerInstance_.log(...args)
|
||||
} else {
|
||||
let message = args[0]
|
||||
this.loggerInstance_.log({
|
||||
level: "info",
|
||||
message,
|
||||
})
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const logger = new Reporter({
|
||||
logger: loggerInstance,
|
||||
activityLogger: ora,
|
||||
})
|
||||
|
||||
export default logger
|
||||
@@ -0,0 +1,35 @@
|
||||
export type PanicData = {
|
||||
id: string
|
||||
context: {
|
||||
rootPath: string
|
||||
path: string
|
||||
}
|
||||
}
|
||||
|
||||
export enum PanicId {
|
||||
InvalidProjectName = "10000",
|
||||
InvalidPath = "10002",
|
||||
AlreadyNodeProject = "10003",
|
||||
}
|
||||
|
||||
export const panicHandler = (panicData: PanicData = {} as PanicData) => {
|
||||
const { id, context } = panicData
|
||||
switch (id) {
|
||||
case "10000":
|
||||
return {
|
||||
message: `Looks like you provided a URL as your project name. Try "medusa new my-medusa-store ${context.rootPath}" instead.`,
|
||||
}
|
||||
case "10002":
|
||||
return {
|
||||
message: `Could not create project because ${context.path} is not a valid path.`,
|
||||
}
|
||||
case "10003":
|
||||
return {
|
||||
message: `Directory ${context.rootPath} is already a Node project.`,
|
||||
}
|
||||
default:
|
||||
return {
|
||||
message: "Unknown error",
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,25 @@
|
||||
import fs from "fs"
|
||||
import glob from "glob"
|
||||
import path from "path"
|
||||
|
||||
export function clearProject(directory: string) {
|
||||
const adminFiles = glob.sync(path.join(directory, `src`, `admin/**/*`))
|
||||
const onboardingFiles = glob.sync(
|
||||
path.join(directory, `src`, `**/onboarding/`)
|
||||
)
|
||||
const typeFiles = glob.sync(path.join(directory, `src`, `types`))
|
||||
const srcFiles = glob.sync(
|
||||
path.join(directory, `src`, `**/*.{ts,tsx,js,jsx}`)
|
||||
)
|
||||
|
||||
const files = [...adminFiles, ...onboardingFiles, ...typeFiles, ...srcFiles]
|
||||
|
||||
files.forEach((file) =>
|
||||
fs.rmSync(file, {
|
||||
recursive: true,
|
||||
force: true,
|
||||
})
|
||||
)
|
||||
// add empty typescript file to avoid build errors
|
||||
fs.openSync(path.join(directory, "src", "index.ts"), "w")
|
||||
}
|
||||
@@ -0,0 +1,15 @@
|
||||
import ConfigStore from "configstore"
|
||||
import reporter from "../reporter"
|
||||
|
||||
const config = new ConfigStore(`medusa`, {}, { globalConfigPath: true })
|
||||
|
||||
const packageMangerConfigKey = `cli.packageManager`
|
||||
|
||||
export const getPackageManager = () => {
|
||||
return config.get(packageMangerConfigKey)
|
||||
}
|
||||
|
||||
export const setPackageManager = (packageManager) => {
|
||||
config.set(packageMangerConfigKey, packageManager)
|
||||
reporter.info(`Preferred package manager set to "${packageManager}"`)
|
||||
}
|
||||
@@ -0,0 +1,5 @@
|
||||
import { getMedusaVersion } from "medusa-core-utils"
|
||||
|
||||
export const getLocalMedusaVersion = (): string => {
|
||||
return getMedusaVersion()
|
||||
}
|
||||
@@ -0,0 +1,29 @@
|
||||
{
|
||||
"compilerOptions": {
|
||||
"lib": ["es2020"],
|
||||
"target": "es2020",
|
||||
"outDir": "./dist",
|
||||
"esModuleInterop": true,
|
||||
"declaration": true,
|
||||
"module": "commonjs",
|
||||
"moduleResolution": "node",
|
||||
"emitDecoratorMetadata": true,
|
||||
"experimentalDecorators": true,
|
||||
"sourceMap": true,
|
||||
"noImplicitReturns": true,
|
||||
"strictNullChecks": true,
|
||||
"strictFunctionTypes": true,
|
||||
"noImplicitThis": true,
|
||||
"allowJs": true,
|
||||
"skipLibCheck": true,
|
||||
"downlevelIteration": true // to use ES5 specific tooling
|
||||
},
|
||||
"include": ["src"],
|
||||
"exclude": [
|
||||
"dist",
|
||||
"./src/**/__tests__",
|
||||
"./src/**/__mocks__",
|
||||
"./src/**/__fixtures__",
|
||||
"node_modules"
|
||||
]
|
||||
}
|
||||
@@ -0,0 +1,5 @@
|
||||
{
|
||||
"extends": "./tsconfig.json",
|
||||
"include": ["src"],
|
||||
"exclude": ["node_modules"]
|
||||
}
|
||||
@@ -0,0 +1,33 @@
|
||||
# Logs
|
||||
logs
|
||||
*.log
|
||||
|
||||
# Runtime data
|
||||
pids
|
||||
*.pid
|
||||
*.seed
|
||||
|
||||
# Directory for instrumented libs generated by jscoverage/JSCover
|
||||
lib-cov
|
||||
|
||||
# Coverage directory used by tools like istanbul
|
||||
coverage
|
||||
|
||||
# Grunt intermediate storage (http://gruntjs.com/creating-plugins#storing-task-files)
|
||||
.grunt
|
||||
|
||||
# node-waf configuration
|
||||
.lock-wscript
|
||||
|
||||
# Compiled binary addons (http://nodejs.org/api/addons.html)
|
||||
build/Release
|
||||
|
||||
# Dependency directory
|
||||
# https://www.npmjs.org/doc/misc/npm-faq.html#should-i-check-my-node_modules-folder-into-git
|
||||
node_modules
|
||||
|
||||
decls
|
||||
dist
|
||||
|
||||
# verdaccio local storage
|
||||
verdaccio
|
||||
@@ -0,0 +1,157 @@
|
||||
# Change Log
|
||||
|
||||
## 0.0.32
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#4421](https://github.com/medusajs/medusa/pull/4421) [`5a4580b6a`](https://github.com/medusajs/medusa/commit/5a4580b6a8c305b7803c8d929843ca1f39677404) Thanks [@adrien2p](https://github.com/adrien2p)! - chore(medusa-dev-cli): Cleanup plugin setup
|
||||
|
||||
## 0.0.31
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#3293](https://github.com/medusajs/medusa/pull/3293) [`e8e7d7bb5`](https://github.com/medusajs/medusa/commit/e8e7d7bb5355877b3d0b85a079b6b57cfe7391c9) Thanks [@patrick-medusajs](https://github.com/patrick-medusajs)! - fix(medusa-dev): include packages/ subdirectories in discovery
|
||||
|
||||
## 0.0.31-rc.0
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#3293](https://github.com/medusajs/medusa/pull/3293) [`e8e7d7bb5`](https://github.com/medusajs/medusa/commit/e8e7d7bb5355877b3d0b85a079b6b57cfe7391c9) Thanks [@patrick-medusajs](https://github.com/patrick-medusajs)! - fix(medusa-dev): include packages/ subdirectories in discovery
|
||||
|
||||
## 0.0.30
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#3217](https://github.com/medusajs/medusa/pull/3217) [`8c5219a31`](https://github.com/medusajs/medusa/commit/8c5219a31ef76ee571fbce84d7d57a63abe56eb0) Thanks [@adrien2p](https://github.com/adrien2p)! - chore: Fix npm packages files included
|
||||
|
||||
## 0.0.29
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#3185](https://github.com/medusajs/medusa/pull/3185) [`08324355a`](https://github.com/medusajs/medusa/commit/08324355a4466b017a0bc7ab1d333ee3cd27b8c4) Thanks [@olivermrbl](https://github.com/olivermrbl)! - chore: Patches all dependencies + minor bumps `winston` to include a [fix for a significant memory leak](https://github.com/winstonjs/winston/pull/2057)
|
||||
|
||||
## 0.0.28
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#3025](https://github.com/medusajs/medusa/pull/3025) [`93d0dc1bd`](https://github.com/medusajs/medusa/commit/93d0dc1bdcb54cf6e87428a7bb9b0dac196b4de2) Thanks [@adrien2p](https://github.com/adrien2p)! - fix(medusa): test, build and watch scripts
|
||||
|
||||
## 0.0.27
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#2360](https://github.com/medusajs/medusa/pull/2360) [`196595cb6`](https://github.com/medusajs/medusa/commit/196595cb651d058b7da8711604ba57e5c0ee8a55) Thanks [@srindom](https://github.com/srindom)! - Avoid dev cli auth
|
||||
|
||||
All notable changes to this project will be documented in this file.
|
||||
See [Conventional Commits](https://conventionalcommits.org) for commit guidelines.
|
||||
|
||||
## [0.0.26](https://github.com/medusajs/medusa/compare/medusa-dev-cli@0.0.24...medusa-dev-cli@0.0.26) (2022-07-05)
|
||||
|
||||
**Note:** Version bump only for package medusa-dev-cli
|
||||
|
||||
## [0.0.25](https://github.com/medusajs/medusa/compare/medusa-dev-cli@0.0.24...medusa-dev-cli@0.0.25) (2022-07-05)
|
||||
|
||||
**Note:** Version bump only for package medusa-dev-cli
|
||||
|
||||
## [0.0.24](https://github.com/medusajs/medusa/compare/medusa-dev-cli@0.0.23...medusa-dev-cli@0.0.24) (2021-12-08)
|
||||
|
||||
**Note:** Version bump only for package medusa-dev-cli
|
||||
|
||||
## [0.0.23](https://github.com/medusajs/medusa/compare/medusa-dev-cli@0.0.22...medusa-dev-cli@0.0.23) (2021-11-11)
|
||||
|
||||
**Note:** Version bump only for package medusa-dev-cli
|
||||
|
||||
## [0.0.22](https://github.com/medusajs/medusa/compare/medusa-dev-cli@0.0.20...medusa-dev-cli@0.0.22) (2021-10-18)
|
||||
|
||||
**Note:** Version bump only for package medusa-dev-cli
|
||||
|
||||
## [0.0.21](https://github.com/medusajs/medusa/compare/medusa-dev-cli@0.0.20...medusa-dev-cli@0.0.21) (2021-10-18)
|
||||
|
||||
**Note:** Version bump only for package medusa-dev-cli
|
||||
|
||||
## [0.0.20](https://github.com/medusajs/medusa/compare/medusa-dev-cli@0.0.19...medusa-dev-cli@0.0.20) (2021-09-15)
|
||||
|
||||
**Note:** Version bump only for package medusa-dev-cli
|
||||
|
||||
## [0.0.19](https://github.com/medusajs/medusa/compare/medusa-dev-cli@0.0.18...medusa-dev-cli@0.0.19) (2021-09-14)
|
||||
|
||||
**Note:** Version bump only for package medusa-dev-cli
|
||||
|
||||
## [0.0.18](https://github.com/medusajs/medusa/compare/medusa-dev-cli@0.0.17...medusa-dev-cli@0.0.18) (2021-07-26)
|
||||
|
||||
**Note:** Version bump only for package medusa-dev-cli
|
||||
|
||||
## [0.0.17](https://github.com/medusajs/medusa/compare/medusa-dev-cli@0.0.15...medusa-dev-cli@0.0.17) (2021-07-15)
|
||||
|
||||
**Note:** Version bump only for package medusa-dev-cli
|
||||
|
||||
## [0.0.16](https://github.com/medusajs/medusa/compare/medusa-dev-cli@0.0.15...medusa-dev-cli@0.0.16) (2021-07-15)
|
||||
|
||||
**Note:** Version bump only for package medusa-dev-cli
|
||||
|
||||
## [0.0.15](https://github.com/medusajs/medusa/compare/medusa-dev-cli@0.0.14...medusa-dev-cli@0.0.15) (2021-07-02)
|
||||
|
||||
**Note:** Version bump only for package medusa-dev-cli
|
||||
|
||||
## [0.0.14](https://github.com/medusajs/medusa/compare/medusa-dev-cli@0.0.13...medusa-dev-cli@0.0.14) (2021-06-22)
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
- release assist ([668e8a7](https://github.com/medusajs/medusa/commit/668e8a740200847fc2a41c91d2979097f1392532))
|
||||
|
||||
## [0.0.13](https://github.com/medusajs/medusa/compare/medusa-dev-cli@0.0.12...medusa-dev-cli@0.0.13) (2021-06-09)
|
||||
|
||||
**Note:** Version bump only for package medusa-dev-cli
|
||||
|
||||
## [0.0.12](https://github.com/medusajs/medusa/compare/medusa-dev-cli@0.0.11...medusa-dev-cli@0.0.12) (2021-06-09)
|
||||
|
||||
**Note:** Version bump only for package medusa-dev-cli
|
||||
|
||||
## [0.0.11](https://github.com/medusajs/medusa/compare/medusa-dev-cli@0.0.10...medusa-dev-cli@0.0.11) (2021-06-09)
|
||||
|
||||
**Note:** Version bump only for package medusa-dev-cli
|
||||
|
||||
## [0.0.10](https://github.com/medusajs/medusa/compare/medusa-dev-cli@0.0.9...medusa-dev-cli@0.0.10) (2021-06-09)
|
||||
|
||||
**Note:** Version bump only for package medusa-dev-cli
|
||||
|
||||
## [0.0.9](https://github.com/medusajs/medusa/compare/medusa-dev-cli@0.0.8...medusa-dev-cli@0.0.9) (2021-06-08)
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
- babel ([81960d5](https://github.com/medusajs/medusa/commit/81960d51812f093e04271f50ffe5de9bce17c06b))
|
||||
|
||||
## [0.0.8](https://github.com/medusajs/medusa/compare/medusa-dev-cli@0.0.5...medusa-dev-cli@0.0.8) (2021-04-28)
|
||||
|
||||
**Note:** Version bump only for package medusa-dev-cli
|
||||
|
||||
## [0.0.7](https://github.com/medusajs/medusa/compare/medusa-dev-cli@0.0.6...medusa-dev-cli@0.0.7) (2021-04-20)
|
||||
|
||||
**Note:** Version bump only for package medusa-dev-cli
|
||||
|
||||
## [0.0.6](https://github.com/medusajs/medusa/compare/medusa-dev-cli@0.0.5...medusa-dev-cli@0.0.6) (2021-04-20)
|
||||
|
||||
**Note:** Version bump only for package medusa-dev-cli
|
||||
|
||||
## [0.0.5](https://github.com/medusajs/medusa/compare/medusa-dev-cli@0.0.4...medusa-dev-cli@0.0.5) (2021-04-13)
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
- working ([a9b1d75](https://github.com/medusajs/medusa/commit/a9b1d75074d2786df6dfca9064b3d9657a664d6d))
|
||||
- yarn lon ([3c49467](https://github.com/medusajs/medusa/commit/3c4946762c25220c18913f46537f777a55a209ec))
|
||||
|
||||
## [0.0.4](https://github.com/medusajs/medusa/compare/medusa-dev-cli@0.0.3...medusa-dev-cli@0.0.4) (2021-03-17)
|
||||
|
||||
**Note:** Version bump only for package medusa-dev-cli
|
||||
|
||||
## 0.0.3 (2021-03-17)
|
||||
|
||||
### Features
|
||||
|
||||
- dev cli ([#203](https://github.com/medusajs/medusa/issues/203)) ([695b1fd](https://github.com/medusajs/medusa/commit/695b1fd0a54a247502cb48ffb73d060356293b76)), closes [#199](https://github.com/medusajs/medusa/issues/199)
|
||||
|
||||
## 0.0.2 (2021-03-17)
|
||||
|
||||
### Features
|
||||
|
||||
- dev cli ([#203](https://github.com/medusajs/medusa/issues/203)) ([695b1fd](https://github.com/medusajs/medusa/commit/695b1fd0a54a247502cb48ffb73d060356293b76)), closes [#199](https://github.com/medusajs/medusa/issues/199)
|
||||
@@ -0,0 +1,74 @@
|
||||
# medusa-dev-cli
|
||||
|
||||
A command-line tool for local Medusa development. When doing development work on
|
||||
Medusa core, this tool allows you to copy the changes to the various
|
||||
Medusa packages to Medusa projects.
|
||||
|
||||
## Install
|
||||
|
||||
`npm install -g medusa-dev-cli`
|
||||
|
||||
## Configuration / First time setup
|
||||
|
||||
The medusa-dev-cli tool needs to know where your cloned Medusa repository is
|
||||
located. You typically only need to configure this once.
|
||||
|
||||
`medusa-dev --set-path-to-repo /path/to/my/cloned/version/medusa`
|
||||
|
||||
## How to use
|
||||
|
||||
Navigate to the project you want to link to your forked Medusa repository and
|
||||
run:
|
||||
|
||||
`medusa-dev`
|
||||
|
||||
The tool will then scan your project's package.json to find its Medusa
|
||||
dependencies and copy the latest source from your cloned version of Medusa into
|
||||
your project's node_modules folder. A watch task is then created to re-copy any
|
||||
modules that might change while you're working on the code, so you can leave
|
||||
this program running.
|
||||
|
||||
Typically you'll also want to run `npm run watch` in the Medusa repo to set up
|
||||
watchers to build Medusa source code.
|
||||
|
||||
## Revert to current packages
|
||||
|
||||
If you've recently run `medusa-dev` your `node_modules` will be out of sync with current published packages. In order to undo this, you can remove the `node_modules` directory or run:
|
||||
|
||||
```shell
|
||||
git checkout package.json; yarn --force
|
||||
```
|
||||
|
||||
or
|
||||
|
||||
```shell
|
||||
git checkout package.json; npm install --force
|
||||
```
|
||||
|
||||
### Other commands
|
||||
|
||||
#### `--packages`
|
||||
|
||||
You can prevent the automatic dependencies scan and instead specify a list of
|
||||
packages you want to link by using the `--packages` option:
|
||||
|
||||
`medusa-dev --packages @medusajs/medusa medusa-interfaces`
|
||||
|
||||
#### `--scan-once`
|
||||
|
||||
With this flag, the tool will do an initial scan and copy and then quit. This is
|
||||
useful for setting up automated testing/builds of Medusa projects from the latest
|
||||
code.
|
||||
|
||||
#### `--quiet`
|
||||
|
||||
Don't output anything except for a success message when used together with
|
||||
`--scan-once`.
|
||||
|
||||
#### `--copy-all`
|
||||
|
||||
Copy all modules/files in the medusa source repo in packages/
|
||||
|
||||
#### `--force-install`
|
||||
|
||||
Disables copying files into node_modules and forces usage of local npm repository.
|
||||
@@ -0,0 +1,13 @@
|
||||
module.exports = {
|
||||
globals: {
|
||||
"ts-jest": {
|
||||
tsconfig: "tsconfig.spec.json",
|
||||
isolatedModules: false,
|
||||
},
|
||||
},
|
||||
transform: {
|
||||
"^.+\\.[jt]s?$": "ts-jest",
|
||||
},
|
||||
testEnvironment: `node`,
|
||||
moduleFileExtensions: [`js`, `jsx`, `ts`, `tsx`, `json`],
|
||||
}
|
||||
@@ -0,0 +1,54 @@
|
||||
{
|
||||
"name": "medusa-dev-cli",
|
||||
"description": "CLI helpers for contributors working on Medusa",
|
||||
"version": "0.0.32",
|
||||
"author": "Sebastian Rindom <skrindom@gmail.com>",
|
||||
"bin": {
|
||||
"medusa-dev": "./dist/index.js"
|
||||
},
|
||||
"files": [
|
||||
"dist"
|
||||
],
|
||||
"dependencies": {
|
||||
"chokidar": "^3.5.3",
|
||||
"configstore": "^5.0.1",
|
||||
"del": "^6.0.0",
|
||||
"execa": "^4.1.0",
|
||||
"find-yarn-workspace-root": "^2.0.0",
|
||||
"fs-extra": "^9.0.1",
|
||||
"glob": "^8.1.0",
|
||||
"got": "^11.8.6",
|
||||
"is-absolute": "^1.0.0",
|
||||
"lodash": "^4.17.21",
|
||||
"signal-exit": "^3.0.7",
|
||||
"verdaccio": "^4.10.0",
|
||||
"yargs": "^15.4.1"
|
||||
},
|
||||
"devDependencies": {
|
||||
"cross-env": "^7.0.3",
|
||||
"jest": "^25.5.4",
|
||||
"ts-jest": "^25.5.1",
|
||||
"typescript": "^4.4.4"
|
||||
},
|
||||
"homepage": "https://github.com/medusajs/medusa/tree/master/packages/medusa-dev-cli#readme",
|
||||
"keywords": [
|
||||
"medusa"
|
||||
],
|
||||
"license": "MIT",
|
||||
"main": "index.js",
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "https://github.com/medusajs/medusa.git",
|
||||
"directory": "packages/medusa-dev-cli"
|
||||
},
|
||||
"scripts": {
|
||||
"prepublishOnly": "cross-env NODE_ENV=production tsc --build",
|
||||
"test": "jest --passWithNoTests -- src",
|
||||
"build": "tsc",
|
||||
"watch": "tsc --watch"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=16"
|
||||
},
|
||||
"gitHead": "41a5425405aea5045a26def95c0dc00cf4a5a44d"
|
||||
}
|
||||
@@ -0,0 +1,178 @@
|
||||
import path from "path"
|
||||
import glob from "glob"
|
||||
import fs from "fs"
|
||||
import Configstore from "configstore"
|
||||
import { kebabCase, snakeCase } from "lodash"
|
||||
import { featureFlagTemplate } from "./template"
|
||||
import pkg from "../../package.json"
|
||||
|
||||
export const buildFFCli = (cli) => {
|
||||
cli.command({
|
||||
command: `ff`,
|
||||
desc: "Manage Medusa feature flags",
|
||||
builder: (yargs) => {
|
||||
yargs
|
||||
.command({
|
||||
command: "create <name>",
|
||||
desc: "Create a new feature flag",
|
||||
builder: {
|
||||
name: {
|
||||
demandOption: true,
|
||||
coerce: (name) => kebabCase(name),
|
||||
description: "Name of the feature flag",
|
||||
type: "string",
|
||||
},
|
||||
description: {
|
||||
alias: "d",
|
||||
demandOption: true,
|
||||
description: "Description of the feature flag",
|
||||
type: "string",
|
||||
},
|
||||
},
|
||||
handler: async (argv) => {
|
||||
const medusaLocation = getRepoRoot()
|
||||
const featureFlagPath = buildPath(argv.name, medusaLocation)
|
||||
|
||||
if (fs.existsSync(featureFlagPath)) {
|
||||
console.error(`Feature flag already exists: ${featureFlagPath}`)
|
||||
return
|
||||
}
|
||||
|
||||
const flagSettings = collectSettings(argv.name, argv.description)
|
||||
writeFeatureFlag(flagSettings, featureFlagPath)
|
||||
},
|
||||
})
|
||||
.command({
|
||||
command: "list",
|
||||
desc: "List available feature flags",
|
||||
handler: async () => {
|
||||
const medusaLocation = getRepoRoot()
|
||||
const flagGlob = buildFlagsGlob(medusaLocation)
|
||||
|
||||
const featureFlags = glob.sync(flagGlob, {
|
||||
ignore: ["**/index.*"],
|
||||
})
|
||||
const flagData = featureFlags.map((flag) => {
|
||||
const flagSettings = readFeatureFlag(flag)
|
||||
return {
|
||||
...flagSettings,
|
||||
file_name: path.basename(flag, ".js"),
|
||||
}
|
||||
})
|
||||
|
||||
console.table(flagData)
|
||||
},
|
||||
})
|
||||
.command({
|
||||
command: "delete <name>",
|
||||
desc: "Delete a feature flag",
|
||||
builder: {
|
||||
name: {
|
||||
demand: true,
|
||||
coerce: (name) => kebabCase(name),
|
||||
description: "Name of the feature flag",
|
||||
type: "string",
|
||||
},
|
||||
},
|
||||
handler: async (argv) => {
|
||||
const medusaLocation = getRepoRoot()
|
||||
const featureFlagPath = buildPath(argv.name, medusaLocation)
|
||||
|
||||
if (fs.existsSync(featureFlagPath)) {
|
||||
fs.unlinkSync(featureFlagPath)
|
||||
}
|
||||
|
||||
console.log(`Feature flag deleted: ${featureFlagPath}`)
|
||||
},
|
||||
})
|
||||
.demandCommand(1, "Please specify an action")
|
||||
},
|
||||
})
|
||||
}
|
||||
|
||||
const getRepoRoot = () => {
|
||||
const conf = new Configstore(pkg.name)
|
||||
const medusaLocation = conf.get(`medusa-location`)
|
||||
|
||||
if (!medusaLocation) {
|
||||
console.error(
|
||||
`
|
||||
You haven't set the path yet to your cloned
|
||||
version of medusa. Do so now by running:
|
||||
medusa-dev --set-path-to-repo /path/to/my/cloned/version/medusa
|
||||
`
|
||||
)
|
||||
process.exit()
|
||||
}
|
||||
|
||||
return medusaLocation
|
||||
}
|
||||
|
||||
const readFeatureFlag = (flagPath) => {
|
||||
const flagSettings = require(flagPath).default
|
||||
return flagSettings
|
||||
}
|
||||
|
||||
const buildFlagsGlob = (repoRoot) => {
|
||||
return path.join(
|
||||
repoRoot,
|
||||
"packages",
|
||||
"medusa",
|
||||
"dist",
|
||||
"loaders",
|
||||
"feature-flags",
|
||||
`*.js`
|
||||
)
|
||||
}
|
||||
|
||||
const buildPath = (kebabCaseName, repoRoot) => {
|
||||
return path.join(
|
||||
repoRoot,
|
||||
"packages",
|
||||
"medusa",
|
||||
"src",
|
||||
"loaders",
|
||||
"feature-flags",
|
||||
`${kebabCaseName}.ts`
|
||||
)
|
||||
}
|
||||
|
||||
const collectSettings = (name, description) => {
|
||||
const snakeCaseName = snakeCase(name)
|
||||
return {
|
||||
key: snakeCaseName,
|
||||
description: description,
|
||||
defaultValue: false,
|
||||
envKey: `MEDUSA_FF_${snakeCaseName.toUpperCase()}`,
|
||||
}
|
||||
}
|
||||
|
||||
const writeFeatureFlag = (settings, featureFlagPath) => {
|
||||
const featureFlag = featureFlagTemplate(settings)
|
||||
fs.writeFileSync(featureFlagPath, featureFlag)
|
||||
logFeatureFlagUsage(featureFlagPath, settings)
|
||||
}
|
||||
|
||||
const logFeatureFlagUsage = (flagPath, flagSettings) => {
|
||||
console.log(`Feature flag created: ${flagPath}`)
|
||||
console.log(`
|
||||
To use this feature flag, add the following to your medusa-config.js:
|
||||
|
||||
{
|
||||
...,
|
||||
featureFlags: {
|
||||
${flagSettings.key}: true
|
||||
}
|
||||
}
|
||||
|
||||
or set the environment variable:
|
||||
|
||||
export ${flagSettings.envKey}=true
|
||||
|
||||
To add guarded code use the featureFlagRouter:
|
||||
|
||||
if (featureFlagRouter.isEnabled("${flagSettings.key}")) {
|
||||
// do something
|
||||
}
|
||||
`)
|
||||
}
|
||||
@@ -0,0 +1,13 @@
|
||||
export const featureFlagTemplate = ({
|
||||
key,
|
||||
description,
|
||||
defaultValue,
|
||||
envKey,
|
||||
}) => {
|
||||
return `export default {
|
||||
key: "${key}",
|
||||
description: "${description}",
|
||||
default_value: ${defaultValue},
|
||||
env_key: "${envKey}",
|
||||
}`
|
||||
}
|
||||
@@ -0,0 +1,184 @@
|
||||
#!/usr/bin/env node
|
||||
|
||||
const Configstore = require(`configstore`)
|
||||
const pkg = require(`../package.json`)
|
||||
const _ = require(`lodash`)
|
||||
const yargs = require(`yargs/yargs`)
|
||||
const path = require(`path`)
|
||||
const os = require(`os`)
|
||||
const fs = require(`fs-extra`)
|
||||
const glob = require("glob")
|
||||
const watch = require(`./watch`)
|
||||
const { getVersionInfo } = require(`./utils/version`)
|
||||
const { buildFFCli } = require("./feature-flags")
|
||||
|
||||
const cli = yargs()
|
||||
|
||||
cli.command({
|
||||
command: `*`,
|
||||
description: `Start the Medusa dev CLI`,
|
||||
builder: (yargs) => {
|
||||
yargs
|
||||
.usage(`Usage: medusa-dev [options]`)
|
||||
.alias(`q`, `quiet`)
|
||||
.nargs(`q`, 0)
|
||||
.describe(`q`, `Do not output copy file information`)
|
||||
.alias(`s`, `scan-once`)
|
||||
.nargs(`s`, 0)
|
||||
.describe(`s`, `Scan once. Do not start file watch`)
|
||||
.alias(`p`, `set-path-to-repo`)
|
||||
.nargs(`p`, 1)
|
||||
.describe(
|
||||
`p`,
|
||||
`Set path to medusa repository.
|
||||
You typically only need to configure this once.`
|
||||
)
|
||||
.nargs(`force-install`, 0)
|
||||
.describe(
|
||||
`force-install`,
|
||||
`Disables copying files into node_modules and forces usage of local npm repository.`
|
||||
)
|
||||
.nargs(`external-registry`, 0)
|
||||
.describe(
|
||||
`external-registry`,
|
||||
`Run 'yarn add' commands without the --registry flag.`
|
||||
)
|
||||
.alias(`C`, `copy-all`)
|
||||
.nargs(`C`, 0)
|
||||
.describe(
|
||||
`C`,
|
||||
`Copy all contents in packages/ instead of just medusa packages`
|
||||
)
|
||||
.array(`packages`)
|
||||
.describe(`packages`, `Explicitly specify packages to copy`)
|
||||
.help(`h`)
|
||||
.alias(`h`, `help`)
|
||||
.nargs(`v`, 0)
|
||||
.alias(`v`, `version`)
|
||||
.describe(`v`, `Print the currently installed version of Medusa Dev CLI`)
|
||||
},
|
||||
handler: async (argv) => {
|
||||
const conf = new Configstore(pkg.name)
|
||||
|
||||
if (argv.version) {
|
||||
console.log(getVersionInfo())
|
||||
process.exit()
|
||||
}
|
||||
|
||||
let pathToRepo = argv.setPathToRepo
|
||||
|
||||
if (pathToRepo) {
|
||||
if (pathToRepo.includes(`~`)) {
|
||||
pathToRepo = path.join(os.homedir(), pathToRepo.split(`~`).pop())
|
||||
}
|
||||
conf.set(`medusa-location`, path.resolve(pathToRepo))
|
||||
process.exit()
|
||||
}
|
||||
|
||||
const havePackageJsonFile = fs.existsSync(`package.json`)
|
||||
|
||||
if (!havePackageJsonFile) {
|
||||
console.error(`Current folder must have a package.json file!`)
|
||||
process.exit()
|
||||
}
|
||||
|
||||
const medusaLocation = conf.get(`medusa-location`)
|
||||
|
||||
if (!medusaLocation) {
|
||||
console.error(
|
||||
`
|
||||
You haven't set the path yet to your cloned
|
||||
version of medusa. Do so now by running:
|
||||
medusa-dev --set-path-to-repo /path/to/my/cloned/version/medusa
|
||||
`
|
||||
)
|
||||
process.exit()
|
||||
}
|
||||
|
||||
// get list of directories to crawl for package declarations
|
||||
const monoRepoPackagesDirs = []
|
||||
try {
|
||||
const monoRepoPkg = JSON.parse(
|
||||
fs.readFileSync(path.join(medusaLocation, "package.json"))
|
||||
)
|
||||
for (const workspace of monoRepoPkg.workspaces.packages) {
|
||||
if (!workspace.startsWith("packages")) {
|
||||
continue
|
||||
}
|
||||
const workspacePackageDirs = glob.sync(workspace, {
|
||||
cwd: medusaLocation.toString(),
|
||||
})
|
||||
monoRepoPackagesDirs.push(...workspacePackageDirs)
|
||||
}
|
||||
} catch (err) {
|
||||
console.error(
|
||||
`Unable to read and parse the workspace definition from medusa package.json`
|
||||
)
|
||||
process.exit(1)
|
||||
}
|
||||
|
||||
// get list of packages from monorepo
|
||||
const packageNameToPath = new Map()
|
||||
const monoRepoPackages = monoRepoPackagesDirs.map((dirName) => {
|
||||
try {
|
||||
const localPkg = JSON.parse(
|
||||
fs.readFileSync(path.join(medusaLocation, dirName, `package.json`))
|
||||
)
|
||||
|
||||
if (localPkg?.name) {
|
||||
packageNameToPath.set(
|
||||
localPkg.name,
|
||||
path.join(medusaLocation, dirName)
|
||||
)
|
||||
return localPkg.name
|
||||
}
|
||||
} catch (error) {
|
||||
// fallback to generic one
|
||||
}
|
||||
|
||||
packageNameToPath.set(dirName, path.join(medusaLocation, dirName))
|
||||
return dirName
|
||||
})
|
||||
|
||||
const localPkg = JSON.parse(fs.readFileSync(`package.json`))
|
||||
// intersect dependencies with monoRepoPackages to get list of packages that are used
|
||||
const localPackages = _.intersection(
|
||||
monoRepoPackages,
|
||||
Object.keys(_.merge({}, localPkg.dependencies, localPkg.devDependencies))
|
||||
)
|
||||
|
||||
if (!argv.packages && _.isEmpty(localPackages)) {
|
||||
console.error(
|
||||
`
|
||||
You haven't got any medusa dependencies into your current package.json
|
||||
You probably want to pass in a list of packages to start
|
||||
developing on! For example:
|
||||
medusa-dev --packages medusa medusa-js
|
||||
If you prefer to place them in your package.json dependencies instead,
|
||||
medusa-dev will pick them up.
|
||||
`
|
||||
)
|
||||
if (!argv.forceInstall) {
|
||||
process.exit()
|
||||
} else {
|
||||
console.log(
|
||||
`Continuing other dependencies installation due to "--forceInstall" flag`
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
watch(medusaLocation, argv.packages, {
|
||||
localPackages,
|
||||
quiet: argv.quiet,
|
||||
scanOnce: argv.scanOnce,
|
||||
forceInstall: argv.forceInstall,
|
||||
monoRepoPackages,
|
||||
packageNameToPath,
|
||||
externalRegistry: argv.externalRegistry,
|
||||
})
|
||||
},
|
||||
})
|
||||
|
||||
buildFFCli(cli)
|
||||
|
||||
cli.parse(process.argv.slice(2))
|
||||
@@ -0,0 +1,19 @@
|
||||
const signalExit = require(`signal-exit`);
|
||||
|
||||
const cleanupTasks = new Set();
|
||||
|
||||
exports.registerCleanupTask = (taskFn) => {
|
||||
cleanupTasks.add(taskFn);
|
||||
return () => {
|
||||
const result = taskFn();
|
||||
cleanupTasks.delete(taskFn);
|
||||
return result;
|
||||
};
|
||||
};
|
||||
|
||||
signalExit(() => {
|
||||
if (cleanupTasks.size) {
|
||||
console.log(`Process exitted in middle of publishing - cleaning up`);
|
||||
cleanupTasks.forEach((taskFn) => taskFn());
|
||||
}
|
||||
});
|
||||
@@ -0,0 +1,79 @@
|
||||
const startVerdaccio = require(`verdaccio`).default
|
||||
|
||||
const fs = require(`fs-extra`)
|
||||
const _ = require(`lodash`)
|
||||
|
||||
let VerdaccioInitPromise = null
|
||||
|
||||
const { verdaccioConfig } = require(`./verdaccio-config`)
|
||||
const { publishPackage } = require(`./publish-package`)
|
||||
const { installPackages } = require(`./install-packages`)
|
||||
|
||||
const startServer = () => {
|
||||
if (VerdaccioInitPromise) {
|
||||
return VerdaccioInitPromise
|
||||
}
|
||||
|
||||
console.log(`Starting local verdaccio server`)
|
||||
|
||||
// clear storage
|
||||
fs.removeSync(verdaccioConfig.storage)
|
||||
|
||||
VerdaccioInitPromise = new Promise((resolve) => {
|
||||
startVerdaccio(
|
||||
verdaccioConfig,
|
||||
verdaccioConfig.port,
|
||||
verdaccioConfig.storage,
|
||||
`1.0.0`,
|
||||
`medusa-dev`,
|
||||
(webServer, addr, pkgName, pkgVersion) => {
|
||||
// console.log(webServer)
|
||||
webServer.listen(addr.port || addr.path, addr.host, () => {
|
||||
console.log(`Started local verdaccio server`)
|
||||
|
||||
resolve()
|
||||
})
|
||||
}
|
||||
)
|
||||
})
|
||||
|
||||
return VerdaccioInitPromise
|
||||
}
|
||||
|
||||
exports.startVerdaccio = startServer
|
||||
|
||||
exports.publishPackagesLocallyAndInstall = async ({
|
||||
packagesToPublish,
|
||||
localPackages,
|
||||
packageNameToPath,
|
||||
ignorePackageJSONChanges,
|
||||
yarnWorkspaceRoot,
|
||||
externalRegistry,
|
||||
root,
|
||||
}) => {
|
||||
await startServer()
|
||||
|
||||
const versionPostFix = Date.now()
|
||||
|
||||
const newlyPublishedPackageVersions = {}
|
||||
|
||||
for (const packageName of packagesToPublish) {
|
||||
newlyPublishedPackageVersions[packageName] = await publishPackage({
|
||||
packageName,
|
||||
packagesToPublish,
|
||||
packageNameToPath,
|
||||
versionPostFix,
|
||||
ignorePackageJSONChanges,
|
||||
root,
|
||||
})
|
||||
}
|
||||
|
||||
const packagesToInstall = _.intersection(packagesToPublish, localPackages)
|
||||
|
||||
await installPackages({
|
||||
packagesToInstall,
|
||||
yarnWorkspaceRoot,
|
||||
newlyPublishedPackageVersions,
|
||||
externalRegistry,
|
||||
})
|
||||
}
|
||||
@@ -0,0 +1,163 @@
|
||||
const path = require(`path`)
|
||||
const fs = require(`fs-extra`)
|
||||
|
||||
const { promisifiedSpawn } = require(`../utils/promisified-spawn`)
|
||||
const { registryUrl } = require(`./verdaccio-config`)
|
||||
|
||||
const installPackages = async ({
|
||||
packagesToInstall,
|
||||
yarnWorkspaceRoot,
|
||||
newlyPublishedPackageVersions,
|
||||
externalRegistry,
|
||||
}) => {
|
||||
console.log(
|
||||
`Installing packages from local registry:\n${packagesToInstall
|
||||
.map((packageAndVersion) => ` - ${packageAndVersion}`)
|
||||
.join(`\n`)}`
|
||||
)
|
||||
let installCmd
|
||||
if (yarnWorkspaceRoot) {
|
||||
// this is very hacky - given root, we run `yarn workspaces info`
|
||||
// to get list of all workspaces and their locations, and manually
|
||||
// edit package.json file for packages we want to install
|
||||
// to make sure there are no mismatched versions of same package
|
||||
// in workspaces which should preserve node_modules structure
|
||||
// (packages being mostly hoisted to top-level node_modules)
|
||||
|
||||
const { stdout: yarnVersion } = await promisifiedSpawn([
|
||||
`yarn`,
|
||||
[`--version`],
|
||||
{ stdio: `pipe` },
|
||||
])
|
||||
const workspaceCommand = !yarnVersion.startsWith("1") ? "list" : "info"
|
||||
|
||||
const { stdout } = await promisifiedSpawn([
|
||||
`yarn`,
|
||||
[`workspaces`, workspaceCommand, `--json`],
|
||||
{ stdio: `pipe` },
|
||||
])
|
||||
|
||||
let workspacesLayout
|
||||
try {
|
||||
workspacesLayout = JSON.parse(JSON.parse(stdout).data)
|
||||
} catch (e) {
|
||||
/*
|
||||
Yarn 1.22 doesn't output pure json - it has leading and trailing text:
|
||||
```
|
||||
$ yarn workspaces info --json
|
||||
yarn workspaces v1.22.0
|
||||
{
|
||||
"z": {
|
||||
"location": "z",
|
||||
"workspaceDependencies": [],
|
||||
"mismatchedWorkspaceDependencies": []
|
||||
},
|
||||
"y": {
|
||||
"location": "y",
|
||||
"workspaceDependencies": [],
|
||||
"mismatchedWorkspaceDependencies": []
|
||||
}
|
||||
}
|
||||
Done in 0.48s.
|
||||
```
|
||||
So we need to do some sanitization. We find JSON by matching substring
|
||||
that starts with `{` and ends with `}`
|
||||
*/
|
||||
|
||||
const regex = /^[^{]*({.*})[^}]*$/gs
|
||||
const sanitizedStdOut = regex.exec(stdout)
|
||||
if (sanitizedStdOut?.length >= 2) {
|
||||
// pick content of first (and only) capturing group
|
||||
const jsonString = sanitizedStdOut[1]
|
||||
try {
|
||||
workspacesLayout = JSON.parse(jsonString)
|
||||
} catch (e) {
|
||||
console.error(
|
||||
`Failed to parse "sanitized" output of "yarn workspaces info" command.\n\nSanitized string: "${jsonString}`
|
||||
)
|
||||
// not exitting here, because we have general check for `workspacesLayout` being set below
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (!workspacesLayout) {
|
||||
console.error(
|
||||
`Couldn't parse output of "yarn workspaces info" command`,
|
||||
stdout
|
||||
)
|
||||
process.exit(1)
|
||||
}
|
||||
|
||||
const handleDeps = (deps) => {
|
||||
if (!deps) {
|
||||
return false
|
||||
}
|
||||
|
||||
let changed = false
|
||||
Object.keys(deps).forEach((depName) => {
|
||||
if (packagesToInstall.includes(depName)) {
|
||||
deps[depName] = `medusa-dev`
|
||||
changed = true
|
||||
}
|
||||
})
|
||||
return changed
|
||||
}
|
||||
|
||||
Object.keys(workspacesLayout).forEach((workspaceName) => {
|
||||
const { location } = workspacesLayout[workspaceName]
|
||||
const pkgJsonPath = path.join(yarnWorkspaceRoot, location, `package.json`)
|
||||
if (!fs.existsSync(pkgJsonPath)) {
|
||||
return
|
||||
}
|
||||
const pkg = JSON.parse(fs.readFileSync(pkgJsonPath, `utf8`))
|
||||
|
||||
let changed = false
|
||||
changed |= handleDeps(pkg.dependencies)
|
||||
changed |= handleDeps(pkg.devDependencies)
|
||||
changed |= handleDeps(pkg.peerDependencies)
|
||||
|
||||
if (changed) {
|
||||
console.log(`Changing deps in ${pkgJsonPath} to use @medusa-dev`)
|
||||
fs.outputJSONSync(pkgJsonPath, pkg, {
|
||||
spaces: 2,
|
||||
})
|
||||
}
|
||||
})
|
||||
|
||||
// package.json files are changed - so we just want to install
|
||||
// using verdaccio registry
|
||||
const yarnCommands = [`install`]
|
||||
|
||||
if (!externalRegistry) {
|
||||
yarnCommands.push(`--registry=${registryUrl}`)
|
||||
}
|
||||
|
||||
installCmd = [`yarn`, yarnCommands]
|
||||
} else {
|
||||
const yarnCommands = [
|
||||
`add`,
|
||||
...packagesToInstall.map((packageName) => {
|
||||
const packageVersion = newlyPublishedPackageVersions[packageName]
|
||||
return `${packageName}@${packageVersion}`
|
||||
}),
|
||||
`--exact`,
|
||||
]
|
||||
|
||||
if (!externalRegistry) {
|
||||
yarnCommands.push(`--registry=${registryUrl}`)
|
||||
}
|
||||
|
||||
installCmd = [`yarn`, yarnCommands]
|
||||
}
|
||||
|
||||
try {
|
||||
await promisifiedSpawn(installCmd)
|
||||
|
||||
console.log(`Installation complete`)
|
||||
} catch (error) {
|
||||
console.error(`Installation failed`, error)
|
||||
process.exit(1)
|
||||
}
|
||||
}
|
||||
|
||||
exports.installPackages = installPackages
|
||||
@@ -0,0 +1,158 @@
|
||||
const fs = require(`fs-extra`)
|
||||
const path = require(`path`)
|
||||
|
||||
const { promisifiedSpawn } = require(`../utils/promisified-spawn`)
|
||||
const { registryUrl } = require(`./verdaccio-config`)
|
||||
|
||||
const NPMRCContent = `${registryUrl.replace(
|
||||
/https?:/g,
|
||||
``
|
||||
)}/:_authToken="medusa-dev"`
|
||||
|
||||
const {
|
||||
getMonorepoPackageJsonPath,
|
||||
} = require(`../utils/get-monorepo-package-json-path`)
|
||||
const { registerCleanupTask } = require(`./cleanup-tasks`)
|
||||
|
||||
/**
|
||||
* Edit package.json to:
|
||||
* - adjust version to temporary one
|
||||
* - change version selectors for dependencies that
|
||||
* will be published, to make sure that yarn
|
||||
* install them in local site
|
||||
*/
|
||||
const adjustPackageJson = ({
|
||||
monoRepoPackageJsonPath,
|
||||
packageName,
|
||||
versionPostFix,
|
||||
packagesToPublish,
|
||||
ignorePackageJSONChanges,
|
||||
packageNameToPath,
|
||||
}) => {
|
||||
// we need to check if package depend on any other package to will be published and
|
||||
// adjust version selector to point to dev version of package so local registry is used
|
||||
// for dependencies.
|
||||
|
||||
const monorepoPKGjsonString = fs.readFileSync(
|
||||
monoRepoPackageJsonPath,
|
||||
`utf-8`
|
||||
)
|
||||
const monorepoPKGjson = JSON.parse(monorepoPKGjsonString)
|
||||
|
||||
monorepoPKGjson.version = `${monorepoPKGjson.version}-dev-${versionPostFix}`
|
||||
packagesToPublish.forEach((packageThatWillBePublished) => {
|
||||
if (
|
||||
monorepoPKGjson.dependencies &&
|
||||
monorepoPKGjson.dependencies[packageThatWillBePublished]
|
||||
) {
|
||||
const currentVersion = JSON.parse(
|
||||
fs.readFileSync(
|
||||
getMonorepoPackageJsonPath({
|
||||
packageName: packageThatWillBePublished,
|
||||
packageNameToPath,
|
||||
}),
|
||||
`utf-8`
|
||||
)
|
||||
).version
|
||||
|
||||
monorepoPKGjson.dependencies[
|
||||
packageThatWillBePublished
|
||||
] = `${currentVersion}-dev-${versionPostFix}`
|
||||
}
|
||||
})
|
||||
|
||||
const temporaryMonorepoPKGjsonString = JSON.stringify(monorepoPKGjson)
|
||||
|
||||
const unignorePackageJSONChanges = ignorePackageJSONChanges(packageName, [
|
||||
monorepoPKGjsonString,
|
||||
temporaryMonorepoPKGjsonString,
|
||||
])
|
||||
|
||||
// change version and dependency versions
|
||||
fs.outputFileSync(monoRepoPackageJsonPath, temporaryMonorepoPKGjsonString)
|
||||
|
||||
return {
|
||||
newPackageVersion: monorepoPKGjson.version,
|
||||
unadjustPackageJson: registerCleanupTask(() => {
|
||||
// restore original package.json
|
||||
fs.outputFileSync(monoRepoPackageJsonPath, monorepoPKGjsonString)
|
||||
unignorePackageJSONChanges()
|
||||
}),
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Anonymous publishing require dummy .npmrc
|
||||
* See https://github.com/verdaccio/verdaccio/issues/212#issuecomment-308578500
|
||||
* This is `npm publish` (as in linked comment) and `yarn publish` requirement.
|
||||
* This is not verdaccio restriction.
|
||||
*/
|
||||
const createTemporaryNPMRC = ({ pathToPackage, root }) => {
|
||||
const NPMRCPathInPackage = path.join(pathToPackage, `.npmrc`)
|
||||
fs.outputFileSync(NPMRCPathInPackage, NPMRCContent)
|
||||
|
||||
const NPMRCPathInRoot = path.join(root, `.npmrc`)
|
||||
fs.outputFileSync(NPMRCPathInRoot, NPMRCContent)
|
||||
|
||||
return registerCleanupTask(() => {
|
||||
fs.removeSync(NPMRCPathInPackage)
|
||||
fs.removeSync(NPMRCPathInRoot)
|
||||
})
|
||||
}
|
||||
|
||||
const publishPackage = async ({
|
||||
packageName,
|
||||
packagesToPublish,
|
||||
versionPostFix,
|
||||
ignorePackageJSONChanges,
|
||||
packageNameToPath,
|
||||
root,
|
||||
}) => {
|
||||
const monoRepoPackageJsonPath = getMonorepoPackageJsonPath({
|
||||
packageName,
|
||||
packageNameToPath,
|
||||
})
|
||||
|
||||
const { unadjustPackageJson, newPackageVersion } = adjustPackageJson({
|
||||
monoRepoPackageJsonPath,
|
||||
packageName,
|
||||
packageNameToPath,
|
||||
versionPostFix,
|
||||
packagesToPublish,
|
||||
ignorePackageJSONChanges,
|
||||
})
|
||||
|
||||
const pathToPackage = path.dirname(monoRepoPackageJsonPath)
|
||||
|
||||
const uncreateTemporaryNPMRC = createTemporaryNPMRC({ pathToPackage, root })
|
||||
|
||||
// npm publish
|
||||
const publishCmd = [
|
||||
`npm`,
|
||||
[`publish`, `--tag`, `medusa-dev`, `--registry=${registryUrl}`],
|
||||
{
|
||||
cwd: pathToPackage,
|
||||
},
|
||||
]
|
||||
|
||||
console.log(
|
||||
`Publishing ${packageName}@${newPackageVersion} to local registry`
|
||||
)
|
||||
try {
|
||||
await promisifiedSpawn(publishCmd)
|
||||
|
||||
console.log(
|
||||
`Published ${packageName}@${newPackageVersion} to local registry`
|
||||
)
|
||||
} catch (e) {
|
||||
console.error(`Failed to publish ${packageName}@${newPackageVersion}`, e)
|
||||
process.exit(1)
|
||||
}
|
||||
|
||||
uncreateTemporaryNPMRC()
|
||||
unadjustPackageJson()
|
||||
|
||||
return newPackageVersion
|
||||
}
|
||||
|
||||
exports.publishPackage = publishPackage
|
||||
@@ -0,0 +1,33 @@
|
||||
const path = require(`path`)
|
||||
const os = require(`os`)
|
||||
|
||||
const verdaccioConfig = {
|
||||
storage: path.join(os.tmpdir(), `verdaccio`, `storage`),
|
||||
port: 4873, // default
|
||||
max_body_size: `1000mb`,
|
||||
web: {
|
||||
enable: true,
|
||||
title: `gatsby-dev`,
|
||||
},
|
||||
logs: [{ type: `stdout`, format: `pretty-timestamped`, level: `warn` }],
|
||||
packages: {
|
||||
"**": {
|
||||
access: `$all`,
|
||||
publish: `$all`,
|
||||
proxy: `npmjs`,
|
||||
},
|
||||
},
|
||||
uplinks: {
|
||||
npmjs: {
|
||||
url: `https://registry.npmjs.org/`,
|
||||
// default is 2 max_fails - on flaky networks that cause a lot of failed installations
|
||||
max_fails: 10,
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
exports.verdaccioConfig = verdaccioConfig
|
||||
|
||||
const registryUrl = `http://localhost:${verdaccioConfig.port}`
|
||||
|
||||
exports.registryUrl = registryUrl
|
||||
@@ -0,0 +1,50 @@
|
||||
const { getDependantPackages } = require(`../get-dependant-packages`)
|
||||
|
||||
function createMockPackageNameToPath(packageNames) {
|
||||
const packageNameToPath = new Map()
|
||||
|
||||
for (const packageName of packageNames) {
|
||||
packageNameToPath.set(packageName, `/test/${packageName}`)
|
||||
}
|
||||
|
||||
return packageNameToPath
|
||||
}
|
||||
|
||||
describe(`getDependantPackages`, () => {
|
||||
it(`handles deep dependency chains`, () => {
|
||||
const packagesToPublish = getDependantPackages({
|
||||
packageName: `package-a-dep1-dep1`,
|
||||
depTree: {
|
||||
"package-a-dep1": new Set([`package-a`]),
|
||||
"package-a-dep1-dep1": new Set([`package-a-dep1`]),
|
||||
"not-related": new Set([`also-not-related`]),
|
||||
},
|
||||
packageNameToPath: createMockPackageNameToPath([
|
||||
`package-a`,
|
||||
`package-a-dep1`,
|
||||
`package-a-dep1-dep1`,
|
||||
`not-related`,
|
||||
`also-not-related`,
|
||||
]),
|
||||
})
|
||||
|
||||
expect(packagesToPublish).toEqual(
|
||||
new Set([`package-a`, `package-a-dep1`, `package-a-dep1-dep1`])
|
||||
)
|
||||
})
|
||||
|
||||
it(`doesn't get stuck in circular dependency loops`, () => {
|
||||
const packagesToPublish = getDependantPackages({
|
||||
packageName: `package-a`,
|
||||
depTree: {
|
||||
"package-a": new Set([`package-b`]),
|
||||
"package-b": new Set([`package-a`]),
|
||||
},
|
||||
packageNameToPath: createMockPackageNameToPath([
|
||||
`package-a`,
|
||||
`package-b`,
|
||||
]),
|
||||
})
|
||||
expect(packagesToPublish).toEqual(new Set([`package-a`, `package-b`]))
|
||||
})
|
||||
})
|
||||
@@ -0,0 +1,78 @@
|
||||
const path = require(`path`)
|
||||
|
||||
const { traversePackagesDeps } = require(`../traverse-package-deps`)
|
||||
|
||||
jest.doMock(
|
||||
path.join(...`<monorepo-path>/packages/package-a/package.json`.split(`/`)),
|
||||
() => {
|
||||
return {
|
||||
dependencies: {
|
||||
"unrelated-package": `*`,
|
||||
"package-a-dep1": `*`,
|
||||
},
|
||||
}
|
||||
},
|
||||
{ virtual: true }
|
||||
)
|
||||
|
||||
jest.doMock(
|
||||
path.join(
|
||||
...`<monorepo-path>/packages/package-a-dep1/package.json`.split(`/`)
|
||||
),
|
||||
() => {
|
||||
return {
|
||||
dependencies: {
|
||||
"package-a-dep1-dep1": `*`,
|
||||
},
|
||||
}
|
||||
},
|
||||
{ virtual: true }
|
||||
)
|
||||
|
||||
jest.doMock(
|
||||
path.join(
|
||||
...`<monorepo-path>/packages/package-a-dep1-dep1/package.json`.split(`/`)
|
||||
),
|
||||
() => {
|
||||
return {
|
||||
dependencies: {},
|
||||
}
|
||||
},
|
||||
{ virtual: true }
|
||||
)
|
||||
|
||||
describe(`traversePackageDeps`, () => {
|
||||
it(`handles deep dependency chains`, () => {
|
||||
const monoRepoPackages = [
|
||||
`package-a`,
|
||||
`package-a-dep1`,
|
||||
`package-a-dep1-dep1`,
|
||||
`package-not-used`,
|
||||
]
|
||||
const packageNameToPath = new Map()
|
||||
for (const packageName of monoRepoPackages) {
|
||||
packageNameToPath.set(
|
||||
packageName,
|
||||
path.join(...`<monorepo-path>/packages/${packageName}`.split(`/`))
|
||||
)
|
||||
}
|
||||
|
||||
const { seenPackages, depTree } = traversePackagesDeps({
|
||||
root: `<monorepo-path>`,
|
||||
packages: [`package-a`, `doesnt-exist`],
|
||||
monoRepoPackages,
|
||||
packageNameToPath,
|
||||
})
|
||||
|
||||
expect(seenPackages).toEqual([
|
||||
`package-a`,
|
||||
`package-a-dep1`,
|
||||
`package-a-dep1-dep1`,
|
||||
])
|
||||
|
||||
expect(depTree).toEqual({
|
||||
"package-a-dep1": new Set([`package-a`]),
|
||||
"package-a-dep1-dep1": new Set([`package-a-dep1`]),
|
||||
})
|
||||
})
|
||||
})
|
||||
@@ -0,0 +1,193 @@
|
||||
const fs = require(`fs-extra`)
|
||||
const _ = require(`lodash`)
|
||||
const {
|
||||
getMonorepoPackageJsonPath,
|
||||
} = require(`./get-monorepo-package-json-path`)
|
||||
const got = require(`got`)
|
||||
|
||||
function difference(object, base) {
|
||||
function changes(object, base) {
|
||||
return _.transform(object, function (result, value, key) {
|
||||
if (!_.isEqual(value, base[key])) {
|
||||
result[key] =
|
||||
_.isObject(value) && _.isObject(base[key])
|
||||
? changes(value, base[key])
|
||||
: value
|
||||
}
|
||||
})
|
||||
}
|
||||
return changes(object, base)
|
||||
}
|
||||
|
||||
/**
|
||||
* Compare dependencies of installed packages and monorepo package.
|
||||
* It will skip dependencies that are removed in monorepo package.
|
||||
*
|
||||
* If local package is not installed, it will check unpkg.com.
|
||||
* This allow gatsby-dev to skip publishing unnecesairly and
|
||||
* let install packages from public npm repository if nothing changed.
|
||||
*/
|
||||
exports.checkDepsChanges = async ({
|
||||
newPath,
|
||||
packageName,
|
||||
monoRepoPackages,
|
||||
isInitialScan,
|
||||
ignoredPackageJSON,
|
||||
packageNameToPath,
|
||||
}) => {
|
||||
let localPKGjson
|
||||
let packageNotInstalled = false
|
||||
try {
|
||||
localPKGjson = JSON.parse(fs.readFileSync(newPath, `utf-8`))
|
||||
} catch {
|
||||
packageNotInstalled = true
|
||||
// there is no local package - so we still need to install deps
|
||||
// this is nice because devs won't need to do initial package installation - we can handle this.
|
||||
if (!isInitialScan) {
|
||||
console.log(
|
||||
`'${packageName}' doesn't seem to be installed. Restart gatsby-dev to publish it`
|
||||
)
|
||||
return {
|
||||
didDepsChanged: false,
|
||||
packageNotInstalled,
|
||||
}
|
||||
}
|
||||
|
||||
// if package is not installed, we will do http GET request to
|
||||
// unkpg to check if dependency in package published in public
|
||||
// npm repository are different
|
||||
|
||||
// this allow us to not publish to local repository
|
||||
// and save some time/work
|
||||
try {
|
||||
const version = getPackageVersion(packageName)
|
||||
const url = `https://unpkg.com/${packageName}@${version}/package.json`
|
||||
const response = await got(url)
|
||||
if (response?.statusCode !== 200) {
|
||||
throw new Error(`No response or non 200 code for ${url}`)
|
||||
}
|
||||
localPKGjson = JSON.parse(response.body)
|
||||
} catch (e) {
|
||||
console.log(
|
||||
`'${packageName}' doesn't seem to be installed and is not published on NPM. Error: ${e.message}`
|
||||
)
|
||||
return {
|
||||
didDepsChanged: true,
|
||||
packageNotInstalled,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const monoRepoPackageJsonPath = getMonorepoPackageJsonPath({
|
||||
packageName,
|
||||
packageNameToPath,
|
||||
})
|
||||
const monorepoPKGjsonString = fs.readFileSync(
|
||||
monoRepoPackageJsonPath,
|
||||
`utf-8`
|
||||
)
|
||||
const monorepoPKGjson = JSON.parse(monorepoPKGjsonString)
|
||||
if (ignoredPackageJSON.has(packageName)) {
|
||||
if (ignoredPackageJSON.get(packageName).includes(monorepoPKGjsonString)) {
|
||||
// we are in middle of publishing and content of package.json is one set during publish process,
|
||||
// so we need to not cause false positives
|
||||
return {
|
||||
didDepsChanged: false,
|
||||
packageNotInstalled,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (!monorepoPKGjson.dependencies) monorepoPKGjson.dependencies = {}
|
||||
if (!localPKGjson.dependencies) localPKGjson.dependencies = {}
|
||||
|
||||
const areDepsEqual = _.isEqual(
|
||||
monorepoPKGjson.dependencies,
|
||||
localPKGjson.dependencies
|
||||
)
|
||||
|
||||
if (!areDepsEqual) {
|
||||
const diff = difference(
|
||||
monorepoPKGjson.dependencies,
|
||||
localPKGjson.dependencies
|
||||
)
|
||||
|
||||
const diff2 = difference(
|
||||
localPKGjson.dependencies,
|
||||
monorepoPKGjson.dependencies
|
||||
)
|
||||
|
||||
let needPublishing = false
|
||||
let isPublishing = false
|
||||
const depChangeLog = _.uniq(Object.keys({ ...diff, ...diff2 }))
|
||||
.reduce((acc, key) => {
|
||||
if (monorepoPKGjson.dependencies[key] === `gatsby-dev`) {
|
||||
// if we are in middle of publishing to local repository - ignore
|
||||
isPublishing = true
|
||||
return acc
|
||||
}
|
||||
|
||||
if (localPKGjson.dependencies[key] === `gatsby-dev`) {
|
||||
// monorepo packages will restore version, but after installation
|
||||
// in local site - it will use `gatsby-dev` dist tag - we need
|
||||
// to ignore changes that
|
||||
return acc
|
||||
}
|
||||
|
||||
if (
|
||||
localPKGjson.dependencies[key] &&
|
||||
monorepoPKGjson.dependencies[key]
|
||||
) {
|
||||
// Check only for version changes in packages
|
||||
// that are not from gatsby repo.
|
||||
// Changes in gatsby packages will be copied over
|
||||
// from monorepo - and if those contain other dependency
|
||||
// changes - they will be covered
|
||||
if (!monoRepoPackages.includes(key)) {
|
||||
acc.push(
|
||||
` - '${key}' changed version from ${localPKGjson.dependencies[key]} to ${monorepoPKGjson.dependencies[key]}`
|
||||
)
|
||||
needPublishing = true
|
||||
}
|
||||
} else if (monorepoPKGjson.dependencies[key]) {
|
||||
acc.push(` - '${key}@${monorepoPKGjson.dependencies[key]}' was added`)
|
||||
needPublishing = true
|
||||
} else {
|
||||
acc.push(` - '${key}@${localPKGjson.dependencies[key]}' was removed`)
|
||||
// this doesn't need publishing really, so will skip this
|
||||
}
|
||||
return acc
|
||||
}, [])
|
||||
.join(`\n`)
|
||||
|
||||
if (!isPublishing && depChangeLog.length > 0) {
|
||||
console.log(`Dependencies of '${packageName}' changed:\n${depChangeLog}`)
|
||||
if (isInitialScan) {
|
||||
console.log(
|
||||
`Will ${!needPublishing ? `not ` : ``}publish to local npm registry.`
|
||||
)
|
||||
} else {
|
||||
console.warn(
|
||||
`Installation of dependencies after initial scan is not implemented`
|
||||
)
|
||||
}
|
||||
return {
|
||||
didDepsChanged: needPublishing,
|
||||
packageNotInstalled,
|
||||
}
|
||||
}
|
||||
}
|
||||
return {
|
||||
didDepsChanged: false,
|
||||
packageNotInstalled,
|
||||
}
|
||||
}
|
||||
|
||||
function getPackageVersion(packageName) {
|
||||
const projectPackageJson = JSON.parse(
|
||||
fs.readFileSync(`./package.json`, `utf-8`)
|
||||
)
|
||||
const { dependencies = {}, devDependencies = {} } = projectPackageJson
|
||||
const version = dependencies[packageName] || devDependencies[packageName]
|
||||
return version || `latest`
|
||||
}
|
||||
@@ -0,0 +1,30 @@
|
||||
/**
|
||||
* Recursively get set of packages that depend on given package.
|
||||
* Set also includes passed package.
|
||||
*/
|
||||
const getDependantPackages = ({
|
||||
packageName,
|
||||
depTree,
|
||||
packagesToPublish = new Set(),
|
||||
}) => {
|
||||
if (packagesToPublish.has(packageName)) {
|
||||
// bail early if package was already handled
|
||||
return packagesToPublish
|
||||
}
|
||||
|
||||
packagesToPublish.add(packageName)
|
||||
const dependants = depTree[packageName]
|
||||
if (dependants) {
|
||||
dependants.forEach(dependant =>
|
||||
getDependantPackages({
|
||||
packageName: dependant,
|
||||
depTree,
|
||||
packagesToPublish,
|
||||
})
|
||||
)
|
||||
}
|
||||
|
||||
return packagesToPublish
|
||||
}
|
||||
|
||||
exports.getDependantPackages = getDependantPackages
|
||||
@@ -0,0 +1,4 @@
|
||||
const path = require(`path`)
|
||||
|
||||
exports.getMonorepoPackageJsonPath = ({ packageName, packageNameToPath }) =>
|
||||
path.join(packageNameToPath.get(packageName), `package.json`)
|
||||
@@ -0,0 +1,29 @@
|
||||
const execa = require(`execa`)
|
||||
|
||||
const defaultSpawnArgs = {
|
||||
cwd: process.cwd(),
|
||||
stdio: `inherit`,
|
||||
}
|
||||
|
||||
exports.setDefaultSpawnStdio = stdio => {
|
||||
defaultSpawnArgs.stdio = stdio
|
||||
}
|
||||
|
||||
exports.promisifiedSpawn = async ([cmd, args = [], spawnArgs = {}]) => {
|
||||
const spawnOptions = {
|
||||
...defaultSpawnArgs,
|
||||
...spawnArgs,
|
||||
}
|
||||
try {
|
||||
return await execa(cmd, args, spawnOptions)
|
||||
} catch (e) {
|
||||
if (spawnOptions.stdio === `ignore`) {
|
||||
console.log(
|
||||
`\nCommand "${cmd} ${args.join(
|
||||
` `
|
||||
)}" failed.\nTo see details of failed command, rerun "medusa-dev" without "--quiet" or "-q" switch\n`
|
||||
)
|
||||
}
|
||||
throw e
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,88 @@
|
||||
const _ = require(`lodash`)
|
||||
const path = require(`path`)
|
||||
|
||||
/**
|
||||
* @typedef {Object} TraversePackagesDepsReturn
|
||||
* @property {Object} depTree Lookup table to check dependants for given package.
|
||||
* Used to determine which packages need to be published.
|
||||
* @example
|
||||
* ```
|
||||
* {
|
||||
* "medusa-cli": Set(["medusa"]),
|
||||
* "medusa-telemetry": Set(["medusa", "medusa-cli"]),
|
||||
* }
|
||||
* ```
|
||||
*/
|
||||
|
||||
/**
|
||||
* Compile final list of packages to watch
|
||||
* This will include packages explicitly defined packages and all their dependencies
|
||||
* Also creates dependency graph that is used later to determine which packages
|
||||
* would need to be published when their dependencies change
|
||||
* @param {Object} $0
|
||||
* @param {String} $0.root Path to root of medusa monorepo repository
|
||||
* @param {String[]} $0.packages Initial array of packages to watch
|
||||
* This can be extracted from project dependencies or explicitly set by `--packages` flag
|
||||
* @param {String[]} $0.monoRepoPackages Array of packages in medusa monorepo
|
||||
* @param {String[]} [$0.seenPackages] Array of packages that were already traversed.
|
||||
* This makes sure dependencies are extracted one time for each package and avoid any
|
||||
* infinite loops.
|
||||
* @param {DepTree} [$0.depTree] Used internally to recursively construct dependency graph.
|
||||
* @return {TraversePackagesDepsReturn}
|
||||
*/
|
||||
const traversePackagesDeps = ({
|
||||
packages,
|
||||
monoRepoPackages,
|
||||
seenPackages = [...packages],
|
||||
depTree = {},
|
||||
packageNameToPath,
|
||||
}) => {
|
||||
packages.forEach((p) => {
|
||||
let pkgJson
|
||||
try {
|
||||
const packageRoot = packageNameToPath.get(p)
|
||||
if (packageRoot) {
|
||||
pkgJson = require(path.join(packageRoot, `package.json`))
|
||||
} else {
|
||||
console.error(`"${p}" package doesn't exist in monorepo.`)
|
||||
// remove from seenPackages
|
||||
seenPackages = seenPackages.filter((seenPkg) => seenPkg !== p)
|
||||
return
|
||||
}
|
||||
} catch (e) {
|
||||
console.error(`"${p}" package doesn't exist in monorepo.`, e)
|
||||
// remove from seenPackages
|
||||
seenPackages = seenPackages.filter((seenPkg) => seenPkg !== p)
|
||||
return
|
||||
}
|
||||
|
||||
const fromMonoRepo = _.intersection(
|
||||
Object.keys({ ...pkgJson.dependencies }),
|
||||
monoRepoPackages
|
||||
)
|
||||
|
||||
fromMonoRepo.forEach((pkgName) => {
|
||||
depTree[pkgName] = (depTree[pkgName] || new Set()).add(p)
|
||||
})
|
||||
|
||||
// only traverse not yet seen packages to avoid infinite loops
|
||||
const newPackages = _.difference(fromMonoRepo, seenPackages)
|
||||
|
||||
if (newPackages.length) {
|
||||
newPackages.forEach((depFromMonorepo) => {
|
||||
seenPackages.push(depFromMonorepo)
|
||||
})
|
||||
|
||||
traversePackagesDeps({
|
||||
packages: fromMonoRepo,
|
||||
monoRepoPackages,
|
||||
seenPackages,
|
||||
depTree,
|
||||
packageNameToPath,
|
||||
})
|
||||
}
|
||||
})
|
||||
return { seenPackages, depTree }
|
||||
}
|
||||
|
||||
exports.traversePackagesDeps = traversePackagesDeps
|
||||
@@ -0,0 +1,4 @@
|
||||
exports.getVersionInfo = () => {
|
||||
const { version: devCliVersion } = require(`../../package.json`);
|
||||
return `Medusa Dev CLI version: ${devCliVersion}`;
|
||||
};
|
||||
@@ -0,0 +1,372 @@
|
||||
const chokidar = require(`chokidar`)
|
||||
const _ = require(`lodash`)
|
||||
const del = require(`del`)
|
||||
const fs = require(`fs-extra`)
|
||||
const path = require(`path`)
|
||||
const findWorkspaceRoot = require(`find-yarn-workspace-root`)
|
||||
|
||||
const { publishPackagesLocallyAndInstall } = require(`./local-npm-registry`)
|
||||
const { checkDepsChanges } = require(`./utils/check-deps-changes`)
|
||||
const { getDependantPackages } = require(`./utils/get-dependant-packages`)
|
||||
const {
|
||||
setDefaultSpawnStdio,
|
||||
promisifiedSpawn,
|
||||
} = require(`./utils/promisified-spawn`)
|
||||
const { traversePackagesDeps } = require(`./utils/traverse-package-deps`)
|
||||
|
||||
let numCopied = 0
|
||||
|
||||
const quit = () => {
|
||||
console.log(`Copied ${numCopied} files`)
|
||||
process.exit()
|
||||
}
|
||||
|
||||
const MAX_COPY_RETRIES = 3
|
||||
|
||||
/*
|
||||
* non-existent packages break on('ready')
|
||||
* See: https://github.com/paulmillr/chokidar/issues/449
|
||||
*/
|
||||
async function watch(
|
||||
root,
|
||||
packages,
|
||||
{
|
||||
scanOnce,
|
||||
quiet,
|
||||
forceInstall,
|
||||
monoRepoPackages,
|
||||
localPackages,
|
||||
packageNameToPath,
|
||||
externalRegistry,
|
||||
}
|
||||
) {
|
||||
setDefaultSpawnStdio(quiet ? `ignore` : `inherit`)
|
||||
// determine if in yarn workspace - if in workspace, force using verdaccio
|
||||
// as current logic of copying files will not work correctly.
|
||||
const yarnWorkspaceRoot = findWorkspaceRoot()
|
||||
if (yarnWorkspaceRoot && process.env.NODE_ENV !== `test`) {
|
||||
console.log(`Yarn workspace found.`)
|
||||
forceInstall = true
|
||||
}
|
||||
|
||||
let afterPackageInstallation = false
|
||||
let queuedCopies = []
|
||||
|
||||
const realCopyPath = (arg) => {
|
||||
const { oldPath, newPath, quiet, resolve, reject, retry = 0 } = arg
|
||||
fs.copy(oldPath, newPath, (err) => {
|
||||
if (err) {
|
||||
if (retry >= MAX_COPY_RETRIES) {
|
||||
console.error(err)
|
||||
reject(err)
|
||||
return
|
||||
} else {
|
||||
setTimeout(
|
||||
() => realCopyPath({ ...arg, retry: retry + 1 }),
|
||||
500 * Math.pow(2, retry)
|
||||
)
|
||||
return
|
||||
}
|
||||
}
|
||||
|
||||
// When the medusa binary is copied over, it is not setup with the executable
|
||||
// permissions that it is given when installed via yarn.
|
||||
// This fixes the issue where after running medusa-dev, running `yarn medusa develop`
|
||||
// fails with a permission issue.
|
||||
// @fixes https://github.com/medusajs/medusa/issues/18809
|
||||
// Binary files we target:
|
||||
// - medusa/bin/medusa.js
|
||||
// -medusa/cli.js
|
||||
// -medusa-cli/cli.js
|
||||
if (/(bin\/medusa.js|medusa(-cli)?\/cli.js)$/.test(newPath)) {
|
||||
fs.chmodSync(newPath, `0755`)
|
||||
}
|
||||
|
||||
numCopied += 1
|
||||
if (!quiet) {
|
||||
console.log(`Copied ${oldPath} to ${newPath}`)
|
||||
}
|
||||
resolve()
|
||||
})
|
||||
}
|
||||
|
||||
const copyPath = (oldPath, newPath, quiet, packageName) =>
|
||||
new Promise((resolve, reject) => {
|
||||
const argObj = { oldPath, newPath, quiet, packageName, resolve, reject }
|
||||
if (afterPackageInstallation) {
|
||||
realCopyPath(argObj)
|
||||
} else {
|
||||
queuedCopies.push(argObj)
|
||||
}
|
||||
})
|
||||
|
||||
const runQueuedCopies = () => {
|
||||
afterPackageInstallation = true
|
||||
queuedCopies.forEach((argObj) => realCopyPath(argObj))
|
||||
queuedCopies = []
|
||||
}
|
||||
|
||||
const clearJSFilesFromNodeModules = async () => {
|
||||
const packagesToClear = queuedCopies.reduce((acc, { packageName }) => {
|
||||
if (packageName) {
|
||||
acc.add(packageName)
|
||||
}
|
||||
return acc
|
||||
}, new Set())
|
||||
|
||||
await Promise.all(
|
||||
[...packagesToClear].map(
|
||||
async (packageToClear) =>
|
||||
await del([
|
||||
`node_modules/${packageToClear}/**/*.{js,js.map}`,
|
||||
`!node_modules/${packageToClear}/node_modules/**/*.{js,js.map}`,
|
||||
`!node_modules/${packageToClear}/src/**/*.{js,js.map}`,
|
||||
])
|
||||
)
|
||||
)
|
||||
}
|
||||
// check packages deps and if they depend on other packages from monorepo
|
||||
// add them to packages list
|
||||
const { seenPackages, depTree } = traversePackagesDeps({
|
||||
root,
|
||||
packages: _.uniq(localPackages),
|
||||
monoRepoPackages,
|
||||
packageNameToPath,
|
||||
})
|
||||
|
||||
const allPackagesToWatch = packages
|
||||
? _.intersection(packages, seenPackages)
|
||||
: seenPackages
|
||||
|
||||
const ignoredPackageJSON = new Map()
|
||||
const ignorePackageJSONChanges = (packageName, contentArray) => {
|
||||
ignoredPackageJSON.set(packageName, contentArray)
|
||||
|
||||
return () => {
|
||||
ignoredPackageJSON.delete(packageName)
|
||||
}
|
||||
}
|
||||
|
||||
if (forceInstall) {
|
||||
try {
|
||||
if (allPackagesToWatch.length > 0) {
|
||||
await publishPackagesLocallyAndInstall({
|
||||
packagesToPublish: allPackagesToWatch,
|
||||
packageNameToPath,
|
||||
localPackages,
|
||||
ignorePackageJSONChanges,
|
||||
yarnWorkspaceRoot,
|
||||
externalRegistry,
|
||||
root,
|
||||
})
|
||||
} else {
|
||||
// run `yarn`
|
||||
const yarnInstallCmd = [`yarn`]
|
||||
|
||||
console.log(`Installing packages from public NPM registry`)
|
||||
await promisifiedSpawn(yarnInstallCmd)
|
||||
console.log(`Installation complete`)
|
||||
}
|
||||
} catch (e) {
|
||||
console.log(e)
|
||||
}
|
||||
|
||||
process.exit()
|
||||
}
|
||||
|
||||
if (allPackagesToWatch.length === 0) {
|
||||
console.error(`There are no packages to watch.`)
|
||||
return
|
||||
}
|
||||
|
||||
const allPackagesIgnoringThemesToWatch = allPackagesToWatch.filter(
|
||||
(pkgName) => !pkgName.startsWith(`medusa-theme`)
|
||||
)
|
||||
|
||||
const ignored = [
|
||||
/[/\\]node_modules[/\\]/i,
|
||||
/\.git/i,
|
||||
/\.DS_Store/,
|
||||
/[/\\]__tests__[/\\]/i,
|
||||
/[/\\]__mocks__[/\\]/i,
|
||||
/\.npmrc/i,
|
||||
].concat(
|
||||
allPackagesIgnoringThemesToWatch.map(
|
||||
(p) => new RegExp(`${p}[\\/\\\\]src[\\/\\\\]`, `i`)
|
||||
)
|
||||
)
|
||||
const watchers = _.uniq(
|
||||
allPackagesToWatch
|
||||
.map((p) => path.join(packageNameToPath.get(p)))
|
||||
.filter((p) => fs.existsSync(p))
|
||||
)
|
||||
|
||||
let allCopies = []
|
||||
const packagesToPublish = new Set()
|
||||
let isInitialScan = true
|
||||
let isPublishing = false
|
||||
|
||||
const waitFor = new Set()
|
||||
let anyPackageNotInstalled = false
|
||||
|
||||
const watchEvents = [`change`, `add`]
|
||||
const packagePathMatchingEntries = Array.from(packageNameToPath.entries())
|
||||
chokidar
|
||||
.watch(watchers, {
|
||||
ignored: [(filePath) => _.some(ignored, (reg) => reg.test(filePath))],
|
||||
})
|
||||
.on(`all`, async (event, filePath) => {
|
||||
if (!watchEvents.includes(event)) {
|
||||
return
|
||||
}
|
||||
|
||||
// match against paths
|
||||
let packageName
|
||||
|
||||
for (const [_packageName, packagePath] of packagePathMatchingEntries) {
|
||||
const relativeToThisPackage = path.relative(packagePath, filePath)
|
||||
if (!relativeToThisPackage.startsWith(`..`)) {
|
||||
packageName = _packageName
|
||||
break
|
||||
}
|
||||
}
|
||||
|
||||
if (!packageName) {
|
||||
return
|
||||
}
|
||||
|
||||
const prefix = packageNameToPath.get(packageName)
|
||||
|
||||
// Copy it over local version.
|
||||
// Don't copy over the medusa bin file as that breaks the NPM symlink.
|
||||
if (_.includes(filePath, `dist/medusa-cli.js`)) {
|
||||
return
|
||||
}
|
||||
|
||||
const relativePackageFile = path.relative(prefix, filePath)
|
||||
|
||||
const newPath = path.join(
|
||||
`./node_modules/${packageName}`,
|
||||
relativePackageFile
|
||||
)
|
||||
|
||||
if (relativePackageFile === `package.json`) {
|
||||
// package.json files will change during publish to adjust version of package (and dependencies), so ignore
|
||||
// changes during this process
|
||||
if (isPublishing) {
|
||||
return
|
||||
}
|
||||
|
||||
// Compare dependencies with local version
|
||||
|
||||
const didDepsChangedPromise = checkDepsChanges({
|
||||
newPath,
|
||||
packageName,
|
||||
monoRepoPackages,
|
||||
packageNameToPath,
|
||||
isInitialScan,
|
||||
ignoredPackageJSON,
|
||||
})
|
||||
|
||||
if (isInitialScan) {
|
||||
// normally checkDepsChanges would be sync,
|
||||
// but because it also can do async GET request
|
||||
// to unpkg if local package is not installed
|
||||
// keep track of it to make sure all of it
|
||||
// finish before installing
|
||||
|
||||
waitFor.add(didDepsChangedPromise)
|
||||
}
|
||||
|
||||
const { didDepsChanged, packageNotInstalled } =
|
||||
await didDepsChangedPromise
|
||||
|
||||
if (packageNotInstalled) {
|
||||
anyPackageNotInstalled = true
|
||||
}
|
||||
|
||||
if (didDepsChanged) {
|
||||
if (isInitialScan) {
|
||||
waitFor.delete(didDepsChangedPromise)
|
||||
// handle dependency change only in initial scan - this is for sure doable to
|
||||
// handle this in watching mode correctly - but for the sake of shipping
|
||||
// this I limit more work/time consuming edge cases.
|
||||
|
||||
// Dependency changed - now we need to figure out
|
||||
// the packages that actually need to be published.
|
||||
// If package with changed dependencies is dependency of other
|
||||
// medusa package - like for example `medusa-plugin-page-creator`
|
||||
// we need to publish both `medusa-plugin-page-creator` and `medusa`
|
||||
// and install `medusa` in example site project.
|
||||
getDependantPackages({
|
||||
packageName,
|
||||
depTree,
|
||||
packages,
|
||||
}).forEach((packageToPublish) => {
|
||||
// scheduling publish - we will publish when `ready` is emitted
|
||||
// as we can do single publish then
|
||||
packagesToPublish.add(packageToPublish)
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
// don't ever copy package.json as this will mess up any future dependency
|
||||
// changes checks
|
||||
return
|
||||
}
|
||||
|
||||
const localCopies = [copyPath(filePath, newPath, quiet, packageName)]
|
||||
|
||||
// If this is from "cache-dir" also copy it into the site's .cache
|
||||
if (_.includes(filePath, `cache-dir`)) {
|
||||
const newCachePath = path.join(
|
||||
`.cache/`,
|
||||
path.relative(path.join(prefix, `cache-dir`), filePath)
|
||||
)
|
||||
localCopies.push(copyPath(filePath, newCachePath, quiet))
|
||||
}
|
||||
|
||||
allCopies = allCopies.concat(localCopies)
|
||||
})
|
||||
.on(`ready`, async () => {
|
||||
// wait for all async work needed to be done
|
||||
// before publishing / installing
|
||||
await Promise.all(Array.from(waitFor))
|
||||
|
||||
if (isInitialScan) {
|
||||
isInitialScan = false
|
||||
if (packagesToPublish.size > 0) {
|
||||
isPublishing = true
|
||||
await publishPackagesLocallyAndInstall({
|
||||
packagesToPublish: Array.from(packagesToPublish),
|
||||
packageNameToPath,
|
||||
localPackages,
|
||||
ignorePackageJSONChanges,
|
||||
externalRegistry,
|
||||
root,
|
||||
})
|
||||
packagesToPublish.clear()
|
||||
isPublishing = false
|
||||
} else if (anyPackageNotInstalled) {
|
||||
// run `yarn`
|
||||
const yarnInstallCmd = [`yarn`]
|
||||
|
||||
console.log(`Installing packages from public NPM registry`)
|
||||
await promisifiedSpawn(yarnInstallCmd)
|
||||
console.log(`Installation complete`)
|
||||
}
|
||||
|
||||
await clearJSFilesFromNodeModules()
|
||||
runQueuedCopies()
|
||||
}
|
||||
|
||||
// all files watched, quit once all files are copied if necessary
|
||||
Promise.all(allCopies).then(() => {
|
||||
if (scanOnce) {
|
||||
quit()
|
||||
}
|
||||
})
|
||||
})
|
||||
}
|
||||
|
||||
module.exports = watch
|
||||
@@ -0,0 +1,29 @@
|
||||
{
|
||||
"compilerOptions": {
|
||||
"lib": ["es5", "es6", "es2019"],
|
||||
"target": "es6",
|
||||
"outDir": "./dist",
|
||||
"esModuleInterop": true,
|
||||
"declaration": true,
|
||||
"module": "commonjs",
|
||||
"moduleResolution": "node",
|
||||
"emitDecoratorMetadata": true,
|
||||
"experimentalDecorators": true,
|
||||
"sourceMap": true,
|
||||
"noImplicitReturns": true,
|
||||
"strictNullChecks": true,
|
||||
"strictFunctionTypes": true,
|
||||
"noImplicitThis": true,
|
||||
"allowJs": true,
|
||||
"skipLibCheck": true,
|
||||
"downlevelIteration": true // to use ES5 specific tooling
|
||||
},
|
||||
"include": ["src"],
|
||||
"exclude": [
|
||||
"dist",
|
||||
"./src/**/__tests__",
|
||||
"./src/**/__mocks__",
|
||||
"./src/**/__fixtures__",
|
||||
"node_modules"
|
||||
]
|
||||
}
|
||||
@@ -0,0 +1,5 @@
|
||||
{
|
||||
"extends": "./tsconfig.json",
|
||||
"include": ["src"],
|
||||
"exclude": ["node_modules"]
|
||||
}
|
||||
@@ -0,0 +1,6 @@
|
||||
/dist
|
||||
node_modules
|
||||
.DS_store
|
||||
.env*
|
||||
.env
|
||||
*.sql
|
||||
File diff suppressed because one or more lines are too long
@@ -0,0 +1,219 @@
|
||||
# medusa-oas-cli - 0.1.0 - experimental
|
||||
|
||||
A command-line tool for all OpenAPI Specifications (OAS) related tooling.
|
||||
|
||||
## Install
|
||||
|
||||
`yarn add --dev @medusajs/medusa-oas-cli`
|
||||
|
||||
Install in the global namespace is not yet supported.
|
||||
~~`npm install -g @medusajs/medusa-oas-cli`~~
|
||||
|
||||
## Configuration / First time setup
|
||||
|
||||
N/A
|
||||
|
||||
## How to use
|
||||
|
||||
```bash
|
||||
yarn medusa-oas <command>
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### Command - `oas`
|
||||
|
||||
This command will scan the `@medusajs/medusa` package in order to extract JSDoc OAS into a json file.
|
||||
|
||||
The command will output one of three the files `admin.oas.json`, `store.oas.json` or `combined.oas.json` in the same
|
||||
directory that the command was run.
|
||||
|
||||
Invalid OAS with throw an error and will prevent the files from being outputted.
|
||||
|
||||
#### `--type <string>`
|
||||
|
||||
Specify which API OAS to create. Accepts `admin`, `store`, `combined`.
|
||||
|
||||
The `combined` option will merge both the admin and the store APIs into a single OAS file.
|
||||
|
||||
```bash
|
||||
yarn medusa-oas oas --type admin
|
||||
```
|
||||
|
||||
#### `--out-dir <path>`
|
||||
|
||||
Specify in which directory should the files be outputted. It accepts a relative or absolute path.
|
||||
If the directory doesn't exist, it will be created. Defaults to `./`.
|
||||
|
||||
```bash
|
||||
yarm medusa-oas oas --out-dir
|
||||
```
|
||||
|
||||
#### `--paths <paths...>`
|
||||
|
||||
Allows passing additional directory paths to crawl for JSDoc OAS and include in the generated OAS.
|
||||
It accepts multiple entries.
|
||||
|
||||
```bash
|
||||
yarn medusa-oas oas --paths ~/medusa-server/src
|
||||
```
|
||||
|
||||
#### `--base <path>`
|
||||
|
||||
Allows overwriting the content the API's base.yaml OAS that is fed to swagger-inline.
|
||||
Paths, tags, and components will be merged together. Other OAS properties will be overwritten.
|
||||
|
||||
```bash
|
||||
yarn medusa-oas oas --base ~/medusa-server/oas/custom.oas.base.yaml
|
||||
```
|
||||
|
||||
#### `--dry-run`
|
||||
|
||||
Will package the OAS but will not output file. Useful for validating OAS.
|
||||
|
||||
```bash
|
||||
yarn medusa-oas oas --dry-run
|
||||
```
|
||||
|
||||
#### `--force`
|
||||
|
||||
Ignore OAS errors and attempt to output generated OAS files.
|
||||
|
||||
```bash
|
||||
yarn medusa-oas oas --force
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### Command - `client`
|
||||
|
||||
Will generate API client files from a given OAS file.
|
||||
|
||||
#### `--src-file <path>`
|
||||
|
||||
Specify the path to the OAS JSON file.
|
||||
|
||||
```bash
|
||||
yarn medusa-oas client --src-file ./store.oas.json`
|
||||
```
|
||||
|
||||
#### `--name <name>`
|
||||
|
||||
Namespace for the generated client. Usually `admin` or `store`.
|
||||
|
||||
```bash
|
||||
yarn medusa-oas client --name admin`
|
||||
```
|
||||
|
||||
#### `--out-dir <path>`
|
||||
|
||||
Specify in which directory should the files be outputted. Accepts relative and absolute path.
|
||||
If the directory doesn't exist, it will be created. Defaults to `./`.
|
||||
|
||||
```bash
|
||||
yarn medusa-oas client --out-dir ./client`
|
||||
```
|
||||
|
||||
#### `--type <type>`
|
||||
|
||||
Client component types to generate. Accepts `all`, `types`, `client`, `hooks`.
|
||||
Defaults to `all`.
|
||||
|
||||
```bash
|
||||
yarn medusa-oas client --type types`
|
||||
```
|
||||
|
||||
#### `--types-packages <name>`
|
||||
|
||||
Replace relative import statements by types package name. Mandatory when using `--type client` or `--type hooks`.
|
||||
|
||||
```bash
|
||||
yarn medusa-oas client --types-packages @medusajs/client-types`
|
||||
```
|
||||
|
||||
#### `--client-packages <name>`
|
||||
|
||||
Replace relative import statements by client package name. Mandatory when using `--type hooks`.
|
||||
|
||||
```bash
|
||||
yarn medusa-oas client --client-packages @medusajs/medusa-js`
|
||||
```
|
||||
|
||||
#### `--clean`
|
||||
|
||||
Delete destination directory content before generating client.
|
||||
|
||||
```bash
|
||||
yarn medusa-oas client --clean
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### Command - `docs`
|
||||
|
||||
Will sanitize OAS for use with Redocly's API documentation viewer.
|
||||
|
||||
#### `--src-file <path>`
|
||||
|
||||
Specify the path to the OAS JSON file.
|
||||
|
||||
```bash
|
||||
yarm medusa-oas docs --src-file ./store.oas.json
|
||||
```
|
||||
|
||||
#### `--out-dir <path>`
|
||||
|
||||
Specify in which directory should the files be outputted. Accepts relative and absolute path.
|
||||
If the directory doesn't exist, it will be created. Defaults to `./`.
|
||||
|
||||
```bash
|
||||
yarn medusa-oas docs --src-file ./store.oas.json --out-dir ./docs`
|
||||
```
|
||||
|
||||
#### `--config <path>`
|
||||
|
||||
Specify the path to a Redocly config file.
|
||||
|
||||
```bash
|
||||
yarn medusa-oas --src-file ./store.oas.json --config ./redocly-config.yaml
|
||||
```
|
||||
|
||||
#### `--dry-run`
|
||||
|
||||
Will sanitize the OAS but will not output file. Useful for troubleshooting circular reference issues.
|
||||
|
||||
```bash
|
||||
yarn medusa-oas docs --src-file ./store.oas.json --dry-run
|
||||
```
|
||||
|
||||
#### `--clean`
|
||||
|
||||
Delete destination directory content before generating client.
|
||||
|
||||
```bash
|
||||
yarn medusa-oas docs --src-file ./store.oas.json --clean
|
||||
```
|
||||
|
||||
#### `--split`
|
||||
|
||||
Creates a multi-file structure output. Uses `redocly split` internally.
|
||||
|
||||
```bash
|
||||
yarn medusa-oas docs --src-file ./store.oas.json --split
|
||||
```
|
||||
|
||||
#### `--preview`
|
||||
|
||||
Generate a preview of the API documentation in a browser. Does not output files. Uses `redocly preview-docs` internally.
|
||||
|
||||
```bash
|
||||
yarn medusa-oas docs --src-file ../../../www/apps/api-reference/specs/store.oas.json --preview
|
||||
```
|
||||
|
||||
#### `--html`
|
||||
|
||||
Generate a zero-dependency static HTML file. Uses `redocly build-docs` internally.
|
||||
|
||||
```bash
|
||||
yarn medusa-oas docs --src-file ./store.oas.json --html
|
||||
```
|
||||
@@ -0,0 +1,14 @@
|
||||
module.exports = {
|
||||
transform: {
|
||||
"^.+\\.[jt]s?$": [
|
||||
"ts-jest",
|
||||
{
|
||||
tsconfig: "tsconfig.json",
|
||||
isolatedModules: false,
|
||||
},
|
||||
],
|
||||
},
|
||||
testEnvironment: `node`,
|
||||
moduleFileExtensions: [`js`, `ts`],
|
||||
testTimeout: 100000,
|
||||
}
|
||||
@@ -0,0 +1,5 @@
|
||||
openapi: 3.0.0
|
||||
info:
|
||||
version: 1.0.0
|
||||
title: Medusa API
|
||||
paths: { }
|
||||
@@ -0,0 +1,53 @@
|
||||
{
|
||||
"name": "@medusajs/medusa-oas-cli",
|
||||
"version": "0.3.2",
|
||||
"description": "OAS CLI",
|
||||
"main": "dist/index.js",
|
||||
"bin": {
|
||||
"medusa-oas": "./dist/index.js"
|
||||
},
|
||||
"files": [
|
||||
"dist",
|
||||
"oas"
|
||||
],
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "https://github.com/medusajs/medusa",
|
||||
"directory": "packages/oas/medusa-oas-cli"
|
||||
},
|
||||
"publishConfig": {
|
||||
"access": "public"
|
||||
},
|
||||
"author": "Medusa",
|
||||
"license": "MIT",
|
||||
"devDependencies": {
|
||||
"@types/js-yaml": "^4.0.9",
|
||||
"jest": "^29.1.0",
|
||||
"js-yaml": "^4.1.0",
|
||||
"ts-jest": "^29.1.0",
|
||||
"ts-node": "^10.9.1",
|
||||
"typescript": "4.9.5",
|
||||
"uuid": "^9.0.0"
|
||||
},
|
||||
"scripts": {
|
||||
"prepublishOnly": "cross-env NODE_ENV=production tsc --build",
|
||||
"build": "tsc --build",
|
||||
"medusa-oas": "ts-node src/index.ts"
|
||||
},
|
||||
"dependencies": {
|
||||
"@medusajs/medusa": "^1.20.4",
|
||||
"@medusajs/openapi-typescript-codegen": "^0.2.1",
|
||||
"@medusajs/types": "^1.11.15",
|
||||
"@medusajs/utils": "^1.11.8",
|
||||
"@readme/json-schema-ref-parser": "^1.2.0",
|
||||
"@readme/openapi-parser": "^2.4.0",
|
||||
"@redocly/cli": "^1.7.0",
|
||||
"@types/lodash": "^4.14.191",
|
||||
"commander": "^10.0.0",
|
||||
"execa": "1.0.0",
|
||||
"lodash": "^4.17.21",
|
||||
"openapi3-ts": "^3.1.2",
|
||||
"swagger-inline": "^6.1.0",
|
||||
"yargs": "^17.7.2"
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,87 @@
|
||||
module.exports = CircularPatch
|
||||
|
||||
/**
|
||||
* Since ref() is triggered upon reaching a $ref OAS line (leaf),
|
||||
* we want to inverse `schemas` instructions in order to optimize iterators.
|
||||
*/
|
||||
function preparePatches(schemas) {
|
||||
const patches = []
|
||||
const patchesObj = {}
|
||||
for (const schemaToPatch in schemas) {
|
||||
for (const schemaName of schemas[schemaToPatch]) {
|
||||
if (!patchesObj[schemaName]) {
|
||||
patchesObj[schemaName] = {
|
||||
schemaName,
|
||||
schemaPointer: `#/components/schemas/${schemaName}`,
|
||||
schemaToPatchPointers: [],
|
||||
}
|
||||
}
|
||||
const schemaToPatchPointer = `#/components/schemas/${schemaToPatch}`
|
||||
if (
|
||||
!patchesObj[schemaName].schemaToPatchPointers.includes(
|
||||
schemaToPatchPointer
|
||||
)
|
||||
) {
|
||||
patchesObj[schemaName].schemaToPatchPointers.push(schemaToPatchPointer)
|
||||
}
|
||||
}
|
||||
}
|
||||
for (const key in patchesObj) {
|
||||
patches.push(patchesObj[key])
|
||||
}
|
||||
return patches
|
||||
}
|
||||
|
||||
function applyPatch(node, schemaName) {
|
||||
delete node["$ref"]
|
||||
node.type = "object"
|
||||
if (!node.description && schemaName) {
|
||||
node.description = `${schemaName} object.`
|
||||
}
|
||||
}
|
||||
|
||||
function CircularPatch({ schemas = {}, verbose = false }) {
|
||||
const logs = []
|
||||
const patches = preparePatches(schemas)
|
||||
const refPathPrefix = "#/components/schemas/"
|
||||
const refPathPrefixLength = refPathPrefix.length
|
||||
const refPathPrefixRegex = /#\/components\/schemas\/\w+/
|
||||
return {
|
||||
ref(ref, ctx, resolved) {
|
||||
if (
|
||||
ctx.type.name.toLowerCase() !== "schema" ||
|
||||
!resolved?.location?.pointer
|
||||
) {
|
||||
return
|
||||
}
|
||||
for (const patch of patches) {
|
||||
if (resolved.location.pointer !== patch.schemaPointer) {
|
||||
continue
|
||||
}
|
||||
const ctxSchemaPointer =
|
||||
ctx.location.pointer.match(refPathPrefixRegex)[0]
|
||||
if (!patch.schemaToPatchPointers.includes(ctxSchemaPointer)) {
|
||||
continue
|
||||
}
|
||||
applyPatch(ref)
|
||||
if (verbose) {
|
||||
logs.push(
|
||||
`${ctxSchemaPointer.substring(refPathPrefixLength)} patch $ref to ${
|
||||
patch.schemaName
|
||||
}`
|
||||
)
|
||||
}
|
||||
}
|
||||
},
|
||||
Root: {
|
||||
leave() {
|
||||
if (verbose) {
|
||||
logs.sort()
|
||||
for (const log of logs) {
|
||||
console.debug(log)
|
||||
}
|
||||
}
|
||||
},
|
||||
},
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,14 @@
|
||||
const CircularPatch = require("./decorators/circular-patch")
|
||||
|
||||
const id = "medusa"
|
||||
|
||||
const decorators = {
|
||||
oas3: {
|
||||
"circular-patch": CircularPatch,
|
||||
},
|
||||
}
|
||||
|
||||
module.exports = {
|
||||
id,
|
||||
decorators,
|
||||
}
|
||||
@@ -0,0 +1,164 @@
|
||||
plugins:
|
||||
- "./plugins/medusa/index.js"
|
||||
|
||||
# Allows to replace a $ref with `type: object` in order to avoid infinite loops
|
||||
# when Redocly attempts to render circular references.
|
||||
decorators:
|
||||
medusa/circular-patch:
|
||||
schemas:
|
||||
Address:
|
||||
- Customer
|
||||
Cart:
|
||||
- Customer
|
||||
- Order
|
||||
- Payment
|
||||
- PaymentSession
|
||||
ClaimImage:
|
||||
- ClaimItem
|
||||
ClaimItem:
|
||||
- ClaimOrder
|
||||
ClaimOrder:
|
||||
- Fulfillment
|
||||
- Order
|
||||
- Return
|
||||
Country:
|
||||
- Region
|
||||
Customer:
|
||||
- Order
|
||||
CustomerGroup:
|
||||
- Customer
|
||||
- PriceList
|
||||
Discount:
|
||||
- Discount
|
||||
DiscountRule:
|
||||
- DiscountCondition
|
||||
DraftOrder:
|
||||
- Cart
|
||||
- Order
|
||||
Fulfillment:
|
||||
- ClaimOrder
|
||||
- Order
|
||||
- Swap
|
||||
FulfillmentItem:
|
||||
- Fulfillment
|
||||
GiftCard:
|
||||
- Order
|
||||
GiftCardTransaction:
|
||||
- GiftCard
|
||||
- Order
|
||||
LineItem:
|
||||
- Cart
|
||||
- ClaimOrder
|
||||
- Order
|
||||
- OrderEdit
|
||||
- Swap
|
||||
LineItemAdjustment:
|
||||
- LineItem
|
||||
LineItemTaxLine:
|
||||
- LineItem
|
||||
MoneyAmount:
|
||||
- PriceList
|
||||
- ProductVariant
|
||||
- Region
|
||||
Notification:
|
||||
- Notification
|
||||
Order:
|
||||
- Cart
|
||||
- ClaimOrder
|
||||
- Customer
|
||||
- DraftOrder
|
||||
- Fulfillment
|
||||
- OrderEdit
|
||||
- Payment
|
||||
- Refund
|
||||
- Return
|
||||
- Swap
|
||||
OrderEdit:
|
||||
- Order
|
||||
OrderItemChange:
|
||||
- OrderEdit
|
||||
Payment:
|
||||
- Cart
|
||||
- Order
|
||||
- Swap
|
||||
ProductCategory:
|
||||
- ProductCategory
|
||||
- Product
|
||||
ProductCollection:
|
||||
- Product
|
||||
ProductOption:
|
||||
- Product
|
||||
ProductOptionValue:
|
||||
- ProductOption
|
||||
- ProductVariant
|
||||
ProductVariant:
|
||||
- Product
|
||||
ProductVariantInventoryItem:
|
||||
- ProductVariant
|
||||
Refund:
|
||||
- Order
|
||||
- Payment
|
||||
Return:
|
||||
- ClaimOrder
|
||||
- Order
|
||||
- Swap
|
||||
ReturnItem:
|
||||
- Return
|
||||
ReturnReason:
|
||||
- ReturnReason
|
||||
SalesChannelLocation:
|
||||
- SalesChannel
|
||||
ShippingMethod:
|
||||
- Cart
|
||||
- ClaimOrder
|
||||
- Order
|
||||
- Payment
|
||||
- Return
|
||||
- Swap
|
||||
ShippingMethodTaxLine:
|
||||
- ShippingMethod
|
||||
ShippingOption:
|
||||
- Region
|
||||
ShippingOptionRequirement:
|
||||
- ShippingOption
|
||||
ShippingProfile:
|
||||
- Product
|
||||
- ShippingOption
|
||||
Swap:
|
||||
- Cart
|
||||
- Fulfillment
|
||||
- Order
|
||||
- Payment
|
||||
- Return
|
||||
TaxRate:
|
||||
- Region
|
||||
TrackingLink:
|
||||
- Fulfillment
|
||||
SalesChannel:
|
||||
- Order
|
||||
- Cart
|
||||
- PublishableApiKey
|
||||
CustomerGroupResponse:
|
||||
- CustomerResponse
|
||||
ProductCategoryResponse:
|
||||
- ProductCategoryResponse
|
||||
|
||||
# Similar config to /www/docs/docusaurus.config.js > redocusaurus
|
||||
# Allows to emulate rendering of API public documentation when using `yarn redocly preview-docs openapi.yaml`
|
||||
theme:
|
||||
openapi:
|
||||
theme:
|
||||
colors:
|
||||
primary:
|
||||
dark: "#242526"
|
||||
sidebar:
|
||||
width: "250px"
|
||||
disableSearch: true
|
||||
expandResponses: "200,204"
|
||||
generatedPayloadSamplesMaxDepth: 4
|
||||
hideDownloadButton: true
|
||||
hideRequestPayloadSample: true
|
||||
nativeScrollbars: true
|
||||
requiredPropsFirst: true
|
||||
showObjectSchemaExamples: true
|
||||
sortTagsAlphabetically: true
|
||||
@@ -0,0 +1,376 @@
|
||||
import { exists, getTmpDirectory } from "../utils/fs-utils"
|
||||
import { writeJson } from "../utils/json-utils"
|
||||
import { OpenAPIObject } from "openapi3-ts"
|
||||
import path from "path"
|
||||
import { v4 as uid } from "uuid"
|
||||
import { readYaml, writeYaml } from "../utils/yaml-utils"
|
||||
import { mkdir, readdir } from "fs/promises"
|
||||
import {
|
||||
formatHintRecommendation,
|
||||
getCircularPatchRecommendation,
|
||||
getCircularReferences,
|
||||
} from "../utils/circular-patch-utils"
|
||||
import execa from "execa"
|
||||
|
||||
const basePath = path.resolve(__dirname, `../../../`)
|
||||
|
||||
export const runCLI = async (command: string, options: string[] = []) => {
|
||||
const params = ["run", "medusa-oas", command, ...options]
|
||||
try {
|
||||
const { all: logs } = await execa("yarn", params, {
|
||||
cwd: basePath,
|
||||
all: true,
|
||||
})
|
||||
} catch (err) {
|
||||
throw new Error(err.message + err.all)
|
||||
}
|
||||
}
|
||||
|
||||
export const getBaseOpenApi = (): OpenAPIObject => {
|
||||
return {
|
||||
openapi: "3.0.0",
|
||||
info: {
|
||||
title: "Test",
|
||||
version: "1.0.0",
|
||||
},
|
||||
paths: {},
|
||||
components: {},
|
||||
}
|
||||
}
|
||||
|
||||
describe("command docs", () => {
|
||||
let tmpDir: string
|
||||
let openApi: OpenAPIObject
|
||||
|
||||
beforeAll(async () => {
|
||||
tmpDir = await getTmpDirectory()
|
||||
})
|
||||
|
||||
beforeEach(async () => {
|
||||
openApi = getBaseOpenApi()
|
||||
})
|
||||
|
||||
describe("basic usage", () => {
|
||||
let srcFile: string
|
||||
|
||||
beforeAll(async () => {
|
||||
openApi = getBaseOpenApi()
|
||||
openApi.components = {
|
||||
schemas: {
|
||||
Order: {
|
||||
type: "object",
|
||||
},
|
||||
},
|
||||
}
|
||||
const outDir = path.resolve(tmpDir, uid())
|
||||
await mkdir(outDir, { recursive: true })
|
||||
srcFile = path.resolve(outDir, "store.oas.json")
|
||||
await writeJson(srcFile, openApi)
|
||||
})
|
||||
|
||||
it("should generate docs", async () => {
|
||||
const outDir = path.resolve(tmpDir, uid())
|
||||
await runCLI("docs", ["--src-file", srcFile, "--out-dir", outDir])
|
||||
const generatedFilePath = path.resolve(outDir, "openapi.yaml")
|
||||
const oas = (await readYaml(generatedFilePath)) as OpenAPIObject
|
||||
const files = await readdir(outDir)
|
||||
expect(oas.components?.schemas?.Order).toBeDefined()
|
||||
expect(files.length).toBe(1)
|
||||
})
|
||||
|
||||
it("should clean output directory", async () => {
|
||||
const outDir = path.resolve(tmpDir, uid())
|
||||
await mkdir(outDir, { recursive: true })
|
||||
await writeJson(path.resolve(outDir, "test.json"), { foo: "bar" })
|
||||
await runCLI("docs", [
|
||||
"--src-file",
|
||||
srcFile,
|
||||
"--out-dir",
|
||||
outDir,
|
||||
"--clean",
|
||||
])
|
||||
const files = await readdir(outDir)
|
||||
expect(files.includes("test.json")).toBeFalsy()
|
||||
expect(files.length).toBe(1)
|
||||
})
|
||||
|
||||
it("should not output any files", async () => {
|
||||
const outDir = path.resolve(tmpDir, uid())
|
||||
await runCLI("docs", [
|
||||
"--src-file",
|
||||
srcFile,
|
||||
"--out-dir",
|
||||
outDir,
|
||||
"--dry-run",
|
||||
])
|
||||
const outDirExists = await exists(outDir)
|
||||
expect(outDirExists).toBeFalsy()
|
||||
})
|
||||
|
||||
it("should split output into multiple files and directories", async () => {
|
||||
const outDir = path.resolve(tmpDir, uid())
|
||||
/**
|
||||
* For split to output files, we need to not be in test mode
|
||||
* See https://github.com/Redocly/redocly-cli/blob/v1.0.0-beta.125/packages/cli/src/utils.ts#L206-L215
|
||||
*/
|
||||
const previousEnv = process.env.NODE_ENV
|
||||
process.env.NODE_ENV = "development"
|
||||
await runCLI("docs", [
|
||||
"--src-file",
|
||||
srcFile,
|
||||
"--out-dir",
|
||||
outDir,
|
||||
"--split",
|
||||
"--clean",
|
||||
])
|
||||
process.env.NODE_ENV = previousEnv
|
||||
const oasFileExists = await exists(path.resolve(outDir, "openapi.yaml"))
|
||||
const outFileExists = await exists(
|
||||
path.resolve(outDir, "components/schemas/Order.yaml")
|
||||
)
|
||||
expect(oasFileExists).toBeTruthy()
|
||||
expect(outFileExists).toBeTruthy()
|
||||
})
|
||||
|
||||
it("should generate static HTML docs", async () => {
|
||||
const outDir = path.resolve(tmpDir, uid())
|
||||
await runCLI("docs", [
|
||||
"--src-file",
|
||||
srcFile,
|
||||
"--out-dir",
|
||||
outDir,
|
||||
"--html",
|
||||
])
|
||||
const generatedFilePath = path.resolve(outDir, "index.html")
|
||||
const htmlFileExists = await exists(generatedFilePath)
|
||||
expect(htmlFileExists).toBeTruthy()
|
||||
})
|
||||
})
|
||||
|
||||
describe("--config", () => {
|
||||
let srcFile: string
|
||||
let configFile: string
|
||||
let configYamlFile: string
|
||||
|
||||
beforeAll(async () => {
|
||||
openApi = getBaseOpenApi()
|
||||
openApi.components = {
|
||||
schemas: {
|
||||
Customer: {
|
||||
type: "object",
|
||||
properties: {
|
||||
address: {
|
||||
$ref: "#/components/schemas/Address",
|
||||
},
|
||||
},
|
||||
},
|
||||
TestOrder: {
|
||||
type: "object",
|
||||
properties: {
|
||||
address: {
|
||||
$ref: "#/components/schemas/Address",
|
||||
},
|
||||
},
|
||||
},
|
||||
Address: {
|
||||
type: "object",
|
||||
properties: {
|
||||
/**
|
||||
* We expect this circular reference to already be handled by
|
||||
* our CLI's default redocly-config.yaml
|
||||
*/
|
||||
customer: {
|
||||
$ref: "#/components/schemas/Customer",
|
||||
},
|
||||
/**
|
||||
* We expect this circular reference to not be handled.
|
||||
*/
|
||||
test_order: {
|
||||
$ref: "#/components/schemas/TestOrder",
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
}
|
||||
const config = {
|
||||
decorators: {
|
||||
"medusa/circular-patch": {
|
||||
schemas: {
|
||||
Address: ["TestOrder"],
|
||||
},
|
||||
},
|
||||
},
|
||||
}
|
||||
const outDir = path.resolve(tmpDir, uid())
|
||||
await mkdir(outDir, { recursive: true })
|
||||
srcFile = path.resolve(outDir, "store.oas.json")
|
||||
await writeJson(srcFile, openApi)
|
||||
configFile = path.resolve(outDir, "redocly-config.json")
|
||||
await writeJson(configFile, config)
|
||||
configYamlFile = path.resolve(outDir, "redocly-config.yaml")
|
||||
await writeYaml(configYamlFile, config)
|
||||
})
|
||||
|
||||
it("should fail with unhandled circular reference", async () => {
|
||||
const outDir = path.resolve(tmpDir, uid())
|
||||
await expect(
|
||||
runCLI("docs", [
|
||||
"--src-file",
|
||||
srcFile,
|
||||
"--out-dir",
|
||||
outDir,
|
||||
"--dry-run",
|
||||
])
|
||||
).rejects.toThrow("Unhandled circular references")
|
||||
})
|
||||
|
||||
it("should succeed with patched circular reference", async () => {
|
||||
const outDir = path.resolve(tmpDir, uid())
|
||||
await expect(
|
||||
runCLI("docs", [
|
||||
"--src-file",
|
||||
srcFile,
|
||||
"--out-dir",
|
||||
outDir,
|
||||
"--config",
|
||||
configFile,
|
||||
"--dry-run",
|
||||
])
|
||||
).resolves.not.toThrow()
|
||||
})
|
||||
|
||||
it("should succeed when config is of type yaml", async () => {
|
||||
const outDir = path.resolve(tmpDir, uid())
|
||||
await expect(
|
||||
runCLI("docs", [
|
||||
"--src-file",
|
||||
srcFile,
|
||||
"--out-dir",
|
||||
outDir,
|
||||
"--config",
|
||||
configYamlFile,
|
||||
"--dry-run",
|
||||
])
|
||||
).resolves.not.toThrow()
|
||||
})
|
||||
|
||||
it("should fail when config is not a file", async () => {
|
||||
const outDir = path.resolve(tmpDir, uid())
|
||||
await expect(
|
||||
runCLI("docs", [
|
||||
"--src-file",
|
||||
srcFile,
|
||||
"--out-dir",
|
||||
outDir,
|
||||
"--dry-run",
|
||||
"--config",
|
||||
outDir,
|
||||
])
|
||||
).rejects.toThrow("--config must be a file")
|
||||
})
|
||||
|
||||
it("should fail when config is not of supported file type", async () => {
|
||||
const outDir = path.resolve(tmpDir, uid())
|
||||
await mkdir(outDir, { recursive: true })
|
||||
const tmpFile = path.resolve(outDir, "tmp.txt")
|
||||
await writeJson(tmpFile, { foo: "bar" })
|
||||
await expect(
|
||||
runCLI("docs", [
|
||||
"--src-file",
|
||||
srcFile,
|
||||
"--out-dir",
|
||||
outDir,
|
||||
"--dry-run",
|
||||
"--config",
|
||||
tmpFile,
|
||||
])
|
||||
).rejects.toThrow("--config file must be of type .json or .yaml")
|
||||
})
|
||||
})
|
||||
|
||||
describe("circular references", () => {
|
||||
let srcFile: string
|
||||
|
||||
beforeAll(async () => {
|
||||
openApi = getBaseOpenApi()
|
||||
openApi.components = {
|
||||
schemas: {
|
||||
Customer: {
|
||||
type: "object",
|
||||
properties: {
|
||||
address: {
|
||||
$ref: "#/components/schemas/Address",
|
||||
},
|
||||
},
|
||||
},
|
||||
TestOrder: {
|
||||
type: "object",
|
||||
properties: {
|
||||
address: {
|
||||
$ref: "#/components/schemas/Address",
|
||||
},
|
||||
},
|
||||
},
|
||||
Address: {
|
||||
type: "object",
|
||||
properties: {
|
||||
customer: {
|
||||
$ref: "#/components/schemas/Customer",
|
||||
},
|
||||
test_order: {
|
||||
$ref: "#/components/schemas/TestOrder",
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
}
|
||||
const outDir = path.resolve(tmpDir, uid())
|
||||
await mkdir(outDir, { recursive: true })
|
||||
srcFile = path.resolve(outDir, "store.oas.json")
|
||||
await writeJson(srcFile, openApi)
|
||||
})
|
||||
|
||||
it("should find circular references", async () => {
|
||||
const { circularRefs, oas } = await getCircularReferences(srcFile)
|
||||
expect(circularRefs.length).toBe(2)
|
||||
expect(circularRefs).toEqual(
|
||||
expect.arrayContaining([
|
||||
"#/components/schemas/Address/properties/customer",
|
||||
"#/components/schemas/TestOrder/properties/address",
|
||||
])
|
||||
)
|
||||
})
|
||||
|
||||
it("should recommend which schemas to patch to resolve circular references", async () => {
|
||||
/**
|
||||
* The recommendation is heavily influenced but the dereference operation
|
||||
* from @readme/json-schema-ref-parser. It's not an exact science and the
|
||||
* results may vary between versions.
|
||||
*/
|
||||
const { circularRefs, oas } = await getCircularReferences(srcFile)
|
||||
const recommendation = getCircularPatchRecommendation(circularRefs, oas)
|
||||
expect(recommendation).toEqual(
|
||||
expect.objectContaining({
|
||||
Address: expect.arrayContaining(["Customer"]),
|
||||
TestOrder: expect.arrayContaining(["Address"]),
|
||||
})
|
||||
)
|
||||
})
|
||||
|
||||
it("should format hint", async () => {
|
||||
const { circularRefs, oas } = await getCircularReferences(srcFile)
|
||||
const recommendation = getCircularPatchRecommendation(circularRefs, oas)
|
||||
const hint = formatHintRecommendation(recommendation)
|
||||
expect(hint).toEqual(
|
||||
expect.stringContaining(`decorators:
|
||||
medusa/circular-patch:
|
||||
schemas:
|
||||
Address:
|
||||
- Customer
|
||||
TestOrder:
|
||||
- Address
|
||||
`)
|
||||
)
|
||||
})
|
||||
})
|
||||
})
|
||||
@@ -0,0 +1,669 @@
|
||||
import fs from "fs/promises"
|
||||
import { OpenAPIObject, SchemaObject } from "openapi3-ts"
|
||||
import { OperationObject } from "openapi3-ts/src/model/OpenApi"
|
||||
import path from "path"
|
||||
import { v4 as uid } from "uuid"
|
||||
import { getTmpDirectory } from "../utils/fs-utils"
|
||||
import { readYaml } from "../utils/yaml-utils"
|
||||
import { readJson } from "../utils/json-utils"
|
||||
import execa from "execa"
|
||||
|
||||
/**
|
||||
* OAS output directory
|
||||
*
|
||||
* @privateRemark
|
||||
* This should be the only directory OAS is loaded from for Medusa V2.
|
||||
* For now, we only use it if the --v2 flag it passed to the CLI tool.
|
||||
*/
|
||||
const oasOutputPath = path.resolve(
|
||||
__dirname, "..", "..", "..", "..", "docs-util", "oas-output"
|
||||
)
|
||||
const basePath = path.resolve(__dirname, `../../`)
|
||||
|
||||
export const runCLI = async (command: string, options: string[] = []) => {
|
||||
const params = ["run", "medusa-oas", command, ...options]
|
||||
try {
|
||||
const { all: logs } = await execa("yarn", params, {
|
||||
cwd: basePath,
|
||||
all: true,
|
||||
})
|
||||
} catch (err) {
|
||||
throw new Error(err.message + err.all)
|
||||
}
|
||||
}
|
||||
|
||||
const listOperations = (oas: OpenAPIObject): OperationObject[] => {
|
||||
const operations: OperationObject[] = []
|
||||
for (const url in oas.paths) {
|
||||
if (oas.paths.hasOwnProperty(url)) {
|
||||
const path = oas.paths[url]
|
||||
for (const method in path) {
|
||||
if (path.hasOwnProperty(method)) {
|
||||
switch (method) {
|
||||
case "get":
|
||||
case "put":
|
||||
case "post":
|
||||
case "delete":
|
||||
case "options":
|
||||
case "head":
|
||||
case "patch":
|
||||
operations.push(path[method])
|
||||
break
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
return operations
|
||||
}
|
||||
|
||||
describe("command oas", () => {
|
||||
let tmpDir: string
|
||||
|
||||
beforeAll(async () => {
|
||||
tmpDir = await getTmpDirectory()
|
||||
})
|
||||
|
||||
describe("--type admin", () => {
|
||||
let oas: OpenAPIObject
|
||||
|
||||
/**
|
||||
* In a CI context, beforeAll might exceed the configured jest timeout.
|
||||
* Until we upgrade our jest version, the timeout error will be swallowed
|
||||
* and the test will fail in unexpected ways.
|
||||
*/
|
||||
beforeAll(async () => {
|
||||
const outDir = path.resolve(tmpDir, uid())
|
||||
await runCLI("oas", ["--type", "admin", "--out-dir", outDir, "--local"])
|
||||
const generatedFilePath = path.resolve(outDir, "admin.oas.json")
|
||||
oas = (await readJson(generatedFilePath)) as OpenAPIObject
|
||||
})
|
||||
|
||||
it("generates oas with admin routes only", async () => {
|
||||
const routes = Object.keys(oas.paths)
|
||||
expect(routes.includes("/admin/products")).toBeTruthy()
|
||||
expect(routes.includes("/store/products")).toBeFalsy()
|
||||
})
|
||||
|
||||
it("generates oas using admin.oas.base.yaml", async () => {
|
||||
const yamlFilePath = path.resolve(
|
||||
oasOutputPath,
|
||||
"base",
|
||||
"admin.oas.base.yaml"
|
||||
)
|
||||
const oasBase = (await readYaml(yamlFilePath)) as OpenAPIObject
|
||||
expect(oas.info.title).toEqual(oasBase.info.title)
|
||||
})
|
||||
})
|
||||
|
||||
describe("--type store", () => {
|
||||
let oas: OpenAPIObject
|
||||
|
||||
beforeAll(async () => {
|
||||
const outDir = path.resolve(tmpDir, uid())
|
||||
await runCLI("oas", ["--type", "store", "--out-dir", outDir, "--local"])
|
||||
const generatedFilePath = path.resolve(outDir, "store.oas.json")
|
||||
oas = (await readJson(generatedFilePath)) as OpenAPIObject
|
||||
})
|
||||
|
||||
it("generates oas with store routes only", async () => {
|
||||
const routes = Object.keys(oas.paths)
|
||||
expect(routes.includes("/admin/products")).toBeFalsy()
|
||||
expect(routes.includes("/store/products")).toBeTruthy()
|
||||
})
|
||||
|
||||
it("generates oas using store.oas.base.yaml", async () => {
|
||||
const yamlFilePath = path.resolve(
|
||||
oasOutputPath,
|
||||
"base",
|
||||
"store.oas.base.yaml"
|
||||
)
|
||||
const oasBase = (await readYaml(yamlFilePath)) as OpenAPIObject
|
||||
expect(oas.info.title).toEqual(oasBase.info.title)
|
||||
})
|
||||
})
|
||||
|
||||
describe("--type combined", () => {
|
||||
let oas: OpenAPIObject
|
||||
|
||||
beforeAll(async () => {
|
||||
const outDir = path.resolve(tmpDir, uid())
|
||||
await runCLI("oas", ["--type", "combined", "--out-dir", outDir, "--local"])
|
||||
const generatedFilePath = path.resolve(outDir, "combined.oas.json")
|
||||
oas = (await readJson(generatedFilePath)) as OpenAPIObject
|
||||
})
|
||||
|
||||
it("generates oas with admin and store routes", async () => {
|
||||
const routes = Object.keys(oas.paths)
|
||||
expect(routes.includes("/admin/products")).toBeTruthy()
|
||||
expect(routes.includes("/store/products")).toBeTruthy()
|
||||
})
|
||||
|
||||
it("generates oas using default.oas.base.yaml", async () => {
|
||||
const yamlFilePath = path.resolve(
|
||||
basePath,
|
||||
"oas",
|
||||
"default.oas.base.yaml"
|
||||
)
|
||||
const oasBase = (await readYaml(yamlFilePath)) as OpenAPIObject
|
||||
expect(oas.info.title).toEqual(oasBase.info.title)
|
||||
})
|
||||
|
||||
it("prefixes tags with api type", async () => {
|
||||
const found = (oas.tags ?? []).filter((tag) => {
|
||||
return !(tag.name.startsWith("Admin") || tag.name.startsWith("Store"))
|
||||
})
|
||||
expect(found).toEqual([])
|
||||
})
|
||||
|
||||
it("prefixes route's tags with api type", async () => {
|
||||
const tags: string[] = listOperations(oas)
|
||||
.map((operation) => {
|
||||
return operation.tags ?? []
|
||||
})
|
||||
.flat()
|
||||
const found = tags.filter((tag) => {
|
||||
return !(tag.startsWith("Admin") || tag.startsWith("Store"))
|
||||
})
|
||||
expect(found).toEqual([])
|
||||
})
|
||||
|
||||
it("prefixes route's operationId with api type", async () => {
|
||||
const operationIds: string[] = listOperations(oas)
|
||||
.map((operation) => operation.operationId)
|
||||
.filter((operationId): operationId is string => !!operationId)
|
||||
const found = operationIds.filter((tag) => {
|
||||
return !(tag.startsWith("Admin") || tag.startsWith("Store"))
|
||||
})
|
||||
expect(found).toEqual([])
|
||||
})
|
||||
|
||||
it("combines components.schemas from admin and store", async () => {
|
||||
const schemas = Object.keys(oas.components?.schemas ?? {})
|
||||
expect(schemas.includes("AdminProductsListRes")).toBeTruthy()
|
||||
expect(schemas.includes("StoreProductsListRes")).toBeTruthy()
|
||||
})
|
||||
})
|
||||
|
||||
/**
|
||||
* to optimize test suite time, we only test --paths with the store api
|
||||
*/
|
||||
describe("--paths", () => {
|
||||
let oas: OpenAPIObject
|
||||
|
||||
beforeAll(async () => {
|
||||
const fileContent = `
|
||||
/** @oas [get] /foobar/tests
|
||||
* operationId: GetFoobarTests
|
||||
*/
|
||||
/** @oas [get] /store/regions
|
||||
* operationId: OverwrittenOperation
|
||||
*/
|
||||
/**
|
||||
* @schema FoobarTestSchema
|
||||
* type: object
|
||||
* properties:
|
||||
* foo:
|
||||
* type: string
|
||||
*/
|
||||
/**
|
||||
* @schema StoreRegionsListRes
|
||||
* type: object
|
||||
* properties:
|
||||
* foo:
|
||||
* type: string
|
||||
*/
|
||||
`
|
||||
const additionalPath = path.resolve(tmpDir, uid())
|
||||
const filePath = path.resolve(additionalPath, "foobar.ts")
|
||||
await fs.mkdir(additionalPath, { recursive: true })
|
||||
await fs.writeFile(filePath, fileContent, "utf8")
|
||||
|
||||
const outDir = path.resolve(tmpDir, uid())
|
||||
await runCLI("oas", [
|
||||
"--type",
|
||||
"store",
|
||||
"--out-dir",
|
||||
outDir,
|
||||
"--paths",
|
||||
additionalPath,
|
||||
"--local"
|
||||
])
|
||||
const generatedFilePath = path.resolve(outDir, "store.oas.json")
|
||||
oas = (await readJson(generatedFilePath)) as OpenAPIObject
|
||||
})
|
||||
|
||||
it("should add new path to existing paths", async () => {
|
||||
const routes = Object.keys(oas.paths)
|
||||
expect(routes.includes("/store/products")).toBeTruthy()
|
||||
expect(routes.includes("/foobar/tests")).toBeTruthy()
|
||||
})
|
||||
|
||||
it("should overwrite existing path", async () => {
|
||||
expect(oas.paths["/store/regions"]["get"].operationId).toBe(
|
||||
"OverwrittenOperation"
|
||||
)
|
||||
})
|
||||
|
||||
it("should add new schema to existing schemas", async () => {
|
||||
const schemas = Object.keys(oas.components?.schemas ?? {})
|
||||
expect(schemas.includes("StoreProductsListRes")).toBeTruthy()
|
||||
expect(schemas.includes("FoobarTestSchema")).toBeTruthy()
|
||||
})
|
||||
|
||||
it("should overwrite existing schema", async () => {
|
||||
const schema = oas.components?.schemas?.StoreRegionsListRes as
|
||||
| SchemaObject
|
||||
| undefined
|
||||
expect(schema?.properties?.foo).toBeDefined()
|
||||
})
|
||||
})
|
||||
|
||||
/**
|
||||
* to optimize test suite time, we only test --base with the store api
|
||||
*/
|
||||
describe("--base", () => {
|
||||
let oas: OpenAPIObject
|
||||
|
||||
beforeAll(async () => {
|
||||
const fileContent = `
|
||||
openapi: 3.1.0
|
||||
info:
|
||||
version: 1.0.1
|
||||
title: Custom API
|
||||
servers:
|
||||
- url: https://foobar.com
|
||||
security:
|
||||
- api_key: []
|
||||
externalDocs:
|
||||
url: https://docs.com
|
||||
webhooks:
|
||||
"foo-hook":
|
||||
get:
|
||||
responses:
|
||||
"200":
|
||||
description: OK
|
||||
tags:
|
||||
- name: Products
|
||||
description: Overwritten tag
|
||||
- name: FoobarTag
|
||||
description: Foobar tag description
|
||||
paths:
|
||||
"/foobar/tests":
|
||||
get:
|
||||
operationId: GetFoobarTests
|
||||
responses:
|
||||
"200":
|
||||
description: OK
|
||||
"/store/regions":
|
||||
get:
|
||||
operationId: OverwrittenOperation
|
||||
responses:
|
||||
"200":
|
||||
description: OK
|
||||
components:
|
||||
schemas:
|
||||
FoobarTestSchema:
|
||||
type: object
|
||||
properties:
|
||||
foo:
|
||||
type: string
|
||||
StoreRegionsListRes:
|
||||
type: object
|
||||
properties:
|
||||
foo:
|
||||
type: string
|
||||
callbacks:
|
||||
fooCallback:
|
||||
get:
|
||||
description: foo callback
|
||||
examples:
|
||||
fooExample:
|
||||
description: foo example
|
||||
headers:
|
||||
fooHeader:
|
||||
description: foo header
|
||||
links:
|
||||
fooLink:
|
||||
description: foo link
|
||||
operationRef: GetFoobarTests
|
||||
parameters:
|
||||
fooParameter:
|
||||
description: foo parameter
|
||||
name: foobar
|
||||
in: path
|
||||
required: true
|
||||
schema:
|
||||
type: string
|
||||
requestBodies:
|
||||
fooRequestBody:
|
||||
description: foo requestBody
|
||||
content:
|
||||
"application/octet-stream": { }
|
||||
responses:
|
||||
fooResponse:
|
||||
description: foo response
|
||||
securitySchemes:
|
||||
fooSecurity:
|
||||
description: foo security
|
||||
type: apiKey
|
||||
name: foo-api-key
|
||||
in: header
|
||||
`
|
||||
const targetDir = path.resolve(tmpDir, uid())
|
||||
const filePath = path.resolve(targetDir, "custom.oas.base.yaml")
|
||||
await fs.mkdir(targetDir, { recursive: true })
|
||||
await fs.writeFile(filePath, fileContent, "utf8")
|
||||
|
||||
const outDir = path.resolve(tmpDir, uid())
|
||||
await runCLI("oas", [
|
||||
"--type",
|
||||
"store",
|
||||
"--out-dir",
|
||||
outDir,
|
||||
"--base",
|
||||
filePath,
|
||||
"--local"
|
||||
])
|
||||
const generatedFilePath = path.resolve(outDir, "store.oas.json")
|
||||
oas = (await readJson(generatedFilePath)) as OpenAPIObject
|
||||
})
|
||||
|
||||
it("should add new path to existing paths", async () => {
|
||||
const routes = Object.keys(oas.paths)
|
||||
expect(routes.includes("/store/products")).toBeTruthy()
|
||||
expect(routes.includes("/foobar/tests")).toBeTruthy()
|
||||
})
|
||||
|
||||
it("should overwrite existing path", async () => {
|
||||
expect(oas.paths["/store/regions"]["get"].operationId).toBe(
|
||||
"OverwrittenOperation"
|
||||
)
|
||||
})
|
||||
|
||||
it("should add new schema to existing schemas", async () => {
|
||||
const schemas = Object.keys(oas.components?.schemas ?? {})
|
||||
expect(schemas.includes("StoreProductsListRes")).toBeTruthy()
|
||||
expect(schemas.includes("FoobarTestSchema")).toBeTruthy()
|
||||
})
|
||||
|
||||
it("should overwrite existing schema", async () => {
|
||||
const schema = oas.components?.schemas?.StoreRegionsListRes as
|
||||
| SchemaObject
|
||||
| undefined
|
||||
expect(schema?.properties?.foo).toBeDefined()
|
||||
})
|
||||
|
||||
it("should replace base properties", async () => {
|
||||
expect(oas.openapi).toBe("3.1.0")
|
||||
expect(oas.info).toEqual({ version: "1.0.1", title: "Custom API" })
|
||||
expect(oas.servers).toEqual([{ url: "https://foobar.com" }])
|
||||
expect(oas.security).toEqual([{ api_key: [] }])
|
||||
expect(oas.externalDocs).toEqual({ url: "https://docs.com" })
|
||||
expect(oas.webhooks).toEqual({
|
||||
"foo-hook": { get: { responses: { "200": { description: "OK" } } } },
|
||||
})
|
||||
})
|
||||
|
||||
it("should add new tag", async () => {
|
||||
expect(oas.tags).toEqual(
|
||||
expect.arrayContaining([
|
||||
expect.objectContaining({
|
||||
name: "FoobarTag",
|
||||
}),
|
||||
])
|
||||
)
|
||||
})
|
||||
|
||||
it("should overwrite existing tag", async () => {
|
||||
expect(oas.tags).toEqual(
|
||||
expect.arrayContaining([
|
||||
expect.objectContaining({
|
||||
name: "Products",
|
||||
description: "Overwritten tag",
|
||||
}),
|
||||
])
|
||||
)
|
||||
})
|
||||
|
||||
it("should add new components", async () => {
|
||||
const components = oas.components ?? {}
|
||||
expect(
|
||||
Object.keys(components.callbacks ?? {}).includes("fooCallback")
|
||||
).toBeTruthy()
|
||||
expect(
|
||||
Object.keys(components.examples ?? {}).includes("fooExample")
|
||||
).toBeTruthy()
|
||||
expect(
|
||||
Object.keys(components.headers ?? {}).includes("fooHeader")
|
||||
).toBeTruthy()
|
||||
expect(
|
||||
Object.keys(components.links ?? {}).includes("fooLink")
|
||||
).toBeTruthy()
|
||||
expect(
|
||||
Object.keys(components.parameters ?? {}).includes("fooParameter")
|
||||
).toBeTruthy()
|
||||
expect(
|
||||
Object.keys(components.requestBodies ?? {}).includes("fooRequestBody")
|
||||
).toBeTruthy()
|
||||
expect(
|
||||
Object.keys(components.responses ?? {}).includes("fooResponse")
|
||||
).toBeTruthy()
|
||||
expect(
|
||||
Object.keys(components.securitySchemes ?? {}).includes("fooSecurity")
|
||||
).toBeTruthy()
|
||||
})
|
||||
})
|
||||
|
||||
describe("public OAS", () => {
|
||||
let oas: OpenAPIObject
|
||||
/**
|
||||
* In a CI context, beforeAll might exceed the configured jest timeout.
|
||||
* Until we upgrade our jest version, the timeout error will be swallowed
|
||||
* and the test will fail in unexpected ways.
|
||||
*/
|
||||
beforeAll(async () => {
|
||||
const outDir = path.resolve(tmpDir, uid())
|
||||
await runCLI("oas", ["--type", "admin", "--out-dir", outDir])
|
||||
const generatedFilePath = path.resolve(outDir, "admin.oas.json")
|
||||
oas = (await readJson(generatedFilePath)) as OpenAPIObject
|
||||
})
|
||||
|
||||
it("generates oas with admin routes only", async () => {
|
||||
const routes = Object.keys(oas.paths)
|
||||
expect(routes.includes("/admin/products")).toBeTruthy()
|
||||
expect(routes.includes("/store/products")).toBeFalsy()
|
||||
})
|
||||
})
|
||||
|
||||
describe("public OAS with base", () => {
|
||||
let oas: OpenAPIObject
|
||||
beforeAll(async () => {
|
||||
const fileContent = `
|
||||
openapi: 3.1.0
|
||||
info:
|
||||
version: 1.0.1
|
||||
title: Custom API
|
||||
servers:
|
||||
- url: https://foobar.com
|
||||
security:
|
||||
- api_key: []
|
||||
externalDocs:
|
||||
url: https://docs.com
|
||||
webhooks:
|
||||
"foo-hook":
|
||||
get:
|
||||
responses:
|
||||
"200":
|
||||
description: OK
|
||||
tags:
|
||||
- name: Products
|
||||
description: Overwritten tag
|
||||
- name: FoobarTag
|
||||
description: Foobar tag description
|
||||
paths:
|
||||
"/foobar/tests":
|
||||
get:
|
||||
operationId: GetFoobarTests
|
||||
responses:
|
||||
"200":
|
||||
description: OK
|
||||
"/store/regions":
|
||||
get:
|
||||
operationId: OverwrittenOperation
|
||||
responses:
|
||||
"200":
|
||||
description: OK
|
||||
components:
|
||||
schemas:
|
||||
FoobarTestSchema:
|
||||
type: object
|
||||
properties:
|
||||
foo:
|
||||
type: string
|
||||
StoreRegionsListRes:
|
||||
type: object
|
||||
properties:
|
||||
foo:
|
||||
type: string
|
||||
callbacks:
|
||||
fooCallback:
|
||||
get:
|
||||
description: foo callback
|
||||
examples:
|
||||
fooExample:
|
||||
description: foo example
|
||||
headers:
|
||||
fooHeader:
|
||||
description: foo header
|
||||
links:
|
||||
fooLink:
|
||||
description: foo link
|
||||
operationRef: GetFoobarTests
|
||||
parameters:
|
||||
fooParameter:
|
||||
description: foo parameter
|
||||
name: foobar
|
||||
in: path
|
||||
required: true
|
||||
schema:
|
||||
type: string
|
||||
requestBodies:
|
||||
fooRequestBody:
|
||||
description: foo requestBody
|
||||
content:
|
||||
"application/octet-stream": { }
|
||||
responses:
|
||||
fooResponse:
|
||||
description: foo response
|
||||
securitySchemes:
|
||||
fooSecurity:
|
||||
description: foo security
|
||||
type: apiKey
|
||||
name: foo-api-key
|
||||
in: header
|
||||
`
|
||||
const targetDir = path.resolve(tmpDir, uid())
|
||||
const filePath = path.resolve(targetDir, "custom.oas.base.yaml")
|
||||
await fs.mkdir(targetDir, { recursive: true })
|
||||
await fs.writeFile(filePath, fileContent, "utf8")
|
||||
|
||||
const outDir = path.resolve(tmpDir, uid())
|
||||
await runCLI("oas", [
|
||||
"--type",
|
||||
"store",
|
||||
"--out-dir",
|
||||
outDir,
|
||||
"--base",
|
||||
filePath,
|
||||
])
|
||||
const generatedFilePath = path.resolve(outDir, "store.oas.json")
|
||||
oas = (await readJson(generatedFilePath)) as OpenAPIObject
|
||||
})
|
||||
|
||||
it("should add new path to existing paths", async () => {
|
||||
const routes = Object.keys(oas.paths)
|
||||
expect(routes.includes("/store/products")).toBeTruthy()
|
||||
expect(routes.includes("/foobar/tests")).toBeTruthy()
|
||||
})
|
||||
|
||||
it("should overwrite existing path", async () => {
|
||||
expect(oas.paths["/store/regions"]["get"].operationId).toBe(
|
||||
"OverwrittenOperation"
|
||||
)
|
||||
})
|
||||
|
||||
it("should add new schema to existing schemas", async () => {
|
||||
const schemas = Object.keys(oas.components?.schemas ?? {})
|
||||
expect(schemas.includes("StoreProductsListRes")).toBeTruthy()
|
||||
expect(schemas.includes("FoobarTestSchema")).toBeTruthy()
|
||||
})
|
||||
|
||||
it("should overwrite existing schema", async () => {
|
||||
const schema = oas.components?.schemas?.StoreRegionsListRes as
|
||||
| SchemaObject
|
||||
| undefined
|
||||
expect(schema?.properties?.foo).toBeDefined()
|
||||
})
|
||||
|
||||
it("should replace base properties", async () => {
|
||||
expect(oas.openapi).toBe("3.1.0")
|
||||
expect(oas.info).toEqual({ version: "1.0.1", title: "Custom API" })
|
||||
expect(oas.servers).toEqual([{ url: "https://foobar.com" }])
|
||||
expect(oas.security).toEqual([{ api_key: [] }])
|
||||
expect(oas.externalDocs).toEqual({ url: "https://docs.com" })
|
||||
expect(oas.webhooks).toEqual({
|
||||
"foo-hook": { get: { responses: { "200": { description: "OK" } } } },
|
||||
})
|
||||
})
|
||||
|
||||
it("should add new tag", async () => {
|
||||
expect(oas.tags).toEqual(
|
||||
expect.arrayContaining([
|
||||
expect.objectContaining({
|
||||
name: "FoobarTag",
|
||||
}),
|
||||
])
|
||||
)
|
||||
})
|
||||
|
||||
it("should overwrite existing tag", async () => {
|
||||
expect(oas.tags).toEqual(
|
||||
expect.arrayContaining([
|
||||
expect.objectContaining({
|
||||
name: "Products",
|
||||
description: "Overwritten tag",
|
||||
}),
|
||||
])
|
||||
)
|
||||
})
|
||||
|
||||
it("should add new components", async () => {
|
||||
const components = oas.components ?? {}
|
||||
expect(
|
||||
Object.keys(components.callbacks ?? {}).includes("fooCallback")
|
||||
).toBeTruthy()
|
||||
expect(
|
||||
Object.keys(components.examples ?? {}).includes("fooExample")
|
||||
).toBeTruthy()
|
||||
expect(
|
||||
Object.keys(components.headers ?? {}).includes("fooHeader")
|
||||
).toBeTruthy()
|
||||
expect(
|
||||
Object.keys(components.links ?? {}).includes("fooLink")
|
||||
).toBeTruthy()
|
||||
expect(
|
||||
Object.keys(components.parameters ?? {}).includes("fooParameter")
|
||||
).toBeTruthy()
|
||||
expect(
|
||||
Object.keys(components.requestBodies ?? {}).includes("fooRequestBody")
|
||||
).toBeTruthy()
|
||||
expect(
|
||||
Object.keys(components.responses ?? {}).includes("fooResponse")
|
||||
).toBeTruthy()
|
||||
expect(
|
||||
Object.keys(components.securitySchemes ?? {}).includes("fooSecurity")
|
||||
).toBeTruthy()
|
||||
})
|
||||
})
|
||||
})
|
||||
@@ -0,0 +1,176 @@
|
||||
import path from "path"
|
||||
import {
|
||||
generate,
|
||||
HttpClient,
|
||||
Indent,
|
||||
PackageNames,
|
||||
} from "@medusajs/openapi-typescript-codegen"
|
||||
import { upperFirst } from "lodash"
|
||||
import fs, { mkdir, readFile } from "fs/promises"
|
||||
import { OpenAPIObject } from "openapi3-ts"
|
||||
import { Command, Option, OptionValues } from "commander"
|
||||
|
||||
/**
|
||||
* CLI Command declaration
|
||||
*/
|
||||
export const commandName = "client"
|
||||
export const commandDescription = "Generate API clients from OAS."
|
||||
export const commandOptions: Option[] = [
|
||||
new Option(
|
||||
"-t, --type <type>",
|
||||
"Namespace for the generated client. Usually `admin` or `store`."
|
||||
).makeOptionMandatory(),
|
||||
|
||||
new Option(
|
||||
"-s, --src-file <srcFile>",
|
||||
"Path to source OAS JSON file."
|
||||
).makeOptionMandatory(),
|
||||
|
||||
new Option(
|
||||
"-o, --out-dir <outDir>",
|
||||
"Output directory for generated client files."
|
||||
).default(path.resolve(process.cwd(), "client")),
|
||||
|
||||
new Option(
|
||||
"-c, --component <component>",
|
||||
"Client component types to generate."
|
||||
)
|
||||
.choices(["all", "types", "client", "hooks"])
|
||||
.default("all"),
|
||||
|
||||
new Option(
|
||||
"--types-package <name>",
|
||||
"Replace relative import statements by types package name."
|
||||
),
|
||||
|
||||
new Option(
|
||||
"--client-package <name>",
|
||||
"Replace relative import statements by client package name."
|
||||
),
|
||||
|
||||
new Option(
|
||||
"--clean",
|
||||
"Delete destination directory content before generating client."
|
||||
),
|
||||
]
|
||||
|
||||
export function getCommand() {
|
||||
const command = new Command(commandName)
|
||||
command.description(commandDescription)
|
||||
for (const opt of commandOptions) {
|
||||
command.addOption(opt)
|
||||
}
|
||||
command.action(async (options) => await execute(options))
|
||||
command.showHelpAfterError(true)
|
||||
return command
|
||||
}
|
||||
|
||||
/**
|
||||
* Main
|
||||
*/
|
||||
export async function execute(cliParams: OptionValues) {
|
||||
/**
|
||||
* Process CLI options
|
||||
*/
|
||||
if (
|
||||
["client", "hooks"].includes(cliParams.component) &&
|
||||
!cliParams.typesPackage
|
||||
) {
|
||||
throw new Error(
|
||||
`--types-package must be declared when using --component=${cliParams.component}`
|
||||
)
|
||||
}
|
||||
if (cliParams.component === "hooks" && !cliParams.clientPackage) {
|
||||
throw new Error(
|
||||
`--client-package must be declared when using --component=${cliParams.component}`
|
||||
)
|
||||
}
|
||||
|
||||
const shouldClean = !!cliParams.clean
|
||||
const srcFile = path.resolve(cliParams.srcFile)
|
||||
const outDir = path.resolve(cliParams.outDir)
|
||||
const apiName = cliParams.type
|
||||
const packageNames: PackageNames = {
|
||||
models: cliParams.typesPackage,
|
||||
client: cliParams.clientPackage,
|
||||
}
|
||||
const exportComponent = cliParams.component
|
||||
|
||||
/**
|
||||
* Command execution
|
||||
*/
|
||||
console.log(`🟣 Generating client - ${apiName} - ${exportComponent}`)
|
||||
|
||||
if (shouldClean) {
|
||||
console.log(`🟠 Cleaning output directory`)
|
||||
await fs.rm(outDir, { recursive: true, force: true })
|
||||
}
|
||||
await mkdir(outDir, { recursive: true })
|
||||
|
||||
const oas = await getOASFromFile(srcFile)
|
||||
await generateClientSDK(oas, outDir, apiName, exportComponent, packageNames)
|
||||
|
||||
console.log(
|
||||
`⚫️ Client generated - ${apiName} - ${exportComponent} - ${outDir}`
|
||||
)
|
||||
}
|
||||
|
||||
/**
|
||||
* Methods
|
||||
*/
|
||||
const getOASFromFile = async (jsonFile: string): Promise<OpenAPIObject> => {
|
||||
const jsonString = await readFile(jsonFile, "utf8")
|
||||
return JSON.parse(jsonString)
|
||||
}
|
||||
|
||||
const generateClientSDK = async (
|
||||
oas: OpenAPIObject,
|
||||
targetDir: string,
|
||||
apiName: string,
|
||||
exportComponent: "all" | "types" | "client" | "hooks",
|
||||
packageNames: PackageNames = {}
|
||||
) => {
|
||||
const exports = {
|
||||
exportCore: false,
|
||||
exportServices: false,
|
||||
exportModels: false,
|
||||
exportHooks: false,
|
||||
}
|
||||
|
||||
switch (exportComponent) {
|
||||
case "types":
|
||||
exports.exportModels = true
|
||||
break
|
||||
case "client":
|
||||
exports.exportCore = true
|
||||
exports.exportServices = true
|
||||
break
|
||||
case "hooks":
|
||||
exports.exportHooks = true
|
||||
break
|
||||
default:
|
||||
exports.exportCore = true
|
||||
exports.exportServices = true
|
||||
exports.exportModels = true
|
||||
exports.exportHooks = true
|
||||
}
|
||||
|
||||
await generate({
|
||||
input: oas,
|
||||
output: targetDir,
|
||||
httpClient: HttpClient.AXIOS,
|
||||
useOptions: true,
|
||||
useUnionTypes: true,
|
||||
exportCore: exports.exportCore,
|
||||
exportServices: exports.exportServices,
|
||||
exportModels: exports.exportModels,
|
||||
exportHooks: exports.exportHooks,
|
||||
exportSchemas: false,
|
||||
indent: Indent.SPACE_2,
|
||||
postfixServices: "Service",
|
||||
postfixModels: "",
|
||||
clientName: `Medusa${upperFirst(apiName)}`,
|
||||
request: undefined,
|
||||
packageNames,
|
||||
})
|
||||
}
|
||||
@@ -0,0 +1,285 @@
|
||||
import { PreviewDocsOptions, previewDocs } from "@redocly/cli/lib/commands/preview-docs"
|
||||
import { commandWrapper } from "@redocly/cli/lib/wrapper"
|
||||
import { Command, Option, OptionValues } from "commander"
|
||||
import execa from "execa"
|
||||
import fs, { mkdir } from "fs/promises"
|
||||
import { isArray, mergeWith } from "lodash"
|
||||
import * as path from "path"
|
||||
import {
|
||||
formatHintRecommendation,
|
||||
getCircularPatchRecommendation,
|
||||
getCircularReferences,
|
||||
} from "./utils/circular-patch-utils"
|
||||
import { getTmpDirectory, isFile } from "./utils/fs-utils"
|
||||
import { readJson } from "./utils/json-utils"
|
||||
import { jsonFileToYamlFile, readYaml, writeYaml, writeYamlFromJson } from "./utils/yaml-utils"
|
||||
import yargs from "yargs"
|
||||
|
||||
/**
|
||||
* Constants
|
||||
*/
|
||||
const basePath = path.resolve(__dirname, "../")
|
||||
|
||||
const medusaPluginRelativePath = "./plugins/medusa/index.js"
|
||||
const medusaPluginAbsolutePath = path.resolve(
|
||||
basePath,
|
||||
"redocly/plugins/medusa/index.js"
|
||||
)
|
||||
const configFileDefault = path.resolve(basePath, "redocly/redocly-config.yaml")
|
||||
|
||||
/**
|
||||
* CLI Command declaration
|
||||
*/
|
||||
export const commandName = "docs"
|
||||
export const commandDescription =
|
||||
"Sanitize OAS for use with Redocly's API documentation viewer."
|
||||
|
||||
export const commandOptions: Option[] = [
|
||||
new Option(
|
||||
"-s, --src-file <srcFile>",
|
||||
"Path to source OAS JSON file."
|
||||
).makeOptionMandatory(),
|
||||
new Option(
|
||||
"-o, --out-dir <outDir>",
|
||||
"Destination directory to output the sanitized OAS files."
|
||||
).default(process.cwd()),
|
||||
new Option(
|
||||
"--config <config>",
|
||||
"Configuration file to merge with default configuration before passing to Redocly's CLI."
|
||||
),
|
||||
new Option("-D, --dry-run", "Do not output files."),
|
||||
new Option(
|
||||
"--clean",
|
||||
"Delete destination directory content before generating documentation."
|
||||
),
|
||||
new Option("--split", "Creates a multi-file structure output."),
|
||||
new Option(
|
||||
"--preview",
|
||||
"Open a preview of the documentation. Does not output files."
|
||||
),
|
||||
new Option(
|
||||
"--html",
|
||||
"Generate a static HTML using Redocly's build-docs command."
|
||||
),
|
||||
new Option(
|
||||
"--main-file-name <mainFileName>",
|
||||
"The name of the main YAML file."
|
||||
).default("openapi.yaml")
|
||||
]
|
||||
|
||||
export function getCommand(): Command {
|
||||
const command = new Command(commandName)
|
||||
command.description(commandDescription)
|
||||
for (const opt of commandOptions) {
|
||||
command.addOption(opt)
|
||||
}
|
||||
command.action(async (options) => await execute(options))
|
||||
command.showHelpAfterError(true)
|
||||
return command
|
||||
}
|
||||
|
||||
/**
|
||||
* Main
|
||||
*/
|
||||
export async function execute(cliParams: OptionValues): Promise<void> {
|
||||
/**
|
||||
* Process CLI options
|
||||
*/
|
||||
const shouldClean = !!cliParams.clean
|
||||
const shouldSplit = !!cliParams.split
|
||||
const shouldPreview = !!cliParams.preview
|
||||
const shouldBuildHTML = !!cliParams.html
|
||||
const dryRun = !!cliParams.dryRun
|
||||
const srcFile = path.resolve(cliParams.srcFile)
|
||||
const outDir = path.resolve(cliParams.outDir)
|
||||
|
||||
const configFileCustom = cliParams.config
|
||||
? path.resolve(cliParams.config)
|
||||
: undefined
|
||||
if (configFileCustom) {
|
||||
if (!(await isFile(configFileCustom))) {
|
||||
throw new Error(`--config must be a file - ${configFileCustom}`)
|
||||
}
|
||||
if (![".json", ".yaml"].includes(path.extname(configFileCustom))) {
|
||||
throw new Error(
|
||||
`--config file must be of type .json or .yaml - ${configFileCustom}`
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Command execution
|
||||
*/
|
||||
console.log(`🟣 Generating API documentation`)
|
||||
|
||||
const tmpDir = await getTmpDirectory()
|
||||
const configTmpFile = path.resolve(tmpDir, "redocly-config.yaml")
|
||||
/** matches naming convention from `redocly split` */
|
||||
const finalOASFile = cliParams.mainFileName
|
||||
|
||||
await createTmpConfig(configFileDefault, configTmpFile)
|
||||
if (configFileCustom) {
|
||||
console.log(
|
||||
`🔵 Merging configuration file - ${configFileCustom} > ${configTmpFile}`
|
||||
)
|
||||
await mergeConfig(configTmpFile, configFileCustom, configTmpFile)
|
||||
}
|
||||
|
||||
if (!dryRun) {
|
||||
if (shouldClean) {
|
||||
console.log(`🟠 Cleaning output directory`)
|
||||
await fs.rm(outDir, { recursive: true, force: true })
|
||||
}
|
||||
await mkdir(outDir, { recursive: true })
|
||||
}
|
||||
|
||||
const srcFileSanitized = path.resolve(tmpDir, "tmp.oas.yaml")
|
||||
await sanitizeOAS(srcFile, srcFileSanitized, configTmpFile)
|
||||
await circularReferenceCheck(srcFileSanitized)
|
||||
|
||||
if (dryRun) {
|
||||
console.log(`⚫️ Dry run - no files generated`)
|
||||
return
|
||||
}
|
||||
if (shouldPreview) {
|
||||
await preview(srcFileSanitized, configTmpFile)
|
||||
return
|
||||
}
|
||||
if (shouldSplit) {
|
||||
await generateReference(srcFileSanitized, outDir)
|
||||
} else {
|
||||
await writeYaml(path.join(outDir, finalOASFile), await fs.readFile(srcFileSanitized, "utf8"))
|
||||
}
|
||||
if (shouldBuildHTML) {
|
||||
const outHTMLFile = path.resolve(outDir, "index.html")
|
||||
await buildHTML(finalOASFile, outHTMLFile, configTmpFile)
|
||||
}
|
||||
console.log(`⚫️ API documentation generated - ${outDir}`)
|
||||
}
|
||||
|
||||
/**
|
||||
* Methods
|
||||
*/
|
||||
type RedoclyConfig = {
|
||||
apis?: Record<string, unknown>
|
||||
decorators?: Record<string, unknown>
|
||||
extends?: string[]
|
||||
organization?: string
|
||||
plugins?: string[]
|
||||
preprocessors?: Record<string, unknown>
|
||||
region?: string
|
||||
resolve?: Record<string, unknown>
|
||||
rules?: Record<string, unknown>
|
||||
theme?: Record<string, unknown>
|
||||
}
|
||||
|
||||
const mergeConfig = async (
|
||||
configFileDefault: string,
|
||||
configFileCustom: string,
|
||||
configFileOut: string
|
||||
): Promise<void> => {
|
||||
const configDefault = await readYaml(configFileDefault)
|
||||
const configCustom =
|
||||
path.extname(configFileCustom) === ".yaml"
|
||||
? await readYaml(configFileCustom)
|
||||
: await readJson(configFileCustom)
|
||||
|
||||
const config = mergeWith(configDefault, configCustom, (objValue, srcValue) =>
|
||||
isArray(objValue) ? objValue.concat(srcValue) : undefined
|
||||
) as RedoclyConfig
|
||||
|
||||
await writeYamlFromJson(configFileOut, config)
|
||||
}
|
||||
|
||||
const createTmpConfig = async (
|
||||
configFileDefault: string,
|
||||
configFileOut: string
|
||||
): Promise<void> => {
|
||||
const config = (await readYaml(configFileDefault)) as RedoclyConfig
|
||||
config.plugins = (config.plugins ?? []).filter(
|
||||
(plugin) => plugin !== medusaPluginRelativePath
|
||||
)
|
||||
config.plugins.push(medusaPluginAbsolutePath)
|
||||
|
||||
await writeYamlFromJson(configFileOut, config)
|
||||
}
|
||||
|
||||
const sanitizeOAS = async (
|
||||
srcFile: string,
|
||||
outFile: string,
|
||||
configFile: string
|
||||
): Promise<void> => {
|
||||
const { all: logs } = await execa(
|
||||
"yarn",
|
||||
[
|
||||
"redocly",
|
||||
"bundle",
|
||||
srcFile,
|
||||
`--output=${outFile}`,
|
||||
`--config=${configFile}`,
|
||||
],
|
||||
{ cwd: basePath, all: true }
|
||||
)
|
||||
console.log(logs)
|
||||
}
|
||||
|
||||
const circularReferenceCheck = async (srcFile: string): Promise<void> => {
|
||||
const { circularRefs, oas } = await getCircularReferences(srcFile)
|
||||
if (circularRefs.length) {
|
||||
console.log(circularRefs)
|
||||
let errorMessage = `🔴 Unhandled circular references - Please manually patch using --config ./redocly-config.yaml`
|
||||
const recommendation = getCircularPatchRecommendation(circularRefs, oas)
|
||||
if (Object.keys(recommendation).length) {
|
||||
const hint = formatHintRecommendation(recommendation)
|
||||
errorMessage += `
|
||||
Within redocly-config.yaml, try adding the following:
|
||||
###
|
||||
${hint}
|
||||
###
|
||||
`
|
||||
}
|
||||
throw new Error(errorMessage)
|
||||
}
|
||||
console.log(`🟢 All circular references are handled`)
|
||||
}
|
||||
|
||||
const generateReference = async (
|
||||
srcFile: string,
|
||||
outDir: string
|
||||
): Promise<void> => {
|
||||
const { all: logs } = await execa(
|
||||
"yarn",
|
||||
["redocly", "split", srcFile, `--outDir=${outDir}`],
|
||||
{ cwd: basePath, all: true }
|
||||
)
|
||||
console.log(logs)
|
||||
}
|
||||
|
||||
const preview = async (oasFile: string, configFile: string): Promise<void> => {
|
||||
await commandWrapper(previewDocs)({
|
||||
port: 8080,
|
||||
host: "127.0.0.1",
|
||||
api: oasFile,
|
||||
config: configFile,
|
||||
} as yargs.Arguments<PreviewDocsOptions>)
|
||||
}
|
||||
|
||||
const buildHTML = async (
|
||||
srcFile: string,
|
||||
outFile: string,
|
||||
configFile: string
|
||||
): Promise<void> => {
|
||||
const { all: logs } = await execa(
|
||||
"yarn",
|
||||
[
|
||||
"redocly",
|
||||
"build-docs",
|
||||
srcFile,
|
||||
`--output=${outFile}`,
|
||||
`--config=${configFile}`,
|
||||
`--cdn=true`,
|
||||
],
|
||||
{ cwd: basePath, all: true }
|
||||
)
|
||||
console.log(logs)
|
||||
}
|
||||
@@ -0,0 +1,243 @@
|
||||
import OpenAPIParser from "@readme/openapi-parser"
|
||||
import { Command, Option, OptionValues } from "commander"
|
||||
import { lstat, mkdir, writeFile } from "fs/promises"
|
||||
import { OpenAPIObject } from "openapi3-ts"
|
||||
import * as path from "path"
|
||||
import swaggerInline from "swagger-inline"
|
||||
import { combineOAS } from "./utils/combine-oas"
|
||||
import {
|
||||
mergeBaseIntoOAS,
|
||||
mergePathsAndSchemasIntoOAS,
|
||||
} from "./utils/merge-oas"
|
||||
import { isFile } from "./utils/fs-utils"
|
||||
|
||||
/**
|
||||
* Constants
|
||||
*/
|
||||
// Medusa core package directory
|
||||
const medusaPackagePath = path.dirname(
|
||||
require.resolve("@medusajs/medusa/package.json")
|
||||
)
|
||||
// Types package directory
|
||||
const medusaTypesPath = path.dirname(
|
||||
require.resolve("@medusajs/types/package.json")
|
||||
)
|
||||
// Utils package directory
|
||||
const medusaUtilsPath = path.dirname(
|
||||
require.resolve("@medusajs/utils/package.json")
|
||||
)
|
||||
const basePath = path.resolve(__dirname, "../")
|
||||
|
||||
/**
|
||||
* CLI Command declaration
|
||||
*/
|
||||
export const commandName = "oas"
|
||||
export const commandDescription =
|
||||
"Compile full OAS from swagger-inline compliant JSDoc."
|
||||
|
||||
export const commandOptions: Option[] = [
|
||||
new Option("-t, --type <type>", "API type to compile.")
|
||||
.choices(["admin", "store", "combined"])
|
||||
.makeOptionMandatory(),
|
||||
new Option(
|
||||
"-o, --out-dir <outDir>",
|
||||
"Destination directory to output generated OAS files."
|
||||
).default(process.cwd()),
|
||||
new Option("-D, --dry-run", "Do not output files."),
|
||||
new Option(
|
||||
"-p, --paths <paths...>",
|
||||
"Additional paths to crawl for OAS JSDoc."
|
||||
),
|
||||
new Option(
|
||||
"-b, --base <base>",
|
||||
"Custom base OAS file to use for swagger-inline."
|
||||
),
|
||||
new Option("-F, --force", "Ignore OAS validation and output OAS files."),
|
||||
new Option(
|
||||
"--v2",
|
||||
"Generate OAS files for V2 endpoints. This loads OAS from docs-util/oas-output/operations directory"
|
||||
),
|
||||
new Option(
|
||||
"--local",
|
||||
"Generate OAS from local files rather than public OAS. This is useful for generating references in the Medusa monorepo."
|
||||
)
|
||||
]
|
||||
|
||||
export function getCommand() {
|
||||
const command = new Command(commandName)
|
||||
command.description(commandDescription)
|
||||
for (const opt of commandOptions) {
|
||||
command.addOption(opt)
|
||||
}
|
||||
command.action(async (options) => await execute(options))
|
||||
command.showHelpAfterError(true)
|
||||
return command
|
||||
}
|
||||
|
||||
/**
|
||||
* Main
|
||||
*/
|
||||
export async function execute(cliParams: OptionValues) {
|
||||
/**
|
||||
* Process CLI options
|
||||
*/
|
||||
const dryRun = !!cliParams.dryRun
|
||||
const force = !!cliParams.force
|
||||
const v2 = !!cliParams.v2
|
||||
const local = !!cliParams.local
|
||||
|
||||
const apiType: ApiType = cliParams.type
|
||||
|
||||
const outDir = path.resolve(cliParams.outDir)
|
||||
|
||||
const additionalPaths = (cliParams.paths ?? []).map((additionalPath) =>
|
||||
path.resolve(additionalPath)
|
||||
)
|
||||
for (const additionalPath of additionalPaths) {
|
||||
if (!(await isDirectory(additionalPath))) {
|
||||
throw new Error(`--paths must be a directory - ${additionalPath}`)
|
||||
}
|
||||
}
|
||||
|
||||
const baseFile = cliParams.base ? path.resolve(cliParams.base) : undefined
|
||||
if (baseFile) {
|
||||
if (!(await isFile(cliParams.base))) {
|
||||
throw new Error(`--base must be a file - ${baseFile}`)
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Command execution
|
||||
*/
|
||||
if (!dryRun) {
|
||||
await mkdir(outDir, { recursive: true })
|
||||
}
|
||||
|
||||
let oas: OpenAPIObject
|
||||
console.log(`🟣 Generating OAS - ${apiType}`)
|
||||
|
||||
if (apiType === "combined") {
|
||||
const adminOAS = !local ? await getPublicOas("admin", v2) : await getOASFromCodebase("admin", v2)
|
||||
const storeOAS = !local ? await getPublicOas("store", v2) : await getOASFromCodebase("store", v2)
|
||||
oas = await combineOAS(adminOAS, storeOAS)
|
||||
} else {
|
||||
oas = !local ? await getPublicOas(apiType, v2) : await getOASFromCodebase(apiType, v2)
|
||||
}
|
||||
|
||||
if (additionalPaths.length || baseFile) {
|
||||
const customOAS = await getOASFromPaths(additionalPaths, baseFile)
|
||||
if (baseFile) {
|
||||
mergeBaseIntoOAS(oas, customOAS)
|
||||
}
|
||||
if (additionalPaths.length) {
|
||||
mergePathsAndSchemasIntoOAS(oas, customOAS)
|
||||
}
|
||||
}
|
||||
|
||||
await validateOAS(oas, apiType, force)
|
||||
if (dryRun) {
|
||||
console.log(`⚫️ Dry run - no files generated`)
|
||||
return
|
||||
}
|
||||
await exportOASToJSON(oas, apiType, outDir)
|
||||
}
|
||||
|
||||
/**
|
||||
* Methods
|
||||
*/
|
||||
async function getOASFromCodebase(
|
||||
apiType: ApiType,
|
||||
v2?: boolean
|
||||
): Promise<OpenAPIObject> {
|
||||
/**
|
||||
* OAS output directory
|
||||
*
|
||||
* @privateRemark
|
||||
* This should be the only directory OAS is loaded from for Medusa V2.
|
||||
* For now, we only use it if the --v2 flag it passed to the CLI tool.
|
||||
*/
|
||||
const oasOutputPath = path.resolve(
|
||||
__dirname, "..", "..", "..", "..", "docs-util", "oas-output"
|
||||
)
|
||||
const gen = await swaggerInline(
|
||||
v2 ? [
|
||||
path.resolve(oasOutputPath, "operations", apiType),
|
||||
path.resolve(oasOutputPath, "schemas"),
|
||||
// We currently load error schemas from here. If we change
|
||||
// that in the future, we should change the path.
|
||||
path.resolve(medusaPackagePath, "dist", "api/middlewares"),
|
||||
] : [
|
||||
path.resolve(medusaTypesPath, "dist"),
|
||||
path.resolve(medusaUtilsPath, "dist"),
|
||||
path.resolve(medusaPackagePath, "dist", "models"),
|
||||
path.resolve(medusaPackagePath, "dist", "types"),
|
||||
path.resolve(medusaPackagePath, "dist", "api/middlewares"),
|
||||
path.resolve(medusaPackagePath, "dist", `api/routes/${apiType}`),
|
||||
],
|
||||
{
|
||||
base: path.resolve(oasOutputPath, v2 ? "base-v2" : "base", `${apiType}.oas.base.yaml`),
|
||||
format: ".json",
|
||||
}
|
||||
)
|
||||
return (await OpenAPIParser.parse(JSON.parse(gen))) as OpenAPIObject
|
||||
}
|
||||
|
||||
async function getPublicOas(
|
||||
apiType: ApiType,
|
||||
v2?: boolean
|
||||
) {
|
||||
const url = `https://docs.medusajs.com/api/download/${apiType}?version=${v2 ? "2" : "1"}`
|
||||
return await OpenAPIParser.parse(url) as OpenAPIObject
|
||||
}
|
||||
|
||||
async function getOASFromPaths(
|
||||
additionalPaths: string[] = [],
|
||||
customBaseFile?: string
|
||||
): Promise<OpenAPIObject> {
|
||||
console.log(`🔵 Gathering custom OAS`)
|
||||
const gen = await swaggerInline(additionalPaths, {
|
||||
base:
|
||||
customBaseFile ?? path.resolve(basePath, "oas", "default.oas.base.yaml"),
|
||||
format: ".json",
|
||||
logger: (log) => {
|
||||
console.log(log)
|
||||
},
|
||||
})
|
||||
return (await OpenAPIParser.parse(JSON.parse(gen))) as OpenAPIObject
|
||||
}
|
||||
|
||||
async function validateOAS(
|
||||
oas: OpenAPIObject,
|
||||
apiType: ApiType,
|
||||
force = false
|
||||
): Promise<void> {
|
||||
try {
|
||||
await OpenAPIParser.validate(JSON.parse(JSON.stringify(oas)))
|
||||
console.log(`🟢 Valid OAS - ${apiType}`)
|
||||
} catch (err) {
|
||||
console.error(`🔴 Invalid OAS - ${apiType}`, err)
|
||||
if (!force) {
|
||||
process.exit(1)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
async function exportOASToJSON(
|
||||
oas: OpenAPIObject,
|
||||
apiType: ApiType,
|
||||
targetDir: string
|
||||
): Promise<void> {
|
||||
const json = JSON.stringify(oas, null, 2)
|
||||
const filePath = path.resolve(targetDir, `${apiType}.oas.json`)
|
||||
await writeFile(filePath, json)
|
||||
console.log(`⚫️ Exported OAS - ${apiType} - ${filePath}`)
|
||||
}
|
||||
|
||||
async function isDirectory(dirPath: string): Promise<boolean> {
|
||||
try {
|
||||
return (await lstat(path.resolve(dirPath))).isDirectory()
|
||||
} catch (err) {
|
||||
console.log(err)
|
||||
return false
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,46 @@
|
||||
#! /usr/bin/env node
|
||||
|
||||
import { Command } from "commander"
|
||||
import { getCommand as oasGetCommand } from "./command-oas"
|
||||
import { getCommand as clientGetCommand } from "./command-client"
|
||||
import { getCommand as docsGetCommand } from "./command-docs"
|
||||
|
||||
const run = async () => {
|
||||
const program = getBaseCommand()
|
||||
|
||||
/**
|
||||
* Alias to command-oas.ts
|
||||
*/
|
||||
program.addCommand(oasGetCommand())
|
||||
|
||||
/**
|
||||
* Alias to command-client.ts
|
||||
*/
|
||||
program.addCommand(clientGetCommand())
|
||||
|
||||
/**
|
||||
* Alias to command-docs.ts
|
||||
*/
|
||||
program.addCommand(docsGetCommand())
|
||||
|
||||
/**
|
||||
* Run CLI
|
||||
*/
|
||||
await program.parseAsync()
|
||||
}
|
||||
|
||||
export function getBaseCommand() {
|
||||
const command = new Command()
|
||||
command.name("medusa-oas")
|
||||
command.action(async () => {
|
||||
console.log("No command provided.")
|
||||
command.outputHelp({ error: true })
|
||||
})
|
||||
command.showHelpAfterError(true)
|
||||
command.helpOption(false)
|
||||
return command
|
||||
}
|
||||
|
||||
void (async () => {
|
||||
await run()
|
||||
})()
|
||||
@@ -0,0 +1 @@
|
||||
type ApiType = "store" | "admin" | "combined"
|
||||
@@ -0,0 +1,85 @@
|
||||
import { OpenAPIObject, SchemaObject } from "openapi3-ts"
|
||||
import OpenAPIParser from "@readme/openapi-parser"
|
||||
import { $Refs } from "@readme/json-schema-ref-parser"
|
||||
import { jsonObjectToYamlString } from "./yaml-utils"
|
||||
|
||||
export const getCircularReferences = async (
|
||||
srcFile: string
|
||||
): Promise<{ circularRefs: string[]; oas: OpenAPIObject }> => {
|
||||
const parser = new OpenAPIParser()
|
||||
const oas = (await parser.validate(srcFile, {
|
||||
dereference: {
|
||||
circular: "ignore",
|
||||
},
|
||||
})) as OpenAPIObject
|
||||
if (parser.$refs.circular) {
|
||||
const $refs = parser.$refs as $Refs
|
||||
let circularRefs = $refs.circularRefs.map(
|
||||
(ref) => ref.match(/#\/components\/schemas\/.*/)![0]
|
||||
)
|
||||
circularRefs = [...new Set(circularRefs)]
|
||||
circularRefs.sort()
|
||||
return { circularRefs, oas }
|
||||
}
|
||||
return { circularRefs: [], oas }
|
||||
}
|
||||
|
||||
export const getCircularPatchRecommendation = (
|
||||
circularRefs: string[],
|
||||
oas: OpenAPIObject
|
||||
): Record<string, string[]> => {
|
||||
type circularReferenceMatch = {
|
||||
schema: string
|
||||
property: string
|
||||
isArray: boolean
|
||||
referencedSchema: string
|
||||
}
|
||||
const matches: circularReferenceMatch[] = circularRefs
|
||||
.map((ref) => {
|
||||
let match =
|
||||
ref.match(
|
||||
/(?:.*)(?:#\/components\/schemas\/)(.*)(?:\/properties\/?)(.*)/
|
||||
) ?? []
|
||||
let schema = match[1]
|
||||
let property = match[2]
|
||||
let isArray = false
|
||||
if (property.endsWith("/items")) {
|
||||
property = property.replace("/items", "")
|
||||
isArray = true
|
||||
}
|
||||
return { schema, property, isArray }
|
||||
})
|
||||
.filter((match) => match.property !== "")
|
||||
.map((match) => {
|
||||
const baseSchema = oas.components!.schemas![match.schema] as SchemaObject
|
||||
const propertySpec = match.isArray
|
||||
? (baseSchema.properties![match.property] as SchemaObject).items!
|
||||
: baseSchema.properties![match.property]
|
||||
const referencedSchema = propertySpec["$ref"].match(
|
||||
/(?:#\/components\/schemas\/)(.*)/
|
||||
)![1]
|
||||
return {
|
||||
schema: match.schema,
|
||||
property: match.property,
|
||||
isArray: match.isArray,
|
||||
referencedSchema,
|
||||
}
|
||||
})
|
||||
|
||||
const schemas = {}
|
||||
for (const match of matches) {
|
||||
if (!schemas.hasOwnProperty(match.schema)) {
|
||||
schemas[match.schema] = []
|
||||
}
|
||||
schemas[match.schema].push(match.referencedSchema)
|
||||
}
|
||||
return schemas
|
||||
}
|
||||
|
||||
export const formatHintRecommendation = (
|
||||
recommendation: Record<string, string[]>
|
||||
) => {
|
||||
return jsonObjectToYamlString({
|
||||
decorators: { "medusa/circular-patch": { schemas: recommendation } },
|
||||
})
|
||||
}
|
||||
@@ -0,0 +1,118 @@
|
||||
import { OpenAPIObject } from "openapi3-ts"
|
||||
import { upperFirst } from "lodash"
|
||||
|
||||
export async function combineOAS(
|
||||
adminOAS: OpenAPIObject,
|
||||
storeOAS: OpenAPIObject
|
||||
): Promise<OpenAPIObject> {
|
||||
prepareOASForCombine(adminOAS, "admin")
|
||||
prepareOASForCombine(storeOAS, "store")
|
||||
|
||||
const combinedOAS: OpenAPIObject = {
|
||||
openapi: "3.0.0",
|
||||
info: { title: "Medusa API", version: "1.0.0" },
|
||||
servers: [],
|
||||
paths: {},
|
||||
tags: [],
|
||||
components: {
|
||||
callbacks: {},
|
||||
examples: {},
|
||||
headers: {},
|
||||
links: {},
|
||||
parameters: {},
|
||||
requestBodies: {},
|
||||
responses: {},
|
||||
schemas: {},
|
||||
securitySchemes: {},
|
||||
},
|
||||
}
|
||||
|
||||
for (const oas of [adminOAS, storeOAS]) {
|
||||
/**
|
||||
* Combine paths
|
||||
*/
|
||||
Object.assign(combinedOAS.paths, oas.paths)
|
||||
/**
|
||||
* Combine tags
|
||||
*/
|
||||
if (oas.tags) {
|
||||
combinedOAS.tags = [...combinedOAS.tags!, ...oas.tags]
|
||||
}
|
||||
/**
|
||||
* Combine components
|
||||
*/
|
||||
if (oas.components) {
|
||||
for (const componentGroup of [
|
||||
"callbacks",
|
||||
"examples",
|
||||
"headers",
|
||||
"links",
|
||||
"parameters",
|
||||
"requestBodies",
|
||||
"responses",
|
||||
"schemas",
|
||||
"securitySchemes",
|
||||
]) {
|
||||
if (Object.keys(oas.components).includes(componentGroup)) {
|
||||
Object.assign(
|
||||
combinedOAS.components![componentGroup],
|
||||
oas.components[componentGroup]
|
||||
)
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return combinedOAS
|
||||
}
|
||||
|
||||
function prepareOASForCombine(
|
||||
oas: OpenAPIObject,
|
||||
apiType: ApiType
|
||||
): OpenAPIObject {
|
||||
console.log(
|
||||
`🔵 Prefixing ${apiType} tags and operationId with ${upperFirst(apiType)}`
|
||||
)
|
||||
for (const pathKey in oas.paths) {
|
||||
for (const operationKey in oas.paths[pathKey]) {
|
||||
/**
|
||||
* Prefix tags declared on routes
|
||||
* e.g.: Admin Customer, Store Customer
|
||||
*/
|
||||
if (oas.paths[pathKey][operationKey].tags) {
|
||||
oas.paths[pathKey][operationKey].tags = oas.paths[pathKey][
|
||||
operationKey
|
||||
].tags.map((tag) => getPrefixedTagName(tag, apiType))
|
||||
}
|
||||
/**
|
||||
* Prefix operationId
|
||||
* e.g.: AdminGetCustomers, StoreGetCustomers
|
||||
*/
|
||||
if (oas.paths[pathKey][operationKey].operationId) {
|
||||
oas.paths[pathKey][operationKey].operationId = getPrefixedOperationId(
|
||||
oas.paths[pathKey][operationKey].operationId,
|
||||
apiType
|
||||
)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Prefix tags globally
|
||||
* e.g.: Admin Customer, Store Customer
|
||||
*/
|
||||
if (oas.tags) {
|
||||
for (const tag of oas.tags) {
|
||||
tag.name = getPrefixedTagName(tag.name, apiType)
|
||||
}
|
||||
}
|
||||
return oas
|
||||
}
|
||||
|
||||
function getPrefixedTagName(tagName: string, apiType: ApiType): string {
|
||||
return `${upperFirst(apiType)} ${tagName}`
|
||||
}
|
||||
|
||||
function getPrefixedOperationId(operationId: string, apiType: ApiType): string {
|
||||
return `${upperFirst(apiType)}${operationId}`
|
||||
}
|
||||
@@ -0,0 +1,30 @@
|
||||
import { access, lstat, mkdtemp } from "fs/promises"
|
||||
import path from "path"
|
||||
import { sep } from "path"
|
||||
import { tmpdir } from "os"
|
||||
|
||||
export async function isFile(filePath: string): Promise<boolean> {
|
||||
try {
|
||||
return (await lstat(path.resolve(filePath))).isFile()
|
||||
} catch (err) {
|
||||
console.log(err)
|
||||
return false
|
||||
}
|
||||
}
|
||||
|
||||
export async function exists(filePath: string): Promise<boolean> {
|
||||
try {
|
||||
await access(path.resolve(filePath))
|
||||
return true
|
||||
} catch (err) {
|
||||
return false
|
||||
}
|
||||
}
|
||||
|
||||
export const getTmpDirectory = async () => {
|
||||
/**
|
||||
* RUNNER_TEMP: GitHub action, the path to a temporary directory on the runner.
|
||||
*/
|
||||
const tmpDir = process.env["RUNNER_TEMP"] ?? tmpdir()
|
||||
return await mkdtemp(`${tmpDir}${sep}`)
|
||||
}
|
||||
@@ -0,0 +1,14 @@
|
||||
import fs from "fs/promises"
|
||||
|
||||
export const readJson = async (filePath: string): Promise<unknown> => {
|
||||
const jsonString = await fs.readFile(filePath, "utf8")
|
||||
return JSON.parse(jsonString)
|
||||
}
|
||||
|
||||
export const writeJson = async (
|
||||
filePath: string,
|
||||
jsonObject: unknown
|
||||
): Promise<void> => {
|
||||
const jsonString = JSON.stringify(jsonObject)
|
||||
await fs.writeFile(filePath, jsonString, "utf8")
|
||||
}
|
||||
@@ -0,0 +1,93 @@
|
||||
import { OpenAPIObject, TagObject } from "openapi3-ts"
|
||||
|
||||
export function mergeBaseIntoOAS(
|
||||
targetOAS: OpenAPIObject,
|
||||
sourceOAS: OpenAPIObject
|
||||
): void {
|
||||
/**
|
||||
* replace strategy for OpenAPIObject properties
|
||||
*/
|
||||
targetOAS.openapi = sourceOAS.openapi ?? targetOAS.openapi
|
||||
targetOAS.info = sourceOAS.info ?? targetOAS.info
|
||||
targetOAS.servers = sourceOAS.servers ?? targetOAS.servers
|
||||
targetOAS.security = sourceOAS.security ?? targetOAS.security
|
||||
targetOAS.externalDocs = sourceOAS.externalDocs ?? targetOAS.externalDocs
|
||||
targetOAS.webhooks = sourceOAS.webhooks ?? targetOAS.webhooks
|
||||
/**
|
||||
* merge + concat strategy for tags
|
||||
*/
|
||||
const targetTags = targetOAS.tags ?? []
|
||||
const sourceTags = sourceOAS.tags ?? []
|
||||
const combinedTags: TagObject[] = []
|
||||
const sourceIndexes: number[] = []
|
||||
for (const targetTag of targetTags) {
|
||||
for (const [sourceTagIndex, sourceTag] of sourceTags.entries()) {
|
||||
if (targetTag.name === sourceTag.name) {
|
||||
combinedTags.push(sourceTag)
|
||||
sourceIndexes.push(sourceTagIndex)
|
||||
continue
|
||||
}
|
||||
combinedTags.push(targetTag)
|
||||
}
|
||||
}
|
||||
for (const [sourceTagIndex, sourceTag] of sourceTags.entries()) {
|
||||
if (!sourceIndexes.includes(sourceTagIndex)) {
|
||||
combinedTags.push(sourceTag)
|
||||
}
|
||||
}
|
||||
targetOAS.tags = combinedTags
|
||||
/**
|
||||
* merge strategy for paths
|
||||
*/
|
||||
targetOAS.paths = Object.assign(targetOAS.paths ?? {}, sourceOAS.paths ?? {})
|
||||
/**
|
||||
* merge strategy for components
|
||||
*/
|
||||
if (!sourceOAS.components) {
|
||||
return
|
||||
}
|
||||
if (!targetOAS.components) {
|
||||
targetOAS.components = {}
|
||||
}
|
||||
for (const componentGroup of [
|
||||
"callbacks",
|
||||
"examples",
|
||||
"headers",
|
||||
"links",
|
||||
"parameters",
|
||||
"requestBodies",
|
||||
"responses",
|
||||
"schemas",
|
||||
"securitySchemes",
|
||||
]) {
|
||||
if (Object.keys(sourceOAS.components).includes(componentGroup)) {
|
||||
targetOAS.components[componentGroup] = Object.assign(
|
||||
targetOAS.components[componentGroup] ?? {},
|
||||
sourceOAS.components[componentGroup]
|
||||
)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
export function mergePathsAndSchemasIntoOAS(
|
||||
targetOAS: OpenAPIObject,
|
||||
sourceOAS: OpenAPIObject
|
||||
): void {
|
||||
/**
|
||||
* merge paths
|
||||
*/
|
||||
Object.assign(targetOAS.paths, sourceOAS.paths)
|
||||
|
||||
/**
|
||||
* merge components.schemas
|
||||
*/
|
||||
if (sourceOAS.components?.schemas) {
|
||||
if (!targetOAS.components) {
|
||||
targetOAS.components = {}
|
||||
}
|
||||
if (!targetOAS.components.schemas) {
|
||||
targetOAS.components.schemas = {}
|
||||
}
|
||||
Object.assign(targetOAS.components.schemas, sourceOAS.components.schemas)
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,27 @@
|
||||
import fs from "fs/promises"
|
||||
import * as yaml from "js-yaml"
|
||||
|
||||
export const readYaml = async (filePath): Promise<unknown> => {
|
||||
const yamlString = await fs.readFile(filePath, "utf8")
|
||||
return yaml.load(yamlString)
|
||||
}
|
||||
|
||||
export const writeYaml = async (filePath: string, yamlContent: string): Promise<void> => {
|
||||
await fs.writeFile(filePath, yamlContent, "utf8")
|
||||
}
|
||||
|
||||
export const writeYamlFromJson = async (filePath, jsonObject): Promise<void> => {
|
||||
const yamlString = yaml.dump(jsonObject)
|
||||
await fs.writeFile(filePath, yamlString, "utf8")
|
||||
}
|
||||
|
||||
export const jsonObjectToYamlString = (jsonObject): string => {
|
||||
return yaml.dump(jsonObject)
|
||||
}
|
||||
|
||||
export const jsonFileToYamlFile = async (inputJsonFile, outputYamlFile) => {
|
||||
const jsonString = await fs.readFile(inputJsonFile, "utf8")
|
||||
const jsonObject = JSON.parse(jsonString)
|
||||
const yamlString = yaml.dump(jsonObject)
|
||||
await fs.writeFile(outputYamlFile, yamlString, "utf8")
|
||||
}
|
||||
@@ -0,0 +1,36 @@
|
||||
{
|
||||
"compilerOptions": {
|
||||
"lib": [
|
||||
"es2019"
|
||||
],
|
||||
"target": "es2019",
|
||||
"outDir": "./dist",
|
||||
"esModuleInterop": true,
|
||||
"declaration": true,
|
||||
"module": "commonjs",
|
||||
"moduleResolution": "node",
|
||||
"emitDecoratorMetadata": true,
|
||||
"experimentalDecorators": true,
|
||||
"sourceMap": true,
|
||||
"noImplicitReturns": true,
|
||||
"strictNullChecks": true,
|
||||
"strictFunctionTypes": true,
|
||||
"noImplicitThis": true,
|
||||
"allowJs": true,
|
||||
"skipLibCheck": true,
|
||||
"downlevelIteration": true // to use ES5 specific tooling
|
||||
},
|
||||
"include": [
|
||||
"src"
|
||||
],
|
||||
"exclude": [
|
||||
"dist",
|
||||
"src/**/__tests__",
|
||||
"src/**/__mocks__",
|
||||
"src/**/__fixtures__",
|
||||
"node_modules"
|
||||
],
|
||||
"ts-node": {
|
||||
"transpileOnly": true
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,5 @@
|
||||
{
|
||||
"extends": "./tsconfig.json",
|
||||
"include": ["src"],
|
||||
"exclude": ["node_modules"]
|
||||
}
|
||||
@@ -0,0 +1,6 @@
|
||||
/dist
|
||||
node_modules
|
||||
.DS_store
|
||||
.env*
|
||||
.env
|
||||
*.sql
|
||||
@@ -0,0 +1,279 @@
|
||||
# @medusajs/oas-github-ci
|
||||
|
||||
## 1.0.27
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#6812](https://github.com/medusajs/medusa/pull/6812) [`e005987adf`](https://github.com/medusajs/medusa/commit/e005987adf2a2dd8c2748e7abc360cc93e7c05ad) Thanks [@shahednasser](https://github.com/shahednasser)! - fix(medusa-oas-cli): fix tool not working in Medusa backends
|
||||
|
||||
- Updated dependencies [[`e005987adf`](https://github.com/medusajs/medusa/commit/e005987adf2a2dd8c2748e7abc360cc93e7c05ad)]:
|
||||
- @medusajs/medusa-oas-cli@0.3.2
|
||||
|
||||
## 1.0.26
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#6338](https://github.com/medusajs/medusa/pull/6338) [`374a3f4dab`](https://github.com/medusajs/medusa/commit/374a3f4dab7bf893ba3c579fa7f9c346fa30ad64) Thanks [@shahednasser](https://github.com/shahednasser)! - feat(@medusajs/medusa-oas-cli): added v2 flag
|
||||
feat(@medusajs/oas-github-ci): added v2 flag
|
||||
|
||||
- [#6710](https://github.com/medusajs/medusa/pull/6710) [`bb87db8342`](https://github.com/medusajs/medusa/commit/bb87db83424e2625e273fd5cf06cece37245b6da) Thanks [@shahednasser](https://github.com/shahednasser)! - feat(oas-github-ci): output specs into `specs-v2` directory when the `--v2` option is passed
|
||||
|
||||
- Updated dependencies [[`374a3f4dab`](https://github.com/medusajs/medusa/commit/374a3f4dab7bf893ba3c579fa7f9c346fa30ad64)]:
|
||||
- @medusajs/medusa-oas-cli@0.3.1
|
||||
|
||||
## 1.0.25
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- Updated dependencies [[`e84847be3`](https://github.com/medusajs/medusa/commit/e84847be36c88c5bd8e49e1e8aca734ce941a7c2)]:
|
||||
- @medusajs/medusa-oas-cli@0.3.0
|
||||
|
||||
## 1.0.24
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- Updated dependencies []:
|
||||
- @medusajs/medusa-oas-cli@0.2.24
|
||||
|
||||
## 1.0.23
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#5453](https://github.com/medusajs/medusa/pull/5453) [`c1b97050a`](https://github.com/medusajs/medusa/commit/c1b97050ab801c6b5702311f777ebd7f98080a97) Thanks [@shahednasser](https://github.com/shahednasser)! - feat(medusa-oas-cli): Added `--main-file-name` option to specify output file name
|
||||
feat(oas-github-ci): Added `--with-full-file"` option to output both split and full files
|
||||
- Updated dependencies [[`c1b97050a`](https://github.com/medusajs/medusa/commit/c1b97050ab801c6b5702311f777ebd7f98080a97)]:
|
||||
- @medusajs/medusa-oas-cli@0.2.23
|
||||
|
||||
## 1.0.22
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- Updated dependencies []:
|
||||
- @medusajs/medusa-oas-cli@0.2.22
|
||||
|
||||
## 1.0.21
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- Updated dependencies []:
|
||||
- @medusajs/medusa-oas-cli@0.2.21
|
||||
|
||||
## 1.0.20
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#5174](https://github.com/medusajs/medusa/pull/5174) [`fa7c94b4c`](https://github.com/medusajs/medusa/commit/fa7c94b4ccf864d7a6c7ed4bba26b7086ed99b2f) Thanks [@shahednasser](https://github.com/shahednasser)! - feat(@medusajs/oas-github-ci): changed output path to match new docs workspace
|
||||
|
||||
- Updated dependencies []:
|
||||
- @medusajs/medusa-oas-cli@0.2.20
|
||||
|
||||
## 1.0.19
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- Updated dependencies []:
|
||||
- @medusajs/medusa-oas-cli@0.2.19
|
||||
|
||||
## 1.0.18
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- Updated dependencies []:
|
||||
- @medusajs/medusa-oas-cli@0.2.18
|
||||
|
||||
## 1.0.17
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- Updated dependencies []:
|
||||
- @medusajs/medusa-oas-cli@0.2.17
|
||||
|
||||
## 1.0.16
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- Updated dependencies []:
|
||||
- @medusajs/medusa-oas-cli@0.2.16
|
||||
|
||||
## 1.0.15
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#4770](https://github.com/medusajs/medusa/pull/4770) [`914d773d3`](https://github.com/medusajs/medusa/commit/914d773d3a82c86685b66c0c33ab1d7a9bad7dd4) Thanks [@shahednasser](https://github.com/shahednasser)! - feat(@medusajs/oas-github-ci): Improvements for custom API reference
|
||||
|
||||
- Updated dependencies []:
|
||||
- @medusajs/medusa-oas-cli@0.2.15
|
||||
|
||||
## 1.0.14
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- Updated dependencies []:
|
||||
- @medusajs/medusa-oas-cli@0.2.14
|
||||
|
||||
## 1.0.13
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- Updated dependencies []:
|
||||
- @medusajs/medusa-oas-cli@0.2.13
|
||||
|
||||
## 1.0.12
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- Updated dependencies []:
|
||||
- @medusajs/medusa-oas-cli@0.2.12
|
||||
|
||||
## 1.0.11
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- Updated dependencies []:
|
||||
- @medusajs/medusa-oas-cli@0.2.11
|
||||
|
||||
## 1.0.10
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- Updated dependencies []:
|
||||
- @medusajs/medusa-oas-cli@0.2.10
|
||||
|
||||
## 1.0.9
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- Updated dependencies []:
|
||||
- @medusajs/medusa-oas-cli@0.2.9
|
||||
|
||||
## 1.0.8
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- Updated dependencies []:
|
||||
- @medusajs/medusa-oas-cli@0.2.8
|
||||
|
||||
## 1.0.7
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- Updated dependencies []:
|
||||
- @medusajs/medusa-oas-cli@0.2.7
|
||||
|
||||
## 1.0.6
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- Updated dependencies [[`cfcd2d54f`](https://github.com/medusajs/medusa/commit/cfcd2d54fd281fd98de881fc6dfbcc6b1b47c855)]:
|
||||
- @medusajs/medusa-oas-cli@0.2.6
|
||||
|
||||
## 1.0.5
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- Updated dependencies []:
|
||||
- @medusajs/medusa-oas-cli@0.2.5
|
||||
|
||||
## 1.0.4
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- Updated dependencies []:
|
||||
- @medusajs/medusa-oas-cli@0.2.4
|
||||
|
||||
## 1.0.3
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- Updated dependencies []:
|
||||
- @medusajs/medusa-oas-cli@0.2.3
|
||||
|
||||
## 1.0.2
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- Updated dependencies [[`1456056e8`](https://github.com/medusajs/medusa/commit/1456056e8f9d48cc1cb64c1836a6de5d5b263b05)]:
|
||||
- @medusajs/medusa-oas-cli@0.2.2
|
||||
|
||||
## 1.0.2
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- Updated dependencies []:
|
||||
- @medusajs/medusa-oas-cli@0.2.1
|
||||
|
||||
## 1.0.1
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#3410](https://github.com/medusajs/medusa/pull/3410) [`8ed67d2d7`](https://github.com/medusajs/medusa/commit/8ed67d2d7d36da2d0cb3541e42bba1770aefc60a) Thanks [@olivermrbl](https://github.com/olivermrbl)! - fix(admin,oas-github-cli): Change public config to private to avoid publishing attempts + include `setup/` in npm release for `admin-ui`
|
||||
|
||||
- Updated dependencies [[`813706190`](https://github.com/medusajs/medusa/commit/8137061908d20314314b95d049b966a624e7123d), [`377b9ce8c`](https://github.com/medusajs/medusa/commit/377b9ce8c27cb2c4d10d154c005c7553c173c0e8), [`1c40346e9`](https://github.com/medusajs/medusa/commit/1c40346e9e56718de4a6e2d5c2d52abd388343e3), [`53c4a43ca`](https://github.com/medusajs/medusa/commit/53c4a43ca2b2c04920dc6f38e07348e7dfc23195), [`966aea65c`](https://github.com/medusajs/medusa/commit/966aea65c221403bf316ae7665cc8f73bccd9c38)]:
|
||||
- @medusajs/medusa-oas-cli@0.2.0
|
||||
|
||||
## 1.0.1-rc.8
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- Updated dependencies []:
|
||||
- @medusajs/medusa-oas-cli@0.2.0-rc.8
|
||||
|
||||
## 1.0.1-rc.7
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- Updated dependencies []:
|
||||
- @medusajs/medusa-oas-cli@0.2.0-rc.7
|
||||
|
||||
## 1.0.1-rc.6
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- Updated dependencies []:
|
||||
- @medusajs/medusa-oas-cli@0.2.0-rc.6
|
||||
|
||||
## 1.0.1-rc.5
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- Updated dependencies []:
|
||||
- @medusajs/medusa-oas-cli@0.2.0-rc.5
|
||||
|
||||
## 1.0.1-rc.4
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- Updated dependencies [[`377b9ce8c`](https://github.com/medusajs/medusa/commit/377b9ce8c27cb2c4d10d154c005c7553c173c0e8)]:
|
||||
- @medusajs/medusa-oas-cli@0.2.0-rc.4
|
||||
|
||||
## 1.0.1-rc.3
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- Updated dependencies []:
|
||||
- @medusajs/medusa-oas-cli@0.2.0-rc.3
|
||||
|
||||
## 1.0.1-rc.2
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- Updated dependencies []:
|
||||
- @medusajs/medusa-oas-cli@0.2.0-rc.2
|
||||
|
||||
## 1.0.1-rc.1
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- Updated dependencies [[`53c4a43ca`](https://github.com/medusajs/medusa/commit/53c4a43ca2b2c04920dc6f38e07348e7dfc23195)]:
|
||||
- @medusajs/medusa-oas-cli@0.2.0-rc.1
|
||||
|
||||
## 1.0.1-rc.0
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#3410](https://github.com/medusajs/medusa/pull/3410) [`8ed67d2d7`](https://github.com/medusajs/medusa/commit/8ed67d2d7d36da2d0cb3541e42bba1770aefc60a) Thanks [@olivermrbl](https://github.com/olivermrbl)! - fix(admin,oas-github-cli): Change public config to private to avoid publishing attempts + include `setup/` in npm release for `admin-ui`
|
||||
|
||||
- Updated dependencies [[`813706190`](https://github.com/medusajs/medusa/commit/8137061908d20314314b95d049b966a624e7123d), [`1c40346e9`](https://github.com/medusajs/medusa/commit/1c40346e9e56718de4a6e2d5c2d52abd388343e3), [`966aea65c`](https://github.com/medusajs/medusa/commit/966aea65c221403bf316ae7665cc8f73bccd9c38)]:
|
||||
- @medusajs/medusa-oas-cli@0.2.0-rc.0
|
||||
|
||||
## 1.0.0
|
||||
@@ -0,0 +1,3 @@
|
||||
# oas-github-ci - 1.0.0
|
||||
|
||||
Script used in CI for validating OAS documentation.
|
||||
@@ -0,0 +1,27 @@
|
||||
{
|
||||
"name": "@medusajs/oas-github-ci",
|
||||
"version": "1.0.27",
|
||||
"description": "OAS Github CI",
|
||||
"main": "scripts/build-openapi.js",
|
||||
"files": [
|
||||
"scripts"
|
||||
],
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "https://github.com/medusajs/medusa",
|
||||
"directory": "packages/oas/oas-github-ci"
|
||||
},
|
||||
"private": true,
|
||||
"author": "Medusa",
|
||||
"license": "MIT",
|
||||
"scripts": {
|
||||
"ci": "node scripts/build-openapi.js",
|
||||
"preview:admin": "yarn medusa-oas docs --src-file ../../../docs/api/admin/openapi.yaml --preview",
|
||||
"preview:store": "yarn medusa-oas docs --src-file ../../../docs/api/store/openapi.yaml --preview",
|
||||
"test": "jest --passWithNoTests"
|
||||
},
|
||||
"dependencies": {
|
||||
"@medusajs/medusa-oas-cli": "0.3.2",
|
||||
"execa": "^5.1.1"
|
||||
}
|
||||
}
|
||||
+87
@@ -0,0 +1,87 @@
|
||||
#!/usr/bin/env node
|
||||
|
||||
const fs = require("fs/promises")
|
||||
const os = require("os")
|
||||
const path = require("path")
|
||||
const execa = require("execa")
|
||||
|
||||
const isDryRun = process.argv.indexOf("--dry-run") !== -1
|
||||
const withFullFile = process.argv.indexOf("--with-full-file") !== -1
|
||||
const v2 = process.argv.indexOf("--v2") !== -1
|
||||
const basePath = path.resolve(__dirname, `../`)
|
||||
const repoRootPath = path.resolve(basePath, `../../../`)
|
||||
const docsApiPath = v2 ? path.resolve(repoRootPath, "www/apps/api-reference/specs-v2") :
|
||||
path.resolve(repoRootPath, "www/apps/api-reference/specs")
|
||||
|
||||
const run = async () => {
|
||||
const oasOutDir = isDryRun ? await getTmpDirectory() : docsApiPath
|
||||
for (const apiType of ["store", "admin"]) {
|
||||
await generateOASSource(oasOutDir, apiType)
|
||||
const oasSrcFile = path.resolve(oasOutDir, `${apiType}.oas.json`)
|
||||
const docsOutDir = path.resolve(oasOutDir, apiType)
|
||||
await generateDocs(oasSrcFile, docsOutDir, isDryRun)
|
||||
}
|
||||
}
|
||||
|
||||
const generateOASSource = async (outDir, apiType) => {
|
||||
const commandParams = ["oas", `--type=${apiType}`, `--out-dir=${outDir}`, "--local"]
|
||||
if (v2) {
|
||||
commandParams.push(`--v2`)
|
||||
}
|
||||
const { all: logs } = await execa(
|
||||
"medusa-oas",
|
||||
commandParams,
|
||||
{ cwd: basePath, all: true }
|
||||
)
|
||||
console.log(logs)
|
||||
}
|
||||
|
||||
const generateDocs = async (srcFile, outDir, isDryRun) => {
|
||||
let params = [
|
||||
"docs",
|
||||
`--src-file=${srcFile}`,
|
||||
`--out-dir=${outDir}`,
|
||||
`--clean`,
|
||||
`--split`,
|
||||
]
|
||||
if (isDryRun) {
|
||||
params.push("--dry-run")
|
||||
}
|
||||
await runMedusaOasCommand(params)
|
||||
if (withFullFile && !isDryRun) {
|
||||
console.log("Generating full file...")
|
||||
params = [
|
||||
"docs",
|
||||
`--src-file=${srcFile}`,
|
||||
`--out-dir=${outDir}`,
|
||||
`--main-file-name=openapi.full.yaml`
|
||||
]
|
||||
await runMedusaOasCommand(params)
|
||||
console.log("Finished generating full file.")
|
||||
}
|
||||
}
|
||||
|
||||
const runMedusaOasCommand = async (params) => {
|
||||
const { all: logs } = await execa("medusa-oas", params, {
|
||||
cwd: basePath,
|
||||
all: true,
|
||||
})
|
||||
console.log(logs)
|
||||
}
|
||||
|
||||
const getTmpDirectory = async () => {
|
||||
/**
|
||||
* RUNNER_TEMP: GitHub action, the path to a temporary directory on the runner.
|
||||
*/
|
||||
const tmpDir = process.env["RUNNER_TEMP"] ?? os.tmpdir()
|
||||
return await fs.mkdtemp(tmpDir)
|
||||
}
|
||||
|
||||
void (async () => {
|
||||
try {
|
||||
await run()
|
||||
} catch (err) {
|
||||
console.log(err)
|
||||
process.exit(1)
|
||||
}
|
||||
})()
|
||||
@@ -0,0 +1,7 @@
|
||||
/dist
|
||||
node_modules
|
||||
.DS_store
|
||||
.env*
|
||||
.env
|
||||
*.sql
|
||||
/bin
|
||||
@@ -0,0 +1 @@
|
||||
*.hbs
|
||||
@@ -0,0 +1,33 @@
|
||||
# @medusajs/openapi-typescript-codegen
|
||||
|
||||
## 0.2.1
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#3675](https://github.com/medusajs/medusa/pull/3675) [`0b3c6fde3`](https://github.com/medusajs/medusa/commit/0b3c6fde30c9bac052a8e1aa641ef925b6937c0e) Thanks [@patrick-medusajs](https://github.com/patrick-medusajs)! - feat(codegen:test): coverage x-expanded-relation + x-codegen.queryParams
|
||||
|
||||
## 0.2.0
|
||||
|
||||
### Minor Changes
|
||||
|
||||
- [#3477](https://github.com/medusajs/medusa/pull/3477) [`826d4bedf`](https://github.com/medusajs/medusa/commit/826d4bedfe1b6459163711d5173eb8eadfdea26e) Thanks [@patrick-medusajs](https://github.com/patrick-medusajs)! - feat(codegen,types): SetRelation on expanded types
|
||||
|
||||
- [#3442](https://github.com/medusajs/medusa/pull/3442) [`7b57695e0`](https://github.com/medusajs/medusa/commit/7b57695e00433e1d54f8cdc912ef7e5f28fc1071) Thanks [@patrick-medusajs](https://github.com/patrick-medusajs)! - feat(codegen): x-expanded-relations
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#3272](https://github.com/medusajs/medusa/pull/3272) [`1c40346e9`](https://github.com/medusajs/medusa/commit/1c40346e9e56718de4a6e2d5c2d52abd388343e3) Thanks [@patrick-medusajs](https://github.com/patrick-medusajs)! - feat(codegen): openapi-typescript-codegen fork
|
||||
|
||||
## 0.2.0-rc.0
|
||||
|
||||
### Minor Changes
|
||||
|
||||
- [#3477](https://github.com/medusajs/medusa/pull/3477) [`826d4bedf`](https://github.com/medusajs/medusa/commit/826d4bedfe1b6459163711d5173eb8eadfdea26e) Thanks [@patrick-medusajs](https://github.com/patrick-medusajs)! - feat(codegen,types): SetRelation on expanded types
|
||||
|
||||
- [#3442](https://github.com/medusajs/medusa/pull/3442) [`7b57695e0`](https://github.com/medusajs/medusa/commit/7b57695e00433e1d54f8cdc912ef7e5f28fc1071) Thanks [@patrick-medusajs](https://github.com/patrick-medusajs)! - feat(codegen): x-expanded-relations
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#3272](https://github.com/medusajs/medusa/pull/3272) [`1c40346e9`](https://github.com/medusajs/medusa/commit/1c40346e9e56718de4a6e2d5c2d52abd388343e3) Thanks [@patrick-medusajs](https://github.com/patrick-medusajs)! - feat(codegen): openapi-typescript-codegen fork
|
||||
|
||||
## 0.1.0
|
||||
@@ -0,0 +1,39 @@
|
||||
# OpenAPI Typescript Codegen - 0.1.0 - experimental
|
||||
|
||||
Node.js library that generates Typescript clients based on the OpenAPI specification.
|
||||
|
||||
## About this fork
|
||||
|
||||
This package is a significantly customized fork of
|
||||
the amazing npm [openapi-typescript-codegen](https://github.com/ferdikoomen/openapi-typescript-codegen) package 💜.
|
||||
|
||||
### Brief reasoning
|
||||
|
||||
We wanted a level of customization that was not achievable through the source package's interface.
|
||||
We started with a conventional fork but the development workflow was hindering our ability to iterate quickly.
|
||||
We decided to fold the package within our monorepo to hopefully accelerate development and innovation.
|
||||
|
||||
### Noteworthy differences
|
||||
|
||||
* Added ability to generate React hooks in the style of `medusa-react`.
|
||||
* Added access to the raw OAS on parsed element from the parser.
|
||||
* Added access `operationId` from path.
|
||||
* Added access `x-codegen` from path.
|
||||
* Added renaming of service method when `x-codegen.method` is declared.
|
||||
* Added bundling of query params into a typed object when `x-codegen.queryParams` is declared.
|
||||
* Added parameterization of import path for the client and models.
|
||||
* Added generated `index.ts` files in directories to simplify imports.
|
||||
* Updated npm dependencies' version to match existing versions found in the monorepo.
|
||||
* Updated coding style to match the convention of the monorepo.
|
||||
* Removed support for Angular client generation.
|
||||
* Removed support for OAS v2 parsing.
|
||||
* Removed documentation.
|
||||
* Removed tests (temporarily).
|
||||
|
||||
## Install
|
||||
|
||||
`yarn add --dev @medusajs/openapi-typescript-codegen`
|
||||
|
||||
## How to use
|
||||
|
||||
See `@medusajs/medusa-oas-cli` for usage.
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user