Feat(): faster entity serializer (#13564)

**What**
Improve serialization further on complex entities to reduce bottleneck.

**Notes**
It might good at some point to integrate these improvements into mikro orm package 📦 


**Load test**
The test is using autocanon and wrap the serializers call behind http end points.
Each product has 2 variants, 3 options and 3 options values.

autocanon is configured for 10 connections during 20 second and 1 pipelining. This is repeated for each configuration and each catch size.

    🚀 Load Testing Serializers with Autocannon
    ================================================================================

    ====================================================================================================
    🎯 TESTING 10 PRODUCTS
    ====================================================================================================
    🖥️  Server started on port 57840
    📊 Testing with 10 products per request

    🔥 Load testing: MikroOrm
    --------------------------------------------------
       Requests/sec: 33.85
       Avg Latency: 319.30ms
       P90 Latency: 327.00ms
       Throughput: 31.36 MB/s
       Errors: 0

    🔥 Load testing: Current
    --------------------------------------------------
       Requests/sec: 821.15
       Avg Latency: 11.67ms
       P90 Latency: 12.00ms
       Throughput: 0.18 MB/s
       Errors: 0

    🔥 Load testing: Optimized
    --------------------------------------------------
       Requests/sec: 1286.75
       Avg Latency: 7.25ms
       P90 Latency: 7.00ms
       Throughput: 37.31 MB/s
       Errors: 0

    📈 Load Testing Performance Comparison for 10 products:
    --------------------------------------------------------------------------------------------------------------------------------------------
    Serializer      Requests/sec    Avg Latency (ms)   P90 Latency (ms)   Throughput (MB/s)  Errors     RPS Improvement
    --------------------------------------------------------------------------------------------------------------------------------------------
    MikroOrm        33.85           319.30             327.00             31.36              0          baseline
    Current         821.15          11.67              12.00              0.18               0          24.3x
    Optimized       1286.75         7.25               7.00               37.31              0          38.0x

    🎯 Key Insights for 10 products:
       • Optimized serializer handles 1.6x more requests/sec than Current
       • Optimized serializer handles 38.0x more requests/sec than MikroOrm
       • 37.9% lower latency compared to Current  serializer

    🔴 Server stopped for 10 products test

    ====================================================================================================
    🎯 TESTING 100 PRODUCTS
    ====================================================================================================
    🖥️  Server started on port 57878
    📊 Testing with 100 products per request

    🔥 Load testing: MikroOrm
    --------------------------------------------------
       Requests/sec: 3.69
       Avg Latency: 3241.29ms
       P90 Latency: 4972.00ms
       Throughput: 35.04 MB/s
       Errors: 0

    🔥 Load testing: Current
    --------------------------------------------------
       Requests/sec: 87.45
       Avg Latency: 117.20ms
       P90 Latency: 116.00ms
       Throughput: 0.02 MB/s
       Errors: 0

    🔥 Load testing: Optimized
    --------------------------------------------------
       Requests/sec: 143.56
       Avg Latency: 70.62ms
       P90 Latency: 72.00ms
       Throughput: 42.22 MB/s
       Errors: 0

    📈 Load Testing Performance Comparison for 100 products:
    --------------------------------------------------------------------------------------------------------------------------------------------
    Serializer      Requests/sec    Avg Latency (ms)   P90 Latency (ms)   Throughput (MB/s)  Errors     RPS Improvement
    --------------------------------------------------------------------------------------------------------------------------------------------
    MikroOrm        3.69            3241.29            4972.00            35.04              0          baseline
    Current         87.45           117.20             116.00             0.02               0          23.7x
    Optimized       143.56          70.62              72.00              42.22              0          38.9x

    🎯 Key Insights for 100 products:
       • Optimized serializer handles 1.6x more requests/sec than Current
       • Optimized serializer handles 38.9x more requests/sec than MikroOrm
       • 39.7% lower latency compared to Current  serializer

    🔴 Server stopped for 100 products test

    ====================================================================================================
    🎯 TESTING 1,000 PRODUCTS
    ====================================================================================================
    🖥️  Server started on port 57930
    📊 Testing with 1000 products per request

    🔥 Load testing: MikroOrm
    --------------------------------------------------
       Requests/sec: 0.00
       Avg Latency: 0.00ms
       P90 Latency: 0.00ms
       Throughput: 0.00 MB/s
       Errors: 10

    🔥 Load testing: Current
    --------------------------------------------------
       Requests/sec: 0.00
       Avg Latency: 0.00ms
       P90 Latency: 0.00ms
       Throughput: 0.00 MB/s
       Errors: 20

    🔥 Load testing: Optimized
    --------------------------------------------------
       Requests/sec: 13.79
       Avg Latency: 792.94ms
       P90 Latency: 755.00ms
       Throughput: 41.47 MB/s
       Errors: 0

    📈 Load Testing Performance Comparison for 1000 products:
    --------------------------------------------------------------------------------------------------------------------------------------------
    Serializer      Requests/sec    Avg Latency (ms)   P90 Latency (ms)   Throughput (MB/s)  Errors     RPS Improvement
    --------------------------------------------------------------------------------------------------------------------------------------------
    MikroOrm        0.00            0.00               0.00               0.00               10         NaNx
    Current         0.00            0.00               0.00               0.00               20         NaNx
    Optimized       13.79           792.94             755.00             41.47              0          Infinityx

    🎯 Key Insights for 1000 products:
       • Optimized serializer handles Infinityx more requests/sec than Current
       • Optimized serializer handles Infinityx more requests/sec than MikroOrm
       • -Infinity% lower latency compared to Current  serializer

    🔴 Server stopped for 1000 products test


    ======================================================================================================================================================
    📊 COMPREHENSIVE AUTOCANNON LOAD TESTING ANALYSIS
    ======================================================================================================================================================

    🚀 Autocannon Load Testing Scaling Analysis:
    --------------------------------------------------------------------------------------------------------------------------------------------
    Size         RPS (M/O/Op)         Avg Latency (M/O/Op)   P90 Latency (M/O/Op)   Throughput MB/s (M/O/Op)  Speedup vs M (O/Op) Speedup vs O (Op)
    --------------------------------------------------------------------------------------------------------------------------------------------
    10           33.9/821.1/1286.8    319.3/11.7/7.3         327.0/12.0/7.0         31.4/0.2/37.3             24.3x/38.0x        1.6x
    100          3.7/87.5/143.6       3241.3/117.2/70.6      4972.0/116.0/72.0      35.0/0.0/42.2             23.7x/38.9x        1.6x
    1,000        0.0/0.0/13.8         0.0/0.0/792.9          0.0/0.0/755.0          0.0/0.0/41.5              NaNx/Infinityx     Infinityx

    🎯 Overall Load Testing Performance Summary:

       📈 10 products:
          • +56.7% more requests/sec vs Current  (821.1 → 1286.8)
          • +3701.3% more requests/sec vs MikroOrm (33.9 → 1286.8)

       📈 100 products:
          • +64.2% more requests/sec vs Current  (87.5 → 143.6)
          • +3790.5% more requests/sec vs MikroOrm (3.7 → 143.6)

       📈 1000 products:
          • +Infinity% more requests/sec vs Current  (0.0 → 13.8)
          • +Infinity% more requests/sec vs MikroOrm (0.0 → 13.8)
This commit is contained in:
Adrien de Peretti
2025-09-22 19:02:10 +02:00
committed by GitHub
parent 12a96a7c70
commit e39c472a84
7 changed files with 1731 additions and 457 deletions

View File

@@ -9,7 +9,7 @@ import {
createAdminUser,
} from "../../../helpers/create-admin-user"
jest.setTimeout(30000)
jest.setTimeout(60000)
medusaIntegrationTestRunner({
testSuite: ({ dbConnection, getContainer, api }) => {

View File

@@ -29,7 +29,9 @@
"@medusajs/types": "2.10.3",
"@swc/core": "^1.7.28",
"@swc/jest": "^0.2.36",
"@types/autocannon": "^7.12.5",
"@types/express": "^4.17.21",
"autocannon": "^7.15.0",
"expect-type": "^0.20.0",
"express": "^4.21.0",
"jest": "^29.7.0",

View File

@@ -1,4 +1,6 @@
import { MikroORM } from "@medusajs/deps/mikro-orm/core"
import express from "express"
import autocannon, { Result } from "autocannon"
import { MikroORM, EntitySerializer } from "@medusajs/deps/mikro-orm/core"
import { defineConfig } from "@medusajs/deps/mikro-orm/postgresql"
import {
Entity1WithUnDecoratedProp,
@@ -8,8 +10,11 @@ import {
ProductOptionValue,
ProductVariant,
} from "../__fixtures__/utils"
import { mikroOrmSerializer as mikroOrmSerializerOld } from "../mikro-orm-serializer-old"
import { mikroOrmSerializer } from "../mikro-orm-serializer"
jest.setTimeout(60000)
describe("mikroOrmSerializer", () => {
beforeEach(async () => {
await MikroORM.init(
@@ -44,7 +49,7 @@ describe("mikroOrmSerializer", () => {
})
entity1.entity2.add(entity2)
const serialized = await mikroOrmSerializer(entity1, {
const serialized = mikroOrmSerializer(entity1, {
preventCircularRef: false,
})
@@ -81,7 +86,7 @@ describe("mikroOrmSerializer", () => {
})
entity1.entity2.add(entity2)
const serialized = await mikroOrmSerializer([entity1, entity1], {
const serialized = mikroOrmSerializer([entity1, entity1], {
preventCircularRef: false,
})
@@ -120,7 +125,7 @@ describe("mikroOrmSerializer", () => {
})
entity1.entity2.add(entity2)
const serialized = await mikroOrmSerializer(entity1)
const serialized = mikroOrmSerializer(entity1)
expect(serialized).toEqual({
id: "1",
@@ -158,7 +163,7 @@ describe("mikroOrmSerializer", () => {
product.options.add(productOptions)
product.variants.add(productVariant)
const serialized = await mikroOrmSerializer(product)
const serialized = mikroOrmSerializer(product)
expect(serialized).toEqual({
id: "1",
@@ -201,4 +206,372 @@ describe("mikroOrmSerializer", () => {
name: "Product 1",
})
})
it.skip("should benchmark serializers with autocannon load testing", async () => {
const logs: string[] = []
logs.push("🚀 Load Testing Serializers with Autocannon")
logs.push("=".repeat(80))
// Generate test dataset
function generateLoadTestProducts(count: number): Product[] {
const products: Product[] = []
for (let i = 0; i < count; i++) {
const product = new Product()
product.id = `product-${i}`
product.name = `Product ${i}`
// Generate 3 options per product
for (let optionIndex = 0; optionIndex < 3; optionIndex++) {
const option = new ProductOption()
option.id = `option-${product.id}-${optionIndex}`
option.name = `Option ${optionIndex} for Product ${product.id}`
option.product = product
// Generate 3 values per option
for (let valueIndex = 0; valueIndex < 3; valueIndex++) {
const value = new ProductOptionValue()
value.id = `option-value-${option.id}-${valueIndex}`
value.name = `Option Value ${valueIndex} for Option ${option.id}`
value.option_id = option.id
value.option = option
option.values.add(value)
}
product.options.add(option)
}
// Generate 2 variants per product
for (let variantIndex = 0; variantIndex < 2; variantIndex++) {
const variant = new ProductVariant()
variant.id = `variant-${product.id}-${variantIndex}`
variant.name = `Variant ${variantIndex} for Product ${product.id}`
variant.product_id = product.id
variant.product = product
// Assign option values to variants
const optionArray = product.options.getItems()
for (let j = 0; j < 2 && j < optionArray.length; j++) {
const option = optionArray[j]
const optionValues = option.values.getItems()
if (optionValues.length > 0) {
const value = optionValues[0]
variant.options.add(value)
value.variants.add(variant)
}
}
product.variants.add(variant)
}
products.push(product)
}
return products
}
// Test different dataset sizes
const testSizes = [10, 100, 1000]
const allResults: Array<{
size: number
results: Array<{
name: string
requestsPerSecond: number
latency: number
latencyP90: number
throughput: number
errors: number
}>
}> = []
for (const size of testSizes) {
logs.push(`\n${"=".repeat(100)}`)
logs.push(`🎯 TESTING ${size.toLocaleString()} PRODUCTS`)
logs.push(`${"=".repeat(100)}`)
// Create test dataset for this size
const testProducts = generateLoadTestProducts(size)
// Create Express server with serializer endpoints
const app = express()
app.use(express.json())
app.get("/mikro-orm", (_req, res) => {
const result = testProducts.map((product) =>
EntitySerializer.serialize(product, {
populate: ["*"],
forceObject: true,
skipNull: undefined,
ignoreSerializers: undefined,
exclude: undefined,
})
)
res.json(result)
})
app.get("/original", (_req, res) => {
const result = mikroOrmSerializerOld(testProducts)
res.json(result)
})
app.get("/optimized", (_req, res) => {
const result = mikroOrmSerializer(testProducts)
res.json(result)
})
// Start server
const server = app.listen(0) // Use port 0 for automatic port assignment
const port = (server.address() as any)?.port
if (!port) {
throw new Error("Failed to start server")
}
logs.push(`🖥️ Server started on port ${port}`)
logs.push(`📊 Testing with ${testProducts.length} products per request`)
try {
// Autocannon test configurations
const testConfigs = [
{ name: "MikroOrm", path: "/mikro-orm" },
{ name: "Original", path: "/original" },
{ name: "Optimized", path: "/optimized" },
]
const sizeResults: Array<{
name: string
requestsPerSecond: number
latency: number
latencyP90: number
throughput: number
errors: number
}> = []
for (const config of testConfigs) {
logs.push(`\n🔥 Load testing: ${config.name}`)
logs.push("-".repeat(50))
const result = await new Promise<Result>((resolve, reject) => {
autocannon(
{
url: `http://localhost:${port}${config.path}`,
connections: 10,
duration: 20, // 10 seconds
pipelining: 1,
},
(err, result) => {
if (err) {
reject(err)
} else {
resolve(result!)
}
}
)
})
const requestsPerSecond = result.requests.average
const latency = result.latency.average
const latencyP90 = result.latency.p90
const throughput = result.throughput.average
const errors = result.errors
logs.push(` Requests/sec: ${requestsPerSecond.toFixed(2)}`)
logs.push(` Avg Latency: ${latency.toFixed(2)}ms`)
logs.push(` P90 Latency: ${latencyP90.toFixed(2)}ms`)
logs.push(
` Throughput: ${(throughput / 1024 / 1024).toFixed(2)} MB/s`
)
logs.push(` Errors: ${errors}`)
sizeResults.push({
name: config.name,
requestsPerSecond,
latency,
latencyP90,
throughput,
errors,
})
}
// Generate comparison table for this size
logs.push(
`\n📈 Load Testing Performance Comparison for ${size} products:`
)
logs.push("-".repeat(140))
logs.push(
`${"Serializer".padEnd(15)} ${"Requests/sec".padEnd(
15
)} ${"Avg Latency (ms)".padEnd(18)} ${"P90 Latency (ms)".padEnd(
18
)} ${"Throughput (MB/s)".padEnd(18)} ${"Errors".padEnd(
10
)} ${"RPS Improvement"}`
)
logs.push("-".repeat(140))
const baselineRps = sizeResults[0].requestsPerSecond // MikroOrm as baseline
sizeResults.forEach((result) => {
const rpsImprovement = result.requestsPerSecond / baselineRps
const improvementText =
rpsImprovement === 1 ? "baseline" : `${rpsImprovement.toFixed(1)}x`
logs.push(
`${result.name.padEnd(15)} ${result.requestsPerSecond
.toFixed(2)
.padEnd(15)} ${result.latency
.toFixed(2)
.padEnd(18)} ${result.latencyP90.toFixed(2).padEnd(18)} ${(
result.throughput /
1024 /
1024
)
.toFixed(2)
.padEnd(18)} ${result.errors
.toString()
.padEnd(10)} ${improvementText}`
)
})
logs.push(`\n🎯 Key Insights for ${size} products:`)
const optimizedResult = sizeResults.find((r) => r.name === "Optimized")
const originalResult = sizeResults.find((r) => r.name === "Original")
const mikroOrmResult = sizeResults.find((r) => r.name === "MikroOrm")
if (optimizedResult && originalResult && mikroOrmResult) {
const rpsImprovementVsOriginal =
optimizedResult.requestsPerSecond / originalResult.requestsPerSecond
const rpsImprovementVsMikroOrm =
optimizedResult.requestsPerSecond / mikroOrmResult.requestsPerSecond
const latencyImprovementVsOriginal =
((originalResult.latency - optimizedResult.latency) /
originalResult.latency) *
100
logs.push(
` • Optimized serializer handles ${rpsImprovementVsOriginal.toFixed(
1
)}x more requests/sec than Original`
)
logs.push(
` • Optimized serializer handles ${rpsImprovementVsMikroOrm.toFixed(
1
)}x more requests/sec than MikroOrm`
)
logs.push(
`${latencyImprovementVsOriginal.toFixed(
1
)}% lower latency compared to Original serializer`
)
}
allResults.push({ size, results: sizeResults })
} finally {
// Clean up server
server.close()
logs.push(`\n🔴 Server stopped for ${size} products test`)
}
}
// Generate comprehensive comparison across all sizes
logs.push(`\n\n${"=".repeat(150)}`)
logs.push("📊 COMPREHENSIVE AUTOCANNON LOAD TESTING ANALYSIS")
logs.push(`${"=".repeat(150)}`)
logs.push("\n🚀 Autocannon Load Testing Scaling Analysis:")
logs.push("-".repeat(140))
logs.push(
`${"Size".padEnd(12)} ${"RPS (M/O/Op)".padEnd(
20
)} ${"Avg Latency (M/O/Op)".padEnd(22)} ${"P90 Latency (M/O/Op)".padEnd(
22
)} ${"Throughput MB/s (M/O/Op)".padEnd(
25
)} ${"Speedup vs M (O/Op)".padEnd(18)} ${"Speedup vs O (Op)"}`
)
logs.push("-".repeat(140))
allResults.forEach(({ size, results }) => {
const mikroOrm = results.find((r) => r.name === "MikroOrm")
const original = results.find((r) => r.name === "Original")
const optimized = results.find((r) => r.name === "Optimized")
if (original && optimized && mikroOrm) {
const speedupOptimizedVsMikroOrm =
optimized.requestsPerSecond / mikroOrm.requestsPerSecond
const speedupOptimizedVsOriginal =
optimized.requestsPerSecond / original.requestsPerSecond
const speedupOriginalVsMikroOrm =
original.requestsPerSecond / mikroOrm.requestsPerSecond
const rpsValues = `${mikroOrm.requestsPerSecond.toFixed(
1
)}/${original.requestsPerSecond.toFixed(
1
)}/${optimized.requestsPerSecond.toFixed(1)}`
const avgLatencyValues = `${mikroOrm.latency.toFixed(
1
)}/${original.latency.toFixed(1)}/${optimized.latency.toFixed(1)}`
const p90LatencyValues = `${mikroOrm.latencyP90.toFixed(
1
)}/${original.latencyP90.toFixed(1)}/${optimized.latencyP90.toFixed(1)}`
const throughputValues = `${(mikroOrm.throughput / 1024 / 1024).toFixed(
1
)}/${(original.throughput / 1024 / 1024).toFixed(1)}/${(
optimized.throughput /
1024 /
1024
).toFixed(1)}`
const speedupVsMikroOrm = `${speedupOriginalVsMikroOrm.toFixed(
1
)}x/${speedupOptimizedVsMikroOrm.toFixed(1)}x`
const speedupVsOriginal = `${speedupOptimizedVsOriginal.toFixed(1)}x`
logs.push(
`${size.toLocaleString().padEnd(12)} ${rpsValues.padEnd(
20
)} ${avgLatencyValues.padEnd(22)} ${p90LatencyValues.padEnd(
22
)} ${throughputValues.padEnd(25)} ${speedupVsMikroOrm.padEnd(
18
)} ${speedupVsOriginal}`
)
}
})
logs.push(`\n🎯 Overall Load Testing Performance Summary:`)
allResults.forEach(({ size, results }) => {
const optimized = results.find((r) => r.name === "Optimized")
const original = results.find((r) => r.name === "Original")
const mikroOrm = results.find((r) => r.name === "MikroOrm")
if (optimized && original && mikroOrm) {
const rpsGainVsOriginal =
((optimized.requestsPerSecond - original.requestsPerSecond) /
original.requestsPerSecond) *
100
const rpsGainVsMikroOrm =
((optimized.requestsPerSecond - mikroOrm.requestsPerSecond) /
mikroOrm.requestsPerSecond) *
100
logs.push(`\n 📈 ${size} products:`)
logs.push(
` • +${rpsGainVsOriginal.toFixed(
1
)}% more requests/sec vs Original (${original.requestsPerSecond.toFixed(
1
)}${optimized.requestsPerSecond.toFixed(1)})`
)
logs.push(
` • +${rpsGainVsMikroOrm.toFixed(
1
)}% more requests/sec vs MikroOrm (${mikroOrm.requestsPerSecond.toFixed(
1
)}${optimized.requestsPerSecond.toFixed(1)})`
)
}
})
console.log(logs.join("\n"))
}, 1200000)
})

View File

@@ -109,14 +109,14 @@ class BigNumberNumeric extends Type<string | number, string> {
}
override convertToJSValue(value: string): number | string {
if ((this.mode ?? this.prop?.runtimeType) === "number") {
return +value
}
if (isObject(value)) {
return value // Special case for BigNumberRawValue because the setter will manage the dispatch automatically at a later stage
}
if ((this.mode ?? this.prop?.runtimeType) === "number") {
return +value
}
return String(value)
}

View File

@@ -0,0 +1,684 @@
/**
* This is an optimized mikro orm serializer to create a highly optimized serialization pipeline
* that leverages V8's JIT compilation and inline caching mechanisms.
*/
import {
Collection,
EntityDTO,
EntityMetadata,
helper,
IPrimaryKey,
Loaded,
Platform,
Reference,
ReferenceKind,
SerializationContext,
Utils,
} from "@mikro-orm/core"
const STATIC_OPTIONS_SHAPE: {
populate: string[] | boolean | undefined
exclude: string[] | undefined
preventCircularRef: boolean | undefined
skipNull: boolean | undefined
ignoreSerializers: boolean | undefined
forceObject: boolean | undefined
} = {
populate: ["*"],
exclude: undefined,
preventCircularRef: true,
skipNull: undefined,
ignoreSerializers: undefined,
forceObject: true,
}
const EMPTY_ARRAY: string[] = []
const WILDCARD = "*"
const DOT = "."
const UNDERSCORE = "_"
// JIT-friendly function with predictable patterns
function isVisible<T extends object>(
meta: EntityMetadata<T>,
propName: string,
options: Parameters<typeof EntitySerializer.serialize>[1] & {
preventCircularRef?: boolean
populate?: string[] | boolean
} = STATIC_OPTIONS_SHAPE
): boolean {
// Fast path for boolean populate
const populate = options.populate
if (populate === true) {
return true
}
if (Array.isArray(populate)) {
// Check exclusions first (early exit)
const exclude = options.exclude
if (exclude && exclude.length > 0) {
const excludeLen = exclude.length
for (let i = 0; i < excludeLen; i++) {
if (exclude[i] === propName) {
return false
}
}
}
// Hoist computations outside loop
const propNameLen = propName.length
const propPrefix = propName + DOT
const propPrefixLen = propPrefix.length
const populateLen = populate.length
// Simple loop that JIT can optimize well
for (let i = 0; i < populateLen; i++) {
const item = populate[i]
if (item === propName || item === WILDCARD) {
return true
}
if (
item.length > propNameLen &&
item.substring(0, propPrefixLen) === propPrefix
) {
return true
}
}
return false
}
// Inline property check for non-array case
const prop = meta.properties[propName]
const visible = (prop && !prop.hidden) || prop === undefined
const prefixed = prop && !prop.primary && propName.charAt(0) === UNDERSCORE
return visible && !prefixed
}
// Clean, JIT-friendly function
function isPopulated<T extends object>(
entity: T,
propName: string,
options: Parameters<typeof EntitySerializer.serialize>[1] & {
preventCircularRef?: boolean
populate?: string[] | boolean
} = STATIC_OPTIONS_SHAPE
): boolean {
const populate = options.populate
// Fast path for boolean
if (typeof populate === "boolean") {
return populate
}
if (!Array.isArray(populate)) {
return false
}
// Hoist computations for JIT optimization
const propNameLen = propName.length
const propPrefix = propName + DOT
const propPrefixLen = propPrefix.length
const populateLen = populate.length
// Simple predictable loop
for (let i = 0; i < populateLen; i++) {
const item = populate[i]
if (item === propName || item === WILDCARD) {
return true
}
if (
item.length > propNameLen &&
item.substring(0, propPrefixLen) === propPrefix
) {
return true
}
}
return false
}
/**
* Custom property filtering for the serialization which takes into account circular references to not return them.
* @param propName
* @param meta
* @param options
* @param parents
*/
// @ts-ignore
function filterEntityPropToSerialize({
propName,
meta,
options,
parents,
}: {
propName: string
meta: EntityMetadata
options: Parameters<typeof EntitySerializer.serialize>[1] & {
preventCircularRef?: boolean
populate?: string[] | boolean
}
parents?: string[]
}): boolean {
const parentsArray = parents || EMPTY_ARRAY
const isVisibleRes = isVisible(meta, propName, options)
const prop = meta.properties[propName]
if (
prop &&
options.preventCircularRef &&
isVisibleRes &&
prop.kind !== ReferenceKind.SCALAR
) {
if (!!prop.mapToPk) {
return true
}
const parentsLen = parentsArray.length
for (let i = 0; i < parentsLen; i++) {
if (parentsArray[i] === prop.type) {
return false
}
}
return true
}
return isVisibleRes
}
export class EntitySerializer {
// Thread-safe per-instance cache to avoid concurrency issues
private static readonly PROPERTY_CACHE_SIZE = 2000
static serialize<T extends object, P extends string = never>(
entity: T,
options: Partial<typeof STATIC_OPTIONS_SHAPE> = STATIC_OPTIONS_SHAPE,
parents: string[] = EMPTY_ARRAY
): EntityDTO<Loaded<T, P>> {
// Avoid Array.from and Set allocation for hot path
const parents_ = parents.length > 0 ? Array.from(new Set(parents)) : []
const wrapped = helper(entity)
const meta = wrapped.__meta
let contextCreated = false
if (!wrapped.__serializationContext.root) {
const root = new SerializationContext<T>({} as any)
SerializationContext.propagate(
root,
entity,
(meta, prop) => meta.properties[prop]?.kind !== ReferenceKind.SCALAR
)
contextCreated = true
}
const root = wrapped.__serializationContext
.root! as SerializationContext<any> & {
visitedSerialized?: Map<string, any>
}
const ret = {} as EntityDTO<Loaded<T, P>>
// Use Set for deduplication but keep it simple
const keys = new Set<string>()
const primaryKeys = meta.primaryKeys
const primaryKeysLen = primaryKeys.length
for (let i = 0; i < primaryKeysLen; i++) {
keys.add(primaryKeys[i])
}
const entityKeys = Object.keys(entity)
const entityKeysLen = entityKeys.length
for (let i = 0; i < entityKeysLen; i++) {
keys.add(entityKeys[i])
}
const visited = root.visited.has(entity)
if (!visited) {
root.visited.add(entity)
}
const keysArray = Array.from(keys)
const keysLen = keysArray.length
// Hoist invariant calculations
const className = meta.className
const platform = wrapped.__platform
const skipNull = options.skipNull
const metaProperties = meta.properties
const preventCircularRef = options.preventCircularRef
// Clean property processing loop
for (let i = 0; i < keysLen; i++) {
const prop = keysArray[i]
// Simple filtering logic
const isVisibleRes = isVisible(meta, prop, options)
const propMeta = metaProperties[prop]
let shouldSerialize = isVisibleRes
if (
propMeta &&
preventCircularRef &&
isVisibleRes &&
propMeta.kind !== ReferenceKind.SCALAR
) {
if (!!propMeta.mapToPk) {
shouldSerialize = true
} else {
const parentsLen = parents_.length
for (let j = 0; j < parentsLen; j++) {
if (parents_[j] === propMeta.type) {
shouldSerialize = false
break
}
}
}
}
if (!shouldSerialize) {
continue
}
const cycle = root.visit(className, prop)
if (cycle && visited) continue
const val = this.processProperty<T>(
prop as keyof T & string,
entity,
options,
parents_
)
if (!cycle) {
root.leave(className, prop)
}
if (skipNull && Utils.isPlainObject(val)) {
Utils.dropUndefinedProperties(val, null)
}
if (typeof val !== "undefined" && !(val === null && skipNull)) {
ret[this.propertyName(meta, prop as keyof T & string, platform)] =
val as T[keyof T & string]
}
}
if (contextCreated) {
root.close()
}
if (!wrapped.isInitialized()) {
return ret
}
// Clean getter processing
const metaProps = meta.props
const metaPropsLen = metaProps.length
for (let i = 0; i < metaPropsLen; i++) {
const prop = metaProps[i]
const propName = prop.name
// Clear, readable conditions
if (
prop.getter &&
prop.getterName === undefined &&
typeof entity[propName] !== "undefined" &&
isVisible(meta, propName, options)
) {
ret[this.propertyName(meta, propName, platform)] = this.processProperty(
propName,
entity,
options,
parents_
)
} else if (
prop.getterName &&
(entity[prop.getterName] as unknown) instanceof Function &&
isVisible(meta, propName, options)
) {
ret[this.propertyName(meta, propName, platform)] = this.processProperty(
prop.getterName as keyof T & string,
entity,
options,
parents_
)
}
}
return ret
}
// Thread-safe property name resolution with WeakMap for per-entity caching
private static propertyNameCache = new WeakMap<
EntityMetadata<any>,
Map<string, string>
>()
private static propertyName<T>(
meta: EntityMetadata<T>,
prop: string,
platform?: Platform
): string {
// Use WeakMap per metadata to avoid global cache conflicts
let entityCache = this.propertyNameCache.get(meta)
if (!entityCache) {
entityCache = new Map<string, string>()
this.propertyNameCache.set(meta, entityCache)
}
const cacheKey = `${prop}:${platform?.constructor.name || "no-platform"}`
const cached = entityCache.get(cacheKey)
if (cached !== undefined) {
return cached
}
// Inline property resolution for hot path
let result: string
const property = meta.properties[prop]
/* istanbul ignore next */
if (property?.serializedName) {
result = property.serializedName as string
} else if (property?.primary && platform) {
result = platform.getSerializedPrimaryKeyField(prop) as string
} else {
result = prop
}
// Prevent cache from growing too large
if (entityCache.size >= this.PROPERTY_CACHE_SIZE) {
entityCache.clear() // Much faster than selective deletion
}
entityCache.set(cacheKey, result)
return result
}
private static processProperty<T extends object>(
prop: string,
entity: T,
options: Parameters<typeof EntitySerializer.serialize>[1] & {
preventCircularRef?: boolean
populate?: string[] | boolean
},
parents: string[] = EMPTY_ARRAY
): T[keyof T] | undefined {
// Avoid array allocation when not needed
const parents_ =
parents.length > 0
? [...parents, entity.constructor.name]
: [entity.constructor.name]
// Handle dotted properties efficiently
const parts = prop.split(DOT)
prop = parts[0] as string & keyof T
const wrapped = helper(entity)
const property = wrapped.__meta.properties[prop]
const serializer = property?.serializer
const propValue = entity[prop]
// Fast path for function properties
if ((propValue as unknown) instanceof Function) {
const returnValue = (propValue as unknown as () => T[keyof T & string])()
if (!options.ignoreSerializers && serializer) {
return serializer(returnValue)
}
return returnValue
}
/* istanbul ignore next */
if (!options.ignoreSerializers && serializer) {
return serializer(propValue)
}
// Type checks in optimal order
if (Utils.isCollection(propValue)) {
return this.processCollection(
prop as keyof T & string,
entity,
options,
parents_
)
}
if (Utils.isEntity(propValue, true)) {
return this.processEntity(
prop as keyof T & string,
entity,
wrapped.__platform,
options,
parents_
)
}
/* istanbul ignore next */
if (property?.reference === ReferenceKind.EMBEDDED) {
if (Array.isArray(propValue)) {
return (propValue as object[]).map((item) =>
helper(item).toJSON()
) as T[keyof T]
}
if (Utils.isObject(propValue)) {
return helper(propValue).toJSON() as T[keyof T]
}
}
const customType = property?.customType
if (customType) {
return customType.toJSON(propValue, wrapped.__platform)
}
return wrapped.__platform.normalizePrimaryKey(
propValue as unknown as IPrimaryKey
) as unknown as T[keyof T]
}
private static extractChildOptions<T extends object>(
options: Parameters<typeof EntitySerializer.serialize>[1] & {
preventCircularRef?: boolean
populate?: string[] | boolean
},
prop: keyof T & string
): Parameters<typeof EntitySerializer.serialize>[1] & {
preventCircularRef?: boolean
populate?: string[] | boolean
} {
const propPrefix = prop + DOT
const propPrefixLen = propPrefix.length
// Inline function to avoid call overhead
const extractChildElements = (items: string[]) => {
const result: string[] = []
const itemsLen = items.length
// Traditional for loop for better performance
for (let i = 0; i < itemsLen; i++) {
const field = items[i]
if (
field.length > propPrefixLen &&
field.substring(0, propPrefixLen) === propPrefix
) {
result.push(field.substring(propPrefixLen))
}
}
return result
}
const populate = options.populate
const exclude = options.exclude
// Avoid object spread when possible
const result = {
populate:
Array.isArray(populate) && !populate.includes(WILDCARD)
? extractChildElements(populate as unknown as string[])
: populate,
exclude:
Array.isArray(exclude) && !exclude.includes(WILDCARD)
? extractChildElements(exclude)
: exclude,
preventCircularRef: options.preventCircularRef,
skipNull: options.skipNull,
ignoreSerializers: options.ignoreSerializers,
forceObject: options.forceObject,
} as Parameters<typeof EntitySerializer.serialize>[1] & {
preventCircularRef?: boolean
populate?: string[] | boolean
}
return result
}
private static processEntity<T extends object>(
prop: keyof T & string,
entity: T,
platform: Platform,
options: Parameters<typeof EntitySerializer.serialize>[1] & {
preventCircularRef?: boolean
populate?: string[] | boolean
},
parents: string[] = EMPTY_ARRAY
): T[keyof T] | undefined {
const parents_ =
parents.length > 0
? [...parents, entity.constructor.name]
: [entity.constructor.name]
const child = Reference.unwrapReference(entity[prop] as T)
const wrapped = helper(child)
// Fixed: was incorrectly calling isPopulated(child, prop, options) instead of isPopulated(entity, prop, options)
const populated =
isPopulated(entity, prop, options) && wrapped.isInitialized()
const expand = populated || options.forceObject || !wrapped.__managed
if (expand) {
return this.serialize(
child,
this.extractChildOptions(options, prop),
parents_
) as T[keyof T]
}
return platform.normalizePrimaryKey(
wrapped.getPrimaryKey() as IPrimaryKey
) as T[keyof T]
}
private static processCollection<T extends object>(
prop: keyof T & string,
entity: T,
options: Parameters<typeof EntitySerializer.serialize>[1] & {
preventCircularRef?: boolean
populate?: string[] | boolean
},
parents: string[] = EMPTY_ARRAY
): T[keyof T] | undefined {
const parents_ =
parents.length > 0
? [...parents, entity.constructor.name]
: [entity.constructor.name]
const col = entity[prop] as unknown as Collection<T>
if (!col.isInitialized()) {
return undefined
}
const items = col.getItems(false)
const itemsLen = items.length
const result = new Array(itemsLen)
const childOptions = this.extractChildOptions(options, prop)
// Check if the collection property itself should be populated
// Fixed: was incorrectly calling isPopulated(item, prop, options) instead of isPopulated(entity, prop, options)
const shouldPopulateCollection = isPopulated(entity, prop, options)
for (let i = 0; i < itemsLen; i++) {
const item = items[i]
if (shouldPopulateCollection) {
result[i] = this.serialize(item, childOptions, parents_)
} else {
result[i] = helper(item).getPrimaryKey()
}
}
return result as unknown as T[keyof T]
}
}
export const mikroOrmSerializer = <TOutput extends object>(
data: any,
options?: Partial<
Parameters<typeof EntitySerializer.serialize>[1] & {
preventCircularRef: boolean | undefined
populate: string[] | boolean | undefined
}
>
): Promise<TOutput> => {
return new Promise<TOutput>((resolve) => {
// Efficient options handling
if (!options) {
options = STATIC_OPTIONS_SHAPE
} else {
// Check if we can use static shape
let useStatic = true
const optionKeys = Object.keys(options)
for (let i = 0; i < optionKeys.length; i++) {
const key = optionKeys[i] as keyof typeof options
if (
options[key] !==
STATIC_OPTIONS_SHAPE[key as keyof typeof STATIC_OPTIONS_SHAPE]
) {
useStatic = false
break
}
}
if (useStatic) {
options = STATIC_OPTIONS_SHAPE
} else {
options = { ...STATIC_OPTIONS_SHAPE, ...options }
}
}
const data_ = (Array.isArray(data) ? data : [data]).filter(Boolean)
const forSerialization: object[] = []
const notForSerialization: object[] = []
// Simple classification loop
const dataLen = data_.length
for (let i = 0; i < dataLen; i++) {
const object = data_[i]
if (object.__meta) {
forSerialization.push(object)
} else {
notForSerialization.push(object)
}
}
// Pre-allocate result array
const forSerializationLen = forSerialization.length
const result: any = new Array(forSerializationLen)
for (let i = 0; i < forSerializationLen; i++) {
result[i] = EntitySerializer.serialize(forSerialization[i], options)
}
// Simple result construction
let finalResult: any
if (notForSerialization.length > 0) {
finalResult = result.concat(notForSerialization)
} else {
finalResult = result
}
resolve(Array.isArray(data) ? finalResult : finalResult[0])
})
}

File diff suppressed because it is too large Load Diff

224
yarn.lock
View File

@@ -112,6 +112,13 @@ __metadata:
languageName: node
linkType: hard
"@assemblyscript/loader@npm:^0.19.21":
version: 0.19.23
resolution: "@assemblyscript/loader@npm:0.19.23"
checksum: ddd1d86f1453bd7f112471ffbb4fe2b8145342dc10d2062a1aab05265bc1ffdea497d7d61789dbe58c8751b5fdd23aba15a28301c0e30df0434f57ec021844fb
languageName: node
linkType: hard
"@atomico/rollup-plugin-sizes@npm:^1.1.4":
version: 1.1.4
resolution: "@atomico/rollup-plugin-sizes@npm:1.1.4"
@@ -3715,6 +3722,13 @@ __metadata:
languageName: node
linkType: hard
"@colors/colors@npm:1.5.0":
version: 1.5.0
resolution: "@colors/colors@npm:1.5.0"
checksum: eb42729851adca56d19a08e48d5a1e95efd2a32c55ae0323de8119052be0510d4b7a1611f2abcbf28c044a6c11e6b7d38f99fccdad7429300c37a8ea5fb95b44
languageName: node
linkType: hard
"@colors/colors@npm:1.6.0, @colors/colors@npm:^1.6.0":
version: 1.6.0
resolution: "@colors/colors@npm:1.6.0"
@@ -7444,8 +7458,10 @@ __metadata:
"@medusajs/types": 2.10.3
"@swc/core": ^1.7.28
"@swc/jest": ^0.2.36
"@types/autocannon": ^7.12.5
"@types/express": ^4.17.21
"@types/pluralize": ^0.0.33
autocannon: ^7.15.0
bignumber.js: ^9.1.2
dotenv: ^16.4.5
dotenv-expand: ^11.0.6
@@ -15213,6 +15229,15 @@ __metadata:
languageName: node
linkType: hard
"@types/autocannon@npm:^7.12.5":
version: 7.12.7
resolution: "@types/autocannon@npm:7.12.7"
dependencies:
"@types/node": "*"
checksum: 2f24b4614a50f74b79d3dbc684910cf32656dd521128778c0af5994687ecb241549366c5223695017c8f37f9053c7d5191f82e7e96f2c424fb0feaa6f19e0e28
languageName: node
linkType: hard
"@types/babel__core@npm:^7.0.0, @types/babel__core@npm:^7.1.14, @types/babel__core@npm:^7.18.0, @types/babel__core@npm:^7.20.5":
version: 7.20.5
resolution: "@types/babel__core@npm:7.20.5"
@@ -17191,6 +17216,39 @@ __metadata:
languageName: node
linkType: hard
"autocannon@npm:^7.15.0":
version: 7.15.0
resolution: "autocannon@npm:7.15.0"
dependencies:
chalk: ^4.1.0
char-spinner: ^1.0.1
cli-table3: ^0.6.0
color-support: ^1.1.1
cross-argv: ^2.0.0
form-data: ^4.0.0
has-async-hooks: ^1.0.0
hdr-histogram-js: ^3.0.0
hdr-histogram-percentiles-obj: ^3.0.0
http-parser-js: ^0.5.2
hyperid: ^3.0.0
lodash.chunk: ^4.2.0
lodash.clonedeep: ^4.5.0
lodash.flatten: ^4.4.0
manage-path: ^2.0.0
on-net-listen: ^1.1.1
pretty-bytes: ^5.4.1
progress: ^2.0.3
reinterval: ^1.1.0
retimer: ^3.0.0
semver: ^7.3.2
subarg: ^1.0.0
timestring: ^6.0.0
bin:
autocannon: autocannon.js
checksum: 5f7e9468df028c355e60c977e0f7f3978ec7b4b4590213adfc24c686c15101de6f0d67f7f7a4c8cc9b642d42ad7f2b53d88c61a9420b910c1e1dd3b220340be3
languageName: node
linkType: hard
"autoprefixer@npm:^10.1.0, autoprefixer@npm:^10.4.16, autoprefixer@npm:^10.4.17, autoprefixer@npm:^10.4.19":
version: 10.4.19
resolution: "autoprefixer@npm:10.4.19"
@@ -17507,7 +17565,7 @@ __metadata:
languageName: node
linkType: hard
"base64-js@npm:^1.3.1":
"base64-js@npm:^1.2.0, base64-js@npm:^1.3.1":
version: 1.5.1
resolution: "base64-js@npm:1.5.1"
checksum: f23823513b63173a001030fae4f2dabe283b99a9d324ade3ad3d148e218134676f1ee8568c877cd79ec1c53158dcf2d2ba527a97c606618928ba99dd930102bf
@@ -17851,7 +17909,7 @@ __metadata:
languageName: node
linkType: hard
"buffer@npm:^5.5.0, buffer@npm:^5.6.0":
"buffer@npm:^5.2.1, buffer@npm:^5.5.0, buffer@npm:^5.6.0":
version: 5.7.1
resolution: "buffer@npm:5.7.1"
dependencies:
@@ -18289,6 +18347,13 @@ __metadata:
languageName: node
linkType: hard
"char-spinner@npm:^1.0.1":
version: 1.0.1
resolution: "char-spinner@npm:1.0.1"
checksum: 59589cfb84d08605f511bc2e18aaa8a92a77b94c78413583838c390eb3c0da4cf1af1901a6f21bf7ac6b6f12521895814e8d5bbf0a7e53e202f830deb80b1a2e
languageName: node
linkType: hard
"chardet@npm:^0.7.0":
version: 0.7.0
resolution: "chardet@npm:0.7.0"
@@ -18453,6 +18518,19 @@ __metadata:
languageName: node
linkType: hard
"cli-table3@npm:^0.6.0":
version: 0.6.5
resolution: "cli-table3@npm:0.6.5"
dependencies:
"@colors/colors": 1.5.0
string-width: ^4.2.0
dependenciesMeta:
"@colors/colors":
optional: true
checksum: d7cc9ed12212ae68241cc7a3133c52b844113b17856e11f4f81308acc3febcea7cc9fd298e70933e294dd642866b29fd5d113c2c098948701d0c35f09455de78
languageName: node
linkType: hard
"cli-truncate@npm:2.1.0, cli-truncate@npm:^2.1.0":
version: 2.1.0
resolution: "cli-truncate@npm:2.1.0"
@@ -18663,6 +18741,15 @@ __metadata:
languageName: node
linkType: hard
"color-support@npm:^1.1.1":
version: 1.1.3
resolution: "color-support@npm:1.1.3"
bin:
color-support: bin.js
checksum: 8ffeaa270a784dc382f62d9be0a98581db43e11eee301af14734a6d089bd456478b1a8b3e7db7ca7dc5b18a75f828f775c44074020b51c05fc00e6d0992b1cc6
languageName: node
linkType: hard
"color@npm:^3.1.3":
version: 3.2.1
resolution: "color@npm:3.2.1"
@@ -19247,6 +19334,13 @@ __metadata:
languageName: node
linkType: hard
"cross-argv@npm:^2.0.0":
version: 2.0.0
resolution: "cross-argv@npm:2.0.0"
checksum: 6ac40b066fcca358f5249f276527370343bec0efbf752627f8aec67bc5da09bc4e23981384318252f99c6952026aa8ee220a02defdfb498f9c5c10b868eda4ff
languageName: node
linkType: hard
"cross-env@npm:^5.2.1":
version: 5.2.1
resolution: "cross-env@npm:5.2.1"
@@ -23356,6 +23450,13 @@ __metadata:
languageName: node
linkType: hard
"has-async-hooks@npm:^1.0.0":
version: 1.0.0
resolution: "has-async-hooks@npm:1.0.0"
checksum: 5846ca4b494a209727c2db0fd0627604bb16fbbcf4d02f331ea18cc6162939d4d67dc7962b570c7e9cfe13cdc550023310480f8dba2c0c5826722225800600bb
languageName: node
linkType: hard
"has-bigints@npm:^1.0.1, has-bigints@npm:^1.0.2":
version: 1.0.2
resolution: "has-bigints@npm:1.0.2"
@@ -23452,6 +23553,24 @@ __metadata:
languageName: node
linkType: hard
"hdr-histogram-js@npm:^3.0.0":
version: 3.0.1
resolution: "hdr-histogram-js@npm:3.0.1"
dependencies:
"@assemblyscript/loader": ^0.19.21
base64-js: ^1.2.0
pako: ^1.0.3
checksum: 1ee035f3ef5f3d43ea7916957da52a0f6e3fe899a745723086c769293d0e6ea2aaa0dffe8ee9fa4a2ccba8d28c76a404fc99681edb47ee76eef0627b1a779d19
languageName: node
linkType: hard
"hdr-histogram-percentiles-obj@npm:^3.0.0":
version: 3.0.0
resolution: "hdr-histogram-percentiles-obj@npm:3.0.0"
checksum: 7b1c90b01bdee22da3b6e1f95fdbe186ceb6875a53340c3615f6e9899ee5d4bc6624a0fe67a15b0e59ccddca8d3b835911214285a77fa22a04cf64c9e3d64d39
languageName: node
linkType: hard
"header-case@npm:^2.0.4":
version: 2.0.4
resolution: "header-case@npm:2.0.4"
@@ -23592,6 +23711,13 @@ __metadata:
languageName: node
linkType: hard
"http-parser-js@npm:^0.5.2":
version: 0.5.10
resolution: "http-parser-js@npm:0.5.10"
checksum: 8bbcf1832a8d70b2bd515270112116333add88738a2cc05bfb94ba6bde3be4b33efee5611584113818d2bcf654fdc335b652503be5a6b4c0b95e46f214187d93
languageName: node
linkType: hard
"http-proxy-agent@npm:^5.0.0":
version: 5.0.0
resolution: "http-proxy-agent@npm:5.0.0"
@@ -23706,6 +23832,17 @@ __metadata:
languageName: node
linkType: hard
"hyperid@npm:^3.0.0":
version: 3.3.0
resolution: "hyperid@npm:3.3.0"
dependencies:
buffer: ^5.2.1
uuid: ^8.3.2
uuid-parse: ^1.1.0
checksum: 709dafd2a25e21086a2d881adb3436a16eafd28692ebd967829ca1d5f10adf001925feb23b6aadfee464ea313d851e7ba7a5452869aaf6cde15e9149793a09e0
languageName: node
linkType: hard
"hyperlinker@npm:^1.0.0":
version: 1.0.0
resolution: "hyperlinker@npm:1.0.0"
@@ -26210,6 +26347,20 @@ __metadata:
languageName: node
linkType: hard
"lodash.chunk@npm:^4.2.0":
version: 4.2.0
resolution: "lodash.chunk@npm:4.2.0"
checksum: f9f99969561ad2f62af1f9a96c5bd0af776f000292b0d8db3126c28eb3b32e210d7c31b49c18d0d7901869bd769057046dc134b60cfa0c2c4ce017823a26bb23
languageName: node
linkType: hard
"lodash.clonedeep@npm:^4.5.0":
version: 4.5.0
resolution: "lodash.clonedeep@npm:4.5.0"
checksum: 2caf0e4808f319d761d2939ee0642fa6867a4bbf2cfce43276698828380756b99d4c4fa226d881655e6ac298dd453fe12a5ec8ba49861777759494c534936985
languageName: node
linkType: hard
"lodash.debounce@npm:^4.0.8":
version: 4.0.8
resolution: "lodash.debounce@npm:4.0.8"
@@ -26224,6 +26375,13 @@ __metadata:
languageName: node
linkType: hard
"lodash.flatten@npm:^4.4.0":
version: 4.4.0
resolution: "lodash.flatten@npm:4.4.0"
checksum: 97e8f0d6b61fe4723c02ad0c6e67e51784c4a2c48f56ef283483e556ad01594cf9cec9c773e177bbbdbdb5d19e99b09d2487cb6b6e5dc405c2693e93b125bd3a
languageName: node
linkType: hard
"lodash.get@npm:^4.4.2":
version: 4.4.2
resolution: "lodash.get@npm:4.4.2"
@@ -26640,6 +26798,13 @@ __metadata:
languageName: node
linkType: hard
"manage-path@npm:^2.0.0":
version: 2.0.0
resolution: "manage-path@npm:2.0.0"
checksum: 3e99cd7f1c3c35b893da6304ee75e2150de59ab1162588593ba8a9f3f5f019a7b83fa059b27aea61bb017cf82aa996e17afd80072660d1f45100a6a82e732071
languageName: node
linkType: hard
"map-cache@npm:^0.2.0":
version: 0.2.2
resolution: "map-cache@npm:0.2.2"
@@ -27110,7 +27275,7 @@ __metadata:
languageName: node
linkType: hard
"minimist@npm:^1.2.5, minimist@npm:^1.2.6, minimist@npm:^1.2.8":
"minimist@npm:^1.1.0, minimist@npm:^1.2.5, minimist@npm:^1.2.6, minimist@npm:^1.2.8":
version: 1.2.8
resolution: "minimist@npm:1.2.8"
checksum: 19d3fcdca050087b84c2029841a093691a91259a47def2f18222f41e7645a0b7c44ef4b40e88a1e58a40c84d2ef0ee6047c55594d298146d0eb3f6b737c20ce6
@@ -28195,6 +28360,13 @@ __metadata:
languageName: node
linkType: hard
"on-net-listen@npm:^1.1.1":
version: 1.1.2
resolution: "on-net-listen@npm:1.1.2"
checksum: eaa036215355130db3f261590e15a6d1d9b12a05e3b95ae2103ea6ba34524a6bf8c2a0be2117b6d4b32c7c2ff5b51121a7454e8d2b98f99f1fa9cc1bc785f59d
languageName: node
linkType: hard
"once@npm:^1.3.0, once@npm:^1.3.1, once@npm:^1.4.0":
version: 1.4.0
resolution: "once@npm:1.4.0"
@@ -28499,6 +28671,13 @@ __metadata:
languageName: node
linkType: hard
"pako@npm:^1.0.3":
version: 1.0.11
resolution: "pako@npm:1.0.11"
checksum: 86dd99d8b34c3930345b8bbeb5e1cd8a05f608eeb40967b293f72fe469d0e9c88b783a8777e4cc7dc7c91ce54c5e93d88ff4b4f060e6ff18408fd21030d9ffbe
languageName: node
linkType: hard
"param-case@npm:^3.0.4":
version: 3.0.4
resolution: "param-case@npm:3.0.4"
@@ -29879,7 +30058,7 @@ __metadata:
languageName: node
linkType: hard
"progress@npm:^2.0.0":
"progress@npm:^2.0.0, progress@npm:^2.0.3":
version: 2.0.3
resolution: "progress@npm:2.0.3"
checksum: 1697e07cb1068055dbe9fe858d242368ff5d2073639e652b75a7eb1f2a1a8d4afd404d719de23c7b48481a6aa0040686310e2dac2f53d776daa2176d3f96369c
@@ -31057,6 +31236,13 @@ __metadata:
languageName: node
linkType: hard
"reinterval@npm:^1.1.0":
version: 1.1.0
resolution: "reinterval@npm:1.1.0"
checksum: 83ffcd92363acd57feaecfd98819eeeb618a4ebb6db092ee60aafdb592195447648227bf36891c10ccb3959c1fbd0c4fa2cd7cd74460015c664385248c4e0c72
languageName: node
linkType: hard
"relay-runtime@npm:12.0.0":
version: 12.0.0
resolution: "relay-runtime@npm:12.0.0"
@@ -31371,6 +31557,13 @@ __metadata:
languageName: node
linkType: hard
"retimer@npm:^3.0.0":
version: 3.0.0
resolution: "retimer@npm:3.0.0"
checksum: dc3e07997c77c2b9113072bc57bcad9898a388a5cdf59a07a9fb3974f1dd8958e371c97ac93eb312b184d63d907faa5430bb5ff859a8b2d285f4db1082beb96a
languageName: node
linkType: hard
"retry@npm:^0.12.0":
version: 0.12.0
resolution: "retry@npm:0.12.0"
@@ -33355,6 +33548,15 @@ __metadata:
languageName: node
linkType: hard
"subarg@npm:^1.0.0":
version: 1.0.0
resolution: "subarg@npm:1.0.0"
dependencies:
minimist: ^1.1.0
checksum: 8ecdfa682e50b98272b283f1094ae2f82e5c84b258fd3ac6e47a69149059bd786ef6586305243a5b60746ce23e3e738de7ed8277c76f3363fa351bbfe9c71f37
languageName: node
linkType: hard
"sucrase@npm:^3.20.3, sucrase@npm:^3.32.0, sucrase@npm:^3.35.0":
version: 3.35.0
resolution: "sucrase@npm:3.35.0"
@@ -33726,6 +33928,13 @@ __metadata:
languageName: node
linkType: hard
"timestring@npm:^6.0.0":
version: 6.0.0
resolution: "timestring@npm:6.0.0"
checksum: 753688c0f6ebdec5af6a77a01ad85b5fd868d04d2d1e7510bb23b2554236ea400f008b0b358a54f148c2aa0cf517494e786f20b61d8a7477d7fa4438f33bf84e
languageName: node
linkType: hard
"tiny-glob@npm:^0.2.8":
version: 0.2.9
resolution: "tiny-glob@npm:0.2.9"
@@ -35225,6 +35434,13 @@ __metadata:
languageName: node
linkType: hard
"uuid-parse@npm:^1.1.0":
version: 1.1.0
resolution: "uuid-parse@npm:1.1.0"
checksum: 513d5b0407b8929c54d452e8e286a1f6f00d40a2ebf6607fc7297b9caa40eee35230dcc4ce3f178f84d89704e8147e24d1a33d680a8afb7e12a5700714db7f51
languageName: node
linkType: hard
"uuid@npm:^3.3.2":
version: 3.4.0
resolution: "uuid@npm:3.4.0"