Feat(): faster entity serializer (#13564)
**What** Improve serialization further on complex entities to reduce bottleneck. **Notes** It might good at some point to integrate these improvements into mikro orm package 📦 **Load test** The test is using autocanon and wrap the serializers call behind http end points. Each product has 2 variants, 3 options and 3 options values. autocanon is configured for 10 connections during 20 second and 1 pipelining. This is repeated for each configuration and each catch size. 🚀 Load Testing Serializers with Autocannon ================================================================================ ==================================================================================================== 🎯 TESTING 10 PRODUCTS ==================================================================================================== 🖥️ Server started on port 57840 📊 Testing with 10 products per request 🔥 Load testing: MikroOrm -------------------------------------------------- Requests/sec: 33.85 Avg Latency: 319.30ms P90 Latency: 327.00ms Throughput: 31.36 MB/s Errors: 0 🔥 Load testing: Current -------------------------------------------------- Requests/sec: 821.15 Avg Latency: 11.67ms P90 Latency: 12.00ms Throughput: 0.18 MB/s Errors: 0 🔥 Load testing: Optimized -------------------------------------------------- Requests/sec: 1286.75 Avg Latency: 7.25ms P90 Latency: 7.00ms Throughput: 37.31 MB/s Errors: 0 📈 Load Testing Performance Comparison for 10 products: -------------------------------------------------------------------------------------------------------------------------------------------- Serializer Requests/sec Avg Latency (ms) P90 Latency (ms) Throughput (MB/s) Errors RPS Improvement -------------------------------------------------------------------------------------------------------------------------------------------- MikroOrm 33.85 319.30 327.00 31.36 0 baseline Current 821.15 11.67 12.00 0.18 0 24.3x Optimized 1286.75 7.25 7.00 37.31 0 38.0x 🎯 Key Insights for 10 products: • Optimized serializer handles 1.6x more requests/sec than Current • Optimized serializer handles 38.0x more requests/sec than MikroOrm • 37.9% lower latency compared to Current serializer 🔴 Server stopped for 10 products test ==================================================================================================== 🎯 TESTING 100 PRODUCTS ==================================================================================================== 🖥️ Server started on port 57878 📊 Testing with 100 products per request 🔥 Load testing: MikroOrm -------------------------------------------------- Requests/sec: 3.69 Avg Latency: 3241.29ms P90 Latency: 4972.00ms Throughput: 35.04 MB/s Errors: 0 🔥 Load testing: Current -------------------------------------------------- Requests/sec: 87.45 Avg Latency: 117.20ms P90 Latency: 116.00ms Throughput: 0.02 MB/s Errors: 0 🔥 Load testing: Optimized -------------------------------------------------- Requests/sec: 143.56 Avg Latency: 70.62ms P90 Latency: 72.00ms Throughput: 42.22 MB/s Errors: 0 📈 Load Testing Performance Comparison for 100 products: -------------------------------------------------------------------------------------------------------------------------------------------- Serializer Requests/sec Avg Latency (ms) P90 Latency (ms) Throughput (MB/s) Errors RPS Improvement -------------------------------------------------------------------------------------------------------------------------------------------- MikroOrm 3.69 3241.29 4972.00 35.04 0 baseline Current 87.45 117.20 116.00 0.02 0 23.7x Optimized 143.56 70.62 72.00 42.22 0 38.9x 🎯 Key Insights for 100 products: • Optimized serializer handles 1.6x more requests/sec than Current • Optimized serializer handles 38.9x more requests/sec than MikroOrm • 39.7% lower latency compared to Current serializer 🔴 Server stopped for 100 products test ==================================================================================================== 🎯 TESTING 1,000 PRODUCTS ==================================================================================================== 🖥️ Server started on port 57930 📊 Testing with 1000 products per request 🔥 Load testing: MikroOrm -------------------------------------------------- Requests/sec: 0.00 Avg Latency: 0.00ms P90 Latency: 0.00ms Throughput: 0.00 MB/s Errors: 10 🔥 Load testing: Current -------------------------------------------------- Requests/sec: 0.00 Avg Latency: 0.00ms P90 Latency: 0.00ms Throughput: 0.00 MB/s Errors: 20 🔥 Load testing: Optimized -------------------------------------------------- Requests/sec: 13.79 Avg Latency: 792.94ms P90 Latency: 755.00ms Throughput: 41.47 MB/s Errors: 0 📈 Load Testing Performance Comparison for 1000 products: -------------------------------------------------------------------------------------------------------------------------------------------- Serializer Requests/sec Avg Latency (ms) P90 Latency (ms) Throughput (MB/s) Errors RPS Improvement -------------------------------------------------------------------------------------------------------------------------------------------- MikroOrm 0.00 0.00 0.00 0.00 10 NaNx Current 0.00 0.00 0.00 0.00 20 NaNx Optimized 13.79 792.94 755.00 41.47 0 Infinityx 🎯 Key Insights for 1000 products: • Optimized serializer handles Infinityx more requests/sec than Current • Optimized serializer handles Infinityx more requests/sec than MikroOrm • -Infinity% lower latency compared to Current serializer 🔴 Server stopped for 1000 products test ====================================================================================================================================================== 📊 COMPREHENSIVE AUTOCANNON LOAD TESTING ANALYSIS ====================================================================================================================================================== 🚀 Autocannon Load Testing Scaling Analysis: -------------------------------------------------------------------------------------------------------------------------------------------- Size RPS (M/O/Op) Avg Latency (M/O/Op) P90 Latency (M/O/Op) Throughput MB/s (M/O/Op) Speedup vs M (O/Op) Speedup vs O (Op) -------------------------------------------------------------------------------------------------------------------------------------------- 10 33.9/821.1/1286.8 319.3/11.7/7.3 327.0/12.0/7.0 31.4/0.2/37.3 24.3x/38.0x 1.6x 100 3.7/87.5/143.6 3241.3/117.2/70.6 4972.0/116.0/72.0 35.0/0.0/42.2 23.7x/38.9x 1.6x 1,000 0.0/0.0/13.8 0.0/0.0/792.9 0.0/0.0/755.0 0.0/0.0/41.5 NaNx/Infinityx Infinityx 🎯 Overall Load Testing Performance Summary: 📈 10 products: • +56.7% more requests/sec vs Current (821.1 → 1286.8) • +3701.3% more requests/sec vs MikroOrm (33.9 → 1286.8) 📈 100 products: • +64.2% more requests/sec vs Current (87.5 → 143.6) • +3790.5% more requests/sec vs MikroOrm (3.7 → 143.6) 📈 1000 products: • +Infinity% more requests/sec vs Current (0.0 → 13.8) • +Infinity% more requests/sec vs MikroOrm (0.0 → 13.8)
This commit is contained in:
committed by
GitHub
parent
12a96a7c70
commit
e39c472a84
@@ -29,7 +29,9 @@
|
||||
"@medusajs/types": "2.10.3",
|
||||
"@swc/core": "^1.7.28",
|
||||
"@swc/jest": "^0.2.36",
|
||||
"@types/autocannon": "^7.12.5",
|
||||
"@types/express": "^4.17.21",
|
||||
"autocannon": "^7.15.0",
|
||||
"expect-type": "^0.20.0",
|
||||
"express": "^4.21.0",
|
||||
"jest": "^29.7.0",
|
||||
|
||||
@@ -1,4 +1,6 @@
|
||||
import { MikroORM } from "@medusajs/deps/mikro-orm/core"
|
||||
import express from "express"
|
||||
import autocannon, { Result } from "autocannon"
|
||||
import { MikroORM, EntitySerializer } from "@medusajs/deps/mikro-orm/core"
|
||||
import { defineConfig } from "@medusajs/deps/mikro-orm/postgresql"
|
||||
import {
|
||||
Entity1WithUnDecoratedProp,
|
||||
@@ -8,8 +10,11 @@ import {
|
||||
ProductOptionValue,
|
||||
ProductVariant,
|
||||
} from "../__fixtures__/utils"
|
||||
import { mikroOrmSerializer as mikroOrmSerializerOld } from "../mikro-orm-serializer-old"
|
||||
import { mikroOrmSerializer } from "../mikro-orm-serializer"
|
||||
|
||||
jest.setTimeout(60000)
|
||||
|
||||
describe("mikroOrmSerializer", () => {
|
||||
beforeEach(async () => {
|
||||
await MikroORM.init(
|
||||
@@ -44,7 +49,7 @@ describe("mikroOrmSerializer", () => {
|
||||
})
|
||||
entity1.entity2.add(entity2)
|
||||
|
||||
const serialized = await mikroOrmSerializer(entity1, {
|
||||
const serialized = mikroOrmSerializer(entity1, {
|
||||
preventCircularRef: false,
|
||||
})
|
||||
|
||||
@@ -81,7 +86,7 @@ describe("mikroOrmSerializer", () => {
|
||||
})
|
||||
entity1.entity2.add(entity2)
|
||||
|
||||
const serialized = await mikroOrmSerializer([entity1, entity1], {
|
||||
const serialized = mikroOrmSerializer([entity1, entity1], {
|
||||
preventCircularRef: false,
|
||||
})
|
||||
|
||||
@@ -120,7 +125,7 @@ describe("mikroOrmSerializer", () => {
|
||||
})
|
||||
entity1.entity2.add(entity2)
|
||||
|
||||
const serialized = await mikroOrmSerializer(entity1)
|
||||
const serialized = mikroOrmSerializer(entity1)
|
||||
|
||||
expect(serialized).toEqual({
|
||||
id: "1",
|
||||
@@ -158,7 +163,7 @@ describe("mikroOrmSerializer", () => {
|
||||
product.options.add(productOptions)
|
||||
product.variants.add(productVariant)
|
||||
|
||||
const serialized = await mikroOrmSerializer(product)
|
||||
const serialized = mikroOrmSerializer(product)
|
||||
|
||||
expect(serialized).toEqual({
|
||||
id: "1",
|
||||
@@ -201,4 +206,372 @@ describe("mikroOrmSerializer", () => {
|
||||
name: "Product 1",
|
||||
})
|
||||
})
|
||||
|
||||
it.skip("should benchmark serializers with autocannon load testing", async () => {
|
||||
const logs: string[] = []
|
||||
logs.push("🚀 Load Testing Serializers with Autocannon")
|
||||
logs.push("=".repeat(80))
|
||||
|
||||
// Generate test dataset
|
||||
function generateLoadTestProducts(count: number): Product[] {
|
||||
const products: Product[] = []
|
||||
|
||||
for (let i = 0; i < count; i++) {
|
||||
const product = new Product()
|
||||
product.id = `product-${i}`
|
||||
product.name = `Product ${i}`
|
||||
|
||||
// Generate 3 options per product
|
||||
for (let optionIndex = 0; optionIndex < 3; optionIndex++) {
|
||||
const option = new ProductOption()
|
||||
option.id = `option-${product.id}-${optionIndex}`
|
||||
option.name = `Option ${optionIndex} for Product ${product.id}`
|
||||
option.product = product
|
||||
|
||||
// Generate 3 values per option
|
||||
for (let valueIndex = 0; valueIndex < 3; valueIndex++) {
|
||||
const value = new ProductOptionValue()
|
||||
value.id = `option-value-${option.id}-${valueIndex}`
|
||||
value.name = `Option Value ${valueIndex} for Option ${option.id}`
|
||||
value.option_id = option.id
|
||||
value.option = option
|
||||
option.values.add(value)
|
||||
}
|
||||
|
||||
product.options.add(option)
|
||||
}
|
||||
|
||||
// Generate 2 variants per product
|
||||
for (let variantIndex = 0; variantIndex < 2; variantIndex++) {
|
||||
const variant = new ProductVariant()
|
||||
variant.id = `variant-${product.id}-${variantIndex}`
|
||||
variant.name = `Variant ${variantIndex} for Product ${product.id}`
|
||||
variant.product_id = product.id
|
||||
variant.product = product
|
||||
|
||||
// Assign option values to variants
|
||||
const optionArray = product.options.getItems()
|
||||
for (let j = 0; j < 2 && j < optionArray.length; j++) {
|
||||
const option = optionArray[j]
|
||||
const optionValues = option.values.getItems()
|
||||
if (optionValues.length > 0) {
|
||||
const value = optionValues[0]
|
||||
variant.options.add(value)
|
||||
value.variants.add(variant)
|
||||
}
|
||||
}
|
||||
|
||||
product.variants.add(variant)
|
||||
}
|
||||
|
||||
products.push(product)
|
||||
}
|
||||
|
||||
return products
|
||||
}
|
||||
|
||||
// Test different dataset sizes
|
||||
const testSizes = [10, 100, 1000]
|
||||
const allResults: Array<{
|
||||
size: number
|
||||
results: Array<{
|
||||
name: string
|
||||
requestsPerSecond: number
|
||||
latency: number
|
||||
latencyP90: number
|
||||
throughput: number
|
||||
errors: number
|
||||
}>
|
||||
}> = []
|
||||
|
||||
for (const size of testSizes) {
|
||||
logs.push(`\n${"=".repeat(100)}`)
|
||||
logs.push(`🎯 TESTING ${size.toLocaleString()} PRODUCTS`)
|
||||
logs.push(`${"=".repeat(100)}`)
|
||||
|
||||
// Create test dataset for this size
|
||||
const testProducts = generateLoadTestProducts(size)
|
||||
|
||||
// Create Express server with serializer endpoints
|
||||
const app = express()
|
||||
app.use(express.json())
|
||||
|
||||
app.get("/mikro-orm", (_req, res) => {
|
||||
const result = testProducts.map((product) =>
|
||||
EntitySerializer.serialize(product, {
|
||||
populate: ["*"],
|
||||
forceObject: true,
|
||||
skipNull: undefined,
|
||||
ignoreSerializers: undefined,
|
||||
exclude: undefined,
|
||||
})
|
||||
)
|
||||
res.json(result)
|
||||
})
|
||||
|
||||
app.get("/original", (_req, res) => {
|
||||
const result = mikroOrmSerializerOld(testProducts)
|
||||
res.json(result)
|
||||
})
|
||||
|
||||
app.get("/optimized", (_req, res) => {
|
||||
const result = mikroOrmSerializer(testProducts)
|
||||
res.json(result)
|
||||
})
|
||||
|
||||
// Start server
|
||||
const server = app.listen(0) // Use port 0 for automatic port assignment
|
||||
const port = (server.address() as any)?.port
|
||||
|
||||
if (!port) {
|
||||
throw new Error("Failed to start server")
|
||||
}
|
||||
|
||||
logs.push(`🖥️ Server started on port ${port}`)
|
||||
logs.push(`📊 Testing with ${testProducts.length} products per request`)
|
||||
|
||||
try {
|
||||
// Autocannon test configurations
|
||||
const testConfigs = [
|
||||
{ name: "MikroOrm", path: "/mikro-orm" },
|
||||
{ name: "Original", path: "/original" },
|
||||
{ name: "Optimized", path: "/optimized" },
|
||||
]
|
||||
|
||||
const sizeResults: Array<{
|
||||
name: string
|
||||
requestsPerSecond: number
|
||||
latency: number
|
||||
latencyP90: number
|
||||
throughput: number
|
||||
errors: number
|
||||
}> = []
|
||||
|
||||
for (const config of testConfigs) {
|
||||
logs.push(`\n🔥 Load testing: ${config.name}`)
|
||||
logs.push("-".repeat(50))
|
||||
|
||||
const result = await new Promise<Result>((resolve, reject) => {
|
||||
autocannon(
|
||||
{
|
||||
url: `http://localhost:${port}${config.path}`,
|
||||
connections: 10,
|
||||
duration: 20, // 10 seconds
|
||||
pipelining: 1,
|
||||
},
|
||||
(err, result) => {
|
||||
if (err) {
|
||||
reject(err)
|
||||
} else {
|
||||
resolve(result!)
|
||||
}
|
||||
}
|
||||
)
|
||||
})
|
||||
|
||||
const requestsPerSecond = result.requests.average
|
||||
const latency = result.latency.average
|
||||
const latencyP90 = result.latency.p90
|
||||
const throughput = result.throughput.average
|
||||
const errors = result.errors
|
||||
|
||||
logs.push(` Requests/sec: ${requestsPerSecond.toFixed(2)}`)
|
||||
logs.push(` Avg Latency: ${latency.toFixed(2)}ms`)
|
||||
logs.push(` P90 Latency: ${latencyP90.toFixed(2)}ms`)
|
||||
logs.push(
|
||||
` Throughput: ${(throughput / 1024 / 1024).toFixed(2)} MB/s`
|
||||
)
|
||||
logs.push(` Errors: ${errors}`)
|
||||
|
||||
sizeResults.push({
|
||||
name: config.name,
|
||||
requestsPerSecond,
|
||||
latency,
|
||||
latencyP90,
|
||||
throughput,
|
||||
errors,
|
||||
})
|
||||
}
|
||||
|
||||
// Generate comparison table for this size
|
||||
logs.push(
|
||||
`\n📈 Load Testing Performance Comparison for ${size} products:`
|
||||
)
|
||||
logs.push("-".repeat(140))
|
||||
logs.push(
|
||||
`${"Serializer".padEnd(15)} ${"Requests/sec".padEnd(
|
||||
15
|
||||
)} ${"Avg Latency (ms)".padEnd(18)} ${"P90 Latency (ms)".padEnd(
|
||||
18
|
||||
)} ${"Throughput (MB/s)".padEnd(18)} ${"Errors".padEnd(
|
||||
10
|
||||
)} ${"RPS Improvement"}`
|
||||
)
|
||||
logs.push("-".repeat(140))
|
||||
|
||||
const baselineRps = sizeResults[0].requestsPerSecond // MikroOrm as baseline
|
||||
|
||||
sizeResults.forEach((result) => {
|
||||
const rpsImprovement = result.requestsPerSecond / baselineRps
|
||||
const improvementText =
|
||||
rpsImprovement === 1 ? "baseline" : `${rpsImprovement.toFixed(1)}x`
|
||||
|
||||
logs.push(
|
||||
`${result.name.padEnd(15)} ${result.requestsPerSecond
|
||||
.toFixed(2)
|
||||
.padEnd(15)} ${result.latency
|
||||
.toFixed(2)
|
||||
.padEnd(18)} ${result.latencyP90.toFixed(2).padEnd(18)} ${(
|
||||
result.throughput /
|
||||
1024 /
|
||||
1024
|
||||
)
|
||||
.toFixed(2)
|
||||
.padEnd(18)} ${result.errors
|
||||
.toString()
|
||||
.padEnd(10)} ${improvementText}`
|
||||
)
|
||||
})
|
||||
|
||||
logs.push(`\n🎯 Key Insights for ${size} products:`)
|
||||
const optimizedResult = sizeResults.find((r) => r.name === "Optimized")
|
||||
const originalResult = sizeResults.find((r) => r.name === "Original")
|
||||
const mikroOrmResult = sizeResults.find((r) => r.name === "MikroOrm")
|
||||
|
||||
if (optimizedResult && originalResult && mikroOrmResult) {
|
||||
const rpsImprovementVsOriginal =
|
||||
optimizedResult.requestsPerSecond / originalResult.requestsPerSecond
|
||||
const rpsImprovementVsMikroOrm =
|
||||
optimizedResult.requestsPerSecond / mikroOrmResult.requestsPerSecond
|
||||
const latencyImprovementVsOriginal =
|
||||
((originalResult.latency - optimizedResult.latency) /
|
||||
originalResult.latency) *
|
||||
100
|
||||
|
||||
logs.push(
|
||||
` • Optimized serializer handles ${rpsImprovementVsOriginal.toFixed(
|
||||
1
|
||||
)}x more requests/sec than Original`
|
||||
)
|
||||
logs.push(
|
||||
` • Optimized serializer handles ${rpsImprovementVsMikroOrm.toFixed(
|
||||
1
|
||||
)}x more requests/sec than MikroOrm`
|
||||
)
|
||||
logs.push(
|
||||
` • ${latencyImprovementVsOriginal.toFixed(
|
||||
1
|
||||
)}% lower latency compared to Original serializer`
|
||||
)
|
||||
}
|
||||
|
||||
allResults.push({ size, results: sizeResults })
|
||||
} finally {
|
||||
// Clean up server
|
||||
server.close()
|
||||
logs.push(`\n🔴 Server stopped for ${size} products test`)
|
||||
}
|
||||
}
|
||||
|
||||
// Generate comprehensive comparison across all sizes
|
||||
logs.push(`\n\n${"=".repeat(150)}`)
|
||||
logs.push("📊 COMPREHENSIVE AUTOCANNON LOAD TESTING ANALYSIS")
|
||||
logs.push(`${"=".repeat(150)}`)
|
||||
|
||||
logs.push("\n🚀 Autocannon Load Testing Scaling Analysis:")
|
||||
logs.push("-".repeat(140))
|
||||
logs.push(
|
||||
`${"Size".padEnd(12)} ${"RPS (M/O/Op)".padEnd(
|
||||
20
|
||||
)} ${"Avg Latency (M/O/Op)".padEnd(22)} ${"P90 Latency (M/O/Op)".padEnd(
|
||||
22
|
||||
)} ${"Throughput MB/s (M/O/Op)".padEnd(
|
||||
25
|
||||
)} ${"Speedup vs M (O/Op)".padEnd(18)} ${"Speedup vs O (Op)"}`
|
||||
)
|
||||
logs.push("-".repeat(140))
|
||||
|
||||
allResults.forEach(({ size, results }) => {
|
||||
const mikroOrm = results.find((r) => r.name === "MikroOrm")
|
||||
const original = results.find((r) => r.name === "Original")
|
||||
const optimized = results.find((r) => r.name === "Optimized")
|
||||
if (original && optimized && mikroOrm) {
|
||||
const speedupOptimizedVsMikroOrm =
|
||||
optimized.requestsPerSecond / mikroOrm.requestsPerSecond
|
||||
const speedupOptimizedVsOriginal =
|
||||
optimized.requestsPerSecond / original.requestsPerSecond
|
||||
const speedupOriginalVsMikroOrm =
|
||||
original.requestsPerSecond / mikroOrm.requestsPerSecond
|
||||
|
||||
const rpsValues = `${mikroOrm.requestsPerSecond.toFixed(
|
||||
1
|
||||
)}/${original.requestsPerSecond.toFixed(
|
||||
1
|
||||
)}/${optimized.requestsPerSecond.toFixed(1)}`
|
||||
const avgLatencyValues = `${mikroOrm.latency.toFixed(
|
||||
1
|
||||
)}/${original.latency.toFixed(1)}/${optimized.latency.toFixed(1)}`
|
||||
const p90LatencyValues = `${mikroOrm.latencyP90.toFixed(
|
||||
1
|
||||
)}/${original.latencyP90.toFixed(1)}/${optimized.latencyP90.toFixed(1)}`
|
||||
const throughputValues = `${(mikroOrm.throughput / 1024 / 1024).toFixed(
|
||||
1
|
||||
)}/${(original.throughput / 1024 / 1024).toFixed(1)}/${(
|
||||
optimized.throughput /
|
||||
1024 /
|
||||
1024
|
||||
).toFixed(1)}`
|
||||
const speedupVsMikroOrm = `${speedupOriginalVsMikroOrm.toFixed(
|
||||
1
|
||||
)}x/${speedupOptimizedVsMikroOrm.toFixed(1)}x`
|
||||
const speedupVsOriginal = `${speedupOptimizedVsOriginal.toFixed(1)}x`
|
||||
|
||||
logs.push(
|
||||
`${size.toLocaleString().padEnd(12)} ${rpsValues.padEnd(
|
||||
20
|
||||
)} ${avgLatencyValues.padEnd(22)} ${p90LatencyValues.padEnd(
|
||||
22
|
||||
)} ${throughputValues.padEnd(25)} ${speedupVsMikroOrm.padEnd(
|
||||
18
|
||||
)} ${speedupVsOriginal}`
|
||||
)
|
||||
}
|
||||
})
|
||||
|
||||
logs.push(`\n🎯 Overall Load Testing Performance Summary:`)
|
||||
allResults.forEach(({ size, results }) => {
|
||||
const optimized = results.find((r) => r.name === "Optimized")
|
||||
const original = results.find((r) => r.name === "Original")
|
||||
const mikroOrm = results.find((r) => r.name === "MikroOrm")
|
||||
|
||||
if (optimized && original && mikroOrm) {
|
||||
const rpsGainVsOriginal =
|
||||
((optimized.requestsPerSecond - original.requestsPerSecond) /
|
||||
original.requestsPerSecond) *
|
||||
100
|
||||
const rpsGainVsMikroOrm =
|
||||
((optimized.requestsPerSecond - mikroOrm.requestsPerSecond) /
|
||||
mikroOrm.requestsPerSecond) *
|
||||
100
|
||||
|
||||
logs.push(`\n 📈 ${size} products:`)
|
||||
logs.push(
|
||||
` • +${rpsGainVsOriginal.toFixed(
|
||||
1
|
||||
)}% more requests/sec vs Original (${original.requestsPerSecond.toFixed(
|
||||
1
|
||||
)} → ${optimized.requestsPerSecond.toFixed(1)})`
|
||||
)
|
||||
logs.push(
|
||||
` • +${rpsGainVsMikroOrm.toFixed(
|
||||
1
|
||||
)}% more requests/sec vs MikroOrm (${mikroOrm.requestsPerSecond.toFixed(
|
||||
1
|
||||
)} → ${optimized.requestsPerSecond.toFixed(1)})`
|
||||
)
|
||||
}
|
||||
})
|
||||
|
||||
console.log(logs.join("\n"))
|
||||
}, 1200000)
|
||||
})
|
||||
|
||||
@@ -109,14 +109,14 @@ class BigNumberNumeric extends Type<string | number, string> {
|
||||
}
|
||||
|
||||
override convertToJSValue(value: string): number | string {
|
||||
if ((this.mode ?? this.prop?.runtimeType) === "number") {
|
||||
return +value
|
||||
}
|
||||
|
||||
if (isObject(value)) {
|
||||
return value // Special case for BigNumberRawValue because the setter will manage the dispatch automatically at a later stage
|
||||
}
|
||||
|
||||
if ((this.mode ?? this.prop?.runtimeType) === "number") {
|
||||
return +value
|
||||
}
|
||||
|
||||
return String(value)
|
||||
}
|
||||
|
||||
|
||||
@@ -0,0 +1,684 @@
|
||||
/**
|
||||
* This is an optimized mikro orm serializer to create a highly optimized serialization pipeline
|
||||
* that leverages V8's JIT compilation and inline caching mechanisms.
|
||||
*/
|
||||
|
||||
import {
|
||||
Collection,
|
||||
EntityDTO,
|
||||
EntityMetadata,
|
||||
helper,
|
||||
IPrimaryKey,
|
||||
Loaded,
|
||||
Platform,
|
||||
Reference,
|
||||
ReferenceKind,
|
||||
SerializationContext,
|
||||
Utils,
|
||||
} from "@mikro-orm/core"
|
||||
|
||||
const STATIC_OPTIONS_SHAPE: {
|
||||
populate: string[] | boolean | undefined
|
||||
exclude: string[] | undefined
|
||||
preventCircularRef: boolean | undefined
|
||||
skipNull: boolean | undefined
|
||||
ignoreSerializers: boolean | undefined
|
||||
forceObject: boolean | undefined
|
||||
} = {
|
||||
populate: ["*"],
|
||||
exclude: undefined,
|
||||
preventCircularRef: true,
|
||||
skipNull: undefined,
|
||||
ignoreSerializers: undefined,
|
||||
forceObject: true,
|
||||
}
|
||||
|
||||
const EMPTY_ARRAY: string[] = []
|
||||
|
||||
const WILDCARD = "*"
|
||||
const DOT = "."
|
||||
const UNDERSCORE = "_"
|
||||
|
||||
// JIT-friendly function with predictable patterns
|
||||
function isVisible<T extends object>(
|
||||
meta: EntityMetadata<T>,
|
||||
propName: string,
|
||||
options: Parameters<typeof EntitySerializer.serialize>[1] & {
|
||||
preventCircularRef?: boolean
|
||||
populate?: string[] | boolean
|
||||
} = STATIC_OPTIONS_SHAPE
|
||||
): boolean {
|
||||
// Fast path for boolean populate
|
||||
const populate = options.populate
|
||||
if (populate === true) {
|
||||
return true
|
||||
}
|
||||
|
||||
if (Array.isArray(populate)) {
|
||||
// Check exclusions first (early exit)
|
||||
const exclude = options.exclude
|
||||
if (exclude && exclude.length > 0) {
|
||||
const excludeLen = exclude.length
|
||||
for (let i = 0; i < excludeLen; i++) {
|
||||
if (exclude[i] === propName) {
|
||||
return false
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Hoist computations outside loop
|
||||
const propNameLen = propName.length
|
||||
const propPrefix = propName + DOT
|
||||
const propPrefixLen = propPrefix.length
|
||||
const populateLen = populate.length
|
||||
|
||||
// Simple loop that JIT can optimize well
|
||||
for (let i = 0; i < populateLen; i++) {
|
||||
const item = populate[i]
|
||||
if (item === propName || item === WILDCARD) {
|
||||
return true
|
||||
}
|
||||
if (
|
||||
item.length > propNameLen &&
|
||||
item.substring(0, propPrefixLen) === propPrefix
|
||||
) {
|
||||
return true
|
||||
}
|
||||
}
|
||||
return false
|
||||
}
|
||||
|
||||
// Inline property check for non-array case
|
||||
const prop = meta.properties[propName]
|
||||
const visible = (prop && !prop.hidden) || prop === undefined
|
||||
const prefixed = prop && !prop.primary && propName.charAt(0) === UNDERSCORE
|
||||
return visible && !prefixed
|
||||
}
|
||||
|
||||
// Clean, JIT-friendly function
|
||||
function isPopulated<T extends object>(
|
||||
entity: T,
|
||||
propName: string,
|
||||
options: Parameters<typeof EntitySerializer.serialize>[1] & {
|
||||
preventCircularRef?: boolean
|
||||
populate?: string[] | boolean
|
||||
} = STATIC_OPTIONS_SHAPE
|
||||
): boolean {
|
||||
const populate = options.populate
|
||||
|
||||
// Fast path for boolean
|
||||
if (typeof populate === "boolean") {
|
||||
return populate
|
||||
}
|
||||
|
||||
if (!Array.isArray(populate)) {
|
||||
return false
|
||||
}
|
||||
|
||||
// Hoist computations for JIT optimization
|
||||
const propNameLen = propName.length
|
||||
const propPrefix = propName + DOT
|
||||
const propPrefixLen = propPrefix.length
|
||||
const populateLen = populate.length
|
||||
|
||||
// Simple predictable loop
|
||||
for (let i = 0; i < populateLen; i++) {
|
||||
const item = populate[i]
|
||||
if (item === propName || item === WILDCARD) {
|
||||
return true
|
||||
}
|
||||
if (
|
||||
item.length > propNameLen &&
|
||||
item.substring(0, propPrefixLen) === propPrefix
|
||||
) {
|
||||
return true
|
||||
}
|
||||
}
|
||||
|
||||
return false
|
||||
}
|
||||
|
||||
/**
|
||||
* Custom property filtering for the serialization which takes into account circular references to not return them.
|
||||
* @param propName
|
||||
* @param meta
|
||||
* @param options
|
||||
* @param parents
|
||||
*/
|
||||
// @ts-ignore
|
||||
function filterEntityPropToSerialize({
|
||||
propName,
|
||||
meta,
|
||||
options,
|
||||
parents,
|
||||
}: {
|
||||
propName: string
|
||||
meta: EntityMetadata
|
||||
options: Parameters<typeof EntitySerializer.serialize>[1] & {
|
||||
preventCircularRef?: boolean
|
||||
populate?: string[] | boolean
|
||||
}
|
||||
parents?: string[]
|
||||
}): boolean {
|
||||
const parentsArray = parents || EMPTY_ARRAY
|
||||
|
||||
const isVisibleRes = isVisible(meta, propName, options)
|
||||
const prop = meta.properties[propName]
|
||||
|
||||
if (
|
||||
prop &&
|
||||
options.preventCircularRef &&
|
||||
isVisibleRes &&
|
||||
prop.kind !== ReferenceKind.SCALAR
|
||||
) {
|
||||
if (!!prop.mapToPk) {
|
||||
return true
|
||||
}
|
||||
|
||||
const parentsLen = parentsArray.length
|
||||
for (let i = 0; i < parentsLen; i++) {
|
||||
if (parentsArray[i] === prop.type) {
|
||||
return false
|
||||
}
|
||||
}
|
||||
return true
|
||||
}
|
||||
|
||||
return isVisibleRes
|
||||
}
|
||||
|
||||
export class EntitySerializer {
|
||||
// Thread-safe per-instance cache to avoid concurrency issues
|
||||
private static readonly PROPERTY_CACHE_SIZE = 2000
|
||||
|
||||
static serialize<T extends object, P extends string = never>(
|
||||
entity: T,
|
||||
options: Partial<typeof STATIC_OPTIONS_SHAPE> = STATIC_OPTIONS_SHAPE,
|
||||
parents: string[] = EMPTY_ARRAY
|
||||
): EntityDTO<Loaded<T, P>> {
|
||||
// Avoid Array.from and Set allocation for hot path
|
||||
const parents_ = parents.length > 0 ? Array.from(new Set(parents)) : []
|
||||
|
||||
const wrapped = helper(entity)
|
||||
const meta = wrapped.__meta
|
||||
let contextCreated = false
|
||||
|
||||
if (!wrapped.__serializationContext.root) {
|
||||
const root = new SerializationContext<T>({} as any)
|
||||
SerializationContext.propagate(
|
||||
root,
|
||||
entity,
|
||||
(meta, prop) => meta.properties[prop]?.kind !== ReferenceKind.SCALAR
|
||||
)
|
||||
contextCreated = true
|
||||
}
|
||||
|
||||
const root = wrapped.__serializationContext
|
||||
.root! as SerializationContext<any> & {
|
||||
visitedSerialized?: Map<string, any>
|
||||
}
|
||||
|
||||
const ret = {} as EntityDTO<Loaded<T, P>>
|
||||
|
||||
// Use Set for deduplication but keep it simple
|
||||
const keys = new Set<string>()
|
||||
|
||||
const primaryKeys = meta.primaryKeys
|
||||
const primaryKeysLen = primaryKeys.length
|
||||
for (let i = 0; i < primaryKeysLen; i++) {
|
||||
keys.add(primaryKeys[i])
|
||||
}
|
||||
|
||||
const entityKeys = Object.keys(entity)
|
||||
const entityKeysLen = entityKeys.length
|
||||
for (let i = 0; i < entityKeysLen; i++) {
|
||||
keys.add(entityKeys[i])
|
||||
}
|
||||
|
||||
const visited = root.visited.has(entity)
|
||||
if (!visited) {
|
||||
root.visited.add(entity)
|
||||
}
|
||||
|
||||
const keysArray = Array.from(keys)
|
||||
const keysLen = keysArray.length
|
||||
|
||||
// Hoist invariant calculations
|
||||
const className = meta.className
|
||||
const platform = wrapped.__platform
|
||||
const skipNull = options.skipNull
|
||||
const metaProperties = meta.properties
|
||||
const preventCircularRef = options.preventCircularRef
|
||||
|
||||
// Clean property processing loop
|
||||
for (let i = 0; i < keysLen; i++) {
|
||||
const prop = keysArray[i]
|
||||
|
||||
// Simple filtering logic
|
||||
const isVisibleRes = isVisible(meta, prop, options)
|
||||
const propMeta = metaProperties[prop]
|
||||
|
||||
let shouldSerialize = isVisibleRes
|
||||
if (
|
||||
propMeta &&
|
||||
preventCircularRef &&
|
||||
isVisibleRes &&
|
||||
propMeta.kind !== ReferenceKind.SCALAR
|
||||
) {
|
||||
if (!!propMeta.mapToPk) {
|
||||
shouldSerialize = true
|
||||
} else {
|
||||
const parentsLen = parents_.length
|
||||
for (let j = 0; j < parentsLen; j++) {
|
||||
if (parents_[j] === propMeta.type) {
|
||||
shouldSerialize = false
|
||||
break
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (!shouldSerialize) {
|
||||
continue
|
||||
}
|
||||
|
||||
const cycle = root.visit(className, prop)
|
||||
if (cycle && visited) continue
|
||||
|
||||
const val = this.processProperty<T>(
|
||||
prop as keyof T & string,
|
||||
entity,
|
||||
options,
|
||||
parents_
|
||||
)
|
||||
|
||||
if (!cycle) {
|
||||
root.leave(className, prop)
|
||||
}
|
||||
|
||||
if (skipNull && Utils.isPlainObject(val)) {
|
||||
Utils.dropUndefinedProperties(val, null)
|
||||
}
|
||||
|
||||
if (typeof val !== "undefined" && !(val === null && skipNull)) {
|
||||
ret[this.propertyName(meta, prop as keyof T & string, platform)] =
|
||||
val as T[keyof T & string]
|
||||
}
|
||||
}
|
||||
|
||||
if (contextCreated) {
|
||||
root.close()
|
||||
}
|
||||
|
||||
if (!wrapped.isInitialized()) {
|
||||
return ret
|
||||
}
|
||||
|
||||
// Clean getter processing
|
||||
const metaProps = meta.props
|
||||
const metaPropsLen = metaProps.length
|
||||
|
||||
for (let i = 0; i < metaPropsLen; i++) {
|
||||
const prop = metaProps[i]
|
||||
const propName = prop.name
|
||||
|
||||
// Clear, readable conditions
|
||||
if (
|
||||
prop.getter &&
|
||||
prop.getterName === undefined &&
|
||||
typeof entity[propName] !== "undefined" &&
|
||||
isVisible(meta, propName, options)
|
||||
) {
|
||||
ret[this.propertyName(meta, propName, platform)] = this.processProperty(
|
||||
propName,
|
||||
entity,
|
||||
options,
|
||||
parents_
|
||||
)
|
||||
} else if (
|
||||
prop.getterName &&
|
||||
(entity[prop.getterName] as unknown) instanceof Function &&
|
||||
isVisible(meta, propName, options)
|
||||
) {
|
||||
ret[this.propertyName(meta, propName, platform)] = this.processProperty(
|
||||
prop.getterName as keyof T & string,
|
||||
entity,
|
||||
options,
|
||||
parents_
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
return ret
|
||||
}
|
||||
|
||||
// Thread-safe property name resolution with WeakMap for per-entity caching
|
||||
private static propertyNameCache = new WeakMap<
|
||||
EntityMetadata<any>,
|
||||
Map<string, string>
|
||||
>()
|
||||
|
||||
private static propertyName<T>(
|
||||
meta: EntityMetadata<T>,
|
||||
prop: string,
|
||||
platform?: Platform
|
||||
): string {
|
||||
// Use WeakMap per metadata to avoid global cache conflicts
|
||||
let entityCache = this.propertyNameCache.get(meta)
|
||||
if (!entityCache) {
|
||||
entityCache = new Map<string, string>()
|
||||
this.propertyNameCache.set(meta, entityCache)
|
||||
}
|
||||
|
||||
const cacheKey = `${prop}:${platform?.constructor.name || "no-platform"}`
|
||||
|
||||
const cached = entityCache.get(cacheKey)
|
||||
if (cached !== undefined) {
|
||||
return cached
|
||||
}
|
||||
|
||||
// Inline property resolution for hot path
|
||||
let result: string
|
||||
const property = meta.properties[prop]
|
||||
|
||||
/* istanbul ignore next */
|
||||
if (property?.serializedName) {
|
||||
result = property.serializedName as string
|
||||
} else if (property?.primary && platform) {
|
||||
result = platform.getSerializedPrimaryKeyField(prop) as string
|
||||
} else {
|
||||
result = prop
|
||||
}
|
||||
|
||||
// Prevent cache from growing too large
|
||||
if (entityCache.size >= this.PROPERTY_CACHE_SIZE) {
|
||||
entityCache.clear() // Much faster than selective deletion
|
||||
}
|
||||
|
||||
entityCache.set(cacheKey, result)
|
||||
return result
|
||||
}
|
||||
|
||||
private static processProperty<T extends object>(
|
||||
prop: string,
|
||||
entity: T,
|
||||
options: Parameters<typeof EntitySerializer.serialize>[1] & {
|
||||
preventCircularRef?: boolean
|
||||
populate?: string[] | boolean
|
||||
},
|
||||
parents: string[] = EMPTY_ARRAY
|
||||
): T[keyof T] | undefined {
|
||||
// Avoid array allocation when not needed
|
||||
const parents_ =
|
||||
parents.length > 0
|
||||
? [...parents, entity.constructor.name]
|
||||
: [entity.constructor.name]
|
||||
|
||||
// Handle dotted properties efficiently
|
||||
const parts = prop.split(DOT)
|
||||
prop = parts[0] as string & keyof T
|
||||
|
||||
const wrapped = helper(entity)
|
||||
const property = wrapped.__meta.properties[prop]
|
||||
const serializer = property?.serializer
|
||||
const propValue = entity[prop]
|
||||
|
||||
// Fast path for function properties
|
||||
if ((propValue as unknown) instanceof Function) {
|
||||
const returnValue = (propValue as unknown as () => T[keyof T & string])()
|
||||
if (!options.ignoreSerializers && serializer) {
|
||||
return serializer(returnValue)
|
||||
}
|
||||
return returnValue
|
||||
}
|
||||
|
||||
/* istanbul ignore next */
|
||||
if (!options.ignoreSerializers && serializer) {
|
||||
return serializer(propValue)
|
||||
}
|
||||
|
||||
// Type checks in optimal order
|
||||
if (Utils.isCollection(propValue)) {
|
||||
return this.processCollection(
|
||||
prop as keyof T & string,
|
||||
entity,
|
||||
options,
|
||||
parents_
|
||||
)
|
||||
}
|
||||
|
||||
if (Utils.isEntity(propValue, true)) {
|
||||
return this.processEntity(
|
||||
prop as keyof T & string,
|
||||
entity,
|
||||
wrapped.__platform,
|
||||
options,
|
||||
parents_
|
||||
)
|
||||
}
|
||||
|
||||
/* istanbul ignore next */
|
||||
if (property?.reference === ReferenceKind.EMBEDDED) {
|
||||
if (Array.isArray(propValue)) {
|
||||
return (propValue as object[]).map((item) =>
|
||||
helper(item).toJSON()
|
||||
) as T[keyof T]
|
||||
}
|
||||
|
||||
if (Utils.isObject(propValue)) {
|
||||
return helper(propValue).toJSON() as T[keyof T]
|
||||
}
|
||||
}
|
||||
|
||||
const customType = property?.customType
|
||||
if (customType) {
|
||||
return customType.toJSON(propValue, wrapped.__platform)
|
||||
}
|
||||
return wrapped.__platform.normalizePrimaryKey(
|
||||
propValue as unknown as IPrimaryKey
|
||||
) as unknown as T[keyof T]
|
||||
}
|
||||
|
||||
private static extractChildOptions<T extends object>(
|
||||
options: Parameters<typeof EntitySerializer.serialize>[1] & {
|
||||
preventCircularRef?: boolean
|
||||
populate?: string[] | boolean
|
||||
},
|
||||
prop: keyof T & string
|
||||
): Parameters<typeof EntitySerializer.serialize>[1] & {
|
||||
preventCircularRef?: boolean
|
||||
populate?: string[] | boolean
|
||||
} {
|
||||
const propPrefix = prop + DOT
|
||||
const propPrefixLen = propPrefix.length
|
||||
|
||||
// Inline function to avoid call overhead
|
||||
const extractChildElements = (items: string[]) => {
|
||||
const result: string[] = []
|
||||
const itemsLen = items.length
|
||||
|
||||
// Traditional for loop for better performance
|
||||
for (let i = 0; i < itemsLen; i++) {
|
||||
const field = items[i]
|
||||
if (
|
||||
field.length > propPrefixLen &&
|
||||
field.substring(0, propPrefixLen) === propPrefix
|
||||
) {
|
||||
result.push(field.substring(propPrefixLen))
|
||||
}
|
||||
}
|
||||
return result
|
||||
}
|
||||
|
||||
const populate = options.populate
|
||||
const exclude = options.exclude
|
||||
|
||||
// Avoid object spread when possible
|
||||
const result = {
|
||||
populate:
|
||||
Array.isArray(populate) && !populate.includes(WILDCARD)
|
||||
? extractChildElements(populate as unknown as string[])
|
||||
: populate,
|
||||
exclude:
|
||||
Array.isArray(exclude) && !exclude.includes(WILDCARD)
|
||||
? extractChildElements(exclude)
|
||||
: exclude,
|
||||
preventCircularRef: options.preventCircularRef,
|
||||
skipNull: options.skipNull,
|
||||
ignoreSerializers: options.ignoreSerializers,
|
||||
forceObject: options.forceObject,
|
||||
} as Parameters<typeof EntitySerializer.serialize>[1] & {
|
||||
preventCircularRef?: boolean
|
||||
populate?: string[] | boolean
|
||||
}
|
||||
|
||||
return result
|
||||
}
|
||||
|
||||
private static processEntity<T extends object>(
|
||||
prop: keyof T & string,
|
||||
entity: T,
|
||||
platform: Platform,
|
||||
options: Parameters<typeof EntitySerializer.serialize>[1] & {
|
||||
preventCircularRef?: boolean
|
||||
populate?: string[] | boolean
|
||||
},
|
||||
parents: string[] = EMPTY_ARRAY
|
||||
): T[keyof T] | undefined {
|
||||
const parents_ =
|
||||
parents.length > 0
|
||||
? [...parents, entity.constructor.name]
|
||||
: [entity.constructor.name]
|
||||
|
||||
const child = Reference.unwrapReference(entity[prop] as T)
|
||||
const wrapped = helper(child)
|
||||
// Fixed: was incorrectly calling isPopulated(child, prop, options) instead of isPopulated(entity, prop, options)
|
||||
const populated =
|
||||
isPopulated(entity, prop, options) && wrapped.isInitialized()
|
||||
const expand = populated || options.forceObject || !wrapped.__managed
|
||||
|
||||
if (expand) {
|
||||
return this.serialize(
|
||||
child,
|
||||
this.extractChildOptions(options, prop),
|
||||
parents_
|
||||
) as T[keyof T]
|
||||
}
|
||||
|
||||
return platform.normalizePrimaryKey(
|
||||
wrapped.getPrimaryKey() as IPrimaryKey
|
||||
) as T[keyof T]
|
||||
}
|
||||
|
||||
private static processCollection<T extends object>(
|
||||
prop: keyof T & string,
|
||||
entity: T,
|
||||
options: Parameters<typeof EntitySerializer.serialize>[1] & {
|
||||
preventCircularRef?: boolean
|
||||
populate?: string[] | boolean
|
||||
},
|
||||
parents: string[] = EMPTY_ARRAY
|
||||
): T[keyof T] | undefined {
|
||||
const parents_ =
|
||||
parents.length > 0
|
||||
? [...parents, entity.constructor.name]
|
||||
: [entity.constructor.name]
|
||||
const col = entity[prop] as unknown as Collection<T>
|
||||
|
||||
if (!col.isInitialized()) {
|
||||
return undefined
|
||||
}
|
||||
|
||||
const items = col.getItems(false)
|
||||
const itemsLen = items.length
|
||||
const result = new Array(itemsLen)
|
||||
|
||||
const childOptions = this.extractChildOptions(options, prop)
|
||||
|
||||
// Check if the collection property itself should be populated
|
||||
// Fixed: was incorrectly calling isPopulated(item, prop, options) instead of isPopulated(entity, prop, options)
|
||||
const shouldPopulateCollection = isPopulated(entity, prop, options)
|
||||
|
||||
for (let i = 0; i < itemsLen; i++) {
|
||||
const item = items[i]
|
||||
if (shouldPopulateCollection) {
|
||||
result[i] = this.serialize(item, childOptions, parents_)
|
||||
} else {
|
||||
result[i] = helper(item).getPrimaryKey()
|
||||
}
|
||||
}
|
||||
|
||||
return result as unknown as T[keyof T]
|
||||
}
|
||||
}
|
||||
|
||||
export const mikroOrmSerializer = <TOutput extends object>(
|
||||
data: any,
|
||||
options?: Partial<
|
||||
Parameters<typeof EntitySerializer.serialize>[1] & {
|
||||
preventCircularRef: boolean | undefined
|
||||
populate: string[] | boolean | undefined
|
||||
}
|
||||
>
|
||||
): Promise<TOutput> => {
|
||||
return new Promise<TOutput>((resolve) => {
|
||||
// Efficient options handling
|
||||
if (!options) {
|
||||
options = STATIC_OPTIONS_SHAPE
|
||||
} else {
|
||||
// Check if we can use static shape
|
||||
let useStatic = true
|
||||
const optionKeys = Object.keys(options)
|
||||
for (let i = 0; i < optionKeys.length; i++) {
|
||||
const key = optionKeys[i] as keyof typeof options
|
||||
if (
|
||||
options[key] !==
|
||||
STATIC_OPTIONS_SHAPE[key as keyof typeof STATIC_OPTIONS_SHAPE]
|
||||
) {
|
||||
useStatic = false
|
||||
break
|
||||
}
|
||||
}
|
||||
|
||||
if (useStatic) {
|
||||
options = STATIC_OPTIONS_SHAPE
|
||||
} else {
|
||||
options = { ...STATIC_OPTIONS_SHAPE, ...options }
|
||||
}
|
||||
}
|
||||
|
||||
const data_ = (Array.isArray(data) ? data : [data]).filter(Boolean)
|
||||
|
||||
const forSerialization: object[] = []
|
||||
const notForSerialization: object[] = []
|
||||
|
||||
// Simple classification loop
|
||||
const dataLen = data_.length
|
||||
for (let i = 0; i < dataLen; i++) {
|
||||
const object = data_[i]
|
||||
if (object.__meta) {
|
||||
forSerialization.push(object)
|
||||
} else {
|
||||
notForSerialization.push(object)
|
||||
}
|
||||
}
|
||||
|
||||
// Pre-allocate result array
|
||||
const forSerializationLen = forSerialization.length
|
||||
const result: any = new Array(forSerializationLen)
|
||||
|
||||
for (let i = 0; i < forSerializationLen; i++) {
|
||||
result[i] = EntitySerializer.serialize(forSerialization[i], options)
|
||||
}
|
||||
|
||||
// Simple result construction
|
||||
let finalResult: any
|
||||
if (notForSerialization.length > 0) {
|
||||
finalResult = result.concat(notForSerialization)
|
||||
} else {
|
||||
finalResult = result
|
||||
}
|
||||
|
||||
resolve(Array.isArray(data) ? finalResult : finalResult[0])
|
||||
})
|
||||
}
|
||||
File diff suppressed because it is too large
Load Diff
Reference in New Issue
Block a user