chore(): Emit events in batch and index process event ids in batch (#12097)
**What** First iteration to prevent events from overwhelming the systems. - Group emitted event ids when possible instead of creating a message per id which leads to reduced amount of events to process massively in cases of import for example - Update the index engine to process event data in batches of 100 - Update event handling by the index engine to be able to upsert by batch as well - Fix index engine build config for intermediate listeners inferrence
This commit is contained in:
committed by
GitHub
parent
b05807bfc1
commit
74381addc3
@@ -60,15 +60,15 @@ export function moduleEventBuilderFactory({
|
||||
})
|
||||
}
|
||||
|
||||
data.forEach((dataItem) => {
|
||||
messages.push({
|
||||
source,
|
||||
action,
|
||||
context: sharedContext,
|
||||
data: { id: dataItem.id },
|
||||
eventName: eventName!,
|
||||
object,
|
||||
})
|
||||
messages.push({
|
||||
source,
|
||||
action,
|
||||
context: sharedContext,
|
||||
data: {
|
||||
id: data.length === 1 ? data[0].id : data.map((item) => item.id),
|
||||
},
|
||||
eventName: eventName!,
|
||||
object,
|
||||
})
|
||||
|
||||
aggregator.saveRawMessageData(messages)
|
||||
|
||||
Reference in New Issue
Block a user