Hey guys,
I may have a pipeline with 2 or 3 $facet stages in every each with $match, $group, $unwind, $addField stages, alltogether easily around 20 or 30 stages,
or I optionally share this one large to let’s say 3-5 other, smaller pipelines
( Pipeline A, Pipeline B, …, Pipeline E ) with a common output collection.
My questions:
- what is a best practice max. stage number limit in a pipeline and what are the other considerations?
- If I execute by schedule ( in Atlas Triggers ) all the 5 smaller pipelines separately, can they add their outputs to the same output collection without issues, using $out ?
I mean Pipeline A adds A1, A2, … An key value pairs to that collection,
Pipeline B adds B1, B2, … Bm key value pairs to that collection,
so this way will the output collection contain all key value pairs of A1, A2, …, B1, B2, …, Ey, Ez ?
Any arguments for or against these options or any other useful option is appreciated,
Thank you!