Aggregate()
Aggregate.prototype.addFields()
Aggregate.prototype.allowDiskUse()
Aggregate.prototype.append()
Aggregate.prototype.catch()
Aggregate.prototype.collation()
Aggregate.prototype.count()
Aggregate.prototype.cursor()
Aggregate.prototype.densify()
Aggregate.prototype.exec()
Aggregate.prototype.explain()
Aggregate.prototype.facet()
Aggregate.prototype.fill()
Aggregate.prototype.finally()
Aggregate.prototype.graphLookup()
Aggregate.prototype.group()
Aggregate.prototype.hint()
Aggregate.prototype.limit()
Aggregate.prototype.lookup()
Aggregate.prototype.match()
Aggregate.prototype.model()
Aggregate.prototype.near()
Aggregate.prototype.option()
Aggregate.prototype.options
Aggregate.prototype.pipeline()
Aggregate.prototype.project()
Aggregate.prototype.read()
Aggregate.prototype.readConcern()
Aggregate.prototype.redact()
Aggregate.prototype.replaceRoot()
Aggregate.prototype.sample()
Aggregate.prototype.search()
Aggregate.prototype.session()
Aggregate.prototype.skip()
Aggregate.prototype.sort()
Aggregate.prototype.sortByCount()
Aggregate.prototype.then()
Aggregate.prototype.unionWith()
Aggregate.prototype.unwind()
Aggregate.prototype[Symbol.asyncIterator]()
Aggregate()
[pipeline]
«Array» aggregation pipeline as an array of objects[modelOrConn]
«Model|Connection» the model or connection to use with this aggregate.Aggregate constructor used for building aggregation pipelines. Do not instantiate this class directly, use Model.aggregate() instead.
const aggregate = Model.aggregate([ { $project: { a: 1, b: 1 } }, { $skip: 5 } ]); Model. aggregate([{ $match: { age: { $gte: 21 }}}]). unwind('tags'). exec();
The documents returned are plain javascript objects, not mongoose documents (since any shape of document can be returned).
Mongoose does not cast pipeline stages. The below will not work unless _id
is a string in the database
new Aggregate([{ $match: { _id: '00000000000000000000000a' } }]); // Do this instead to cast to an ObjectId new Aggregate([{ $match: { _id: new mongoose.Types.ObjectId('00000000000000000000000a') } }]);
Aggregate.prototype.addFields()
arg
«Object» field specificationAppends a new $addFields operator to this aggregate pipeline. Requires MongoDB v3.4+ to work
// adding new fields based on existing fields aggregate.addFields({ newField: '$b.nested' , plusTen: { $add: ['$val', 10]} , sub: { name: '$a' } }) // etc aggregate.addFields({ salary_k: { $divide: [ "$salary", 1000 ] } });
Aggregate.prototype.allowDiskUse()
value
«Boolean» Should tell server it can use hard drive to store data during aggregation.Sets the allowDiskUse option for the aggregation query
awaitModel.aggregate([{ $match: { foo: 'bar' } }]).allowDiskUse(true);
Aggregate.prototype.append()
...ops
«Object|Array[Object]» operator(s) to append. Can either be a spread of objects or a single parameter of a object array.Appends new operators to this aggregate pipeline
aggregate.append({ $project: { field: 1 }}, { $limit: 2 }); // or pass an arrayconst pipeline = [{ $match: { daw: 'Logic Audio X' }} ]; aggregate.append(pipeline);
Aggregate.prototype.catch()
[reject]
«Function»Executes the aggregation returning a Promise
which will be resolved with either the doc(s) or rejected with the error. Like .then()
, but only takes a rejection handler. Compatible with await
.
Aggregate.prototype.collation()
collation
«Object» optionsAdds a collation
const res = awaitModel.aggregate(pipeline).collation({ locale: 'en_US', strength: 1 });
Aggregate.prototype.count()
fieldName
«String» The name of the output field which has the count as its value. It must be a non-empty string, must not start with $ and must not contain the . character.Aggregate.prototype.cursor()
options
«Object»[options.batchSize]
«Number» set the cursor batch size[options.useMongooseAggCursor]
«Boolean» use experimental mongoose-specific aggregation cursor (for eachAsync()
and other query cursor semantics)Sets the cursor
option and executes this aggregation, returning an aggregation cursor. Cursors are useful if you want to process the results of the aggregation one-at-a-time because the aggregation result is too big to fit into memory.
const cursor = Model.aggregate(..).cursor({ batchSize: 1000 }); cursor.eachAsync(function(doc, i) { // use doc });
Aggregate.prototype.densify()
arg
«Object» $densify operator contentsAppends a new $densify operator to this aggregate pipeline.
aggregate.densify({ field: 'timestamp', range: { step: 1, unit: 'hour', bounds: [newDate('2021-05-18T00:00:00.000Z'), newDate('2021-05-18T08:00:00.000Z')] } });
Aggregate.prototype.exec()
Executes the aggregate pipeline on the currently bound Model.
const result = await aggregate.exec();
Aggregate.prototype.explain()
[verbosity]
«String»Aggregate.prototype.facet()
facet
«Object» optionsCombines multiple aggregation pipelines.
const res = awaitModel.aggregate().facet({ books: [{ groupBy: '$author' }], price: [{ $bucketAuto: { groupBy: '$price', buckets: 2 } }] }); // Output: { books: [...], price: [{...}, {...}] }
Aggregate.prototype.fill()
arg
«Object» $fill operator contentsAppends a new $fill operator to this aggregate pipeline.
aggregate.fill({ output: { bootsSold: { value: 0 }, sandalsSold: { value: 0 }, sneakersSold: { value: 0 } } });
Aggregate.prototype.finally()
[onFinally]
«Function»Executes the aggregate returning a Promise
which will be resolved with .finally()
chained.
More about Promise finally()
in JavaScript.
Aggregate.prototype.graphLookup()
options
«Object» to $graphLookup as described in the above linkAppends new custom $graphLookup operator(s) to this aggregate pipeline, performing a recursive search on a collection.
Note that graphLookup can only consume at most 100MB of memory, and does not allow disk use even if { allowDiskUse: true }
is specified.
// Suppose we have a collection of courses, where a document might look like `{ _id: 0, name: 'Calculus', prerequisite: 'Trigonometry'}` and `{ _id: 0, name: 'Trigonometry', prerequisite: 'Algebra' }` aggregate.graphLookup({ from: 'courses', startWith: '$prerequisite', connectFromField: 'prerequisite', connectToField: 'name', as: 'prerequisites', maxDepth: 3 }) // this will recursively search the 'courses' collection up to 3 prerequisites
Aggregate.prototype.group()
arg
«Object» $group operator contentsAppends a new custom $group operator to this aggregate pipeline.
aggregate.group({ _id: "$department" });
Aggregate.prototype.hint()
value
«Object|String» a hint object or the index nameSets the hint option for the aggregation query
Model.aggregate(..).hint({ qty: 1, category: 1 }).exec();
Aggregate.prototype.limit()
num
«Number» maximum number of records to pass to the next stageAggregate.prototype.lookup()
options
«Object» to $lookup as described in the above linkAppends new custom $lookup operator to this aggregate pipeline.
aggregate.lookup({ from: 'users', localField: 'userId', foreignField: '_id', as: 'users' });
Aggregate.prototype.match()
arg
«Object» $match operator contentsAppends a new custom $match operator to this aggregate pipeline.
aggregate.match({ department: { $in: [ "sales", "engineering" ] } });
Aggregate.prototype.model()
[model]
«Model» Set the model associated with this aggregate. If not provided, returns the already stored model.Get/set the model that this aggregation will execute on.
const aggregate = MyModel.aggregate([{ $match: { answer: 42 } }]); aggregate.model() === MyModel; // true// Change the model. There's rarely any reason to do this. aggregate.model(SomeOtherModel); aggregate.model() === SomeOtherModel; // true
Aggregate.prototype.near()
arg
«Object»arg.near
«Object|Array<Number>» GeoJSON point or coordinates arrayAppends a new $geoNear operator to this aggregate pipeline.
MUST be used as the first operator in the pipeline.
aggregate.near({ near: { type: 'Point', coordinates: [40.724, -73.997] }, distanceField: "dist.calculated", // requiredmaxDistance: 0.008, query: { type: "public" }, includeLocs: "dist.location", spherical: true, });
Aggregate.prototype.option()
options
«Object» keys to merge into current options[options.maxTimeMS]
«Number» number limits the time this aggregation will run, see MongoDB docs on maxTimeMS
[options.allowDiskUse]
«Boolean» boolean if true, the MongoDB server will use the hard drive to store data during this aggregation[options.collation]
«Object» object see Aggregate.prototype.collation()
[options.session]
«ClientSession» ClientSession see Aggregate.prototype.session()
Lets you set arbitrary options, for middleware or plugins.
const agg = Model.aggregate(..).option({ allowDiskUse: true }); // Set the `allowDiskUse` option agg.options; // `{ allowDiskUse: true }`
Aggregate.prototype.options
Contains options passed down to the aggregate command. Supported options are:
allowDiskUse
bypassDocumentValidation
collation
comment
cursor
explain
fieldsAsRaw
hint
let
maxTimeMS
raw
readConcern
readPreference
session
writeConcern
Aggregate.prototype.pipeline()
Returns the current pipeline
MyModel.aggregate().match({ test: 1 }).pipeline(); // [{ $match: { test: 1 } }]
Aggregate.prototype.project()
arg
«Object|String» field specificationAppends a new $project operator to this aggregate pipeline.
Mongoose query selection syntax is also supported.
// include a, include b, exclude _id aggregate.project("a b -_id"); // or you may use object notation, useful when// you have keys already prefixed with a "-" aggregate.project({a: 1, b: 1, _id: 0}); // reshaping documents aggregate.project({ newField: '$b.nested' , plusTen: { $add: ['$val', 10]} , sub: { name: '$a' } }) // etc aggregate.project({ salary_k: { $divide: [ "$salary", 1000 ] } });
Aggregate.prototype.read()
pref
«String|ReadPreference» one of the listed preference options or their aliases[tags]
«Array» optional tags for this query.Sets the readPreference option for the aggregation query.
awaitModel.aggregate(pipeline).read('primaryPreferred');
Aggregate.prototype.readConcern()
level
«String» one of the listed read concern level or their aliasesSets the readConcern level for the aggregation query.
awaitModel.aggregate(pipeline).readConcern('majority');
Aggregate.prototype.redact()
expression
«Object» redact options or conditional expression[thenExpr]
«String|Object» true case for the condition[elseExpr]
«String|Object» false case for the conditionAppends a new $redact operator to this aggregate pipeline.
If 3 arguments are supplied, Mongoose will wrap them with if-then-else of $cond operator respectively If thenExpr
or elseExpr
is string, make sure it starts with $$, like $$DESCEND
, $$PRUNE
or $$KEEP
.
awaitModel.aggregate(pipeline).redact({ $cond: { if: { $eq: [ '$level', 5 ] }, then: '$$PRUNE', else: '$$DESCEND' } }); // $redact often comes with $cond operator, you can also use the following syntax provided by mongooseawaitModel.aggregate(pipeline).redact({ $eq: [ '$level', 5 ] }, '$$PRUNE', '$$DESCEND');
Aggregate.prototype.replaceRoot()
newRoot
«String|Object» the field or document which will become the new root documentAppends a new $replaceRoot operator to this aggregate pipeline.
Note that the $replaceRoot
operator requires field strings to start with '$'. If you are passing in a string Mongoose will prepend '$' if the specified field doesn't start '$'. If you are passing in an object the strings in your expression will not be altered.
aggregate.replaceRoot("user"); aggregate.replaceRoot({ x: { $concat: ['$this', '$that'] } });
Aggregate.prototype.sample()
size
«Number» number of random documents to pickAppends new custom $sample operator to this aggregate pipeline.
aggregate.sample(3); // Add a pipeline that picks 3 random documents
Aggregate.prototype.search()
$search
«Object» optionsHelper for Atlas Text Search's $search
stage.
const res = awaitModel.aggregate(). search({ text: { query: 'baseball', path: 'plot' } }); // Output: [{ plot: '...', title: '...' }]
Aggregate.prototype.session()
session
«ClientSession»Sets the session for this aggregation. Useful for transactions.
const session = awaitModel.startSession(); awaitModel.aggregate(..).session(session);
Aggregate.prototype.skip()
num
«Number» number of records to skip before next stageAggregate.prototype.sort()
arg
«Object|String»Appends a new $sort operator to this aggregate pipeline.
If an object is passed, values allowed are asc
, desc
, ascending
, descending
, 1
, and -1
.
If a string is passed, it must be a space delimited list of path names. The sort order of each path is ascending unless the path name is prefixed with -
which will be treated as descending.
// these are equivalent aggregate.sort({ field: 'asc', test: -1 }); aggregate.sort('field -test');
Aggregate.prototype.sortByCount()
arg
«Object|String»Appends a new $sortByCount operator to this aggregate pipeline. Accepts either a string field name or a pipeline object.
Note that the $sortByCount
operator requires the new root to start with '$'. Mongoose will prepend '$' if the specified field name doesn't start with '$'.
aggregate.sortByCount('users'); aggregate.sortByCount({ $mergeObjects: [ "$employee", "$business" ] })
Aggregate.prototype.then()
[resolve]
«Function» successCallback[reject]
«Function» errorCallbackProvides a Promise-like then
function, which will call .exec
without a callback Compatible with await
.
Model.aggregate(..).then(successCallback, errorCallback);
Aggregate.prototype.unionWith()
options
«Object» to $unionWith query as described in the above linkAppends new $unionWith operator to this aggregate pipeline.
aggregate.unionWith({ coll: 'users', pipeline: [ { $match: { _id: 1 } } ] });
Aggregate.prototype.unwind()
fields
«String|Object|Array[String]|Array[Object]» the field(s) to unwind, either as field names or as objects with options. If passing a string, prefixing the field name with '$' is optional. If passing an object, path
must start with '$'.Appends new custom $unwind operator(s) to this aggregate pipeline.
Note that the $unwind
operator requires the path name to start with '$'. Mongoose will prepend '$' if the specified field doesn't start '$'.
aggregate.unwind("tags"); aggregate.unwind("a", "b", "c"); aggregate.unwind({ path: '$tags', preserveNullAndEmptyArrays: true });
Aggregate.prototype[Symbol.asyncIterator]()
Returns an asyncIterator for use with for/await/of
loops You do not need to call this function explicitly, the JavaScript runtime will call it for you.
const agg = Model.aggregate([{ $match: { age: { $gte: 25 } } }]); forawait (const doc of agg) { console.log(doc.name); }
Node.js 10.x supports async iterators natively without any flags. You can enable async iterators in Node.js 8.x using the --harmony_async_iteration
flag.
Note: This function is not set if Symbol.asyncIterator
is undefined. If Symbol.asyncIterator
is undefined, that means your Node.js version does not support async iterators.