With the MongoCollection::aggregate
method, pipeline stages appear in a array.
Documents pass through the stages in sequence.
db.collection.aggregate( [ { <stage> }, ... ] )
The MongoPipeline fluent interface can be used to bluid a pipeline:
var pipeline = (new MongoPipeline).
match((new MongoMatch).eq("game", "nit")).
group((new MongoGroup("$game._id")).sum("nitcoins", "$game.nitcoins")).
sort((new MongoMatch).eq("nitcoins", -1)).
limit(10)
The pipeline can then be used in an aggregation query:
collection.aggregate(pipeline)
For more information read about MongoDB pipeline operators from the MongoDB official documentation: https://docs.mongodb.com/manual/reference/operator/aggregation/
mongodb :: MongoPipeline :: add_stage
Add a stage to the pipelinemongodb :: MongoPipeline :: defaultinit
mongodb :: MongoPipeline :: project
Apply projectionmongodb $ MongoPipeline :: SELF
Type of this instance, automatically specialized in every classcore :: Collection :: CONCURRENT
Type of the concurrent variant of this collectionserialization :: Serializable :: accept_json_serializer
Refinable service to customize the serialization of this class to JSONserialization :: Serializable :: accept_msgpack_attribute_counter
Hook to customize the behavior of theAttributeCounter
serialization :: Serializable :: accept_msgpack_serializer
Hook to customize the serialization of this class to MessagePackmongodb :: MongoPipeline :: add_stage
Add a stage to the pipelineserialization :: Serializable :: add_to_bundle
Called by[]=
to dynamically choose the appropriate method according
core :: Object :: class_factory
Implementation used byget_class
to create the specific class.
core :: Collection :: combinations
Allr
-length combinations on self (in same order) without repeated elements.
core :: Collection :: combinations_with_replacement
Allr
-length combination on self (in same order) with repeated elements.
core :: AbstractArrayRead :: copy_to
Copy a portion ofself
to an other array.
serialization :: Serializable :: core_serialize_to
Actual serialization ofself
to serializer
json :: JsonArray :: defaultinit
json :: JsonSequenceRead :: defaultinit
core :: SimpleCollection :: defaultinit
core :: Cloneable :: defaultinit
core :: Object :: defaultinit
core :: SequenceRead :: defaultinit
core :: Sequence :: defaultinit
core :: Collection :: defaultinit
core :: AbstractArrayRead :: defaultinit
mongodb :: MongoPipeline :: defaultinit
core :: AbstractArray :: defaultinit
core :: Array :: defaultinit
core :: AbstractArray :: enlarge
Force the capacity to be at leastcap
.
core :: Array :: filled_with
Create an array ofcount
elements
serialization :: Serializable :: from_deserializer
Create an instance of this class from thedeserializer
core :: SequenceRead :: get_or_default
Try to get an element, returndefault
if the index
is invalid.
core :: SequenceRead :: get_or_null
Try to get an element, returnnull
if the index
is invalid.
core :: Collection :: has_all
Does the collection contain at least each element ofother
?
core :: Collection :: has_any
Does the collection contain at least one element ofother
?
core :: Collection :: has_exactly
Does the collection contain exactly all the elements ofother
?
core :: SequenceRead :: index_of_from
The index of the first occurrence ofitem
, starting from pos.
core :: Sequence :: insert_all
Insert all elements at a given position, following elements are shifted.core :: Object :: is_same_instance
Return true ifself
and other
are the same instance (i.e. same identity).
core :: Object :: is_same_serialized
Isself
the same as other
in a serialization context?
core :: Object :: is_same_type
Return true ifself
and other
have the same dynamic type.
core :: SequenceRead :: iterator_from
Gets a new Iterator starting at positionpos
core :: SequenceRead :: last_index_of
The index of the last occurrence ofitem
.
core :: SequenceRead :: last_index_of_from
The index of the last occurrence ofitem
starting from pos
and decrementing.
core :: AbstractArrayRead :: length=
core :: SequenceRead :: modulo_index
Returns the real index for a modulo index.serialization :: Serializable :: msgpack_extra_array_items
Hook to request a larger than usual metadata arraycore :: Object :: output_class_name
Display class name on stdout (debug only).core :: Collection :: permutations
Allr
-length permutations on self (all possible ordering) without repeated elements.
core :: Collection :: product
Cartesian product, overr
times self
.
mongodb :: MongoPipeline :: project
Apply projectioncore :: RemovableCollection :: remove
Remove an occurrence ofitem
core :: RemovableCollection :: remove_all
Remove all occurrences ofitem
core :: SequenceRead :: reverse_iterator
Gets an iterator starting at the end and going backwardscore :: SequenceRead :: reverse_iterator_from
Gets an iterator on the chars of self starting frompos
serialization :: Serializable :: serialize_msgpack
Serializeself
to MessagePack bytes
serialization :: Serializable :: serialize_to
Serializeself
to serializer
serialization :: Serializable :: serialize_to_json
Serializeself
to JSON
core :: Collection :: to_concurrent
Wrapsself
in a thread-safe collection
core :: Collection :: to_counter
Create and fill up a counter with the elements of `self.core :: Collection :: to_curlslist
Convert Collection[String] to CURLSListserialization :: Serializable :: to_pretty_json
Serializeself
to plain pretty JSON
core :: Collection :: to_shuffle
Return a new array made of elements in a random order.core :: Array :: with_capacity
Create an empty array with a given capacity.core :: Array :: with_native
Create a array filled with a given native array.Serializer::serialize
# Mongo pipelines are arrays of aggregation stages
#
# With the `MongoCollection::aggregate` method, pipeline stages appear in a array.
# Documents pass through the stages in sequence.
#
# ~~~json
# db.collection.aggregate( [ { <stage> }, ... ] )
# ~~~
#
# The MongoPipeline fluent interface can be used to bluid a pipeline:
# ~~~
# var pipeline = (new MongoPipeline).
# match((new MongoMatch).eq("game", "nit")).
# group((new MongoGroup("$game._id")).sum("nitcoins", "$game.nitcoins")).
# sort((new MongoMatch).eq("nitcoins", -1)).
# limit(10)
# ~~~
#
# The pipeline can then be used in an aggregation query:
# ~~~nitish
# collection.aggregate(pipeline)
# ~~~
#
# For more information read about MongoDB pipeline operators from the MongoDB
# official documentation: https://docs.mongodb.com/manual/reference/operator/aggregation/
class MongoPipeline
super JsonArray
# Add a stage to the pipeline
#
# https://docs.mongodb.com/manual/reference/operator/aggregation/#stage-operators
#
# Each stage is registered as:
# ~~~json
# { $<stage>: <json> }
# ~~~
fun add_stage(stage: String, json: Serializable): MongoPipeline do
var obj = new JsonObject
obj["${stage}"] = json
add obj
return self
end
# Apply projection
#
# https://docs.mongodb.com/manual/reference/operator/aggregation/project/#pipe._S_project
#
# Passes along the documents with only the specified fields to the next stage
# in the pipeline.
#
# ~~~json
# { $project: { <specifications> } }
# ~~~
#
# The specified fields can be existing fields from the input documents or
# newly computed fields.
fun project(projection: JsonObject): MongoPipeline do return add_stage("project", projection)
# Apply match
#
# https://docs.mongodb.com/manual/reference/operator/aggregation/match/
#
# Filters the documents to pass only the documents that match the specified
# condition(s) to the next pipeline stage.
#
# ~~~json
# { $match: { <query> } }
# ~~~
fun match(query: MongoMatch): MongoPipeline do return add_stage("match", query)
# Apply sort
#
# https://docs.mongodb.com/manual/reference/operator/aggregation/sort/
#
# Sorts all input documents and returns them to the pipeline in sorted order.
#
# ~~~json
# { $sort: { <projection> } }
# ~~~
fun sort(projection: JsonObject): MongoPipeline do return add_stage("sort", projection)
# Apply skip
#
# https://docs.mongodb.com/manual/reference/operator/aggregation/skip/
#
# Skips over the specified number of documents that pass into the stage and
# passes the remaining documents to the next stage in the pipeline.
#
# ~~~json
# { $skip: { <number> } }
# ~~~
#
# If `number == null` then no skip stage is generated
fun skip(number: nullable Int): MongoPipeline do
if number == null then return self
return add_stage("skip", number)
end
# Apply limit
#
# https://docs.mongodb.com/manual/reference/operator/aggregation/limit/
#
# Limits the number of documents passed to the next stage in the pipeline.
#
# ~~~json
# { $limit: { <number> } }
# ~~~
#
# If `number == null` then no limit stage is generated
fun limit(number: nullable Int): MongoPipeline do
if number == null then return self
return add_stage("limit", number)
end
# Apply group
#
# https://docs.mongodb.com/manual/reference/operator/aggregation/group/
#
# Groups documents by some specified expression and outputs to the next stage
# a document for each distinct grouping.
#
# The output documents contain an `_id` field which contains the distinct
# group by key.
#
# The output documents can also contain computed fields that hold the values
# of some accumulator expression grouped by the `$group`'s `_id` field.
# `$group` does not order its output documents.
#
# ~~~json
# { $group: { <group> } }
# ~~~
fun group(group: MongoGroup): MongoPipeline do return add_stage("group", group)
# Apply unwind
#
# https://docs.mongodb.com/manual/reference/operator/aggregation/unwind/
#
# Deconstructs an array field from the input documents to output a document
# for each element.
# Each output document is the input document with the value of the array
# field replaced by the element.
#
# ~~~json
# { $unwind: <field path> }
# ~~~
fun unwind(path: String): MongoPipeline do return add_stage("unwind", path)
end
lib/mongodb/queries.nit:350,1--496,3