-sharp !/usr/bin/python3
-sharp encoding=utf-8
import db
from pymongo import MongoClient
user = db.get_db_user()
recharge = db.get_db_recharge()
consume = db.get_db_consume()
client = MongoClient("localhost", 27017)
db = client.test
col_new = db.new
lookup1 = {"$lookup": {
        "from": "recharge",
        "localField": "_id",
        "foreignField": "uid",
        "as": "recharge"
}}
lookup2 = {"$lookup": {
        "from": "consume",
        "localField": "_id",
        "foreignField": "uid",
        "as": "consume"
}}
-sharp lookup uid
replaceRoot1 = {"$replaceRoot": {"newRoot": {"$mergeObjects": [{"$arrayElemAt": ["$recharge", 0]}, "$$ROOT"]}}}
replaceRoot2 = {"$replaceRoot": {"newRoot": {"$mergeObjects": [{"$arrayElemAt": ["$consume", 0]}, "$$ROOT"]}}}
-sharp replaceRoot 
-sharp mergeObjects 
-sharp arrayElemAt 
-sharp ROOT  
project = {"$project": {
        "_id": 1,
        "nickname": 1,
        "phone": 1,
        "regDate": 1,
        "lastLogin": 1,
        "balance": 1,
        "totalRecharge": 1,
        "jcode": 1,
        "consume.amount": 1,
        "consume.consumeDate": 1,
        "recharge.real": 1,
        "recharge.amount": 1,
        "recharge.from": 1,
        "recharge.rechargeDate": 1,
        "recharge.tradeNo": 1
}}
-sharp project 
pipeline = [lookup1, lookup2, replaceRoot1, replaceRoot2, project]
result = user.aggregate(pipeline)
num = 0
for i in result:
    num = num + 1
    -sharp col_new.insert_one(i)
    print(num)
    print(i) Printing elements of result and inserting them into the new collection are no longer done at the time of 101elements. I don"t know why. 
 there is no problem with the number of elements in the user collection, even if the pipeline condition deletes a few! 
 the key is that there was no such problem yesterday. The code was also sent to github backup. Today, a field has been added to project. Deleting and writing yesterday"s new collection will not work all the time. 
  
 


