MongoDb offers a handy tool – mongoexport– that allows us to export a MongoDB collection to a json file:

mongoexport --collection=posts --db=blog --out=c:\progs\myPosts.json

Running the above command will result in the collection posts – from the database blog– to be exported to the myPosts.json json file.

So far so good, but there is a little caveat: the mongoexport command does not allow us to split the output to multiple files, meaning that if we are dealing with a quite large collections we may end up with an uncomfortably big sized file.

In those cases we can turn to alternative export solutions, like , for example, printjson:

mongo --quiet blog  --eval 'printjson(db.posts.find().skip(0).limit(200 ).toArray())' > c:\progs\myPosts1.json

This command will take the first 200 objects in the collections, and print it to the myPosts1.json file. Using the find function on the collection, and playing with the skip and limit functions we can orderly dumb the collection content to different files each one limited to N objects.

If in the previous example exported the first 200 entries to myPosts1.json, to export the next 200 we would run:

mongo --quiet blog  --eval 'printjson(db.posts.find({}).skip(200).limit(200 ).toArray())' > c:\progs\myPosts2.json

We can use a script to automatise the process, and export the full collection splitted in so many files as we wish!