Handling complex JSON documents is not easy with traditional sql databases like postgres and mysql. My team just got done building a large site with AngularJS, and using MongoDB (or any document store) allowed us an easy way of saving denormalized JSON documents. If you have a JSON object that contains an array of other objects, and each of those objects has arrays of other data, so you have a complex document that is 4 dimensions deep, with MongoDB (or any other document store) you simply put the document in the document store, but with MySQL or PostGre you would break that document up and store it in normal form, which means breaking it up into at least 4 database tables, maybe more. That is a lot more work and, crucially, that is much more likely to be impossible to automate. The code can not automatically know what deeply nested array of objects maps to some database table. The requirements for automatic normalization go beyond the features of your normal ORM.
There has been an explosion of these systems lately that automate the synchronization of frontend and backend. As far as I know, they all rely on document stores such as MongoDB.
Mongo may be the right call for your team, but just as an aside, the use case you're describing is perfectly doable with Postgresql using either text columns or json columns, and moreover, table denormalization is just as much of an option for relational users as for document store users.
There has been an explosion of these systems lately that automate the synchronization of frontend and backend. As far as I know, they all rely on document stores such as MongoDB.