Batch saving to database

Hey there,

I would like to save 2 (or more) objects to the database, if one of them fails then neither of them should be actually saved.

i.e.: A set of write operations on one or more documents, either all of the operations succeed, or none of them are applied.

This is useful for unexpected failures. Does moralis have this feature?

Thanks in advance!

It is possible. If you are saving multiple data in the same row, then add the .save() function after the final loop. So if the function stops due to some error it will not be saved in the database.


If you want to delete after saving you can use the .destroy() function to remove the object from the database.

it looks like it is not yet implemented in parse server:

1 Like

Thanks @cryptokid looked for something that mentioned batched transactions in parse docs but didn’t find it, forgot to look on their git!
Thanks for the help as always!

1 Like

So it is not impossible then? What would be a good workaround this issue?
I also need to batch save objects into Moralis database.

const axiosFunc = async (apiUrl, xAPIKey, editionSize,  imageDataArray, arrayOfArrays, i) =>  {

await rateLimiter.schedule(() => axios
.post(apiUrl, arrayOfArrays, {
  maxContentLength: 100000000000,
  maxBodyLength: 1000000000000,

  headers: {
    "X-API-Key": xAPIKey,
    "content-type": "application/json",
    accept: "application/json",

.then((res) => {
  console.log("IMAGE FILE PATHS:",;


    (imageMD) => {
console.log("saved image metadata!");
    (error) => {
 console.log("failed to save image metadata!");

The code above is the one I am using to test this loop. You can’t see it now because I deleted the loop and just tried to hardcode it because I suspected there was a problem with the loop. But now that I read your message, it makes sense to me that this batch feature is not implemented yet.

you can try bulk queries maybe
it works only in cloud code