How to load big amount of data to moralis DB?

I have a json file with big amount of data.

I want make a script in node js and load it all to moralis dababase

const Moralis = require("moralis/node");

const serverUrl = "test"
const appId = "tesr"

const masterKey = ""

import { readFileSync } from 'fs';
const { forEach } = require('p-iteration');

const SaveData = async () => {
    await Moralis.start({serverUrl, appId, masterKey});

    const Monster = Moralis.Object.extend("Monster");

    const file = readFileSync('db/monser_data.json', 'utf-8');

    const orig_data = JSON.parse(file);

    await forEach(orig_data, async (item) => {
        const trx = new Monster();
        trx.set("user", item.user);
        trx.set("value", item.value);
        await trx.save(null,{useMasterKey:true});
  });
};

SaveData();

But hangs for very big amount of data.How to do this properly ?

I thought to use

Moralis.bulkWrite("Food", foodsToInsert);

but this can be only be done in clould function.

Maybe create my own clould function and pass data in baches to it from script ?

you can connect directly to mongo db from nodes and try to add the data that way, make sure to use string for object_id

https://docs.moralis.io/moralis-dapp/database/direct_access