Duplicate entries when using pagination

I tested with cursor now and it seems to work fine:

import requests
import time

ids = {}
def get_nft_owners(offset, cursor):
    print("offset", offset)
    url = 'https://deep-index.moralis.io/api/v2/nft/0x50f5474724e0ee42d9a4e711ccfb275809fd6d4a?chain=eth&format=decimal'
    if cursor:
      url = url + "&cursor=%s" % cursor

    print("api_url", url)
    headers = {
        "Content-Type": "application/json",
        "X-API-Key": "API_KEY"
    }
    statusResponse = requests.request("GET", url, headers=headers)
    data = statusResponse.json()
    try:
        print("nr results", len(data['result']))
        for x in data['result']:
            ids[int(x['token_id'])] = 1
    except:
        print(repr(data))
        print("exiting")
        raise SystemExit

    cursor = data['cursor']
    print(data['page'], data['total'])
    return cursor


cursor = None
for j in range(0, 211):
    print("nr unique token_ids at offset", j*500, "=>", len(ids))
    cursor = get_nft_owners(j*500, cursor)
    print()
    time.sleep(1.1)


print("nr unique token_ids", len(ids))

There still seems to be an issue with pagination (or general data queries). I wrote a script to call the “{address}/nft/transfers” endpoint, store the results and store the cursor and repeat until my local db has the same “total” as the moralis endpoint shows. This works (most of the time) but as I try to call the endpoint (hourly) to look for new entries, they don’t show in the results

When I initially ran the script it showed a final total of 333050 entries. I stored the cursor and setup the hourly cron. Within a day, the remote total was higher than my local total but the api results are empty using the last given cursor. See below. You can test this here using the cursor and endpoint below:

https://deep-index.moralis.io/api-docs/#/account/getNFTTransfers

The last cursor received:
eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJvcmRlciI6IkRFU0MiLCJvZmZzZXQiOjMzMzUwMCwibGltaXQiOjUwMCwidG9rZW5fYWRkcmVzcyI6IjB4NTBmNTQ3NDcyNGUwZWU0MmQ5YTRlNzExY2NmYjI3NTgwOWZkNmQ0YSIsInBhZ2UiOjY2Nywid2hlcmUiOnt9LCJrZXkiOiI5MDQ4OTM2LjEzNS4xMzUuMCIsImlhdCI6MTY1MTE2NDQxOX0.oJ4qlZ4vejZPF9gfuT58Z1vjH1jy_jfGhCjm_apn0fg

The Contract Address:
0x50f5474724e0ee42d9a4e711ccfb275809fd6d4a

array:6 [
“total” => 334981
“page” => 667
“page_size” => 500
“cursor” => “”
“result” => []
“block_exists” => true
]

You can see that it says page 667. We know that the max result set is 500 so 667 x 500 = 333500 but the total shown (above) is 334981.

Something appears to be wrong (still) with the moralis api cursor system.

you don’t have to store the cursor, you will get a new cursor every time you start the query again without a cursor

So when I get to the end and there are no more responses I can call the endpoint again without a cursor and it will reply with any new results?

Cursor is used to get the next page of results. Omitting it will start your query from the beginning or the first page.

It won’t give you new results compared to your previous queries, you are just calling it again which may or may not have different results or data.

Right. That’s what I thought. I’m trying to get the newest entries. So, I get all the entries, an hour goes by and new entries are added (on the moralis side) and I want to get those new entries.

You could call it again and compare the results, but this is obviously quite heavy. I think that’s your only option at this stage. Do they need to specifically be new entries or could you just replace your old data?

What chain are you using for that getNFTTransfers example you mentioned (to look at the total/page_size issue)?

yes, that is how you should use the cursor, but it will not reply only with new results, it will iterate again on all the results

I see. This isn’t going to scale if every client needs to pull the entire data set in order to get the new entries. A delta mechanism can be implemented that will still allow the Moralis servers to reduce load. I created a diagram, please take a look. I’m open to discuss with your systems architect(s). Please pass this along.

Can you post this on roadmap.moralis.io?

Yeah, I posted it there.
Honestly, I don’t know how people will derive much use out of an API system that can’t get just new entries. I mean, I’m sure you guys aren’t pulling the entire blockchain into your database each time you sync. Can you imagine the processing that would take? That’s what your asking your clients to do.

1 Like