I’ve got the API working well and connected to our ERP system. It looks like when I get everything loaded, the nightly batches of data I’m setting up will be very quick to upload.
To get the data established for 2019, I have to get 42k customers (ecomm & brick and mortar business), that is the largest number of records I will be uploading in the initial setup.
In testing, I uploaded 15k records in 1.5 hours and it slowed significantly as the upload continued. While this is not a major problem as it’s a ‘one time’ load and I can just leave it running until done, I was just wondering if I’m coming up against throttling limits or need to pause the upload after a number requests like the Amazon api (X requests per minute allowing Y requests per Hour (Leaky Bucket Algorithm))?
I checked the response headers and didn’t see any quota information but maybe I’m missing something?
I suspect another option would be to use the free desktop version as an intermediary and batch create to get 2019 set up. Then import the company into the cloud edition and use the API just for the nightly batches of data - just wondering if it’s viable to avoid that manual step?
Thanks in advance for your input.