Rate Limits & Optimizing Queries

Hello -

I am currently trying to write a script that will iterate through purchase order line items for each purchase order returned. I got about this by querying each purchase order and then iterating through line items depending on the “has_Next_Page:” argument, with a limit of “first: 10”. My goal is to pull in all line item information for all historical purchase orders.

The issue I am running into is that when I run my script to iterate through purchase orders and then line items, I am getting a 400 Error at different points of the iteration. Sometimes it fails on Purchase Order 123 (let’s say) and Page 12, and other times it’s Purchase Order 123 and Page 15. I am doing no changes to the script other than updating the filters for “id” and “after” in my query. Also, when I pull out the iteration step and run it in Altair I have no issues and results return.

Is there a rate limit imposed that is based on the number of calls made during a period of time? Let’s say only x per minute?

Please let me know as you have a chance as I am really banging my head over why :sweat_smile:.

Current Query Used:

query {
purchase_order(id: “[purchase order id]”,
analyze: true) {
complexity
data {
line_items(first: 10, after: “[iterated end cursor]”) {
pageInfo {
hasNextPage
hasPreviousPage
startCursor
endCursor
}
edges {
node {
id
po_id
account_id
vendor_id
sku
vendor_sku
product_id
variant_id
product_name
quantity
quantity_received
quantity_rejected
price
fulfillment_status
updated_at
created_at
}
cursor
}
}
}
}
}

Hello -

Responding to this thread again in hopes of getting a response. I am not sure if/why I am getting limited if I have not gone over my credit usage.

Thanks.

Hey @rubix3 ! could you please shows the exact error and if possible share the request_id? You should always add that field as part of the response (at the same level of the complexity field). With that we can inspect the original payload and track for issues.

Hi rubix3,

Yup, there’s a limit of 5000 credits per hour. The complexity item in your query shows the number of credits that the query is using. So if the base complexity for retrieving a single record is 1, each record retrieved will cost 1 credit, plus one additional credit per query call. That means that if you request the first 10 records, the total cost for that query run is 11. If you know that they exist, requesting up to 100 records is more cost effective. The caveat is that if you request 100 records but there are only 5, it will still cost you 101 credits instead of 6. It goes by what’s requested, not what’s returned, unfortunately.

Also, as the name suggests, as you make a query more complex, the more it will cost to pull a single “record”, so pulling 100 records with a complexity of 4 will cost 401 credits.

I hope that helps!

Jeremy

2 Likes

Hello @seba !

I will recreate the error tomorrow morning and send your way.

Thanks,

-rubix3

Hello @jeremyw !

Yes, I have seen this limit of 5000 credits per hour. My issue arises when I try to run the same query multiple times in a short period of time. Each time there is an error that occurs, I still have over 3000 credits remaining in the hour.

I am essentially looping through a set of results with a limit of the first 10 results each time to avoid the over allocation of credits in an instance when there is much less than the standard 100 records available.

I will recreate the error and obtain the request_id above.

Thanks!

-rubix3

1 Like

@rubix3 I’ve seen a similar issue recently where I have plenty of credits remaining for what I’m trying to do, but then all of a sudden I get back either an error or something in the middle of it. I’ve been trying to catch one of those errors myself, but now that I’m watching for it, I haven’t seen any.

@jeremyw thanks for your patience.

It seems as though I am on the same boat. I have not been able to recreate the error since it occured last week. I will keep a watch over it and see if the error pops up again as I work through the problem.

Thanks,
-rubix3

1 Like