simple query:
{
“query”: “{items(limit:50000){ name state board {id} group {title} column_values { title text } }}”
}
mentioned below is the error.
I don’t see this when I query with lower limits like 1000.
Is there any max cap till which I can query??
<html>
monday.com is having some technical issues
@font-face {
font-family: Roboto;
font-style: normal;
font-weight: 300;
src: local("Roboto Light"), local("Roboto-Light"), url(data:font/woff2;base64,d09GMgABAAAAACmQAA4AAAAAURgAACk5AAEAAAAAAAAAAAAAAAAAAAAAAAAAAAAAGoFOG5JCHDYGYACHDBEMCvRE3SMLg1oAATYCJAOHMAQgBYJ8ByAbpkIF7Ng0sHEA9uDsckZF2RRNBP9fEjgZQ6rxe01YhItQqVQqtbZKo6NSLXshCcvuamYvk1vBgLu/bcUlFh3Hac1oktcuX/oxUb\
.......and so on
Hi @mitchell.hudson, is there a way to get the total count of all items? This way we could break it into lots of small queries. I’ve had queries of 200 000 with complexity 6million and also only recently started getting the technical issues response.
Not necessarily for the entire account, but let’s say I create a query with defined limits, the pagination is great, but it doesn’t help me if I don’t know how many pages to iterate through. I could call sequentially and check if an empty list returns, but this will take just as long as making a large query as the calls need to wait for the previous call to complete. I was hoping for some way to get the total item count based on the page size that I provide. so for a total of 1000 items / page size(200) = 5 pages.
That is rather unfortunate. Hopefully that gets added at some point as waiting for each call may actually take longer than just submitting a huge query.
Dipro from the monday.com team here – sorry for the late response.
Why not make N concurrent requests at a time? You can check the Nth response to see if there are more items to return. This means you can keep each request smaller (lower the chance of hitting a timeout or 500 error) and still get a lot of data at once.
Here are my responses to the two questions you have:
Queries with big limits
At the moment we don’t have a hard limit on how many items you can request at a time. However, we do have a 60 second timeout for all requests, so if you’re returning a ton of data you might see a timeout.
And of course, if you’re asking for a really big data set an error might be thrown on the infrastructure side and cause a 500 error.
Paging through responses
We don’t currently expose the number of pages when you’re paginating, but I’ve passed that as feedback.
Thank you, the pagination count would be great. I’ll give your solution a go, thanks! As for the OP’s question, the technical issues are returning html instead of an error code. Hopefully this can be solved as well.