Issues with Limit using python api query

Hello Community!,

I am having a lot of issues exporting information from monday. I have a general script on python where I dinamically create queries (editing limit, filters and of course identifying if pagination is needed). Most of the time I need to pull out all the information from boards, but as the boards have been growing the request have been failing more. So the way to fix this is just to move limit from 500 to a lesser value. Some boards in 400, some boards in 300, but some of those I have had to limit below 50 (and it takes an eternity).

Giving all this context, is there a way to better use request library on python so it does not crash? I am thinking that maybe between the API and the request there is a way to give more time to react in do not go below 500 limit, this will reduce time greatly as lower limits (mostly getting subitems queries) can take hours.

Thank you!

I can sahre the query I am using, or code but the code does not make anything outside normal, the issue goes by the request mostly.

It sounds like the issue is related to how Monday. com’s API handles large requests, especially as your boards grow. Lowering the limit helps avoid crashes, but it also slows down the process significantly. Here are a few things you can try to improve performance without dropping the limit too much:

  1. Increase Timeout and Retry Failed Requests
  • Use the timeout parameter in the requests library to give the API more time to respond.
  • Implement a retry mechanism with exponential backoff to handle temporary failures.Example:
import requests
import time

def make_request(query, retries=3, backoff=2):
    url = "https://api.monday.com/v2"
    headers = {"Authorization": "your_api_key", "Content-Type": "application/json"}
    for attempt in range(retries):
        try:
            response = requests.post(url, json={"query": query}, headers=headers, timeout=30)
            response.raise_for_status()
            return response.json()
        except requests.exceptions.RequestException as e:
            if attempt < retries - 1:
                time.sleep(backoff ** attempt)  # Exponential backoff
            else:
                raise e
  1. Use Cursor-Based Pagination Instead of Fixed Limits
  • If you aren’t already using cursor pagination, switch to it so you can pull data in chunks dynamically rather than setting a hard limit.
  • Example query:
query {
  boards (limit: 500) {
    items_page (cursor: "your_cursor_value") {
      items {
        id
        name
      }
    }
  }
}
  1. Optimize Queries to Fetch Only Necessary Data
  • Reduce the fields you are querying, especially for subitems. Requesting large nested data sets can cause failures.
  • Try breaking large requests into multiple smaller ones and only fetching required fields.
  1. Check API Rate Limits
  • If you’re making too many requests in a short time, you might be hitting Monday. com’s rate limits.
  • Use logging to track when requests fail and see if rate limits might be the cause.
  1. Parallelize Requests
  • If you’re fetching multiple boards, try using multithreading or asynchronous requests to process them in parallel instead of sequentially.

Curiious to see if any of that works. Hope this helps!
Trevor

Hello Trevor, this is actually more than I was expecting, we will do some trials and let you know how it went,

Thank you!

1 Like