Getting Board relation column value as count

Hi,

I have a board with a column "relations" where the column value is an array of ids
I want to set the value of another column "relation_count" based on how many ids are in “relations” value.

relations is changed -> set relation_count

I wonder if and how to set an automation of that nature.

What type is relations column? Board relation?

{
  items (ids: "123124123") {
    column_values(ids: "connect_boards") {
      ... on BoardRelationValue {
        linked_item_ids
      }
    }
  }
}
const relationCount = res.data.items[0].column_values[0].linked_item_ids.length

That should give you the number of connected items.

Or were you thinking something like a formula or automation on the board itself? Sorry this being in the apps & Developers section inclines answers toward programming tasks.

1 Like

Hi @codyfrisch ty for the prompt response.

Via the gql there are indeed many ways to assert that count of said array of linked items.

I was thinking more in the lines of an automation

When value of board relation column chnges set another column value to length of linked items array.

Meanwhile I made a on item update webhook to use an external service to do so.

1 Like

Yeah, there is no way to get the number directly except maybe a formula column, or an existing third party app (but you have the skills, why pay!). But no direct automation exist (yet)

1 Like

Hehe I am up to my elbows in lambdas atm :stuck_out_tongue:

Can you please elaborate on formula columns?sounds interesting.
I would love any monday native solutions where possible

Formula columns are a column type available only in pro & enterprise accounts.

You can create a formula in their settings and it will calculate a value from other columns, much like Excel syntax (but not the same). The board relation column has an operator for the count of connected items.

{Relation Column#Count} as the formula will return the count of connected items! So exactly what you need…

HOWEVER! formula columns are also quite limited. The values are calculated client side by the users browser. They are display only as well. You can’t access them from the API, you can’t trigger automations off them changing, you can’t use their values in emails, notifications, or updates. You can only see the value in the formula column on the board.

But if that is all you need, and you have paid for pro/enterprise, then you can use them. Just add a column to the board and choose “more columns” and search “formula” to add one.

Amazing feature.
If it had api access at least it would save me a lambda :stuck_out_tongue:

Tyvm!

Are you using API Gateway, or just function URLs (direct or through cloudfront?)? Now I’m just being nosey.

Hehe all good.

I believe in knowledge sharing!

I use function urls.

This is what one of the bigger lambdas looks like.
The tests are mostly ok.one is just for hacky int test I need to better define.

While on the subject,
I came across some rumour one can define webhook behaviour.

Say for a bulk items move to group I want to ensure the webhook triggers for each item seqeuncially rarher than all of them at once.

Do you know anything about this?

I do not know anything about the behavior you discussed. First I’ve heard of such a thing - though it might be nice.

I’m using AWS CDK to build and deploy, since it lets me also define queues, databases, etc. through code.

Sure feel free to DM anytime!

I think a queue would be a best choice for bulk updates indeed.

btw what you’re thinking of for ordered operations is likely the workflow builder, which is in beta - but today it doesn’t support external integrations.

SQS FIFO queues work pretty well I’ve found, since you can return one to the queue with a visibility timeout set to when the API complexity resets. That way your lambda that is processing the queue can exit and just retrigger once the visibility timeout expires (and API complexity limits reset.)

To be honest I will probably decide if the bulk operations seem to malfunction…at this time they do not seem to fail.

Any queue were things run in sequence rather than concurrently will do well.
Im used to Kafka where such setups are very simple to define.

If you use CDK, you can define a queue for something, and trigger a lambda with it in a few lines. Very easy - once you get CDK. Then you just have to grant another lambda (the generator) the ability to send to the queue, and in that lambda you can just publish to it.

How are you handling your API complexity limits with lambda today? One of the problems with API complexity limits and bulk operations is you consume the complexity very fast - then your lambda has to pause execution for a minute. There is a problem with this, the webhook expects a response in 1 minute or it terminates which should abort the lambda. But that is going to often mean complexity resets after the request aborts. If you don’t have a queue then you lose the request. Now, when you have single requests that do a single operation, failing will trigger a retry if using the apps framework (it will not with the webhook integrations) isn’t a big deal. You can also check complexity before starting execution and if there isn’t going to be enough to complete the request, you return a 429 and it will retry in one minute. (again if using apps framework, not webhooks).

Yes currently looking into SQS to sort the concurrency default nature of the AWS lambdas.

This is indeed an issue that I am now facing.
first phase I am optimizing my queries overall to reduce complexity.
Also considering getting afew keys instead of just the one.

The complexity limit is not per key, it is per app+account, and is shared across all users for the same account. One or forty keys - you’ll have the same limit. If you are using user api keys - those are one quota for the whole account, shared by all users.

I see,
Thanks for the clarification!

I’ll factor that in.

1 Like