I have found a big issue with the “Updated Opportunity in Copper” Zapier link, and the problem seems to be on Copper’s side rather than Zapier’s. I suspect this applies to all the “Updated” zap hooks too.
When editing an opportunity record, each time a field is edited this counts as an “update”. This means editing even 2-3 fields can trigger half a dozen “update” events, and going through updating a whole record could mean 10-20 separate “updates” all within a few seconds of each other. I believe this is true for all content types.
That doesn’t sound like an issue until you get to Zapier and realise that every single one of these tiny changes is triggering a Zapier hook and burning through another bunch of tasks. I estimated that one Zap to update a Google sheet when an opportunity is updated could burn through our entire Zapier task allocation in just a few days from pretty light usage because Copper is triggering zap tasks for absolutely everything.
So is there a way to change this so that “update” triggers are only sent every 5 minutes? That way we can complete all the changes to all the fields and then Zapier will get the whole lot in one go?
(Zapier can’t do anything about this because the problem is being caused by Copper sending so many update actions, not their processing of them, and setting up filters would still burn through our task allocation. Adding a tag also won’t help because that would stop the records syncing after the first time, whereas we want to be able to update records on an ongoing basis.)
The webhook is triggered for each update in Copper. You can’t change that. What you can do is optimizing the process on the Zapier side based on the requirements for further processing.
If you need your updates only to be processed once per hour you can build 2 zaps:
1: Trigger update from Copper → Store the selected ID (or all fields you need) in Airtable or gSheet.
2: Schedule once per hour → Use a filter step to Check if there are updates to be processed by a filter (zero steps if nothing to be processed) → Do whatever is needed for each unique Copper-ID → Mark as Processed
Now you have reduced the overhead from triggering each field update to only one step.
Please let me know if this approach makes sense to you.
If the cost for Zapier is an issue for you, you can also consider moving to Make.com.
This is by design, it's how API webhooks work across most all CRMs.
What is it exactly that you are looking to achieve? Passing what data specifically through to Google Sheets?
2 ways to handle this:
1) What we'd usually recommend (which you mentioned wouldn't work in your scenario but adding it here for others to see) is if it's more of a one-time sync, then at the end of the automation, add a unique tag back to the entity within Copper (XYZ software - data synced), and then filter out the automation if it already contains the tag.
2) There's also more complex but flexible ways to achieve this. If just a specific field or two is the actual important data you want to sync over, another way is to store a prior value of the important field you're talking about syncing (you can literally do this as a hidden field within Copper if you don't have too many or too much going on), and then only continue if the prior value doesn't match the current value, as that would mean it's out of sync. Then at the end of the automation, update that hidden field to the new value. As easy as that 🚀
There's way to store last synced times within Google Sheets to only continue if it's after x minutes, but even with that, you're going to trigger a lot of tasks with something like a bulk update. My recommendation #2 should use 0 tasks unless it's actually updating.
Anyway, all I'm saying is there's a whole myriad of ways to fix this, there's a lot involved with automation/integration design (that's why there's a whole industry with people like us who specialize in exactly the scenario you're describing above 😁).
There's absolutely a way to accomplish what you're speaking of doing, it's just more of a custom build is all—don't blame you if you can't get it working though as it has taken me probably 6+ years of exclusively working with API's to build something more custom like this (building to cut down on how often a sync is triggered), it's a complex feature.
You can also use a different trigger (like on stage change) and just sync the data when it hits various stages in the process, versus every single update of any field.
Hope this helps!
Thanks Alex. What we’re trying to achieve is to update details about e.g. events or training courses in a pipeline, sync those into our calendar (so the attendees have easy access to the information), and sync them to a Google sheet (where we track invoicing and end-of-year reporting). The one-time sync won’t work for us.
Yes, we could trigger an update when a particular field changes, or perhaps when a “ sync” checkbox was checked, but as you say will that still use two Zapier tasks to reach the Zapier filter stage. We could use the stage change as the trigger, but these updates are not usually related to the stage changing so it’s not ideal. Both of these feel like hacks rather than a stable solution.
Could you give me a bit more on how this hidden field works? I’m not sure I understand what you’re proposing. Also there’s a possibility of using workflows, so could you confirm whether they are subject to the same issue or if there is a delay on them? For example if I set up a workflow to update a field when the record is updated, and then link Zapier to that field, would that generate fewer zap calls?
You say this is by design and how most CRMs work, but I’m not sure I agree. Most CRMs would let you edit a record and save it, therefore only triggering an update when the record is saved. Copper is saving on each field submission in the browser, which means updates are triggered every time a field is changed. One update generates multiple update hooks. Wouldn’t it be a lot simpler to queue those “update” flags and only list a field as updated every 1-2 minutes? That way the webhooks wouldn’t be flooded with dozens of tiny updates when each field is updated, it would just send one because the record has been updated. I can’t see any utility to sending dozens of update notifications to the webhook when all the updates are seconds apart on a single record.
And thanks Jaco too, good shout about Make.com (we’ve used IFTTT too in the past), perhaps we can route around this by using a system that doesn’t mind so many webhook calls. But it still seems like a huge burden on the system to be generating that many webhook calls, for Copper as well as Zapier/Make.