Recent searches
No recent searches
Feature request: Limit fields returned from API calls
Posted Jan 03, 2022
Currently when you make a request for ticket data, you get all the data for that ticket back. So if I asked for data about 100 tickets, I could potentially get 100,000 lines of JSON because it includes all the custom_fields (which can be a lot per ticket) - this is a huge amount of data to return (i.e. useless network traffic) for 100 tickets when all I want is the ticket status and priority for each ticket.
I was advised by customer support that such a feature does not exist - could we make this happen?
1
7
7 comments
Eric Nelson
I'd be interested in hearing more about your use case. Is the network traffic the main concern?
Have a wonderful day!
Eric Nelson
Senior Manager, Developer Advocacy
-1
Jonathan Farache
yes Eric Nelson, i was thinking of implementing a process that runs every 5 minutes and validates the the tickets in my DB from the last hour are synced with tickets in ZD. That would easily be hundreds of tickets every 5 minutes
but with this amount of network traffic it seams unfeasible
0
Eric Nelson
Hey Jonathan Farache

It sounds like all you really care about is making sure that you have parity between your db record and Zendesk. Instead of running this as a job you could instead have it be a triggered event from Zendesk to a service to sync with your db via a webhook.
For example, you create a webhook that sends a post request with relevant information to your service only when a ticket's status or priority has been updated (though you can have the conditions be whatever you'd like). Which then updates your database. This would trim down on the network traffic significantly.
Let me know if this solution works for you. Have a wonderful day!
Eric Nelson
Senior Manager, Developer Advocacy
-1
Jonathan Farache
Thanks for the reply Eric Nelson
I should have clarified: we already have a webhook setup. It sends data to an AWS SQS, and from there our app grabs each message as it comes in. The problem is that sometimes AWS goes down, and then we lose some data until it's back up. In those instances, I need to re-sync my data.
You might say "well why don't you connect the webhook directly to the app" - that's not possible for both security reasons, but also the fact that the app itself might go down and then we're in the same issue - hence the need to occasionally re-sync.
0
Eric Nelson
Thanks for the additional info! Glad to hear that you're already following this pattern, and I understand the need to have a backup option that meets your needs. I'll sync up with product to provide this feedback. Have you thought about paginating the request? So only return X amount of records where X is the number of records that equals a response size you feel comfortable with. Then just loop through the pages of responses and handle the data as necessary. The total size of the job will be similar, but handling 10 100kb requests is much easier than a 1mb request.
-1
Sudarshan Kadam
Eric Nelson Was this request for a way to specify names of fields to be returned taken up? I am currently building a reporting dashboard and my etl processes would love if the APIs only returned few fields that we need instead of everything at once. It will lighten the network traffic for sure, and will help simplify the data ingestion process.
0
Andrew
+1, When I use the search API, I get back hundreds of unrelated custom fields that are unrelated to my search query.
It would be nice to specify the field set to be returned.
Webhooks are ok but we use downstream software that charges per Business Process Flow so a webhook is too costly.
Also, sometimes we want to push data out and due to vendor restrictions we lack the ability to run filtering code on the receiving end.
0