Forums/Community/Support tips & notes

Dumping Ticket Export to a CSV

Adam
posted this on June 18, 2012 09:54

1. Introduction

Objective: To download the data from the ticket export endpoint to a CSV file. The reason for this is because many customers want to analyze their Zendesk data in a secondary program (other then GoodData) such as Excel.
Notes: While we are developing and testing our code we'll want to use the sample endpoint which gives us the same sample data over and over. The purpose of this is that the sample endpoint is more forgiving and developed for this very purpose.

2. Setup

Let's get started with curlWrap.
  1. You can get the curlWrap function here:
  2. Put this in a new script file called csv.php
  3. We need to create a new file, of course. The code below will store our filename to a variable and create a file handler.
    $myFile = "tickets.csv";
    $fh = fopen($myFile, 'w');
  4. We now have a file handler that is stored in the variable $fh. When we refer to things we will be doing with our new CSV file we will use that variable.
  5. The ticket export endpoint requires a date/time stamp that is in a specific format called UTC Epoch time. Without getting overly technical this is a timestamp, in seconds, since some point in time long ago. To do this we'll use two functions.
    This sets the timezone to UTC:
    date_default_timezone_set('UTC');
    This tells our program how far back we want to go.
    $time = strtotime("-8 hours");
    You might be wondering what that means? The ticket export endpoint gives us all tickets that were touched between two time periods. The hitch is that the ending time period is always now (ie: when we run the command). The shortest increment we can go back is 5 minutes. In this case we are going to pull a list of tickets from the last 8 hours.
  6. Now we pull our data from the endpoint:
    $decoded = curlWrap("/exports/tickets/sample.json?start_time=1332034771", null, "GET");

3. Building the CSV

Let's build our CSV.
  1. Now we need to create the header of our CSV. The header usually contains all the column headings we want to use. To create this heading we'll copy the first row of data to a variable called $first. Then we'll convert this to an array (since it started out as an object, if this doesn't make sense, don't worry!) and finally, we'll flip it so that all the values are keys and the keys are values.
    $first = $decoded->results[0];
    $first = (array) $first;
    $keys = array_keys($first);
  2. Now we have a column headings in the array called $keys. We need to write this to our CSV. We use the function fputcsv:
    fputcsv($fh, $keys, ",", "\"");
  3. As you can see, fputcsv takes 4 arguments: the file handler ($fh), the array of data to write (in this case, the keys), the delimiter (ie: comma) and the character to use to encapsulate each field (you should stick with the quote mark ("). For the last paramater, you'll need to "escape" the quote. To do this simply but a backslash before the quote mark so php knows that the quote character is being referenced.
  4. Finally, we iterate through each result we got from Zendesk and write it to the file we are working on. We do this like so:
    foreach($decoded->results as $result){
         $result = (array) $result;
         fputcsv($fh, $result, ",", "\"");
    }
  5. As you can see above, we need to convert (like we did before) from an object to an array. We do this the same way as before by putting (array) infront of the variable. This is known as casting.
  6. The last thing we do is "close" the file we are working on. We do this with this command:
    fclose($fh);

4. The Final Product

When you run the script you will get a CSV file back with the ticket data we were interested in. I have added to the GitHub repository a sample of the kind of file we will get back from the process. It is called tickets.csv.
 

Comments

User photo
Lam Lnu

I just downloaded the file in github and put it in htdocs folder then run. The return value $decode is not a Object and I got this error: 

Notice: Trying to get property of non-object in C:\xampp\htdocs\dev\csv.php on line 55

Notice: Trying to get property of non-object in C:\xampp\htdocs\dev\csv.php on line 59

Warning: Invalid argument supplied for foreach() in C:\xampp\htdocs\dev\csv.php on line 59

 

Could you help me with this error?

Thank you.

August 15, 2012 18:07
User photo
Adam
Zendesk

@Lam, did you replace the define variables correctly? In addition, since you are using WAMP, did you enable curl correctly as well?

August 15, 2012 21:29
User photo
Benoit
easyneo

Thanks a lot Adam it is working fine ! :)

October 10, 2012 01:34
User photo
Morgan A
Virgin Gaming

Hi Adam,

When running your script on a windows machine using Cygwin/Wamp server I get the same errors as Lam, I have curl installed and enabled with JSON also installed and enabled, when I run the same code on a unix/linux machine I get no errors but I see that I'm not getting all of my tickets.

If i set the date back to 2 weeks it gets a couple of days worth of tickets (About 1000 lines), if I set it to a month I get the same number of lines back (1000 lines).

Is there any way of changing this to get all my tickets which is stated in $time = strtitune("-2 weeks"); ?

Also is there any work around to lines 60: $decoded->results[0]; and 64 foreach($decided->results as $result)?

 

Thank you.
     

April 11, 2013 13:01
User photo
Adam
Zendesk

Hi Morgan,

 

yeah, in regards to the first issue, I just wouldn't use windows if you can help it. Even as a development environment. In regards to the second issue, it's because I never built this to do paging. We page out 1000 records at a time. If you want to get the next 1000 records, you have to get the end time from the previous run and re-run the script 5 minutes later with the end time as the new start time.

 

I kind of built this to just do 1000 records at a time. It was also just written as a proof of concept. 

 

What's the problem with those lines? I'm not getting it. 

April 11, 2013 13:32
User photo
Morgan A
Virgin Gaming

Is it possible to do more then 1000 records at a time without having to wait for 5 minutes? We deal with thousands of tickets daily and we use our own spread sheet to view the data.

Or is there any way of getting the script to do paging?

Thank you.

April 12, 2013 05:13
User photo
Brandon K.
Zendesk

Hello Morgan,

The rate limit should not be 1000 tickets every 5 minutes, it should be 1000 every minute. Additionally, we only return the tickets that have changed within that time frame. The documentation states this at developer.zendesk.com. The endpoint is designed so people can poll the next_page endpoint every minute to get another 1000 tickets per minute without having to calculate the next start value. Some of our largest customers use this endpoint extensively without issue, so hopefully knowing how it is supposed to work will help you in setting up your workflow.

May 07, 2013 13:15
User photo
Morgan A
Virgin Gaming

Thank you Brandon.

I got it working but the only issue now is that it's pulling the times were tickets are updated, is it possible to dump the tickets by the created date field.

So if i'm dumping 7 days worth of tickets it will get all tickets created from day 1 to 7.

Thank you once again for the help.

M.A

May 21, 2013 12:42
User photo
Brandon K.
Zendesk

Hello Morgan,

I checked with one of our engineers and unfortunately it is not possible to pull tickets by their created date. I'm very sorry.

May 24, 2013 15:13
User photo
Evelyn Yoder
activevideo

In Chrome, this page is formatted badly. See attached.

July 02, 2013 11:46
User photo
Laura D.
Zendesk

Hi Evelyn, 

Looks like something extra got pasted in the article - I fixed it, thanks for the heads up :)

July 02, 2013 12:09
User photo
Armin Vazehgoo

Could you provide an example of code that would replicate the repeated ticket export base on the end_time and next_page? 

I am trying to dump all the tickets into a CSV which is eventually being imported into a MySQL DB, therefore, I would like to be able to retrieve all the tickets every time I run the export through the API. 

July 09, 2013 13:36
User photo
Adam
Zendesk

Hi Armin,

 

Unfortunately this was written to be a starting point for most projects. I understand that it lacks a lot in terms of making it truly useful. The truth is, I'd rather provide a starting point, and let others tailer the script to their needs/implementation then build something really specific and have various requests for modification this way and that.

 

That said, building something like this pushes to a mysql db shouldn't be too complicated if you think of it like a service. Have a text file that gets "touched" at the same time the service kicks off. Use the created time on that file to determine when the system last ran (swept). Each time you run, read the start time from the "touched" file and use that as your start time, pulling until you reach a time that is less than or equal to NOW - 5 minutes. 

 

Understood that the above is not a polished piece of code by any stretch but hopefully, it's a start.

July 09, 2013 13:52
User photo
Armin Vazehgoo

Thanks for the swift response Adam!

I understand this code is just a basic framework. I have already wrote the script that takes the data and puts it in the database, just needed to figure out a way to make sure all the tickets were in the database and updated every time we ran the job. 

Your method of using the text file sounds good. Will test it out!

July 09, 2013 13:57
User photo
Armin Vazehgoo

I read in this entry here, https://support.zendesk.com/entries/230191-GET-Tickets-using-API by a comment made by Steven that it's possible to do an incremental ticket export. However, when I am running this script, I do not see an end_time or next_page being returned. Is that no longer supported or is it a different call?

July 16, 2013 12:19
User photo
Laura D.
Zendesk

Hi Armin, 

This should happen if, and only if, there are no ticket returned in the response. Are you seeing tickets in the response but no end_time or next_page?

July 19, 2013 16:35