Before

After

We'll be using the Twitter API to extract tweets from any username's profile. The main use case for this would be to monitor your own profile's engagement within your daily reports.  More precisely, we'll use the GET statuses/user_timeline reference (read more here).

To get started, make a copy of the above Google Sheet and open your script editor by going to "Tools" -> "Script editor".

Step 1 - Import API key & API secret key

To generate your API keys, you either need to create a new Twitter app or use an existing one. Within the keys & tokens tab, you can find the keys and copy them into your sheet. The following code then gets the values from the sheet.

var sheet = SpreadsheetApp.getActiveSpreadsheet().getActiveSheet();
var ApiKey = sheet.getRange("B1").getValue();
var ApiSecret = sheet.getRange("B2").getValue();
var Username = sheet.getRange("B3").getValue();

Step 2 - Generate a bearer token

Twitter requires you to authenticate yourself in order to ensure data security. There're are a few methods for authentication.  In this tutorial, we'll be using the OAuth 2.0 Bearer Token but you could opt for other ways. Read more about Twitter authentication methods here.

var ApiUrlAuthorize = "https://api.twitter.com/oauth2/token";
var token = Utilities.base64EncodeWebSafe(ApiKey + ":" + ApiSecret);
var options = {
   headers : {
     Authorization: "Basic " + token,
     "Content-Type": "application/x-www-form-urlencoded;charset=UTF-8"
   },
   method: "post",
   payload: "grant_type=client_credentials"
};
var responseApiUrlAuthorize = UrlFetchApp.fetch(ApiUrlAuthorize, options);
var resultToken = JSON.parse(responseApiUrlAuthorize);

Step 3 - Request tweets

Using the above bearer token we're now good to go and make a request to the Twitter statuses API. We'll convert the response into JSON so that we can work with the data later on.

var ApiUrlTweetIds = 'https://api.twitter.com/1.1/statuses/user_timeline.json?screen_name=' + Username;
var parsedToken = resultToken.access_token;
var options = {
  headers : {
    Authorization: 'Bearer ' + parsedToken
  },
  "method" : "get"
 };
 var responseApiUrlTweetIds = UrlFetchApp.fetch(ApiUrlTweetIds, options);
 var response = responseApiUrlTweetIds.getContentText()
 var result = JSON.parse(response);

Step 4 - Add data into arrays

Now that we have all the data, we just need to group them into arrays so that we can insert these into the correct rows and columns. To do this we loop through the JSON data and push the desired status value (e.g. test, retweet_count and favorite_count) into the associated array.

//Define empty arrays to push data in
var text = [];
var retweet = [];
var favourite = [];
var retweetCount = 0;
var favouriteCount = 0;
var dateToday = Utilities.formatDate(new Date(), "GMT", "yyyy-MM-dd");

//Loop through results
for(var i=0;i<result.length;i++){
    
//Push data into arrays
  text.push([result[i].text]); 
  retweet.push([result[i].retweet_count]); 
  favourite.push([result[i].favorite_count]);  
     
  retweetCount = retweetCount + result[i].retweet_count;
  favouriteCount = favouriteCount + result[i].favorite_count;
}

Step 5 - Insert data into the sheet

Now that all data is sorted nicely into the correct arrays, all that's left is to state in which sheet range we want to insert the data. 

//Insert new data (tweets)
sheet.getRange("A6:A"+(text.length + 5)).setValues(text); 
sheet.getRange("B6:B"+(retweet.length + 5)).setValues(retweet); 
sheet.getRange("C6:C"+(favourite.length + 5)).setValues(favourite); 
  
//Get number of values in row
var rowNum = sheet.getRange("E1:E").getValues();
rowNum = rowNum.filter(String).length;

//To start at the 6th row
rowNum = rowNum + 5;

//Insert daily data (total numbers) 
sheet.getRange(rowNum,5,1,1).setValue(dateToday);
sheet.getRange(rowNum,6,1,1).setValue(retweetCount);
sheet.getRange(rowNum,7,1,1).setValue(favouriteCount);

Step 6 - Set up a daily trigger

If you'd like to refresh this data every day, then you can create a project trigger by clicking on the stopwatch icon inside the sheet script editor. If you change the settings to run daily, then it will automatically add new data in column E:G without overwriting the existing data. This will thus create a nice database to be used for reporting by for instance linking it to your DataStudio infrastructure.


Let me know if this was useful & let's connect on     or