Download Twitter Data using JSON in R

Here we consider the task of downloading Twitter data using the R software package RJSONIO.

Screen Shot 2013-05-25 at 6.02.01 PM

Before we can download Twitter data, we’ll need to prove to Twitter that we are in fact authorized to do so. I refer the interested reader to the post Twitter OAuth FAQ for instructions on how to setup an application with dev.twitter.com. Once we’ve setup an application with Twitter we can write some R code to communicate with Twitter about our application and get the data we want. Code from the post Authorize a Twitter Data request in R, specifically the keyValues() function, will be used in this post to handle our authentication needs when requesting data from Twitter.

## Install R packages
install.packages('bitops')
install.packages('digest')
install.packages('RCurl')
install.packages('ROAuth')
install.packages('RJSONIO')


## Load R packages
library('bitops')
library('digest')
library('RCurl')
library('ROAuth')
library('RJSONIO')
library('plyr')


## Set decimal precision
options(digits=22)


## OAuth application values
oauth <- data.frame(consumerKey='YoUrCoNsUmErKeY',consumerSecret='YoUrCoNsUmErSeCrEt',accessToken='YoUrAcCeSsToKeN',accessTokenSecret='YoUrAcCeSsToKeNsEcReT')

keyValues <- function(httpmethod,baseurl,par1a,par1b){  	
# Generate a random string of letters and numbers
string <- paste(sample(c(letters[1:26],0:9),size=32,replace=T),collapse='') # Generate random string of alphanumeric characters
string2 <- base64(string,encode=TRUE,mode='character') # Convert string to base64
nonce <- gsub('[^a-zA-Z0-9]','',string2,perl=TRUE) # Remove non-alphanumeric characters
 
# Get the current GMT system time in seconds 
timestamp <- as.character(floor(as.numeric(as.POSIXct(Sys.time(),tz='GMT'))))
 
# Percent encode parameters 1
#par1 <- '&resources=statuses'
par2a <- gsub(',','%2C',par1a,perl=TRUE) # Percent encode par
par2b <- gsub(',','%2C',par1b,perl=TRUE) # Percent encode par
 
# Percent ecode parameters 2
# Order the key/value pairs by the first letter of each key
ps <- paste(par2a,'oauth_consumer_key=',oauth$consumerKey,'&oauth_nonce=',nonce[1],'&oauth_signature_method=HMAC-SHA1&oauth_timestamp=',timestamp,'&oauth_token=',oauth$accessToken,'&oauth_version=1.0',par2b,sep='')
ps2 <- gsub('%','%25',ps,perl=TRUE) 
ps3 <- gsub('&','%26',ps2,perl=TRUE)
ps4 <- gsub('=','%3D',ps3,perl=TRUE)
 
# Percent encode parameters 3
url1 <- baseurl
url2 <- gsub(':','%3A',url1,perl=TRUE) 
url3 <- gsub('/','%2F',url2,perl=TRUE) 
 
# Create signature base string
signBaseString <- paste(httpmethod,'&',url3,'&',ps4,sep='') 
 
# Create signing key
signKey <- paste(oauth$consumerSecret,'&',oauth$accessTokenSecret,sep='')
 
# oauth_signature
osign <- hmac(key=signKey,object=signBaseString,algo='sha1',serialize=FALSE,raw=TRUE)
osign641 <- base64(osign,encode=TRUE,mode='character')
osign642 <- gsub('/','%2F',osign641,perl=TRUE)
osign643 <- gsub('=','%3D',osign642,perl=TRUE)
osign644 <- gsub('[+]','%2B',osign643,perl=TRUE)
 
return(data.frame(hm=httpmethod,bu=baseurl,p=paste(par1a,par1b,sep=''),nonce=nonce[1],timestamp=timestamp,osign=osign644[1]))
}

Next, we need to figure out what kind of Twitter data we want to download. The Twitter REST API v1.1 Resources site provides a useful outline of what kind of data we can get from Twitter. Just read what is written under the Description sections. As an example, let’s download some user tweets. To do this, we find and consult the specific Resource on the REST API v1.1 page that corresponds with the action we want, here GET statuses/user_timeline. The resource page lists and describes the download options available to the task of getting tweets from a specific user, the thing we want to do, so it’s worth it to the reader to check it out.

Here we download the 100 most recent tweets (and re-tweets) made by the user ‘Reuters’.

## Download user tweets
# Limited to latest 200 tweets
# Specify user name
user <- 'Reuters'
 
kv <- keyValues(httpmethod='GET',baseurl='https://api.twitter.com/1.1/statuses/user_timeline.json',par1a='count=100&include_rts=1&',par1b=paste('&screen_name=',user,sep=''))
 
theData1 <- fromJSON(getURL(paste(kv$bu,'?','oauth_consumer_key=',oauth$consumerKey,'&oauth_nonce=',kv$nonce,'&oauth_signature=',kv$osign,'&oauth_signature_method=HMAC-SHA1&oauth_timestamp=',kv$timestamp,'&oauth_token=',oauth$accessToken,'&oauth_version=1.0','&',kv$p,sep='')))

At this point in the post you should have the 100 most recent tweets made by the user ‘Reuters’ as well as values on several variables recorded by Twitter on each tweet. These are stored in a list data structure. You are now free to do list things to these data to explore what it is you have.

For instance, let’s see the tweets.

theData2 <- unlist(theData1)
names(theData2)
tweets <- theData2[names(theData2)=='text']

6 thoughts on “Download Twitter Data using JSON in R

    • Good point. Resource URLs used to request data from Twitter should always reflect their newest standards. Changing the code from ‘1’ to ‘1.1’ was totally the right way to go.

      As to the authentication error, I think I found the problem. The key/value pairs need to be ordered by the first letter of each key when generating the signature base string. I made a comment in the keyValues function about this, but, as it turns out, I actually failed to do it. I shouldn’t fail to do this.

      Thanks for catching these issues! I updated the code to reflect these points. Let me know if it works.

      • Hi, i am able to use the code for the specific query, that is fetch 100 tweets of user ‘reuters’.
        If need a different query using the different operators that are provided in the “twitter search api”, will i be able to do some simple modification to this code and reuse it ? if so can you provide details for the same.
        I am a newbie to R !

      • Glad to hear the code worked for you! You are correct, different queries as outlined in the Twitter search API at times use different operators. Simple modifications to the code should work with many GET and POST resources. Which resources are you most interested using?

Leave a comment