Experiment
Back in 2013, I was spending an unhealthy amount of time on Twitter. And at that time, I was really into the caricatures. So instead of checking every site I know for any new caricatures, I thought wouldn’t be great if there were a Twitter account that “tweets” new caricatures every day. And actually, there were such accounts. The problem was that those accounts weren’t relevant enough and shared the same caricatures over and over… And most of them were running actively only for a couple of months. Because after a couple of months, the owner of accounts usually got bored and didn’t want to continue.
So to have an account without an owner that gets bored quickly, I thought we should delegate this task to the bots.
Building
Because it was a weekend project, I didn’t want it to cost an arm and leg. I knew that if I wanted the bot to be a long-running show, it should require a minimum of possible time and resources from me.
I was already familiar with Google Apps Script and had used it several times. So I thought if I could build the bot with Google Apps Script (GAS), it meant I wouldn’t be paying for servers, and as a bonus, I would get to play with the Javascript at the backend.
Obviously, it has been a long time since I started building it, and I can not recall all the process details. But I remember that there was built-in support for OAuth in GAS back then, so it was relatively easier to build a bot. And there was another service called ScriptDB in GAS. It allowed scripts to have databases and store data in them. For example, I’ve used ScriptDB to keep track of the caricatures to check if the bot has tweeted the picture before.
But both of those services were deprecated; I had to replace those parts along the way to keep the bot functioning. So when ScriptDB was deprecated, I started using Parse.com. Do you remember old, good Parse.com? The one that Facebook bought and killed after a while.
After the fate of Parse.com, I gave up on third-party databases, and since then, I’ve been using Google Sheets as a database for the bot.
And nowadays, you have to use libraries to authenticate with external services like Twitter:
Since the bot's launch, it has seen a couple of iterations. Right now, it’s on 3rd version.
After the latest update, it now has a little web dashboard where you can manually post, check the time of scheduled events, or check the social channels' connection status.
This is how it works:
You can set “triggers” on the Google Apps Script file. For example, I’ve set up a trigger that runs every hour and executes the GAS function that checks websites to see if any new caricatures are posted. When it finds a picture, it first runs it through the database (Google Sheets) to check if it’s a new picture. If so, then posts it to the social channels and write back to the database for future reference.
As you can see from the picture above, after figuring out how to integrate with Twitter, I’ve also incorporated a bot with Facebook and Tumblr. Tweeting or posting text was easy, but posting images with GAS took some time to figure out. So here’s the code snippet to save you time:
function getTwitterService() {
// Check https://github.com/gsuitedevs/apps-script-oauth1#usage
// for the docs
return OAuth1.createService('twitter')
// Set the endpoint URLs.
.setAccessTokenUrl('https://api.twitter.com/oauth/access_token')
.setRequestTokenUrl('https://api.twitter.com/oauth/request_token')
.setAuthorizationUrl('https://api.twitter.com/oauth/authorize')
// Set the consumer key and secret.
.setConsumerKey(TWITTER.CONSUMER_KEY)
.setConsumerSecret(TWITTER.CONSUMER_SECRET)
// Set the name of the callback function in the script referenced
// above that should be invoked to complete the OAuth flow.
.setCallbackFunction('authCallback')
// Set the property store where authorized tokens should be persisted.
.setPropertyStore(PropertiesService.getUserProperties());
}
function tweetPicFromURL(url) {
try {
var boundary = "cuthere";
var picture = UrlFetchApp.fetch(url).getBlob().setContentTypeFromExtension();
var status = "Test status";
var requestBody = Utilities.newBlob("--"+boundary+"\r\n"+
"Content-Disposition: form-data; name=\"status\"\r\n\r\n"+status+"\r\n"+
"--"+boundary+"\r\n"+
"Content-Disposition: form-data; name=\"media[]\"; filename=\""+picture.getName()+"\"\r\n"+
"Content-Type: "+picture.getContentType()+"\r\n\r\n").getBytes();
requestBody = requestBody.concat(picture.getBytes());
requestBody = requestBody.concat(Utilities.newBlob("\r\n--"+boundary+"--\r\n").getBytes());
var options =
{
method: "post",
contentType: "multipart/form-data; boundary="+boundary,
payload: requestBody
};
// twitter stuff
var api = 'https://api.twitter.com/1.1/statuses/update_with_media.json';
var twitterService = getTwitterService();
var response = twitterService.fetch(api, options);
Logger.log(response);
return response;
}
catch(e){Logger.log(e)}
}
Github Gist of the code snippet: https://gist.github.com/msadig/30dccada8104adc823336eaadce1cc6c
Conclusion
Honestly, up until this year’s (2018) Google I/O, I didn’t realize why this project appeals to me so much. But, after watching one of the presentations on Google I/O ‘18, I realized that I like this architecture and project because it’s serverless. And the fact that I was using serverless before even it was a “thing” made me love this project even more now.
So after 5 years and 11.3K caricatures that posted, I would like to think about this experiment as a successful one.
Here’re the links to check out for working bots:
Links: