My final reading and writing electronic text project is, among other things, a Twitter bot that formulates and tweets generalizations, aphorisms, platitudes, and, occasionally, vaguely (and not so vaguely) offensive stereotyping statements. It is a “parody” of how our brains tend to overgeneralize everything all the time, e.g. having a bad day can magically turn into life being nothing but bad days all the time in one’s overdramatic head.

the head


The approach was  to use a body of lexicon to randomly generate a tweet that follows one of a few phrase templates. The body of text used is the wonderful Fifteen Thousand Useful Phrases by Greenville Kleiser, available freely from Project Gutenberg. The text is a simple listing of parts of speech that are intended to be learned to enrich one’s conversations regardless of the context. (A cleaned-up copy of the book that I used in my Python script below can be found here.)


fifteen thousand useful phrases

I wrote a Python script that simply parses all the of book’s contents, and extracts noun phrases, verb phrases, adjective phrases, and other phrase types, to a dictionary. The dictionary had phrase types (e.g. NP, VP, and ADJP) for keys, and a set of text segments as the value for each key.

This dictionary was then used as the main data variable of another Python script that uses the text segments to generate a number of phrases at every run. The generated phrases follow one of predefined phrase templates, including: Every [singular subject] is [always/never] [adverb] [adjective], and All/Most [plural subject] are [adverb] [adjective] | [adjective] [noun]. Each phrase type is generated through a function that takes relevant text segments (NP, VP, etc), conditions them appropriately using convenience functions (e.g. pluralize a noun, or make a verb gerund), and then encloses them between text segments that are hardcoded in the program (e.g. collective pronouns/subjects such as We, Everything, and All of us). Some text segments, such as the trailing Every time, Everywhere, All the time, etc. are added according to a probability.


The final Python script I did represents the Twitter bot functionality. I made a few choices about how the bot should work, the first of which is that it should generate phrases in sets about single concepts. The functional design of the program as described above allowed for this as the same NPs, VPs, ADJPs, arguments can be reused with the different phrase-generating functions when constructing a number of consecutive tweets.

The second choice I made is that the bot should tweet in a more variable, natural rhythm. To do so, I opted to run the bot from my computer, using a prepetually-running python script (not a cron-job) that, after tweeting every tweet, waits for a randomly allocated interval between 15 minutes and 3 hours before tweeting the next tweet. In addition to that, the program obviously stops running when I switch off my computer (which I don’t do often).

The design choices I made resulted, I felt, in text that can often sound as a surreal, pseudo-philosophical conversation about a single thing. As a result, along with the Twitter bot, I opted to present the text as a comic strip of sorts, borrowing from the aesthetics of  surrealist web comics.


One of my personal projects for the summer is to automate the composition of the images + text, to ultimately post a “generative web comic” of sort in Tumblr or Twitter.

Below are the three Python scripts I wrote for this project.

1. Parsing the text source, generating the dictionary of text segments

2. Generating the phrases to be tweeted (dictionary variable had to be removed as it is too large).

3. Posting the phrases to Twitter in a random interval between 15 minutes and 3 hours.



  • December 9, 2014 Reply


    Did you make the illustrations yourself?
    I saw this project at the code poetry slam and it was hilarious 🙂

    • December 9, 2014 Reply


      The components are found clip art from different online places. I edited those, and composited scenes out of them.

Leave a Comment

Error: Please check your entries!