I was once asked the following interview question and told to answer it programmatically and analytically.

There are 100 light bulbs lined up in a row in a long room. Each bulb has its own switch and is currently switched off. The room has an entry door and an exit door. There are 100 monkeys lined up outside the entry door. Each bulb is numbered consecutively from 1 to 100. So is each monkey.

Monkey No. 1 enters the room, switches on every bulb, and exits. Monkey No. 2 enters and flips the switch on every second bulb (turning off bulbs 2, 4, 6…). Monkey No. 3 enters and flips the switch on every third bulb (changing the state on bulbs 3, 6, 9…). This continues until all 100 people have passed through the room.

What is the final state of bulb No. 64? And how many of the light bulbs are illuminated after the 100th person has passed through the room?

I thought this was a great interview question, in that it not only tests a persons problem solving skills, but also gives insight into their choice of hammer. My initial response was a simple Python script. Nested loops, bulbs under monkeys. In addition to this straight forward programmatic answer, I coded up a quick d3.js response. A fun way to quickly emphasize my interest in data visualization.

This is a quick and dirty Python script to assemble a deck from the collection of images I gathered in the previous post. Works well and uses the Python Imaging Library (PIL). Were I to repeat this, I would strip all white space and non-alpha characters from my file names, then convert them to lower case. Not a huge deal, but you’ll have to add some parsing logic if you don’t do it.

Obviously I need to structure my file names a bit better, but a solid little script overall.

I recently decided I wanted an image copy of every Magic the Gathering card, and wrote this script in Python to grab them from the awesome magiccards.info. I used Beautiful Soup and Mechanize to automate the process.

Beautiful Soup is a fantastic Python library for parsing HTML and is great for screen scraping projects. I have used it extensively for scraping websites….weather data, insurance information, and in the following example, Magic cards.

When combined with Mechanize, a Python based browser, automation becomes a breeze.