AIIA ICT Industry Night

Categories Design, RMIT, Technology

Last night I went to the AIIA Industry Night at RMIT Storey Hall to help present a Computer Science project I was involved with on the Product Development side. Coming from an Industrial Design background I was impressed with how well the school of Computer Science & IT directly engages with industry players, I met a lot of great people and saw a diverse range of interesting and insightful projects. Hopefully in the future I’ll come across some of them in the wild.

AIIA Industry Night AIIA Industry Night AIIA Industry Night AIIA Industry Night AIIA Industry Night AIIA Industry Night

Make a custom button pad for Raspberry Pi [Pt II]

Categories Interaction, RMIT

In my previous post I was using a testing cable with an alligator clip on one end and a probe on the other, shorting the circuit board that I took out of a cheap USB Keyboard. The small distance between each contact point made using the alligator clip awkward and prone to bridging nearby contacts. I decided that I would use a section of jumper ribbon cable that I salvaged from an old computer to form a sort of breakout board that I could use as a more robust, exploratory interface.

Firstly, I started stripping two sets of 13 wires for each bank of contacts on the USB circuit.

From there I started soldering each wire to the circuit board. Initially, creating a successful solder joint was difficult and I found that a light sanding of the contact points (to remove some surface coating and presumably add texture) made soldering much easier.

After all the contact points (in actuality, only those that were necessary) were soldered to the ribbon cable, I used jumper cables to open out the pins onto a small breadboard, allowing me to hook multiple jumpers up between points. I used this setup to create a document that outlines which connections produced a useable keystroke within the context of my project (in this case letters, numbers and symbols – no function or conditional keys). In total, I needed at least 25 different keystrokes for use in my 5 x 5 button matrix, with 8 additional keys to account for function keys.

In a later post I’ll show the process of creating the actual 25-key button matrix, solder it all up and test it to make sure that every button on the custom pad triggers a unique keystroke. It’s been a while since I’ve done a great deal of soldering, and my connections on the USB circuit – though functional – aren’t very clean looking, so I’m worried about soldering up 25+ buttons. Ultimately all the ugly electronics will be hidden inside a 3D Printed housing anyway, so no one will ever need to know.

 to be continued

Make a custom button pad for Raspberry Pi [Pt I]

Categories Interaction, RMIT

I needed a button matrix to use as an interface for a new digital instrument that I’m making as part of my undergraduate major project. Something like the monome would probably work quite well, but I cannot justify being lazy and paying for one when I feel like I could probably hack some existing things together and end up with a custom result.

In earlier posts I talked about establishing serial connection between raspberry pi and arduino, because initially I intended to read a button array with the arduino and then send little data packets via usb that told the raspberry pi which buttons were pressed.

Previous design exploration: 16-Digit Keypad connected to Arduino, connected via USB serial to Raspberry Pi using a 3.5″ TFT LCD Screen as display.

For a few reasons, particularly the latency issues, this wasn’t a viable solution. I am vaguely aware that the arduino could be cut out of the process, and the button array read by the raspberry pi directly, but I’m also vaguely aware that to try and figure out how to do that may set me back a month.

The solution was to buy a cheap USB Keyboard, gut it, rip the chip out and hook it up to a custom button array made up of a 5 x 5 grid of N/O buttons.

I like this solution in that it doesn’t really matter which contacts I solder the buttons to, so long as each button triggered a different keystroke. I can then assign each button to it’s specific keystroke and then map each keystroke to it’s corresponding position in my Processing/Python sketch.

Ultimately, the keyboard is controlled by a small chip that communicates via USB. The chip has two banks of 13 contact points. To get any of the ~101 keystrokes available you have to close the circuit between two specific contact points.

So I started blindly shorting the board with alligator clips and a probe with notepad open on my laptop to capture the keystrokes.

Ended up with some glitch poetry

Now I just need to perform the time-consuming task of soldering up an array of N/O buttons to the keyboard chip, based on the combinations that results in 25 useful keystrokes (letters, numbers, punctuation – no function or naviagation keys). Then I need to house it all in a 3D printed case.

continue to Part II

Displaying webcam video on Raspberry Pi using pygame

Categories Open-source, Programming, RMIT, Visuals

Another small exploratory project related to my undergraduate Major Project had me trying to stream video onto a small 3.5″ TFT LCD display using a webcam connected to a Raspberry Pi mini computer.

 

Firstly, not all USB webcams work with the Raspberry Pi, so after trying randomly to find a working webcam I discovered this list of Raspberry Pi verified peripherals and bought myself a Logitech C100 for $20 on ebay.

Secondly, USB webcams seem to be horribly slow on the Raspberry Pi, pulling dodgy framerates at what might be <10fps at their full resolution (in my case 640×480 pixels). Originally, I was using the custom imgproc library (available here) but found that there was either a lack of simple documentation for extended functionality like scaling and so on, or that the library wasn’t built to perform those sorts of tasks. Eventually I settled upon using the pygame library (which you can get here). The documentation for pygame is thorough and easy to navigate, making troubleshooting and extension or exploration of the library very easy to do.

In order to combat the low framerate and because resolution is not a priority (for the purposes of the concept I will be testing) I decided to pull the webcam feed in at a much lower resolution (32×24 pixels) and then scale it up to full screen (640×480 pixels). I’m unsure if the webcam can actually pull the feed in at 32×24 pixels, I have to look into that – and perhaps some pixel dropping techniques.

I wrote the code, copied it over, connected the webcam to the Raspberry Pi and ran the following python script:

import sys
import pygame
import pygame.camera

pygame.init()
pygame.camera.init()

#create fullscreen display 640x480
screen = pygame.display.set_mode((640,480),0)

#find, open and start low-res camera
cam_list = pygame.camera.list_cameras()
webcam = pygame.camera.Camera(cam_list[0],(32,24))
webcam.start()

while True:
    #grab image, scale and blit to screen
    imagen = webcam.get_image()
    imagen = pygame.transform.scale(imagen,(640,480))
    screen.blit(imagen,(0,0))

    #draw all updates to display
    pygame.display.update()

    # check for quit events
    for event in pygame.event.get():
        if event.type == pygame.QUIT:
        webcam.stop()
        pygame.quit()
        sys.exit()

After a brief pause a pygame window pops up and the webcam feed is shown on the screen, scaled to fullscreen. The framerate is better than the full resolution webcam feed, but is still very slow by modern standards.