Monday, March 24, 2014

Controlling NXT Robot From the Keyboard with leJOS NXJ

Bad GUI for robot control
I worked up a proof of concept in leJOS NXJ I'll be teaching my high school robotics students soon for a method of controlling a tethered NXT robot directly through key presses. At first I had worked out a GUI with buttons following some Head First Java lessons but I realized that wasn't an optimal way to control a robot, by moving a mouse around clicking it. Arrow keys make much more sense. Capturing arrow key presses was hard to figure out. The Scanner class won't do because you type in a line and your String is captured when enter is pressed. For instant capture of every key press a KeyListener is needed, or that's the simplest way I could figure out.
So the solution I worked out is a text field with a KeyListener, and each arrow key press event calls a send-over-USB class method, sending a number over a data stream to the robot, which responds to the number sent with motor commands. In this proof of concept all I'm getting the robot to do is display response text on the LCD but it's a short step from there to driving it around.
Better GUI for robot control
First, the KeyListener GUI: For this I'm indebted to an Oracle example on KeyListeners. I stripped everything out that I didn't need in order to learn the bare elements that were needed. Here is the KeyEventDemo.java class, in plain text.
Second, the Sender class: There is an example in the leJOS NXJ download that sets up a data stream between PC and NXT over USB. This opens a data stream, sends an int 100 times to a Receiver class, receives it back from the NXT, and closes the data stream. I needed the data stream to stay open as long as the user wants to send new commands. The way the USB send examples were originally written, every time a key was pressed a whole new connection was established, and read and write data streams opened, which made each key press take 7-10 seconds to reach the NXT. So I moved the connection into the USBSend constructor so it would be established as soon as the USBSend class was referenced, then it was just the data streams that were opened and closed each loop. Synchronizing the logical coordination of opening and closing the streams and the connection was tricky to figure out. The other thing to figure out was how to close the IO streams and USB connection gracefully at the user request, which worked out nicely in the USBReceive class.
Here's the adapted USBSend.java class.
And the USBReceive.java class.
A few things that are helpful to know with leJOS.
  • the KeyEventDemo and USBSend classes have to be set up as a PC project, while the USBReceive class is a leJOS project.
  • On Mac with anything Snow Leopard + PC classes have to be run in 32 bit mode. In Eclipse you open run configurations and add -d32 to the VM arguments.

Thursday, March 20, 2014

Radio Shack DIY 3 X 3 LED Cube Project

This project is really fun! Lots of people have posted their experience putting it together so I won't do that, but after recovering from a bit of soldering problems what I find challenging, or just tedious, is figuring out exactly what hex codes to use to create arrays for you own programs. I did figure out lighting up individual LEDs 1-9 on each level to make these two patterns, shown one after the other here:

So to make it easier for others here are the codes for individual LEDs:

// 1-9
// * 0x001 = 1
// * 0x002 = 2
// * 0x004 = 3
// * 0x008 = 4
// * 0x010 = 5
// * 0x020 = 6
// * 0x040 = 7
// * 0x080 = 8
// * 0x100 = 9

And here's the array that makes the LEDs circle around:
//        Spiral 2, set page to 32
                                 {0x001,0x000,0x000,0x1ff},
                                 {0x002,0x000,0x000,0x1ff},
                                 {0x004,0x000,0x000,0x1ff},
                                 {0x020,0x000,0x000,0x1ff},
                                 {0x100,0x000,0x000,0x1ff},
                                 {0x080,0x000,0x000,0x1ff},
                                 {0x040,0x000,0x000,0x1ff},
                                 {0x008,0x000,0x000,0x1ff},

                                 {0x000,0x001,0x000,0x1ff},
                                 {0x000,0x002,0x000,0x1ff},
                                 {0x000,0x004,0x000,0x1ff},
                                 {0x000,0x020,0x000,0x1ff},
                                 {0x000,0x100,0x000,0x1ff},
                                 {0x000,0x080,0x000,0x1ff},
                                 {0x000,0x040,0x000,0x1ff},
                                 {0x000,0x008,0x000,0x1ff},

                                 {0x000,0x000,0x001,0x1ff},
                                 {0x000,0x000,0x002,0x1ff},
                                 {0x000,0x000,0x004,0x1ff},
                                 {0x000,0x000,0x020,0x1ff},
                                 {0x000,0x000,0x100,0x1ff},
                                 {0x000,0x000,0x080,0x1ff},
                                 {0x000,0x000,0x040,0x1ff},
                                 {0x000,0x000,0x008,0x1ff},

                                 {0x000,0x001,0x000,0x1ff},
                                 {0x000,0x002,0x000,0x1ff},
                                 {0x000,0x004,0x000,0x1ff},
                                 {0x000,0x020,0x000,0x1ff},
                                 {0x000,0x100,0x000,0x1ff},
                                 {0x000,0x080,0x000,0x1ff},
                                 {0x000,0x040,0x000,0x1ff},
                                 {0x000,0x008,0x000,0x1ff},

Now how about the hex codes for combinations of LEDs? Brad here explained the method for calculating them but I didn't get it until I read over it a few times. Once we have the base 10 (DEC) value for each of 1-9 (see Brad's table), just add those for the numbers you want to display, then convert the total to HEX, like here. For example, if I want 4 and 6 to light, 8 + 32 = 40, converted to HEX = 28. So this will do it for that element: 0x028.
Update: here is a modified sketch for running different tables more easily. Different patterns are individual tables, so call the pattern() function with a table passed as argument, along with how many lines it uses.

Monday, March 17, 2014

Hands On Tech: Making Ceramic Tiles

Inspired by the brilliant work of Gary Donohue and Josh Burker, I have been working on a project with some of our art teachers that has exciting possibilities. It started with Gary describing his workflow that allows elementary students' line drawings to become 3D printed pieces. Then Josh took that in another direction with 4th graders using Turtle Art to make patterns that can be printed as presses for making clay tiles.
One of our art teachers said this is the idea that fills in the missing link for him with 3d printing--the link between hand-made artwork and digital manufacturing. Other teachers responded similarly, feeling little affinity for a machine that prints digitally designed objects until they could see how hand craft can be a vital part of its use.
I'm working with a high school art teacher and her ceramics class. For 9th and 10th graders I thought a text-based coding environment would be better for producing tile patterns than the drag-and-drop of Turtle Art. I developed this Processing app that gives students who haven't had much exposure to code to be able to dip their toes in it and see how it can be a creative tool. All you have to do to try out different shapes and their spacing is modify arguments in the shape object...
/**first argument = side length or diameter (if circle) of each shape
/ second argument = height of each row
/ third argument = horizontal spacing of each shape */
circle = new Circle(120, 90, 70);
square = new Square(110, 70, 100);
hexagon = new Hexagon(60, 90, 70);
octagon = new Octagon(60, 90, 105);
triangles = new Triangles(180, 135, 90);

...and uncomment the shape/s you want. You can stagger each row by putting true in the second argument and you can change the line thickness by assigning a different number than the one given. In the code below only the square would be drawn when the program is run.
thickness = 2; //thickness of line
//true means offset each even row, false means don't
// circle.makeRows(thickness, true);
// triangles.makeRows(thickness, false);
square.makeRows(thickness, true);
// hexagon.makeRows(thickness, true);
// octagon.makeRows(thickness, true);




Conceptually the program is very similar to the one Josh presents above for Turtle Art. Each shape object is made up of a shape method, called by a row method that draws the shape across the window (and a second offset row if you set offsetEvenRows to true), called by a makeRows method that repeats the rows top to bottom. The amount of horizontal and vertical overlap is determined by the numbers you give the shape, as well as the side length or diameter set by the first number.
Here is the zipped program. Here is the code all in one file just for a quick look. To use it, make sure Processing is installed on your computer, and open the file tiles.pde. After configuring the shapes they way you want run the program (click the play icon) and see what you get. You will want to tweak it before you get your final pattern.
When you click the Processing window it puts a snapshot in the sketch folder. The next step is to crop the image down to what you want just for the tile, so open it in Preview, or some image editor on a PC, and crop down to the desired area. Since you probably want the tile to repeat horizontally and possibly vertically as well this may be tricky, especially if you are aiming to have a square tile.

Save the cropped image and open it in Illustrator. Now you will trace it to prepare it for conversion to a vector graphic. Click Object > Live Trace > Tracing Options. Check Preview and spend some time trying the different Presets. When you have one you like, click the Trace button. At first I tried Detailed Illustration on this one, but when I completed the next step, importing to Tinkercad, I saw that the preset did something very different than I expected, so I retraced it with the Lettering preset and that worked much better.

Click Save as, and choose SVG.
Now in Tinkercad use the section in the upper right to import the SVG file into a new design. It will come in bigger than you want it, so scale it down, holding shift while you drag a corner to keep the aspect ratio.
The height will have decreased as well so raise it back up to about 6mm. Then add a 2mm layer to tie it together. 
Now I haven't gone through the process of pressing clay tiles with these yet, but at a glance the art teacher likes the total height of 6mm, 4mm for just the ridges. I'll do an update when we have some tiles made. 






Friday, March 07, 2014

A Machine Asking To Be Remade

I've been having a great time teaching students about 3D design and printing, and helping them print things they need. From robot scoops to ears, it's always an exciting surprise to find out when the printer can be a useful tool for learning. One of the more interesting ways it serves as a useful tool is in printing parts for itself. I've found two of these opportunities so far. Early on I could see that the filament tube on the Replicator 2 came out of its clips on the back easily when the extruder had to move towards the back of the build plate. I found a filament tube upgrade on Thingiverse that has worked perfectly.

Then recently a student's dad brought in a few spools of filament to provide more colors for printing. One of the spools has a much smaller hole than the standard Makerbot spools. So I found a spindle, again on Thingiverse, modified it a bit in Tinkercad, and now the spool fits perfectly.

Monday, February 24, 2014

Raspberry Pi Makerbot Cam

We've been using our new Makerbot 3D printer quite a bit since we got in in December in our school. I've been the one learning how to use it and training some students to use it. So it's taken quite a lot of my time watching the results from printing different projects under different conditions. Not every print comes out successfully and sometimes you need to cancel a print because you can see that something is going wrong. And sometimes a spool of filament is running low and you're not sure there will be enough to complete a print. Rather than having to check on it every so often, I wanted to set up a streaming Makerbot cam with my Raspberry Pi and the new Pi Cam I got for it so I can oversee prints wherever I am.
The Pi Cam produces very nice quality still and video images and there are some very good tutorials on making it do lots of different things.
I like to do things a little at a time, so in this post I'm outlining the steps toward the eventual goal of streaming video and some of the hurdles on the way to getting there.

Getting the camera working

Getting the camera plugged in and snapping pics and video was easy and fun. The Raspberry Pi folks have put this information together here: Raspberry Pi Camera setup. In the process I wanted to start using the Raspi in "headless" mode to make it easier to get the photos off. My email client doesn't really work on it and plugging in a USB drive to copy files takes time. 

Remote access to images

From a Mac I use Terminal to log right in to the Raspi. Get the Raspi's IP by typing ifconfig on the Pi (connected to a monitor with peripherals), then use that to log in with SSH:

ssh pi@ipAddress
Now I'm at the pi prompt and can type the camera commands to take pictures and video. But the files are saved on the Pi. To get them off, log out of the ssh session. I used SCP to copy them to my Mac:

scp pi@ipAddress:image.jpg Desktop/image.jpg

puts the image right on my desktop.

Streaming video to a website

Miguel Grinberg has a great method of setting up a constant stream of images to the web from a Raspi Cam here. The idea is not to stream video, but to take stills in rapid succession, each new one replacing the previous, and stream those with MJPG-Streamer. So far it's pretty choppy for me, but I haven't played with the framerate yet. I can probably tweak it to get a smoother stream. Here is a sample:
There is a point in Grinberg's tutorial where he provides an update on installing MJPG-Streamer. Definitely follow it. The information is crucial to getting it to work successfully. I found a couple Linux command line tools useful. To stop the raspistill process before it's finished, first log out of the SSH session (the trailing & allows you to log out without killing the process), log back in and type:
$ ps aux | grep raspistill
That shows the raspistill process ID. Then you can kill it with:
$ sudo kill 2314 
(or whatever it's process ID is)

Running the streamer from button input

So I don't want to have to log in to the Pi and run this line of code every time I want the Pi to stream Makerbot footage:
$ raspistill --nopreview -w 640 -h 480 -q 5 -o /tmp/stream/pic.jpg -tl 100 -t 9999999 -th 0:0:0 &

What I need is a button I can press on the Pi that starts taking pictures for an hour or two. So I've ordered this setup from Adafruit--a GPIO ribbon and breakout board--to which I'll wire a button. The MJPG Streamer is always running, so all it needs is the pictures to dump into the streaming folder. This took some time to figure out. Here is the wiring below, which includes wiring for a momentary switch and an LED that will stay on as long as the camera is taking stills. The LED on the camera would work were it not for the camera enclosure I printed, which covers up the LED. I know, I could have drilled one or put it in the design, but a little more wiring is fun. Some tips on this setup follow:
Note: LED is on because switch has just been pressed.
To figure this out, I mostly used these two resources; buttons and switches and blinking LEDs. Initially the switch did nothing because I made the mistake of having the GPIO ribbon reversed. Heed this advice and make sure the single different colored wire on the ribbon--in my case, white--is oriented so it is towards the outside facing end of the Pi, not towards the middle. And on the topic of GPIO pin numbering, I am using BCM mode, which is a confusing topic for beginners like me and well explained here (down the page at A Word About GPIO Pin Numberings). The concept of pullup and pulldown resistors is still a little fuzzy to me, despite having dealt with it before with Arduinos, but I am pretty sure I have a pullup resistor here because there is no resistor between the button and ground. I need to attend a pullup/pulldown workshop, clearly.
So for the programming I used Python, which is typical on the Raspberry Pi. I made one Python program that 
  • configures the GPIO pins, 
  • turns on the LED when the button is pressed, 
  • runs a shell script that creates the /tmp/stream directory (for some reason it keeps disappearing) to store the images, 
  • starts the raspistill command, 
  • starts MJPG Streamer (if it's not already going), 
  • then turns off the LED when those finish. 
Note the trailing & after the MJPG Streamer command. That allows raspistill to start even though the MJPG Streamer process hasn't been stopped. The omission of & at the end of the raspistill command forces the LED to stay on until it has finished, though. The original program in the buttons and switches tutorial had prev_input = 0, but I had to change it to prev_input = 1, or the program would run without being pressed the first time, I think because my switch is being pulled high when open (not pressed), so ((not prev_input) and input) was initially true when prev_input was initialized to 0.
Here is startcam.py:
And here is the shell script startMBCam.txt:
I tried to get startcam.py to run at startup like is shown in the buttons and switches tutorial but MJPG Streamer would not run. I could tell the camera was snapping images but nothing was accessible at the streaming IP.

Creating a box for the camera

For this part I found a nice Raspberry Pi Camera case design on Thingiverse and used Tinkercad to modify it a little. I moved the ball connection to the bottom of the camera case front and flattened the ball socket.

The next and final step will be mounting the whole apparatus near our Makerbot with power and ethernet attached, but that will have to wait until after break.


Wednesday, January 01, 2014

Tri-color LED Project: Mineral Light Show

My son has a piece of hambergite--I'm pretty sure that's what it is--and I've been wanting to give it special lighting for a while. Finally got a tri-color LED and wired up some switches so the colors can be turned on in combination. With this setup you can make 7 colors, red, green, blue, of course, and their complements, yellow, cyan, and magenta, and with all three, white. Hambergite is translucent with cleavage running straight along the length of it, so light travels through it well. So here are some pictures of the results.










Tuesday, December 31, 2013

Gear Project Fail

Inspired by these mechanical movements animations, I thought it would be fun to figure out how to make some gears out of one of my favorite building material, cardboard. Turns out making working gears is harder than I thought. I thought one gear 12" in diameter with 16 teeth and one 6" in diameter with 8 teeth would work well, and I guess they could if the teeth were designed correctly. My gears stick because the teeth catch on each other. Further research is needed, but I would like to find something easy enough to cut out of cardboard yet able to move easily. There is a lot of information on spur gears, which is what these turn out to be.





Sunday, December 29, 2013

How Do You Measure Programmers' Productivity

I love this quote from Len Shustek's article for The Computer Museum on MacPaint and QuickDraw:
How do you measure programmer productivity?
When the Lisa team was pushing to finalize their software in 1982, project managers started requiring programmers to submit weekly forms reporting on the number of lines of code they had written. Bill Atkinson thought that was silly. For the week in which he had rewritten QuickDraw’s region calculation routines to be six times faster and 2000 lines shorter, he put “-2000″ on the form. After a few more weeks the managers stopped asking him to fill out the form, and he gladly complied.
 I was going to say it reminded me of how Adobe had started charging developers per centimeter of  Actionscript code, but then remembered that was an April Fools. I guess one that made an impression.

Thursday, November 28, 2013

Student Machinima in OpenSim With Greenscreen

I have a student, 11th grader, working on one of the most complex tech projects ever. Her English teacher gave an open-ended project assignment for students to make something that depicts a scene from Bram Stoker's Dracula. She decided to make a machinima that would require up to four avatars, all controlled and filmed by herself, in one week.
I think Rand Spiro's Cognitive Flexibility Theory describes the use of technology to solve creative problems well. Of course there are optimal setups for creating machinima but here I had to help a student realize a very complex project with minimal training and portable equipment, as she had to complete it over the Thanksgiving weekend.
So here's what I came up with. I will find out on Monday how successful it will have been:

  1. We have a school OpenSim virtual world, but currently port forwarding is not set up in the firewall, so it's not an option for working at home. She has a Mac so she couldn't use SoaS. So we set up her own sim, installing MySQL, Mono, and Diva distro. After a few hiccups she was up and running.
  2. We needed the Diva account functionality to set up the character and camera person accounts and I had chosen Imprudence for a viewer and for some reason the Diva splash page wasn't showing up. I found Singularity which turns out to be quite awesome.
  3. Next was loaning her a PC she could use for the filming. I put Fraps and Singularity on it and taught her how to change her MyWorld.ini and RegionConfig.ini files to reflect her LAN IP so the PC could log into her sim. I taught her how to use Fraps, which is dead simple.
  4. For multiple avatars in the same scene, all directed by her, she would need a green floor and background, which she would then have to edit together in a video editor. She made some nice avatar costumes and developed gestures from the stock opensim animations.
  5. She would have preferred to edit the footage in iMovie on her Mac, as would I, but having the files on the PC in AVI format complicated things, as they would have to be converted to MOV for iMovie. It was too much for me to explain and add to the workflow. So I opted for having her use Movie Maker on the PC. That decision could prove to be the project's undoing as greenscreen is very hard to work with in WMM. We'll see. You have to use WMM 6.0 and install RehanFX shaders, and syncing the overlaid clips is almost impossible. It seems pretty much just set up to make cheesy music videos with ridiculous backgrounds. But it would have to work.
  6. UPDATE: I am happy to say she managed to get the DVI  files copied to her Mac, and used either Perian or Evom to convert them. iMovie makes clip editing much easier, even after applying the greenscreen.
  7. So the greenscreen workflow consists of syncing two avatar clips, applying the greenscreen so they both appear over green. Export that, and reimport it. Add a third avatar clip and sync those. Apply the greenscreen filter again and export that. Repeat for the fourth avatar clip. Finally, reimport that and greenscreen it with the chosen background image, and if possible figure out how to work in multiple background images for scene changes, which I'm not even sure is possible.
I hope all this works, we'll see.
UPDATE: She completed the video and it came out amazingly well! She did end up figuring our how to convert the files from DVI to MOV and move them to iMovie. That should teach me to assume something will be too hard for someone.

Monday, November 18, 2013

Programming TETRIX Servos With leJOS NXJ

The last time I taught my high school robotics class I used RobotC to program TETRIX servos. The RobotC API provides the functions servoValue, servo, and servoChangeRate. From the documentation we learned that the only way you can be sure not to push your servo against a physical barrier and damage it is to avoid setting it to a position it can't reach. The easy programming also allowed us to avoid learning about how servos really work. leJOS NXJ has tools for dealing with servos that afford a much better learning experience, in my opinion. The leJOS API provides setRange(), setAngle(), setPulseWidth(), getAngle(), and getPluseWidth(). At the very least you will need to call setRange and setAngle on servos. That's because setAngle depends on a range of movement having been set with setRange. With leJOS it behooves you to set servos to a safe range of movement before moving it around, and to do so you have to understand something about how pulse width modulation makes servos run. Two articles do an excellent job explaining how to servos work , one from Jameco Electronics and one from Science Buddies. But there is still a gap of information when it comes to using the setRange and setAngle methods. The documentation provides the following:

public void setRange(int microsecLOW, int microsecHIGH, int travelRange) "Set the allowable pulse width operating range of this servo in microseconds and the total travel range. Default for pulse width at instantiation is 750 & 2250 microseconds. Default for travel is 200 degrees. " The parameters are defined as follows:
microsecLOW - The low end of the servos response/operating range in microseconds
microsecHIGH - The high end of the servos response/operating range in microseconds
travelRange - The total mechanical travel range of the servo in degrees
To better understand what these values mean I created some diagrams that make clear the function of each parameter.
The minimum and maximum PWM allowed are 750 and 2250, but if you use these  you are in danger of hitting the robot.
If the servo horn is attached such that the servo's physical stops are tilted the arm can hit the robot even with a safe min and max PWM range.
The third argument to setRange sets the number of programmable positions  between the min and max limits.
Setting the travelRange to 10, for example, will greatly reduce the precision it is capable of.