Next Level Robotics
To an outsider looking in at amateur robotics, it often appears that the field hasn’t evolved much in the past few years. Certainly, there have been evolutionary gains. Sensors are a little smaller and smarter, and motors and controllers are a little more powerful and sophisticated. Furthermore, there have been a few advances in microcontrollers, such as the development of the Parallax Propeller, and more powerful field programmable gate arrays or FPGAs.
Despite incremental advances in the components we use to construct robots, the fundamental capabilities of carpet roamers, crawlers, and arms haven’t changed much. The leading edge of low-cost robotics is often represented by toys carried by the major retail outlets. So what’s it going to take to get amateur robotics to the next level? That is, to a level that not only matches the capabilities illustrated by commercial and academic robotics, but that at least hints at the capabilities we ascribe to robots depicted in Star Wars and Transformers?
First, a reality check. Developing a semi-autonomous Martian rover or a robotic prosthetic arm for a soldier injured in Iraq takes significant financial resources and teams of engineers, scientists, and machinists. So what can you do, given the current economic environment, to move your robot designs to the next level?
The most fertile area in robotics yet to be fully exploited that is within reach of every roboticist is software development. For example, in the area of robot vision, there is a need to better recognize, track, and differentiate objects, to read facial expressions and gestures, and — in general — to make robots more socially adaptable. If your interest is outdoor navigation, then there is a world of software options to explore, from GPS-based localization to navigation with light and RF beacons. Means of providing robots with the ability to maneuver through mazes and how to best avoid ledges and low-traction areas have yet to be perfected.
Connected to a PC, your robotic arm or vehicle with appropriate sensors can become just as sophisticated as any rover developed by NASA. Of course, you can work on challenges completely within a computer using simulations. And that’s an efficient, low-cost method. However, at some point you have to validate your work on a real robot. One thing I’ve learned over several years of building robots is that unless you’re working on a specific hardware specification, you’ll make more progress in shorter time if you leave the design of the hardware platform to someone else and focus on the overall functionality.
For example, why spend months designing and building an arm when you can buy a kit from Lynxmotion (www.lynxmotion.com) or CrustCrawler (www.crustcrawler.com)? Even if you have to modify an off-the-shelf arm, you’ll likely still save time and money. I’ve used various versions of the CrustCrawler arm — including their latest Smart robotic arm — as the basis for many projects that rely on the processing power of a PC. Both CrustCrawler and Lynxmotion offer PC-based software to control their arms, and third party software is available, as well.
Similarly, you needn’t start your software designs from scratch or with a huge budget. The entry-level versions of the various Microsoft .Net compilers and the MS Robotics Studio can be freely downloaded. If you’re not a Microsoft fan, there are dozens of software options, from MatLab and Simulink (www.mathworks.com) to open-source compilers. If possible, leverage what’s been done before and move to the next level more quickly and easily. Just be sure to return the favor and post your software to the web – and consider sharing your experience with SERVO readers.
I don’t want to discourage mechanical engineers and engineers-in-training from tackling new hardware designs. If you have a machine shop at your disposal and the skill to use those tools, then don’t hesitate. Everyone has different goals and ideas of what they want to get out of robotics. However, if getting to the next level quickly on a limited budget is your focus, then you should at least consider focusing on the brains — as opposed to the brawn — of your robots. SV