|Overview||Tomorrow's software will not be chained to today's computing devices. Rather, future software will run on computing platforms embedded in the physical world. The keyboard, video, and mouse will be replaced with a rich cadre of low-power sensors and a host of actuators. The days of Ethernet and radios that draw hundreds of milliwatts will give way to radios that draw a few microwatts on average. Disk drives will be replaced with NAND, NOR, FRAM, and MRAM. Mains power and Li+ batteries will be replaced with solar, vibration, wind, and other forms of harvested energy while energy storage will occur in new types of batteries and supercapacitors. Instead of being client computers, these devices will be tiny servers hosting data about their environment or accepting commands to alter it.|
See the Epic Homepage for some
recent embedded projects. Others projects include: Intel Common
Sense Badge (in development), Extreme Scale Mote (XSM), Benchmark,
Cyberstick Joystick, ImageArray Digital Advertising System, Robots,
Quanto, Trio, Radar, Tag, Microsoft .NET Mote, NetComm, and others.
dotnetcpu Mote. This proof-of-concept sensor network platform is based on the Microsoft SPOT Stamp (dotnetcpu) and the Chipcon CC2420 radio. The dotnetcpu runs the TinyCLR, a small-footprint virtual machine targeted for embedded platforms. One of the great benefits of using the TinyCLR is that Visual Studio can target this virtual machine -- which means programming C# is possible (much easier than C). This platform supports the IEEE 802.15.4 PHY and elements of the MAC. Our C# library implementation includes a simple, UdpClient-like application programming interface, and sample applications for interoperability with IEEE802.15.4 motes that run TinyOS including the Telos and MicaZ. I worked on this project, among other things, during a summer internship at Microsoft Research.
Digital Advertising. Do you remember that scene in Minority Report where Tom Cruise's character walks into a mall and an advertisment kiosk recognizes him and says something like, "you seem stressed, perhaps you'd like a cup of cappucino?" Well, this project isn't quite like that but it is a big step toward digital advertising at the point of purchase. The basic notion is that static advertisements are expensive and time-consuming to distribute and, well, static. This technology demonstrates the feasibility of centralized content creation and management followed by digital distribution to potentially thousands of locations (each of which is equipped with a sign).
Each sign is composed of one or more (21 in this case) self-contained units with an integrated display, computer, and communications. Key research questions included devising a leader election scheme for a partially synchronous network that avoided electing multiple leaders and attempted to maintain the current leader, devising an electro-mechanical "plug-and-play" system that a layperson could operate, finding the right mix of pre-rendering vs. run-time rendering of images and movies, and - believe it or not - cooling (yes, there are two air conditioners). I led the software engineering efforts of this project, had a strong hand in the development of the system and network architecture, and contributed to the specification of the hardware. This project was completed in 2003.
Cyberstick Pro. The Comdex '97 "Best of Show" awarding-winning CyberStick Pro was the first virtual-reality joystick to use MEMS-based silicon accelerometers (vs. potentiometers) to measure joystick position (it was even the first non-automotive commercial application of the ADXL family (ADXL202) of MEMS accelerometers from Analog Devices). Unlike traditional joysticks, this unit could be held and manuevered in midair. The joystick operated in both analog and digitial modes, and its sensitivity could be adjusted electronically.
Six-Legged Autonomous Mobile Robot. This six-legged mobile robot was a research platform I built to investigate reactive robotic architectures. The work won the Third Place Grand Award at the Intel International Science and Engineering Fair. The robot had over 2,000 components, 21 motors, and 10 processors and was built during my senior year in high school. The robot is very similar in design to Attila and Hannibal, designed and built by Colin Angle and Rod Brooks at the MIT AI Lab.
Wheeled Actively-Articulated Vehicle. This six-wheeled mobile robot research platform was built to understand the mobility, coordination, and control issues in actively controlled vehicles. Our testbed was a six-wheeled machine called the Wheeled Actively Articulated Vehicle (WAAV). In contrast with traditional rovers like the NASA/Jet Propulsion Lab Rocker Bogie, the WAAV had twelve actively controlled degrees-of-freedom -- six wheels and six active joints (a pair of roll, pitch, and yaw joints). Our research demonstrated that by actively controlling the normal force between each wheel and the underlying surface, we could significantly improve traction and enhance mobility in unstructured terrain. I designed and implemented several electrical and control subsystems.
Dark Horse. This robot was built for the American Association for Artificial Intelligence/International Joint Conferences on Artificial Intelligence held in Seattle in 1994. The entry won the LEGO Mobile Robot Competition that year after competing against some very strong teams from around the country. The robot had a differential-drive system and a freely rotating caster. It had a horizontally-moving arms that were used to extend its reach to push widgets around. The gear-train is visible in the lower right-hand corner just above the wheel. The robot used shaft encoders for measuring wheel rotations, optical sensors for following lines painted on the contest arena, bump sensors for detecting collisions, motor current sensors for detecting stalls, and a four-way servo-driven rotating turret to angulate a pair of beacons on the course.
The robot used a hierarchical behavior-based control program to navigate the course. The robot first attempted a dead-reckoning strategy using only encoders for the control loop, which was fast but not robust, until it detected it was off course (e.g. off the painted path, high motor current draw, bump sensor triggered). Depending on the severity of the problem, the robot would respond differently. A minor error like drifting outside of the lines would trigger only the line-following behavior and the cutover itself appeared transparent. A more severe error like a collision or stall caused the robot to stop moving and begin a recovery procedure during which it would attempt to angulate its position against a pair of beacons on the course and determine its location using an internal course map. Daniel Marcu, Feng Zhao and I designed and built this robot. This project was completed in August, 1994.
Speed Controller and H-bridge Driver. Popular among the members of the Silicon Valley Homebrew Robotics Club, these circuits include a PIC-based speed controller and an H-bridge driver that I designed with Chuck McManis, whose distinguished Silicon Valley career includes being one of the first half-dozen or so people on the Java team at Sun. The circuit drives high-current DC motors (20 to 40 amps, depending on the heat sinks) from servo signals. Chuck maintains a web page that includes a theory of operation, schematics, gerbers, BOMs, and assembly instructions for the speed controller and the H-bridge.
Prabal K. Dutta
Computer Science Division
Department of Electrical Engineering and Computer Sciences
The University of California, Berkeley
Soda Hall #1776, Berkeley, California 94720