DSTO & R&D: Team Michigan's MAGIC fleet | ADM Nov 2010

Team Michigan have big plans for their little robots.

With the real world lessons learned in MAGIC, they hope to further develop the systems and platforms for other applications in the future.

They have also made many of their tools open source for other developers.

Katherine Ziesing | Canberra

The Team Michigan robots were specially designed for MAGIC and are manufactured at the APRIL robotics laboratory at the University of Michigan.

The robots are principally made of Baltic birch plywood, which has been cut using a laser cutter based on a SolidWorks design.

This method makes it possible for the team to construct a relatively large fleet of robots, with approximately 14 platforms planned to be in play for the challenge.

“Mechanical and software reliability are big issues when you have this many robots,” Dr Edwin Olson of University of Michigan said.

“Suppose a robot only breaks down once every other week (say 80 hours of operation).

“Well, with 16 robots, that means you have two robots breaking every day!

“We've been taking a close look at what goes wrong on our robots and trying to address those problems one at a time.

“Over time, this has made a huge difference in our robots' reliability.”

The team has a relatively small collection of academia and industry but they are working together well to produce tools to guide such a large robot fleet.

“Soar Tech is a small technology company in Ann Arbor,” Olson said.

“Their area of expertise is software for high-level reasoning.

“As part of the team, they are helping to build user interfaces that alert the human operator to attention-worthy events.

“In addition, we are financially supported by Intel and Texas Instruments.

“Building robots is expensive!”

The robot is driven using four DC gearhead motors, each with independent control and quadrature phase feedback.

The drive-train itself uses parts designed for a lawnmower, the ultimate off the shelf solution.

This drivetrain, while lacking a suspension (forgivable given a maximum speed of around 1 m/s), is quite rugged.

Team Michigan has also made extensive use of a Dimension uPrint rapid-prototyping printer to fabricate sensor mounts, cases, and other small parts.

This printer creates models in ABS plastic, which is strong enough for most non-drivetrain purposes.

The robot is powered by a 720 Wh battery with a nominal voltage of 24 V, which is sufficient to power all the robot's systems (including computer) for about four hours, just over the 3.5 hours of the MAGIC course.

The battery is roughly the size of a loaf of bread.

A bank of switching DC/DC converters generates the additional voltages required by other subsystems.

The robots are controlled by two uOrc robotics controllers, which provide motor control and data acquisition.

These controllers were developed previously by the University of Michigan lab for an earlier robotics competition.

Each controller handles two motors.

These controllers are based around the Stellaris LM3S8962 Cortex-M3, which is a 32bit ARM core running at 50MHz.

The controllers interface to the laptop via an ethernet connection.

“We wanted to push into a new regime of multi-robot systems,” Olson told ADM.

“There's a lot of neat things that happen when you go with large fleets: you can cover area faster, you can tolerate losses/breakdowns, you can afford to deploy robots as communication relays, etc.

“The challenge is getting them to agree on what the world looks like and to coordinate with each other - and that's one of my research interests.”

Sensors

The primary sensor is a combined camera/LIDAR (light detecting and ranging) rig.

The camera is a PointGray FireflyMV USB camera with a 2.8mm fixed focal length lens giving us about a 90-degree field of view.

The camera is mounted on a pair of AX-12 servos which allow the camera to pan and tilt around its focal point, making panorama generation straight-forward.

Integrated into the same sensor mount is a Hokuyo UTM-30LX LIDAR range finder.

It is mounted on a third AX-12 servo, allowing us to produce 3D point clouds.

Because the two sensors are well-calibrated with respect to each other, Team Michigan can obtain colour data for laser points, or ranges for camera pixels.

The camera/LIDAR mount was fabricated using the uPrint rapid prototyper.

The GPS unit is a consumer-grade Garmin GPS-18x.

The Simultaneous Localization and Mapping algorithms provide the navigational information needed for operation, eliminating the need for a more complicated differential or real time kinematic (RTK) GPS system.

“This non-reliance on GPS gives our system an advantage in dense urban areas and in combat areas where GPS could be jammed,” Olson said.

“We developed an inertial measurement unit (IMU) which has four gyro axes (we sense yaw twice in order to reduce noise), three accelerometer axes, three magnetometer axes, and a barometric altimeter with a resolution of about 10 inches.”

This IMU is the size of a business card and is a bus-powered USB device.

Processing all of the sensor data requires a lot of computational power and that’s where a Lenovo T410 dual core i5 laptop at 2.4GHz comes in.

The Team Michigan software runs on open source Ubuntu 10.4 and is primarily written in Java.

“Some of our software is already open source; contest-specific code will be released after the competition,” Olson said.

“Our robots communicate with the base station using two separate radio systems: a 900MHz XTend system (which is long range but only about 10kB/s), and an 802.11g mesh network built from OM1P nodes,” Olson explains.

“While the raw bandwidth of the 802.11g radios is very high in comparison to the XTend radios, the need to relay messages quickly eats away at this bandwidth.”

Task allocation is handled centrally at the ground control station, either via a rewards-based planner or manually by a human operator.

These tasks are fairly high-level, such as ‘travel to (x,y)’.

The command can send the robot well beyond the robot's sensor horizon.

While travelling, the robots autonomously identify obstacles, plan paths, and detect objects of interest.

Exceptions are reported back to ground control, which can result in either a new task assignment or human intervention.

“Getting the robots to agree on a common coordinate system is one of the central challenges of the competition,” Olson said.

“In our system, each robot maintains a map in its own private coordinate system.

“This map is constructed using a combination of odometry, IMU, and 3D laser scan matching.

“Robots share information about their coordinate systems in two ways.

“The first way is by observing another robot: the 2D barcodes on each robot allow robots to recognize each other and thus register each other's coordinate systems.

“This system is based on the AprilTag visual fiducial system, which our lab has made open source.”

Robots can also register coordinate frames by matching 3D scans from other robots.

This method can be much more accurate, but requires significantly more radio bandwidth and produces outliers that must be rejected.

Team Michigan plans to expand upon this element of their solution after MAGIC has been completed.

comments powered by Disqus