Real-Time LiDAR Plane Detection

ilab
Detecting planes in 3D LiDAR Point Clouds

Many sensors used in robotic perception produce 3D point clouds. Stitching such point clouds together to produce a consistent 3D representation of the world is a computationally expensive process which often can't be done in real-time. Our algorithm exploits the geometry of one such sensor (the Velodyne LiDAR) to detect planar structures in the environment, and uses these structures to efficiently reconstruct the path of the robot.

Check it out here: Paper PDF

Cereal

ilab
An open-source C++11 serialization library.

cereal takes arbitrary data types and reversibly turns them into different representations, such as compact binary encodings, XML, or JSON. cereal was designed to be fast, light-weight, and easy to extend - it has no external dependencies and can be easily bundled with other code or used standalone.

cereal was a collaboration between Shane Grant and myself, and was built to replace the boost.serialization library in NRT.

Check it out here: Github Documentation Page

Neuromorphic Robotics Toolkit

ilab
A modular programming framework for vision-based robotics.

The Neuromorphic Robotics Toolkit was developed alongside Dr. Laurent Itti, and is a C++ framework for writing distributed, modular applications. The main focus of the project is on fine-grained modularity for computer vision applications, which requires a very high performance messaging system.

The main features of the tookit are:

  • Static module interface declarations, which leads to more predictable (and understandable!) modules, and helps the compiler prevent you from making many mistakes.
  • Modules are loaded into the same address space by default, allowing extremely efficient communication. In most cases the only communication overhead is a pointer copy.
  • Seamless integration into a multi-processing mode with optional message serialization across Ethernet. This allows you to split up your application between multiple address spaces for debugging, or across multiple machines for distributed computing.
  • A full-featured, user friendly Image class and supporting processing library.
Check it out here: nrtkit.org

Neovision2

ilab
A DARPA funded project to develop an object recognition system based on neuroscience models.

The task of the Neovision2 project is to develop a state of the art object recognition system based on current neuroscience research, and to prove that such a biologically plausible system is superior to non biologically inspired systems.

My main contributions to the project were:

  • The system was built using NRT, so much of my work was fine-tuning the API in support of the other researchers.
  • Researched fast motion based segmentation algorithms.
  • Implemented a full keyframe-based annotation GUI in Qt to generate ground-truth for object recognition training.
  • Implemented Bayesian filters for tracking.
Check it out here: darpa.mil

Cognitive Technology Threat Warning System

ilab
A DARPA funded project to develop a surveillance system using models of the primate visual system.

The CT2WS project was initiated by DARPA to develop a soldier-portable visual threat detection platform. Furthermore, the goal of the project was to implement such a system using biologically plausible models of primate visual attention.

I acted as the primary developer in iLab to implement, tune, and interface a model of visual attention developed by Dr. Itti and Dr. Baldi (see here for more details). Our lab worked in close conjunction with Hughes Research Labs to produce a working system that fused the automatic detections of our attention model with the detected neural signatures of a human operator.

Check it out here: darpa.mil

Beobot 2.0

ilab
A 16-core vision based robotics research platform.

The Beobot 2.0 is a rolling mini cluster used in the iLab for visual navigation research. It's mechanical base is an electric wheelchair, which holds a custom CPU rack, water cooling system, and a host of cameras and other sensors.

The main work I did on this project was as follows:

  • Various schematics for the XTX carrier boards, and power boards.
  • Wrote the low-level communication library which allows various components of the robot to cooperate.
  • Wrote the initial microcontroller/cpu communication and control code.
Check it out here: Beobot Wiki Page

Variance Ridge Detector

ilab
A highly optimized edge detector written in C with SSE intrinsics.

I advised Sagar Pandya on his semester-long directed research project to implement a fast SSE optimized edge detector in C. The code has an easy to use plain C interface, compiles to a .mex file for interfacing with MATLAB, and has an interface to NRT.

Check it out here: Github Repository

NEMO

ilab
A tiny robotics board based on Gumstix and Arduino.

My first job as a graduate student at USC was TAing CS445 - Introduction to Robotics. At the time, the class was based around a 2Mhz microcontroller robotics board called the Handy Board.

After my first semester of teaching, it became obvious that the Handy Board was just no longer cutting the mustard when it came to teaching modern robotics. After much research, I determined that the best solution was to design my own custom board that I could use for the class. The result was a great success, and has since allowed us to upgrade the CS445 lab to teach such topics as image processing, and probabilistic robotics.

Here are some quick specs on the board:

  • Onboard 600Mhz Gumstix Overo with built-in WiFi for high-level processing. This allows students to simply SSH into their robots during labs to upload new code and monitor their robots' status.
  • Onboard Atmega 1280 for low-level processing. I preprogrammed this chip with all of the functionality that the students would need, and provided them with a simple API that run on the Gumstix. For example, a student could simply call:
    • Nemo::setMotor(0, 100); to set motor 0 to 100% power.
    • float distance = Nemo::getSonar(6); to get the distance from a sonar connected to pin 6.
    • Image<PixRGB<byte> > img = Nemo::grabImage(); to grab an image from an attached USB webcam.
  • Dual-input, auto balancing power supply with a custom switching regulator providing up to 10A of power.

USC Competition Robotics Teams

personal
I am the technical advisor for USC's two competition robotics teams.

USC hosts two primarily undergraduate robotics teams which compete in yearly competitions:

  • The Underwater Robotics Team, which enters their autonomous underwater vehicle (AUV) in the AUVSI RoboSub Competition. This competition requires their AUV to autonomously traverse an underwater obstacle course, and to perform such tasks as bumping into buoys, dropping markers into bins, and shooting "torpedos" through targets.
  • The Aerial Robotics Team, which enters their unmanned aerial vehicle (UAV) in the AUVSI International Aerial Robotics Competition. This competition requires their UAV to autonomously enter an office building environment, and then navigate through the building to locate and remove USB flash drive from one of the offices.

I advise both teams on all aspects of their designs, including software, electrical and mechanical.

Check it out here: uscrs.sagarpandya.com/

Randterm

personal
A simple python serial terminal.

Randterm is a simple python serial terminal inspired by the awesome (but Windows only) Realterm. It was built mainly to help me debug serial protocols for microcontrollers, and so it includes the ability to send and display data in ascii, decimal, hex, or binary.

Check it out here: Github Repository

Simple SLAM

personal
A bare-bones implementation of FastSLAM 1.0 written for MATLAB/Octave.

SimpleSLAM simulates a robot cruising around a 2D environment, occasionally receiving range and bearing measurements to landmarks. The task of the robot is to simultaneously build a map of the environment, and to localize itself within that map. It was written to teach (myself and others) about the fundamentals of simultaneous localization and mapping.

Check it out here: Github Repository

randolphvoorhies.com

personal
My personal website. You're looking at it!

I had a ton of fun writing this site because I used it as an excuse to learn as much as possible about modern web programming. Here's how the site works:

  • All pages were written using the Jinja2 templating engine. All templates are rendered offline into pure html pages prior to publishing using a custom python script.
  • The site scaffolding is based on Bootstrap, which makes layout really easy.
  • Styles are handled using Less, and are based off of Bootstrap's source.
  • A custom Watchr script monitors the Less and Jinja template folders for changes, and automatically rebuilds whatever is necessary.
Check it out here: Github Repository