Bio-inspired active vision for mobile systems

Visual insects move in the world with impressive speed and robustness. Given their small brain and limited time to process visual information, they negotiate the world without explicit understanding of each object. Instead, they have a suite of built-in motion detection mechanisms to extract relevant object motion in the world. This approach is extremely fast and energetically efficient. To further reduce the computational load, many insects develop strategic steering of their eyes to accomplish visual measurements. This form of “active vision” provides several advantages and has not been fully exploited in the bioinspired engineering community. In this project, we integrate two motion detection pathways in insect vision (i.e. wide-field motion & moving target detection) to explore how they can extract information for both static and moving obstacles synergistically with appropriate steering of the visual sensor.

Robot platforms

We will test our machine vision algorithms in two mobile robot platforms: micro racing drones and classic two-wheel autonomous robot. Both of these systems are light-weight, low-cost, and ubiquitous. We insist on maintaining the same visual input data quality and latency a human operator would experience. An efficient bio-inspired algorithm should be able to perform close to human operators.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s