top of page

Overview

Autonodyne is a Boston-based software company. We got our start in aviation, but have since branched out into other domains (air, sea, land) to use our software to control one, a few, or many dissimilar vehicles in multiple domains autonomously.

 

There is a long-term autonomy model that may eventually eliminate all human involvement. However, for the foreseeable future, Autonodyne believes a supervisory human role will be essential. By continuing to develop new autonomous behaviors for uncrewed vehicles, we are fostering the age of true autonomy. We are providing “additive autonomy” and sophisticated reasoning "at the edge” to enable Unmanned Vehicle (UV) products and services.

We subscribe to the school of thought as described in ‘Our Robots, Ourselves’ that involves the human and machine working together by trading control and shifting levels of automation to suit the situation at hand. In certain times and places, the vehicle is very autonomous while in others, more human involvement is needed.

Software for One or Many Unmanned Vehicles (UVs)

Our portfolio of work includes control capabilities for a broad spectrum of vehicles, and we have developed autonomous behaviors that permit operating large groups or swarms of vehicles in multiple missions and maneuvers.

Sometimes operators want to pilot different types of vehicles simultaneously during a mission or need the vehicles to be able to interact as a team. Allowing dissimilar vehicles to work together is a cornerstone of our autonomous control technologies.

W_USV-UUV_5.jpg

Sea UVs (USV)

utap2_edited_edited.jpg

Air UVs (UAV)

Image depicting different dissimilar land UV's

Land UVs (UGV)

Image depicting multiple different vehicles of different domains

We began with air vehicles (air domain) but have branched out into applying our technology to the land domain (unmanned ground vehicles or UGVs) and both on the surface (unmanned surface vehicles or USVs) and underwater (unmanned underwater vehicles or UUVs) in the sea domain. 

This image shows a single operator on a dock only a few hundred meters away from our Boston office simultaneously controlling a UGV driving on the dock, a tethered UUV operating below the surface, and a UAV taking off and landing on a homemade USV "aircraft carrier".

Common teamings involve UAVs and USVs cueing objects of interest for other UV teammates to perform reactive behaviors towards, or applying effects against.

Human-System Interface

Human-System interface describes how humans and machines work together. It is a combination of software, user interface, and autonomy to create a team of humans and robots that are more capable together than by themselves.

It includes planning, conducting , and analyzing success (i.e. mission planning, mission execution, and mission debrief). We try to optimize the ideal interface for the scenario you find yourself in - we've been using voice control for UxS since 2019, started experimenting with augmented and virtual reality in 2018, used gesture control, and all forms of tactile controllers so that you have flexibility and options.

If the human-machine team is performing a complex operation, or if there are large quantities of robots involved, an advanced and powerful interface is required. What human can
think for 20, 100, or 1,000 robots?

This is where we come in - we combine the right user interface with
advanced autonomy software. One without the other is an
incomplete solution in our opinion.

Vehicle Control & Management

As we progress from one human controller responsible for a single vehicle (“human-in-the-loop”) to the human monitoring multiple missions being performed by multiple vehicles (“human-on/over-the-loop”) to maybe eventually no humans involved (“What’s-a-loop?”), Autonodyne is striving to create modern interfaces designed with the user and the mission in mind.

 

We are working not to build engineering interfaces for engineers but instead, to find that perfect design that blends simplicity and power. We aim to provide the human operator/supervisor/monitor the right level of situational awareness and the ability to affect changes if they need to be done. From providing 100% manual input (e.g. joystick and throttle control) and optional voice control, to approving a suggested course of action, to just being a means to provide “commander intent”, the Autonodyne control stations are that powerful form of functional artwork.

A CBX and ruggerdized tablet running Autonodyne software
Many Behaviors Increase Capabilities

Autonodyne’s library of software behaviors permits your vehicle or team of vehicles to perform a variety of mission-specific maneuvers. 

After a vehicle is commanded with a set of behaviors, it allows humans in the human-machine team to make better informed decisions, expand their reach and access, and increase safety and productivity while permitting the vehicles to focus on what they do best.

 

For a complete inventory of our behaviors see Autonomy Behaviors.

bottom of page