Twitter icon
Facebook icon
LinkedIn icon
Google icon
Reddit icon
StumbleUpon icon
Del.icio.us icon

Six cool new autonomous technologies

Added to IoTplaybook or last updated on: 05/06/2022
Six cool new autonomous technologies

“Autonomous” may sometimes seem like just another overhyped tech buzzword—but in reality it is the future. People alive today will experience in their lifetimes a world in which autonomous systems are commonplace.

At its core, an autonomous system is any technology that can make its own decisions. A self-driving car deciding how to navigate a road is an autonomous system as are the kinds of human-like robots that are a staple of science fiction.

In most contemporary applications, autonomous systems make decisions with low stakes. A Roomba, for example, will determine the most efficient path to vacuum a home. But increasingly autonomous systems are able to make decisions that carry more weight. It is this capability will change the world.

Here we present six examples of burgeoning autonomous technologies that are at the cutting edge today.

EVAA prevents plane crashes with its moral compass

The Trolley Problem is an infamous philosophical exercise in ethical reasoning. A trolley is careening down tracks towards a group of five people, who will surely die. But there is a switch in the tracks, and you happen to be standing next to the control lever. If you pull the switch, the trolley will move to another track where it will only kill one person. Do you pull the switch?

This is a question that self-driving cars will have to answer—a question that becomes more interesting if you, the car’s owner, are the person the car must sacrifice. And that question will apply to more than self-driving cars.

Expandable Variable Autonomy Architecture (EVAA) is autonomous technology developed by NASA’s Resilient Autonomy team and the Federal Aviation Administration for use in aircraft. Its purpose is to avoid aviation accidents by giving aircraft autonomous capabilities. It can assist human pilots or act on its own. According Mark Skoog, principal investigator for autonomy research at NASA's Armstrong Flight Research Center, EVAA has already saved 11 lives.

At first glance, EVAA looks like the many other autonomous flight systems that exist today. But EVAA’s “moral compass” is unique because it can make a decision when it encounters something like The Trolley Problem. Under normal conditions, the moral compass doesn’t need to make any desperate choices. But in an emergency scenario, the moral compass helps EVAA decide what action to take.

The moral compass acts according to a prioritized list of objectives. If Air Force One possessed EVAA, for example, the president’s life would be the highest priority. In a fighter jet, on the other hand, the pilot might receive a lower priority than innocent civilians on the ground during a worst-case scenario. EVAA is an aircraft-specific technology, but we will see similar systems in self-driving cars.

Autonomous tractors increase farm productivity

The world population has more than tripled in the last century, yet people are now better fed than ever. That is thanks to agricultural advancements that allow for crops with higher yields, improved food storage and better dispersal of resources. But if world population growth continues to skyrocket, we’ll need new advancements. Autonomous farming is one industry in which there is tremendous room for improvement.

The idyllic concept of small, family-run farms is now antiquated. Most of the world’s food comes from large industrial farms. While many may debate the ethics of the modern agricultural industry, its efficiency in delivering more food at a lower cost is undeniable.

Autonomous farming can increase that efficiency even further. This starts with self-operating tractors and other machinery. John Deere started working on autonomous farming equipment years ago and unveiled a new autonomous tractor at CES this year that may push this idea into the mainstream. Instead of requiring a human operator who needs breaks and goes home after their shift, tractors like this can run all day—and night—without direct human control.

We should see these autonomous tractors in everyday use long before our roads become congested with self-driving cars. Unlike those cars, autonomous tractors don’t need to contend with other drivers, hard-to-identify road signs or hazardous conditions.

Self-driving tractor trailers fight driver fatigue

Machines excel at repetitive and monotonous tasks, and there are few jobs more monotonous than long-haul trucking. Truckers can cover more than 600 miles a day and their work is integral to our modern economy. But those miles are difficult and accidents happen when truckers become fatigued, which is why the U.S. government regulates how many hours truckers can drive in a day.

Autonomous systems improve safety and reduce fatigue-related accidents. Much like the active driver safety features present in many modern cars, these systems can take over in emergency situations to prevent collisions. Lane-keeping assistance and emergency braking alone will save many lives every year.

As in the farming industry, autonomous systems will enable around-the-clock trucking. Current regulations cap a trucker’s driving time at 11 hours in a 24-hour period. That means that a truck with a single driver will remain stationary for the other 13 hours of the day. A fully autonomous truck could operate 24 hours a day—more than doubling efficiency.

Most of the same challenges facing self-driving cars apply here, too. For the foreseeable future, trucks will require human operators. But semi-autonomous driver assistance features will still improve safety and efficiency in the coming years.

Autonomous submarine explores Antarctica’s ‘doomsday’ glacier

While industry is always important, so are scientific and environmental pursuits. Global warming is a dire issue that affects us all and Antarctica’s crumbling Thwaites glacier—the so-called “doomsday” glacier—is a looming threat.

The Thwaites glacier is already losing 50 billion tons of ice a year and contributing 4% to the global sea level rise. Preliminary research suggests that the situation could soon become much worse. If the current predictions hold, an ice shelf extending from the Thwaites glacier could crumble in the next five years. That will result in a global sea level rise of up to 10 feet.

To gather more data, several autonomous submersible robots are launching to explore the ocean around and below the Thwaites glacier. Included in the fleet is Boaty McBoatface, which has a silly name but an important job. It can operate autonomously undersea for hundreds of miles and can dive to depths of more than 3 miles below the surface.

It will use a host of sensors to measure the sea water’s temperature, salinity, oxygenation and much more. This data will help scientists better understand the conditions around the Thwaites glacier so that they can develop more accurate predictions about the threat it poses.

Robotic spacecraft traverse the solar system

In the previous examples, the autonomous system is replacing a human operator. But space exploration is unique. We haven’t stepped foot on any body in our solar system other than the moon because space is inhospitable and space travel takes a long time. By using autonomous robots and spacecraft as intermediaries, we can explore our solar system and beyond.

Every rover sent to Mars was at least semi-autonomous, which was necessary because of the distance between Earth and Mars. Depending on their current positions, it can take anywhere from five to 20 minutes to send a signal between Earth and Mars. That time doubles in most situations because we must receive telemetry and then send back commands. Thanks to autonomy, Mars rovers can continue working and only need to respond to new objectives sent from Earth.

The communication challenges increase the further into space we travel. Voyager 1 and Voyager 2 are robotic probes launched in 1977. Both are still operational and are past the boundary of the heliosphere, meaning they are now in interstellar space. Voyager 1 is now more than 150 AU from Earth, so it takes almost a day to receive data sent by the probe—far too long to make direct control practical.

Both Voyager 1 and Voyager 2 photographed Titan, one of Saturn’s moons, as they passed nearby. Huygens was an atmospheric probe that touched down on the surface of Titan in 2005. And it is on Titan where autonomous robots may collect very important data on prebiotic life and could even find living extraterrestrial organisms.

The Dragonfly spacecraft will launch in 2027 and will carry a drone designed to fly in Titan’s atmosphere. It will operate autonomously and is capable of landing at specific exploration sites to analyze samples and gather data. Transmission time between Titan and Earth is 70-90 minutes each way, making autonomous operation a necessity.

Space agencies, universities, and even private labs have proposed several other Titan missions over the years and some of them are still under consideration. Many of these focus on robotic submersibles, similar to Boaty McBoatface, that can explore Titan’s oceans. All of them rely on autonomous technology.

The autonomous home of the future

In 1957, Disneyland unveiled the Monsanto House of the Future in the Tomorrowland theme park. Monsanto, Massachusetts Institute of Technology and Disney built the house as an exhibit to give visitors a peak into the future of home living. While a primary focus was to showcase plastics—a growing industry that Monsanto experimented in at the time—it also contained cutting-edge appliances such as microwave ovens.

Microsoft, Hewlett-Packard and Disney followed up the Monsanto House of the Future with Innoventions in 1998. It featured much more technology, including devices and appliances that would fit under the Internet of Things (IoT) and smart home industry umbrellas today.

Many of the futuristic concepts seen in the Innoventions exhibit are commonplace today. Smart home assistant devices, like Amazon Alexa, are popular and can control IoT devices. With a simple voice command, Alexa can adjust your thermostat, activate your lawn sprinklers or preheat your oven. This trend will continue and expand thanks to autonomous capability.

Instead of responding to explicit commands, the autonomous home of the future will anticipate the needs of its occupants and act accordingly. A Nest thermostat can already do that for an HVAC system today by learning its user’s preferences and habits, but soon your entire home will follow suit.

When you pull into your driveway, your front porch light will turn on to welcome you. As you walk to the front door, the lock will disengage. Before you step foot inside, your interior lights will illuminate to your desired level and your favorite music will play through smart speakers. When you go to bed, your security system will arm itself and your curtains will close for privacy.

The next morning, your bedroom lights will brighten when it is time to wake up. When you get out of the shower, your coffee will be ready. After you leave, your robotic vacuum will get to work and your refrigerator will place an order for groceries. All of that will happen without your direct control.

This might sound like something out of “The Jetsons.” It is all made possible today by a combination of existing IoT technology and modern machine learning. The latter recognizes patterns, such as the time you wake up each morning, and learns what you want to happen as a result.

We will see autonomous homes with capabilities like these as soon as the general public becomes comfortable with the idea. Like the microwave oven in the Monsanto Home of the Future, it might take some time before we see widespread adoption. But it is inevitable because the convenience offered by autonomous systems is overwhelming.

Cameron Coward, senior technology writer at Avnet

 

Avnet

This content is provided by our content partner Avnet, a global technology solutions provider with end-to-end ecosystem capabilities. Visit them online for more great content like this.

This article was originally published at Avnet. It was added to IoTplaybook or last modified on 05/06/2022.