Robots | Popular Science https://www.popsci.com/category/robots/ Awe-inspiring science reporting, technology news, and DIY projects. Skunks to space robots, primates to climates. That's Popular Science, 145 years strong. Mon, 05 Jun 2023 14:30:00 +0000 en-US hourly 1 https://wordpress.org/?v=6.1.1 https://www.popsci.com/uploads/2021/04/28/cropped-PSC3.png?auto=webp&width=32&height=32 Robots | Popular Science https://www.popsci.com/category/robots/ 32 32 This robot ‘chef’ can follow video instructions to make a very simple salad https://www.popsci.com/technology/robot-salad-chef-maker/ Mon, 05 Jun 2023 14:30:00 +0000 https://www.popsci.com/?p=545805
Robot arm assembling salad from ingredients
The robot even created its own salad recipe after learning from examples. University of Cambridge

It may not make it on 'Top Chef,' but the robot's learning abilities are still impressive.

The post This robot ‘chef’ can follow video instructions to make a very simple salad appeared first on Popular Science.

]]>
Robot arm assembling salad from ingredients
The robot even created its own salad recipe after learning from examples. University of Cambridge

It may not win a restaurant any Michelin stars, but a research team’s new robotic ‘chef’ is still demonstrating some impressive leaps forward for culinary tech. As detailed in the journal IEEE Access, a group of engineers at the University of Cambridge’s Bio-Inspired Robotics Laboratory recently cooked up a robot capable of assembling a slate of salads after watching human demonstration videos. From there, the robot chef was even able to create its own, original salad based on its previous learning.

“We wanted to see whether we could train a robot chef to learn in the same incremental way that humans can—by identifying the ingredients and how they go together in the dish,” the paper’s first author Greg Sochacki, a Cambridge PhD candidate in information engineering, said in a statement.

[Related: 5 recipe apps to help organize your meals.]

What makes the team’s AI salad maker even more impressive is that the robot utilized a publicly available, off-the-shelf neural network already programmed to visually identify fruits and vegetables such as oranges, bananas, apples, broccoli, and carrots. The neural network also examined each video frame to identify the various objects, features, and movements depicted—for instance, the ingredients used, knives, and the human trainer’s face, hands, and arms. Afterwards, the videos and recipes were converted into vectors that the robot could then mathematically analyze.

Of the 16 videos observed, the robot correctly identified the recipe depicted 93 percent of the time, all while only recognizing 83 percent the human chef’s movements. Its observational abilities were so detailed, in fact, that the robot could tell when a recipe demonstration featured a double portion of an ingredient or if a human made a mistake, and know that these were variations on a learned recipe, and not an entirely new salad. According to the paper’s abstract, “A new recipe is added only if the current observation is substantially different than all recipes in the cookbook, which is decided by computing the similarity between the vectorizations of these two.”

Sochacki went only to explain that, while the recipes aren’t complex (think an un-tossed vegetable medley minus any dressings or flourishes), the robot was still “really effective at recognising, for example, that two chopped apples and two chopped carrots is the same recipe as three chopped apples and three chopped carrots.”

[Related: What robots can and can’t do for a restaurant.]

That said, there are still some clear limitations to the robotic chef’s chops—mainly, it needs clear, steady video footage of a dish being made with unimpeded views of human movements and their ingredients. Still, researchers are confident video platforms like YouTube could be utilized to train such robots on countless new recipes, even if they are unlikely to learn any creations from the site’s most popular influencers, whose clips traditionally feature fast editing and visual effects. Time to throw on some old reruns of Julia Child’s The French Chef and get to chopping.

The post This robot ‘chef’ can follow video instructions to make a very simple salad appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
A robot inspired by centipedes has no trouble finding its footing https://www.popsci.com/technology/centipede-robot-japan/ Thu, 01 Jun 2023 16:00:00 +0000 https://www.popsci.com/?p=545090
Macro closeup of orange and black millipede on green leaf
Centipedes' undulating movements can sometimes improve robot mobility. Deposit Photos

Researchers at Osaka University designed a 'myriapod' bot that uses less energy and computational power than other walking machines.

The post A robot inspired by centipedes has no trouble finding its footing appeared first on Popular Science.

]]>
Macro closeup of orange and black millipede on green leaf
Centipedes' undulating movements can sometimes improve robot mobility. Deposit Photos

Last month, engineers at Georgia Institute of Technology unveiled a creepy, crawly centipede-inspired robot sporting a plethora of tiny legs. The multitude of extra limbs wasn’t simply meant to pay homage to the arthropods, but rather to improve the robot’s maneuverability across difficult terrains while simultaneously reducing the number of complicated sensor systems. Not to be outdone, a separate team of researchers at Japan just showed off their own biomimetic “myriapod” robot which leverages natural environmental instabilities to move in curved motions, thus reducing its computational and energy requirements.

[Related: To build a better crawly robot, add legs—lots of legs.]

As detailed in an article published in Soft Robotics, a team at Osaka University’s Mechanical Science and Bioengineering department recently created a 53-inch-long robot composed of six segments, each sporting two legs alongside agile joints. In a statement released earlier this week, study co-author Shinya Aoi explained their team was inspired by certain “extremely agile” insects able to utilize their own dynamic instability to quickly change movement and direction. To mimic its natural counterparts, the robot included tiny motors that controlled an adjustable screw to increase or decrease each segment’s flexibility while in motion. This leads to what’s known as “pitchfork bifurcation.” Basically, the forward-moving centipede robot becomes unstable.

But instead of tipping over or stopping, the robot can employ that bifurcation to begin moving in curved patterns to the left or right, depending on the circumstances. Taking advantage of this momentum allowed the team to control their robot extremely efficiently, and with much less computational complexity than other walking bots.

As impressive as many bipedal robots now are, their two legs can often prove extremely fragile and susceptible to failure. What’s more, losing control of one of those limbs can easily render the machine inoperable. Increasing the number of limbs a lá a centipede robot, creates system redundancies that also expand the terrains it can handle. “We can foresee applications in a wide variety of scenarios, such as search and rescue, working in hazardous environments or exploration on other planets,” explained Mau Adachi, one of the paper’s other co-authors.

[Related: NASA hopes its snake robot can search for alien life on Saturn’s moon Enceladus.]

Such serpentine robots are attracting the attention of numerous researchers across the world. Last month, NASA announced the latest advancements on its Exobiology Extant Life Surveyor (EELS), a snake-bot intended to potentially one day search Saturn’s icy moon Enceladus for signs of extraterrestrial life. Although EELS utilizes a slithering movement via “rotating propulsion units,” it’s not hard to envision it doing so alongside a “myriapod” partner—an image that’s as cute as it is exciting.

The post A robot inspired by centipedes has no trouble finding its footing appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
The Dallas airport is testing out EV charging bots that roll around like suitcases https://www.popsci.com/technology/ziggy-ev-charging-robot-dallas-airport/ Wed, 31 May 2023 22:00:00 +0000 https://www.popsci.com/?p=544933
ZiGGY mobile EV charger connected to vehicle in parking lot.
ZiGGY will show off its skills this summer at Dallas-Fort Worth International Airport. EV Safe Charge/YouTube

Mobile EV charging stations may soon juice up travelers' parked cars while they're flying high.

The post The Dallas airport is testing out EV charging bots that roll around like suitcases appeared first on Popular Science.

]]>
ZiGGY mobile EV charger connected to vehicle in parking lot.
ZiGGY will show off its skills this summer at Dallas-Fort Worth International Airport. EV Safe Charge/YouTube

One of the world’s busiest airports will soon showcase an innovative, undeniably cute way to speed up travelers’ entrances and exits. First announced earlier this month, Dallas Fort Worth International Airport (DFW) is partnering with EV Safe Charge to demonstrate how the company’s mobile electric vehicle charging station, ZiGGY, could be deployed in public spaces to economically and conveniently power up consumers’ parked cars.

[Related: Electric cars are better for the environment, no matter the power source.]

Electric vehicles are an integral component of the societal shift towards clean, renewable energy. Unfortunately, battery shortages stemming from supply chain issues alongside a need for evermore charging stations is hampering a wider adoption of green transportation. ZiGGY obviously isn’t a catch-all fix, but it’s still a novel tool that both its makers and DFW hope to highlight over the summer as part of the airport’s series of EV charging solution demos.

“We know that electric vehicles will be a big part of the future of transportation,” Paul Puopolo, DFW’s Executive VP of Innovation, said in a statement, adding their air hub is “leaning into emerging technology now so that we are prepared to meet the needs of the airport community well into the future.”

ZiGGY itself resembles a large vending machine on wheels, which makes a certain amount of sense given it dispenses electric fuel on demand. Using geofencing technology, app-based controls, and on-board cameras, ZiGGY can be deployed directly to the location of your parked EV, where a user can then connect the charging bot to their ride. To court additional revenue streams, each ZiGGY also features large video screens capable of displaying advertisements. Don’t worry about getting stuck behind it if someone is using a ZiGGY, either—its dimensions and mobility ensures each station can park itself behind an EV without the need for additional space.

Speaking with Ars Technica on Tuesday, EV Safe Charge’s founder and CEO Caradoc Ehrenhalt explained that the idea is to deploy ZiGGY fleets to commercial hubs around the world, such as additional airports, hotels, and shopping centers. “What we’re hearing from people… is the common thread of the infrastructure being very challenging or not possible to put in or not cost effective or takes too much time. And so there really is the need for a mobile charging solution,” said Ehrenhalt.

[Related: Why you barely see electric vehicles at car dealerships.]

Of course, such an autonomous vehicle could find itself prone to defacement and vandalism, but Ehrenhalt apparently opts to look on the sunnier side of things. “Ziggy is fairly heavy because of the battery,” they cautioned to Ars Technica. “It has cameras all around and sensors, including GPS, and so there potentially could be [vandalism], but I’m always hoping for the best of humanity.”

The post The Dallas airport is testing out EV charging bots that roll around like suitcases appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Google engineers used real dogs to develop an agility course for robots https://www.popsci.com/technology/google-barkour-robot-dog-agility/ Tue, 30 May 2023 23:00:00 +0000 https://www.popsci.com/?p=544460
Beagle flying over an obstacle hurdle
A robot dog 'Barkour' course may provide a new industry standard for four-legged machines. Deposit Photos

Researchers hope the 'Barkour' challenge can become an industry benchmark.

The post Google engineers used real dogs to develop an agility course for robots appeared first on Popular Science.

]]>
Beagle flying over an obstacle hurdle
A robot dog 'Barkour' course may provide a new industry standard for four-legged machines. Deposit Photos

It feels like nearly every week or so, someone’s quadrupedal robot gains yet another impressive (occasionally terrifying) ability or trick. But as cool as a Boston Dynamics Spot bot’s new capability may be, it’s hard to reliably compare newly developed talents to others when there still aren’t any industry standard metrics. 

Knowing this, a team of research scientists at Google are aiming to streamline evaluations through their new system that’s as ingenious as it is obvious: robot obstacle courses akin to dog agility competitions. It’s time to stretch those robotic limbs and ready the next generation of four-legged machines for Barkour.

[Related: This robot dog learned a new trick—balancing like a cat.]

“[W]hile researchers have enabled robots to hike or jump over some obstacles, there is still no generally accepted benchmark that comprehensively measures robot agility or mobility,” the team explained in a blog post published last week. “In contrast, benchmarks are driving forces behind the development of machine learning, such as ImageNet for computer vision, and OpenAI Gym for reinforcement learning (RL).” As such, “Barkour: Benchmarking Animal-level Agility with Quadruped Robots” aims to rectify that missing piece of research.

Illustrated side-by-side of concept and real robot agility course.
Actual dogs can complete the Barkour course in about 10 seconds, but robots need about double that. CREDIT: Google Research

In simple terms, the Barkour agility course is nearly identical to many dog courses, albeit much more compact at 5-by-5 meters to allow for easy setup in labs. The current standard version includes four unique obstacles—a line of poles to weave between, an A-frame structure to climb up and down, a 0.5m broad jump, and finally, a step up onto an end table.

To make sure the Barkour setup was fair to robots mimicking dogs, the team first offered up the space to actual canines—in this case, a small group of “dooglers,” aka Google employees’ own four-legged friends. According to the team, small dogs managed to complete the course in around 10 seconds, while robots usually take about double that time.

[Related: Dogs can understand more complex words than we thought.]

Scoring occurs between 0 and 1 for each obstacle, and is based on target times set for small dogs in novice agility competitions (around 1.7m/s). In all, each quadrupedal robot must complete all five challenges, but is given penalties for failing, skipping stations, or maneuvering too slowly through the course.

“We believe that developing a benchmark for legged robotics is an important first step in quantifying progress toward animal-level agility,” explained the team, adding that, moving forward, the Barkour system potentially offers industry researchers an “easily customizable” benchmark.

The post Google engineers used real dogs to develop an agility course for robots appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Watch the US Navy launch an ocean glider from a helicopter https://www.popsci.com/technology/navy-deploys-slocum-glider-from-helicopter/ Tue, 30 May 2023 19:02:21 +0000 https://www.popsci.com/?p=544473
glider drops from navy helicopter
The test took place in March. Bobby Dixon / US Navy

The Slocum glider is a type of robot designed to gather information about the sea's conditions.

The post Watch the US Navy launch an ocean glider from a helicopter appeared first on Popular Science.

]]>
glider drops from navy helicopter
The test took place in March. Bobby Dixon / US Navy

On March 15, the US Navy launched a torpedo-shaped robot into the Persian Gulf from the back of a helicopter. The robot was a Slocum glider, an uncrewed sensing tool that can collect data on ocean conditions below the surface. Dropping it from a helicopter was a proof of concept, a test towards expanding the array of vehicles that can put the robots into the water. As the US Navy seeks to know more about the waterways it patrols, distributing data collection tools can provide a more complete image of the ocean without straining the existing pool of sailors.

The US Navy helicopter, part of Helicopter Mine Countermeasures Squadron (HM) 15, delivered the glider by flying low and slow over the sea surface. The glider, held between railings facing seaward, slid forward, diving but not tumbling into the water. The setup enabled smooth entry into the water, keeping the robot from falling aft over teakettle.

“We are excited to be a part of another series of firsts! In this instance, the first launch from a helicopter and the first-ever successful glider deployment from an aircraft,” Thomas Altshuler, a senior VP at Teledyne, said in a release. While the test took place in March, it was only recently announced by both the Navy and Teledyne, makers of the Slocum glider. “Teledyne Marine​ takes pride in our continued innovation and support of the U.S. Navy as it expands the operational envelope of underwater gliders.”

This is what that entry looked like:

A second video, which appears to be recorded by the phone camera of one of the sailors standing next to the rail, offers a different angle on the descent. The mechanics of the rail mount are clearer, from the horseshoe-shaped brace holding the glider in place, to the mechanism of release. When the glider hits water, it makes a splash, big at the moment then imperceptible in the wake of the rotor wash on the ocean surface.

For this operation, Teledyne says the glider was outfitted with “Littoral Battlespace Sensing – Glider (LBS-G) mine countermeasures (MCM) sensors.” In plain language, that means sensors designed to work near the shore, and to collect information about the conditions of the sea where the Navy is operating. This data is used by both the Navy for informing day-to-day operation and by the Naval Oceanographic Office, for understanding ocean conditions and informing both present and future operations.

[Related: What it’s like to rescue someone at sea from a Coast Guard helicopter]

In addition to HM 15, the test was coordinated with the aforementioned Naval Oceanographic Office, which regularly uses glider robots to collect and share oceanographic data. The Slocum glider is electrically powered, with range and endurance dependent upon battery type. At a minimum, that means the glider can travel 217 miles over 15 days, powerlessly gliding at an average speed of a little over 1 mph. (Optional thruster power doubles the speed to 2 mph.) With the most extensive power, Teledyne boasts that the gliders can range over 8,000 miles under water, stay in operation for 18 months, and work from shallows of 13 feet to depths of 3,280 feet.

“Naval Meteorology and Oceanography Command directs and oversees more than 2,500 globally-distributed military and civilian personnel who collect, process, and exploit environmental information to assist Fleet and Joint Commanders in all warfare areas to make better decisions faster than the adversary,” notes the Navy description of the test.

Communicating that data from an underwater robot to the rest of the Navy is done through radio signals, satellite uplink, and acoustic communication, among other methods. These methods allow the glider to transmit data and receive commands from remote human operators. 

“The invention of gliders addressed a long-standing problem in physical oceanography: how do you measure changes in the ocean over long periods of time?” reads an Office of Navy Research history of the program. The Slocum gliders themselves date back to a concept floated in 1989, where speculative fiction imagined hundreds of autonomous floats surveying the ocean by 2021. The prototype glider was first developed in 1991, had sea trials in 1998, and today according to that report,the Naval Oceanographic Office alone operates more than 150 gliders.

This information is useful generally, as it builds a comprehensive picture of the vast seas on which fleets operate. It is also specifically useful, as listening for acoustics underwater can help detect other ships and submarines. Undersea mines, hidden from the surface, can be found through sensing the sea, and revealing their location protects Navy ships, sailors, and commercial ocean traffic, too.

Releasing the gliders from helicopters expands how and where these exploratory machines can start operations, hastening deployment for the undersea watchers. When oceans are battlefields, knowing the condition of the waters first can make all the difference.

The post Watch the US Navy launch an ocean glider from a helicopter appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
A robot gardener outperformed human horticulturalists in one vital area https://www.popsci.com/technology/alphagarden-ai-robot-farming/ Tue, 30 May 2023 16:00:00 +0000 https://www.popsci.com/?p=544349
Gardener harvesting beets from ground.
AlphaGarden used as much as 44 percent less water than its human counterparts. Deposit Photos

UC Berkeley researchers claim their robotic farmer passes the green thumb Turing Test.

The post A robot gardener outperformed human horticulturalists in one vital area appeared first on Popular Science.

]]>
Gardener harvesting beets from ground.
AlphaGarden used as much as 44 percent less water than its human counterparts. Deposit Photos

Even after all that quarantine hobby honing, gardening can still be an uphill battle for those lacking a green thumb—but a little help from robotic friends apparently goes a long way. Recently, UC Berkeley unveiled AlphaGarden, a high-tech, AI-assisted plant ecosystem reportedly capable of cultivating a polycultural garden at least as well as its human counterparts. And in one particular, consequential metric, AlphaGarden actually excelled.

As detailed by IEEE Spectrum over the weekend, UC Berkeley’s gardening plot combined a commercial robotic gantry farming setup with AlphaGardenSim, an AI program developed in-house by utilizing a high-resolution camera alongside soil moisture sensors. Additionally, the developers included automated drip irrigation, pruning, and even seed planting. AlphaGarden (unfortunately) doesn’t feature a fleet of cute, tiny farm bots scuttling around its produce; instead, the system resembles a small crane installation capable of moving above and tending to the garden bed.

[Related: How to keep your houseplants from dying this summer.]

As an added challenge, AlphaGarden was a polyculture creation, meaning it contained a variety of crops like turnips, arugula, lettuce, cilantro, kale, and other plants. Polyculture gardens reflect nature much more accurately, and benefit from better soil health, pest resilience, and fewer fertilization requirements. At the same time, they are often much more labor-intensive given the myriad plant needs, growth rates, and other such issues when compared to a monoculture yield.

To test out AlphaGarden’s capabilities compared with humans, researchers simply built two plots and planted the same seeds in both of them. Over the next 60 days, AlphaGarden was largely left to its own literal and figurative devices, while professional horticulturalists did the same. Afterwards, UC Berkeley repeated the same growth cycle, but this time allowed AlphaGarden to give its slower-growing plants an earlier start.

According to researchers, the results from the two cycles  “suggest that the automated AlphaGarden performs comparably to professional horticulturalists in terms of coverage and diversity.” While that might not be too surprising given all the recent, impressive AI advancements, there was one aspect that AlphaGarden unequivocally outperformed its human farmer controls—over the two test periods, the robotic system reduced water consumption by as much as a whopping 44 percent. As IEEE Spectrum explained, that translates to several hundred liters less after the two month period.

[Related: Quick and dirty tips to make sure your plants love the soil they’re in.]

Although researchers claim “AlphaGarden has thus passed the Turing Test for gardening,” referencing the much-debated marker for robotic intelligence and sentience, there are a few caveats here. For one, these commercial gantry systems remain cost prohibitive for most people (the cheapest one looks to be about $3,000), and more research is needed to further optimize its artificial light sources and water usage. There’s also the question of scalability and customization, as different gardens have different shapes, sizes, and needs.

Still, in an era of increasingly dire water worries, it’s nice to see developers creating novel ways to reduce water consumption for one of the planet’s thirstiest industries.

The post A robot gardener outperformed human horticulturalists in one vital area appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Cozy knit sweaters could help robots ‘feel’ contact https://www.popsci.com/technology/robot-sweaters-yarn/ Thu, 25 May 2023 20:00:00 +0000 https://www.popsci.com/?p=543752
Robot arm encased in smart sweater material
The sensitive 'yarn' encases robots to direct them based on human touch and guidance. Carnegie Mellon

The snuggly garb is used to teach robots how to sense possible collisions in advance.

The post Cozy knit sweaters could help robots ‘feel’ contact appeared first on Popular Science.

]]>
Robot arm encased in smart sweater material
The sensitive 'yarn' encases robots to direct them based on human touch and guidance. Carnegie Mellon

Certain robots can certainly sense cold temperatures, but feeling cold is a whole other ordeal. And yet the world is now blessed with robot sweaters.

To be fair, the new, adorable garb recently designed by an engineering team at Carnegie Mellon University’s Robotics Institute isn’t intended to keep machines warm. As detailed in a research paper scheduled to be presented at 2023 IEEE International Conference on Robotics and Automation, the group utilized the properties of a knitted sweater to create a fabric capable of sensing pressure and contact. The cutting-edge textile can now help indicate direction, orientation, and even grip strength via physical touch. 

[Related: A new material creates clean electricity from the air around it.]

Like its yarn inspiration, the new “RobotSweater” fabric can be woven into whatever three-dimensional shape is needed, and thus fitted over robots’ uneven shapes and surfaces. The knitted material itself features two layers of conductive, metallic fibers capable of conducting electricity. Between those two layers, another lace-like pattern is inserted. When pressure is applied, a closed circuit is generated and subsequently detected by sensors.

In order to ensure the metallic yarn didn’t degrade or break with usage, the team wrapped the wires around snap fasteners at the end of each stripe in the fabric. “You need a way of attaching these things together that is strong, so it can deal with stretching, but isn’t going to destroy the yarn,” James McCann, an assistant professor in Carnegie Mellon’s School of Computer Science (SCS), explained in a statement.

To demonstrate their creation, researchers dressed up a companion robot in their RobotSweater, then pushed it to direct its head and body movement. On a robotic arm, the fabric could respond to guided human pushes, while grabbing the arm itself opened and closed a gripping mechanism.

[Related: Dirty diapers could be recycled into cheap, sturdy concrete.]

Swaddling robots in smart sweaters isn’t just fashionable—it could prove extremely valuable in industrial settings to improve human worker safety. According to the team, most safety barriers are currently extremely rigid and shield-like; encasing machines in flexible, sensitive fabrics, however could make them much more sensitive, and thus able to “detect any possible collision,” said Changliu Liu, an assistant professor of robotics in the SCS. Moving forward, the team hopes to integrate touchscreen inputs like swiping and pinching motions to direct robots. Even if that takes a while to realize, at least the machines will look stylish and cozy.

The post Cozy knit sweaters could help robots ‘feel’ contact appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Wendy’s wants underground robots to deliver food to your car https://www.popsci.com/technology/wendys-underground-delivery-robot/ Thu, 18 May 2023 16:30:00 +0000 https://www.popsci.com/?p=541984
Wendy's chain restaurant at night.
Wendy's wants to automate its drive-thru. Batu Gezer / Unsplash

The concept is similar to a pneumatic tube system.

The post Wendy’s wants underground robots to deliver food to your car appeared first on Popular Science.

]]>
Wendy's chain restaurant at night.
Wendy's wants to automate its drive-thru. Batu Gezer / Unsplash

Wendy’s announced this week that it is going to try using underground autonomous robots to speed up how customers collect online orders. The burger joint plans to pilot the system designed by “hyperlogistics” company Pipedream, and aims to be able to send food from the kitchen to designated parking spots.

Wendy’s seems to be on a quest to become the most technologically advanced fast food restaurant in the country. Last week, it announced that it had partnered with Google to develop its own AI system (called Wendy’s FreshAI) that could take orders at a drive-thru. This week, it’s going full futuristic. (Pipedream’s current marketing line is “Someday we’ll use teleportation, until then we’ll use Pipedream.”)

According to a PR email sent to PopSci, digital orders now make up 11 percent of Wendy’s total sales and are growing. On top of the 75 to 80 percent of orders that are placed at a drive-thru.

The proposed autonomous system aims “to make digital order pick-up fast, reliable and invisible.” When customers or delivery drivers are collecting an online order, they pull into a dedicated parking spot with an “Instant Pickup portal,” where there will be a drive-thru style speaker and kiosk to confirm the order with the kitchen. In a matter of seconds, the food is then sent out by robots moving through an underground series of pipes using “Pipedream’s temperature-controlled delivery technology.” The customer can then grab their order from the kiosk without ever leaving their car. Apparently, the “first-of-its-kind delivery system” is designed so that drinks “are delivered without a spill and fries are always Hot & Crispy.”

[Related: What robots can and can’t do for a restaurant]

Wendy’s is far from the first company to try and use robots to streamline customer orders, though most go further than the parking lot. Starship operates a delivery service on 28 university campuses while Uber Eats is still trialing sidewalk delivery robots in Miami, Florida; Fairfax, Virginia; and Los Angeles, California. Whether these knee-height six-wheeled electric autonomous vehicles can graduate from school and make it into the real world remains to be seen.

The other big semi-autonomous delivery bets are aerial drones. Wing, a subsidiary of Google-parent Alphabet, unveiled a device called the Auto-Loader earlier this year. It also calls for a dedicated parking spot and aims to make it quicker and easier for staff at partner stores to attach deliveries to one of the company’s drones. 

What sets Wendy’s and Pipedream’s solution apart is that it all happens in a space that the restaurant controls. Starship, Uber Eats, and Wing are all trying to bring robots out into the wider world where they can get attacked by students, take out power lines, and otherwise have to deal with humans, street furniture, and the chaos of existence. Providing Wendy’s abides by building ordinances and any necessary health and safety laws, cost is the only stopping them adding tube-dwelling robots to every restaurant the company controls. Really, the option Wendy’s is trialing has more in common with a pneumatic tube system—hopefully it will be a bit more practical.

The post Wendy’s wants underground robots to deliver food to your car appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
This helpful robot uses a camera to find items for people with dementia https://www.popsci.com/technology/memory-robot-dementia/ Mon, 15 May 2023 17:00:00 +0000 https://www.popsci.com/?p=541200
Fetch robot picking up dry erase marker off of table
A new 'artificial memory' can log and locate missing items for users. University of Waterloo

Researchers designed a new object-detection algorithm allowing robots to remember the locations of items they 'see.'

The post This helpful robot uses a camera to find items for people with dementia appeared first on Popular Science.

]]>
Fetch robot picking up dry erase marker off of table
A new 'artificial memory' can log and locate missing items for users. University of Waterloo

Researchers at Canada’s University of Waterloo have unveiled a new program for personal assistance robots. This new program utilizes episodic memory and object-detection algorithms to help locate lost items. Although designed specifically to aid patients suffering from cognitive issues, the team believes their advancements could eventually find their way onto people’s smartphones or tablets.

Dementia affects approximately 1 in 10 Americans over the age of 65, while another 22 percent of the same population contends with mild cognitive impairments. Symptoms vary between individuals, but forgetfulness is a common issue that can disrupt one’s day and increase stress levels for both those suffering from these conditions, as well as their caregivers.

Knowing this, a four-person team at the University of Waterloo created an algorithm they then uploaded to a commercial Fetch mobile manipulator robot, endowing the machine with a memory log of individual objects detected via its onboard video camera. Once enabled, the Fetch robot noted the time and date for anytime it spotted an object in its view area. Researchers also designed a graphical user interface (GUI) for individuals to pick and label which detected objects they wanted to track. Searching for a label via keyboard entry could then bring up Fetch’s “highly accurate” location log, according to a statement released on Monday.

[Related: The latest recommendations for preventing dementia are good advice for everyone.]

“The long-term impact of this is really exciting,” said Ali Ayub, a postdoctoral fellow in electrical and computer engineering and study co-author. “A user can be involved not just with a companion robot but a personalized companion robot that can give them more independence.”

Caregiving robotics is a rapidly expanding field that is showing promise in a number of areas. Recently, researchers at the Munich Institute of Robotics and Machine Intelligence announced Garmi, a personal assistant designed to help elderly users for telemedicine appointments, and potentially even physical tasks like opening bottles and serving meals.

Although Ayub and their colleagues have only tested their visual-based algorithm amongst themselves, the team hopes to soon conduct further trials—first with people without disabilities, then with people dealing with dementia and other cognitive issues. While Ayub’s team conceded that disabled individuals could potentially find the GUI and robot intimidating, they believe the system could still prove extremely beneficial for their caregivers and family members.

The post This helpful robot uses a camera to find items for people with dementia appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
This lawn-mowing robot can save part of your yard for pollinators https://www.popsci.com/technology/husqvarna-rewilding-mower-mode/ Mon, 15 May 2023 14:30:00 +0000 https://www.popsci.com/?p=541155
Pink clover meadow and blue sky.
Husqvarna's Rewilding Mode saves one tenth of yard for natural growth. Deposit Photos

Husqvarna has introduced a new autopilot mode for its mowers that omits a portion of owners' yards to promote sustainability.

The post This lawn-mowing robot can save part of your yard for pollinators appeared first on Popular Science.

]]>
Pink clover meadow and blue sky.
Husqvarna's Rewilding Mode saves one tenth of yard for natural growth. Deposit Photos

This month marks the fifth anniversary of “No Mow May,” an annual environmental project dedicated to promoting sustainable, eco-friendly lawns via a 31-day landscaping moratorium. In doing so, the brief respite gives bees and other pollinators a chance to do what they do best: contribute to a vibrant, healthy, and biodiverse ecosystem. To keep the No Mow May momentum going, Swedish tech company Husqvarna has announced a new, simple feature for its line of robotic lawnmowers: a “rewilding” mode that ensures 10 percent of owners’ lawns remain untouched for pollinators and other local wildlife.

While meticulously manicured lawns are part of the traditional suburban American mindset, they come at steep ecological costs such as biodiversity loss and massive amounts of water waste. The Natural Resource Defense Council, for instance, estimates that grass lawns consume almost 3 trillion gallons of water each year alongside 200 million gallons of gas for traditional mowers, as well as another 70 million pounds of harmful pesticides. In contrast, rewilding is a straightforward, self-explanatory concept long pushed by environmentalists and sustainability experts that encourages a return to regionally native flora for all-around healthier ecosystems.

[Related: Build a garden that’ll have pollinators buzzin’.]

While convincing everyone to adopt rewilding practices may seem like a near-term impossibility, companies like Husqvarna are hoping to set the literal and figurative lawnmower rolling with its new autopilot feature. According to Husqvarna’s announcement, if Europeans set aside just a tenth of their lawns, the cumulative area would amount to four times the size of the continent’s largest nature preserve.

Enabling the Rewilding Mode only takes a few taps within the product line’s Automower Connect app, and can be customized to change the overall shape, size, and placement of the rewilding zones. Once established, the robotic mower’s onboard GPS systems ensure which areas of an owner’s lawn are off-limits and reserved for bees, butterflies, and whatever else wants to set up shop.

Of course, turning on Rewilding Mode means owning a Husqvarna robotic mower that supports the setting—and at a minimum of around $700 for such a tool, they might be out of many lawn care enthusiasts’ budgets. Even so, that doesn’t mean you should abandon giving rewilding a try for your own lawns. It’s easy to get started on the project, and as its name suggests, doesn’t take much maintenance once it’s thriving. If nothing else, there’s still two weeks left in No Mow May, so maybe consider postponing your weekend outdoor chore for a few more days.

The post This lawn-mowing robot can save part of your yard for pollinators appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
A fleet of humanoid, open-source robots could change robotics research https://www.popsci.com/technology/nsf-quori-robot-research/ Tue, 09 May 2023 16:00:00 +0000 https://www.popsci.com/?p=539990
Two researchers standing next to Quori humanoid robot
Over two dozen Quori robots are heading to research teams across the country. Shivani Jinger/OSU

Not all robots are created equal—and the National Science Foundation wants to help level the playing field to speed up research.

The post A fleet of humanoid, open-source robots could change robotics research appeared first on Popular Science.

]]>
Two researchers standing next to Quori humanoid robot
Over two dozen Quori robots are heading to research teams across the country. Shivani Jinger/OSU

Immense strides in human-robot interactions have been made over the past few years. But, all of these robots tend to be quite different.  The lack of an affordable, generalized, modular robotic platform hampers many researchers’ progress, alongside their ability to share and compare findings.

The National Science Foundation, an independent US government-funded agency supporting research and education, wants to accelerate advancements in robotics, and is offering a $5 million fleet of standardized humanoid robots to speed things along. On Monday, the NSF announced plans to distribute another 50 of its Quori bots to various research projects, with assistance from Oregon State University, University of Pennsylvania’s GRASP Laboratory, and the robotics software company, Semio.

[Related: Meet Garmi, a robot nurse and companion for Germany’s elderly population.]

First designed with support from the NSF’s Computer and Information Science and Engineering (CISE) Community Research Infrastructure, Quori robots feature an omnidirectional, wheeled base, expressive video screen face, two gesturing arms, and a bowing spine. Quori is made to function both in labs and “in the wild,” according to its official description.

A previous pilot program built and tested 10 Quori robots that were subsequently awarded to research teams, including one from Carnegie Mellon University, who used their model to focus on social behavior and communication methods between humans and robots.

The new multimillion-dollar expansion will see many more of these standardized humanoid bots made available to applicants. All of Quori’s hardware designs are available as open-source, meaning anyone can access them to potentially build their own versions.

“A big hurdle in robotics research has been the lack of a common robot to work with,” Bill Smart, a professor of mechanical, industrial, and manufacturing engineering in OSU’s College of Engineering and project co-lead, explained in a statement.  “It’s tough to compare results and replicate and build on each other’s work when everyone is using a different type of robot. Robots come in many shapes and sizes, with different types of sensors and varying capabilities.”

[Related: Robot trash cans have survived a New York City field test.]

Alongside OSU project co-lead Naomi Fitter, Smart’s team will primarily set up and maintain a resource network for the Quori fleet, as well as beta test the robots. The project aims to soon connect both researchers and students through online collaborations, events, and various other opportunities in hopes of “building a community of roboticists that can learn from one another and advance the pace of research.”

According to Smart, pairing newcomers alongside experienced individuals can help quickly bring them up to speed in their field, while also increasing diversity and access in a field that is inordinately composed of white male researchers. 

The post A fleet of humanoid, open-source robots could change robotics research appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
NASA hopes its snake robot can search for alien life on Saturn’s moon Enceladus https://www.popsci.com/technology/eels-robot-saturn-enceladus-moon/ Mon, 08 May 2023 19:00:00 +0000 https://www.popsci.com/?p=539793
Concept art of NASA EELS snake robot on icy surface of Saturn's moon, Enceladus
The 200-pound robot is designed to maneuver both across ice and underwater. NASA/JPL-CalTech

EELS could one day wriggle its way into Enceladus' hidden oceans in search of extraterrestrial life.

The post NASA hopes its snake robot can search for alien life on Saturn’s moon Enceladus appeared first on Popular Science.

]]>
Concept art of NASA EELS snake robot on icy surface of Saturn's moon, Enceladus
The 200-pound robot is designed to maneuver both across ice and underwater. NASA/JPL-CalTech

At least 83 moons orbit Saturn, and experts believe its most reflective one could harbor life underneath its icy surface. To find out, NASA scientists hope to send a massive serpentine robot to scour Enceladus, both atop its frozen ground—and maybe even within a hidden ocean underneath.

As CBS News highlighted on Monday, researchers and engineers are nearing completion of their Exobiology Extant Life Surveyor (EELS) prototype. The 16-foot-long, 200-pound snakelike bot is capable of traversing both ground and watery environments via “first-of-a-kind rotating propulsion units,” according to NASA’s Jet Propulsion Laboratory. These repeating units could act as tracks, gripping mechanisms, and underwater propellers, depending on the surrounding environment’s need. The “head” of EELS also includes 3D mapping technology alongside real-time video recording and transmission capabilities to document its extraplanetary adventure.

[Related: Saturn’s rings have been slowly heating up its atmosphere.]

In theory, EELS would traverse the surface of Enceladus towards one of the moon’s many “plume vents,” which it could then enter to use as a passageway towards its oceanic source. Over 100 of these vents were discovered at Enceladus’ southern pole by the Cassini space probe during its tenure around Saturn. Scientists have since determined the fissures emitted water vapor into space that contained amino acids, which are considered pivotal in the creation of lifeforms.

NASA EELS snake robot in ice skating rink next to researchers.
EELS goes ice-skating. CREDIT: NASA/JPL-CalTech.

To assess its maneuverability, NASA researchers have already taken EELS out for test drives in environments such as an ice skating rink in Pasadena, CA, and even an excursion to Athabasca Glacier in Canada’s Jasper National Park. Should all go as planned, the team hopes to present a finalized concept by fall 2024. But be prepared to wait a while to see it in action on Enceladus—EELS’ journey to the mysterious moon would reportedly take roughly 12 years. Even if it never makes it there, however, the robotic prototype could prove extremely useful closer to Earth, and even on it. According to the Jet Propulsion Lab, EELS could show promise exploring the polar caps of Mars, or even ice sheet crevasses here on Earth.

[Related: Saturn has a slushy core and rings that wiggle.]

Enceladus’ fascinating environment was first unveiled thanks to NASA’s historic Cassini space probe. Launched in 1997, the satellite began transmitting data and images of the planet and its moons back to Earth after arriving following a 7 year voyage. After 13 years of service, a decommissioned Cassini descended towards Saturn, where it was vaporized within the upper atmosphere’s high pressure and temperature. Although NASA could have left Cassini to cruise sans trajectory once its fuel ran out, they opted for the controlled demolition due to the slim possibility of crashing into Enceladus or Titan, which might have disrupted the potential life ecosystems scientists hope to one day discover. 

The post NASA hopes its snake robot can search for alien life on Saturn’s moon Enceladus appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
To build a better crawly robot, add legs—lots of legs https://www.popsci.com/technology/centipede-robot-georgia-tech/ Mon, 08 May 2023 11:00:00 +0000 https://www.popsci.com/?p=539360
centipede robot
The centipede robot from Georgia Tech is a rough terrain crawler. Georgia Institute of Technology

Researchers hope that more limbs will allow them to have fewer sensors.

The post To build a better crawly robot, add legs—lots of legs appeared first on Popular Science.

]]>
centipede robot
The centipede robot from Georgia Tech is a rough terrain crawler. Georgia Institute of Technology

When traveling on rough and unpredictable roads, the more legs the better—at least for robots. Balancing on two legs is somewhat hard; on four legs, it’s slightly easier. But what if you had many many legs, like a centipede? Researchers at Georgia Institute of Technology have found that by giving a robot multiple, connected legs, it allows the machine to easily clamber over landscapes with cracks, hills, and uneven surfaces without the need for extensive sensor systems that would otherwise have helped it navigate its environment. Their results are published in a study this week in the journal Science.

The team has previously done work modeling the motion of these creepy critters. In this new study, they created a framework for operating this centipede-like robot that was influenced by mathematician Claude Shannon’s communication theory, which posits that in transmitting a signal between two points, that to avoid noise, it’s better to break up the message into discrete, repeating units. 

“We were inspired by this theory, and we tried to see if redundancy could be helpful in matter transportation,” Baxi Chong, a physics postdoctoral researcher, said in a news release. Their creation is a robot with joined parts like a model train with two legs sticking out from each segment that could allow it to “walk.” The notion is that after being told to go to a certain destination, along the way, these legs would make contact with a surface, and send information about the terrain to the other segments, which would then adjust motion and position accordingly. The team put it through a series of real-world and computer trials to see how it walked, how fast it could go, and how it performed on grass, blocks, and other rough surfaces. 

[Related: How a dumpy, short-legged bird could change water bottle designs]

“One value of our framework lies in its codification of the benefits of redundancy, which lead to locomotor robustness over environmental contact errors without requiring sensing,” the researchers wrote in the paper. “This contrasts with the prevailing paradigm of contact-error prevention in the conventional sensor-based closed-loop controls that take advantage of visual, tactile, or joint-torque information from the environment to change the robot dynamics.”

They repeated the experiment with robots that had different numbers of legs (six, 12, and 14). In future work with the robot, the researchers said that they want to hone in on finding the optimal number of legs to give its centipede-bot so that it can move smoothly in the most cost-effective way possible.

“With an advanced bipedal robot, many sensors are typically required to control it in real time,” Chong said. “But in applications such as search and rescue, exploring Mars, or even micro robots, there is a need to drive a robot with limited sensing.” 

The post To build a better crawly robot, add legs—lots of legs appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Stunt or sinister: The Kremlin drone incident, unpacked https://www.popsci.com/technology/kremlin-drone-incident-analysis/ Sat, 06 May 2023 11:00:00 +0000 https://www.popsci.com/?p=539413
Drones photo

There is a long history of drones being used in eye-catching and even dangerous ways.

The post Stunt or sinister: The Kremlin drone incident, unpacked appeared first on Popular Science.

]]>
Drones photo

Early in the morning of May 3, local Moscow time, a pair of explosions occurred above the Kremlin. Videos of the incident appeared to show two small drones detonating—ultramodern tech lit up against the venerable citadel. The incident was exclusively the domain of Russian social media for half a day, before Russian President Vladimir Putin declared it a failed assassination attempt.

What actually happened in the night sky above the Russian capital? It is a task being pieced together in public and in secret. Open-source analysts, examining the information available in the public, have constructed a picture of the event and video release, forming a good starting point.

Writing at Radio Liberty, a US-government-funded Russian-language outlet, reporters Sergei Dobrynin and Mark Krutov point out that a video showing smoke above the Kremlin was published around 3:30 am local time on a Moscow Telegram channel. Twelve hours later, Putin released a statement on the attack, and then, write Dobrynin and Krutov, “several other videos of the night attack appeared, according to which Radio Liberty established that two drones actually exploded in the area of ​​​​the dome of the Senate Palace with an interval of about 16 minutes, arriving from opposite directions. The first caused a small fire on the roof of the building, the second exploded in the air.”

That the drones exploded outside a symbolic target, without reaching a practical one, could be by design, or it could owe to the nature of Kremlin air defense, which may have shot the drones down at the last moment before they became more threatening. 

Other investigations into the origin, nature, and means of the drone incident are likely being carried out behind the closed doors and covert channels of intelligence services. Without being privy to those conversations, and aware that information released by governments is only a selective portion of what is collected, it’s possible to instead answer a different set of questions: could drones do this? And why would someone use a drone for an attack like this?

To answer both, it is important to understand gimmick drones.

What’s a gimmick drone?

Drones, especially the models able to carry a small payload and fly long enough to travel a practical distance, can be useful tools for a variety of real functions. Those can include real-estate photography, crop surveying, creating videos, and even carrying small explosives in war. But drones can also carry less-useful payloads, and be used as a way to advertise something other than the drone itself, like coffee delivery, beer vending, or returning shirts from a dry cleaner. For a certain part of the 2010s, attaching a product to a drone video was a good way to get the media to write about it. 

What stands out about gimmick drones is not that they were doing something only a drone could do, but instead that the people behind the stunt were using a drone as a publicity technique for something else. In 2018, a commercial drone was allegedly used in an assassination attempt against Venezuelan president Nicolás Maduro, in which drones flew at Maduro and then exploded in the sky, away from people and without reports of injury. 

As I noted at the time about gimmick drones, “In every case, the drone is the entry point to a sales pitch about something else, a prelude to an ad for sunblock or holiday specials at a casual restaurant. The drone was always part of the theater, a robotic pitchman, an unmanned MC. What mattered was the spectacle, the hook, to get people to listen to whatever was said afterwards.”

Drones are a hard weapon to use for precision assassination. Compared to firearms, poisoning, explosives in cars or buildings, or a host of other attacks, drones represent a clumsy and difficult method. Wind can blow the drones off course, they can be intercepted before they get close, and the flight time of a commercial drone laden with explosives is in minutes, not hours.

What a drone can do, though, is explode in a high-profile manner.

Why fly explosive-laden drones at the  Kremlin?

Without knowing the exact type of drone or the motives of the drone operator (or operators), it is hard to say exactly why one was flown at and blown up above one of Russia’s most iconic edifices of state power. Russia’s government initially blamed Ukraine, before moving on to attribute the attack to the United States. The United States denied involvement in the attack, and US Secretary of State Anthony Blinken said to take any Russian claims with “a very large shaker of salt.”

Asked about the news, Ukraine’s President Zelensky said the country fights Russia on its own territory, not through direct attacks on Putin or Moscow. The war has seen successful attacks on Putin-aligned figures and war proponents in Russia, as well as the family of Putin allies, though attribution for these attacks remains at least somewhat contested, with the United States attributing at least one of them to Ukrainian efforts.

Some war commentators in the US have floated the possibility that the attack was staged by Russia against Russia, as a way to rally support for the government’s invasion. However, that would demonstrate that Russian air defenses and security services are inept enough to miss two explosive-laden drones flying over the capital and would be an unusual way to argue that the country is powerful and strong. 

Ultimately, the drone attackers may have not conducted this operation to achieve any direct kill or material victory, but as a proof of concept, showing that such attacks are possible. It would also show that claims of inviolability of Russian airspace are, at least for small enough flying machines and covert enough operatives, a myth. 

In that sense, the May 3 drone incident has a lot in common with the May 1987 flight of Mathias Rust, an amateur pilot in Germany who safely flew a private plane into Moscow and landed it in Red Square, right near the Kremlin. Rust’s flight ended without bloodshed or explosions, and took place in a peacetime environment, but it demonstrated the hollowness of the fortress state whose skies he flew through.

The post Stunt or sinister: The Kremlin drone incident, unpacked appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Researchers built a ‘SoftZoo’ to virtually test animal-inspired robots https://www.popsci.com/technology/softzoo-animal-robots/ Fri, 05 May 2023 17:00:00 +0000 https://www.popsci.com/?p=539279
Young panda eating branch while sitting in tree.
Yes, there's a pandabot option. Deposit Photos

The open-source testing ground could help engineers envision future soft robotic designs.

The post Researchers built a ‘SoftZoo’ to virtually test animal-inspired robots appeared first on Popular Science.

]]>
Young panda eating branch while sitting in tree.
Yes, there's a pandabot option. Deposit Photos

There are so many animal-inspired soft robots out there at this point that you could easily pack an entire zoo with them. Although an adorable idea, it’s unlikely any such program will find its way into the real world soon—that said, a virtual zoo filled with digital soft robot prototypes will soon become available to researchers hoping to design and optimize their own creations.

A team at MIT recently unveiled SoftZoo, an open framework platform that simulates a variety of 3D model animals performing specific tasks in multiple environmental settings. “Our framework can help users find the best configuration for a robot’s shape, allowing them to design soft robotics algorithms that can do many different things,” MIT PhD student and project lead researcher Tsun-Hsuan Wang said in a statement. “In essence, it helps us understand the best strategies for robots to interact with their environments.”

While MIT notes similar platforms already exist, SoftZoo reportedly goes further by simulating design and control algorithms atop virtual biomes like snow, water, deserts, or wetlands. For instance, instead of a program only offering animal models like seals and caterpillars moving in certain directions, SoftZoo can place these designs in numerous settings via what’s known as a “differentiable multiphysics engine.”

[Related: Watch this robotic dog use one of its ‘paws’ to open doors.]

Soft robotics have quickly shown themselves to be extremely promising in navigating natural, real-world environments. Unlike laboratory settings, everyday clutter can prove extremely challenging for traditional robots. Soft variants’ malleability and adaptability, however, make them well suited for difficult situations such as volatile search-and-rescue scenarios like collapsed buildings and swift moving waters. The MIT team’s open-source SoftZoo program allows designers to simultaneously optimize their own works’ body and brain instead of relying on multiple expensive, complicated systems.

SoftZoo animal robot model examples
OpenZoo soft robot models. Credit: MIT/CSAIL

“This computational approach to co-designing the soft robot bodies and their brains (that is, their controllers) opens the door to rapidly creating customized machines that are designed for a specific task,” added Daniela Rus, paper co-author and director of MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) and the Andrew and Erna Viterbi Professor in the MIT Department of Electrical Engineering and Computer Science (EECS).

Of course, it’s one thing to simulate a soft robot, and another thing entirely to actualize it in the real world. “The muscle models, spatially varying stiffness, and sensorization in SoftZoo cannot be straightforwardly realized with current fabrication techniques, so we are working on these challenges,” explained Wang. Still, offering an open source program like SoftZoo allows researchers to experiment and test out their robot ideas in an extremely accessible way. From there, they can move on to making their best and most promising designs a reality.

The post Researchers built a ‘SoftZoo’ to virtually test animal-inspired robots appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Robot plants could be used to grow infrastructure in space from scratch https://www.popsci.com/science/plant-inspired-robots-colonize-mars/ Thu, 04 May 2023 01:00:00 +0000 https://www.popsci.com/?p=538662
A variable-stiffness tendril-like soft robot (polyethylene terephthalate (PET) tube) based on reversible osmotic actuation. An osmosis-driven system that controls its turgidity and performs sophisticated tasks.
A variable-stiffness tendril-like soft robot (polyethylene terephthalate (PET) tube) based on reversible osmotic actuation. An osmosis-driven system that controls its turgidity and performs sophisticated tasks. IIT-Istituto Italiano di Tecnologia

Barbara Mazzolai’s roboplants could analyze and enrich soil, search for water and other chemicals, and more.

The post Robot plants could be used to grow infrastructure in space from scratch appeared first on Popular Science.

]]>
A variable-stiffness tendril-like soft robot (polyethylene terephthalate (PET) tube) based on reversible osmotic actuation. An osmosis-driven system that controls its turgidity and performs sophisticated tasks.
A variable-stiffness tendril-like soft robot (polyethylene terephthalate (PET) tube) based on reversible osmotic actuation. An osmosis-driven system that controls its turgidity and performs sophisticated tasks. IIT-Istituto Italiano di Tecnologia

This article was original featured on MIT Press.This article is excerpted from Dario Floreano and Nicola Nosengo’s book “Tales From a Robotic World.”

In the early 2010s, a new trend in robotics began to emerge. Engineers started creating robotic versions of salamanders, dragonflies, octopuses, geckos, and clams — an ecosystem of biomimicry so diverse the Economist portrayed it as “Zoobotics.” And yet Italian biologist-turned-engineer Barbara Mazzolai raised eyebrows when she proposed looking beyond animals and building a robot inspired by a totally different biological kingdom: plants. As fluid as the definition of the word robot can be, most people would agree that a robot is a machine that moves. But movement is not what plants are famous for, and so a robotic plant might at first sound, well, boring.

But plants, it turns out, are not static and boring at all; you just have to look for action in the right place and at the right timescale. When looking at the lush vegetation of a tropical forest or marveling at the colors of an English garden, it’s easy to forget that you are actually looking at only half of the plants in front of you. The best-looking parts, maybe, but not necessarily the smartest ones. What we normally see are the reproductive and digestive systems of a plant: the flowers and fruits that spread pollen and seeds and the leaves that extract energy from sunlight. But the nervous system, so to speak, that explores the environment and makes decisions is in fact underground, in the roots.

Roots may be ugly and condemned to live in darkness, but they firmly anchor the plant and constantly collect information from the soil to decide in which direction to grow to find nutrients, avoid salty soil, and prevent interference with the roots of other plants. They may not be the fastest diggers, but they’re the most efficient ones, and they can pierce the ground using only a fraction of the energy that worms, moles, or manufactured drills require. Plant roots are, in other words, a fantastic system for underground exploration — which is what inspired Mazzolai to create a robotic version of them.

“It forced us to rethink everything, from materials to sensing and control of robots.”

Mazzolai’s intellectual path is a case study in interdisciplinarity. Born and raised in Tuscany, in the Pisa area that is one of Italy’s robotic hot spots, she was fascinated early on by the study of all things living, graduating in biology from the University of Pisa and focusing on marine biology. She then became interested in monitoring the health of ecosystems, an interest that led her to get her doctorate in microengineering and eventually to be offered by Paolo Dario, a biorobotics pioneer at Pisa’s Scuola Superiore Sant’Anna, the possibility of opening a new research line on robotic technologies for environmental sensing.

It was there, in Paolo Dario’s group, that the first seeds of her plant-inspired robots were planted. Mazzolai got in touch with a group at the European Space Agency (ESA) in charge of exploring innovative technologies that looked interesting but were still far away from applications, she recalls. While brainstorming with them, she realized space engineers were struggling with a problem that plants brilliantly solved several hundred million years ago.

“In real plants, roots have two functions,” says Mazzolai. “They explore the soil in search of water and nutrients, but even more important, they anchor the plant, which would otherwise collapse and die.” Anchoring happens to be an unsolved problem when designing systems that have to sample and study distant planets or asteroids. In most cases, from the moon to Mars and distant comets and asteroids, the force of gravity is weak. Unlike on Earth, the weight of the spacecraft or rover is not always enough to keep it firmly on the ground, and the only available option is to endow the spacecraft with harpoons, extruding nails, and drills. But these systems become unreliable over time if the soil creeps, provided they work in the first place. They didn’t work for Philae, for example, the robotic lander that arrived at the 67P/Churyumov–Gerasimenko comet in 2014 after a 10-year trip only to fail to anchor at the end of its descent, bouncing away from the ground and collecting just a portion of the planned measurements.

In a brief feasibility study carried out between 2007 and 2008 for ESA, Mazzolai and her team let their imagination run free and described an anchoring system for spacecrafts inspired by plant roots. The research group also included Stefano Mancuso, a Florence-based botanist who would later gain fame for his idea that plants display “intelligent” behavior, although of a completely different sort from that of animals. Mazzolai and her team described an ideal system that would reproduce, and transfer to other planets, the ability of Earth plants to dig through the soil and anchor to it.

In the ESA study, Mazzolai imagined a spacecraft descending on a planet with a really hard landing: The impact would dig a small hole in the planetary surface, inserting a “seed” just deep enough in the soil, not too different from what happens to real seeds. From there, a robotic root would start to grow by pumping water into a series of modular small chambers that would expand and apply pressure on the soil. Even in the best-case scenario, such a system could only dig through loose and fine dust or soil. The root would have to be able to sense the underground environment and turn away from hard bedrock. Mazzolai suggested Mars as the most suitable place in the solar system to experiment with such a system — better than the moon or asteroids because of the Red Planet’s low gravity and atmospheric pressure at surface level (respectively, 1/3 and 1/10 of those found on Earth). Together with a mostly sandy soil, these conditions would make digging easier because the forces that keep soil particles together and compact them are weaker than on Earth.

At the time, ESA did not push forward with the idea of a plant-like planetary explorer. “It was too futuristic,” Mazzolai admits. “It required technology that was not yet there, and in fact still isn’t.” But she thought that others beyond the space sector would find the idea intriguing. After transitioning to the Italian Institute of Technology, in 2012, Mazzolai convinced the European Commission to fund a three-year study that would result in a plant-inspired robot, code-named Plantoid. “It was uncharted territory,” says Mazzolai. “It meant creating a robot without a predefined shape that could grow and move through soil — a robot made of independent units that would self-organize and make decisions collectively. It forced us to rethink everything, from materials to sensing and control of robots.”

The project had two big challenges: on the hardware side, how to create a growing robot, and on the software side, how to enable roots to collect and share information and use it to make collective decisions. Mazzolai and her team tackled hardware first and designed the robot’s roots as flexible, articulated, cylindrical structures with an actuation mechanism that can move their tip in different directions. Instead of the elongation mechanism devised for that initial ESA study, Mazzolai ended up designing an actual growth mechanism, essentially a miniature 3D printer that can continuously add material behind the root’s tip, thus pushing it into the soil.

It works like this. A plastic wire is wrapped around a reel stored in the robot’s central stem and is pulled toward the tip by an electric motor. Inside the tip, another motor forces the wire into a hole heated by a resistor, then pushes it out, heated and sticky, behind the tip, “the only part of the root that always remains itself,” Mazzolai explains. The tip, mounted on a ball bearing, rotates and tilts independent of the rest of the structure, and the filament is forced by metallic plates to coil around it, like the winding of a guitar string. At any given time, the new plastic layer pushes the older layer away from the tip and sticks to it. As it cools down, the plastic becomes solid and creates a rigid tubular structure that stays in place even when further depositions push it above the metallic plates. Imagine winding a rope around a stick and the rope becomes rigid a few seconds after you’ve wound it. You could then push the stick a bit further, wind more rope around it, and build a longer and longer tube with the same short stick as a temporary support. The tip is the only moving part of the robot; the rest of the root only extends downward, gently but relentlessly pushing the tip against the soil.

The upper trunk and branches of the plantoid robot are populated by soft, folding leaves that gently move toward light and humidity. Plantoid leaves cannot yet transform light into energy, but Michael Graetzel, a chemistry professor at EPFL in Lausanne, Switzerland, and one of the world’s most cited scientists, has developed transparent and foldable films filled with synthetic chlorophyll capable of converting and storing electricity from light that one day could be formed into artificial leaves powering plantoid robots. “The fact that the root only applies pressure to the soil from the tip is what makes it fundamentally different from traditional drills, which are very destructive. Roots, on the contrary, look for existing soil fractures to grow into, and only if they find none, they apply just enough pressure to create a fracture themselves,” Mazzolai explains.

This new project may one day result in robot explorators that can work in dark environments with a lot of empty space, such as caves or wells.

The plantoid project has attracted a lot of attention in the robotics community because of the intriguing challenges that it combines — growth, shape shifting, collective intelligence — and because of possible new applications. Environmental monitoring is the most obvious one: The robotic roots could measure changing concentrations of chemicals in the soil, especially toxic ones, or they could prospect for water in arid soils, as well as for oil and gas — even though, by the time this technology is mature, we’d better have lost our dependence on them as energy sources on planet Earth. They could also inspire new medical devices, such as safer endoscopes that move in the body without damaging tissue. But space applications remain on Mazzolai’s radar.

Meanwhile, Mazzolai has started another plant-inspired project, called Growbot. This time the focus is on what happens over the ground, and the inspiration comes from climbing trees. “The invasiveness of climbing plants shows how successful they are from an evolutionary point of view,” she notes. “Instead of building a solid trunk, they use the extra energy for growing and moving faster than other plants. They are very efficient at using clues from the environment to find a place to anchor. They use light, chemical signals, tactile perception. They can sense if their anchoring in the soil is strong enough to support the part of the plant that is above the ground.” Here the idea is to build another growing robot, similar to the plantoid roots, that can overcome void spaces and attach to existing structures. “Whereas plantoids must face friction, grow-bots work against gravity,” she notes. This new project may one day result in robot explorators that can work in dark environments with a lot of empty space, such as caves or wells.

But for all her robots, Mazzolai is still keeping an eye on the visionary idea that started it all: planting and letting them grow on other planets. “It was too early when we first proposed it; we barely knew how to study the problem. Now I hope to start working with space agencies again.” Plant-inspired robots, she says, could not only sample the soil but also release chemicals to make it more fertile — whether on Earth or a terraformed Mars. And in addition to anchoring, she envisions a future where roboplants could be used to grow entire infrastructure from scratch. “As they grow, the roots of plantoids and the branches of a growbot would build a hollow structure that can be filled with cables or liquids,” she explains. This ability to autonomously grow the infrastructure for a functioning site would make a difference when colonizing hostile environments such as Mars, where a forest of plant-inspired robots could analyze the soil and search for water and other chemicals, creating a stable structure complete with water pipes, electrical wiring, and communication cables: the kind of structure astronauts would like to find after a year-long trip to Mars.


Dario Floreano is Director of the Laboratory of Intelligent Systems at the Swiss Federal Institute of Technology Lausanne (EPFL). He is the co-author, with Nicola Nosengo, of “Tales From a Robotic World: How Intelligent Machines Will Shape Our Future,” from which this article is excerpted.

Nicola Nosengo is a science writer and science communicator at EPFL. His work has appeared in Nature, the Economist, Wired, and other publications. He is the Chief Editor of Nature Italy

The post Robot plants could be used to grow infrastructure in space from scratch appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Seals provided inspiration for a new waddling robot https://www.popsci.com/technology/seal-soft-robot/ Mon, 01 May 2023 16:00:00 +0000 https://www.popsci.com/?p=537958
Two seals laying on shore near water.
Pinnipeds are getting robotic cousins. Deposit Photos

Fin-footed mammals, aka pinnipeds, provided the template for a new soft robot.

The post Seals provided inspiration for a new waddling robot appeared first on Popular Science.

]]>
Two seals laying on shore near water.
Pinnipeds are getting robotic cousins. Deposit Photos

It might be difficult to see at first, but if you squint just right, you can tell the latest animal-inspired robot owes its ungainly waddle to seals. Researchers at Chicago’s DePaul University looked at the movements of the aquatic mammal and its relatives for their new robot prototype—and while it may look a bit silly, the advances could one day help in extremely dire situations.

According to their paper’s abstract, the team writes they aimed to build a robot featuring “improved degrees of freedom, gait trajectory diversity, limb dexterity, and payload capabilities.” To do this, they studied the movements of pinnipeds—the technical term given to fin-footed mammals such as seals, walruses, and sea lions—as an alternative to existing quadrupedal and soft-limbed robots. Their final result is a simplified, three-limbed device that propels itself via undulating motions and is supported by a rigid “backbone” like those of their mammalian inspirations.

As also detailed last week via TechXplore, the robot’s soft limbs are each roughly 9.5 inches long by 1.5 inches wide, and encased in a protective outer casing. Each arm is driven by pneumatic actuators filled with liquid to obtain varying degrees of stiffness. Changing the limbs’ rigidness controls the robot’s directional abilities, something researchers say is generally missing from similar crawling machines.

[Related: Robot jellyfish swarms could soon help clean the oceans of plastic.]

Interestingly, the team realized that their pinniped product actually moves faster when walking “backwards.” While in reverse, the robot waddled at a solid 6.5 inches per second, compared to just 4.5 inches per second during forward motion. “Pinnipeds use peristaltic body movement to propel forward since the bulk of the body weight is distributed towards the back,” explains the team in its research paper. “But, the proposed soft robot design has a symmetric weight distribution and thus it is difficult to maintain stability while propelling forward. As a consequence, the robot shows limited frontal movements. Conversely, when propelling backward, the torque imbalance is countered by the body.”

But despite the reversal and slightly ungainly stride, the DePaul University team believes soft robots such as their seal-inspired creation could one day come in handy for dangerous tasks, including nuclear site inspections, search and rescue efforts, and even future planetary explorations. It might be one small step for robots, but it may prove one giant waddle for pinniped propulsion tech.

The post Seals provided inspiration for a new waddling robot appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
This agile robotic hand can handle objects just by touch https://www.popsci.com/technology/robot-hand-sixth-sense/ Fri, 28 Apr 2023 18:15:00 +0000 https://www.popsci.com/?p=537548
A robotic hand manipulates a reflective disco ball in dim lighting.
The hand can spin objects like this disco ball without the need of 'eyes'. Columbia University ROAM Lab

Researchers designed a robot that doesn't need visual data to get a handle on objects.

The post This agile robotic hand can handle objects just by touch appeared first on Popular Science.

]]>
A robotic hand manipulates a reflective disco ball in dim lighting.
The hand can spin objects like this disco ball without the need of 'eyes'. Columbia University ROAM Lab

The human hand is amazingly complex—so much so that most modern robots and artificial intelligence systems have a difficult time understanding how they truly work. Although machines are now pretty decent at grasping and replacing objects, actual manipulation of their targets (i.e. assembly, reorienting, and packaging) remains largely elusive. Recently, however, researchers created an impressively dextrous robot after realizing it needed less, not more, sensory inputs.

A team at Columbia Engineering has just unveiled a five-digit robotic “hand” that relies solely on its advanced sense of touch, alongside motor learning algorithms, to handle difficult objects—no visual data required. Because of this, the new proof-of-concept is completely immune to common optical issues like dim lighting, occlusion, and even complete darkness.

[Related: Watch a robot hand only use its ‘skin’ to feel and grab objects.]

Each of the new robot’s digits are equipped with highly sensitive touch sensors alongside 15 independently actuating joints. Irregularly shaped objects such as a miniature disco ball were then placed into the hand for the robot to rotate and maneuver without dropping them. Alongside “submillimeter” tactile data, the robot relied on what’s known as “proprioception.” Often referred to as the “sixth sense,” proprioception includes abilities like physical positionality, force, and self-movement. These data points were then fed into a deep reinforcement learning program, which was able to simulate roughly one year of practice time in only a few hours via “modern physics simulators and highly parallel processors,” according to a statement from Columbia Engineering.

In their announcement, Matei Ciocarlie, an associate professor in the departments of mechanical engineering and computer science, explained that “the directional goal for the field remains assistive robotics in the home, the ultimate proving ground for real dexterity.” While Ciocarlie’s team showed how this was possible without any visual data, they plan to eventually incorporate that information into their systems. “Once we also add visual feedback into the mix along with touch, we hope to be able to achieve even more dexterity, and one day start approaching the replication of the human hand,” they added.

[Related: AI is trying to get a better handle on hands.]

Ultimately, the team hopes to combine this dexterity and understanding alongside more abstract, semantic and embodied intelligence. According to Columbia Engineering researchers, their new robotic hand represents the latter capability, while recent advances in large language modeling through OpenAI’s GPT-4 and Google Bard could one day supply the former.

The post This agile robotic hand can handle objects just by touch appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Tony Stark would love this new experimental materials lab https://www.popsci.com/technology/a-lab-materials-discovery/ Fri, 28 Apr 2023 14:21:08 +0000 https://www.popsci.com/?p=537487
Berkeley Lab researcher Yan Zeng looks over the starting point at A-Lab.
Berkeley Lab researcher Yan Zeng looks over the starting point at A-Lab. (Credit: Marilyn Sargent/Berkeley Lab), © 2023 The Regents of the University of California, Lawrence Berkeley National Laboratory

It’s operated by robotic arms and AI, and it runs around the clock.

The post Tony Stark would love this new experimental materials lab appeared first on Popular Science.

]]>
Berkeley Lab researcher Yan Zeng looks over the starting point at A-Lab.
Berkeley Lab researcher Yan Zeng looks over the starting point at A-Lab. (Credit: Marilyn Sargent/Berkeley Lab), © 2023 The Regents of the University of California, Lawrence Berkeley National Laboratory

Lawrence Berkeley National Laboratory has recently announced the completion of its ‘A-Lab,’ where the ‘A’ stands for artificial intelligence, automated, and accelerated. The $2 million lab is complete with three robotic arms, eight furnaces, and lab equipment all controlled by AI software, and it works around the clock. 

If it seems like a real-life replica of Marvel character Tony Stark’s lab, well, it’s not far off. It’s an entirely autonomous lab that can create and test up to 200 samples of new materials a day, accelerating materials science discoveries at an unprecedented rate and easing the workload off researchers.

Researchers at the A-lab are currently working on materials for improved batteries and energy storage devices, hoping to meet urgent needs for sustainable energy use. The lab could spur innovation in many other industries as well.

“Materials development, which is so important for society, is just too slow,”  says Gerd Ceder, the principal investigator for A-Lab. 

Materials science is a field that identifies, develops, and tests materials and their application for everything from aerospace to clean energy to medicine.

Materials scientists typically use computers to predict novel, not-seen-in-nature, materials that are stable enough to be used. Though a computer can generate theoretical inorganic compounds, identifying which novel compounds to make, figuring out how to synthesize them, and then evaluating their performance is a time-consuming process to do manually. 

[Related: This tiny AI-powered robot is learning to explore the ocean on its own]

Additionally, computational tools have made designing materials virtually so much easier, which means that there is a surplus of novel materials that still need to be tested, creating a bottleneck effect.

“Sometimes you’re lucky and in two weeks of trying, you’ve made it and sometimes six months in the lab and you’re nowhere.” Ceder says. “So developing chemical synthesis routes to actually make that compound that you would like to get so much can be extremely time consuming.”

A-Lab works with The Materials Project, a database of hundreds of thousands predicted materials, run by founding director Kristin Persson. They provide free access to thousands of computationally predicted novel materials, together with information on the compounds’ structures and some of their chemical properties, that researchers can use.

“In order to actually design new materials, we can’t just predict them in the computer,” Persson says. “We have to show that this is real.”

Experienced researchers can only vet a handful of samples in a working day. A-Lab would in theory be able to produce hundreds of samples quickly, more accurately. With the help of A-Lab, researchers can allocate more of their time to big-picture projects instead of doing grunt work. 

Yan Zeng, a staff scientist leading the A-lab, compares the lab’s process to cooking a new dish, where the lab is given a new dish, which in this case is the target compound, to find a recipe for. Once researchers identify a novel compound with the required qualities, they send it to the lab. The AI system creates new recipes with various combinations of over 200 ingredients, or precursor powders like metal oxides containing iron, copper, manganese, and nickel. 

The robot arms mix the slurry of powders together with a solvent, and then bake the new sample in furnaces to stimulate a chemical reaction that may or may not yield the intended compound. Following trial and error, the AI system can then learn and tweak the recipe until it creates a successful compound. 

[Related: A simple guide to the expansive world of artificial intelligence]

AI software controls the movement of three robotic arms that work with lab equipment, and weigh and mix different combinations of starting ingredients. And the lab itself is also autonomous. That means it can make new decisions about what to do following failures, independently working through new synthesis recipes faster than a human can.

“I had not expected that it would do so well on the synthesis of novel compounds,” Ceder says. “And that was kind of the maiden voyage.” 

The speed bump from human scientists is not only due to the AI-controlled robots, but because the software can draw knowledge from  around 100,000 synthesis recipes across five million research papers. 

Like a human scientist, A-lab also records details from every experiment, even documenting the failures. 

Researchers do not publish data from failed experiments for many reasons, including limited time and funding, lack of public interest, and the perception that failure is less informative than success. However, failed experiments do have a valuable place in research. They rule out false hypotheses and unsuccessful approaches. With easy access to data from hundreds of failed samples created each day, they can better understand what works, and what does not.

The post Tony Stark would love this new experimental materials lab appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
The Marines are getting supersized drones for battlefield resupply https://www.popsci.com/technology/marines-large-resupply-drones/ Thu, 27 Apr 2023 20:40:51 +0000 https://www.popsci.com/?p=537422
A TRV-150 seen on April 20, 2023.
A TRV-150 seen on April 20, 2023. Raymond Valdez / US Army

The big flying machines are designed to carry about 150 pounds and can fly at about 67 miles per hour.

The post The Marines are getting supersized drones for battlefield resupply appeared first on Popular Science.

]]>
A TRV-150 seen on April 20, 2023.
A TRV-150 seen on April 20, 2023. Raymond Valdez / US Army

On April 11, the Department of Defense announced that it was allocating just over $8 million for 21 new delivery drones. These flying machines, officially called the TRV-150C Tactical Resupply Unmanned Aircraft Systems, are made by Survice Engineering in partnership with Malloy Aeronautics

The TRV-150C is a four-limbed drone that looks like a quadcopter on stilts. Its tall landing legs allow it to take off with a load of up to 150 pounds of cargo slung underneath. The drone’s four limbs each mount two rotors, making the vehicle more of an octocopter than a quadcopter. 

The TRV drone family also represents the successful evolution of a long-running drone development program, one that a decade ago promised hoverbikes for humans and today is instead delivering uncrewed delivery drones.

The contract award is through the Navy and Marine Corps Small Tactical Unmanned Aircraft Systems program office, which is focused on ensuring the people doing the actual fighting on the edge of combat or action get the exact robotic assistance they need. For Marines, this idea has been put into practice and not just theorized, with an exercise involving drone resupply taking place at Quantico, Virginia, at the end of March.

The Tactical Resupply Unmanned Aircraft System (TRUAS), as the TRV-150C is referred to in use, “is designed to provide rapid and assured, highly automated aerial distribution to small units operating in contested environments; thereby enabling flexible and rapid emergency resupply, routine distribution, and a constant push and pull of material in order to ensure a constant state of supply availability,” said Master Sergeant Chris Genualdi in a release about the event. Genualdi already works in the field of airborne and air delivery, so the delivery drone became an additional tool to meet familiar problems.

Malloy Aeronautics boasts that the drone has a range of over 43 miles; in the Marines’ summary from Quantico, the drone is given a range of 9 miles for resupply missions. Both numbers can be accurate: Survice gives the unencumbered range of the TRV-150 at 45 miles, while carrying 150 pounds of cargo that range is reduced to 8 miles. 

With a speed of about 67 mph and a flight process that is largely automated, the TRV-150C is a tool that can get meaningful quantities of vital supplies where they are needed, when they are needed. Malloy also boasts that drones in the TRV-150 family have batteries that can be easily swapped, allowing for greater operational tempo as the drones themselves do not have to wait for a recharge before being sent on their next mission.

These delivery drones use “waypoint navigation for mission planning, which uses programmed coordinates to direct the aircraft’s flight pattern,” the Marines said in a release, with Genualdi noting “that the simplicity of operating the TRUAS is such that a Marine with no experience with unmanned aircraft systems can be trained to operate and conduct field level maintenance on it in just five training days.”

Reducing the complexity of the drone to essentially a flying cart that can autonomously deliver gear where needed is huge. The kinds of supplies needed in battle are all straightforward—vital tools like more bullets, more meals, or even more blood and medical equipment—so attempts at life-saving can be made even if it’s unsafe for the soldiers to move towards friendly lines for more elaborate care.

Getting the drone down to just a functional delivery vehicle comes after years of work. In 2014, Malloy debuted a video of a reduced scale hoverbike designed for a human to ride on, using four rotors and a rectangular body. En route to becoming the basis for the delivery drone seen today, the hoverbike was explored by the US Army as a novel way to fly scouts around. This scout ultimately moved to become a resupply tool, which the Army tested in January 2017.

In 2020, the US Navy held a competition for a range of delivery drones at the Yuma Proving Grounds in Arizona. The entry by Malloy and Survice came in first place, and cemented the TRV series as the drones to watch for battlefield delivery. In 2021, British forces used TRV drones in an exercise, with the drones tasked with delivering blood to the wounded. 

“This award represents a success story in the transition of technology from U.S. research laboratories into the hands of our warfighters,” said Mark Butkiewicz, a vice president at SURVICE Engineering, in a release. “We started with an established and proven product from Malloy Aeronautics and integrated the necessary tech to provide additional tactical functionality for the US warfighter. We then worked with research labs to conduct field experiments with warfighters to refine the use of autonomous unmanned multirotor drones to augment logistical operations at the forward most edge of the battlefield.”

The 21 drones awarded by the initial contract will provide a better start, alongside the drones already used for training, in teaching the Marines how to rely on robots doing resupply missions in combat. Genualdi expects the Marines to create a special specialty to support the use of drones, with commanders dispatching members to learn how to work alongside the drone.

The drones could also see life as exportation and rescue tools, flying through small gaps in trees, buildings, and rubble in order to get people the aid they need. In both peace and wartime uses, the drone’s merit is its ability to get cargo where it is needed without putting additional humans at risk of catching a bullet. 

The post The Marines are getting supersized drones for battlefield resupply appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Robot jellyfish swarms could soon help clean the oceans of plastic https://www.popsci.com/technology/jellyfish-robot-ocean-pollution/ Wed, 26 Apr 2023 17:00:00 +0000 https://www.popsci.com/?p=536873
The Jellyfish-Bot is small, energy efficient, and virtually noiseless.
The Jellyfish-Bot is small, energy efficient, and virtually noiseless. MPI-IS

By simulating jellyfish movement with artificial muscles, the robots can safely kick up ocean trash for recycling.

The post Robot jellyfish swarms could soon help clean the oceans of plastic appeared first on Popular Science.

]]>
The Jellyfish-Bot is small, energy efficient, and virtually noiseless.
The Jellyfish-Bot is small, energy efficient, and virtually noiseless. MPI-IS

The oceans are inundated with plastic. Despite the numerous flashy proposed solutions, there unfortunately still isn’t any surefire way to clean it all up. One of the most buzzed about ideas—underwater vacuuming—has recently come up against intense scrutiny for its potential collateral damage to marine ecosystems and wildlife. Meanwhile, even the more delicate alternatives often hinge upon large, cumbersome surface skimmers. To tackle some of these issues, scientists at Germany’s Max Planck Institute for Intelligent Systems (MPI-IS) have created a robotic trash collector inspired by some of the oceans’ oldest and most resilient residents—jellyfish.

Recently detailed in the research journal Scientific Advances, the team’s “Jellyfish-Bot” already shows promise in helping cleanup the copious amounts of human-generated trash littering the planets’ aquatic environments. But unlike many other underwater cleaners, the prototype is incredibly small, energy-efficient, and nearly noiseless. Additionally, the hand-sized device doesn’t need to actually physically interact with its cleanup targets. Instead, the robot takes a cue from jellyfishes’ graceful movements via six limbs employing artificial muscles called hydraulically amplified self-healing electrostatic actuators, or HASELs.

As New Atlas explains, HASELs are ostensibly electrode-covered sacs filled with oils. When the electrodes receive a small current—in this case, about 100 mW—they become positively charged, then safely discharge the current into the negatively charged water around them. Alternating this current forces the oil in the sacs to move back and forth, thus making the actuators flap in a way that generates momentum to move trash particles upward. From there, humans or other gathering tools can scoop up the detritus.

“When a jellyfish swims upwards, it can trap objects along its path as it creates currents around its body,” study author and postdoc in the MPI-IS Physical Intelligence Department Tianlu Wang explained in a statement. “In this way, it can also collect nutrients.”

Wang went on to describe how their robot similarly circulates water around it. “This function is useful in collecting objects such as waste particles,” Wang adds. “It can then transport the litter to the surface, where it can later be recycled.”

[Related: Ocean plastic ‘vacuums’ are sucking up marine life along with trash.]

Apart from generating currents, the Jellyfish-Bots’ actuators could also be divided up into separate responsibilities. In the team’s demonstrations, the prototypes could use all six of its limbs for propulsion, or rely on two of them as claws to lightly grasp targets like an N95 face mask.

The biggest drawback at the moment is simply the fact that a controlled Jellyfish-Bot still requires a wired connection for power, thus hampering its scope. Although researchers have been able to incorporate battery and wireless communications modules into the robots, the untethered versions cannot currently be directed in a desired path. Still, it’s easy to envision future iterations of the Jellyfish-Bot clearing this relatively small hurdle. If that is accomplished, then fleets of the cute cleanup machines may soon be deployed as a safe, efficient, and environmentally harmless way to help tackle one of the environment’s most pressing threats.

The post Robot jellyfish swarms could soon help clean the oceans of plastic appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Arctic researchers built a ‘Fish Disco’ to study ocean life in darkness https://www.popsci.com/technology/fish-disco-arctic-ocean/ Mon, 24 Apr 2023 11:00:00 +0000 https://www.popsci.com/?p=536004
northern lights over the Arctic ocean
Northern lights over the Arctic ocean. Oliver Bergeron / Unsplash

It's one of the many tools they use to measure artificial light’s impact on the Arctic ocean's sunless world.

The post Arctic researchers built a ‘Fish Disco’ to study ocean life in darkness appeared first on Popular Science.

]]>
northern lights over the Arctic ocean
Northern lights over the Arctic ocean. Oliver Bergeron / Unsplash

During the winter, the Arctic doesn’t see a sunrise for months on end. Although completely immersed in darkness, life in the ocean goes on. Diurnal animals like humans would be disoriented by the lack of daylight, having been accustomed to regular cycles of day and night. 

But to scientists’ surprise, it seems that even the photosynthetic plankton—microorganisms that normally derive their energy from sunlight—have found a way through the endless night. These marine critters power the region’s ecosystem, through the winter and into the spring bloom. Even without the sun, daily patterns of animals migrating from surface to the depths and back again (called the diel vertical migration) remain unchanged. 

However, scientists are concerned that artificial light could have a dramatic impact on this uniquely adapted ecosystem. The Arctic is warming fast, and the ice is getting thinner—that means there’s more ships, cruises, and coastal developments coming in, all of which can add light pollution to the underwater world. We know that artificial light is harmful to terrestrial animals and birds in flight. But its impact on ocean organisms is still poorly understood. 

A research team called Deep Impact is trying to close this knowledge gap, as reported in Nature earlier this month. Doing the work, though, is no easy feat. Mainly, there’s a bit of creativity involved in conducting experiments in the darkness—researchers need to understand what’s going on without changing the behaviors of the organisms. Any illumination, even from the research ship itself, can skew their observations. This means that the team has to make good use of a range of tools that allow them to “see” where the animals are and how they’re behaving, even without light. 

One such invention is a specially designed circular steel frame called a rosette, which contains a suite of optical and acoustic instruments. It is lowered into the water to survey how marine life is moving under the ship. During data collection, the ship will make one pass across an area of water without any light, followed by another pass with the deck lights on. 

[Related: Boaty McBoatface has been a very busy scientific explorer]

There are a range of different rosettes, made up of varying instrument compositions. One rosette called Frankenstein can measure light’s effect on where zooplankton and fish move to in the water column. Another, called Fish Disco, “emits sequences of multicolored flashes to measure how they affect the behavior of zooplankton,” according to Nature

And of course, robots that can operate autonomously can come in handy for occasions like these. Similar robotic systems have already been deployed on other aquatic missions like exploring the ‘Doomsday glacier,’ scouring for environmental DNA, and listening for whales. In absence of cameras, they can use acoustic-based tech, like echosounders (a sonar system) to detect objects in the water. 

In fact, without the element of sight, sound becomes a key tool for perceiving without seeing. It’s how most critters in the ocean communicate with one another. And making sense of the sound becomes an important problem to solve. To that end, a few scientists on the team are trying to see if machine learning can be used to identify what’s in the water through the pattern of the sound frequencies they reflect. So far, an algorithm currently being tested has been able to discern two species of cod.

The post Arctic researchers built a ‘Fish Disco’ to study ocean life in darkness appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Robot trash cans have survived a New York City field test https://www.popsci.com/technology/new-york-robot-trash-can/ Sat, 22 Apr 2023 11:00:00 +0000 https://www.popsci.com/?p=535976
A treat for a very good bot.
A treat for a very good bot. Cornell University

In a recent study, people in New York interacted with robotic trash cans on wheels. Here’s how it went.

The post Robot trash cans have survived a New York City field test appeared first on Popular Science.

]]>
A treat for a very good bot.
A treat for a very good bot. Cornell University

Throwing out trash can be an icky, and sometimes even confusing, experience. To better understand how humans interact with robots, Cornell University researchers recently created and released two trash and recycling bots to do some dirty work in a Manhattan plaza. And for most of the people who interacted with the adorable barrel bots, the robots’ helpful interceptions of waste were welcomed.

The study involved two robots. One was blue, and one was gray, and they were mounted on recycled hoverboard parts and equipped with 360-degree cameras. The bots received all sorts of reactions, from onlookers expressing their appreciation to treating it like a playful dog with a treat. Some of them even felt compelled to “feed” the robots, according to a Cornell press release. 

The scientists behind the creation recently presented their study, called “Trash Barrel Robots in the City,” in the video program at the ACM/IEEE International Conference on Human-Robot Interaction. This isn’t the first time the trashbots have made their debut in the real world—the robot was deployed at Stanford a few years ago and was met by bystanders who quickly began to dote on the trashbot. According to The Verge in 2016, people became so smitten with the bot that “when it falls over they race to pick it up, even asking if it’s OK.” 

[Related: Meet Garmi, a robot nurse and companion for Germany’s elderly population.]

Team leader Wendy Ju, an associate professor at the Jacobs Technion-Cornell Institute at Cornell Tech and the Technion, originally planned to turn chairs and tables in New York City into bots, but the trash can inevitably won out. “When we shared with them the trash barrel videos that we had done at Stanford, all discussions of the chairs and tables were suddenly off the table,” Ju said in a statement. “It’s New York! Trash is a huge problem!”

Of course, you can’t win over everybody, even if you’re a cute trash can. Some folks found it creepy, raised concerns about surveillance, gave it the middle finger, or even knocked it over. Now, the team hopes to send the trash can out to explore the rest of New York City, hopefully to be met with adoration and not animosity.

“Everyone is sure that their neighborhood behaves very differently,” Ju said. “So, the next thing that we’re hoping to do is a five boroughs trash barrel robot study.”

Watch more about these trash cans on wheels, below:

The post Robot trash cans have survived a New York City field test appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Giving drones inflatable suits could help them survive crash landings https://www.popsci.com/technology/bird-inspired-collision-drone/ Fri, 21 Apr 2023 17:00:00 +0000 https://www.popsci.com/?p=535966
Perfectly perched.
Perfectly perched. Arizona State University

Birds once again inspire robots to nimbly navigate the skies and obstacles.

The post Giving drones inflatable suits could help them survive crash landings appeared first on Popular Science.

]]>
Perfectly perched.
Perfectly perched. Arizona State University

When entering into disaster scenarios, robots still have a major downside—their inability to recover when they inevitably crash into things. Scientists, however, have taken a page out of biology’s playbook, as they often do, to create a drone that can bounce back when met with various obstacles. 

Think of a bird landing on a tree branch—in order to do so, they likely have to collide with a few smaller branches or leaves in the process of touching down. But, their joints and soft tissues cushion these bumps along the way, and their feet are built precisely to lock themselves in place without straining a muscle. When a drone opts for a similar route, taking on a bunch of collisions on the way to their destination, it’s a little bit more dramatic. “They don’t recover; they crash,” Wenlong Zhang, an associate professor and robotics expert at Arizona State University said in a release

“We see drones used to assess damage from high in the sky, but they can’t really navigate through collapsed buildings,” Zhang added. “Their rigid frames compromise resilience to collision, so bumping into posts, beams, pipes or cables in a wrecked structure is often catastrophic.” 

Zhang is an author of a recent paper published in Soft Robotics wherein a team of scientists designed and tested a quadrotor drone with an inflatable frame, apparently the first of its kind. The inflatable frame acts almost like a blow-up suit, protecting the drone from any harsh consequences of banging into a wall or another obstacle. It also provides the kind of soft tissue absorption necessary for perching—the team’s next task.

[Related: Watch this bird-like robot make a graceful landing on its perch.]

After studying how birds land and grip onto branches with their taloned feet, the team developed a fabric-based bistable grasper for the inflatable drone. The grasper had two unpowered “resting states,” meaning it can remain open or closed without using energy, and reacts to impact of landing by closing its little feet and gripping hard onto a nearby object.

“It can perch on pretty much anything. Also, the bistable material means it doesn’t need an actuator to provide power to hold its perch. It just closes and stays like that without consuming any energy,” Zhang said in the release. “Then when needed, the gripper can be pneumatically retracted and the drone can just take off.”

A more resilient type of drone is crucial for search and rescue scenarios when the path forward may be filled with debris, but the authors could also see this kind of creation being useful in monitoring forest fires or even exploration on other planets.

The post Giving drones inflatable suits could help them survive crash landings appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
The Terranaut is a new mine-hunting bot designed for beaches https://www.popsci.com/technology/terranaut-robot-mine-clearing/ Fri, 21 Apr 2023 14:25:55 +0000 https://www.popsci.com/?p=535906
Marines during an exercise in Hawaii on April 10, 2023. The Terranaut robot, not pictured, is designed to cope with explosives in these kinds of environments.
Marines during an exercise in Hawaii on April 10, 2023. The Terranaut robot, not pictured, is designed to cope with explosives in these kinds of environments. Clayton Baker / US Marines

The autonomous robot is intended for the dangerous work of dealing with explosives in areas where Marines would typically tread.

The post The Terranaut is a new mine-hunting bot designed for beaches appeared first on Popular Science.

]]>
Marines during an exercise in Hawaii on April 10, 2023. The Terranaut robot, not pictured, is designed to cope with explosives in these kinds of environments.
Marines during an exercise in Hawaii on April 10, 2023. The Terranaut robot, not pictured, is designed to cope with explosives in these kinds of environments. Clayton Baker / US Marines

On April 19, Nauticus Robotics announced that its work on the Terranaut, an amphibious machine designed to defeat explosive mines for the Defense Innovation Unit, had cleared its initial phase and was progressing to further development. The machine builds on Nauticus’ previous work with aquatic uncrewed vehicles. It fits into a holistic picture of untethered, autonomous underwater operations, where tools developed for commercial underwater work inform machines specifically built to tackle the special military needs below the ocean’s surface.

Nauticus teased this announcement of Terranaut on social media with a picture of tread lines on a beach leading into the ocean surface.

DIU, or the Defense Innovation Unit, is an organization within the larger Department of Defense designed to pull innovations from the commercial tech world into military use. Rather than reinventing the wheel, it is built to look at wagon wheels it could simply buy for its chariots.

“DIU gets intrigued when you have some commercial-facing technologies that they think they could orient towards a defense mission,” Nauticus CEO Nicolaus Radford tells Popular Science. “A lot of people focus on our big orange robots. But what’s between our robots’ ears is more important.” 

“So DIU is like, all right, you guys have made some commercial progress,” Radford adds. “You’ve got a commercial platform both in software and hardware. Maybe we can modify it a little bit towards some of these other missions that we’re interested in.”

In Nauticus’ announcement, they emphasized that Terranaut is being developed as an autonomous mine countermeasure robot, which can work in beaches and surf zones. These are the exact kind of areas where Marines train and plan to fight, especially in Pacific island warfare. Terranaut, as promised, will both swim and crawl, driven by an autonomous control system that can receive human direction through acoustic communication.

The Terranaut can navigate on treads and with powerful thrusters, with plans for manipulator arms that can emerge from the body to tackle any tasks, like disassembling an underwater mine.

The Terranaut robot.
The Terranaut robot. Nauticus Robotics

“It’s able to fly through the water column and then also change its buoyancy in a way that it can get appreciable traction,” says Radford. “Let’s say you’re driving on the sub-sea bed and you encounter a rock. Well, you don’t know how long the rock is, it could take you a while to get around it, right?” The solution in that case would be to go above it. 

Much of the work that informed the creation and design of Terranaut comes from Nauticus’ work on Aquanaut, which is a 14.5-foot-long submersible robot that can operate at depths of almost 10,000 feet, and in regular versions at distances of up to 75 miles. Powered by an electric motor and carrying over 67 kilowatt hours of battery power, the aquanaut moves at a baseline speed of 3 knots, or almost 3.5 mph, underwater, and it can last on its battery power for over four days continuously. But what most distinguishes Aquanaut is its retractable manipulator arms that fold into its body when not needed, and its ability to operate without the direct communications control through an umbilical wire like another undersea robot.

The Aquanaut can perceive its environment thanks to sonar, optical sensors in stereo, native 3D cloud point imagery, and other sensors. This data can be collected at a higher resolution than is transmittable while deep undersea, with the Aquanaut able to surface or dock and transmit higher volumes and density of data faster

Like the Aquanaut, the Terranaut does not have an umbilical connecting it to a boat.

Typically, boats have umbilicals connecting them to robots “because you have to have an operator with joysticks looking at HD monitors, being able to drive this thing,” says Radford. “What we said is ‘let’s throw all that out.’ We can create a hybrid machine that doesn’t need an umbilical that can swim really far, but as it turns out, people just don’t want to take pictures. They want to pick something up, drop something off, cut something, plug something in, and we developed a whole new class of subsea machines that allows you to do manipulation underwater without the necessity of an umbilical.”

Removing the umbilical frees up the design for what kind of ships can launch and manage underwater robotics. It also comes with a whole new set of problems, like how to ensure that the robot is performing the tasks asked of it by a human operator, now that the operator is not driving but directing the machine. Communication through water is hard, as radio signals and light signals are both limited in range and efficacy, especially below the ocean’s surface.

Solving these twin problems means turning to on-board autonomy, and acoustic controls.  

“We have data rates akin to dial up networking in 1987,” says Radford. “You’re not gonna be streaming HD video underwater with a Netflix server, but there are ways in which you can send representative information in the 3D environment around you back to an operator, and then the operator flies the autopilot of the robot around.”

That means, in essence, that the robot itself is largely responsible for managing the specifics of its ballast and direction, and following commands transmitted acoustically through the water. In return it sends information back, allowing a human to select actions and behaviors already loaded onto the robot.

Like the Aquanaut before it, the Terranaut will come preloaded with the behaviors needed to navigate its environment and perform the tasks assigned to it. Once the Terranaut rolls through surfy shallows, onto beaches, and into visual range, it will apply those tools, adaptive autonomy and remote human guidance, to taking apart deadly obstacles, like underwater explosives.

“I think this is the beginning of a very vibrant portfolio of aquatic drones that I hope captures the public’s imagination on what’s possible underwater. I think it’s just as fascinating as space, if not more so, because it’s so much more near to us,” said Radford. “You know, five percent of the ocean seabed has been explored on any level. We live on an ocean planet stupidly called Earth.”

The post The Terranaut is a new mine-hunting bot designed for beaches appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
A new robotic seed can wriggle into soil to harvest climate data https://www.popsci.com/technology/seed-robot-soil/ Thu, 20 Apr 2023 20:00:00 +0000 https://www.popsci.com/?p=535681
When tested in a soil sample, the robot was able to shimmy about, adapt its shape to cracks, and burrow into holes in the ground much like the natural seed.
When tested in a soil sample, the robot was able to shimmy about, adapt its shape to cracks, and burrow into holes in the ground much like the natural seed. Unsplash

The nature-inspired device could help improve our soddy communication with sod.

The post A new robotic seed can wriggle into soil to harvest climate data appeared first on Popular Science.

]]>
When tested in a soil sample, the robot was able to shimmy about, adapt its shape to cracks, and burrow into holes in the ground much like the natural seed.
When tested in a soil sample, the robot was able to shimmy about, adapt its shape to cracks, and burrow into holes in the ground much like the natural seed. Unsplash

Soil is one of the most crucial, if not underrated, elements of daily life—it’s essential for growing the food and resources we rely on, combats drought, protects against flooding, and can sequester carbon dioxide for years to come. But, the dirt beneath our feet is constantly under threat due to rising temperatures and biodiversity loss thanks to climate change. And despite how simple we may think soil is, it’s pretty hard to know what’s really going on deep in the ground from the surface.

Scientists in Italy, however, think they may have a robotic solution—a seed-inspired robot. Scientists at the Bioinspired Soft Robotics (BSR) Lab, a part of the Istituto Italiano di Tecnologia (IIT-Italian Institute of Technology) in Genoa, have developed the first 4D printed seed-inspired soft robot, which they claim can help act as sensors for monitoring pollutants, CO2 levels, temperature and humidity in soil. They published their findings earlier this year in Advanced Science. The research is part of the EU-funded I-Seed project aimed at making robots that can detect environmental changes in air and soil. 

What they’ve got here is an artificial seed inspired by the structure of a South African geranium, or the Pelargonium appendiculatum. The seeds of the tuberous, hairy-leafed plant have the ability to change shape in response to how humid their environment is. When the time comes for the seeds to leave the plant, they detach and can move independently to “penetrate” soil fractures, according to the study. This almost looks like crawling and burning action, which is due its helical shape changing according to changes in the environment. In a way. The curly seeds can find a home for themselves simply by expanding and shrinking due to changes in water content of the air.

[Related: This heat-seeking robot looks and moves like a vine.]

The team at IIT-BSR mimicked these seeds by combining 3D printing and electrospinning, using materials that also absorb and expand when exposed to humidity. Using fused deposition modeling, the authors printed a substrate layer of polycaprolactone, a biodegradable thermoplastic polyester activated using oxygen plasma to increase water-attracting abilities. Next, they added electrospun hygroscopic fibers made of a polyethylene oxide shell and a cellulose nanocrystal core. 

When tested in a soil sample, the robot was able to shimmy about, adapt its shape to cracks, and burrow into holes in the ground much like the natural seed. Not to mention, it was capable of lifting about 100 times its own weight. First author Luca Cecchini, a PhD student at IIT, said in a statement that the biodegradable and energy-autonomous robots could be used as “wireless, battery-free tools for surface soil exploration and monitoring.”

Land photo
The first I-Seed created at IIT is inspired by the seed structure of a South African geranium, the Pelargonium appendiculatum. Credit: IIT-Istituto Italiano di Tecnologia

“With this latest research,” Barbara Mazzolai, associate director for robotics of the IIT and coordinator of the I-Seed Project, said in the statement, “we have further proved that it is possible to create innovative solutions that not only have the objective of monitoring the well-being of our planet, but that do so without altering it.”

The post A new robotic seed can wriggle into soil to harvest climate data appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
This robot dog learned a new trick—balancing like a cat https://www.popsci.com/technology/robot-dog-balance-beam/ Wed, 19 Apr 2023 14:00:00 +0000 https://www.popsci.com/?p=535177
Just a step at a time.
Just a step at a time. Carnegie Mellon University

Without a tail and a bendy spine, nonetheless.

The post This robot dog learned a new trick—balancing like a cat appeared first on Popular Science.

]]>
Just a step at a time.
Just a step at a time. Carnegie Mellon University

We’ve seen how a quadruped robot dog can dribble a ball, climb walls, run on sand, and open doors with its “paws.” The latest test isn’t that of motion, necessarily, but of balance. This time, researchers at Carnegie Mellon University’s Robotics Institute have found a way to make an off-the-shelf quadruped robot agile and stable enough to walk across a balance beam.

Even for humans, the balance beam is quite a feat to conquer—something that leaves even gymnasts nervous. “It’s the great equalizer,” Michigan women’s gymnastics coach Beverly Plocki told the Chicago Tribune in 2016. “No other event requires the same mental focus. You stumble on the floor, it’s a minor deduction. The beam is the event of perfection. No room for error.”

[Related: A new tail accessory propels this robot dog across streams.]

But in robot dogs, their legs aren’t exactly coordinated. If three feet can touch the ground, generally they are fine, but reduce that to one or two robot feet and you’re in trouble. “With current control methods, a quadruped robot’s body and legs are decoupled and don’t speak to one another to coordinate their movements,” Zachary Manchester, an assistant professor in the Robotics Institute and head of the Robotic Exploration Lab, said in a statement. “So how can we improve their balance?”

How CMU’s scientists managed to get a robot to daintily scale a narrow beam—the first time this has been done, so the researchers claim—is by leveraging hardware often used on spacecrafts: a reaction wheel actuator. This system helps the robot balance wherever its feet are, which is pretty helpful in lieu of something like a tail or a flexible spine which helps actual four-legged animals catch their balance. 

[Related: This bumblebee-inspired bot can bounce back after injuring a wing.]

“You basically have a big flywheel with a motor attached,” said Manchester. “If you spin the heavy flywheel one way, it makes the satellite spin the other way. Now take that and put it on the body of a quadruped robot.”

The team mounted two reaction wheel actuators on the pitch and roll axis of a commercial Unitree A1 robot, making it so the little bot could balance itself no matter where its feet were. Then, they did two dexterity tests—the first dropping it upside down from about half a meter in the air. Like a cat, the robot was able to flip itself over and land on its feet. 

Second came the balance beam test, this time making the robot walk along a six-centimeter-wide balance beam, which the bot did with ballerina-like gracefulness. This could come in handy in the future, not only for purely entertainment value, but maneuvering tricky scenarios in the case of search-and-rescue, which is often a goal for development across all sorts of robots. The team will be showing off their latest endeavor at the 2023 International Conference on Robotics and Automation this summer in London.

The post This robot dog learned a new trick—balancing like a cat appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Meet xenobots, tiny machines made out of living parts https://www.popsci.com/technology/xenobots/ Mon, 17 Apr 2023 11:00:00 +0000 https://www.popsci.com/?p=534352
A xenobot, or a living robot, in culture, under a microscope.
Xenobots can work together to gather particulate matter into a pile. Douglas Blackiston and Sam Kriegman

The starting ingredient for these bio-robots: frog cells.

The post Meet xenobots, tiny machines made out of living parts appeared first on Popular Science.

]]>
A xenobot, or a living robot, in culture, under a microscope.
Xenobots can work together to gather particulate matter into a pile. Douglas Blackiston and Sam Kriegman

You may or may not have heard of xenobots, a kind of Frankenfrog creation that involves researchers turning frog embryo cells into tiny bio-machines that can move around, push or carry objects, and work together. These ephemeral beings were first made by a team of scientists from Tufts University and the University of Vermont in 2020. 

The goal behind building these “bots” was to understand how cells communicate with one another. Here’s a breakdown of the hard facts behind how xenobots actually work, and what they are currently used for. 

What are xenobots?

A “living robot” can sound like a scary sci-fi term, but they are not anything like the sentient androids you may have seen on screen.

“At the most basic level, this is a platform or way to build with cells and tissues, the way we can build robots out of mechanical components,” says Douglas Blackiston, a senior scientist at Tufts University. “You can almost think of it as Legos, where you can combine different Legos together, and with the same set of blocks you can make a bunch of different things.” 

Biology photo
Xenobots are tiny. Here they are against a dollar bill for size. Douglas Blackiston and Sam Kriegman

But why would someone want to build robots out of living components instead of traditional materials, like metal and plastic? One advantage is that having a bio-robot of sorts means that it is biodegradable. In environmental applications, that means if the robot breaks, it won’t contaminate the environment with garbage like metal, batteries, or plastic. Researchers can also program xenobots to fall apart naturally at the end of their lives. 

How do you make a xenobot?

The building blocks for xenobots come from the eggs laid by the female African clawed frog, which goes by the scientific name Xenopus laevis

Just like with a traditional robot, they need other essential components: a power source, a motor or actuator for movement, and sensors. But with xenobots, all of these components are biological.

A xenobot’s energy comes from the yolk that’s a part of all amphibian eggs, which can power these machines for about two weeks with no added food. To get them to move, scientists can add biological “motors” like muscle or cardiac tissue. They can arrange the motors in different configurations to get the xenobots to move in certain directions or with a certain speed.  

“We use cardiac tissue because cardiac cells pulse at a regular rate, and that gives you sort of an inchworm type of movement if you build with it,” says Blackiston. “The other types of movement we get are from cilia. These are small hair-like structures that beat on the outside of different types of tissues. And this is a type of movement that dominates the microscopic world. If you take some pond water and look, most of what you see will move around with cilia.” 

Biology photo
Swimming xenobots with cilia covering their surface. Douglas Blackiston and Sam Kriegman

Scientists can also add components like optogenetic muscle tissues or chemical receptors to allow these biobots to respond to light or other stimuli in their environment. Depending on how the xenobots are programmed, they can autonomously navigate through their surroundings or researchers can add stimulus to “drive” them around. 

“There’s also a number of photosynthetic algae that have light sensors that directly hook onto the motors, and that allows them to swim towards sunlight,” says Blackiston. “There’s been a lot of work on the genetic level to modify these to respond to different types of chemicals or different types of light sources and then to tie them to specific motors.”

[Related: Inside the lab that’s growing mushroom computers]

Even in their primitive form, xenobots can still convey some type of memory, or relay information back to the researchers about where they went and what they did. “You can pretty easily hook activation of these different sensors into fluorescent molecules that either turn on or change color when they’re activated,” Blackiston explains. For example, when the bots swim through a blue light, they might change color from green to red permanently. As they move through mazes with blue lights in certain parts of it, they will glow different colors depending on the choices they’ve made in the maze. The researcher can walk away while the maze-solving is in progress, and still be in the know about how the xenobot navigated through it.  

They can also, for example, release a compound that changes the color of the water if they sense something.  

These sensors make the xenobot easy to manage. In theory, scientists can make a system in which the xenobots are drawn to a certain wavelength of light. They could then shine this at an area in the water to collect all of the bots. And the ones that slip through can still harmlessly break down at the end of their life. 

A xenobot simulator

Blackiston, along with collaborators at Northwestern and University of Vermont, are using an AI simulator they built to design different types of xenobots. “It looks sort of like Minecraft, and you can simulate cells in a physics environment and they will behave like cells in the real world,” he says. “The red ones are muscle cells, blue ones are skin cells, and green ones are other cells. You can give the computer a goal, like: ‘use 5,000 cells and build me a xenobot that will walk in a straight line or pick something up,’ and it will try hundreds of millions of combinations on a supercomputer and return to you blueprints that it thinks will be extremely performant.”

Most of the xenobots he’s created have come from blueprints that have been produced by this AI. He says this speeds up a process that would have taken him thousands of years otherwise. And it’s fairly accurate as well, although there is a bit of back and forth between playing with the simulator and modeling the real-world biology. 

Biology photo
Xenobots of different shapes crafted using computer-simulated blueprints. Douglas Blackiston and Sam Kriegman

The xenobots that Blackiston and his colleagues use are not genetically modified. “When we see the xenobots doing kinematic self-replication and making copies of themselves, we didn’t program that in. We didn’t have to design a circuit that tells the cells how to do kinematic self replication,” says Michael Levin, a professor of biology at Tufts. “We triggered something where they learned to do this, and we’re taking advantage of the native problem-solving capacity of cells by giving it the right stimuli.” 

What can xenobots help us do?

Xenobots are not just a blob of cells congealing together—they work like an ecosystem and can be used as tools to explore new spaces, in some cases literally, like searching for cadmium contamination in water. 

“We’re jamming together cells in configurations that aren’t natural. Sometimes it works, sometimes the cells don’t cooperate,” says Blackiston. “We’ve learned about a lot of interesting disease models.”

For example, with one model of xenobot, they’ve been able to examine how cilia in lung cells may work to push particles out of the airway or spread mucus correctly, and see that if the cilia don’t work as intended, defects can arise in the system.

The deeper application is using these biobots to understand collective intelligence, says Levin. That could be a groundbreaking discovery for the space of regenerative medicine. 

“For example, cells are not hardwired to do these specific things. They can adapt to changes and form different configurations,” he adds. “Once we figure out how cells decide together what structures they’re going to form, we can take advantages of those computations and build new organs, regenerate after injury, reprogram tumors—all of that comes from using these biobots as a way to understand how collective decision-making works.” 

The post Meet xenobots, tiny machines made out of living parts appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Cyborg cockroaches could one day scurry to your rescue https://www.popsci.com/technology/cockroach-cyborg/ Thu, 13 Apr 2023 20:00:00 +0000 https://www.popsci.com/?p=533937
Madagascar hissing cockroach balanced on human finger against green backdrop
Imagine this, but with a tiny computer strapped to its back. Deposit Photos

Here's how hacking bug brains could one day help save lives.

The post Cyborg cockroaches could one day scurry to your rescue appeared first on Popular Science.

]]>
Madagascar hissing cockroach balanced on human finger against green backdrop
Imagine this, but with a tiny computer strapped to its back. Deposit Photos

Imagine yourself trapped in a building’s rubble following an earthquake. It’s a terrifying prospect, especially if time is of the essence for search and rescue operations. Now imagine  one of your rescuers turns out to be a cyborg cockroach. 

Regardless of how you feel about insects, a team of scientists at Osaka University in Japan apparently believe these resilient little bugs can come in handy in times of disaster. According to the researchers’ paper recently published within the journal Cyborg and Bionic Systems, society is closer than it’s ever been to deploying cybernetically augmented bugs to aid in real world scenarios such as natural disasters and extreme environment explorations. And everyone owes it all to their legion of semi-controllable cyborg Madagascar hissing cockroaches.

[Related: Spider robots could soon be swarming Japan’s aging sewer systems.]

Insects are increasingly inspiring robotic advancements, but biomimicry still often proves immensely complex. As macabre as it may seem, researchers have found augmenting instead of mechanically replicating six-legged creatures can offer simpler, cost-effective alternatives. In this most recent example, scientists implanted tiny, stimulating electrodes into the cockroaches’ brains and peripheral nervous systems, which were subsequently connected to a machine learning program. The system was then trained to recognize the insects’ locomotive states—if a cockroach paused at an obstacle or hunkered down in a dark, cold environment (as cockroaches are evolutionarily prone to do), the electrodes directed them to continue moving in an alternative route. To prevent excess fatigue, researchers even fine-tuned the stimulating currents to make them as minimal as possible.

Insects photo
Cyborg cockroaches could help save lives. Credit: Osaka University

Importantly, the setup didn’t reduce the insects to zombie cockroaches, but instead simply influenced their movement decisions.  “We don’t have to control the cyborg like controlling a robot. They can have some extent of autonomy, which is the basis of their agile locomotion,” Keisuke Morishima, a roboticist and one of the study’s authors, said in a statement. “For example, in a rescue scenario, we only need to stimulate the cockroach to turn its direction when it’s walking the wrong way or move when it stops unexpectedly.”

[Related: This bumblebee-inspired bot can bounce back after injuring a wing.]

While the scientists currently can’t yet control their cockroaches’ exact directions this way, their paper concludes the setup “successfully increased [their] average search rate and traveled distance up to 68 and 70 percent, respectively, while the stop time was reduced by 78 percent.” Going forward, they hope to improve these accuracy rates, as well as develop means to intentionally direct their enhanced cockroaches. Once that’s achieved, then you can start worrying about the zombie cyborg cockroach invasion.

The post Cyborg cockroaches could one day scurry to your rescue appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Watch a robot hand only use its ‘skin’ to feel and grab objects https://www.popsci.com/technology/soft-robot-hand/ Wed, 12 Apr 2023 20:00:00 +0000 https://www.popsci.com/?p=533712
Soft robotic hand picking up plastic ball from table
It's harder than it looks. University of Cambridge

It turns out robot hands don't need to articulate their fingers to grasp objects.

The post Watch a robot hand only use its ‘skin’ to feel and grab objects appeared first on Popular Science.

]]>
Soft robotic hand picking up plastic ball from table
It's harder than it looks. University of Cambridge

Robots can have trouble grasping the concept of “grasping.” It’s so bad that even a toddler’s motor skills are usually far more developed than some of the most advanced bots. For example, as instinctually easy as it is for a human to pick up an egg, robots usually struggle to compute the intricacies of force and manipulation while also not expending too much energy. To solve this issue, researchers at the University of Cambridge recently found a novel solution by streamlining what a robot hand could do.

As detailed in a paper published with Advanced Intelligent Systems, the team has developed a low-cost robotic hand capable of passively grasping and holding various objects via sensors embedded in its “skin.” What’s more, no finger articulation is needed to accomplish its tasks, thus drastically simplifying its design, programming, and energy needs.

“We want to simplify the hand as much as possible,” Fumiya Iida, a professor in the university’s Bio-Inspired Robotics Laboratory and one of the paper’s co-authors, said in a statement. “We can get lots of good information and a high degree of control without any actuators, so that when we do add them, we’ll get more complex behavior in a more efficient package.”

To pull it off, researchers first implanted tactile sensors within a soft, 3D-printed, anthropomorphic hand that only moved via its wrist. The team then performed over 1,200 tests to study its grasping and holding abilities. Many of these tests focused on picking up small, 3D-printed plastic balls by mimicking pre-determined movements demonstrated by humans. After the plastic balls, the hand graduated to attempting to pick up bubble wrap, a computer mouse, and even a peach. According to their results, the hand successfully managed 11 of the 14 additional test objects.

[Related: Human brains have to work overtime to beat robots at Ping-Pong.]

According to first author Kieran Gilday, the team’s robot appendage learns over time that certain combinations of wrist motion and sensor data leads to success or failure, and adjusts as needed. “The hand is very simple, but it can pick up a lot of objects with the same strategy,” they said in the statement.

While by no means perfect, the simplified robotic hand could prove useful in a variety of environments and industries, such as manufacturing. Moving forward, researchers hope to potentially expand the robot hand’s capabilities through combining it with computer vision and teaching it “to exploit its environment” to utilize a wider array of objects.

The post Watch a robot hand only use its ‘skin’ to feel and grab objects appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Human brains have to work overtime to beat robots at Ping-Pong https://www.popsci.com/technology/human-robot-brain-neuroscience-ping-pong/ Tue, 11 Apr 2023 15:00:00 +0000 https://www.popsci.com/?p=533206
Ping Pong player wearing electrode cap playing against robot ball server
Playing against robots can result in brain 'desynchronization.'. University of Florida / Frazier Springfield

Playing sports with bots can be a real workout for the human brain.

The post Human brains have to work overtime to beat robots at Ping-Pong appeared first on Popular Science.

]]>
Ping Pong player wearing electrode cap playing against robot ball server
Playing against robots can result in brain 'desynchronization.'. University of Florida / Frazier Springfield

Dealing with humanoid robots in their current iterations often seems a bit uncanny—the technology feels vaguely like us, but comes up short in a way that boggles the mind. While this may not be as major a problem in manufacturing roles, in industries like elderly care, personality—even if robotic—could go a long way. But it’s more than just how they look—movement is key, too. As robots increasingly become integrated facets of everyday modern life, both their makers and outside researchers want to pinpoint why people respond to them the way they do, and how they can improve.

As researchers recently showed, one way to analyze these interactions is through a few rounds of table tennis. As detailed in a new paper published in the journal eNeuro a team at the University of Florida recorded human brain activity during dozens of hours of matches between both fellow human and robot opponents. They then compared the games to see if those rivals resulted in different neurological readings.

[Related: Meet Garmi, a robot nurse and companion for Germany’s elderly population.]

In a statement, Daniel Ferris, a professor of biomedical engineering at the University of Florida and advisor on the project run by Amanda Studnicki, a graduate student at UF, said, “Robots are getting more ubiquitous… Humans interacting with robots is going to be different than when they interact with other humans. Our long term goal is to try to understand how the brain reacts to these differences.”

To do this, Studnicki and Ferris assembled a cap lined with over 100 electrodes attached to a backpack-sized device, then asked human trial participants to don the futuristic hat while playing Ping-Pong. As it turns out, the human brain shows clear signs of working harder if paired against a robot opponent.

While serving against a fellow human, players’ neurons cooperated in unison to interpret subtle body cues, timing, and speed. When squared up against a ball-serving machine, however, the neurons weren’t as aligned, a situation known within neuroscience as “desynchronization.”

[Related: Do we trust robots enough to put them in charge?]

“In a lot of cases, that desynchronization is an indication that the brain is doing a lot of calculations as opposed to sitting and idling,” explained Ferris in a statement on Monday.

The team theorizes that because human brains work so much differently against a robotic opponent, when it comes to sports training nothing beats a fellow member of the species. That said, Studnicki isn’t so sure that will always be the case. “I still see a lot of value in practicing with a machine,” they said in Monday’s announcement. “But I think machines are going to evolve in the next 10 or 20 years, and we could see more naturalistic behaviors for players to practice against.”

Those naturalistic behaviors could come about through continued robotic improvements alongside similar brain activity monitoring. The closer to synchronization, the more seamless and less uncanny people’s experiences with robots could become. The ball, after all, is firmly in robot makers’ court.

The post Human brains have to work overtime to beat robots at Ping-Pong appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Meta just released a tool that helps computers ‘see’ objects in images https://www.popsci.com/technology/meta-segment-anything-ai-tool/ Thu, 06 Apr 2023 22:00:00 +0000 https://www.popsci.com/?p=532186
figure with mixed reality headset
Segmentation is a key feature of machine vision. Liam Charmer / Unsplash

You can test out the model in your browser right now.

The post Meta just released a tool that helps computers ‘see’ objects in images appeared first on Popular Science.

]]>
figure with mixed reality headset
Segmentation is a key feature of machine vision. Liam Charmer / Unsplash

In a blog post this week, Meta AI announced the release of a new AI tool that can identify which pixels in an image belong to which object. The Segment Anything Model (SAM) performs a task called “segmentation” that’s foundational to computer vision, or the process that computers and robots) employ to “see” and comprehend the world around them. As well as its new AI model, Meta is also making its training dataset available to outside researchers. 

In his 1994 book, The Language Instinct, Steven Pinker wrote “the main lesson of 35 years of AI research is that the hard problems are easy and the easy problems are hard.” Called Moravec’s paradox, 30-odd years later it still holds true. Large language models like GPT-4 are capable of producing text that reads like something a human wrote in seconds, while robots struggle to pick up oddly shaped blocks—a task so seemingly basic that children do it for fun before they turn one. 

Segmentation falls into this looks-easy-but-is-technically-hard category. You can look at your desk and instantly tell what’s a computer, what’s a smartphone, what’s a pile of paper, and what’s a scrunched up tissue. But to computers processing a 2D image (because even videos are just series of 2D images) everything is just a bunch of pixels with varying values. Where does the table top stop and the tissue start?

Meta’s new SAM AI is an attempt to solve this issue in a generalized way, rather than using a model designed specifically to identify one thing, like faces or guns. According to the researchers, “SAM has learned a general notion of what objects are, and it can generate masks for any object in any image or any video, even including objects and image types that it had not encountered during training.” In other words, instead of only being able to recognize the objects it’s been taught to see, it can guess at what the different objects are. SAM doesn’t need to be shown hundreds of different scrunched up tissues to tell one apart from your desk, it’s general sense of things is enough. 

[Related: One of Facebook’s first moves as Meta: Teaching robots to touch and feel]

You can try SAM in your browser right now with your own images. SAM can generate a mask for any object you select by clicking on it with your mouse cursor or drawing a box around it. It can also just create a mask for every object it detects in the image. According to the researchers, SAM is also able to take text prompts—such as: select “cats”—but the feature hasn’t been released to the public yet. It did a pretty good job of segmenting the images we tested out here at PopSci

AI photo
A visualization of how the Segment Anything tool works. Meta AI

While it’s easy to find lots of images and videos online, high-quality segmentation data is a lot more niche. To get SAM to this point, Meta had to develop a new training database: the Segment Anything 1-Billion mask dataset (SA-1B). It contains around 11 million licensed images and over 1.1 billion segmentation masks “of high quality and diversity, and in some cases even comparable in quality to masks from the previous much smaller, fully manually annotated datasets.” In order to “democratize segmentation,” Meta is releasing it to other researchers. 

AI photo
Some industry applications for the new AI tool. Meta AI

Meta has big plans for its segmentation program. Reliable, general computer vision is still an unsolved problem in artificial intelligence and robotics—but it has a lot of potential. Meta suggests that SAM could one day identify everyday items seen through augmented reality (AR) glasses. Another project from the company called Ego4D also plans to tackle a similar problem through a different lens. Both could one day lead to tools that allow users to follow directions along with a step-by-step recipe, or leave virtual notes for your partner on the dog bowl. 

More plausibly, SAM would also have a lot of potential uses in industry and research. Meta proposes using it to help farmers count cows or biologists track cells under a microscope—the possibilities are endless.

The post Meta just released a tool that helps computers ‘see’ objects in images appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Foldable robots with intricate transistors can squeeze into extreme situations https://www.popsci.com/technology/origami-mechanobots-transistors/ Tue, 04 Apr 2023 18:00:00 +0000 https://www.popsci.com/?p=525374
Venus flytrap inspired origami robot on black backgroun
The robot's material itself operates as a stand-in for semiconductors. UCLA Samueli

Researchers at UCLA have designed semiconductor alternatives that function as foldable material.

The post Foldable robots with intricate transistors can squeeze into extreme situations appeared first on Popular Science.

]]>
Venus flytrap inspired origami robot on black backgroun
The robot's material itself operates as a stand-in for semiconductors. UCLA Samueli

A team of UCLA researchers has designed a new way to integrate traditionally inflexible semiconductor and sensory components into their devices’ structural materials. Enter a new dimension to origami-based robotics.

While origami has long inspired robotic design, it usually comes with some caveats—notably the placement and size of bits like computer chips. As “foldable” as a device may be, these rigid parts generally meant semiconductors needed installation after a robot’s shape was finalized. However, the multidisciplinary team managed to integrate flexible, conductive materials into extremely thin sheets of polyester film in order to create entirely new networks of transistors. According to UCLA’s description, the sheets could then be programmed with computer functions to emulate semiconductors’ usual roles within a robot. They recently detailed these findings in a paper published in Nature Communications.

[Related: A tiny, foldable solar panel is going to outer space.]

To test out their advancements, researchers built three versions of their Origami MechanoBots, or OrigaMechs: a bug bot that reverses course whenever its antennae detects an impediment, a two-wheeled robot capable of traveling along prearranged geometric pathways, and even a Venus flytrap-inspired mechanism that closes its jaws when detecting pressure from its “prey.”

According to the team’s paper, the foldable semiconductor-like materials’ utility could go above and beyond their lightweight flexibility. In the future, similar robots could also operate within extreme environments unsuitable for traditional semiconductors, including situations involving strong magnetic or radiative fields, high electrostatic discharges, as well as particularly intense radio frequencies.

Robots photo
Credit: UCLA Samueli

“These types of dangerous or unpredictable scenarios, such as during a natural or manmade disaster, could be where origami robots proved to be especially useful,” said Ankur Mehta, the study’s principal investigator and director of UCLA’s Laboratory for Embedded Machines and Ubiquitous Robots.

[Related: This fabric doubles as 1,200 solar panels.]

Because of their thin design, the new robotic material could also prove useful in future missions to space, where cargo capacity and size restraints are incredibly pivotal factors to consider. There’s even talk of using them within future toys and educational games. According to Mehta, the sky is truly the limit for the polyester OrigaMechs:

“While it’s a very long way away, there could be environments on other planets where explorer robots that are impervious to those scenarios would be very desirable,” he said. 

The post Foldable robots with intricate transistors can squeeze into extreme situations appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
MIT’s soccer-playing robot dog is no Messi, but could one day help save lives https://www.popsci.com/technology/dribblebot-mit-soccer/ Mon, 03 Apr 2023 18:00:00 +0000 https://www.popsci.com/?p=524968
DribbleBot four-legged robot standing behind soccer ball
DribbleBot can kick around a soccer ball on a variety of terrains. MIT CSAIL

DribbleBot is as cute as it is talented, but its advances could one day help save lives.

The post MIT’s soccer-playing robot dog is no Messi, but could one day help save lives appeared first on Popular Science.

]]>
DribbleBot four-legged robot standing behind soccer ball
DribbleBot can kick around a soccer ball on a variety of terrains. MIT CSAIL

The history of robots strutting their stuff on the soccer field stretches as far back as 1992, when Japanese researchers first envisioned the sport as a solid benchmark testing environment for engineering advancements. Since then, the RoboCup has become the annual nexus of fancy mechanical footwork, and MIT just unveiled its newest potential competitor. Unlike many of its two-legged and wheeled counterparts, however, the quadrupedal design makes this but particularly unique, as well as better suited to handle a variety of real-world terrains.

It’s time for DribbleBot’s kickoff.

Publicly unveiled today from researchers in MIT’s Improbable Artificial Intelligence Lab within the school’s Computer Science and Artificial Intelligence Laboratory (CSAIL), DribbleBot showcases extremely impressive strides in articulation and real-time environmental analysis. Using a combination of onboarding computing and sensing, the team’s four-legged athlete can reportedly handle gravel, grass, sand, snow, and pavement, as well as pick itself up if it falls.

Kicking around a soccer ball provides an interesting additional array of complications for a robot. The way a ball interacts with the terrain beneath it via friction and drag, for example, is different from how a robot’s legs may interact with the same environment. Therefore, a robot needs to be able to simultaneously account for both its own responses, as well as the object it’s attempting to kick.

[Related: Watch this robotic dog use one of its ‘paws’ to open doors.]

The newest robotic versatility comes from a combination of machine learning, onboard sensors, actuators, cameras, and computing power. But before taking to the soccer stadium, DribbleBot needed extensive practice time—in this case, within computer simulations. Researchers built a program mimicking the dog bot’s design alongside real world physics parameters. Once given the greenlight, 4,000 versions of the robot are simulated simultaneously to collect and learn from data. According to researchers, a few actual days’ worth of training adds up to hundreds of simulated days.

Building better DribbleBots isn’t simply for fun and games; the advancements are meant to help out in some of life’s most serious situations. “If you look around today, most robots are wheeled. But imagine that there’s a disaster scenario, flooding, or an earthquake, and we want robots to aid humans in the search-and-rescue process,” Pulkit Agrawal, a CSAIL principal investigator and director of Improbable AI Lab, said in a statement, adding, “Our goal in developing algorithms for legged robots is to provide autonomy in challenging and complex terrains that are currently beyond the reach of robotic systems.”

The post MIT’s soccer-playing robot dog is no Messi, but could one day help save lives appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Colombia is deploying a new solar-powered electric boat https://www.popsci.com/technology/colombia-electric-patrol-boat-drone/ Fri, 31 Mar 2023 14:13:04 +0000 https://www.popsci.com/?p=524519
Colombia is not the only country experimenting with electric uncrewed boats. Above, an Ocean Aero Triton drone (left) and a Saildrone Explorer USV. These two vessels were taking part in an exercise involving the United Arab Emirates Navy and the US Navy in February, 2023.
Colombia is not the only country experimenting with electric uncrewed boats. Above, an Ocean Aero Triton drone (left) and a Saildrone Explorer USV. These two vessels were taking part in an exercise involving the United Arab Emirates Navy and the US Navy in February, 2023. Jay Faylo / US Navy

The 29-foot-long vessel is uncrewed, and could carry out intelligence, surveillance, and reconnaissance missions for the Colombian Navy.

The post Colombia is deploying a new solar-powered electric boat appeared first on Popular Science.

]]>
Colombia is not the only country experimenting with electric uncrewed boats. Above, an Ocean Aero Triton drone (left) and a Saildrone Explorer USV. These two vessels were taking part in an exercise involving the United Arab Emirates Navy and the US Navy in February, 2023.
Colombia is not the only country experimenting with electric uncrewed boats. Above, an Ocean Aero Triton drone (left) and a Saildrone Explorer USV. These two vessels were taking part in an exercise involving the United Arab Emirates Navy and the US Navy in February, 2023. Jay Faylo / US Navy

Earlier this month, a new kind of electric boat was demonstrated in Colombia. The uncrewed COTEnergy Boat debuted at the Colombiamar 2023 business and industrial exhibition, held from March 8 to 10 in Cartagena. It is likely a useful tool for navies, and was on display as a potential product for other nations to adopt. 

While much of the attention in uncrewed sea vehicles has understandably focused on the ocean-ranging craft built for massive nations like the United States and China, the introduction of small drone ships for regional powers and routine patrol work shows just far this technology has come, and how widespread it is likely to be in the future.

“The Colombian Navy (ARC) intends to deploy the new electric unmanned surface vehicle (USV) CotEnergy Boat in April,” Janes reports, citing Admiral Francisco Cubides. 

The boat is made from aluminum and has a compact, light body. (See it on Instagram here.) Just 28.5 feet long and under 8 feet wide, the boat is powered by a 50 hp electric motor; its power is sustained in part by solar panels mounted on the top of the deck. Those solar panels can provide up to 1.1 kilowatts at peak power, which is enough to sustain its autonomous operation for just shy of an hour.

The vessel was made by Atomo Tech and Colombia’s state-owned naval enterprise company, COTECMAR. The company says the boat’s lightweight form allows it to take on different payloads, making it suitable for “intelligence and reconnaissance missions, port surveillance and control missions, support in communications link missions, among others.”

Putting sensors on small, autonomous and electric vessels is a recurring theme in navies that employ drone boats. Even a part of the ocean that seems small, like a harbor, represents a big job to watch. By putting sensors and communications links onto an uncrewed vessel, a navy can effectively extend the range of what can be seen by human operators. 

In January, the US Navy used Saildrones for this kind of work in the Persian Gulf. Equipped with cameras and processing power, the Saildrones identified and tracked ships in an exercise as they spotted them, making that information available to human operators on crewed vessels and ultimately useful to naval commanders. 

Another reason to turn to uncrewed vessels for this work is that they are easier to run on fully  electric power, as opposed to a diesel or gasoline. COTECMAR’s video description notes that the COTEEnergy Boat is being “incorporated into the offer of sustainable technological solutions that we are designing for the energy transition.” Making patrol craft solar powered and electric starts the vessels sustainable.

While developed as a military tool, the COTENERGY boat can also have a role in scientific and research expeditions. It could serve as a communications link between other ships, or between ships and other uncrewed vessels, ensuring reliable operation and data collection. Putting in sensors designed to look under the water’s surface could aid with oceanic mapping and observation. As a platform for sensors, the COTEnergy Boat is limited by what its adaptable frame can carry and power, although its load capacity is 880 pounds.

Not much more is known about the COTEnergy Boat at this point. But what is compelling about the vessel is how it fits into similar plans of other navies. Fielding small useful autonomous scouts or patrol craft, if successful, could become a routine part of naval and coastal operations.

With these new kinds of boat come new challenges. Because uncrewed ships lack humans, it can make them easier targets for other navies or possibly maritime criminal groups, like pirates. The same kind of Saildrones used by the US Navy to scout the Persian Gulf have also been detained, if briefly, by the Iranian Navy. With such detentions comes the risk that data on the ship is compromised, and data collection tools figured out, making it easier for hostile forces to fool or evade the sensors in the future.

Still, the benefits of having a flexible, solar-powered robot ship outweigh such risks. Inspection of ports is routine until it isn’t, and with a robotic vessel there to scout first, humans can wait to act until they are needed, safely removed from their remote robotic companions.

Watch a little video of the COTEnergy Boat below:

The post Colombia is deploying a new solar-powered electric boat appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Watch this robotic dog use one of its ‘paws’ to open doors https://www.popsci.com/technology/quadrupedal-robot-walls/ Thu, 30 Mar 2023 20:00:00 +0000 https://www.popsci.com/?p=524360
It's surprisingly difficult to get robots to use their legs for walking and object interaction.
It's surprisingly difficult to get robots to use their legs for walking and object interaction. Carnegie Mellon/UC Berkeley

Oh, great. They can let themselves inside buildings now.

The post Watch this robotic dog use one of its ‘paws’ to open doors appeared first on Popular Science.

]]>
It's surprisingly difficult to get robots to use their legs for walking and object interaction.
It's surprisingly difficult to get robots to use their legs for walking and object interaction. Carnegie Mellon/UC Berkeley

Even with their many advances, quadrupedal robots’ legs are most often still just made for walking. Using individual front paws for moving and non-locomotion tasks like pushing buttons or moving objects, however, usually falls outside the machines’ reach, but a team of researchers appear to be designing them to finally bridge that gap.

Roboticists from Carnegie Mellon University and UC Berkeley have demonstrated the ability to program a quadrupedal robot—in this case, a Unitree Go1 one utilizing an Intel RealSense camera—to use its front limbs not only to walk, but also to help climb walls and interact with simple objects, as needed. The progress, detailed in a paper to be presented next month at the International Conference of Robotics and Automation (ICRA 2023), potentially marks a major step forward for what quadrupedal robots can handle. There’s also some pretty impressive video demonstrations, as well. Check out the handy machine in action below:

To pull off these abilities, researchers broke down their robots’ desired tasks into two broad skill sets—locomotion (movement like walking or climbing walls) and manipulation (using one leg to interact with externalities while balancing on the other three limbs). As IEEE Spectrum explains, the separation is important: Often, these tasks can prove to be in opposition to one another, leading robots to get stuck in computational quandaries. After training how to handle both skill sets within simulations, the team combined it all into a “robust long-term plan” via learning a behavior tree from “one clean expert demonstration,” according to the research paper.

[Related: A new tail accessory propels this robot dog across streams]

Developing cost-effective robots capable of tackling both movement and interaction with their surroundings is a key hurdle in deploying machines that can easily maneuver through everyday environments. In the research team’s videos, for example, the quadrupedal robot is able to walk up to a door, then press the nearby wheelchair access button to open it. Obviously, it’s much easier to rely on a single robot to manage both requirements, as opposed to using two robots, or altering human-specific environments to suit machines.

Combine these advancements with existing quadrupedal robots’ abilities to traverse diverse terrains such as sand and grass, toss in the trick of scaling walls and ceilings, and you’ve got a pretty handy four-legged friend.

The post Watch this robotic dog use one of its ‘paws’ to open doors appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
The Navy’s version of a Roomba inspects billion-dollar ships for damage https://www.popsci.com/technology/gecko-robotics-machine-inspects-navy-ships/ Wed, 29 Mar 2023 19:00:00 +0000 https://www.popsci.com/?p=523955
The critter is on the hull.
The critter is on the hull. Gecko Robotics

The machine from Gecko Robotics cruises along on magnetic wheels, gathering data about the hull as it goes.

The post The Navy’s version of a Roomba inspects billion-dollar ships for damage appeared first on Popular Science.

]]>
The critter is on the hull.
The critter is on the hull. Gecko Robotics

On March 27, Gecko Robotics announced its hull-inspecting robots will be used to assess a US Navy destroyer and an amphibious assault ship, expanding work already done to inspect Navy ships. These robots map surfaces as they climb them, creating useful and data-rich models to better help crews and maintainers find flaws and fix them. As the Navy looks to sustain and expand the role of its fleet while minimizing the number of new sailors needed, enlisting the aid of robot climbers can guide present and future repairs, and help ensure more ships are seaworthy for more time.

Getting ships into the sea means making sure they’re seaworthy, and it’s as important to naval operations as ensuring the crew is fed and the supplies are stocked. Maintenance can be time-intensive, and the Navy already has a backlog of work that needs to be done on the over 280 ships it has. Part of getting that maintenance right, and ensuring the effort is spent where it needs to be, is identifying the specific parts of a ship worn down by time at sea.

Enter a robotic critter called Gecko.

“The Navy found that using Gecko achieved incredible time savings and improvement in data quantity and quality. Before Gecko, the Navy’s inspection process produced 6,000 data points. Gecko provides significantly more coverage by collecting over 3.3 million data points for the hull and over 463,000 data points for the outboard side of the starboard rudder,” Ed Bryner, director of engineering at Gecko Robotics, tells Popular Science via email.

Those data points are collected by a hull-climbing robot. Gecko makes several varieties of the Toka robot, and the Navy inspections use the Toka 4. This machine can crawl over 30 feet a minute, recording details of the hull as it goes. 

“It is a versatile, multi-function robot designed initially to help hundreds of commercial customers in the power, manufacturing and oil and gas industries. It utilizes advanced sensors, cameras, and ultrasonics to detect potential defects and damages in flight decks, hulls and rudders,” says Bryner.

To climb the walls, the Toka uses wheels with neodymium permanent rare earth magnets that work on the carbon steel of the ship’s hull. The sensors are used to detect how thick walls are, if there is pitting or other degradation in the walls, and then to plot a map of all that damage. This is done with computers on-board the robot as it works, and then also processed in the cloud, through a service offered in Gecko’s Cantilever Platform.

“The millions of data points collected by the Toka 4 are used to build a high-fidelity digital twin to detect damage, automatically build repair plans, forecast service life and ensure structural integrity,” says Bryner.

A digital twin is a model and map based on the scanned information. Working on that model, maintainers can see where the ship may have deteriorated—perhaps a storm with greater force or a gritty patch of ocean that pockmarked the hull in real but hard to see ways. This model can guide repairs at port, and then it can also serve as a reference tool for maintainers when the ship returns after a deployment. Having a record of previous stress can guide repairs and work, and over time build a portrait of what kinds of degradation happen where.

“Gecko’s Cantilever Platform allows customers to pinpoint & optimize precise areas of damage in need of remediation (rather than replacing large swaths of a flight deck, for example), track their physical assets over time to identify trends and patterns, prioritize and build repair plans, deploy repair budgets efficiently, and make detailed maintenance plans for the service life of the asset,” says Bryner.

The robot is a tool for guiding repairs, operated by one or two people while it inspects and maps. This map then guides maintenance to where it is most needed, and in turn shapes maintenance that comes after. It’s a way of modernizing the slow but important work of keeping ships ship-shape. 

So far, reports Breaking Defense, Gecko’s system has scanned six ships, with two more announced this week. Deck maintenance is a dull duty, but it’s vital that it be done, and done well. In moments of action, everyone on a ship needs to know they can trust the vessel they are standing on to work as intended. Finding and fixing hidden flaws, or bolstering weaker areas before going back out to sea, ensures that the routine parts of ship operation can operate as expected. 

Watch a video of the robot below: 

The post The Navy’s version of a Roomba inspects billion-dollar ships for damage appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
The best robot mops of 2023 https://www.popsci.com/story/reviews/best-robot-mops/ Mon, 15 Mar 2021 18:59:00 +0000 https://stg.popsci.com/uncategorized/best-robot-mops/
The best robot mops will clean your home and and eliminate hassle.

First they came for your carpets; now the robots have their sights set on your tile and wood floors.

The post The best robot mops of 2023 appeared first on Popular Science.

]]>
The best robot mops will clean your home and and eliminate hassle.

We may earn revenue from the products available on this page and participate in affiliate programs. Learn more ›

Best overall eufy by Anker, RoboVac G30 Hybrid, Robot Vacuum eufy by Anker RoboVac G30 Hybrid Robot Vacuum
SEE IT

This robot vacuum and mop uses laser navigation to create accurate maps of your home.

Best for hardwood floors BISSELL SpinWave Hard Floor Expert Wet and Dry Robot Vacuum BISSELL SpinWave Hard Floor Expert Wet and Dry Robot Vacuum
SEE IT

This robot automatically avoids carpets and rugs while mopping.

Best budget iRobot 240 Braava Robot Mop iRobot 240 Braava Robot Mop
SEE IT

The Braava jet pad type determines the cleaning mode—which the robot automatically detects.

Say goodbye to soggy mops with the help of the latest smart home appliance: a robot mop that—you guessed it—sops up spills and scrubs schmutz off of your flooring. If a mopping robot sounds right up your alley, you might be excited to know that you can get a combined robot vacuum and mop (although, for reasons that will become clear, you might not want to). This guide will give you the insider info you need to help you find one of the best robot mops for your home.

How we picked the best robot mops

There’s something about the task of mopping that just seems really onerous. Maybe it’s because there are no shortcuts, and all your hard work can be undone instantly by one muddy footprint or the tiniest amount of food spillage. So, little wonder that the concept of the best robotic vacuum with mop is an attractive one. However, before you get carried away, it’s worth noting that these robots are still in their relative infancy and come with certain limitations. Even the best mopping robot isn’t going to do what a person-powered mop can. We still think robot mops are worth a go if you have kids or pets and need a device to do constant little cleanups to keep the chaos at bay.

Think about how you’d clean a floor the old-fashioned way. In that case, you’d probably vacuum it first, then use a combination of water and detergent over the surface, and once you’ve done that, you’d get a clean bucket of water and rinse away the dirty water and soap.

That’s not how mopping robots work. Most are more akin to a Swiffer-style sweeper mop in that either they spray the floor with water (a lot specifically advise against using any form of detergent, although some do have proprietary formulations) to loosen any grime. They then drag a cloth over the floor to remove schmutz, or, in the case of “dry mop” robots, use damp cloths or pads to go over a surface. That said, there are also scrubby spin mops and the best robot vac and mop combos to consider. We vetted dozens of the best robotic mops of all types before arriving at our recommendations.

The best robot mops: Reviews & Recommendations

You already know about robot vacuum cleaners—you might even have one running around in your carpeted living room right now. Maximize the cleaning of all surfaces in your home by upgrading to a robot mop. Here’s a rundown of our favorites for a variety of needs.

Best overall: eufy by Anker, RoboVac G30 Hybrid Robot Vacuum

eufy

SEE IT

Specs

  • Dimensions: 12.8 inches L x 12.8 inches W x 2.85 inches H
  • Power: Corded electric and lithium-ion battery
  • Battery life: 110 minutes 

Pros

  • Mop and vacuum in one
  • Works on range of surfaces
  • Good for pet owners

Cons

  • Less suction power than other models

Vacuum with 2000Pa of suction power and an interchangeable mopping module with adjustable power settings for different surfaces. This robot vacuum and mop uses laser navigation to create accurate maps of your home and returns to the base station to recharge before returning to the place where it stopped with pinpoint accuracy.

Best vacuum combo: Ecovacs Deebot T8 AIVI Robot Vacuum Cleaner

ECOVACS

SEE IT

Specs

  • Dimensions: 13.7 inches L x 13.7 inches W x 3.6 inches H
  • Power: Lithium-ion battery
  • Battery life: 110 minutes

Pros

  • Uses AI for object recognition and laser mapping to plan
  • Mops and sweeps at the same time
  • Large water tank allows it to mop a 2,000-square-foot space

Cons

  • Expensive
  • Users have encountered problems with the mapping system and its ability to distinguish between surfaces

Robot mop machines differ; while some can switch seamlessly from vacuum to mopping, others require you to swap in a mopping module and fill a tank with water before it can mop. Sophisticated mapping software allows you to designate no-go and no-mop zones with this robot vacuum and mop, while the 240ml water tank allows for more than 2,000 square feet of mopping. Object recognition technology means it won’t get tripped up by socks and cables. The built-in camera can even be a remote security device with on-demand live video.

Best smart: iRobot Braava Jet M6 (6110) Ultimate Robot Mop

iRobot

SEE IT

Specs

  • Dimensions: 10 inches x 10.6 inches x 3.5 inches
  • Power: Lithium-ion battery
  • Battery life: 150 minutes

Pros

  • Long battery life
  • Maps your home
  • Responds to spills and messes in the moment
  • Recharges itself when the battery gets low

Cons

  • Uses disposable pads 
  • Water distribution power may be an issue

Featuring smart charge and resume technology that allows it to resume cleaning where it left off, this smart mop accurately maps your home and integrates with home assistants so you can literally tell it to “mop in front of the kitchen table” after a spill. Works with water or proprietary cleaning solution and with single-use or washable pads.

Best for hardwood floors: BISSELL SpinWave Hard Floor Expert Wet and Dry Robot Vacuum

Bissell

SEE IT

Specs

  • Dimensions: 14 inches L x 13 inches W x 4 inches H
  • Power: Lithium-ion
  • Battery life: 100 minutes

Pros

  • Uses cleaning solution specially designed for wood floors
  • Donates money to support homeless pets with every purchase
  • Cleans with reusable pad

Cons

  • May not do well in homes with atypical layouts
  • Sometimes bumps into furniture

This superior robot vacuum doubles as a mop with machine-washable mopping pads that work in conjunction with a cleaning solution to scrub efficiently and clean floors. A soft surface avoidance sensor means it effectively avoids carpets and rugs when in mopping mode.

Best self-emptying: Ecovacs Deebot Ozmo N8 Pro+

ECOVACS

SEE IT

Specs

  • Dimensions: 13.9 inches L x 13.9 inches W x 3.69 inches H
  • Power: Lithium-ion
  • Battery life: 110 minutes

Pros

  • Strong suction power
  • Superior LIDAR mapping
  • Automatically empties

Cons

  • Cumbersome setup with app
  • Short battery life

The Ecovacs Deebot makes using a robot mop even easier. The device comes with powerful 2600Pa suction that’s especially useful on carpets. It’s designed with 3-D technology and LIDAR that enable the device to map your home with greater precision and avoid furniture and other obstructions. And when you’re done, the Deebot empties itself.

Best budget: iRobot 240 Braava Robot Mop

iRobot

SEE IT

Specs

  • Dimensions: 7 inches L x 6.7 inches W x 3.3 inches H
  • Power: Lithium-ion
  • Battery life: 160 minutes

Pros

  • Two hours of battery life
  • Designed for hard-to-reach places
  • Works in kitchens and bathrooms
  • Relatively inexpensive

Cons

  • Doesn’t have mapping capability
  • Not as powerful as other models

Works with disposable pads to dry sweep, damp mop, or wet mop hardwood, tile, and stone. The clever design of this iRobot robot mop means it cleans easily around furniture, under cupboards, and right up to the edges of walls. A relatively small tank and compact battery mean this model best suits compact rooms.

Things to consider when picking the best robot mop

Combo v. solo device

At first glance, it makes perfect sense: one machine that can do it all. But if you’re buying a combination product, it’s worth checking a few things first. One of the complaints about the earliest vacuum and mop combo models was that, although they were meant to be able to differentiate between hard floors and carpet, you ended up with a soggy rug when they didn’t. That said, advances in technology now mean that these are far smarter devices that, with a little bit of prep, can map out the whole of your home (see below). Once mapped, you can limit certain areas as “no-go areas”—where you don’t want the machine going at all—or as “no-mop” areas where it’s fine for it to vacuum but not for it to mop.

Power

There’s a general feeling that, like a watched pot that never boils, a watched robot doesn’t do its best work when it’s being observed. This is why battery life shouldn’t really be a concern of yours. In an ideal world, you’ll just set and forget—essentially schedule for it to clean when you’re out of the house, and then just come back to tidier floors without worrying about exactly how that happened.

However, that’s only possible if your robot mop has a feature that makes sure it returns to the base station to charge itself when its juice is running low and then has the smarts to get itself back to where it left off so that it can finish the job. Otherwise, you end up with a device that runs out of power and leaves the job half done. So don’t look at battery life exclusively. It’s also tricky to know exactly what battery life is until you get the device rolling, so you’re better off looking for gadgets that offer this sort of return, recharge, reboot approach. And unfortunately, there’s no standardized name for this sort of tech—some brands call it “pinpoint return,” others “smart charging,” so you really need to read the small print.

Mapping

There are various ways in which you can “tell” a robot vacuum or robot mop where it should and shouldn’t go in your home. The most simplistic requires you to use tape, known as barrier tape, boundary tape, or just magnetic tape. You literally tape around areas that you don’t want the mop to go into, such as where your pet’s food bowls are, or around the perimeter of your Persian rug. Other solutions include virtual barriers like beacons that emit infrared, which communicates to your robot that the area is a no-go zone.

However, these relatively rudimentary and unsightly approaches have been superseded by far better technology. Using a variety of sensors, the robot mop moves around your home and creates an accurate picture of the layout of your house. The most sophisticated devices can generate and store multiple maps (ideal for houses with several floors) viewable on a smartphone app. And with the help of this app, you can not only track how your robot is doing in real-time, but also create zones that should be cleaned more frequently, as well as virtually rope off areas. So while good mapping technology does add to the cost of a robot mop, when it works well, it’s a worthwhile investment.

Spin mop or spray mop?

While some robot mops work in a “dry mop” fashion, where you attach a damp cloth or pad to it and it wipes it across the floor, “wet mops” have an on-board reservoir of water (or occasionally cleaning fluid) which they distribute across the floor—some will gradually feed water into a pad. In contrast, others have a spray function so that stains and sticky patches get blitzed away and then swiped over with a cloth.

Some other mops have a spin function which means that rather than having static pads, they have spinning pads. Spin mops are tougher on sticky splodges and dirty marks than water alone. For another option, check out our guide to the best steam mops.

Price

As is often the case with new technology, you can get a robot mop on a budget, but it’s probably not going to have the same bells and whistles as a top-of-the-range, top-of-the-budget one. So don’t expect sophisticated mapping technologies, control via an app, or intuitive features that will prevent the robot from drenching rugs or tipping downstairs.

But budget robot mops do exist. They tend to be a lot smaller, which means shorter battery life (and these definitely won’t have a return, recharge, reboot function), so these mop robots are better for focusing exclusively on self-contained kitchens or bathrooms rather than larger open-plan spaces. They’re also likely to have even more limited cleaning power, so maybe think of them as a maintenance mop rather than something that will do a thorough clean of ingrained dirt. But if you’re tight for time, live in a relatively small space, and hate mopping, they’re definitely worth considering.

FAQs

Q: Are robot mops worth it?

Whether a robot mop is worth it for you depends on how much you value your time and what your expectations are. If you’ve got a filthy, mud-caked floor with ingrained dirt that hasn’t been cleaned in years, expecting a robot mop to get it sparkling clean might be a bit much. But if you like the idea of a quick maintenance mop daily but are never going to do it yourself, a robot mop will.

Q: Can you use a robot mop on laminate floors?

Yes, you can use a robot mop on laminate floors. Obviously, double-check both the small print of your device and the small print of any cleaning fluid that you’re using before you use a robot mop on any flooring. But most are designed not to scratch hard floors and not to leave a lot of water on the floor and, as such, most hard floorings—from tile, stone, and concrete to vinyl, hardwood, and laminate—can be maintained with a robot mop.

Q: How do I choose a robot mop?

As with any purchase, choosing a robot mop comes down to your priorities. Budget has to be the first consideration, but you also want to think about whether you want a combined vacuum and mop—and if so, whether your priority is vacuuming or mopping and whether you want one that can seamlessly do both without needing you to switch in a mopping module. Hopefully, some of the information here has helped you to identify what’s out there and how a robot mop could work in your home.

A final word on shopping for the best robot mop

Choosing the best robot mop for you might well come down to choosing the best robot vacuum and mop or the best spray mop, but at least now you should have an idea of the tech that is out there and what it can do. These are still very much household appliances in their infancy. Over the next few years, we can expect to see huge advances—especially now that brands such as Dyson have started to get into the robot vacuum game. This invariably means that there will be more models coming to the market, improved efficiency, and more features and prices will come down, so watch this space.

Why trust us

Popular Science started writing about technology more than 150 years ago. There was no such thing as “gadget writing” when we published our first issue in 1872, but if there was, our mission to demystify the world of innovation for everyday readers means we would have been all over it. Here in the present, PopSci is fully committed to helping readers navigate the increasingly intimidating array of devices on the market right now.

Our writers and editors have combined decades of experience covering and reviewing consumer electronics. We each have our own obsessive specialties—from high-end audio to video games to cameras and beyond—but when we’re reviewing devices outside of our immediate wheelhouses, we do our best to seek out trustworthy voices and opinions to help guide people to the very best recommendations. We know we don’t know everything, but we’re excited to live through the analysis paralysis that internet shopping can spur so readers don’t have to.

The post The best robot mops of 2023 appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Watch this Navy drone take off and land on its tail like a rocket https://www.popsci.com/technology/tail-sitter-drone-aerovel-flexrotor/ Tue, 21 Mar 2023 22:00:00 +0000 https://www.popsci.com/?p=521729
An Aerovel Flexrotor drone takes off from the guided-missile destroyer USS Paul Hamilton in the Arabian Gulf on March 8, 2023.
An Aerovel Flexrotor drone takes off from the guided-missile destroyer USS Paul Hamilton in the Arabian Gulf on March 8, 2023. Elliot Schaudt / US Navy

Drones like these are called tail-sitters, and they have distinct advantages.

The post Watch this Navy drone take off and land on its tail like a rocket appeared first on Popular Science.

]]>
An Aerovel Flexrotor drone takes off from the guided-missile destroyer USS Paul Hamilton in the Arabian Gulf on March 8, 2023.
An Aerovel Flexrotor drone takes off from the guided-missile destroyer USS Paul Hamilton in the Arabian Gulf on March 8, 2023. Elliot Schaudt / US Navy

On March 8, in the ocean between Iran and the Arabian Peninsula, the US Navy tested out a new drone. Called the Aerovel Flexrotor, it rests on a splayed tail, and boasts a powerful rotor just below the neck of its bulbous front-facing camera pod. The tail-sitting drone needs very little deck space for takeoff or landing, and once in the sky, it pivots and flies like a typical fixed-wing plane. It joins a growing arsenal of tools that are especially useful in the confined launch zones of smaller ship decks or unimproved runways.

The March flights took place as part of the International Maritime Exercise 2023, billed as a multinational undertaking involving 7,000 people from across 50 nations. Activities in the exercise include working on following orders together, maritime patrol, countering naval mines, testing the integration of drones and artificial intelligence, and work related to global health. It is a hodgepodge of missions, capturing the multitude of tasks that navies can be called upon to perform.

This deployment is at least the second time the Flexrotor has been brought to the Persian Gulf by the US Navy. In December 2022, a Coast Guard ship operating as part of a Naval task force in the region launched a Flexrotor. This flight was part of an event called Digital Horizon, aimed at integrating drones and AI into Navy operations, and it included 10 systems not yet used in the region.

“The Flexrotor can support intelligence, surveillance and reconnaissance (ISR) missions day and night using a daylight or infrared camera to provide a real-time video feed,” read a 2022 release from US Central Command. The release continued: “In addition to providing ISR capability, UAVs like the Flexrotor enable Task Force 59 to enhance a resilient communications network used by unmanned systems to relay video footage, pictures and other data to command centers ashore and at sea.”

Putting drones on ships is hardly new. ScanEagles, a scout-drone used by the US Navy since 2005, can be launched from a rail and landed by net or skyhook. What sets the Flexrotor apart is not that it is a drone on a ship, but the fact that it requires a minimum of infrastructure to make it usable. This is because the drone is a tail-sitter.

What is a tail-sitter?

There are two basic ways to move a heavier-than-air vehicle from the ground to the sky: generate lift from spinning rotors, or generate lift from forward thrust and fixed wings. Helicopters have many advantages, needing only landing pads instead of runways, and they can easily hover in flight. But helicopters’ aerodynamics limit cruising and maximum speeds, even as advances continue to be made

Fixed wings, in turn, need to build speed and lift off on runways, or find another way to get into the sky. For rail-launched drones like the ScanEagle, this is done with a rail, though other methods have been explored.

Between helicopters and fixed-wing craft sit tiltrotors and jump-jets, where the the thrust (from either rotors/propellers or ducted jets) changes as the plane stays level in flight, allowing vertical landings and short takeoffs. This is part of what DARPA is exploring through the SPRINT program.

Tail-sitters, instead, involve the entire plane pivoting in flight. In effect, they look almost like a rocket upon launch, narrow bodies pointed to pierce the sky, before leveling out in flight and letting the efficiency of lift from fixed wings extend flight time and range. (Remember the space shuttle? It was positioned like a tail-sitter when it blasted off, but landed like an airplane, albeit without engines.) Early tail-sitters suffered because they had to accommodate a human pilot through all those transitions. Modern tail-sitter drones, like the Flexrotor or Australia’s STRIX, instead have human operators guiding the craft remotely from a control station. Another example is Bell’s APT 70.

The advantage to a tail-sitting drone is that it only needs a clearing or open deck space as large as its widest dimension. In the case of the Flexrotor, that means a rotor diameter of 7.2 feet, with at least one part of the launching surface wide enough for the drone’s nearly 10-foot wingspan. By contrast, the Seahawk helicopters used by the US Navy have a rotor diameter of over 53 feet. Ships that can already accommodate helicopters can likely easily add tail-sitter drones, and ships that couldn’t possibly fit a full-sized crewed helicopter might be able to take on and operate a drone scout.

In use, the Flexrotor boasts a cruising speed of 53 mph, a top speed of 87 mph, and potentially more than 30 hours of continuous operation. After takeoff, the Flexrotor pivots to fixed-wing flight, and the splayed tail retracts into a normal tail shape, allowing the craft to operate like a regular fixed-wing plane in the sky. Long endurance drones like these allow crews to pilot them in shifts, reducing pilot fatigue without having to land the drone to switch operators. Aerovel claims that Flexrotors have a range of over 1,265 miles at cruising speeds. In the air, the drone can serve as a scout with daylight and infrared cameras, and it can also work as a communications relay node, especially valuable if fleets are dispersed and other communications are limited.

As the Navy looks to expand what it can see and respond to, adding scouts that can be stowed away and then launched from cleared deck space expands the perception of ships. By improving scouting on the ocean, the drones make the vastness of the sea a little more knowable.

Watch a video below:

The post Watch this Navy drone take off and land on its tail like a rocket appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Meet Garmi, a robot nurse and companion for Germany’s elderly population https://www.popsci.com/technology/garmi-germany-elderly-robot/ Mon, 20 Mar 2023 16:00:00 +0000 https://www.popsci.com/?p=521071
Garmi aid robot attending to seated patient in lab
Garmi could one day help doctors remotely assist elderly patients. CHRISTOF STACHE/AFP via Getty Images

As the world's population ages up, researchers believe robot assistants will be integral to society.

The post Meet Garmi, a robot nurse and companion for Germany’s elderly population appeared first on Popular Science.

]]>
Garmi aid robot attending to seated patient in lab
Garmi could one day help doctors remotely assist elderly patients. CHRISTOF STACHE/AFP via Getty Images

A group of researchers at the Munich Institute of Robotics and Machine Intelligence recently unveiled their new robo-helper, Garmi—named after Garmisch-Partenkirchen, a ski resort town home to the school’s unit specializing in geriatronics, a relatively new field developing cutting edge tech for elderly care. The office location isn’t an accident, either. Garmisch-Partenkirchen boasts one of Germany’s highest proportional senior populations.

[Related: Do we trust robots enough to put them in charge?]

Somewhat resembling Honda’s Asimo on wheels, Garmi is currently in early prototype testing, but could one day soon offer a wide array of assistance for older patients in hospitals and nursing facilities. One day, they could  simply work in one’s own home. Abdeldjallil Naceri, the lab’s lead scientist, likened deploying Garmi to installing ATMs around a town. “We can imagine that one day, based on the same model, people can come to get their medical examination in a kind of technology hub,” he told Agence France-Presse.

From there, doctors could remotely evaluate patients’ diagnostics, which could be particularly helpful for those living in secluded locations. In a lab demonstration, for example, researchers guided Garmi to a patient stand-in using joystick controls. Once properly positioned, the robot aide positioned a stethoscope to the subject’s chest, which then provided health data to the driver’s computer screen. Outside of medical facilities, Garmi could hypothetically be deployed in residences to offer personalized help, like opening bottles, serving meals, as well as facilitating video calls for families or in the case of emergencies.

[Related: The next version of ChatGPT is live—here’s what’s new.]

Rolling out such help is not without its challenges, as Garmi is far from the first attempt at developing effective robotic aids for elderly populations. Although Germany contains one of the world’s fastest aging populations, Japan has long attempted to solve its own elderly issues with robot help to mixed results, at best. Despite these hurdles, however, researchers such as Naceri believe Garmi and similar robots are absolutely necessary. “We must get there, the statistics are clear that it is urgent… must be able to integrate this kind of technology in our society,” Naceri said.

Correction 3/21/23: An earlier version of this post stated Asimo was made by Hyundai. It is made by Honda.

The post Meet Garmi, a robot nurse and companion for Germany’s elderly population appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
This bumblebee-inspired bot can bounce back after injuring a wing https://www.popsci.com/technology/bumblebee-flying-robot-wing-repair/ Thu, 16 Mar 2023 14:00:00 +0000 https://www.popsci.com/?p=520098
Small flying robot perched atop cactus
Bumblebee wings withstand a lot of damage, and researchers want to mimic that in robots. MIT

Bumblebees can hurt their wings and still fly. Researchers want their own aerial robots to do the same.

The post This bumblebee-inspired bot can bounce back after injuring a wing appeared first on Popular Science.

]]>
Small flying robot perched atop cactus
Bumblebee wings withstand a lot of damage, and researchers want to mimic that in robots. MIT

Given their habit of bouncing off their surroundings with surprising regularity, bumblebees certainly live up to their name. But despite their collision records, the small insects’ wings can withstand a comparatively hefty amount of damage and still function well enough to continue along their pollination routes. This surprising, natural wing strength often outperforms most flying robots’ arrays, which can be grounded by the smallest issues. It’s a resilience that recently inspired researchers to delve into just what makes bumblebees so hearty, and how engineers can mimic that in repairing their own artificial wings.

In an upcoming issue of the research journal, Science Robotics, a team at MIT detailed the new ways they improved tiny aerial robots’ actuators, aka artificial muscles, to handle a sizable amount of damage and continue flying. In this instance, the test robots were roughly the size of a microcassette tape while weighing slightly more than an average paper clip. Each robot has two wings powered by ultrathin layers of dielectric elastomer actuators (DEAs) placed between two electrodes and rolled into a tube shape. As electricity is applied, the electrodes constrict the elastomers which then cause the wings to flap.

[Related: MIT engineers have created tiny robot lightning bugs.]

DEAs have been around for years, but miniscule imperfections in them can cause sparks that damage the device. Around 15 years ago, however, researchers realized that DEA failures from a single minor injury could be avoided via what’s known as “self-clearing,” in which a high enough voltage applied to the DEA disconnects an electrode from the problem area while keeping the rest of its structure intact.

For large wounds, such as a tear in the wing that lets too much air pass through it, researchers developed a laser cauterization method to inflict minor damage around the injury perimeter. After accomplishing this, they were then able to utilize self-clearing to burn away the damaged electrode and isolate the issue. To assess efficacy, engineers even integrated electroluminescent particles into each actuator. If light shines from the area, they know that portion of the actuator works, while darkened portions mean they are out-of-commission.

[Related: This tiny robot grips like a gecko and scoots like an inchworm.]

The team’s repair innovations showed great promise during stress tests. Self-clearing allowed the aerial robots to maintain performance, position, and altitude, while laser surgery on DEAs recovered roughly 87 percent of its normal abilities. “We’re very excited about this. But the insects are still superior to us, in the sense that they can lose up to 40 percent of their wing and still fly,” Kevin Chen, assistant professor of electrical engineering and computer science (EECS), as well as the paper’s senior author, said in a statement. “We still have some catch-up work to do.”

But even without catch-up, the new repair techniques could come in handy when using flying robots for search-and-rescue missions in difficult environments like dense forests or collapsed buildings.

The post This bumblebee-inspired bot can bounce back after injuring a wing appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
The corn leaf angle measuring robot is more useful than you think https://www.popsci.com/technology/corn-crop-robot/ Tue, 07 Mar 2023 21:00:00 +0000 https://www.popsci.com/?p=517873
Corn stalk leaf measuring robot parked in front of field
Corn leaf angle optimization is key to better crop yields. NC State University

The new tool could help farmers produce better and larger crop yields

The post The corn leaf angle measuring robot is more useful than you think appeared first on Popular Science.

]]>
Corn stalk leaf measuring robot parked in front of field
Corn leaf angle optimization is key to better crop yields. NC State University

A robot for measuring the angles of corn stalk leaves may sound like a ridiculously niche invention, but it’s a device with potentially major benefits for farmers. As detailed in a paper recently published in the Journal of Field Robotics, researchers from Iowa State University and North Carolina State University have designed an autonomous wheeled device narrow enough to move between corn rows spaced a standard 30 inches apart. As the robot traverses a field, its four tiers of dual cameras take an array of photos to allow a stereoscopic view for 3D plant modeling via a separate software program. 

[Related: John Deere finally agrees to let farmers fix their own equipment, but there’s a catch.]

When it comes to corn, the curves and angles are important, as far as the leaves are concerned. In relation to the stalk itself, the crop’s leaves ideally will angle upwards at the top before bending more horizontally as they progress lower, thus allowing optimal sunlight harvesting for photosynthesis. Unfortunately, measuring this attribute—important to optimizing future crop generations—is a painstakingly slow and rudimentary chore for farmers, who often resort to hand measurements via basic protractors. 

Enter onto the field AngleNet, the name given to the two part robot-software system.

In a press statement on Tuesday, Lirong Xiang, the paper’s first author, as well as assistant professor of biological and agricultural engineering at North Carolina State University, explained that, “For plant breeders, it’s important to know not only what the leaf angle is, but how far those leaves are above the ground. This gives them the information they need to assess the leaf angle distribution for each row of plants. This, in turn, can help them identify genetic lines that have desirable traits—or undesirable traits.”

[Related: Jailbreaking has sprouted for John Deere tractors.]

Researchers also found that AngleNet measured corn stalk leaves’ angles within 5 degrees of those measured by hand, or “well within the accepted margin of error for purposes of plant breeding,” Xiang said.

It may not seem like it at first thought, but the agricultural industry is often home to extremely advanced automation technologies—albeit not without their own controversies and concerns. Moving forward, however, researchers hope to further optimize AngleNet’s algorithms for even more precise measurements, as well as work alongside other crop scientists to utilize the technology. By deploying the system in the real world, the team also hopes to speed plant breeding research to eventually improve farmers’ future crop yields.

Correction 3/8/23: An earlier version of this story incorrectly stated that the University of Iowa participated in this research. We regret the error.

The post The corn leaf angle measuring robot is more useful than you think appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Leaping robots take physics lessons from grasshoppers https://www.popsci.com/technology/grasshopper-jumping-robot/ Fri, 03 Mar 2023 16:00:00 +0000 https://www.popsci.com/?p=516979
Meadow grasshopper sitting on blade of grass
Leaping robots could soon traverse malleable environments like grass and sand. Deposit Photos

Insects like grasshoppers could help build the next generation of jumping robots.

The post Leaping robots take physics lessons from grasshoppers appeared first on Popular Science.

]]>
Meadow grasshopper sitting on blade of grass
Leaping robots could soon traverse malleable environments like grass and sand. Deposit Photos

To give grasshoppers some credit—leaping across yards and between branches takes a lot more expertise than it might appear. There are incredibly tiny factors to consider, such as the resistance in launchpad material (Are the blades of grass bouncy? Is the plant twig brittle?), as well as desired distance, speed, and landing.

Most jumping robots can’t compete with the insect, as their leaps are limited to starting atop extremely rigid surfaces. But a new bouncing bot developed by researchers in Carnegie Mellon’s College of Engineering is soaring over those hurdles, and showing immense promise for how autonomous devices could operate in the future.

[Related: Watch these tiny bugs catapult urine with their butts.]

A team of scientists led by professor of mechanical engineering Sarah Bergbreiter recently optimized a robot’s latch mechanisms used to propel it upward. Previously, these latches were primarily thought of as simple “on/off” switches that enabled the release of stored energy. However, Bergbreiter and her team employed mathematical modeling to illustrate that these latches both were capable of steering energy output, as well as controlling the transfer of energy between the jumper and the launch surface.

Insects photo
Credit: Carnegie Mellon University

To test their work, the team positioned a small leaping robot atop a tree branch and recorded the precise energy transfers in its jumps’ first moments. Watching the branch recoil before the robot jumped, they could tell the device recovered at least a bit of the energy first transferred to the branch right before liftoff.

“We found that the latch can not only mediate energy output but can also mediate energy transfer between the jumper and the environment that it is jumping from,” said Bergbreiter.

Researchers also noticed an “unconventional” energy recovery in other instances which employed a different latch variety. In those situations, the branch actually provided a little push for the bot after it leaped off its surface, thus returning some of its momentum to boost it higher.

[Related: This tiny robot grips like a gecko and scoots like an inchworm.]

Now that researchers better understand the interactions at play in the opening moments of leaping, they can now begin working on ways to integrate this into future robotic designs. Likewise, biologists can gain a better insight into how insects maneuver through variable terrains, such as grass or sand.

“It has been nearly impossible to design controlled insect-sized robots because they are launched in just milliseconds,” explained Bergbreiter. “Now, we have more control over whether our robots are jumping up one foot or three… It’s really fascinating that the latch— something that we already need in our robots—can be used to control outputs that we couldn’t have controlled before.”

The post Leaping robots take physics lessons from grasshoppers appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
This tiny robot grips like a gecko and scoots like an inchworm https://www.popsci.com/technology/soft-robot-gecko-inchworm/ Mon, 27 Feb 2023 15:30:00 +0000 https://www.popsci.com/?p=515501
Soft robot inspired by gecko and inchworm moving across surface
GeiwBot gets its mobility from the movements of insects and reptiles. University of Waterloo

Several unique animals influenced the design of a new soft robot that requires no external power source.

The post This tiny robot grips like a gecko and scoots like an inchworm appeared first on Popular Science.

]]>
Soft robot inspired by gecko and inchworm moving across surface
GeiwBot gets its mobility from the movements of insects and reptiles. University of Waterloo

Robots partially inspired by reptiles’ impressive grip are already scaling walls and traipsing across ceilings, but so far they remain large and relatively clunky.  If one could combine similar mobility within the burgeoning world of soft robotics, however, an entire world of possibilities could open for everything from the medical industry to search-and-rescue operations.

Knowing this, researchers at Canada’s University of Waterloo recently set out to do just that. As showcased in a recent issue of Cell Reports Physical Science, the team’s newly developed four centimeters long, three millimeters wide, and one millimeter thick robot draws inspiration from geckos’ grip and inchworm movements.

[Related: The newest robot dog can scale walls and ceilings.]

The physics behind geckos’ famous grip is what’s known as the van der Waals force. This occurs when molecules and electrons interact and generate electromagnetic attractions. In the reptiles’ case, their microscopic toe hairs rub against wall surfaces, generating the van der Waals force and allowing it to cling to whatever it walks across. Inchworm contractions, meanwhile, help propel it forward as it moves. By combining both inspirations, the researchers’ new creation utilizes ultraviolet light and magnetic fields to scoot along floors, up walls, and even across ceilings.

Although its maneuverability is impressive, the robot’s real strength arguably lies in being completely wireless. According to researchers, GeiwBot (a portmanteau of its gecko and worm influences) is the first of its kind to not require a hookup to an external power source. This means the sneaky little bot  could one day be utilized in fields such as surgery via remote control. Other uses also include simply being able to access hard-to-reach locations, such as during search-and-rescue emergency operations.

[Related: This heat-seeking robot looks and moves like a vine.]

Despite its natural world inspirations, the actual materials used to construct GeiwBot can’t be found anywhere outside a lab. The device is built from a strip of light-responsive polymer that arcs and straightens akin to an inchworm, while liquid crystal elastomers and synthetic magnet patches mimic on either end geckos’ gravity-defying grip.

The team eventually hopes to create and hone a climbing soft robot that ditches magnets entirely for UV light-driven movement. From there, there is also the possibility of augmenting GeiwBot’s design to use near-infrared radiation in lieu of UV light to make the device more biocompatible. 

In a statement, Boxin Zhao, the University of Waterloo’s Endowed Chair in Nanotechnology and paper co-author, explains that, “Even though there are still limitations to overcome, this development represents a significant milestone for utilizing biomimicry and smart materials for soft robots,” adding that, “Nature is a great source of inspiration and nanotechnology is an exciting way to apply its lessons.”

The post This tiny robot grips like a gecko and scoots like an inchworm appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Why DARPA put AI at the controls of a fighter jet https://www.popsci.com/technology/darpa-ai-fighter-jet-test/ Sat, 18 Feb 2023 12:00:00 +0000 https://www.popsci.com/?p=513331
a modified F-16 in flight
The VISTA aircraft in August, 2022. Kyle Brasier / US Air Force

In December tests, different artificial intelligence algorithms flew an F-16-like fighter jet. Can AI be a good combat aviator?

The post Why DARPA put AI at the controls of a fighter jet appeared first on Popular Science.

]]>
a modified F-16 in flight
The VISTA aircraft in August, 2022. Kyle Brasier / US Air Force

In December, a special fighter jet made multiple flights out of Edwards Air Force Base in California. The orange, white, and blue aircraft, which is based on an F-16, seats two people. A fighter jet taking to the skies with a human or two on board is not remarkable, but what is indeed remarkable about those December flights is that for periods of time, artificial intelligence flew the jet. 

As the exploits of generative AI like ChatGPT grip the public consciousness, artificial intelligence has also quietly slipped into the military cockpit—at least in these December tests.  

The excursions were part of a DARPA program called ACE, which stands for Air Combat Evolution. The AI algorithms came from different sources, including a company called Shield AI as well as the Johns Hopkins Applied Physics Laboratory. Broadly speaking, the tests represent the Pentagon exploring just how effective AI can be at carrying out tasks in planes typically done by people, such as dogfighting. 

“In total, ACE algorithms were flown on several flights with each sortie lasting approximately an hour and a half,” Lt. Col. Ryan Hefron, the DARPA program manager for ACE, notes to PopSci via email. “In addition to each performer team controlling the aircraft during dogfighting scenarios, portions of each sortie were dedicated to system checkout.”

The flights didn’t come out of nowhere. In August of 2020, DARPA put artificial intelligence algorithms through their paces in an event called the AlphaDogfight Trials. That competition didn’t involve any actual aircraft flying through the skies, but it did conclude with an AI agent defeating a human flying a digital F-16. The late 2022 flights show that software agents that can make decisions and dogfight have been given a chance to actually fly a real fighter jet. “This is the first time that AI has controlled a fighter jet performing within visual range (WVR) maneuvering,” Hefron notes.

[Related: I flew in an F-16 with the Air Force and oh boy did it go poorly]

So how did it go? “We didn’t run into any major issues but did encounter some differences compared to simulation-based results, which is to be expected when transitioning from virtual to live,” Hefron said in a DARPA press release

Andrew Metrick, a fellow in the defense program at the Center for New American Security, says that he is “often quite skeptical of the applications of AI in the military domain,” with that skepticism focused on just how much practical use these systems will have. But in this case—an artificial intelligence algorithm in the cockpit—he says he’s more of a believer. “This is one of those areas where I think there’s actually a lot of promise for AI systems,” he says. 

The December flights represent “a pretty big step,” he adds. “Getting these things integrated into a piece of flying hardware is non-trivial. It’s one thing to do it in a synthetic environment—it’s another thing to do it on real hardware.” 

Not all of the flights were part of the DARPA program. All told, the Department of Defense says that a dozen sorties took place, with some of them run by DARPA and others run by a program out of the Air Force Research Laboratory (AFRL). The DOD notes that the DARPA tests were focused more on close aerial combat, while the other tests from AFRL involved situations in which the AI was competing against “a simulated adversary” in a “beyond-vision-range” scenario. In other words, the two programs were exploring how the AI did in different types of aerial contests or situations. 

Breaking Defense reported earlier this year that the flights kicked off December 9. The jet flown by the AI is based on an F-16D, and is called VISTA; it has space for two people. “The front seat pilot conducted the test points,” Hefron explains via email, “while the backseater acted as a safety pilot who maintained broader situational awareness to ensure the safety of the aircraft and crew.”

One of the algorithms that flew the jet came from a company called Shield AI. In the AlphaDogfight trials of 2020, the leading AI agent was made by Heron Systems, which Shield AI acquired in 2021. Shield’s CEO, Ryan Tseng, is bullish on the promise of AI to outshine humans in the cockpit.I do not believe that there’s an air combat mission where AI pilots should not be decisively better than their human counterparts, for much of the mission profile,” he says. That said, he notes that “I believe the best teams will be a combination of AI and people.” 

One such future for teaming between a person and AI could involve AI-powered fighter-jet-like drones such as the Ghost Bat working with a crewed aircraft like an F-35, for example. 

It’s still early days for the technology. Metrick, of the Center for New American Security, wonders how the AI agent would be able to handle a situation in which the jet does not respond as expected, like if the aircraft stalls or experiences some other type of glitch. “Can the AI recover from that?” he wonders. A human may be able to handle “an edge case” like that more easily than software.

The post Why DARPA put AI at the controls of a fighter jet appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
A torpedo-like robot named Icefin is giving us the full tour of the ‘Doomsday’ glacier https://www.popsci.com/technology/icefin-robot-thwaites-glacier/ Fri, 17 Feb 2023 15:00:00 +0000 https://www.popsci.com/?p=513275
The Icefin robot under the sea ice.
Icefin under the sea ice. Rob Robbins, USAP Diver

It may look like a long, narrow tube, but this robot is useful for a range of scientific tasks.

The post A torpedo-like robot named Icefin is giving us the full tour of the ‘Doomsday’ glacier appeared first on Popular Science.

]]>
The Icefin robot under the sea ice.
Icefin under the sea ice. Rob Robbins, USAP Diver

Thwaites, a notoriously unstable glacier in western Antarctica, is cracking and disintegrating, spelling bad news for sea level rise across the globe. Efforts are afoot to understand the geometry and chemistry of Thwaites, which is about the size of Florida, in order to gauge the impact that warming waters and climate change may have on it. 

An 11-foot tube-like underwater robot called Icefin is offering us a detailed look deep under the ice at how the vulnerable ice shelf in Antarctica is melting. By way of two papers published this week in the journal Nature, Icefin has been providing pertinent details regarding the conditions beneath the freezing waters. 

The torpedo-like Icefin was first developed at Georgia Tech, and the first prototype of the robot dates back to 2014. But it has since found a new home at Cornell University. This robot is capable of characterizing below-ice environments using the suite of sensors that it carries. It comes equipped with HD cameras, laser ranging systems, sonar, doppler current profilers, single beam altimeters (to measure distance), and instruments for measuring salinity, temperature, dissolved oxygen, pH, and organic matter. Its range is impressive: It can go down to depths of 3,280 feet and squeeze through narrow cavities in the ice shelf. 

Since Icefin is modular, it can be broken down, customized, and reassembled according to the needs of the mission. Researchers can remotely control Icefin’s trajectory, or let it set off on its own.  

Icefin isn’t alone in these cold waters. Its journey is part of the International Thwaites Glacier Collaboration (ITGC), which includes other radars, sensors, and vehicles like Boaty McBoatface

[Related: The ‘Doomsday’ glacier is fracturing and changing. AI can help us understand how.]

In 2020, through a nearly 2,000-foot-deep borehole drilled in the ice, Icefin ventured out across the ocean to the critical point where the Thwaites Glacier joins the Amundsen Sea and the ice starts to float. Data gathered by Icefin, and analyzed by human researchers, showed that the glacier had retreated up the ocean floor, thinning at the base, and melting outwards quickly. Additionally, the shapes of certain crevasses in the ice are helping funnel in warm ocean currents, making sections of the glacier melt faster than previously expected. 

These new insights, as foreboding as they are, may improve older models that have been used to predict the changes in Thwaites, and in the rates of possible sea level rise if it collapses. 

“Icefin is collecting data as close to the ice as possible in locations no other tool can currently reach,” Peter Washam, a research scientist from Cornell University who led analysis of Icefin data used to calculate melt rates, said in a press release. “It’s showing us that this system is very complex and requires a rethinking of how the ocean is melting the ice, especially in a location like Thwaites.”

Outside of Thwaites, you can find Icefin monitoring the ecosystems within ice-oceans around Antarctica’s McMurdo research station, or helping astrobiologists understand how life came to be in ocean worlds and their biospheres. 

Learn more about Icefin below: 

The post A torpedo-like robot named Icefin is giving us the full tour of the ‘Doomsday’ glacier appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Watch never-before-seen footage of the Titanic shipwreck from the 1980s https://www.popsci.com/science/new-titanic-footage/ Thu, 16 Feb 2023 15:05:00 +0000 https://www.popsci.com/?p=512884
The bow of the RMS Titanic, almost 12,500 feet under the ocean.
The bow of the RMS Titanic, almost 12,500 feet under the ocean. WHOI Archives /©Woods Hole Oceanographic Institution

Previously unreleased footage shows off the watershed tech used at the site's earliest expeditions.

The post Watch never-before-seen footage of the Titanic shipwreck from the 1980s appeared first on Popular Science.

]]>
The bow of the RMS Titanic, almost 12,500 feet under the ocean.
The bow of the RMS Titanic, almost 12,500 feet under the ocean. WHOI Archives /©Woods Hole Oceanographic Institution

Taken at the site of the RMS Titanic in 1986, nine months after it was first discovered, more than 80 minutes of previously unreleased footage of the second expedition to the wreckage have been made public for the first time. The footage highlights the difficult work that the team did in order to bring images of the iconic ship back to the surface and advance the field of underwater exploration–like facing extremely high pressures 2.5 miles down, while keeping the tethers of the submersible from getting stuck, and not disturbing hallowed ground–at the time when the technology was still limited.

Rare, uncut, and un-narrated footage of the wreck captured in July 1986 from cameras on the human-occupied submersible Alvin and the newly built, remotely operated Jason Junior. CREDIT: WHOI Archives /©Woods Hole Oceanographic Institution.

Andrew Bowen, an engineer and director of the National Deep Submergence Facility at Woods Hole Oceanographic Institution (WHOI), was on the team. For him, the ocean often provides scientists with a bit of a reality check. “The ocean is constantly throwing reminders that despite the advances in technology, we still have a tremendous amount to learn about it and about the environment,” Bowen tells PopSci

The premiere of the footage coincides with the theatrical re-release of Titanic (1997) in celebration of the director James Cameron film’s 25th anniversary. (Cameron himself has viewed the wreckage at the bottom of the ocean at least 33 times.)

[Related: Did An Optical Illusion Doom the Titanic?]

The new footage shows the three-person research submersible Alvin approaching the Titanic wreckage, exploring the bow, and eventually parking on its deck. A then-brand new remotely operated vehicle called Jason Jr. also took a closer look into a chief officer’s cabin, one of the ship’s promenade windows, and the familiar railings alongside of the bow. 

HOV Alvin, with ROV Jason Jr. attached, descends to the ocean bottom.
HOV Alvin, with ROV Jason Jr. attached, descends to the ocean bottom. CREDIT: WHOI Archives /©Woods Hole Oceanographic Institution.

“To be able to actually inspect things on the seafloor in greater detail, that’s where the Jason Jr. vehicle came into play. It was a key part of the 1986 expedition,” says Bowen. “My first project at WHOI really was designing the Jason Jr. vehicle and that was, from my point of view, a great way to demonstrate the technology.”

WHOI senior scientist Dana Yoerger said that the crew did not collect anything from the wreckage because Ballard thought it best to preserve the site. Also, the materials would not teach us about the wreck. According to Yoerger’s statement streamed on YouTube, they have “associative value” not “historical value.”

The ocean liner hit an iceberg and sank on April 15, 1912 during its maiden voyage from Southampton, England, to New York City. While the search for the wreckage began almost immediately after the Titanic sank, technological limitations hampered recovery efforts. By 1985, WHOI had developed new imaging technology, including Argo–a camera sled that was towed from the research vessel Knorr and captured the first photographs of the wreckage almost 12,500 feet underwater. 

Robert Ballard holds a camera aboard the submersible vehicle Alvin when it dove on the Titanic wreckage.
WHOI’s Dr. Robert Ballard led the 1986 return to the wreck and was one of the passengers aboard Alvin when it dove on the wreckage. (Ballard was also a part of the 1985 discovery). CREDIT: WHOI Archives /©Woods Hole Oceanographic Institution.

The team led by WHOI and Robert Ballard in partnership with Institut français de recherche pour l’exploitation de la mer (IFEMER), discovered the Titanic shipwreck in the North Atlantic Ocean, roughly 400 miles off the coast of Newfoundland, Canada on September 1, 1985 and the team would take 11 trips to the wreckage in 1986. 

In an interview with the Associated Press when this new footage was released on February 15, Ballard said, “The first thing I saw coming out of the gloom at 30 feet was this wall, this giant wall of riveted steel that rose over 100 and some feet above us. I never looked down at the Titanic. I looked up at the Titanic. Nothing was small.”

While there weren’t any human remains at the site, the crew saw shoes that resembled tombstones where some of the roughly 1,500 people who died during the sinking came to rest that Ballard called “haunting.”

[Related: These new robots will plunge into the ocean’s most alien depths.]

“More than a century after the loss of Titanic, the human stories embodied in the great ship continue to resonate,” said explorer, filmmaker, and ocean advocate James Cameron, in a press release. “Like many, I was transfixed when Alvin and Jason Jr. ventured down to and inside the wreck. By releasing this footage, WHOI is helping tell an important part of a story that spans generations and circles the globe.”

[Related: Archive Gallery: Our Obsession with the Titanic.]

Since 1986, the technology for exploring the ocean and its hallowed ground has vastly improved, including new autonomous underwater vehicles like REMUS and Orpheus that can explore the water the way a drone explores the air. Submersibles are now equipped with LED lights, instead of the quartz lamps on Alvin and Jason Jr. that could only see about 50 feet ahead. 

However, humans won’t be out of a job any time soon.

“One constant element that remains really compelling and scientifically important is human presence in the deep ocean,” says Bowen. “Humans still have an incredible skill at assimilating to an unknown environment in a way that a machine just doesn’t. Having humans involved, to gain context about the environment is a very valuable part of exploring the ocean that endures today.”

The post Watch never-before-seen footage of the Titanic shipwreck from the 1980s appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Researchers are stuffing drones into taxidermy birds to make them seem more ‘natural’ https://www.popsci.com/technology/taxidermy-bird-drone-robot/ Wed, 15 Feb 2023 21:00:00 +0000 https://www.popsci.com/?p=512596
Hand holding up drone disguised within taxidermy bird body
Researchers hope birds won't notice the difference. Mostafa Hassanalian

It's a bird, it's a plane, it's... sort of both, actually.

The post Researchers are stuffing drones into taxidermy birds to make them seem more ‘natural’ appeared first on Popular Science.

]]>
Hand holding up drone disguised within taxidermy bird body
Researchers hope birds won't notice the difference. Mostafa Hassanalian

Why spend all that time building and fine tuning robots that mimic birds when you can just…stuff robots in dead birds’ bodies? It’s hardly that simple, but  a recent project courtesy of Mostafa Hassanalian and their fellow New Mexico Tech colleagues put the peculiar idea to the test.

The team, who presented their work in late January at the American Institute of Aeronautics and Astronautics’ SciTech Forum, designed new systems reliant on taxidermy bird parts and artificial wing setups to mirror their (formerly living) avian inspirations. As New Scientist also highlighted on Tuesday, Hassanalian’s group technically built two dead bird bots—one fusing artificial body parts with an actual pheasant’s head and feathers, as well as a mechanical body combined with real pigeon wings.

[Related: Watch this bird-like robot make a graceful landing on its perch.]

The techno-taxidermy models, perhaps unsurprisingly, lag considerably behind their living counterparts’ maneuverability, speed, and grace. Currently, however, the feathery drones can glide, hover in place, and soar higher on hot thermal currents—just don’t expect them to do anything elegantly just yet, judging from video supplied to PopSci by Hassanalian.

The uncanniness of robot birds flying arount may not be much of an issue for the new designs’ potential usages, anyway. The research team’s paper notes that future models could hypothetically be used as “spy drones for military use,” although Hassanalian makes it clear in an email that this is far from its foremost goal of “developing a nature-friendly drone concept for wildlife monitoring.” Traditional drones are often disruptive to ecosystems due to issues such as sound and unfamiliarity, so developing quieter, natural-looking alternatives could help wildlife monitoring and research.

[Related: Reverse-engineered hummingbird wings could inspire new drone designs.]

Hassanalian also notes there are potential biological discoveries to be found in mimicking bird movement. For example, figuring out  how actual birds conserve energy while flying in V-formations or the role that feather colors and patterns may affect heat absorption and airflow.

Of course, any plans will require a bit more delving into the ethics and research guidelines for using deceased birds in future tinkerings. And before you ask—don’t worry. Hassanalian’s team worked with a nearby taxidermy artist to source the drones’ natural components. No real birds were physically harmed in the making of the drones. But it remains to be seen if any living animals will suffer psychologically from potentially seeing their cyborg cousins snapping spy photos of them one day.

The post Researchers are stuffing drones into taxidermy birds to make them seem more ‘natural’ appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Cuttlefish have amazing eyes, so robot-makers are copying them https://www.popsci.com/technology/cuttlefish-eye-imaging-system/ Wed, 15 Feb 2023 20:00:00 +0000 https://www.popsci.com/?p=512718
a cuttlefish in darkness
Cuttlefish are clever critters with cool eyes. Will Turner / Unsplash

Cameras inspired by cuttlefish eyes could help robots and cars see better.

The post Cuttlefish have amazing eyes, so robot-makers are copying them appeared first on Popular Science.

]]>
a cuttlefish in darkness
Cuttlefish are clever critters with cool eyes. Will Turner / Unsplash

Cuttlefish are smart, crafty critters that have long fascinated scientists. They’re masters of disguise, creative problem solvers, and they wear their feelings on their skin. On top of all that, they have cool-looking eyes and incredible sight. With w-shaped pupils, a curved retina, and a special arrangement of cells that respond to light, they have stellar 3D vision, great perception of contrast, and an acute sensitivity to polarized light. This vision system allows these creatures to hunt in underwater environments where lighting is often uneven or less than optimal. And for an international team of roboticists wanting to create machines that can see and navigate in these same conditions, they’re looking to nature for inspiration on artificial vision. 

In a new study published this week in Science Robotics, the team created an artificial vision design that was inspired by cuttlefish eyes. It could help the robots, self-driving vehicles, and drones of the future see the world better. 

“Aquatic and amphibious animals have evolved to have eyes optimized for their habitats, and these have inspired various artificial vision systems,” the researchers wrote in the paper. For example, imaging systems have been modeled after the fish eyes with a panoramic view, the wide-spectrum vision of mantis shrimp, and the 360-field-of-view of fiddler crab eyes. 

[Related: A tuna robot reveals the art of gliding gracefully through water]

Because the cuttlefish has photoreceptors (nerve cells that take light and turn it into electrical signals) that are packed together in a belt-like region and stacked in a certain configuration, it’s good at recognizing approaching objects. This feature also allows them to filter out polarized light reflecting from the objects of interest in order to obtain a high visual contrast. 

Meanwhile, the imaging system the team made mimics the unique structural and functional features of the cuttlefish eye. It contains a w-shaped pupil that is attached on the outside of a ball-shaped lens with an aperture sandwiched in the middle. The pupil shape is intended to reduce distracting lights not in the field of vision and balance brightness levels. This device also contains a flexible polarizer on the surface, and a cylindrical silicon photodiode array that can convert photons into electrical currents. These kinds of image sensors usually pair one photodiode to one pixel. 

“By integrating these optical and electronic components, we developed an artificial vision system that can balance the uneven light distribution while achieving high contrast and acuity,” the researchers wrote. 

In a small series of imaging tests, the cuttlefish-inspired camera was able to pick up the details on a photo better than a regular camera, and it was able to fairly accurately translate the outlines of complex objects like a fish even when the light on it was harsh or shone at an angle. 

The team notes that this approach is promising for reducing blind spots that most modern cameras on cars and bots have trouble with, though they acknowledge that some of the materials used in their prototype may be difficult to fabricate on an industrial level. Plus, they note that “there is still room for further improvements in tracking objects out of sight by introducing mechanical movement systems such as biological eye movements.”

The post Cuttlefish have amazing eyes, so robot-makers are copying them appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
This robot can create finger paintings based on human inputs https://www.popsci.com/technology/frida-ai-paint-robot/ Sat, 11 Feb 2023 12:00:00 +0000 https://www.popsci.com/?p=511313
Robot painted portrait of Frida Kahlo
FRIDA's portrait of its namesake artist. Carnegie Mellon University

Carnegie Mellon University's FRIDA turns ideas into colorful finger-painted portraits.

The post This robot can create finger paintings based on human inputs appeared first on Popular Science.

]]>
Robot painted portrait of Frida Kahlo
FRIDA's portrait of its namesake artist. Carnegie Mellon University

A research team at Carnegie Mellon University has developed a new project that embraces artistic collaboration’s spontaneity and joy by merging the strengths of humans, artificial intelligence, and robotics. FRIDA—the Framework and Robotics Initiative for Developing Arts—ostensibly works like the generative art-bot DALL-E by developing an image based on a series of human prompts. But FRIDA  takes it a step further by actually painting its idea on a physical canvas.

As described in a paper to be presented in May at the IEEE International Conference on Robotics and Automation, the team first installed a paintbrush onto an off-the-shelf robotic arm, then programmed its accompanying AI to reinterpret human input, photographs, and even music. The final results arguably resemble somewhat rudimentary finger paintings.

Unlike other similar designs, FRIDA analyzes its inherently imprecise brushwork in real time, and adjusts accordingly. Its perceived mistakes are incorporated into the project as they come, offering a new level of spontaneity. “It will work with its failures and it will alter its goals,” Peter Schaldenbrand, a Ph.D. student and one of the FRIDA’s creators, said in the demonstration video provided by Carnegie Mellon.

[Related: Netflix used AI-generated images in anime short. Artists are not having it.]

Its creators’ emphasis on the robot being a tool for human creativity. According to the team’s research paper, FRIDA “is a robotics initiative to promote human creativity, rather than replacing it, by providing intuitive ways for humans to express their ideas using natural language or sample images.”

Going forward, researchers hope to continue honing FRIDA’s abilities, along with expanding its repertoire to potentially one day include sculpting, an advancement that could show great promise in a range of production industries. 

The post This robot can create finger paintings based on human inputs appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
This robot eel could be the start of a new breed of ‘voxel’ robots https://www.popsci.com/technology/mit-voxel-eel-robot/ Tue, 07 Feb 2023 21:00:00 +0000 https://www.popsci.com/?p=510500
Underwater snake robot from MIT
Meet the son of Robotuna. MIT

MIT scientists created what could be the Legos of affordable, adaptable robotics.

The post This robot eel could be the start of a new breed of ‘voxel’ robots appeared first on Popular Science.

]]>
Underwater snake robot from MIT
Meet the son of Robotuna. MIT

Way back in 1994, engineers at MIT unveiled Robotuna. Inspired by the 160 million-year-old species, the aptly named, four-foot-long submersible robot required over 2,800 components, including 40 ribs, tendons, a vertebrae-like backbone, and even a Lycra covering to mimic the fish’s skin. Now, nearly two decades later, yet another MIT research team (including one Robotuna veteran) have unveiled their new underwater successor to the breakthrough fishbot—a modular creation composed of simplified, repeating structures instead of individualized pieces that can resemble everything from an eel to a hydrofoil wing.

Their findings, published recently in the journal Soft Robotics, showcase MIT’s new advances in developing deformable, dynamically changing underwater robotic structures. This ability is key for submersible robots, since it allows them to move through water much more efficiently, as countless varieties of fish do in rivers, lakes, and the open ocean.

[Related: This amphibious robot can fly like a bird and swim like a fish.]

The team’s new design relies on lattice-like pieces called voxels, which are stiff in structure yet still low-density, and allow for large scalability potentials. The voxels are made to be load-bearing in one direction, yet soft in others through a combination of various materials and proportions, including cast plastic pieces. The entire design was then encased in a rib-like support material, and all of that was covered in waterproof neoprene.

To demonstrate these advances, the team created a meter-long, eel-like robot composed of four structures, each made of five voxels. An actuator wire attached to each end’s voxel allows the robot to undulate accordingly, causing the snakebot to move through water. Unlike the two-year construction time for its Robotuna ancestor, however, the new robot only took two days to build.

[Related: Bat-like echolocation could help these robots find lost people.]

“There have been many snake-like robots before, but they’re generally made of bespoke components, as opposed to these simple building blocks that are scalable,” Neil Gershenfeld, an MIT professor and research team member, said in a news release.

Aside from scalability, the voxels allow for numerous other design potentials, including a winglike hydrofoil also built by the team. Resembling a sail, the second construction shows promise for integration onto shipping vessel hulls, whether they could generate drag-inducing eddies to improve energy efficiency. There’s also talk of a “whale-like submersible craft” capable of creating its own propulsion. Given the voxels’ drastically shorter build times, however, that prototype could be here before we know it.

The post This robot eel could be the start of a new breed of ‘voxel’ robots appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
This amphibious robot can fly like a bird and swim like a fish https://www.popsci.com/technology/drone-air-water-quadcopter/ Mon, 06 Feb 2023 20:00:00 +0000 https://www.popsci.com/?p=510182
Quadcopter drone propelling itself underwater in swimming pool
Mirs-X works as well in the water as it does in the air. New Scientist/YouTube

A new drone is just as comfortable soaring through the air as it is taking a swim.

The post This amphibious robot can fly like a bird and swim like a fish appeared first on Popular Science.

]]>
Quadcopter drone propelling itself underwater in swimming pool
Mirs-X works as well in the water as it does in the air. New Scientist/YouTube

One of the most striking aspects of the military’s much-analyzed UAP footage is some of the objects’ apparent ability to travel between air and water in the blink of an eye. Something capable of such a feat may certainly appear like some seriously extraterrestrial technology to the untrained eye, but a research team at the Chinese University of Hong Kong recently showed that, at least on a small scale, it’s not impossible to do.

As highlighted by New Scientist and soon-to-be detailed at the upcoming IEEE International Conference on Robotics and Automation, Ben Chen and their team’s small “Mirs-X” quadcopter prototype can hover about six minutes in the air, or dive as deep as three meters for a whopping 40 minutes. To accomplish the dual biome maneuvering, researchers equipped each of the drone’s four motors with a dual-speed gearbox. The motors and propellers are situated on rotating mounts capable of tilting and changing direction independent of one another, thus allowing for underwater propulsion.

[Related: Bat-like echolocation could help these robots find lost people.]

Precise propeller speed is also a vital factor for Mirs-X’s success. Given air is far less dense than water, the drone’s propellers must be able to spin incredibly fast to generate enough lift to rise and hover. Those same mechanisms can then slow down immensely once underwater to offer the appropriate thrust.

Although the Mirs-X prototype is pretty small—measuring just under 15 inches across and weighing barely 3.5 pounds—Chen’s team hopes to scale up the drone as large as 6 feet across in future experiments. They also hope to include additional abilities like grasping and carrying objects recovered underwater, although cautioned to New Scientist that further waterproofing could hamper its effectiveness.

If the hurdles could be cleared, however, such a drone could one day prove immensely useful for situations such as search and rescue operations requiring both aerial and submerged reconnaissance, or for inspecting engineering and industrial areas… perhaps a team-up with that new echolocation bot could prove interesting.

The post This amphibious robot can fly like a bird and swim like a fish appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Bat-like echolocation could help these robots find lost people https://www.popsci.com/technology/robot-echolocation-bat/ Fri, 03 Feb 2023 16:30:00 +0000 https://www.popsci.com/?p=509670
Bat hanging upside down from branch
The robots are almost as cute as this lil' guy. Deposit Photos

The cheap, simple addition could help robots navigate and map hard-to-reach areas.

The post Bat-like echolocation could help these robots find lost people appeared first on Popular Science.

]]>
Bat hanging upside down from branch
The robots are almost as cute as this lil' guy. Deposit Photos

Echolocation is an immense benefit for bats—and certain superheroes. Typically, the sense works via the brain interpreting sound waves bouncing off nearby surroundings to estimate information such as size and distance. Users of echolocation usually generate the sounds themselves via high-pitched clicks and pings, as is the case with bat, dolphins, and whales.

Now, researchers working at Switzerland’s Ecole Polytechnique Fédérale de Lausanne (EPFL) recently extended the sensory ability to the field of robotics, with some very promising results. As first detailed in the journal, Robotics and Automation Letters, and subsequently highlighted on Thursday by New Scientist, Frederike Dümbgen—now at the University of Toronto—and her team managed to combine a simple, cheap microphone and speaker array for both wheeled and flying robots.

[Related: When wind turbines kill bats and birds, these scientists want the carcasses.]

The cost-effective system essentially works exactly like bats’ sensory organs by first emitting short pings across a range of frequencies, then using the robot’s onboard microphone to record the sounds after bouncing off nearby walls. An algorithm designed by Dümbgen’s team next analyzes how the sound waves interfere with its own echoes, and subsequently reconstructs the rough dimensions of the room.

According to their results, a stationary robot about the size of a tennis ball could map within two centimeters’ accuracy when placed half a meter away from a wall, while the flying drone could manage within eight meters. This is a relatively far cry from the accuracy of advanced camera and laser options, but a solid alternative for its comparative light weight and cost.

[Related: How humans can echolocate like bats.]

Yet using just this basic robotic echolocation system could soon show immense promise in difficult-to-map or completely foreign environments, as well as search-and-rescue operations, as New Scientist offers. Its ability to already attach to off-the-shelf robot models—in this case, a Crazyflie and an e-puck—also makes it incredibly versatile for additional designs. For future iterations, Dümbgen’s team hopes to hone the system’s accuracy, as well as potentially phase out audible pinging setup to instead just measure inherent sound such as the flying robot’s own propellers.

Correction 2/7/23: A previous version of this article misattributed the research as taking place at the University of Toronto. Dümbgen’s research was conducted at Switzerland’s Ecole Polytechnique Fédérale de Lausanne.

The post Bat-like echolocation could help these robots find lost people appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Russia’s robot tanklet is being pitched as an anti-armor tool https://www.popsci.com/technology/russia-marker-robot-ukraine/ Wed, 01 Feb 2023 23:06:57 +0000 https://www.popsci.com/?p=509251
Leopard 2A6 tanks seen in Germany in June, 2018, during a training event.
Leopard 2A6 tanks seen in Germany in June, 2018, during a training event. US Army / Rolyn Kropf

The experimental bot in question is called the Marker UGV.

The post Russia’s robot tanklet is being pitched as an anti-armor tool appeared first on Popular Science.

]]>
Leopard 2A6 tanks seen in Germany in June, 2018, during a training event.
Leopard 2A6 tanks seen in Germany in June, 2018, during a training event. US Army / Rolyn Kropf

On January 26, Russian politician Dmitry Rogozin claimed in an interview that the country’s robotic Marker Uncrewed Ground Vehicles will be deployed in Ukraine as a tool to counter tanks. The Marker is a long-in-development and high-tech concept, designed to explore how robots could work together with humans on the battlefield. As Russia’s invasion of Ukraine continues, and as Ukraine prepares to receive armored vehicles, including tanks, from other countries, Marker appears to have been moved from conceptual promise to being touted as a wonder weapon. 

The Marker UGV dates back to at least 2019, when it was promoted as a symbol of the modern technological prowess of the Russian military. While Russia had already developed armed drones, ground robots typically took the form of mine clearing machines like the Uran-6. With treads and with a turret, the Marker featured in glossy produced videos with a rock beat and a machine gun swivel that seemed to follow the commands of a remote human spotter.

Marker was developed by Russia’s Advanced Research Foundation, which is a rough analog to DARPA in the US. Early work on Marker made it a tool for exploring concepts in robots, remote control, and autonomy, with the assumption that later, other companies would develop new tools and weapons based on the research done with Marker.

As recently as January 2022, Russian state-owned media described Marker as being used to patrol a spaceport and work alongside quadcopter drones. Marker was one of several robots promoted as major technological advances, all against the backdrop of Russia mobilizing tanks and soldiers for the invasion of Ukraine that came February 24. In the eleven months since the invasion, Russia’s major advances have been halted, and on multiple fronts turned back. Now, with news that Ukraine stands ready to receive armored transports and tanks, Marker is back to being a darling of Russian media.

Meeting its Marker  

On January 15, Rogozin claimed to news service TASS that Marker robots would be tested in Ukraine soon. While Rogozin has no official capacity in the Russian government, he has held multiple high-level positions within the Russian government. In July 2022, he was dismissed as the head of Roscosmos, Russia’s space agency, and has since rebranded himself as a leader of a volunteer group called the “Tsar’s Wolves” whose aim is improving the technology of Russian forces. Testing Marker in Ukraine would mark a debut for the device, and a task it was never quite designed for.

“This would be a first combat deployment for the Marker UGV, and yes, it wasn’t really tested in combat conditions before,” says Samuel Bendett, an analyst at the Center for Naval Analysis and adjunct senior fellow at the Center for New American Security. “It was tested in a rather controlled environment, even when it had to navigate autonomously through a forested environment in late 2021. There is of course a possibility of a classified series of tests that could have taken place, but as far as all info about this UGV, there was no real combat stress test.”

Deploying an untested robot into combat, should it happen, reads as more of a stunt than battle-changing tool. In earlier tests and demonstrations, what set Marker apart was its ability to carry machine guns and anti-tank weapons, then use them at the discretion of protected or hidden soldiers. Powerful cameras and sensors could make it a useful spotter and shooter, though the role necessarily entails exposing the robot to return fire, risking the integrity of the machine. At a production level, that is a loss that a military can absorb. But with just a handful of test platforms, it is a big gamble for a flashy demonstration.

“Marker has limited autonomy capability for movement and target selection, although testing that in a complex battlefield space is probably different than trying to recreate such a test in pre-2022 trials. This is the crux of the problem in using such UGVs – real combat presents many unpredictable situations that cannot be all tested out beforehand, so it’s also likely that Markers will be remote-controlled to avoid losses,” says Bendett. “And there is also a significant PR element here.”

The possible fronts where Marker could be deployed in Ukraine are many, from old trenches in the Donbas region that Russia has contested since 2014, to fierce fighting around the Ukrainian city of Bakhmut in the east, or even along Russian-held front lines northeast of Crimea. Regardless of where it is deployed, it is unlikely to be effective against heavy armor.

Rogozin highlighted that Marker exists in two forms. The sensor-and-drone equipped scout is designed as a useful spotter. Rogozin pitched the armed version, complete with anti-tank missiles, as an answer to Abrams and Leopard tanks. Says Bendett: “The recon version seems more plausible [for use] than a straight up contest against two of the most powerful tanks in the world.”

The post Russia’s robot tanklet is being pitched as an anti-armor tool appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
This heat-seeking robot looks and moves like a vine https://www.popsci.com/technology/vine-robot-heat/ Wed, 01 Feb 2023 21:00:00 +0000 https://www.popsci.com/?p=509154
Infrared heat comparison photo of vine robot moving towards heat source
Researchers hope the robot could one day aid in fire suppression. Charles Xiao et al.

The tubular robot could one day help extinguish the hard-to-find remnants of wildfires.

The post This heat-seeking robot looks and moves like a vine appeared first on Popular Science.

]]>
Infrared heat comparison photo of vine robot moving towards heat source
Researchers hope the robot could one day aid in fire suppression. Charles Xiao et al.

Robots are increasingly drawing from a menagerie of animal inspirations, but one of the newer designs takes its novel inspiration from vegetation. As spotted on Tuesday by New Scientist, researchers at the University of California Santa Barbara have created a robot capable of mimicking vines and roots in their ability to locally detect and move towards sources of moisture. In this case, however, Charles Xiao and his cohort honed their creation to grow towards the direction of heat.

The two-meter-long, tendril-like bot is composed of a pair of thin Mylar bags filled with a refrigerant fluid called Novec 7000 which are separated by an insulating, low-density polyethylene “spine.” Each Mylar sleeve is divided into 4.5 centimeter segments, which expand wider while shrinking in overall length once their internal refrigerant liquid begins evaporating past its comparatively low boiling point of 93F.

[Related: This Korean robodog proves running on sand isn’t just for ‘Baywatch’.]

When a warmed side’s segment expands and contracts, its complimentary portion on the other side of the robot’s spine lengthens in response, thus arcing the overall device towards the heat source. According to the team’s research, their robot can already navigate around simple obstacles, as well as bend backwards towards heat even if pointed in the opposing direction. Another feature of the team’s root-inspired robot are its eversion capabilities, the process in which soft robots can extend or unfurl from its interior—New Scientist aptly compares the movement to an inside-out dress shirt sleeve being pushed out by an arm.

Ironically, the vine bot’s attraction to warmth could be utilized to quell the heat source itself. Xiao and his team envision future iterations combined with hoses to funnel water or inert gasses into dangerous situations such as wildfires to help put out flames at minimal risk to humans. The device’s cheapness doesn’t hurt, either—the team estimates deployment could cost as little as $1 for every three meters of the uncoiling robot. Next up for the design is making it speedier, as well as customizing the internal liquid to change at what temperatures the robot begins to react.

The post This heat-seeking robot looks and moves like a vine appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
This Korean robodog proves running on sand isn’t just for ‘Baywatch’ https://www.popsci.com/technology/dog-robot-multiterrain/ Fri, 27 Jan 2023 22:00:00 +0000 https://www.popsci.com/?p=508217
Four legged robot running alongside man on sandy beach
See RaiBo run. Run, RaiBo, run. KAIST Robotics & Artificial Intelligence Lab

Quadrupedal robots are usually confined to a single terrain. RaiBo is changing that.

The post This Korean robodog proves running on sand isn’t just for ‘Baywatch’ appeared first on Popular Science.

]]>
Four legged robot running alongside man on sandy beach
See RaiBo run. Run, RaiBo, run. KAIST Robotics & Artificial Intelligence Lab

Legged robots have become definitively more agile in recent years, but often still remain limited by tricky terrain. One bot optimized for flat, hard surfaces might not fare as well as another designed more specifically for dynamic areas like muddy fields or sandy beaches—or ceilings, for that matter. A new quadrupedal robot control technology developed by researchers at the Korea Advanced Institute of Science and Technology (KAIST) appears to be breaking down those literal and physical barriers, thanks to help from AI reinforcement learning.

In reinforcement learning, a huge variety of simulations are generated that approximate physical trials, thereby shortening the training times needed for an AI to optimize itself towards its intended goals. Led by Professor Jemin Hwangbo of KAIST’s Department of Mechanical Engineering, their team created a new artificial neural network capable of making real-time terrain assessments sans any prior information, then feeding that knowledge back to their four-legged robot, RaiBo.

In this case, however, the KAIST researchers also defined a new contact modeling based on the robot’s physical pressure interacting with the ground reactions of various mediums to simulate deforming terrain, such as sand. All of this information was then fed into RaiBo’s AI to produce some truly impressive results.

[Related: The newest robot dog can scale walls and ceilings.]

According to the team’s study published earlier this month in the journal Science Robotics, their dog-bot demonstrated its ability to jog across a beach at roughly 3m per second with its feet fully submerged in sand. RaiBo could also run across grassy fields and a running track without any additional programming or control algorithm tweaking.

“It has been shown that providing a learning-based controller with a close contact experience with real deforming ground is essential for application to deforming terrain,” Suyoung Choi, the paper’s first author, said in a statement

Because the new proposed controller can be used without any prior information on a terrain, it can easily be applied to future AI walking research, such as how to get a robot to gracefully move atop an air mattress, something RaiBo also reportedly could accomplish. 

The post This Korean robodog proves running on sand isn’t just for ‘Baywatch’ appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
A new artificial skin could be more sensitive than the real thing https://www.popsci.com/technology/artificial-skin-iontronic/ Fri, 27 Jan 2023 15:00:00 +0000 https://www.popsci.com/?p=508099
two hands
Could artificial skin be the next frontier in electronics?. Shoeib Abolhassani / Unsplash

It can detect direct pressure as well as objects that are hovering close by.

The post A new artificial skin could be more sensitive than the real thing appeared first on Popular Science.

]]>
two hands
Could artificial skin be the next frontier in electronics?. Shoeib Abolhassani / Unsplash

The human skin is the body’s largest organ. It also provides one of our most important senses: touch. Touch enables people to interact with and perceive objects in the external world. In building robots and virtual environments, though, touch has not been the easiest feature to translate compared to say, vision. Many labs are nonetheless trying to make touch happen, and various versions of artificial skin show promise in making electronics (like the ones powering prosthetics) smarter and more sensitive.

A study out this week in the journal small presents a new type of artificial skin created by a team at Nanyang Technological University in Singapore that can not only sense direct pressure being applied on it, but also when objects are getting close to it. 

[Related: One of Facebook’s first moves as Meta: Teaching robots to touch and feel]

Already, various artificial skin mockups in the past have been able to pick up on factors like temperature, humidity, surface details and force, and turn those into digital signals. In this case, this artificial skin is “iontronic,” which means that it integrates ions and electrodes to try to enable sense. 

Specifically, it’s made up of a porous, spongy layer soaked with salty liquid sandwiched between two fabric electrode layers embedded with nickel. These raw components are low-cost, and easily scalable, which the researchers claim makes this type of technology suitable for mass production. The result is a material that is bendy, soft, and conductive. The internal chemistry of the structure makes it so that when there is pressure applied onto the material, it induces a change in capacitance, producing an electric signal. 

“We created artificial skin with sensing capabilities superior to human skin. Unlike human skin that senses most information from touching actions, this artificial skin also obtains rich cognitive information encoded in touchless or approaching operations,” corresponding author Yifan Wang, an assistant professor at Nanyang Technological University, in Singapore said in a press release. “The work could lead to next-generation robotic perception technologies superior to existing tactile sensors.”

The design of the device also creates a “fringing electric field” around the edge of the skin. This resulting electric field can sense when objects get close and can also discern the material that the object is made of. For example, it can distinguish between a plastic, metal, and human skin in a small proof-of-concept demo. 

As for use cases, the artificial skin can be put onto robot fingers or on a control interface for an electronic game that uses the touch of the finger to move the characters. In their experiment, users played the game Pac-Man, and navigated through electronic maps by interacting with a panel of the artificial skin. 

The post A new artificial skin could be more sensitive than the real thing appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
A squishy new robot uses syringes and physics to mosey along https://www.popsci.com/technology/soft-robot-syringe-pump/ Tue, 24 Jan 2023 23:00:00 +0000 https://www.popsci.com/?p=507611
cornell soft robot
Fluids help this robot move. Cornell University

A new invention from engineers at Cornell moves by pumping fluids.

The post A squishy new robot uses syringes and physics to mosey along appeared first on Popular Science.

]]>
cornell soft robot
Fluids help this robot move. Cornell University

When we think of robots, we typically think of clunky gears, mechanical parts, and jerky movements. But a new generation of robots have sought to break that mold. 

Since Czech playwright Karel Čapek first coined the term “robot” in 1920, these machines have evolved into many forms and sizes. Robots can now be hard, soft, large, microscopic, disembodied or human-like, with joints controlled by a range of unconventional motors like magnetic fields, air, or light

A new six-legged soft robot from a team of engineers at Cornell University has put its own spin on motion, using fluid-powered motors to achieve complex movements. The result: A free-standing bug-like contraption carrying a backpack with a battery-powered Arbotix-M controller and two syringe pumps on top. The syringes pump fluid in and out of the robot’s limbs as it ambles along a surface at a rate of 0.05 body lengths per second. The design of the robot was described in detail in a paper published in the journal Advanced Intelligent Systems last week. 

Robots photo
Cornell University

The robot was born out of Cornell’s Collective Embodied Intelligence Lab, which is exploring ways that robots can think and collect information about the environment with other parts of their body outside of a central “brain,” kind of like an octopus. In doing this, the robot would rely on its version of reflexes, instead of on heavy computation, to calculate what to do next. 

[Related: This magnetic robot arm was inspired by octopus tentacles]

To build the robot, the team created six hollowed-out silicone legs. Inside the legs are fluid-filled bellows (picture the inside of an accordion) and interconnecting tubes arranged into a closed system. The tubes alter the viscosity of the fluid flowing in the system, contorting the shape of the legs; the geometry of the bellows structure allows fluid from the syringe to move in and out in specific ways that adjust the position and pressure inside each leg, making them extend stiffly or deflate into their resting state. Coordinating different, alternating combinations of pressure and position creates a cycled program that makes the legs, and the robot, move.  

According to a press release, Yoav Matia, a postdoctoral researcher at Cornell and an author on the study, “developed a full descriptive model that could predict the actuator’s possible motions and anticipate how different input pressures, geometries, and tube and bellow configurations achieve them–all with a single fluid input.”

Because of the flexibility of these rubber joints, the robot is also able to switch its gait, or walking style, depending on the landscape or nature of the obstacles it’s traversing. The researchers say that the technology behind these fluid-based motors and nimble limbs can be applied to a range of other applications, such as 3D-printed machines and robot arms.

The post A squishy new robot uses syringes and physics to mosey along appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Acrobatic beetle bots could inspire the latest ‘leap’ in agriculture https://www.popsci.com/technology/click-beetle-robot-actuator/ Mon, 23 Jan 2023 20:00:00 +0000 https://www.popsci.com/?p=507090
Graphic of click beetle and robotic actuators
New actuators in robots can mimic click beetles' leaping muscles. Michael Vincent/University of Illinois

A swarm of bug robots could one day soon bounce between farm crops to examine plant health.

The post Acrobatic beetle bots could inspire the latest ‘leap’ in agriculture appeared first on Popular Science.

]]>
Graphic of click beetle and robotic actuators
New actuators in robots can mimic click beetles' leaping muscles. Michael Vincent/University of Illinois

Animal-inspired robots are all the rage now, with recent creations drawing abilities from birds, snakes, octopuses, and even insects.The buggy creature kingdom just offered its newest inspiration, one which could offer huge benefits at a very small scale.

A group of mechanical engineering researchers across multiple universities have spent the last decade delving into click beetles’ evolution, anatomy, and movements. In recent years, the team focused on how a muscle within the tiny insect’s thorax enables it to not only travel many times its body length, but also right itself if turned over on its back. The propulsion, known as snap buckling, is seen as a natural feature that could be adapted into the field of robotics.

[Related: Watch this bird-like robot make a graceful landing on its perch.]

As detailed in a new paper published with Proceedings of the National Academy of Sciences, a team lead by Sameh Tawfick designed a series of coiled actuators which mimic click beetles’ anatomy. When pulled, the beam-shaped device buckles and stores elastic energy like the insects’ thorax muscle. Once the actuator is released, the resultant amplified boost propels the tiny robots upward, allowing it to maneuver over obstacles at roughly the same speed as the real bug. The movement, known as dynamic buckling cascading, could be used by future robots to traverse and examine the innards of large systems like jet turbines using small, on-board cameras.

Tawfick explained in a statement that the team experimented with four robotic actuator variations to determine which were the most economical and effective based on biological data and mathematical modeling. In the end, two designs successfully propelled the robots without any need for manual intervention.

[Related: This robot gets its super smelling power from locust antennae.]

“Moving forward, we do not have a set approach on the exact design of the next generation of these robots, but this study plants a seed in the evolution of this technology,” said Tawfick, explaining that the entire trial-and-error process is similar to biological evolution.

Scientists also believe that future, insect-sized robots using the dynamic buckling cascade actuators could be deployed among agricultural settings like large farms. Often technology such as drones and rovers monitor fields, but these miniscule devices could open up entirely new, more delicate methods of observation and recording.

The post Acrobatic beetle bots could inspire the latest ‘leap’ in agriculture appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Boston Dynamics’s bipedal robots can throw heavy objects now https://www.popsci.com/technology/boston-dynamics-atlas-robot-hand-demo/ Thu, 19 Jan 2023 19:30:00 +0000 https://www.popsci.com/?p=506507
Boston Dynamics Atlas bipedal robot tossing toolbag up to human on construction scaffolding in warehouse
It's impressive, to say the least. YouTube

Atlas is one step closer to being a real world Transformer.

The post Boston Dynamics’s bipedal robots can throw heavy objects now appeared first on Popular Science.

]]>
Boston Dynamics Atlas bipedal robot tossing toolbag up to human on construction scaffolding in warehouse
It's impressive, to say the least. YouTube

Boston Dynamics’ Atlas prototype puts pretty much every other bipedal robot out there (but especially this one) to shame. Although not currently available for purchase, the company’s two-legged research and development platform has consistently impressed the internet for years with ever-improving feats of mobility, coordination, and even TikTok-esque choreography. The company’s latest showcase, however, might be its most jaw-dropping demo yet—until Boston Dynamics releases its next video clip, of course.

The brief sequence required a bit of staging, but Atlas’ new “hands” quickly showcase why the upgrade is a major advancement for the robot. As TechCrunch notes, both its claw-like appendages consist of one fixed and and one moving finger designed for “heavy lifting tasks,” something quite on display during the minute-long demonstration. During that time, Atlas manages to pick up a 2×8 wood beam and place it as a makeshift bridge between two blocks, grab a toolbag, ascend stairs, and traverse gaps. It then tosses its toolkit up to a human above it before then concluding with an “inverted 540-degree, multi-axis flip,” which Boston Dynamics explains “adds asymmetry to the robot’s movement, making it a much more difficult skill than previously performed parkour.”

[Related: Tesla’s Optimus humanoid robot can shuffle across stage, ‘raise the roof’.]

Perhaps anticipating skepticism of the authenticity of Atlas’ new moves, Boston Dynamics also released a far more in-depth, behind-the-scenes look at all the work that went into designing and pulling off its newest abilities.

Part of Atlas’ computational strength lies in its camera system, which relies on both a visual camera as well as a depth camera that actually measures photons’ time of flight to estimate distances. The robot also utilizes something called model predictive control, which developers liken to the human body’s ability to anticipate what it needs to do for impending tasks—such as the heart speeding up slightly ahead of standing up from a seated position.

[Related: Boston Dynamics starts a legal dog fight with competitor Ghost.]

“The robot is doing exactly the same thing,” explains one Boston Dynamics developer. “It’s thinking, ‘How hard do I need to push with my right foot so that, one second from now, I’m not falling over?’”

Fair enough, Atlas. Just give everyone a heads up before you start being able to lift and toss people around like that toolbag.

The post Boston Dynamics’s bipedal robots can throw heavy objects now appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
It’s not a UFO—this drone is scooping animal DNA from the tops of trees https://www.popsci.com/technology/e-dna-drone-tree-top/ Wed, 18 Jan 2023 22:22:15 +0000 https://www.popsci.com/?p=506207
drone on branch
An eDNA sampling drone perched on a branch. ETH Zurich

This flying robot can help ecologists understand life in forest canopies.

The post It’s not a UFO—this drone is scooping animal DNA from the tops of trees appeared first on Popular Science.

]]>
drone on branch
An eDNA sampling drone perched on a branch. ETH Zurich

If an animal passes through the forest and no one sees it, does it leave a mark? A century ago, there would be no way to pick up whatever clues were left behind. But with advancements in DNA technology, particularly environmental DNA (eDNA) detecting instruments, scientists can glean what wildlife visited an area based on genetic material in poop as well as microscopic skin and hair cells that critters shed and leave behind. For ecologists seeking to measure an ecosystem’s biodiversity as non-invasively as possible, eDNA can be a treasure trove of insight. It can capture the presence of multiple species in just one sample. 

But collecting eDNA is no easy task. Forests are large open spaces that aren’t often easily accessible (canopies, for example, are hard to reach), and eDNA could be lurking anywhere. One way to break up this big problem is to focus on a particular surface in the forest to sample eDNA from, and use a small robot to go where humans can’t. That’s the chief strategy of a team of researchers from ETH Zurich, the Swiss Federal Institute for Forest, Snow and Landscape Research WSL, and company SPYGEN. A paper on their approach was published this week in the journal Science Robotics

In aquatic environments, eDNA-gathering robots sip and swim to do their jobs. But to reach the treetops, not only do researchers have to employ flying drones (which are tougher to orient and harder to protect), these drones also need to be able to perch on a variety of surfaces. 

[Related: These seawater-sipping robots use drifting genes to make ocean guest logs]

The design the Swiss team came up with looks much like a levitating basket, or maybe a miniature flying saucer. They named this 2.6-pound contraption eDrone. It has a cage-like structure made up of four arcs that extend out below the ring mainframe that measure around 17 inches in diameter. The ring and cage-like body protect it and its four propellers from obstacles, kind of like the ring around a bumper car. 

To maneuver, the eDrone uses a camera and a “haptic-based landing strategy,” according to the paper, that can perceive the position and magnitude of forces being applied to the body of the robot in order to map out the appropriate course of action. To help it grip, there are also features like non-slip material, and carbon cantilevers on the bottom of each arc. 

Once it firmly touches down, the drone uses a sticky material on each arc to peel off an eDNA sample from the tree branch and stow it away for later analysis. In a small proof-of-concept run, the eDrone was able to successfully obtain eDNA samples from seven trees across three different families. This is because different tree species have their own branch morphologies (some are cylindrical and others have more irregular branches jutting out). Different trees also host different animals and insects. 

“The physical interaction strategy is derived from a numerical model and experimentally validated with landings on mock and real branches,” the researchers wrote in the paper.  “During the outdoor landings, eDNA was successfully collected from the bark of seven different trees, enabling the identification of 21 taxa, including insects, mammals, and birds.”

Although the robot did its intended job well in these small trials, the researchers noted that there needs to be more extensive studies into how its performance may be affected by tree species beyond the ones they tested for or by changing environmental conditions like wind or overcast skies. Moreover, eDNA gathering by robot, they propose, can be an additional way to sample eDNA in forests alongside other methods like analyzing eDNA from pooled rainwater

“By allowing these robots to dwell in the environment, this biomonitoring paradigm would provide information on global biodiversity and potentially automate our ability to measure, understand, and predict how the biosphere responds to human activity and environmental changes,” the team wrote. 

Watch the drone in action below: 

The post It’s not a UFO—this drone is scooping animal DNA from the tops of trees appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
This robot gets its super smelling power from locust antennae https://www.popsci.com/technology/smell-robot-desert-locust/ Wed, 18 Jan 2023 15:00:00 +0000 https://www.popsci.com/?p=506070
Scientist holding syringe next to wheeled robot with biological sensor

The new system is 10,000 times more sensitive than existing odor detecting programs.

The post This robot gets its super smelling power from locust antennae appeared first on Popular Science.

]]>
Scientist holding syringe next to wheeled robot with biological sensor

Although human snouts aren’t quite as weak as they’ve been made out to be, they still pale in comparison to a lot of our world’s fellow inhabitants. After all, you don’t see specially trained (human) police officers sniffing baggage at the airport. Even something as tiny as the mosquito, for instance, can detect a 0.01 percent difference in its surrounding environment’s CO2 levels. That said, you’ll never see a mosquito construct a robot to help pick up our species’ olfactory slack, which is exactly what one research team at Tel Aviv University recently accomplished.

The group’s findings, published in the journal Biosensor and Bioelectronics, showcases how the team connected a biological sensor—in this case, a desert locust’s antenna—to an electronic array before subsequently using a machine learning algorithm to hone the computer’s scent detection abilities. The result was a new system that is 10,000 times more sensitive than the existing, commonly used electronic devices currently available. This is largely thanks to the locust’s powerful sense of odor detection.

[Related: This surgical smart knife can detect endometrial cancer cells in seconds.]

Generally speaking, sensory organs such as animals’ eyes and noses use internal receptors to identify external stimuli, which they then translate into electrical signals that can be processed by their brains. Scientists measured the electrical activity induced within the desert locust’s antennae from various odors, then fed those readings into a machine learning program that created a “library of smells,” according to one researcher. The archive initially included 8 separate entries, including marzipan, geranium, and lemon, but reportedly went on to incorporate differentiations between different varieties of Scotch whisky—probably a pretty nice bonus for the desert locust.

The ability for such delicate readings could soon offer major improvements in the detection of everything from illicit substances, to explosives, to even certain kinds of diseases and cancers. The researchers also stressed that the new biosensor capabilities aren’t limited to simply smell—with additional work and testing, the same idea could be applied to touch or even certain animals’ abilities to sense impending natural disasters such as earthquakes. The team also explained they hope to soon develop the means for their robot to navigate on its own, thereby honing in on an odor’s source before identifying it.

The post This robot gets its super smelling power from locust antennae appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
The US Navy used solar-powered Saildrones to scout in the Persian Gulf https://www.popsci.com/technology/us-navy-saildrones-scouts-destroyer/ Thu, 12 Jan 2023 21:00:50 +0000 https://www.popsci.com/?p=505193
A Saildrone and the  destroyer USS Delbert D. Black on January 8.
A Saildrone and the destroyer USS Delbert D. Black on January 8. US Navy / Jeremy R. Boan

They just need wind and sun to get things done.

The post The US Navy used solar-powered Saildrones to scout in the Persian Gulf appeared first on Popular Science.

]]>
A Saildrone and the  destroyer USS Delbert D. Black on January 8.
A Saildrone and the destroyer USS Delbert D. Black on January 8. US Navy / Jeremy R. Boan

From January 6 through 9, in the Persian Gulf, the US Navy conducted an exercise in which two Saildrone robotic boats communicated with the USS Delbert D. Black, a destroyer. The exercise used robots, AI, and a crewed ship to scout the environment around them, a practical peacetime use of the robot that could inform how these tools are used in war. 

“During the exercise, unmanned and artificial intelligence systems operated in conjunction with Delbert D. Black and CTF [Coalition Task Force] Sentinel’s command center ashore in Bahrain. The systems were able to help locate and identify objects in nearby waters and relay visual depictions to watchstanders,” the US Navy said in a release. 

This isn’t the first time the Navy has used Saildrones in these waters. In August 2022, a ship from the US Coast Guard and a ship from the Royal Bahrain Naval Force worked alongside a Saildrone, integrating the robot’s sensors into the mission. And in September 2022, while Navy Saildrones were operating in the Persian Gulf, Iran’s Navy temporarily seized and held the robots before returning them to the US Navy, a return facilitated by the USS Delbert D. Black. 

Robots at sea can see

So what kind of information or images might the robots capture that’s so valuable to the Navy? A pair of images released by the service branch offer some detail. In one, Lieutenant Richard Rodriguez, aboard the Delbert D. Black, watches images sent from the sea-going drone to a monitor. The Saildrone’s information is viewed through a Mission Portal dashboard displayed in Chrome. The robot’s camera tracks the horizon at an angle, and against it are three marker rectangles, showing possible ship sightings.

As the Navy’s caption describes it, the visuals were transmitted from a Saildrone to a room on the destroyer where a crew member could watch it. In this way, the drones help the crew keep watch.

Another image captures the information as displayed inside the group’s Manama, Bahrain headquarters. At the center of this display is a map, where the layout of the observed gulf is plotted and abstracted. Solid shapes indicate vessels, lines track the Saildrones’ path through time, and plotted polygons denote other phenomena, perhaps rules of egress or avoidance.

A shot from the headquarters in Manama, Bahrain.
A shot from the headquarters in Manama, Bahrain. US Navy / Jacob Vernier

The Malaysian-flagged cargo vessel MSC Makalu III is selected in the shot. The Makalu III was tracked for 23.6 nautical miles over two hours and 38 minutes by two Saildrones. Two images below the name of the Makalu III on the dashboard, presumably from the Saildrone’s cameras, show the distant position of the ship against the watery horizon, and a zoomed-in view that clearly shows the dark mass of a far-away vessel on the surface.

Again, the Saildrone was being used as an observer, a robot on watch duty.

In some sense, this information isn’t exactly novel. The Makalu III is trackable publicly. What is more remarkable is that the Saildrones are able to not just spot vessels, but follow them. The Persian Gulf is a high-traffic waterway, and while many navigational technologies make it easier to track and follow ships as they transit to and from the gulf, the ability to put new sensors into the water enhances what can be known.

The screen displayed in the Manama headquarters shows not just Saildrone activity at the moment, but over time. One of the driving goals behind the Navy’s adoption of uncrewed ships is to enhance how much ocean traffic it can observe over time, and in this case, with two wind-driven robots the ability of a ship to passively observe its surroundings appears greatly enhanced. 

Watching, waiting

Saildrones are small boats, just 23 feet long and rising 16 feet above the surface. With a sail to catch the wind and solar panels to power its electronic systems, and charge its batteries, a Saildrone exists as a tool for passively monitoring the sea. 

These vessels have been used by scientific organizations for civilian purposes. NASA and NOAA, respectively, used Saildrones to fix gaps in satellite maps and monitor fish populations. While the Navy’s recent exercise with Saildrones was brief, the solar power and long endurance of the robots makes them ideal for longer term monitoring, as they sip power from the sun.

The Pentagon formally divides the places combat can take place into domains, and while “sea” is smaller than the vastness of “space,” it is far more peopled. The Navy is tasked simultaneously with ensuring the free flow of law-abiding traffic across the oceans, and with being ready to fight any force that threatens open navigation of the oceans. Knowing where and when to fight, or at least move ships into a show of force, can be aided by keeping an eye on ocean traffic.

Saildrones are a way to make the ocean more known, through the watchful and unblinking eyes of wind-propelled and solar-powered robots.

The post The US Navy used solar-powered Saildrones to scout in the Persian Gulf appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Meet Golfi, the robot that plays putt-putt https://www.popsci.com/technology/robot-golf-neural-network-machine-learning/ Tue, 03 Jan 2023 21:00:00 +0000 https://www.popsci.com/?p=502766
Robot putting golf ball across indoor field into hole
But can Golfi sink a putt through one of those windmill obstacles, though?. YouTube

The tiny bot can scoot on a green and hit a golf ball with impressive accuracy.

The post Meet Golfi, the robot that plays putt-putt appeared first on Popular Science.

]]>
Robot putting golf ball across indoor field into hole
But can Golfi sink a putt through one of those windmill obstacles, though?. YouTube

The first robot to sink an impressive hole-in-one pulled off its fairway feat back in 2016. But the newest automated golfer looks like it’s coming for the short game.

First presented at the IEEE International Conference on Robotic Computing last month and subsequently highlighted by New Scientist on Tuesday, “Golfi” is the modest-sized creation from a research team at Germany’s Paderborn University capable of autonomously locating a ball on a green, traveling to it, and successfully sinking a putt around 60 percent of the time.

To pull off its relatively accurate par, Golfi utilizes an overhead 3D camera to scan an indoor, two-square-meter artificial putting green to find its desired golf ball target. It can then scoot over to the ball and use a neural network algorithm to quickly analyze approximately 3,000 potential golf swings from random points while accounting for physics variables like mass, speed, and ground friction. From there, its arm offers a modest putt that sinks the ball roughly 6 or 7 times out of 10. Although not quite as good as standard human players, it’s still a sizable feat for the machine.

[Related: Reverse-engineered hummingbird wings could inspire new drone designs.]

However, Golfi isn’t going to show up at minigolf parks anytime soon, however. The robot’s creators at Paderborn University designed their prototype to solely work in a small indoor area while connected to a wired power source. Golfi’s necessary overhead 3D camera mount also ensures it won’t make an outdoor tee time, either. That’s because, despite its name, Golfi isn’t actually designed to revolutionize the golf game. Instead, the little robot was built to showcase the benefits of combining physics-based models with machine learning programs.

It’s interesting to see Golfi’stalent in comparison to other recent robotic advancements, which have often drawn from inspirations within the animal kingdom—from hummingbirds, to spiders, to dogs that just so happen to also climb up walls and across ceilings.

The post Meet Golfi, the robot that plays putt-putt appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Australia’s stealthy military drone sub will be called Ghost Shark https://www.popsci.com/technology/australia-ghost-shark-underwater-robot/ Fri, 30 Dec 2022 15:00:00 +0000 https://www.popsci.com/?p=501767
The system previously known as XL-AUV is now called Ghost Shark.
The system previously known as XL-AUV is now called Ghost Shark. Australia DOD / Dan Gosse Images

The undersea robot has a fittingly fierce name.

The post Australia’s stealthy military drone sub will be called Ghost Shark appeared first on Popular Science.

]]>
The system previously known as XL-AUV is now called Ghost Shark.
The system previously known as XL-AUV is now called Ghost Shark. Australia DOD / Dan Gosse Images

On December 12, Australia announced the name of its latest robotic submarine: the Ghost Shark. This vessel, which is being developed by both Anduril and Australia’s Navy and Defence Science and Technology Group, is designed as a large, underwater, autonomous machine, guided by artificial intelligence. The Ghost Shark will be a stealth robot, built for future wars at sea.

In picking the name, the Royal Australian Navy chose a moniker that conferred both stealth, and paid tribute to the wildlife of the continent, or in this case, just off the coast of the continent.

“Ghost Shark’s name comes about from actually an indigenous shark that’s found on our southern waters, indeed it’s found in deeper waters, so it’s quite stealthy, which is a good corollary to the stealthy extra large autonomous vehicle. It also keeps that linkage to the Ghost Bat, the MQ-28 program for the Air Force, which is also another quite stealthy autonomous system,” said Commodore Darron Kavanagh of the Royal Australian Navy. (Ghost sharks, the animals, are often consumed as part of fish and chips.)

The Ghost Bat drone fighter, or MQ-28 he referenced, is another recent initiative by Australia to augment crewed forces with robotic allies. While a jet is bound by the finite number of hours it can stay airborne, a robotic submarine, freed of crew, can endure under the sea for a long time.

“They have the capacity to remain at sea undetected for very long periods, carry various military payloads and cover very long distances,” Rear Admiral Peter Quinn said in a release. “The vessels will provide militaries with a persistent option for the delivery of underwater effects in high-risk environments, complementing our existing crewed ships and submarines, as well as other future uncrewed surface vessels.”

Pause for effect

“Effects” is a broad term that refers to all the ways a vehicle, tool, or weapon can make battle easier for one side and harder for its enemies. “Kinetic effects,” for example, are the missiles, torpedoes, and bullets that immediately come to mind when people think of war. But effects can include other tools, like electromagnetic jamming, or a smoke grenade detonating and creating a dense cloud to hide the movement of soldiers.

Underwater, those effects could be direct attack, like with torpedoes, or it could be sending misleading sonar signals, fooling enemy ships and submarines to target a robot instead of a more powerful crewed vessel.

In May, Anduril announced it was working on Extra Large Autonomous Undersea Vehicles (XL-AUVs) for the Royal Australian Navy, which is what is now known as Ghost Shark.

“It is modular, customizable and can be optimized with a variety of payloads for a wide range of military and non-military missions such as advanced intelligence, infrastructure inspection, surveillance, reconnaissance and targeting,” read the announcement.

In this instance, its job could include seeing enemy vessels and movements, as well as identifying targets for weapons fired from other vehicles. One of the most consistent promises from autonomous systems is that, by using sensors and fast onboard processing, these machines will be able to discover, discern, and track enemies faster than human operators of the sensor systems. If the role of the Ghost Shark is limited, at least initially, to targeting and not firing, it lets the robot submarines bypass the difficult questions and implications of a machine making a lethal decision on its own.

At the press conference this month, Quinn told the press that adversaries will have to assume that a Ghost Shark is not only watching their movements, but “is capable of deploying a wide range of effects — including lethal ones,” reports Breaking Defense. If the Ghost Shark is to be an armed robot, it will raise difficult questions about human control of lethal autonomous machines, especially given the added difficulty of real-time communication under water.

Uncrewed underwater

The Ghost Shark is just one of a growing array of large underwater drones in development by a host of nations. In the chart below, the XL-AUV references the original name for the Ghost Shark.

Before the Ghost Shark can reach the extra-large size it’s intended to have, Anduril is developing the concept on an existing robot submarine it already makes, the smaller Dive-LD. At the naming announcement, a Dive-LD with “Ghost Shark” on the side was on display, highlighting how the program will flow from one into the other.

The Dive-LD is smaller than the XL-AUV (or Ghost Shark) will be, with its 5.8 meter length between 4 and 24 meters shorter than the final design. It still is a useful starting point for developing software, techniques, and testing payloads, all with the intent of scaling the robot up to the size needed for long lasting and deep operations.

The company boasts that these submarines can operate for up to 10 days, with room to expand that endurance, and can operate at depths of up to 6,000 meters below the surface. 

Watch a video about the Ghost Shark, from the Australian Department of Defence, below:

https://www.youtube.com/watch?v=eSXwWvyrrPY

The post Australia’s stealthy military drone sub will be called Ghost Shark appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
The best robot vacuums of 2023 https://www.popsci.com/story/reviews/best-robot-vacuum/ Mon, 01 Nov 2021 11:00:00 +0000 https://www.popsci.com/uncategorized/best-robot-vaccum/
Home photo

Here’s what to look for when you’re shopping, plus a few of our favorite models.

The post The best robot vacuums of 2023 appeared first on Popular Science.

]]>
Home photo

We may earn revenue from the products available on this page and participate in affiliate programs. Learn more ›

Best smart Thin, black robot vacuum cleaner for hardwood floors with boundary stripes eufy BoostIQ RoboVac 30C MAX
SEE IT

Connect to the eufyHome app and Alexa or Google Assistant for streamlined cleaning where you can control schedules, notifications, and locate your vacuum.

Best vacuum and mop combo The Robovac S7 MaxV Ultra Robot Vacuum and Mop is one of the best robot vacuums that's a combo.Roborock-S7-MaxV-Ultra-Robot-Vacuum-and-Mop-best-robot-vacuums Roborock S7 MaxV Ultra Robot Vacuum
SEE IT

This two-in-one pick uses artificial intelligence and 3D scanning to map out your home and provides strong suction and sonic scrubbing power.

Best self-emptying iRobot Roomba s9+ (9550) Robot Vacuum with Automatic Dirt Disposal- Empties itself, Wi-Fi Connected, Smart Mapping, Powerful Suction, Anti-Allergen System, Corners & Edges, Ideal for Pet Hair is the smart vacuum that you need in your house. iRobot Roomba s9+ Robot Vacuum
SEE IT

With powerful suction and a self-emptying function, you can go up to 60 days without emptying the canister. It’s never been easier to maintain a clean home without lifting a finger.

Nothing beats hands-free cleaning—and it truly doesn’t get any better than a robot vacuum. With the push of a button, the best robot vacuums can tackle the largest room in your house without wasting any time. They’re equipped with special features like a quick connection to handheld devices or the ability to remember the overall layout of each room in your home. Stop spending hours panic-vacuuming before guests come over or doing chores on your weekends. Enjoy more free time while these devices take care of the dirty work. All you need to operate a robot vacuum is an open outlet for its charging port and you’re ready to roll. Below are our favorite options and the things you will want to consider in your search for the best robot vacuum cleaner.

How we selected the best robot vacuums

We compared a range of over 50 robot vacuum models for price, brand, added features, mapping technology, reviews, and battery life. No two people’s cleaning needs are the same, which is why we provided a variety of options—from mop-only options from Samsung that can cut through stubborn grime to self-emptying picks for those that don’t want to lift a pinky. Many of the brands we selected have made a name for themselves in tech and vacuums, so we could be sure you’re choosing a robo-vac that will be both reliable and worth the investment.

Best robot vacuums: Reviews & Recommendations

Best smart: eufy BoostIQ RoboVac 30C MAX

Amazon

SEE IT

Why it made the cut: With a large-capacity dust bin and powerful motor, the eufy is a great pick for just about any home.

Specs:

  • Surfaces: Hard floor, carpet
  • Bin size: .6 L
  • Run time: Maximum 100 minutes

Pros:

  • Strong suction power
  • Voice-control equipped
  • Boundary strips for personalized control

Cons:

  • Does not map floor plan

The brand eufy is a branch of Anker Innovations founded by Steven Yang in 2011, following his work with Google. The goal of eufy is to create products that make the “smart home simplified” with a focus on accessibility and convenience. The company’s devices are designed to easily connect with one another, creating cohesion and coherence in the home, from wireless security systems to robot vacuums and light bulbs. And the RoboVac from eufy is as “smart” as it gets when it comes to robot vacuums. It connects to Alexa and Google Assistant, as well as the specifically designed eufyHome app where you can set cleaning schedules, direct the clean with remote control, receive notifications, and locate your robot. You can easily program boundary strips that the RoboVac will identify using 10 built-in sensors as it uses the bounce method to clean.

Best vacuum and mop combo: Roborock S7 MaxV Ultra Robot Vacuum and Mop

Billy Cadden

SEE IT

Why it made the cut: This pick auto-detects the difference between carpet and hard floors to give you a complete and hassle-free clean.

Specs:

  • Surfaces: Hard floor, carpet, tile
  • Bin size: 300 mL water tank
  • Run time: Three hours

Pros:

  • AI detects obstacles
  • Long battery life
  • Self-cleaning

Cons:

  • Expensive

Our favorite robot vacuum-and-mop hybrid is the Roborock S7 MaxV Ultra Robot Vacuum and Mop. This state-of-the-art device is designed with LiDAR navigation reactive artificial intelligence, and structured light 3D scanning to help it map put your home and cover steer clear of obstacles like shoes and toys. The Robovac also provides a powerful suction of 5100 Pa. The mopping function, meanwhile, incorporates sonic vibration technology, which allows it to scrub up to 3,000 times a minute. That a lot of time, since it stays charged for up to three hours. Just tell the Robovac what you want with Alexa or Google Assistant. Oh, and it’s self-cleaning as well.

Best self-emptying: iRobot Roomba s9+ Robot Vacuum

Amazon

SEE IT

Why it made the cut: The power of a robot vacuum, with a self-emptying feature to eliminate every step of this household task.

Specs:

  • Surfaces: Carpet
  • Bin size: Reported 60 days of dirt
  • Run time: Maximum 120 minutes

Pros:

  • Self-emptying design
  • Three stage cleaning for more thorough vacuum
  • Smart mapping

Cons:

  • Some software issues in-app

The Roomba S9+ is iRobot’s most powerful vacuum to date and, boy, does it pack a punch. The iRobot company was founded in 1990 by three MIT roboticists—Colin Angle, Helen Geiner, and Rodney Brooks—with the vision of making practical robots a reality. Their first robot vacuum was released in 2002 and they have been consistently adding to and improving this design ever since. This vacuum self-evacuates after each clean at its docking station, which is equipped with a dirt disposal container that can hold up to 60 days of dust and debris. That means you can vacuum every day for almost two months without being bothered by multiple trips to the trash can.

Best with mapping technology: Neato Robotics Botvac D8 Connected

Neato

SEE IT

Why it made the cut: Easily designate which areas you want to be cleaned with the virtual No-Go lines and high-tech features on this Neato Robotics pick.

Specs:

  • Surfaces: Hard floor
  • Bin size: .7 L
  • Run time: 100 minutes

Pros:

  • Gets into hard-to-reach areas
  • HEPA filter
  • Automatic learning

Cons:

  • Louder than other options

The Botvac D8 from Neato Robotics is a great go-to vacuum that can map and store the memory of up to three floors in your home for a methodical, planned clean, as well as zone-clean specific messes or spills when you tell it to. You can easily draw no-go lines on your phone’s touchscreen using the Neato app that the vacuum will automatically learn and follow. It comes equipped with a HEPA filter to capture dust mites and allergens, battery life of up to 100 minutes, a large 0.7-liter dustbin, and a flat edge design for quick and easy corner clean. Additionally, the brush on the D8 is 70-percent larger than other leading brands, so this vacuum is specifically great for picking up pet hair.

Best for marathon cleaning sessions: Ecovacs Deebot Ozmo T5

Amazon

SEE IT

Why it made the cut: This mop-meets-vacuum has a long battery life and high-tech features to make your clean as seamless as possible.

Specs:

  • Surfaces: Hard floor, carpet
  • Bin size: 430 mL
  • Run time: Over three hours

Pros:

  • Long battery life
  • Mopping included
  • Laser-mapping technology for a complete clean

Cons:

  • Mop could use more water

Ecovacs was established as a company in 1998 with the official Ecovacs Robotics brand created in 2006. They specialize in spatially aware, mobile robots that clean your home, and the Deebot Ozmo is no exception. The Deebot Ozmo T5 from Ecovacs can run for over three hours, cleaning up to 3,200 square feet in a single session. Along with the impressive battery life, this vacuum is equipped with Smart Navi 3.0 laser-mapping technology to keep track of your home and prevent any missed areas, a high-efficiency filter, and three levels of suction power. It connects to your smartphone for a customized clean, and, did we mention? It’s also a mop. Yep, this vacuum can also simultaneously mop your floors, recognizing and avoiding carpeted areas as it cleans.

Best mop-only robot: SAMSUNG Electronics Jetbot Robotic

Why it made the cut: When it comes to automated mopping, this Samsung pick is designed with squeaky-clean floors in mind.

Specs:

  • Surfaces: Tile, vinyl, laminate, and hardwood
  • Run time: 100 minutes

Pros:

  • Multiple cleaning pads
  • Eight cleaning modes
  • Dual pads remove grime

Cons:

  • No mapping

Whether you’re cleaning your bathroom floors, hardwood in the living room, or laminate in the kitchen, the dual spinning pads on the Samsung Jetbot (you can choose machine-washable Microfiber or Mother Yarn) scrub away grime and dirt without the effort of mopping. The eight cleaning modes (selectable via remote) include hand mode, focus mode, and random mode, among others, allowing you to personalize your clean depending on the room and mess level. A 100-minute battery allows for enough time for the double water tanks to offer edge-to-edge coverage.

What to consider when shopping for the best robot vacuums

There are five major things you should take into consideration when purchasing a robot vacuum. The best robot vacuums have a long-lasting battery and a large bin capacity so they can work away in your home without needing to be dumped out or recharged before the job is over. You might want to find one that can easily connect to your smartphone for customized or remote control. And if you’re really looking to elevate your floors, consider a robot vacuum with a mopping function to make your surfaces shine. Finally, look for other advanced features like mapping capabilities or smart-timers. We know that’s a lot of information to keep in mind while you shop, so we’ve created a thorough guide to help you better understand these features, as well as some product suggestions to get you started.

How much cleaning time do you want?

A robot vacuum is only as good as its battery life. Fortunately, many robot vacuums have batteries that last at least one hour. If you have a larger living space you might want to look for something that can last between 90 to 120 minutes to make sure the robot can get to every nook and cranny before needing to recharge. Keep in mind, some vacuums have different power settings, like high intensity or turbo that might drain its battery more quickly. Think about how you want to use your vacuum, what your regular time frames for cleaning will look like, and whether or not you need more surface coverage or suction power.

Most robot vacuums will either alert you when the battery is low or they will dock themselves at their charger. They may also do this automatically after every clean, which means you’ll never have to bother with locating a charging cable or deal with the consequences of forgetting to plug it in. A truly smart robot vacuum will take care of itself after taking care of your floors.

Do you want to control the robot vacuum with your phone?

The best robot vacuums pair with your smartphone so you can create customized settings and control your clean remotely. When we say these things can get fancy, we mean fancy. A device compatible robot vacuum might be able to pair with Alexa or Google Assistant, follow invisible boundary lines you create to keep it away from loose rugs or lots of cables, generate statistics based on a recent clean, tell you how much battery life is left, and virtually map your living space. Being able to control a robot vacuum from your phone means going to brunch with friends, running to the grocery store, picking up your kids from school, and coming home to a clean house. Some models even allow you to set a predetermined schedule for cleaning so you won’t even have to pull out your phone to get it going. Keep in mind, it might be a good idea to be home for your robot’s first clean so you can identify any tough spots or issues your little machine might face.

Before purchasing make sure you check each vacuum’s compatibility, especially if you are using an Android or you are looking to connect to a specific virtual assistant. Many of the vacuums are going to work great with any smart device, but we would hate for you to get ready to connect only to end up disappointed.

Do you want it to take out the trash for you?

Not all robot vacuums can collect the same amount of debris and detritus before needing to be emptied out. Think about how frequently you’re hoping to vacuum your home and how much dust, dirt, and pet dander might accumulate in the meantime. If you have a smaller living area, keep things relatively tidy, dust and sweep often, or vacuum regularly, you might be able to survive on a smaller bin. However, if you know you need something more heavy-duty, don’t skimp on bin storage. The average dustbin size is 600 milliliters; some can go up to 700 or 750. These dustbins are easy to remove and don’t require extra work, such as bag or filter replacement. If you have a cat or dog (or a very hairy human) running around the house, consider a vacuum that specifically boasts its ability to pick up hair and dander.

One of the best features a robot vacuum can have is a self-evacuating bin. Instead of emptying a bin after every one or two cleaning sessions, your vacuum will automatically deposit all of its collected dust bunnies, forgotten LEGO pieces, food crumbs, and other artifacts to a larger bin at its docking station. Many of these stations come with allergen filters and other sensors to keep its contents completely sealed. It will let you know when it needs to be emptied so you don’t have to worry about spillage or clogging. Now that’s some seriously futuristic cleaning.

Do you want a mop, too?

We are pleased to inform you that the best robot vacuums can also mop, so not only will you have all the dirt and debris sucked away but you’ll also have sparkling clean floors free of stains and spills. These vacuum-mop hybrids have two compartments: one for collecting the bits and pieces that are suctioned up and another to hold water that will go over hardwood or tile flooring. These hybrids typically come with a sensor that informs the robot where carpeted areas can be found, which the vacuum will avoid when it’s time to mop. That’s one more chore your smart vacuum can take care of and one more episode of TV you get to watch instead!

If the vacuum you are looking at doesn’t have its own mopping function, or maybe a hybrid isn’t in your price range, look for models that are able to pair with a separate robot mopper all together. Many brands create individual vacuums and mops that communicate with one another via smartphone or internal programming to schedule cleanings one right after the other. They can often be stored next to one another and have similar special features and battery life—so you can count on this dynamic duo to get the job done.

Does it know your home?

We touched on special features a little bit when we outlined smartphone compatibility, but we want to dive in further and really explain the kinds of advanced features you might want to prioritize when considering which robot vacuum is right for you. The first thing to look for is a vacuum with obstacle identification features, so your vacuum can identify small barriers like power strips, cables, pet toys, or shoes. There’s nothing worse than coming home and finding your vacuum trapped in an endless battle between your internet router and your kid’s favorite stuffed animal, right?

You can also look for specific mapping capabilities that determine whether or not your robot cleans randomly or methodically. A random robot using a “bounce” cleaning method might have object identification sensors—but it won’t necessarily keep track of where in your house it has been, and will go over areas more than once for a thorough clean. A methodical vacuum has sensors that track where it’s been and what areas of the house it’s covered. This is often faster, but not always the most thorough. However, these methodical cleaners collect data over time to retain a virtual map of your home for a more efficient clean. Just make sure you keep the lights on during a vacuuming session, because these sensors need to quite literally “see” in order to collect information and avoid bumping into things. Once this data has been collected, you might also be able to set up boundaries or no-clean zones from your phone. This tells the robot where to avoid, like a play area or delicate carpet.

You can also look for a vacuum with a camera so you can see where it is or simply check in on your home. There are almost endless advanced features you can choose to prioritize depending on your needs.

FAQs

Q: Do cheap robot vacuums work?

Affordable robot vacuums can still achieve the clean of your dreams but might sacrifice some added features, like self-emptying, smart-home connectivity, or mopping capabilities. That said, even a cheap robot vacuum will still drastically cut down the time you spend on chores—in our book, that’s a win.

Q: Is it worth getting a robot vacuum?

You can spend less time cleaning when you have a robot vacuum in your arsenal. While models are still evolving with better technology, those with families, pets, or simply limited spare time can benefit from investing in a robot vacuum. Regular vacuums—like a this one from Dyson—can be quite pricey as well, so why not spend a bit more and relegate the chore to hands-free software?

Q: Can robot vacuums go from hardwood to carpet?

In short, it depends. While some models can auto-detect the transition from carpet to hardwood floors, others will need you to map out different zones. These maps can help your robot vacuum determine what modes it needs to be on for each area to ensure an overall deep clean.

The final word on shopping for the best robot vacuums

An amazing, hands-free clean should now be well within reach with a robot vacuum. There are so many options out there and we hope you now know what to look for when venturing out to get your new robotized housekeeper. Keep in mind that the best robot vacuums are worth investing in for an efficient, smart, and clean home with the push of a button.

The post The best robot vacuums of 2023 appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Monitoring volcanoes that could explode? A drone is on it. https://www.popsci.com/technology/drone-volcano-eruption/ Fri, 23 Dec 2022 00:00:00 +0000 https://www.popsci.com/?p=501461
volcano erupting
Volcano eruptions can be scary if you don't know they're coming. Izabela Kraus / Unsplash

By keeping track of the ratio of certain gasses, it can predict when a volcano is likely to erupt.

The post Monitoring volcanoes that could explode? A drone is on it. appeared first on Popular Science.

]]>
volcano erupting
Volcano eruptions can be scary if you don't know they're coming. Izabela Kraus / Unsplash

Volcano eruptions are dramatic, messy events. And worse, they’re often unpredictable. Despite humanity’s best efforts to understand them, volcanoes continue to be a big threat—one that most people are not adequately prepared for. Historic ice cores tell us that the biggest explosions yet are still to come. Over the years, scientists have devised software, computer simulations, and even special instruments to monitor and predict when these sleeping beasts may wake. Researchers from Johannes Gutenberg University Mainz have come up with another technique: drones. 

In a study published in late October in Scientific Reports, a team of scientists showed that small drones can be used to characterize the chemistry of volcanic plumes. Properties like the ratio of carbon dioxide to sulfur dioxide can give clues on what reactions are happening under the surface, and whether an eruption is coming soon. This ratio sometimes changes quickly before a volcano blows. 

Robots photo
Research drone flying in tests on Vulcano, Italy. Hoffmann group

Big, heavy-duty drones can often be a hassle to transport in and around the terrain surrounding volcanoes. And having humans trek out in special gear and suits is not an easy or quick journey either. Using a drone that could fit into a backpack could simplify the process. The drone used in the experiment was a 2-pound commercial model called DJI Mavic 3. 

Of course, the flying gizmo had to undergo a few modifications before it was volcano-ready. It’s decked out in a sensor system coordinated by a 4 MB microcontroller that bridges the communications between the electrochemical sulfur dioxide sensor, a light-based carbon dioxide sensor, plus other instruments for measuring temperature, humidity and pressure, and a GPS module.

The drone boasts a relatively high-frequency sampling rate: 0.5 Hz. And its battery allows it to run for 1.5 hours. The team tested the system on the island of Vulcano, Italy in April 2022 and flew it into a fumarole field, where volcanic gasses and vapors are emitted from openings in the ground. During its test flight it was able to quickly and accurately measure the gaseous emissions from the fumarole field in order to monitor volcanic activity. 

[Related: Whale-monitoring robots are oceanic eavesdroppers with a mission]

Drones are being used more and more as the eyes in the skies above hazardous locations; they have proved to be a practical solution for monitoring the real-time developments of phenomena and disasters like fog, wildfires, and hurricanes. They’ve also been used to monitor day-to-day happenings in the natural world like shark activity and changes to water sources

Off-the-shelf drones have become an extremely valuable tool for scientists hoping to collect data in hard-to-access places like the polar region and remote locations where wildlife congregate. But the certification process to use drones through the FAA is sometimes a hurdle to these tools becoming more commonplace among more researchers. 

The group from Johannes Gutenberg University Mainz isn’t the first to use drones to study volcanoes. For example, an international team of researchers used the flying tools to study the structural integrity of volcanic domes, and NASA has used small airplane-like drones to capture visible-light and thermal images of volcanic areas from above. 

Hopefully, if the science done via drone become refined and robust enough, it could help researchers actually predict eruptions well before they happen.

The post Monitoring volcanoes that could explode? A drone is on it. appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
The newest robot dog can scale walls and ceilings https://www.popsci.com/technology/robot-dog-climb-walls-ceilings/ Mon, 19 Dec 2022 17:00:00 +0000 https://www.popsci.com/?p=500147
Side-by-side images of four-legged robot dog scaling a wall
MARVEL certainly lives up to the name. KAIST

Magnetic feet make climbing easy for these 18-pound robots.

The post The newest robot dog can scale walls and ceilings appeared first on Popular Science.

]]>
Side-by-side images of four-legged robot dog scaling a wall
MARVEL certainly lives up to the name. KAIST

It’s easy to have a love-hate relationship with the ever-widening industry of robot dogs. On the one hand, many of them are almost as cute and agile as the four-legged friends they emulate; on the other hand, some of them have guns on their heads.

That being said, it’s hard to decide how to take the latest feat for quadrupedal robots: they’re able to climb walls and ceilings now. The new achievement comes courtesy of researchers at the Korea Advanced Institute of Science and Technology, who published a paper last week in Science Robotics detailing their new creation, MARVEL. Short for “Magnetically Adhesive Robot for Versatile and Expeditious Locomotion,” MARVEL lives up to both the name and its acronym by relying on four electromagnetic legs tipped with new smart materials called magnetorheological elastomers (MREs). These MREs resemble the consistency and elasticity of rubber, but contain components such as carbonyl iron powder that make them capable of conducting electromagnetic force.

[Related: This agile robot dog uses a video camera in place of senses.]

As also pointed out over the weekend by Futurism, MARVEL already excels at climbing up flat metal walls and ceilings at respective speeds of 1.6 and 2 feet per second. The four-legged robot is also capable of handling curved metal surfaces, such as storage tanks coated in 0.3-millimeter-thick paint underneath both rust and dust. Researchers also noted in the study? that MARVEL can tackle a variety of obstacles, including 10-centimeter-wide gaps, 5-centimeter-high barriers, all while easily transitioning between floor, wall, and ceiling settings.

All that versatility could soon make them potentially ideal tools for inspection scenarios within industrial environments, as well as large ships, bridges, and tall buildings that could prove potentially dangerous for humans. At just 18 pounds and only 13 inches long, MARVEL is about the same size as a small dog and incredibly portable and easy to handle.

[Related: Ghost Robotics now makes a lethal robot dog.]

Animals are increasingly inspiring researchers in designing robots that can maneuver within settings that prove difficult or dangerous for humans. Recently, a team of engineers even designed a two-winged, birdlike robot capable of landing atop a perch using a single grasping claw mechanism. Like MARVEL, the bird-bot’s abilities could lend themselves to accessing remote or dangerous areas that are otherwise hard for their human creators. Both designs are far from ready to make their commercial debut, but it certainly seems an era of everyday interactions alongside animal-like robots is in our future.

The post The newest robot dog can scale walls and ceilings appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Watch this bird-like robot make a graceful landing on its perch https://www.popsci.com/technology/winged-bird-robot-perch-landing/ Fri, 16 Dec 2022 14:00:00 +0000 https://www.popsci.com/?p=499353
Winged bird robot landing on wooden perch during test flight
The robot only needs one claw to successfully stick its landing. Raphael Zufferey

A perching robot could one day be used to monitor even the most shy, hidden animals.

The post Watch this bird-like robot make a graceful landing on its perch appeared first on Popular Science.

]]>
Winged bird robot landing on wooden perch during test flight
The robot only needs one claw to successfully stick its landing. Raphael Zufferey

It’s one thing to get a robot to fly like a bird, but it’s another thing entirely to get them to perch like one. There are a lot of factors to consider—including speed, timing, impact force, distance estimation, and balance, just to name a few. But judging from these recent photos and videos courtesy of Raphael Zufferey and their colleagues at the Swiss Federal Institute of Technology in Lausanne, a new bar has been set for ornithopters, aka bird-bots.

In an  interview with New Scientist, Zufferey explains that the trick to pulling off the stunning feat requires a few augmentations from actual avian behavior. Although the spring-loaded claw grasps a 6cm diameter branch much like its zoological inspirations, the final approach differs from its real-world counterparts. Generally, birds hover above their intended perch for a few moments before touching down. Zufferey’s invention, however, simply slows down as it nears its final destination using optical camera assessments, thus allowing the springed talons time to trigger within just 25 milliseconds, according to the team’s paper published with Nature Communications.

[Related: Flying snakes could inspire a new generation of airborne robots.]

The new ornithopter isn’t quite ready for outdoor use.  It currently only operates while “dependent on accurate localization data from a motion capture system,” according to the team’s research paper, and isn’t optimized yet for unpredictable environments. Once those problems are solved, however, researchers think the robot could offer novel alternatives to gathering samples in hard-to-reach locations, or even monitoring noise-sensitive animals in the wild for research purposes. Assuming said animals don’t mind a nosy mechanical bird hovering around them, that is.

In any case, animals are providing inspiration for all manner of advancements in robotics: from six-legged spider rovers set to soon roam Japanese sewer systems to waterborne robots that can now mimic manta rays for faster, lighter, and more energy efficient designs. Birds aren’t the only flying creatures robots can imitate, either. Flying snakes—yes, you read correctly—capable of flattening and undulating their bodies to propel through the air have inspired creative new movements for future bot designs.

The post Watch this bird-like robot make a graceful landing on its perch appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Flying snakes could inspire a new generation of airborne robots https://www.popsci.com/technology/flying-snakes-could-inspire-a-new-generation-of-airborne-robots/ Tue, 13 Dec 2022 21:00:00 +0000 https://www.popsci.com/?p=498341
Green flying snake resting on tree branch looking at camera
Flying snakes can soar over 25 meters to evade predators. Deposit Photos

Researchers used computational analysis to study the paradise tree snake's gliding techniques for potential robot guidance.

The post Flying snakes could inspire a new generation of airborne robots appeared first on Popular Science.

]]>
Green flying snake resting on tree branch looking at camera
Flying snakes can soar over 25 meters to evade predators. Deposit Photos

The paradise tree snake belongs to a particularly unique family of serpents capable of “flying” through the air, likely much to the dismay of anyone suffering from ophidiophobia. After launching themselves from tree branches in southeastern Asia, the Chrysopelea paradisi flattens their bodies to become more aerodynamic, then undulates in a specific pattern to soar as far as 25 meters in a single flight.

It’s an impressive feat, but not necessarily the first animal you’d expect scientists to  mimic when designing new robots. A team of researchers from both University of Virginia and Virginia Tech, however, recently saw potential in the snake’s talent, and used computational analysis to break down its movements’ intricacies for potential future advances in robotic mobility.

[Related: How engineers taught a manta ray-inspired robot the butterfly stroke.]

In a paper published with Fluids in Physics, the group first gathered data from high-speed video recordings of flying tree snakes, which they then fed into a computational program designed to analyze the minutiae of their movements. As it turns out, the snakes’ undulations utilize the same airflow mechanisms as an everyday frisbee—like the flying disks, the snakes generate a high-pressure flow underneath their bellies with a low-pressure region providing a suction force above their backs. This differential allows the snakes to earn its “flying” moniker.

“The snake’s horizontal undulation creates a series of major vortex structures, including leading edge vortices, LEV, and trailing edge vortices, TEV,” Haibo Dong, a researcher at the University of Virginia and paper co-author, said in a statement. “The formation and development of the LEV on the dorsal, or back, surface of the snake body plays an important role in producing lift.”

[Related: Spider robots could soon be swarming Japan’s aging sewer systems.]

One surprising find was that the trick to soaring distance resides in the frequency of the flying snakes’ side-to-side movements while in the air. Generally, serpents like the paradise tree snake writhe between one and two times per second as they travel between tree branches or to the ground, typically to evade predators. While they hypothetically could do so faster, the increased movement ostensibly would throw them off balance, resulting in decreased aerodynamics.

Unfortunately, the researchers aren’t testing any robotic flying snake prototypes at the moment, but the recent analysis provides future designers a wealth of information to make such creations a very real possibility. When they finally arrive, you can add them to the menagerie of animal-inspired bots, alongside dogs, spiders, and octopuses.

The post Flying snakes could inspire a new generation of airborne robots appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
This fossil-sorting robot can identify millions-year-old critters for climate researchers https://www.popsci.com/technology/forabot-sort-foram-fossil/ Tue, 13 Dec 2022 20:00:00 +0000 https://www.popsci.com/?p=498405
Foraminiferas are tiny marine organisms with intricate shells.
Foraminiferas are tiny marine organisms with intricate shells. Josef Reischig / Wikimedia Czech Republic

Forabot’s job is to image, ID, and categorize the tiny shells left behind by marine organisms called foraminiferas.

The post This fossil-sorting robot can identify millions-year-old critters for climate researchers appeared first on Popular Science.

]]>
Foraminiferas are tiny marine organisms with intricate shells.
Foraminiferas are tiny marine organisms with intricate shells. Josef Reischig / Wikimedia Czech Republic

Tiny marine fossils called foraminifera, or forams, have been instrumental in guiding scientists studying global climate through the ages. The oldest record of their existence, evident through the millimeter-wide shells they leave behind when they die, date back more than 500 million years. In their heyday, these single-celled protists thrived across many marine environments—so much so that a lot of seafloor sediments are comprised of their remains

The shells, which are varied and intricate, can provide valuable insights into the state of the ocean, along with its chemistry and temperature, during the time that the forams were alive. But so far, the process of identifying, cataloging, and sorting through these microscopic organisms has been a tedious chore for research labs around the world. 

Now, there is hope that the menial job may get outsourced to a more mechanical workforce in the future. A team of engineers from North Carolina State University and University of Colorado Boulder has built a robot specifically designed to isolate, image, and classify individual forams by species. It’s called the Forabot, and is constructed from off-the-shelf robotics components and a custom artificial intelligence software (now open-source). In a small proof-of-concept study published this week in the journal Geochemistry, Geophysics, Geosystems, the technology had an ID accuracy of 79 percent. 

“Due to the small size and great abundance of planktic foraminifera, hundreds or possibly thousands can often be picked from a single cubic centimeter of ocean floor mud,” the authors wrote in their paper. “Researchers utilize relative abundances of foram species in a sample, as well as determine the stable isotope and trace element compositions of their fossilized remains to learn about their paleoenvironment.”

[Related: Your gaming skills could help teach an AI to identify jellyfish and whales]

Before any formal analyses can happen, however, the foraminifera  have to be sorted. That’s where Forabot could come in. After scientists wash and sieve samples filled with sand-like shells, they place the materials into a container called the isolation tower. From there, single forams are transferred to another container called the imaging tower where an automated camera captures a series of shots of the specimen that’s then fed to the AI software for identification. Once the specimen gets classified by the computer, it is then shuttled to a sorting station, where it is dispensed into a corresponding well based on species. In its current form, Forabot can distinguish six different species of foram, and can process 27 forams per hour (quick math by the researchers indicate that it can go through around 600 fossils a day). 

For the classification software, the team modified a neural network called VGG-16 that had been pretrained on more than 34,000 planktonic foram images that were collected worldwide as part of the Endless Forams project. “This is a proof-of-concept prototype, so we’ll be expanding the number of foram species it is able to identify,” Edgar Lobaton, an associate professor at NC State University and an author on the paper, said in a press release. “And we’re optimistic we’ll also be able to improve the number of forams it can process per hour.”

Watch Forabot at work below:

The post This fossil-sorting robot can identify millions-year-old critters for climate researchers appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
How a robotic arm could help the US Army lift artillery shells https://www.popsci.com/technology/us-army-robot-artillery-loader/ Mon, 12 Dec 2022 12:00:00 +0000 https://www.popsci.com/?p=497049
A US Army M109 Paladin howitzer seen in Latvia in July, 2022.
A US Army M109 Paladin howitzer seen in Latvia in July, 2022. US Army / Eliezer Meléndez

Artillery ammunition is heavy, but robots are strong.

The post How a robotic arm could help the US Army lift artillery shells appeared first on Popular Science.

]]>
A US Army M109 Paladin howitzer seen in Latvia in July, 2022.
A US Army M109 Paladin howitzer seen in Latvia in July, 2022. US Army / Eliezer Meléndez

To fire artillery faster, the US Army is turning to robotic arms. On December 1, Army Futures Command awarded a $1 million contract to Sarcos Technology and Robotics Corporation to test a robot system that can handle and move artillery rounds. 

Every artillery piece is, in essence, a tube that combines the artillery shell with an explosive propellant, hurling a projectile and pain far away to someone else. The rate of artillery fire is determined by how quickly the crew can aim, load, and reload the gun. For artillery on the ground, that’s a matter of drill and skill, training the humans to lift and load, and clear and seal guns as fast as possible without dropping an artillery round that can weigh over 90 pounds. 

As such, the Army hopes that robotics can help with this process. “The Sarcos robotic ammunition handling solution leverages a dexterous robotic arm that was designed to be integrated into the U.S. Army’s fleet of Self-Propelled Howitzer Systems,” the company said in a release.

A self-propelled artillery system is a long-range gun mounted on a vehicle, usually a tracked and to some extent an armored one, that looks at a distance like a tank with a very large gun. The Army’s self-propelled howitzer is the venerable M109 Paladin, whose earliest models entered service in 1963. The Paladin has been upgraded at least 15 times in its long service, with new production models adapting to better technology and changing needs in combat.

Operating a Paladin at present takes a crew of six. The driver directs the vehicle, the gunner aims the weapon, three ammunition handlers load and ready the weapon, and a commander oversees the whole operation. Fitting three people in the back of the Paladin to lift and load ammunition means specifically finding recruits who can fit within the vehicle’s confines. Those people must also endure the stress of repeatedly lifting and loading rounds at the pace of battle.

[Related: The US’s latest assist to Ukraine: Rocket launchers with a 43-mile range]

For the Extended Range Cannon Artillery, the Army’s latest iteration of the Paladin-derived design, the Army is hoping to double the range of its artillery, while keeping pace with the complex tasks of firing and calibrating shots. Depending on ammunition, a Paladin today can hit targets at a range of 11 to 15 miles away. The Extended Range version, which has been thoroughly redesigned since the 1963 models, will have a range of 40 miles. 

“The Extended Range Cannon Artillery system is used extensively in the U.S. Army for long range precision firing, but the downside to this system is the weight of the ammunition needing to be hand-loaded by Soldiers in the field,” Reeg Allen, vice president of business development, Sarcos, said in a release.

An automated system, using robot arms to fetch and ready artillery rounds, would function somewhat like a killer version of a vending machine arm. The human gunner could select the type of ammunition from internal stores, and then the robotic loader finds it, grabs it, and places it on a lift. 

If it sounds futuristic, a system like this is actually already in use. This is how the automated loader of the Panzerhaubitze 2000 self-propelled howitzer works. That gun is in service with several nations, including Germany and Ukraine. The use of the automated system requires one fewer human artillery crew member in the vehicle. The PzH 2000 also has an automated loader for outside the vehicle, allowing soldiers to carry ammunition from trucks or nearby storage and restock the vehicle in the field, without having to crawl into the confined space of the artillery crew compartment.

[Related: What to know about the Caesars, the gigantic truck-mounted artillery units France sent Ukraine]

Testing the new automated system means ensuring not just that it can lift and load artillery, but that it can also handle the rigors of war. Any useful hardware must be able to absorb the shock and vibration of driving, as well as handling the environmental factors in which it operates, from intense heat to sharp cold, as well as erosion from sand, dust, and humidity.

Should the robot arm perform as expected in testing, it will eliminate a job that is all repetitive strain. The robot, lifting and loading ammunition, is now an autonomous machine, automating the dull and menial task of reading rounds to fire.

Improved speed and reduced crewing of artillery are always broadly good objectives, and the ongoing war in Ukraine has emphasized the continuing role of artillery on modern battlefields. Self-propelled artillery offers a way for armies to shoot and scoot, unleashing salvos and then relocating before retaliation. Unlike high-end rock and missile systems like the HIMARs, self-propelled artillery can deliver that barrage using much lower cost artillery shells.

Watch an automated loader for a PzH 2000 in action below: 

The post How a robotic arm could help the US Army lift artillery shells appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Sea urchin sperm is surprisingly useful to robotics experts https://www.popsci.com/environment/sea-urchin-sperm-robotics/ Sat, 10 Dec 2022 20:00:00 +0000 https://www.popsci.com/?p=497231
Purple sea urchins underwater releasing eggs and sperms during the mating process
Sea urchins have finetuned the way their sperm find egg cells in the vastness of the ocean. Deposit Photos

Engineers have been building machines with 'extremum seeking' algorithms for decades. Sea urchins perfected the method naturally.

The post Sea urchin sperm is surprisingly useful to robotics experts appeared first on Popular Science.

]]>
Purple sea urchins underwater releasing eggs and sperms during the mating process
Sea urchins have finetuned the way their sperm find egg cells in the vastness of the ocean. Deposit Photos

Sperm have a unique sense of direction. The reproductive cells of many species are tuned to seek out eggs, no matter how far or how difficult the journey. Take sea urchin sex. First, males and females will puff out clouds of sperm and egg cells into the ocean. To find and fertilize an egg in the open waters, the sperm of these spiny seabed critters follow a chemical bread crumb trail. And engineers are tapping this clever attraction method for smarter, destination-seeking robots of their own. 

A study published December 9 in the journal Physical Review E details the similarities between the trajectory of sea urchin sperm and computer systems that use a type of real-time search approach called extremum seeking. Engineers from the University of California, Irvine and University of Michigan made a mathematical model of the sperm’s pathway to better understand its behavior. According to the authors, assessing the sea urchin’s biological nature could help design miniature robots that follow cues from sources in the same way.

[Related: Sterile mice have been modified to make rat sperm]

Since the 1920s, engineers have used extremum seeking as an adaptive control technique to program technologies that help steer or direct systems for maximum function. It’s been used to control and optimize fuel flow in flight-propulsion systems, combustion for engines and gas furnaces, and anti-lock braking systems in cars. At its basics, a system’s extremum seeking algorithm tracks a signal beacon emitted by a source, says Mahmoud Abdelgalil, who studies dynamics and control at UC Irvine and was the lead author of the paper.

When you think of robotic designs, sea urchin sex isn’t quite what comes to mind. But Abdelgalil says their reproductive cells are a useful and well-studied biological model. To find an egg, sea urchin sperm use chemotaxis, where the cells move in response to a chemical stimulus. Sea urchin eggs specifically secrete a compound called a sperm-activating peptide, which interacts with the sperm’s flagellum, controlling how it beats. This curves and bends the sperm’s direction on a path toward the egg. 

“Sperm don’t have a GPS,” Abdelgalil says. “They don’t know ahead of time where the egg is. So they measure the local concentration [of the peptide] at the current position, then they use that information and move in the direction of increasing concentration levels—which we like to call the direction of the concentration gradient.”

It’s the same for an extremum seeking robot: It doesn’t have coordinates or other information about the target’s location—all it knows is that it can measure and follow the dynamic signal from the current position. Abdelgalil got the idea to look at sea urchin sperm when he saw a previously published paper detailing their behavior under a microscope. The trajectory of the sperm looked nearly identical to a proposed model of an extremum-seeking unicycle robot, a simple machine that can only control its orientation and move in a forward direction. 

“As soon as I saw the two pictures, I realized that this is more or less the same,” he says. So, in the new study, Abdelgalil and his colleagues illustrated how key components of the sea urchin sperm’s navigation strategy resemble hallmark features of extremum seeking. 

This extremely effective searching strategy, which evolved over time in nature, could be useful in fine-tuning future system designs and technologies. Extremum seeking algorithms with minimal sensors could help steer miniature robots, like those being tested for targeted drug delivery. Research groups have already explored drug delivery microrobot designs that utilize external signals, Abdelgalil says. For instance, Abdelgalil mentions that researchers at ETH Zurich in Switzerland developed a tiny starfish larva-inspired robot that is guided by sound waves and might one day be useful in delivering drugs directly to specific diseased cells in the body. “I hope my work will eventually be applied in studying or designing microrobots that employ extremum seeking to autonomously navigate environments and find the exact locations of infected cells that need drugs,” he says.

[Related: What this jellyfish-like sea creature can teach us about underwater vehicles of the future]

Abdelgalil also notes that other organisms seem to have some form of extremum seeking, including bacteria searching for food or algae moving in the direction of light. “We can learn from the behavior of these microorganisms to design our robots that behave in a well-defined way when there is no one commanding them,” he says. “This can enhance the autonomy of our more traditionally operated robots.” 

The post Sea urchin sperm is surprisingly useful to robotics experts appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Watch this little drummer bot stay on beat https://www.popsci.com/technology/xiaomi-humanoid-robot-drum/ Sat, 10 Dec 2022 12:00:00 +0000 https://www.popsci.com/?p=496976
xiaomi's cyberone robot drumming
IEEE Spectrum / YouTube

Humanoid robots can be hard to train. Here's how Xiaomi taught CyberOne to play a drum set.

The post Watch this little drummer bot stay on beat appeared first on Popular Science.

]]>
xiaomi's cyberone robot drumming
IEEE Spectrum / YouTube

Humanoid robots have long been a passion project for tech giants and experimental engineers. And these days, it seems like everyone wants one. But for machines, social skills are hard to learn, and more often than not, these robots have a hard time fitting in with the human world. 

Xiaomi, a consumer electronics company based in China, teased that they were making such a machine in August. According to the company’s press release, the 5-foot, 8-inch bot, called CyberOne, is probably not intended to be all that useful, IEEE Spectrum reported, but rather it’s “a way of exploring possibilities with technology that may have useful applications elsewhere.” 

As for the robot’s specs, the company said that CyberOne comes with a “depth vision module” as well as an AI interaction algorithm. It can additionally support up to 21 degrees of freedom in motion and has a real-time response speed that “allows it to fully simulate human movements.” 

Xiaomi has just unveiled a new clip of CyberOne, and it’s slowly but aptly playing a multi-instrument drum set. It’s able to precisely coordinate a series of complex movements including hitting the drumsticks together, tapping the cymbals, foot pedal, and a set of four drums to make a range of sounds. And it’s certainly more elegant and evolved than other, scrappier (and sometimes disembodied) robot bands and orchestras of the past. 

[Related: How Spotify trained an AI to transcribe music]

So how does CyberOne know what to do? Xiaomi made a diagram showing how sound files become movements for CyberOne. First, drum position and strike speed commands are fine-tuned online. Then, these beats are fed to CyberOne via a MIDI file, which tells the computer what instrument was played, what notes were played on the instrument, how loud and how long they were played for, and with which effects, if any. The robot then uses an offline motion library to generate the moves for its performance, being careful to hit the correct instrument and in time. 

Executing instructions concisely and in a controlled, coordinated manner is a difficult exercise even for humans. Humanoid robots are different from regular bots because they’re meant to emulate natural movements, but can often be impractical in a real world setting. They need specialized training to do the simplest functions (like not falling over). A humanoid robot capable of honing in on a skill like playing drums could be useful for a variety of complex tasks that might involve manipulating or interacting with objects in its environment. 

“We are working on the second generation of CyberOne, and hope to further improve its locomotion and manipulation ability,” Zeyu Ren, a senior hardware engineer at the Xiaomi Robotics Lab, told IEEE Spectrum. “On the hardware level, we plan to add more degrees of freedom, integrate self-developed dexterous hands, and add more sensors. On the software level, more robust control algorithms for locomotion and vision will be developed.”

Watch CyberOne groove below:

The post Watch this little drummer bot stay on beat appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Spider robots could soon be swarming Japan’s aging sewer systems https://www.popsci.com/technology/spider-robot-japan-sewer/ Thu, 08 Dec 2022 18:00:00 +0000 https://www.popsci.com/?p=496367
Three Tmsuk spider robots meant for sewer system inspections and repairs
The stuff of nightmares. Tmsuk/YouTube

Faced with an ongoing labor shortage, Japan could turn to robots to handle utility maintenance.

The post Spider robots could soon be swarming Japan’s aging sewer systems appeared first on Popular Science.

]]>
Three Tmsuk spider robots meant for sewer system inspections and repairs
The stuff of nightmares. Tmsuk/YouTube

What’s worse—getting trapped in a dank, decrepit sewer system, or finding yourself face-to-face with an army of robotic spiders? The correct answer is getting trapped in a dank, decrepit sewer system, where you then find yourself face-to-face with an army of robotic spiders.

[Related: This spooky robot uses inflatable tentacles to grab delicate items.]

The latter half of this scenario happens if Japan’s robotics manufacturer Tmsuk has its say. As a new video report courtesy of South China Morning Post detailed earlier this week, the company recently unveiled its line of SPD1 prototypes—small robots powered by Raspberry Pi CPUs that creep along upon eight legs modeled after its arachnid inspirations. The little spider-bots also have 360-degree vision thanks to an array of very spidey-like camera eyes.

In Tmsuk’s video below, the tiny machines are in action. The company certainly seems to be leaning into the spookiness in the promotional material.

SPD1 comes as Japan continues to reckon with a labor shortage affecting over half of the country’s industries, including public utility maintenance. With some projections estimating 6.4 million job vacancies by decade’s end, businesses like Tmsuk are offering creative, if arguably off-putting, alternatives to hard-to-fill positions such as those involving sewer repairs.

“The lifespan (of sewer pipes) is 50 years, and there are many sewer pipes reaching the end of that lifespan,” Tmsuk CEO Yuji Kawakubo explained in the SCMP video interview. “There is an overwhelming shortage of manpower to inspect such pipes, and the number of sewer pipes that have not been inspected is increasing.”

[Related: Meet the world’s speediest laundry-folding robot.]

Kawakubo recounted that early iterations of the SPD1 relied on wheels for movement. However, sewer systems’ rocky, unstable terrain quickly proved too difficult. Replacing the wheel system with eight legs allowed the remote-controlled devices a much greater range of mobility and reach during testing. Tmsuk hopes the SPD1 can hit the market sometime soon after April 2024, with future editions able to handle small repair jobs on top of their current surveillance and examination capabilities.

If a swarm of SPD1 bots crawling underneath your home isn’t spooky enough, it’s worth noting that this isn’t the only spider robot in development. Last year, a UK government-funded company appropriately named Pipebots introduced its own designs for sewer repairing automatic arachnids. Like the SPD1, Pipebots hopes its products can begin traipsing through the muck and mire sometime in 2024.

The post Spider robots could soon be swarming Japan’s aging sewer systems appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Magnetic microrobots could zap the bacteria out of your cold glass of milk https://www.popsci.com/technology/magnetic-microrobots-dairy/ Thu, 08 Dec 2022 00:00:00 +0000 https://www.popsci.com/?p=496186
milk products
Aleksey Melkomukov / Unsplash

These “MagRobots” can specifically target toxins in dairy that survive pasteurization.

The post Magnetic microrobots could zap the bacteria out of your cold glass of milk appeared first on Popular Science.

]]>
milk products
Aleksey Melkomukov / Unsplash

A perfect mix of chemistry and engineering has produced microscopic robots that function like specialized immune cells—capable of pursuing pathogenic culprits with a specific mugshot. 

The pathogen in question is Staphylococcus aureus (S. aureus), which can impact dairy cows’ milk production. These bacteria also make toxins that cause food poisoning and gastrointestinal illnesses in humans (that includes the usual trifecta of diarrhea, abdominal cramps, and nausea). 

Removing the toxins from dairy products is not easy to do. The toxins tend to be stable and can’t be eradicated by common hygienic practices in food production, like pasteurization and heat sterilization. However, an international group of researchers led by a team from the University of Chemistry and Technology Prague may have come up with another way to get rid of these pesky pathogens: with a tiny army of magnetic microrobots. Plus, each “MagRobot” is equipped with an antibody that specifically targets a protein on the S. aureus bacteria, like a lock-and-key mechanism. 

In a small proof-of-concept study published in the journal Small, the team detailed how these MagRobots could bind and isolate S. aureus from milk without affecting other microbes that may naturally occur.

Bacteria-chasing nanobots have been making waves lately in medicine, clearing wounds and dental plaque. And if these tiny devices can work in real, scaled up trials, as opposed to just in the lab, they promise to cut down on the use of antibiotics. 

In the past, microscopic robots have been propelled by light, chemicals, and even ultrasound. But these MagRobots are driven through a special magnetic field. The team thought this form for control was the best option to go with since the robots wouldn’t produce any toxic byproducts, and can be remotely accessed for reconfiguring and reprogramming. 

To make the MagRobots, paramagnetic microparticles are covered with a chemical compound that allows them to be coated with antibodies that match with proteins on the cell wall of S. aureus. This allows the MagRobot to find, bind, and retrieve the bacteria. A transversal rotating magnetic field with different frequencies is used to coordinate the bots. At higher frequencies, the MagRobots moved faster. Researchers preset the trajectory of the microrobots so that they would “walk” back and forth through a control solution and a container with milk in it in three rows and two columns. They are retrieved using a permanent magnet

During the experiment, MagRobots that measured 2.8 micrometers across were able to remove around 60 percent of the S. aureus cells in one hour. When the MagRobots were placed in a milk containing both S. aureus and another bacteria, E. coli, it was able to avoid the E. coli and solely go after the S. aureus. 

“These results indicate that our system can successfully remove S. aureus remaining after the milk has been pasteurized,” the researchers wrote. “Moreover, this fuel-free removal system based on magnetic robots is specific to S. aureus bacteria and does not affect the natural milk microbiota or add toxic compounds resulting from fuel catalysis.”

Additionally, they propose that this method can be applied to a variety of other pathogens simply by modifying the surfaces of these microrobots. 

The post Magnetic microrobots could zap the bacteria out of your cold glass of milk appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Hair clips inspired the design for these land- and sea-based robots https://www.popsci.com/technology/hair-clip-robot/ Wed, 07 Dec 2022 15:00:00 +0000 https://www.popsci.com/?p=495848
From beauty product to bot.
From beauty product to bot. Zechen Xiong/Youtube

The fashionable build ensures the robot's body acts simultaneously acts as its own motor system.

The post Hair clips inspired the design for these land- and sea-based robots appeared first on Popular Science.

]]>
From beauty product to bot.
From beauty product to bot. Zechen Xiong/Youtube

Although many modern robots often take inspiration from the natural world around us, one of the newest examples derives its movement from a small, ubiquitous styling option: the common hair clip. Designed by scientists at Columbia University and subsequently highlighted by New Atlas, the research team recently took inspiration from the beauty product’s simple, alternating states to develop their “fast, untethered soft-robotic crawler with elastic instability.”

[Related: This spooky robot uses inflatable tentacles to grab delicate items.]

Or, in layperson’s terms, the Barrette Bot… or, rather, two Barrette Bots:

https://www.youtube.com/watch?v=2vxqgBPo9S8

Using what team leads Zechen Xiong, Yufeng Su, and Hod Lipson dubbed their Hair Clip Mechanism (HCM), the group developed a pair of small, soft robots that utilize strips of prestressed plastic that is then attached to basic electric servos. When activated, the strips alternate between convex and concave shapes, thus enabling movement while simultaneously amplifying the force behind them.

To test out their new design, the group developed an aquatic robot that relies on the HCM as a fishtail alongside another machine that pairs two HCMs to approximate a quadrupedal organism. The results offered solidly speedy robots compared to their respective sizes—moving about 435mm (roughly two body lengths) per second for the fishbot, and 313mm (1.6 body lengths) per second for the land-bound creation. Although the team claims their small robots are speedier than other groups’ comparable designs, North Carolina State University recently concocted a hairclip-like swimming robot of their own capable of paddling almost 3.75 body lengths per second.

[Related: How engineers taught a manta ray-inspired robot the butterfly stroke.]

Still, what’s particularly interesting about the new HCM is that it allows the robots’ frames to simultaneously act as their propulsion systems. In their demonstration video, the team likens it to a car whose engine also functioned as its frame, or a human skeleton that doubled as a body’s musculature. Cutting out the need for an entirely separate motor mechanism could allow for cheaper, simpler, and lighter robots in the near future, the team says.

Soft-bodied robotics are gaining popularity for their creative, lightweight, economical designs that are as agile as they are delicate and effective. Recently, Colorado State University showed off a soft robotic gripper so sensitive that it can pick up individual droplets of liquid without compromising its surface tension. Then there was the tentacled robot inspired by octopuses, which looks much spookier than the barrette bot.

The post Hair clips inspired the design for these land- and sea-based robots appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Brace yourself for smarter robots that don’t fall over (as easily) https://www.popsci.com/technology/robots-falling/ Fri, 02 Dec 2022 20:30:00 +0000 https://www.popsci.com/?p=494233
Humanoid robot bracing itself against a wall in balance experiment with human researcher standing behind it in lab setting
Easy does it there, pal. YouTube

Researchers tipped over their robot over 882,000 times to teach its neural network how to keep from tumbling.

The post Brace yourself for smarter robots that don’t fall over (as easily) appeared first on Popular Science.

]]>
Humanoid robot bracing itself against a wall in balance experiment with human researcher standing behind it in lab setting
Easy does it there, pal. YouTube

A lot of mobile robots do a pretty decent job of maintaining their balance while on the move, but like humans, they’re still prone to lose their footing from time to time. Although that bodes well for outrunning them during the impending robopocalypse, until then, it mostly means more potential for expensive repairs and time-consuming maintenance. As first unveiled earlier this year and highlighted by Engadget on Wednesday, at least a few more robots could be saved from taking a tumble in the near future thanks to advancements from researchers at France’s University of Lorraine.

[Related: The Boston Dynamics robots are surprisingly good dancers.]

Through a lot of trial and error—reportedly over 882,000 training simulations, to be more exact—developers designed a new “Damage Reflex” system for their humanoid robot test subject. When activated, the robot’s neural network quickly identifies the best spot on a neanrby wall to support itself if its stability gets compromised. Well, perhaps not so much “if” as “when,” judging from the demonstration video below.

As Engadget explains, the testing procedure sounds pretty simple, if a bit macabre: To showcase the Damage Reflex system in action, the robot has one of its legs “broken” to ensure it tips over towards a nearby test wall. In roughly three out of four instances, the machine’s arm was able to determine a solid spot to plant itself against in order to prevent falling to its doom. That’s pretty good when one takes into all the considerations all the physics variations regarding location, balance, weight, and distribution that go into determining how to prevent an accident in real time.

[Related: Boston Dynamics gave its dog-like robot a charging dock and an arm on its head.]

There are quite a few caveats to the Damage Reflex system’s early iteration: Firstly, it only stops a robot from falling over; it can’t help it recover or right itself. Right now, it’s also only been tested in a stationary test robot, meaning that the system currently isn’t capable of addressing accidents that may occur while walking or mid-stride. That said, researchers intend to further develop their system so that it’s also capable of handling on-the-go machines, as well as utilize nearby objects like chairs or tables to its advantage.

Companies like Tesla and Boston Dynamics are keen to push bipedal robots into everyday life, a goal that’s really only realistic as long as their products are relatively affordable to both purchase and maintain. Systems like Damage Reflex, while still in their infancy, could soon go a long way to both protect robots and extend their lifespans.

The post Brace yourself for smarter robots that don’t fall over (as easily) appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Armed police robots will be a threat to public safety. Here’s why. https://www.popsci.com/technology/armed-police-robots-san-francisco/ Fri, 02 Dec 2022 15:00:00 +0000 https://www.popsci.com/?p=493962
A robot used for explosive ordnance disposal is seen in Qatar in 2017.
A robot used for explosive ordnance disposal is seen in Qatar in 2017. US Air Force / Amy M. Lovgren

A recent vote in San Francisco allows police robots to use lethal force, such as with explosives.

The post Armed police robots will be a threat to public safety. Here’s why. appeared first on Popular Science.

]]>
A robot used for explosive ordnance disposal is seen in Qatar in 2017.
A robot used for explosive ordnance disposal is seen in Qatar in 2017. US Air Force / Amy M. Lovgren

On November 29, San Francisco’s government voted 8 to 3 to authorize the use of lethal weapons by police robots. The vote and authorization, which caught national attention, speaks directly to the real fears and perils regarding the use of robotics and remote-control tools domestically. The vote took place in the context of a 2021 law enacted by California mandating that police get approval from local governing authorities over what equipment it uses and how it does so. 

As the the San Francisco Chronicle reported, city Supervisor Aaron Peskin told his colleagues: “There could be an extraordinary circumstance where, in a virtually unimaginable emergency, they might want to deploy lethal force to render, in some horrific situation, somebody from being able to cause further harm,” offering a rationale for why police may want to use a robot to kill.

Police robots are not new, though the acquisition of military-grade robots was bolstered by a program that offered local police departments surplus military goods. Bomb squad robots, used heavily in Iraq and Afghanistan to relocate and safely dispose of roadside bombs, or Improvised Explosive Devices, were offered to police following the drawdowns of US forces from those countries in the 2010s. 

Many of the tools that ultimately end up in police hands first see their debut in military contexts, especially in counter-insurgency or irregular warfare. Rubber bullets, a now-ubiquitous less-lethal police weapon, have their origin in the wooden bullets of British Hong Kong and the rubber bullets of British forces in Northern Ireland. MRAPS, the massive heavy armored vehicles hastily produced to protect soldiers from bombs in Iraq and Afghanistan, have also seen a second post-war life in police forces.

Bomb squad robots are remarkable, in part, because they are a tool for which the military and police applications are the same. A robot with a gripper and a camera, remotely controlled over a long tether, can inspect a suspicious package, sparing a human life in the event of detonation. Police and military bomb squads even train on the robots together, sharing techniques for particularly tricky cases

San Francisco’s government voted to allow police, with explicit authorization from “one of two high-ranking SFPD leaders” to authorize the lethal use of an armed robot, reports the San Francisco Chronicle. The Chronicle also notes that “the department said it has no plans to outfit robots with a gun,” instead leaving the killing to explosives mounted on robots.

Past precedent

There is relevant history here: In the early hours of July 8, 2016, police in Dallas outfitted an explosive to a Remotec Andros Mark V-A1 and used it to kill an armed suspect. The night of July 7, the suspected shooter had fired on seven police officers, killing five. Dallas police surrounded the suspect and exchanged gunfire during a five-hour standoff in a parking garage. The Dallas Police Department had operated this particular Remotec Andros bomb squad robot since 2008. 

On that night in July, the police attached a bomb to the robot’s manipulator arm. Operated by remote control, the robot’s bomb killed the suspect, while the lifeless robot made it through the encounter with only a damaged manipulator arm. The robot gripper arms are designed to transport and relocate found explosives to a place where they can be safely detonated, sometimes with charges placed by the robot.

While Dallas was a groundbreaking use of remote-control explosives, it fit into a larger pattern of police using human-set explosives, most infamously the 1985 MOVE bombing by Philadelphia Police, when a helicopter delivered two bombs onto a rowhouse and burned it down, as well as 65 other houses. 

Flash bang grenades are a less-lethal weapon used by police and militaries, creating a bright light and loud sound as a way to incapacitate a person before police officers enter a building. These weapons, which are still explosive, can cause injury on contact with skin, and have set fires, including one that burned a home and killed a teenager in Albuquerque, New Mexico in July 2022.

The authorization to arm robots adds one more category of lethal tools to an institution already permitted to do violence on behalf of the state. 

Remote possibilities

Bomb squad robots, which come in a range of models and can costs into the six figures, are a specialized piece of equipment. They are often tethered, with communications and controls running down a large wire to humans, ensuring that the robot can be operated despite interference in wireless signals. One of the ways these robots are used is to facilitate negotiations, with a microphone and speaker allowing police to safely talk to a cornered suspect. In 2015, California Highway Patrol used a bomb squad robot to deliver pizza to a knife-armed man standing over a highway overpass, convincing the man to come down. 

The possibility that these robots could instead be used to kill, as one was in 2016, makes it harder for the robots to be used for non-violent resolution of crises with armed people. In the Supervisors’ hearing, references were made to both the 2017 Mandalay Bay shooting in Las Vegas and the 2022 school shooting in Uvalde, though each is a problem at best tangentially related to armed robots. In Las Vegas, the shooter was immediately encountered by an armed guard, and when police arrived they were able to breach rooms with explosives they carried. In Uvalde, the use of explosives delivered by robot would only have endangered children, who were already waiting for the excruciatingly and fatally long police response to the shooter.

By allowing police to turn a specialized robot into a weapon, San Francisco is solving for a problem that does not meaningfully exist, and is making a genuinely non-lethal tool into a threat. It also sets a precedent for the arming of other machines, like inexpensive quadcopter drones, increasing the distance between police and suspects without leading to arrests or defused situations. 

The post Armed police robots will be a threat to public safety. Here’s why. appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
This robot’s delicate touch scoops up liquid droplets without causing a splash https://www.popsci.com/technology/robot-liquid-droplet/ Fri, 25 Nov 2022 17:00:00 +0000 https://www.popsci.com/?p=490901
Hazardous liquids are no problem for these tiny robotic scoopers.
Hazardous liquids are no problem for these tiny robotic scoopers. Colorado State University

Don't be fooled by the new robotic gripper's sensitivity—it's designed to handle the most hazardous materials out there.

The post This robot’s delicate touch scoops up liquid droplets without causing a splash appeared first on Popular Science.

]]>
Hazardous liquids are no problem for these tiny robotic scoopers.
Hazardous liquids are no problem for these tiny robotic scoopers. Colorado State University

Robots are becoming more agile and less clunky by the day, but few look more precise than one recently designed by a group of researchers at Colorado State University (CSU). As explained in a paper published with the Royal Society of Chemistry journal, Materials Horizons, a collaboration between two labs previously working on separate applied technologies has produced small robotic gripper that’s so delicate it can handle individual droplets of liquid.

But don’t be fooled by the tenderness—that careful touch is meant to handle some extremely hazardous materials.

[Related: These robots can build almost anything—including clones of themselves.]

As recently announced by CSU, in combining “soft robotics” with “superomniphobic coatings,” scientists have developed a flexible and lightweight machine whose finger-sized gripper is composed of only a few dollars’ worth of nylon fiber and adhesive tape. Despite the robot’s small size and low cost, the embedded artificial muscle is 100-times stronger than its human counterpoint.

That manipulator is then treated with the aforementioned superomniphobic material, a substance that pretty much does what the name implies, i.e. it resists getting wet from nearly every kind of liquid, even when tilting or moving. As a result, the soft robot can interact with liquid droplets without disturbing their surface tension, thus allowing it to “grasp, transport, and release individual droplets as if they were flexible solids,” per the university. What’s more, the materials and economical design allows each gripper-bot to be completely disposable, thus reducing the chances of human contact even more.

[Related: This agile robot dog uses a video camera in place of senses.]

As one researcher explains in CSU’s write-up, they previously had difficulty attracting attention for their research, a problem that evaporated almost completely after the COVID-19 pandemic’s onset. As such, the new device could soon open up an entirely new way to safely handle toxic, hazardous, or otherwise dangerous liquids without any direct human interference.

“We envision that our biofluid manipulators will not only reduce manual operations and minimize exposure to infectious agents, but also pave the way for developing inexpensive, simple and portable robotic systems, which can allow point-of-care operations, particularly in developing nations,” explains the team’s abstract.

The post This robot’s delicate touch scoops up liquid droplets without causing a splash appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
These robots can build almost anything—including clones of themselves https://www.popsci.com/technology/robot-assembler-swarm/ Wed, 23 Nov 2022 16:30:00 +0000 https://www.popsci.com/?p=490594
MIT swarm robots constructing object using voxel building blocks
Prepare to familiarize yourself with 'voxels.'. MIT News

The breakthrough robot swarms function as both the builders and final products.

The post These robots can build almost anything—including clones of themselves appeared first on Popular Science.

]]>
MIT swarm robots constructing object using voxel building blocks
Prepare to familiarize yourself with 'voxels.'. MIT News

It’s robots all the way down: Researchers at MIT’s Center for Bits and Atoms (CBA) are developing a mind-bending new variety of fully autonomous machines capable of working together to assemble almost any conceivable structure or product, including bigger iterations of themselves as their projects scale larger. The new findings—recently published in Nature Communications Engineering—synthesizes years’ worth of similar research into a new system wherein both robots and their assigned projects are composed of the same subunit materials called voxels, the volumetric equivalents of a two-dimensional pixel.

“[W]hile earlier voxels were purely mechanical structural pieces, the team has now developed complex voxels that each can carry both power and data from one unit to the next,” explains MIT News‘s recent writeup. “This could enable the building of structures that can not only bear loads, but also carry out work such as lifting, moving and manipulating materials—including the voxels themselves.”

[Related: Robots are coming to hotels, but how long will they stay?]

AI photo
Credit: Amira Abdel-Rahman/MIT Center for Bits and Atoms

The innovations also rely on the robots’ capability to determine when they need to pause to build bigger versions of themselves to continue the job as the project’s size increases. At a certain point, the distances these tiny machines must travel renders them inefficient. By training them to recognize when this happens and when to construct necessary, larger iterations, the entire system can scale upwards indefinitely.

[Related: This agile robot dog uses a video camera in place of senses.]

The end result is a creation that is simultaneously the intended structure and the robot constructing it. “The robots themselves consist of a string of several voxels joined end-to-end,” explains MIT. “These can grab another voxel using attachment points on one end, then move inchworm-like to the desired position, where the voxel can be attached to the growing structure and released there.” While researchers note that a fully autonomous system of voxel-bots is still “years away,” recent strides showcase the jaw-dropping potential of iterative robotics and their potential ramifications within seemingly countless industries.

Other potential uses include building structures to aid in protection against sea level rise and coastal erosion, as well as 3D printed houses and space habitat construction. CBA’s director and paper co-author, Neil Gershenfeld, offers jumbo jet construction as an example: “[W]hen you make a jumbo jet, you need jumbo jets to carry the parts of the jumbo jet to make it. The final assembly of the airplane is the only assembly.”

The post These robots can build almost anything—including clones of themselves appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
This agile robot dog uses a video camera in place of senses https://www.popsci.com/technology/robot-dog-camera/ Tue, 22 Nov 2022 21:00:00 +0000 https://www.popsci.com/?p=490016
Four-legged robot walking across pile of boards and building materials during mobility test
Quadrupedal robots are all the rage right now. YouTube

A new spin on quadrupedal robotics relies only on prior virtual training and an onboard camera.

The post This agile robot dog uses a video camera in place of senses appeared first on Popular Science.

]]>
Four-legged robot walking across pile of boards and building materials during mobility test
Quadrupedal robots are all the rage right now. YouTube

Another week, another uncanny advancement for the burgeoning four-legged robot “dog” industry: MIT Technology Review just previewed a quadrupedal creation similar to Boston Dynamics’ Spot capable of taking strolls over rocky terrain, climbing up stairs, and even hopping over small gaps while relying only an onboard visual camera and some AI reinforcement learning. Other robots may move similarly, or better, across environments, but the majority of them require internal mapping software. The new strutting dog-bot, however, represents a design breakthrough not only in the real-time assessment of its surroundings, but in the variety of settings it can successfully navigate.

[Related: Boston Dynamics starts a legal dog fight with competitor Ghost.]

Check out a video of the little machine in action—it’s pretty darn cute.

“Animals are capable of precise and agile locomotion using vision. Replicating this ability has been a long-standing goal in robotics,” the Carnegie Mellon research team explains in their YouTube video‘s description, before explaining that the robot’s systems were first trained through trial and error in simulation environments of areas like staircases and stepping stones. Once the team completed that phase, their four-legged walker utilized its onboard video camera to process what was in front of it while referring back to its previous reinforcement training to adapt as needed.

[Related: Ghost Robotics now makes a lethal robot dog.]

According to Technology Review, the lack of pre-trial mapping is a major leap forward (so to speak) for mobile robots, as a better, more consistent real-time analysis of surroundings could one day greatly widen their accessibility and deployment. Although the team’s quadruped still has some trouble with slippery surfaces and visually noisy environments like tall grass, the progress remains incredibly impressive ahead of the project’s big debut at next month’s Conference on Robot Learning.

The post This agile robot dog uses a video camera in place of senses appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Meta’s new AI can use deceit to conquer a board game world https://www.popsci.com/technology/meta-ai-bot-diplomacy/ Tue, 22 Nov 2022 20:00:00 +0000 https://www.popsci.com/?p=490109
map of europe and a globe
Aslı Yılmaz / Unsplash

It can play Diplomacy better than many humans. Here's how it works.

The post Meta’s new AI can use deceit to conquer a board game world appeared first on Popular Science.

]]>
map of europe and a globe
Aslı Yılmaz / Unsplash

Computers are getting pretty good at a growing roster of arcade and board games, including chess, Go, Pong, and Pac-Man. Machines might even change how video games get developed in the not-so-distant future. Now, after building an AI bot that outbluffs humans at poker, scientists at Meta AI have created a program capable of even more complex gameplay: one that can strategize, understand other players’ intentions, and communicate or negotiate plans with them through chat messages.  

This bot is named ​​CICERO, and it can play the game Diplomacy better than many human players. CICERO more than doubled the average score of its human opponents and placed in the top 10 percent of players across 40 games in an online league.

The program has been a work in progress for the past three years between engineers at Meta, and researchers from Columbia, MIT, Stanford, Carnegie Mellon University, UC Berkeley, and Harvard. A description of how the CICERO came together was published in a paper today in Science. The team is open sourcing the code and the model, and they will be making the data used in the project accessible to other researchers. 

Diplomacy is originally a board game set in a stylized version of Europe. Players assume the role of different countries, and their objective is to gain control of territories by making strategic agreements and plans of action. 

“What sets Diplomacy apart is that it involves cooperation, it involves trust, and most importantly, it involves natural language communication and negotiation with other players,” says Noam Brown, a research scientist at Meta AI and an author on the paper. 

Although a special version of the game without the chat function has been used to test AI over the years, the progress with language models from 2019 onwards made the team realize that it might be possible to teach an AI how to play Diplomacy in full. 

But because Diplomacy had this unique requirement for collaboration, “a lot of the techniques that have been used for prior games just don’t apply anymore,” Brown explains. 

Previously, the team had run an experiment with the non-language version of the game, where players were specifically informed that in each game there would be one bot and six humans. “What we found is that the players would actively try to figure out who the bot was, and then eliminate that player,” says Brown. “Fortunately, our bot was able to pass as a human in that setting; they actually had a lot of trouble figuring out who the bot was, so the bot actually got first place in the league.” 

But with the full game of Diplomacy, the team knew that the bot wasn’t ready to pass the Turing test if natural language interrogations were involved. So during the experiment, players were not told that they were playing with a bot—a detail that was only revealed after the game ended. 

Making CICERO

To construct the Diplomacy-playing AI, the team built two separate data processing engines that fed into one another: one engine for dialogue (inspired by models like GPT-3, BlenderBot 3, LaMDA, and OPT-175B), and another for strategic reasoning (inspired by previous work like AlphaGo and Pluribus). Combined together, the dialogue model, which was trained on a large corpus of text data from the internet and 50,000 human games from webDiplomacy.net, can communicate and convey intents that are in line with its planned course of action. 

AI photo
Meta AI

This works in the reverse direction as well. When other players communicate to the bot, the dialogue engine can translate that into plans and actions in the game, and use that to inform the strategy engine about next steps. CICERO’s grand plans are formulated by a strategic reasoning engine that estimates the best next move based on the state of the board, the content of the most recent conversations, moves that were made historically by players in a similar situation, and the bot’s goals. 

[Related: MIT scientists taught robots how to sabotage each other]

“Language models are really good these days, but they definitely have their shortcomings. The more strategy that we can offload from the language model, the better we can do,” Brown says. “For that reason, we have this dialogue model that conditions on the plans, but the dialogue model is not responsible for the plans.” So, the part of the program that does the talking is not the same as the part that does the planning.

The planning algorithm the bot uses is called piKL. It will make an initial prediction of what everyone is likely to do and what everyone thinks the bot will do, and refine this prediction by weighing the values of different moves. “When doing this iterative process, it’s trying to weigh what people have done historically given the dataset that we have,” says Brown. “It’s also trying to balance that with the understanding that players have certain objectives in this game, they’re trying to maximize their score and they’re going to not do very serious mistakes as they would minor mistakes. We’ve actually observed that this models humans much better than just doing the initial prediction based on human data.”

AI photo
Meta AI

“Deception exists on a spectrum” 

Consider the concept of deception, which is an interesting aspect of Diplomacy. In the game, before each turn, players will spend 5 to 15 minutes talking to each other and negotiating plans. But since this is all all happening in private, people can double deal. They can make promises to one person, and tell another that they’ll do something else. 

But just because people can be sneaky doesn’t mean that’s the best way to go about the contest. “A lot of people when they start playing the game of Diplomacy they view it as a game about deception. But actually if you talk to experienced Diplomacy players, they think with a very different approach to the game, and they say it’s a game about trust,” Brown says. “It’s being able to establish trust with other players in an environment that encourages you to not trust anybody. Diplomacy is not a game where you can be successful on your own. You really need to have allies.” 

Early versions of the bot were more outright deceptive, but it actually ended up doing quite poorly. Researchers then went in to add filters to make it lie less, leading to to much better performances. But of course, CICERO is not always fully honest with all of its intentions. And importantly, it understands that other players may also be deceptive. “Deception exists on a spectrum, and we’re filtering out the most extreme forms of deception, because that’s not helpful,” Brown says. “But there are situations where the bot will strategically leave out information.”

For example, if it’s planning to attack somebody, it will omit the parts of its attack plan in its communications. If it’s working with an ally, it might only communicate the need-to-know details, because exposing too much of its goals might leave it open to being backstabbed. 

“We’re accounting for the fact that players do not act like machines, they could behave irrationally, they could behave suboptimally. If you want to have AI acting in the real world, that’s necessary to have them understand that humans are going to behave in a human-like way, not in a robot-like way,” Brown says. “Having an agent that is able to see things from other perspectives and understand their point of view is a pretty important skillset going forward in human-AI interactions.” 

Brown notes that the techniques that underlie the bot are “quite general,” and he can imagine other engineers building on this research in a way that leads to more useful personal assistants and chatbots.

The post Meta’s new AI can use deceit to conquer a board game world appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Workplace automation could affect income inequality even more than we thought https://www.popsci.com/technology/automation-workers-robots/ Mon, 21 Nov 2022 16:30:00 +0000 https://www.popsci.com/?p=489320
Blue robotic arm in factory working on assembly line with plywood materials
"So-so automation" benefits companies far more than workers. Deposit Photos

A new study from MIT and Boston University argues workplace automation is even more problematic than previous research shows.

The post Workplace automation could affect income inequality even more than we thought appeared first on Popular Science.

]]>
Blue robotic arm in factory working on assembly line with plywood materials
"So-so automation" benefits companies far more than workers. Deposit Photos

As major companies continue to announce increasingly automated labor chains, many human employees are understandably worried their own jobs aren’t safe from impending robotic rollouts. Unfortunately, new research into automation’s history and longterm effects demonstrate potential negative effects could be worse than predicted.

A new study published in the journal Econometrica presents an unprecedented dive into the effects of robotic labor over the past four decades, revealing the rise of newly dubbed “so-so automation” exacerbates wage gaps between white and blue collar workers more than almost any other factor. “So-so automation” refers to industry robotics that save corporations large sums of money and eradicate lower-skilled human jobs in exchange for relatively minor productivity gains and consumer convenience, according to the paper.

[Related: Amazon’s latest warehouse robot is here.]

Taking into account a multitude of datasets and economic census breakdowns between 1980 to 2016, co-authors Daron Acemoglu of MIT and Boston University’s Pascual Restrepo estimate that automation “has reduced the wages of men without a high school degree by 8.8 percent and women without a high school degree by 2.3 percent, adjusted for inflation,” per an announcement this morning from MIT.

As the news release explains, although inflation-adjusted incomes for those with college and postgraduate degree have steadily risen since 1980, the overall earnings for men without high school degrees has decreased by 15 percent. Multiple factors like diminished labor unions, market concentration, and other tech advancements all contribute to these issues, but Acemoglu and Restrepo’s evidence points to the rise of “so-so automation” systems to account between “50 to 70 percent of the changes or variation between group inequality.”

These “so-so automations” are more or less everywhere these days. MIT offers grocery store self-checkout kiosks as a prime example—while shoppers may enjoy slightly shorter checkout lines, they rarely bag as efficiently or well as trained human employees. What’s more, the labor cost isn’t eradicated by automated systems, but instead cleverly passed along to the shoppers themselves, who now bag their own goods without being paid for it. The grocery store chain saves vast amounts of money by not paying additional employees and consumers might enjoy a (sometimes) speedier experience, but the “so-so automation” doesn’t justify its existence when examined within the larger picture.

[Related: Amazon’s purchase of iRobot comes under FTC scrutiny.]

While some figures envision a vague future in which all menial jobs are replaced by automation, many analysts are far more concerned with the already disproportionate effects of industry robots felt by higher and lower educated workers, where the former demographic benefits far more than the latter. According to MIT, Acemoglu and Restrepo’s research presents “a more stark outlook in which automation reduces earnings power for workers and potentially reduces the extent to which policy solutions—more bargaining power for workers, less market concentration—could mitigate the detrimental effects of automation upon wages.”

Major companies like Amazon continue to push forward with warehouse automation while extolling their potential employee benefits, but as Acemoglu and Restrop argue in their new study, human laborers deserve and require more rigorous protections in the face of workforce robotics’ longterm effects.

The post Workplace automation could affect income inequality even more than we thought appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
How engineers taught a manta ray-inspired robot the butterfly stroke https://www.popsci.com/technology/butterfly-bot-ncstate/ Fri, 18 Nov 2022 19:00:00 +0000 https://www.popsci.com/?p=488751
manta ray-inspired swimming robot
Yin Lab@NCSU / YouTube

The engineers behind it claim this design allows the robot to be lighter, faster, and more energy efficient.

The post How engineers taught a manta ray-inspired robot the butterfly stroke appeared first on Popular Science.

]]>
manta ray-inspired swimming robot
Yin Lab@NCSU / YouTube

Making a robot that can swim well can be surprisingly difficult. Part of this is due to the fact that the physics of how organisms move in the water is often complicated, and hard to replicate. But that hasn’t stopped researchers from studying how ocean animals move so they can create better aquatic robots. 

A notable addition to this field comes from engineers at North Carolina State University, who came up with a manta ray-like robot that can do the butterfly stroke. A detailed description of their design is published this week in the journal Science Advances

“To date, swimming soft robots have not been able to swim faster than one body length per second, but marine animals—such as manta rays—are able to swim much faster, and much more efficiently,” Jie Yin, an associate professor at NC State University, and an author on the paper, said in a press release. “We wanted to draw on the biomechanics of these animals to see if we could develop faster, more energy-efficient soft robots.” 

[Related: A tuna robot reveals the art of gliding gracefully through water

As a result, the team put together two versions of a silicon “butterfly bot”: one that can reach average speeds of 3.74 body lengths per second, and another that can turn sharply to the left or right. Both are about the size of a human palm.

Unlike similar biology-inspired robot concepts in the past that use motors to directly operate the wings, the NC State team’s robot flaps with a “bistable” wing that snaps into two distinct positions like a hair clip, or a pop-up jumping toy. To alter the position of the curved, rotating wings, researchers used a tether to pump air into upper and lower chambers of the robot body. When the chambers inflate and deflate, the body bends up and down, making the wings snap back and forth. 

As the robots were tested in the aquarium, researchers saw that inflating the top chamber caused the soft body to bend upward, inducing a downstroke motion. Deflating that and inflating the bottom pneumatic chamber caused the body to bend downward, inducing an upstroke with the wings. During the upstroke-to-downstroke transition, the robot body is pushed deep into the water and then propelled forward. This design allows the soft robot to be lighter and more energy efficient. 

[Related: This amphibious drone hitchhikes like a suckerfish]

The first version of butterfly bot was built for speed. It holds a single drive unit that controls both of its wings. The second version was built for maneuverability. It has two connected drive units that let each wing be controlled independently. Flapping only one wing allows it to turn more easily. 

“This work is an exciting proof of concept, but it has limitations,” Yin said. “Most obviously, the current prototypes are tethered by slender tubing, which is what we use to pump air into the central bodies. We’re currently working to develop an untethered, autonomous version.”

Watch butterfly bot in action, below:

The post How engineers taught a manta ray-inspired robot the butterfly stroke appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Boston Dynamics starts a legal dog fight with competitor Ghost https://www.popsci.com/technology/boston-dynamics-ghost-robotics-dog-accusation/ Wed, 16 Nov 2022 22:30:00 +0000 https://www.popsci.com/?p=487950
Onlookers film and photograph Boston Dynamics' Spot robot at a showcase
Should there be only one top dog?. Deposit Photos

This comes shortly after Ghost added a sniper rifle to one of their models.

The post Boston Dynamics starts a legal dog fight with competitor Ghost appeared first on Popular Science.

]]>
Onlookers film and photograph Boston Dynamics' Spot robot at a showcase
Should there be only one top dog?. Deposit Photos

Boston Dynamics and Ghost Robotics have both offered similarly unsettling four-legged automatons to consumers for a few years now—although only one of those companies recently strapped a sniper rifle to their product. As it seems, however, turning your quadrupedal robot into a weapon apparently isn’t enough of a differentiation to prevent a legal battle, as Tech Crunch and elsewhere report that Boston Dynamics has filed a lawsuit against Ghost Robotics alleging multiple patent infringements.

According to legal paperwork submitted on November 11, Boston Dynamics is accusing their competitor of blatantly copying seven patents for “core technology” related to Spot, the plaintiff company’s four-legged, dog-like robot. “Boston Dynamics’ early success with the Spot robot did not go unnoticed by competitors in the robotics industry, including Ghost Robotics,” argues a portion of the filing, specifically calling out Ghost Robotics’ Vision 60 and Spirit 40 quadruped products. Boston Dynamics also recounts that it sent Ghost a request to review its patents over the summer, followed by multiple cease and desist letters that went unanswered.

[Related: Boston Dynamics gave its dog-like robot a charging dock and an arm on its head.]

As Tech Crunch also notes, although Boston Dynamics has previously sold products to law enforcement groups such as the New York Police Department (although that partnership ended last year), it still opposes weaponizing robots. Last month, the company even added its name to an open letter speaking out against the practice, alongside companies including Clearpath Robotics and ANYbotics. “We believe that adding weapons to robots that are remotely or autonomously operated, widely available to the public, and capable of navigating to previously inaccessible locations where people live and work, raises new risks of harm and serious ethical issues,” reads a portion of the letter. “Weaponized applications of these newly-capable robots will also harm public trust in the technology in ways that damage the tremendous benefits they will bring to society.”

Compare this to Ghost Robotics, whose CEO once vowed, “We’re not going to dictate to our government customers how they use the robots” shortly after debuting one of its aforementioned four-legged robots with SWORD Defense Systems Special Purpose Unmanned Rifle (SPUR) mounted to it at a trade show.

Ghost Robotics had not yet responded to the allegations at the time of writing.

The post Boston Dynamics starts a legal dog fight with competitor Ghost appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
New robot moves Amazon towards increased warehouse automation https://www.popsci.com/technology/amazon-sparrow-robot/ Mon, 14 Nov 2022 21:00:00 +0000 https://www.popsci.com/?p=486760
Amazon Sparrow robot sorting products into boxes at warehouse facility
Amazon's Sparrow robot could potentially rollout as early as next year. Amazon

The company says the new Sparrow robot is meant to reduce workplace injuries.

The post New robot moves Amazon towards increased warehouse automation appeared first on Popular Science.

]]>
Amazon Sparrow robot sorting products into boxes at warehouse facility
Amazon's Sparrow robot could potentially rollout as early as next year. Amazon

Last Thursday, Amazon announced the introduction of Sparrow, its first robotic system designed to identify, select, and handle millions of individual warehouse inventory items. At the same time, Sparrow supposedly minimizing employees’ repetitive tasks and improving worker safety.

Utilizing a combination of AI, computer vision, and a suction-cup “hand,” Sparrow is reportedly able to handling roughly 65 percent of all pre-packaged products available on Amazon’s website, according to the company’s own description. “Working with our employees, Sparrow will take on repetitive tasks, enabling our employees to focus their time and energy on other things, while also advancing safety,” reads Amazon’s official press release, which also describes the new system as “a major technological advancement to support our employees.”

[Related: Four workers die in Amazon warehouses across 22 days.]

As Business Insider reports, however, some workers are worried about their employer’s true motives behind Sparrow’s impending rollout. “[It] will take my job,” one warehouse worker told the outlet, who chose to remain anonymous for fear of company retaliation.

Amazon first introduced robots into its workforce in 2012, and has since deployed 520,000 robotic drive units globally capable of a variety of warehouse tasks. Sparrow will join the company’s previously announced Robin and Cardinal systems, both of which are meant to streamline and speed up warehouse tasks while supposedly freeing human laborers of mundane, repetitive, and often potentially dangerous responsibilities.

AI photo
Source: Amazon

Earth’s Best Employer” has a well-documented history of controversy. For example their on-the-job injury and fatality rates far surpass industry averages, and workforce turnover is so high that the company may “run out of prospective workers” in the US by 2024, according to one report. Although the company argues Sparrow’s imminent rollout will reduce the likelihood of warehouse workers hurting themselves while trying to maintain Amazon’s productivity quotas, some workers believe this could only exacerbate the existing issues. “They want you to compete with the robots. They want all the employees to compete with them. But who can win against a robot?” Mohamed Mire Mohamed, a former Amazon employee and current labor organizer, told Business Insider.

[Related: Amazon’s new warehouse employee training exec used to manage private prisons.]

While Amazon claims its expensive interest in robotics will ultimately be a net positive for employees, critics argue the reality will be far more automation at the expense of actual human positions within the company. “Instead of providing high-quality jobs and addressing the safety crisis it has created in its warehouses and on our roads, Amazon is investing in ways to maximize profits to the detriment of working people,” a spokesperson for Athena Coalition, a grassroots coalition focused on Amazon, said in a statement provided to PopSci. “If Amazon were concerned with keeping workers safe, they would stop union-busting, pay livable wages, and end the invasive surveillance and punitive management practices that are the real cause of the corporation’s safety crisis.”

Earlier this morning, The New York Times also revealed that Amazon is planning on cutting an estimated 10,000 jobs from its overall workforce, primarily within its “devices organization… as well as at its retail division and in human resources.” With over 1.5 million employees across the globe, the layoffs would represent roughly less than 1 percent of its overall workforce.

The post New robot moves Amazon towards increased warehouse automation appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Scientists developed a microrobotic finger that can tickle a bug https://www.popsci.com/technology/microrobotic-finger-insect/ Mon, 14 Nov 2022 16:00:00 +0000 https://www.popsci.com/?p=486620
Pillbug curled into ball
Don't be shy, little guy!. Deposit Photos

The advancement allows humans to physically interact with some of the most delicate animals.

The post Scientists developed a microrobotic finger that can tickle a bug appeared first on Popular Science.

]]>
Pillbug curled into ball
Don't be shy, little guy!. Deposit Photos

It’s relatively easy to observe and study the world’s insects, but it’s another thing entirely to safely interact with them physically. Take a pillbug, for example—you can watch them live their tiny pillbug lives all day long, but any attempts to handle them at best only annoys the little insects… and at worst, literally and figuratively squashes their future plans.

[Related: The monarch butterfly is scientifically endangered. So why isn’t it legally protected yet?]

The days of clumsy interactions with the itty-bitty world may be drawing to a close, however: Researchers at Japan’s Ritsumeikan University recently published a paper detailing their advancements in micro-robotics that allows for unprecedented physical interactions with extremely small subjects. As detailed in a paper published last month via Scientific Reports, developers have created “microfingers” that use artificial muscle actuators and tactile sensors to provide a “haptic teleoperation robot system” which they then tested on aforementioned pillbugs. Apparently, the results were extremely successful, although judging from the illustration provided, it sure looks like Ritsumeikan University invented a very ingenious way to finally achieve a truly adorable goal:

That’s right. We can tickle bugs now.

Insects photo

As researchers explained in their paper, while microsensors have previously been used to measure forces exerted by walking and flying insects, most other studies focused on measuring insect behavior. Now, however, with the new robotic glove “a human user can directly control the microfingers,” says study lead, Professor Satoshi Konishi, adding, “This kind of system allows for a safe interaction with insects and other microscopic objects.”

[Related: Scientists made the highest-ever resolution microscope.]

To test their new device, researchers fixed a pillbug in place using a suction tool, then used their microfinger setup to apply an extremely small amount of force on the bug to measure its legs’ reaction—10 mN (millinewtons), to be exact. While the device currently serves as a proof-of-concept, researchers are confident the advancement can pave the way for more accurate and safe interactions with the microworld around us. The paper’s authors also noted the potential to combine their technology with augmented reality systems in the future. Hopefully, this implies they’ll be able to see the bugs laughing when they’re being tickled.

The post Scientists developed a microrobotic finger that can tickle a bug appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
One of nature’s tiniest acrobats inspired a leaping robot https://www.popsci.com/technology/one-of-natures-tiniest-acrobats-inspired-a-leaping-robot/ Tue, 08 Nov 2022 20:30:18 +0000 https://www.popsci.com/?p=485177
Springtail insect under the microscope
The robot is cuter. Deposit Photos

A minuscule insect-like animal called the springtail lives atop water. Researchers just made its robotic sibling.

The post One of nature’s tiniest acrobats inspired a leaping robot appeared first on Popular Science.

]]>
Springtail insect under the microscope
The robot is cuter. Deposit Photos

The springtail is a tiny, fascinating semiaquatic invertebrate capable of escaping predators by impressively leaping ten times’ its height, performing a midair U-turn, and finally landing atop the water’s surface. Although there are thousands of known springtail species in nature, the close relative to the flea remains a relatively obscure animal, despite its astounding capabilities. Thanks to a closer examination, however, researchers at the Georgia Institute of Technology and South Korea’s Ajou University have not only gained a better understanding of the creature’s acrobatic skills, but recently pulled off mimicking the movements in their own penny-sized robotic imitator. The implications could one improve the movement of robots much larger than a grain-sized springtail. The authors recently published their findings in the Proceedings of the National Academy of Sciences.

[Related: Watch a snake wearing robot trousers strut like a lizard.]

Per a recent report from The New York Times, biologists and keen-eyed observers previously believed springtails’ evasive maneuvers were largely random and uncontrolled. The key to a springtail’s gymnastics is a tiny organ called a furcula, which slaps the water underneath it to launch the animal into the air. In less than 20 milliseconds following liftoff (a world record for speed, by the way) springtails manage to orient themselves so as to land on their hydrophilic collophores—tubelike appendages capable of holding water and sticking to surfaces, thus allowing the springtails to sit comfortable atop ponds and lakes.

[Relate: Watch this penny-sized soft robot paddle with hydrogel fins.]

Using a combination of machine training and observations, researchers were then able to construct a tiny, relatively simple robot that mimics springtails’ movements, down to their ability to accurately land around 75 percent of the time. Actual springtails, by comparison, stick 85 percent of their landings.

While extremely small, the robotic springtails’ results could help developments in the fields of engineering, robotics, and hydrodynamics, according to Kathryn Dickson, a program director at the National Science Foundation which partially funded the research, via a news release. Researchers also hope that further fine-tuning and study will allow them to gain insights into the evolutionary origins of flight in various organisms, as well as implement their advancements on other tiny robots used in water and airborne studies.

The post One of nature’s tiniest acrobats inspired a leaping robot appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Google is testing a new robot that can program itself https://www.popsci.com/technology/google-ai-robot-code-as-policies/ Sat, 05 Nov 2022 11:00:00 +0000 https://www.popsci.com/?p=484325
Code as Policies robot demo at Google AI event
Charlotte Hu

Human operators can type in instructions like "pick up the yellow block" and the robot will do the rest.

The post Google is testing a new robot that can program itself appeared first on Popular Science.

]]>
Code as Policies robot demo at Google AI event
Charlotte Hu

Writing working code can be a challenge. Even relatively easy languages like HTML require the coder to understand the specific syntax and available tools. Writing code to control robots is even more involved and often has multiple steps: There’s code to detect objects, code to trigger the actuators that move the robot’s limbs, code to specify when the task is complete, and so on. Something as simple as programming a robot to pick up a yellow block instead of a red one is impossible if you don’t know the coding language the robot runs on. 

But Google’s robotics researchers are exploring a way to fix that. They’ve developed a robot that can write its own programming code based on natural language instructions. Instead of having to dive into a robot’s configuration files to change block_target_color from #FF0000 to #FFFF00, you could just type “pick up the yellow block” and the robot would do the rest. 

Code as Policies (or CaP for short) is a coding-specific language model developed from Google’s Pathways Language Model (PaLM) to interpret the natural language instructions and turn them into code it can run. Google’s researchers trained the model by giving it examples of instructions (formatted as code comments written by the developers to explain what the code does for anyone reviewing it) and the corresponding code. From that, it was able to take new instructions and “autonomously generate new code that re-composes API calls, synthesizes new functions, and expresses feedback loops to assemble new behaviors at runtime,” Google engineers explained in a blog post published this week, In other words, given a comment-like prompt, it could come up with some probable robot code. Read the preprint of their work here.

AI photo
Google AI

To get CaP to write new code for specific tasks, the team provided it with “hints,” like what APIs or tools were available to it, and a few instructions-to-code paired examples. From that, it was able to write new code for new instructions. It does this using “hierarchical code generation” which prompts it to “recursively define new functions, accumulate their own libraries over time, and self-architect a dynamic codebase.” This means that given one set of instructions once, it can develop some code that it can then repurpose for similar instructions later on.

[Related: Google’s AI has a long way to go before writing the next great novel]

CaP can also use the arithmetic operations and logic of specific languages. For example, a model trained on Python can use the appropriate if/else and for/while loops when needed, and use third-party libraries for additional functionality. It can also turn ambiguous descriptions like “faster” and “to the left” into the precise numerical values necessary to perform the task. And because CaP is built on top of a regular language model, it has a few features unrelated to code—like understanding emojis and non-English languages. 

For now, CaP is still very much limited in what it can do. It relies on the language model it is based on to provide context to its instructions. If they don’t make sense or use parameters it doesn’t support, it can’t write code. Similarly, it apparently can only manage a handful of parameters in a single prompt; more complex sequences of actions that require dozens of parameters just aren’t possible. There are also safety concerns: Programming a robot to write its own code is a bit like Skynet. If it thinks the best way to achieve a task is to spin around really fast with its arm extended and there is a human nearby, somebody could get hurt. 

Still, it’s incredibly exciting research. With robots, one of the hardest tasks is generalizing their trained behaviors. Programming a robot to play ping-pong, doesn’t make it capable of playing other games like baseball or tennis. Although CaP is still miles away from such broad real world applications, it does allow a robot to perform a wide range of complex robot tasks without task-specific training. That’s a big step in the direction of one day being able to teach a robot that can play one game how to play another—without having to break everything down to new human-written code.

The post Google is testing a new robot that can program itself appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
AI can teach robot dogs tricks for cheap https://www.popsci.com/technology/robot-dog-ai-training/ Thu, 03 Nov 2022 14:00:00 +0000 https://www.popsci.com/?p=483680
Quadrupedal robot with arm attachment walking across grass
It's AI training systems all the way down. New Scientist

Multiple rungs of AI training systems may lower the time-consuming, pricey hurdles for future robots.

The post AI can teach robot dogs tricks for cheap appeared first on Popular Science.

]]>
Quadrupedal robot with arm attachment walking across grass
It's AI training systems all the way down. New Scientist

Two things we know to be facts regarding mobile robots like Boston Dynamics’ Spot—they are definitely here to stay, but at least for the time being, they are extremely pricey to produce. According to a new report courtesy of The New Scientist, however, that cost could soon drop precipitously thanks to training regimen overseen by multiple AI “coaches.” Judging from the video clip, we might as well start rolling out the red carpet—they certainly will have no trouble walking a straight line down one, anyway.

Thanks to a team of researchers at Carnegie Mellon University in Pennsylvania, a four-legged, doglike robot featuring an arm attachment recently managed to master complex maneuvers and multiple tasks overseen via a combination of human trainers and AI systems. Using reinforcement learning methods in both real-world and simulated environments, the developers first trained an AI to control the robot’s four legs independently from its arm extension, the synced the two systems into one walking, grabbing machine.

[Related: The Boston Dynamics robots are surprisingly good dancers.]

Additionally, the team used a simulation-taught AI to then “coach” a similar AI what it learned, which would next attempt to mimic its tutor. This second AI was then applied to the four-legged robot for physical training while only relying on information gleaned through onboard cameras and sensors.

[Related: Boston Dynamics gave its dog-like robot a charging dock.]

The results were both impressive and uncanny, as the above video showcases. Among other feats, “the AI learned on its own to bend and stretch certain legs to help maximize the reach of the robot’s arm as it tossed cups into a garbage bin or wiped a whiteboard with an eraser,” per The New Scientist. For now, the miniature prototype doesn’t appear strong enough to toss anything bigger than small dinnerware into the garbage bin, so there’s no need to fear being picked up and chucked around by the AI-trained creature.

Usually, this kind of synchronization and programming costs a lot of time and money, but researchers’ breakthroughs managed to produce their own quadruped robot for the relatively low price of $6,300—roughly just one-tenth the cost of a Spot from Boston Dynamics. The research team hopes its novel workarounds will make both future four- and two-legged assistant robots much more affordable for general consumers. Regardless of cost, robots from some of the most notable tech companies, like Tesla’s recent Optimus prototype, could likely learn a thing or two from this little guy and its AI coaches.

The post AI can teach robot dogs tricks for cheap appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Watch this penny-sized soft robot paddle with hydrogel fins https://www.popsci.com/technology/hydrogel-aquabot/ Sat, 29 Oct 2022 11:00:00 +0000 https://www.popsci.com/?p=481993
hydrogel swimming robot
Science Robotics

Researchers believe that this robot design can be one day repurposed for applications like biomedical devices.

The post Watch this penny-sized soft robot paddle with hydrogel fins appeared first on Popular Science.

]]>
hydrogel swimming robot
Science Robotics

Engineers have long been interested in making water-faring robots. We’ve seen robots fashioned after tuna, suckerfish, octopuses and more. Now a new type of aquabot is swimming onto the scene. Researchers from Korea University and Ajou University designed insect-sized robots that can wade through the water employing hydrogel fins and paddles. They describe the process behind making and operating these robots in a new paper out this week in Science Robotics

Hydrogels are 3D structures made from crosslinked molecules. They can be made from synthetic or natural materials and tend to swell in water. Some hydrogels can even change shape in response to external stimuli like variations in pH, temperature, ionic strength, solvent type, electric and magnetic fields, light, and more. 

The medical industry has been exploring how to use hydrogels for applications such as wound dressing. But robot engineers have also been interested in using hydrogel to make soft robots—just check out this nifty drug delivering jellyfish-like aquabot from 2008. Of course, the design of such robots are always being reimagined and optimized.

[Related: A tuna robot reveals the art of gliding gracefully through water]

The new, free-floating bots from Korea University and Ajou University team have porous hydrogel paddles that are coated with a webbing of nanoparticles, or “wrinkled nanomembrane electrodes.” Onboard electronics were shrunk down to the size of a penny into the body of the robot. Inside the body is an actuator, or motor, that can apply different voltages of electric potential across the hydrogel. To move the robot, researchers can alter an external electric field to induce electroosmosis-driven hydraulic pumping—where charged surfaces can change the water flow around it. 

“In particular, our soft aquabots based on WNE actuators could be constructed without any high-voltage converter and conventional transmission system, which have considerably limited the miniaturization of soft robots,” the researchers wrote. “However, to be a fully autonomous robotic system, sensing components should be further integrated for the recognition of position and orientation of the robot. We believe that our approach could provide a basis for developing lightweight, high-performance soft actuators and robots at small scale that require a variety of motions under electric stimuli.”

Watch the tiny aquabot in motion below:

The post Watch this penny-sized soft robot paddle with hydrogel fins appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
This spooky robot uses inflatable tentacles to grab delicate items https://www.popsci.com/technology/harvard-tentacle-robot/ Tue, 25 Oct 2022 15:30:00 +0000 https://www.popsci.com/?p=480653
Harvard robotic arm with tentacle filaments gripping succulent plant
Look at all those little tentacles. Harvard Microrobotics Lab/Harvard SEAS

Sometimes the best grabbing tool is a bunch of worm-like noodle fingers.

The post This spooky robot uses inflatable tentacles to grab delicate items appeared first on Popular Science.

]]>
Harvard robotic arm with tentacle filaments gripping succulent plant
Look at all those little tentacles. Harvard Microrobotics Lab/Harvard SEAS

Harvard researchers, presumably to get into the Halloween spirit of things, have unveiled a horrifying new robotic arm with tentacle fingers capable of grasping extremely delicate objects like houseplants and glassware. The engineers at the university’s John A. Paulson School of Engineering and Applied Sciences (SEAS) used hollow rubber tubing to craft a robot hand featuring what looks like flesh-colored spaghetti that can coil around, or in, its subject before winding back up into a wormy, curly tangle. Watch Harvard’s video below to see the bizarrely delicate creation in action.

Robot arms often attempt to mimic human hands, which are evolutionarily primed for grasping objects thanks to our fingers and opposable thumbs. That said, our mammalian mitts are pretty complicated biological tools, and as such, are difficult to fully mirror with even some of the most advanced robotics. Instead of optimizing neural networks, sensors, feedback loops, and artificial intelligence systems, however, Harvard’s project leads relied on some relatively basic concepts like inflatable tubing. “The gripper relies on simple inflation to wrap around objects and doesn’t require sensing, planning, or feedback control,” reads Harvard’s recent writeup.

[Related: Meet the world’s speediest laundry-folding robot.]

Researchers came up with the idea from studying other animals—in this case, the anatomy of creatures like octopi and jellyfish. Octopus arms don’t inflate, but instead relying on musculature and immensely powerful suction cups with “piston-like” grip abilities, while jellyfish tentacles rely on neurotoxin venom to stun prey ensnare them. Combine the muscular flexibility of octopus with the tangling abilities of a jellyfish, and you start approaching Tentacle-Bot.

“Taking inspiration from nature, [researchers] designed a new type of soft, robotic gripper that uses a collection of thin tentacles to entangle and ensnare objects, similar to how jellyfish collect stunned prey,” the announcement continues, which isn’t disturbing in the least.

While each individual noodle is relatively weak, a group of them acting together can securely grasp items without damaging them in the process. The developer team hopes their invention could soon have real-world applications in industrial settings like agricultural production and distribution of even the most delicate tomatoes and bananas. One day, the wiggly robot could work within the medical field when handling delicate tissues.

The post This spooky robot uses inflatable tentacles to grab delicate items appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Meet the world’s speediest laundry-folding robot https://www.popsci.com/technology/laundry-folding-robot/ Fri, 21 Oct 2022 15:30:00 +0000 https://www.popsci.com/?p=479945
Two armed laundry folding robot folding a t-shirt
Everyone loves laundry-bot, the bot that folds laundry!. YouTube

The 'SpeedFolding' robot is still not as fast as a human.

The post Meet the world’s speediest laundry-folding robot appeared first on Popular Science.

]]>
Two armed laundry folding robot folding a t-shirt
Everyone loves laundry-bot, the bot that folds laundry!. YouTube

Researchers at UC Berkeley’s AUTOLAB recently unveiled a new robot whose neural network allows it to fold 30-40 randomly disheveled garments per hour, an astounding new speed record for automated bots. While the machine still (unfortunately) trails human capabilities by a considerable amount, its cuteness coupled with its amusingly to-the-point name makes up for any remaining laundry lag time. Everyone, say hello to SpeedFolding.

[Related: Amazon buys Roomba maker iRobot for $1.7 billion.]

According to UC Berkeley designers and a subsequent writeup earlier this week via ArsTechnica, SpeedFolding utilizes a neural network called BiManual Manipulation Network, a pair of industrial robot arms, and an overhead camera system to analyze each wrinkly, unfolded garment. From there, it can subsequently arrange the fabric into shape on average in under two minutes, with a 93 percent success rate.

SpeedFolding learned how to properly and quickly fold through studying 4,300 human and machine-assisted examples. Coupled with the extra arm (previous, similar robots usually only employed one limb or none at all), SpeedFolding essentially blows all past iterations out of the water, given that the previous record only clocked in at between 4-6 folds per hour.

[Related: Amazon’s purchase of iRobot comes under FTC scrutiny.]

The AUTOLAB team employed an ABB YuMi industrial robot featuring gripper fingertips “extended by small 3D printed teeth to improve grasping.” As Ars Technica notes, a similar product runs for about $58,000, so don’t expect to see SpeedFolding in households anytime soon. In the meantime, however, homeowners are welcome to automate their vacuuming with something like a Roomba, whose makers at iRobot recently were acquired by Amazon to the tune of $1.7 billion.

The post Meet the world’s speediest laundry-folding robot appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Google is training robots to interact with humans through ping pong https://www.popsci.com/technology/google-robot-ping-pong/ Wed, 19 Oct 2022 23:00:00 +0000 https://www.popsci.com/?p=479445
ping pong paddle and ball
Lennart Schneider / Unsplash

Here's how a machine learns to rally.

The post Google is training robots to interact with humans through ping pong appeared first on Popular Science.

]]>
ping pong paddle and ball
Lennart Schneider / Unsplash

Yesterday, Google Research unveiled two new projects it’s been working on with a table tennis-playing robot. The Robotics team at Google taught a robot arm to play 300+ shot rallies with other people and return serves with the precision of “amateur humans.” While this might not sound that impressive given how bad some people are at table tennis, the same techniques could be used to train robots to perform other “dynamic, high acceleration tasks” that require close human-robot interaction. 

Table tennis is an interesting task for robots to learn because of two complementary properties: It requires both fast and precise movements in a structured game that occurs in a fixed and predictable environment. The learning algorithm the robot relies on to make decisions has to work hard to get good, but the confines of a table tennis table limit how much of the world it has to contend with. It does help that playing table tennis is a task that requires two parties: the robot can play with another robot (or simulation) or an actual human to train. All this makes it a great set up for exploring human-robot interaction and reinforcement learning techniques (where the robot learns from doing).

Google engineers designed two separate projects using the same robot. Iterative-Sim2Real, which will be presented at CoRL later this year, and GoalsEye, which will be presented at IROS next week. Iterative-Sim2Real is the program that trained the robot to play 300-shot cooperative rallies with humans while GoalsEye allows it to return serves to a specific target point on the table with amateur human-like precision.

Iterative-Sim2Real is an attempt to overcome the “chicken and egg problem” of teaching machines to mimic human behaviors. The research team explains that if you don’t have a good robot policy (a set of rules for the robot) to begin with, then you can’t collect high-quality data on how people will interact with it. But, without a human behavior model to start with, you can’t come up with the robot policy in the first place. One alternative solution is to exclusively train robots in the real-world. However, this process is “often slow, cost-prohibitive, and poses safety-related challenges, which are further exacerbated when people are involved.” In other words, it takes a long time and people can get hurt by robot arms swinging table tennis bats around. 

Iterative-Sim2Real sidesteps this problem by using a very simple model of human behavior as a starting point and then training the robot both with a simulation and a human in the real world. After each iteration, both the human behavior model and the robot policy are refined. Using five human subjects, the robot trained with Iterative-Sim2Real outperformed an alternative approach called sim-to-real plus fine-tuning. It had significantly fewer rallies that ended in less than five shots and its average rally length was 9 percent longer. 

GoalsEye, on the other hand, set out to tackle a different set of training problems and taught the robot to return the ball to an arbitrary location such as “the back left corner” or “just over the net on the right side.” Imitation learning—where a robot develops a play strategy derived from human performance data—is hard to conduct in high-speed settings. There are so many variables affecting how a human hits a ping pong ball that makes tracking everything necessary for a robot to learn practically impossible. Reinforcement learning is typically good for these situations but can be slow and sample inefficient—especially at the start. (In other words, it takes a lot of repetitions to develop a fairly limited play strategy.) 

GoalsEye attempts to overcome both sets of issues using an initial “small, weakly-structured, non-targeted data set” that enables the robot to learn the basics of what happens when it hits a ping pong ball and then allowing it to self-practice to teach it to hit the ball precisely to specific points. After being trained on the initial 2,480 demonstrations, the robot was able to return a ball to within 30 centimeters (~1 foot) only 9 percent of the time. But after self-practicing for ~13,500 shots, it was accurate 43 percent of the time. 

While teaching robots to play games might seem trivial, the research team contends that solving these kinds of training problems with table tennis has potential real-world applications. Iterative-Sim2Real allows robots to learn from interacting with humans while GoalsEye shows how robots can learn from unstructured data and self-practice in a “precise, dynamic setting.” Worst case scenario: If Google’s big goals don’t pan out, at least they could build a robot table tennis coach. 

The post Google is training robots to interact with humans through ping pong appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
ART, the turtle robot, gets by swimmingly in water and on land https://www.popsci.com/technology/amphibious-turtle-robot/ Wed, 12 Oct 2022 20:00:00 +0000 https://www.popsci.com/?p=477186
ART turtle robot on the grass
This little robot can take on adventures on solid ground and underwater. Courtesy of the lab of Rebecca Kramer-Bottiglio

The best of both worlds.

The post ART, the turtle robot, gets by swimmingly in water and on land appeared first on Popular Science.

]]>
ART turtle robot on the grass
This little robot can take on adventures on solid ground and underwater. Courtesy of the lab of Rebecca Kramer-Bottiglio

As a society, we can’t seem to get enough of robots that remind us of animals. We want to swim with animatronic dolphins, make them more social like our pets, and even use robotics to give legs to animals that haven’t had them in millions of years. Not to mention, we’ve learned a lot about how to make robots better from looking at our animal friends like octopuses, suckerfish, and insects. But, scientists at Yale have turned to another beloved creature to master the art of robotic amphibiosity. 

In a new study out today in Nature, researchers found a way to create a semi-aquatic robot nicknamed the Amphibious Robotic Turtle, or ART. The adorable robot that looks a little bit like a long-legged crab has morphing limbs that can change shape, stiffness, and behavior, as well as use thermoset polymers and embedded heaters to warm up and change shape. Each of these limbs is operated by a shoulder joint with three motors. They take the shape of a cylindrical stumpy leg for walking (like a terrestrial tortoise on land) and can then turn into an aquatic flipper for swimming (like a sea turtle in the ocean). 

[Related: The march of the penguins has a new star: an autonomous robot.]

“Terrestrial and aquatic turtles share similar bodies, with four limbs and a shell, but have distinctive limb shapes and gaits adapted for their specific environment,” Rebecca Kramer-Bottiglio, a professor of mechanical engineering at Yale University and the principal investigator of the study, said in a press release. “Sea turtles have elongated flippers for swimming, whereas land turtles and tortoises have rounded legs for load bearing while walking.”

Animals photo
The ART taking a little swim. Courtesy of the lab of Rebecca Kramer-Bottiglio.

The act of going back and forth between flippers and legs is what the authors call “adaptive morphogenesis.” For robots that may need to cross both land and sea doing things like monitoring shorelines, ocean farming, and measuring waves and currents, being able to “evolve on demand,” as Karl Ziemelis, chief physical sciences editor for Nature called it in the press release, could come in handy. And while the robot isn’t fast, (after all, it is a turtle robot) it is three times more efficient than a bipedal robot developed by the Massachusetts Institute of Technology, and performed similarly to four-legged robots like Ecole Polytechnique Federale De Lausanne’s Cheetah Cub and Tokyo Institute of Technology’s Titan V-III, according to the study. 

[Related: A tiny crabby robot gets its scuttling orders from lasers.]

Theoretically, the ART doesn’t have to just stick to resembling a tortoise or sea turtle. “Although we focused on transitions between two discrete states, flipper and leg, intermediate shapes or even radically different shapes are certainly possible,” Robert Baines, a Ph.D. candidate at Yale and author of the study, tells New Scientist. For this little robot, it could be land and water today, infinite possibilities in the future. 

The post ART, the turtle robot, gets by swimmingly in water and on land appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Your upcoming Walgreens prescription may be refilled by a machine https://www.popsci.com/technology/walgreens-automated-prescription-refills/ Tue, 04 Oct 2022 12:00:00 +0000 https://www.popsci.com/?p=474563
Walgreens Retail Store Exterior at night
Walgreens aims to save $1 billion thanks to robot refillers. Deposit Photos

Faced with pharmacist shortages and increased demand, the drugstore giant looks to automation for relief.

The post Your upcoming Walgreens prescription may be refilled by a machine appeared first on Popular Science.

]]>
Walgreens Retail Store Exterior at night
Walgreens aims to save $1 billion thanks to robot refillers. Deposit Photos

Ongoing pandemic crises and a shortage of trained pharmacists has Walgreens doubling down on automation. As The Wall Street Journal reported over the weekend, the drug store giant is heavily investing in a network of prescription refill centers reliant on automated pharmabots to help with recent years’ skyrocketing medication orders. In doing so, the company hopes to cut pharmacists’ workloads “by at least 25 percent,” allowing them to focus more on services like vaccinations, patient advising, and clinical needs. The business also estimates it can save $1 billion annually by relying more on robot sorting.

[ Related: At US hospitals, a drug mix-up is just a few keystrokes away ]

According to Walgreens, there are currently eight automation centers providing medications to roughly 1,800 stores, although the company aims to boost that number up to around 24 fulfillment locations by 2025. Each center is described as being around the size of a single city block—one of the largest reportedly fills 35,000 medications per day for 500 stores in Arkansas, Texas, and Louisiana. Eventually, Walgreens purportedly hopes to to fill between 40 and 50 percent of all medications at these robot-centric distribution sites.

[Related: Is it safe to take expired medication?]

Job automation is on the rise throughout the labor industry, although strategies like this point more towards a reorganization of on-the-clock responsibilities than sunsetting actual positions. The WSJ notes that anywhere from dozens to over a hundred human operators still work within Walgreen’s automatic refilling locales, and pharmacists remain in extremely high demand—the company has even gone so far as to offer up to $75,000 signing bonuses for new pharmacist hires. For the time being, time-sensitive and controlled substance prescriptions will also still require humans to fulfill them at store locations.

The decision to use automation as a means to free up pharmacists for more human-to-human caregiving roles also plays into Walgreens’ ongoing efforts to slowly rebrand itself as a modern healthcare provider as opposed to simply a drugstore. In 2021, the company purchased a majority stake in the primary care network, VillageMD, with an eye to offer more services along these lines.

The post Your upcoming Walgreens prescription may be refilled by a machine appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Tesla’s Optimus humanoid robot can shuffle across stage, ‘raise the roof’ https://www.popsci.com/technology/teslas-ai-day-optimus/ Mon, 03 Oct 2022 15:00:00 +0000 https://www.popsci.com/?p=474360
Tesla Optimus robot prototype dancing on stage
Whether Tesla will "raise the roof" on robotics is yet to be determined. Tesla/YouTube

If you missed Elon Musk's (delayed) Friday night debut of Optimus, we've got you covered.

The post Tesla’s Optimus humanoid robot can shuffle across stage, ‘raise the roof’ appeared first on Popular Science.

]]>
Tesla Optimus robot prototype dancing on stage
Whether Tesla will "raise the roof" on robotics is yet to be determined. Tesla/YouTube

After weeks of speculation, Tesla’s AI Day kicked off late Friday evening with CEO Elon Musk at the helm to showcase the company’s strides in machine learning, self-driving vehicles, and robotics. Musk devoted extensive time to developments in Full Self-Driving vehicle technology and further progress on his team’s supercomputer system, Dojo. But the night’s focus was Optimus, Tesla’s worker robot meant to aid in “ending poverty.”

After last year’s mockup concept model—as demonstrated by a human dressed in a robot costume meant to illustrate Tesla’s vision—audiences finally got a real glimpse at the robot promised to usher in “fundamental change in civilization as we know it.”

[Related: What we know about Tesla’s supercomputer.]

Behold, Tesla’s “Bumble-C” Optimus working test mule prototype. You can also watch the robot in action here.

AI photo
Bumble-C waving to Friday evening’s audience. Tesla

It’s important to note that Bumble-C is more a proof-of-concept than anything else, and certainly far from the finished product Tesla hopes to achieve. In any case, Bumble-C slowly ambled across stage (reportedly for the first time without a safety tether system), waved at audience members, and briefly “raised the roof,” but stopped short of any in-person capability demonstrations. Instead, Tesla showed pre-taped footage of the working prototype handling packages in an office setting, watering plants, and doing basic manual work on a factory floor. “The robot can actually do a lot more than we just showed you,” Musk claimed on stage at one point. “We just didn’t want it to fall on its face.”

The most current Optimus model has a sleeker, more Tesla-aligned design, but can currently only move its arms and hands without assistance. Musk alleged this version was only a few weeks away from actually becoming mobile.

[Related: Tesla updated latest glitch with its windows.]

While Musk’s timetable for a market ready Optimus has been somewhat vague—he previously estimated production could begin next year—during the Q&A portion he conceded that consumers probably won’t see the product for at least 3-5 years. When, or if, it does hit markets, Tesla’s CEO said he thinks the product will cost under $20,000, although he’s offered similarly bullish price ranges in the past. For example, the company’s Model 3 EV was long touted as costing consumers around $35,000 before its release; drivers can snag one now for around $47,000.

Optimus joins a very crowded field of working humanlike robots, many of which can be seen doing backflips, walking up and down stairs, and performing far more complex dance routines. Musk made sure to highlight that Tesla’s project differentiates itself because it, unlike a company such as Boston Dynamics, is focused delivering machines at a high-volume to general consumers.

Musk has long sounded off on his fears regarding artificial intelligence, cautioning against various robopocalypse scenarios if developed recklessly. Musk’s approach to Optimus, he argued, is meant to avoid “pav[ing] the road to hell with good intentions”.

The post Tesla’s Optimus humanoid robot can shuffle across stage, ‘raise the roof’ appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
The Navy’s robot pilots could one day outnumber its human ones https://www.popsci.com/technology/navy-carriers-robot-planes/ Sat, 01 Oct 2022 15:59:00 +0000 https://www.popsci.com/?p=474232
The MQ-25 aircraft on the aircraft carrier USS George H.W. Bush in December, 2021.
The MQ-25 aircraft on the aircraft carrier USS George H.W. Bush in December, 2021. US Navy / Hillary Becke

The plan is for at least 60 percent of the flying machines that take off and land from carriers to be uncrewed, like the MQ-25 Stingray.

The post The Navy’s robot pilots could one day outnumber its human ones appeared first on Popular Science.

]]>
The MQ-25 aircraft on the aircraft carrier USS George H.W. Bush in December, 2021.
The MQ-25 aircraft on the aircraft carrier USS George H.W. Bush in December, 2021. US Navy / Hillary Becke

When it comes to equipping the aircraft carriers of the 21st century, the US Navy wants a mix of aircraft that is at least 60-percent uncrewed. This goal was “outlined by multiple officials during updates at the annual Tailhook Association symposium in September,” reports Aviation Week, referring to the conference held by a fraternal order of Naval Aviators, the pilots who presently and previously performed the kind of job that the Navy intends to shift mostly to robots.

The Navy has made no secret of its intentions to move towards more uncrewed aircraft flying on and off of carriers. In March 2021, Vice Adm. James Kilby told the House Armed Services committee that “we think we could get upwards of 40 percent of the aircraft in an air wing that are unmanned and then transition beyond that.”

Shifting from 40 to 60 percent is a substantial leap, though it’s of a piece with the overarching strategy for how the Navy intends to incorporate and expand the use of uncrewed vehicles in the coming decades. In the 2022 Navigation Plan, the Navy’s longer-term procurement strategy document, the Navy said that by the 2040s it is planning to field “Aircraft for anti-submarine and anti-surface warfare, to include helicopters and maritime patrol and reconnaissance aircraft, all augmented by unmanned aviation systems” with a capacity goal of “approximately 900.”

For the Navy, much of its uncrewed aviation plans hinge on the continued success of the MQ-25 Stingray tanker drone. The Stingray’s mission is to take off from a carrier deck, and travel with fighters like the F/A-18 jets part of the way to their mission. Then, the Stingray is supposed to top off the fuel tanks of the jets while they’re already airborne, extending the functional range of those fighters. This is a mission at present performed by specially equipped F/A-18s, but switching the refueling to a specialized uncrewed aircraft would free up the crewed fighter for other missions.

In June 2021, a Stingray successfully transferred fuel from an external storage tank to a fighter in flight for the first time, and testing of the aircraft continues, with the Navy expecting the drones to enter service in 2026. While not as flashy as the combat missions Navy drones may someday fly, the tanker missions require mastering the ability to take off from and land on carrier decks, as well as the ability for an uncrewed vehicle to coordinate with human pilots in close contact while airborne. If the airframe and its autonomous systems can accomplish that, then adapting the form to other missions, like scouting or attack, can come in the future. 

Adding uncrewed aircraft can potentially increase the raw numbers of flying machines fielded, as autonomous systems are not limited by the availability or capacity of human pilots. The uncrewed aircraft can also be designed from the start without a need to accommodate human pilots, letting designers build airframes without having to include space for not just cockpits but the pilot safety systems, like ejection seats, oxygen, and redundant engines. 

By saving the labor of piloting by shifting towards autonomy, and saving space on an aircraft carrier through denser uncrewed design, roboting wingmates could allow ships to put more flying machines into the sky, without needing to have a similar expansion in pilot numbers or carrier decks. 

[Related: The US Navy floats its wishlist: 350 ships and 150 uncrewed vessels]

The Navy’s intention has parallels across the Department of Defense. In September, DARPA announced ANCILLARY, a program looking for a versatile drone that could fly from rugged environments and ship decks, without any need for additional infrastructure. GAMBIT, a program by defense contractor General Atomics, is pitched to the Air Force as a way to develop four different drone models from one single core design, allowing cost savings and versatility with shared parts.

Beyond those speculative programs, the Air Force has worked to develop semi-autonomous drones that can receive orders from and fly in formation with human-piloted planes. This Loyal Wingmate program is aimed at expanding the number of aircraft, and in turn sensors and weapons, that can be flown in formation, again without expanding the number of pilots needed. It also allows the Air Force to develop a rotating cast of uncrewed aircraft around existing crewed fighters, with hoped-for shorter production timelines and rapid deployment of new capabilities once they’re developed.

[Related: A guide to the Gambit family of military drones and their unique jobs]

The Navy’s ultimate vision, one suggested at 40 percent uncrewed and necessitated at 60 percent, is that the new robotic planes perform well enough to justifying their place in carrier storage, while also being expendable enough that they can take the brunt of risk in any conflict, sparing human pilots from exposure to enemy anti-aircraft weaponry. A shot-down pilot is a tragedy. A shot-down drone is just lost equipment and the ensuing paperwork.

The post The Navy’s robot pilots could one day outnumber its human ones appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
This robot broke a Guinness World Record for the 100-meter dash https://www.popsci.com/technology/robot-new-guinness-world-record-100m-dash/ Fri, 30 Sep 2022 17:30:00 +0000 https://www.popsci.com/?p=474114
Cassie bipedal robot running on relay track
Why are you in such a rush there, Cassie?. Oregon State University

Oregon State University's two-legged 'Cassie' clocked in at 24.73 seconds, making it the fastest robot on record.

The post This robot broke a Guinness World Record for the 100-meter dash appeared first on Popular Science.

]]>
Cassie bipedal robot running on relay track
Why are you in such a rush there, Cassie?. Oregon State University

It may still trail the fastest humans on the planet, but Cassie proved it’s no slouch, either. Earlier this week, Oregon State University’s College of Engineering and its spinout company, Agility Robotics, released a video shot earlier this year showcasing their bipedal robot as it set a new Guinness World Record for the fastest 100m dash undertaken by a machine: 24.73 seconds.

[Related: Boston Dynamics gave its dog-like robot a charging dock and an arm on its head.]

Last year, Cassie pulled off another achievement when it power-walked a 5K in just over 53 minutes, displaying the team’s advancements in reliability and durability within mobile robotics. “This may be the first bipedal robot to learn to run, but it won’t be the last,” OSU robotics professor Jonathan Hurst said in a press statement. “I think progress is going to accelerate from here.”

Check out a video of the record-breaking in action below:

Using a computing technique known as parallelization in which a program simultaneously runs multiple calculations and simulations, Cassie fit the equivalent of an entire year’s worth of machine learning into a single week. “Cassie can perform a spectrum of different gaits but as we specialized it for speed we began to wonder, which gaits are most efficient at each speed?” Devin Crowley, a graduate student and project collaborator, said in a statement. Despite the prototype’s decidedly ostrich-legged appearance, the learning “led to Cassie’s first optimized running gait and resulted in behavior that was strikingly similar to human biomechanics.”

At 9.58 seconds, our species’ record is still less than half of that (thanks, Usain Bolt), and as The Verge notes, Cassie is far from the world’s fastest robot—Boston Dynamics’ quadrupedal WildCat can max out at 19mph, while MIT’s four-legged Mini Cheetah reaches around 9mph, although there doesn’t appear to be any indications either could travel as long as Cassie. Still, it’s an impressive new milestone in robotic mobility.

The post This robot broke a Guinness World Record for the 100-meter dash appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>