banner
News center
Our team is keen on providing bespoke orders

Mammoth meatballs, deceptive robots, and plant stress

Jan 18, 2024

By

Emily Kwong

,

Regina G. Barber

,

Margaret Cirino

,

Liz Metzger

,

Rebecca Ramirez

According to a recent study in the journal Cell, plants that are distressed due to factors like dehydration and cuts, emit specific airborne sounds at an increased frequency. Tuvik Beker/Tel Aviv University hide caption

According to a recent study in the journal Cell, plants that are distressed due to factors like dehydration and cuts, emit specific airborne sounds at an increased frequency.

After reading the science headlines this week, we have A LOT of questions. What's really in those mammoth meatballs and are they edible? Are all of our poorly maintained indoor plants crying out at frequencies too low for us to hear? And if the robots of our future lie to us — is there room for redemption? Luckily, it's the job of the Short Wave team to decipher the science behind the headlines. This week, the brain power is spread across co-host Emily Kwong, Scientist in Residence Regina G. Barber and producer Margaret Cirino. Hang out with us as we dish on some of the coolest recent science stories in this installment of our regular newsy hangouts!

Last week, the Australian cultivated meat company Vow unveiled the mammoth meatball as part of a big marketing play. The meatball is constructed out of lab-grown sheep meat and featured a mammoth myoglobin gene. Myoglobin is a heme-protein in the muscle tissue of vertebrates. It helps give red meat its characteristic taste and color. Gaps in the gene were filled in using genetic data from the African elephant. So far, the jury's out on the flavor of the meatball. Because the woolly-mammoth proteins have been around for thousands of years old, the scientists did not have anyone taste the meatball.

Scientists have recently uncovered the ultrasonic sounds of distressed plants drifting through the air. Researchers positioned two microphones by tomato and tobacco plants that were varyingly cut, dehydrated or well-maintained as controls. The researchers then used machine learning to identify the type of distress each plant was experiencing — with a roughly 70% accuracy rate. The sounds are out of the range of hearing for most humans — though within range for small, hyperlocal critters like mice and moths.

Researchers at Georgia Institute of Technology and The Ohio State University are calling robots on their bluff — examining what happens when robots intentionally lie to humans. As more AI systems enable human-robot interaction (HRI), the field of robot deception is in need of more research. In their paper, Lying About Lying: Examining Trust Repair Strategies After Robot Deception in a High-Stakes HRI Scenario, researchers use a creative approach to measure how humans navigate robot deception and how trust can be repaired after a robot issues a text-based apology.

Listen to Short Wave on Spotify, Apple Podcasts and Google Podcasts.

We love hearing what you're reading and what science is catching your eye. Reach the show by emailing [email protected].

This episode was produced by Liz Metzger. It was edited by Rebecca Ramirez. Anil Oza checked the facts. The audio engineer was Margaret Luthar.