Capturing the sense of touch could upgrade prosthetics and our digital lives

On most mornings, Jeremy D. Brown eats an avocado. But first, he gives it a little squeeze. A ripe avocado will yield to that pressure, but not too much. Brown also gauges the fruit’s weight in his hand and feels the waxy skin, with its bumps and ridges.

“I can’t imagine not having the sense of touch to be able to do something as simple as judging the ripeness of that avocado,” says Brown, a mechanical engineer who studies haptic feedback — how information is gained or transmitted through touch — at Johns Hopkins University.

Many of us have thought about touch more than usual during the COVID-19 pandemic. Hugs and high fives rarely happen outside of the immediate household these days. A surge in online shopping has meant fewer chances to touch things before buying. And many people have skipped travel, such as visits to the beach where they might sift sand through their fingers. A lot goes into each of those actions.

“Anytime we touch anything, our perceptual experience is the product of the activity of thousands of nerve fibers and millions of neurons in the brain,” says neuroscientist Sliman Bensmaia of the University of Chicago. The body’s natural sense of touch is remarkably complex. Nerve receptors detect cues about pressure, shape, motion, texture, temperature and more. Those cues cause patterns of neural activity, which the central nervous system interprets so we can tell if something is smooth or rough, wet or dry, moving or still.

hand touching rotating drum covered with different materials
Scientists at the University of Chicago attached strips of different materials to a rotating drum to measure vibrations produced in the skin as a variety of textures move across a person’s fingertips.
Matt Wood/Univ. of Chicago

Neuroscience is at the heart of research on touch. Yet mechanical engineers like Brown and others, along with experts in math and materials science, are studying touch with an eye toward translating the science into helpful applications. Researchers hope their work will lead to new and improved technologies that mimic tactile sensations.

As scientists and engineers learn more about how our nervous system responds to touch stimuli, they’re also studying how our skin interacts with different materials. And they’ll need ways for people to send and receive simulated touch sensations. All these efforts present challenges, but progress is happening. In the near term, people who have lost limbs might recover some sense of touch through their artificial limbs. Longer term, haptics research might add touch to online shopping, enable new forms of remote medicine and expand the world of virtual reality.

“Anytime you’re interacting with an object, your skin deforms,” or squishes a bit.

Sliman Bensmaia

Good vibrations

Virtual reality programs already give users a sense of what it’s like to wander through the International Space Station or trek around a natural gas well. For touch to be part of such experiences, researchers will need to reproduce the signals that trigger haptic sensations.

Our bodies are covered in nerve endings that respond to touch, and our hands are really loaded up, especially our fingertips. Some receptors tell where parts of us are in relation to the rest of the body. Others sense pain and temperature. One goal for haptics researchers is to mimic sensations resulting from force and movement, such as pressure, sliding or rubbing.

“Anytime you’re interacting with an object, your skin deforms,” or squishes a bit, Bensmaia explains. Press on the raised dots of a braille letter, and the dots will poke your skin. A soapy glass slipping through your fingers produces a shearing force — and possibly a crash. Rub fabric between your fingers, and the action produces vibrations.

Four main categories of touch receptors respond to those and other mechanical stimuli. There’s some overlap among the types. And a single contact with an object can affect multiple types of receptors, Bensmaia notes.

One type, called Pacinian corpuscles, sits deep in the skin. They are especially good at detecting vibrations created when we interact with different textures. When stimulated, the receptors produce sequences of signals that travel to the brain over a period of time. Our brains interpret the signals as a particular texture. Bensmaia compares it to the way we hear a series of notes and recognize a tune.

“Corduroy will produce one set of vibrations. Organza will produce another set,” Bensmaia says. Each texture produces “a different set of vibrations in your skin that we can measure.” Such measurements are a first step toward trying to reproduce the feel of different textures.

Additionally, any stimulus meant to mimic a texture sensation must be strong enough to trigger responses in the nervous system’s touch receptors. That’s where work by researchers at the University of Birmingham in England comes in. The vibrations from contact with various textures create different kinds of wave energy. Rolling-type waves called Rayleigh waves go deep enough to reach the Pacinian receptors, the team reported last October in Science Advances. Much larger versions of the same types of waves cause much of the damage from earthquakes.

Not all touches are forceful enough to trigger a response from the Pacinian receptors. To gain more insight into which interactions will stimulate those receptors, the team looked at studies that have collected data on touches to the limbs, head or neck of dogs, dolphins, rhinos, elephants and other mammals. A pattern emerged. The group calls it a “universal scaling law” of touch for mammals.

For the most part, a touch at the surface will trigger a response in a Pacinian receptor deep in the skin if the ratio is 5-to-2 between the length of the Rayleigh waves resulting from the touch and the depth of the receptor. At that ratio or higher, a person and most other mammals will feel the sensation, says mathematician James Andrews, lead author of the study.

Also, the amount of skin displacement needed to cause wavelengths long enough to trigger a sensation by the Pacinian receptors will be the same across most mammal species, the group found. Different species will need more or less force to cause that displacement, however, which may depend on skin composition or other factors. Rodents did not fit the 5–2 ratio, perhaps because their paws and limbs are so small compared with the wavelengths created when they touch things, Andrews notes.

Beyond that, the work sheds light on “what types of information you’d need to realistically capture the haptic experience — the touch experience — and send that digitally anywhere,” Andrews says. People could then feel sensations with a device or perhaps with ultrasonic waves. Someday the research might help provide a wide range of virtual reality experiences, including virtual hugs.

Online tactile shopping

Mechanical engineer Cynthia Hipwell of Texas A&M University in College Station moved into a new house before the pandemic. She looked at some couches online but couldn’t bring herself to buy one from a website. “I didn’t want to choose couch fabric without feeling it,” Hipwell says.

“Ideally, in the long run, if you’re shopping on Amazon, you could feel fabric,” she says. Web pages’ computer codes would make certain areas on a screen mimic different textures, perhaps with shifts in electrical charge, vibration signals, ultrasound or other methods. Touching the screen would clue you in to whether a sweater is soft or scratchy, or if a couch’s fabric feels bumpy or smooth. Before that can happen, researchers need to understand conditions that affect our perception of how a computer screen feels.

Surface features at the nanometer scale (billionths of a meter) can affect how we perceive the texture of a piece of glass, Hipwell says. Likewise, we may not consciously feel any wetness as humidity in the air mixes with our skin’s oil and sweat. But tiny changes in that moisture can alter the friction our fingers encounter as they move on a screen, she says. And that friction can influence how we perceive the screen’s texture.

Shifts in electric charge also can change the attraction between a finger and a touch screen. That attraction is called electroadhesion, and it affects our tactile experience as we touch a screen. Hipwell’s group recently developed a computer model that accounts for the effects of electroadhesion, moisture and the deformation of skin pressing against glass. The team reported on the work in March 2020 in IEEE Transactions on Haptics.

Hipwell hopes the model can help product designers develop haptic touch screens that go beyond online shopping. A car’s computerized dashboard might have sections that change texture for each menu, she suggests. A driver could change temperature or radio settings by touch while keeping eyes on the road.

“Ideally, in the long run, if you’re shopping on Amazon, you could feel fabric.”

Cynthia Hipwell

Wireless touch patches

Telemedicine visits rose dramatically during the early days of the COVID-19 pandemic. But video doesn’t let doctors feel for swollen glands or press an abdomen to check for lumps. Remote medicine with a sense of touch might help during pandemics like this one — and long after for people in remote areas with few doctors.

People in those places might eventually have remote sensing equipment in their own homes or at a pharmacy or workplace. If that becomes feasible, a robot, glove or other equipment with sensors could touch parts of a patient’s body. The information would be relayed to a device somewhere else. A doctor at that other location could then experience the sensations of touching the patient.

Researchers are already working on materials that can translate digital information about touch into sensations people — in this case, doctors — can feel. The same materials could communicate information for virtual reality applications. One possibility is a skin patch developed by physical chemist John Rogers of Northwestern University in Evanston, Ill., and others.

One layer of the flexible patch sticks to a person’s skin. Other layers include a stretchable circuit board and tiny actuators that create vibrations as current flows around them. Wireless signals tell the actuators to turn on or off. Energy to run the patch also comes in wirelessly. The team described the patch in Nature in 2019.

Garrett Anderson shakes hands with researcher Aadeel Akhtar
Retired U.S. Army Sgt. Garrett Anderson shakes hands with researcher Aadeel Akhtar, CEO of Psyonic, a prosthesis developer. A wireless skin patch on Anderson’s upper arm gives him sensory feedback when grasping an object.Northwestern Univ.
prototype patch
Inside the patch are circular actuators that vibrate in response to signals. The prototype device might give the sensation of touch pressure in artificial limbs, in virtual reality and telemedicine.

Since then, Rogers’ group has reduced the patch’s thickness and weight. The patch now also provides more detailed information to a wearer. “We have scaled the systems into a modular form to allow custom sizes [and] shapes in a kind of plug-and-play scheme,” Rogers notes. So far, up to six separate patches can work at the same time on different parts of the body.

The group also wants to make its technology work with electronics that many consumers have, such as smartphones. Toward that end, Rogers and colleagues have developed a pressure-sensitive touch screen interface for sending information to the device. The interface lets someone provide haptic sensations by moving their fingers on a smartphone or touch screen–based computer screen. A person wearing the patch then feels stroking, tapping or other touch sensations.

Pressure points

Additionally, Rogers’ team has developed a way to use the patch system to pick up signals from pressure on a prosthetic arm’s fingertips. Those signals can then be relayed to a patch worn by the person with the artificial limb. Other researchers also are testing ways to add tactile feedback to prostheses. European researchers reported in 2019 that adding feedback for pressure and motion helped people with an artificial leg walk with more confidence (SN: 10/12/19, p. 8). The device reduced phantom limb pain as well.

Brown, the mechanical engineer at Johns Hopkins, hopes to help people control the force of their artificial limbs. Nondisabled people adjust their hands’ force instinctively, he notes. He often takes his young daughter’s hand when they’re in a parking lot. If she starts to pull away, he gently squeezes. But he might easily hurt her if he couldn’t sense the stiffness of her flesh and bones.

Two types of prosthetic limbs can let people who lost an arm do certain movements again. Hands on “body-controlled” limbs open or close when the user moves other muscle groups. The movement works a cable on a harness that connects to the hand. Force on those other muscles tells the person if the hand is open or closed. Myoelectric prosthetic limbs, in contrast, are directly controlled by the muscles on the residual limb. Those muscle-controlled electronic limbs generally don’t give any feedback about touch. Compared with the body-controlled options, however, they allow a greater range of motion and can offer other advantages.

In one study, Brown’s group tested two ways to add feedback about the force that a muscle-controlled electronic limb exerts on an object. One method used an exoskeleton that applied force around a person’s elbow. The other technique used a device strapped near the wrist. The stiffer an object is, the stronger the vibrations on someone’s wrist. Volunteers without limb loss tried using each setup to judge the stiffness of blocks.

haptic feedback system that applied force near the elbow
In a study of two different haptic feedback methods, one system applied force near the elbow. N. Thomas et al/J. NeuroEng. Rehab. 2019
haptic feedback system that applied force near the elbow
The other system tested in the study provided vibrations near the wrist. N. Thomas et al/J. NeuroEng. Rehab. 2019

Both methods worked better than no feedback. And compared with each other, the two types of feedback “worked equally well,” Brown says. “We think that is because, in the end, what the human user is doing is creating a map.” Basically, people match up how much force corresponds to the intensity of each type of feedback. The work suggests ways to improve muscle-controlled electronic limbs, Brown and colleagues reported in 2019 in the Journal of NeuroEngineering and Rehabilitation.

Still, people’s brains may not be able to match up all types of feedback for touch sensations. Bensmaia’s group at the University of Chicago has worked with colleagues in Sweden who built tactile sensors into bionic hands: Signals from a sensor on the thumb went to an electrode implanted around the ulnar nerve on people’s arms. Three people who had lost a hand tested the bionic hands and felt a touch when the thumb was prodded, but the touch felt as if it came from somewhere else on the hand.

Doctors can choose which nerve an electrode will stimulate. But they don’t know in advance which bundle of fibers it will affect within the nerve, Bensmaia explains. And different bundles receive and supply sensations to different parts of the hand. Even after the people had used the prosthesis for more than a year, the mismatch didn’t improve. The brain didn’t adapt to correct the sensation. The team shared its findings last December in Cell Reports.

Despite that, in previous studies, those same people using the bionic hands had better precision and more control over their force when grasping objects, compared with those using versions without direct stimulation of the nerve. People getting the direct nerve stimulation also reported feeling as if the hand was more a part of them.

As with the bionic hands, advances in haptic technology probably won’t start out working perfectly. Indeed, virtual hugs and other simulated touch experiences may never be as good as the real thing. Yet haptics may help us get a feel for the future, with new ways to explore our world and stay in touch with those we love.

Fast radio bursts could help solve the mystery of the universe’s expansion

Astronomers have been arguing about the rate of the universe’s expansion for nearly a century. A new independent method to measure that rate could help cast the deciding vote.

For the first time, astronomers calculated the Hubble constant — the rate at which the universe is expanding — from observations of cosmic flashes called fast radio bursts, or FRBs. While the results are preliminary and the uncertainties are large, the technique could mature into a powerful tool for nailing down the elusive Hubble constant, researchers report April 12 at arXiv.org.

Ultimately, if the uncertainties in the new method can be reduced, it could help settle the longstanding debate that holds our understanding of the universe’s physics in the balance (SN: 7/30/19).

“I see great promises in this measurement in the future, especially with the growing number of detected repeated FRBs,” says Stanford University astronomer Simon Birrer, who was not involved with the new work.

Astronomers typically measure the Hubble constant in two ways. One uses the cosmic microwave background, the light released shortly after the Big Bang, in the distant universe. The other uses supernovas and other stars in the nearby universe. These approaches currently disagree by a few percent. The new value from FRBs comes in at an expansion rate of about 62.3 kilometers per second for every megaparsec (about 3.3 million light-years). While lower than the other methods, it’s tentatively closer to the value from the cosmic microwave background, or CMB.

“Our data agrees a little bit more with the CMB side of things compared to the supernova side, but the error bar is really big, so you can’t really say anything,” says Steffen Hagstotz, an astronomer at Stockholm University. Nonetheless, he says, “I think fast radio bursts have the potential to be as accurate as the other methods.”

No one knows exactly what causes FRBs, though eruptions from highly magnetic neutron stars are one possible explanation (SN: 6/4/20). During the few milliseconds when FRBs blast out radio waves, their extreme brightness makes them visible across large cosmic distances, giving astronomers a way to probe the space between galaxies (SN: 5/27/20).

As an FRB signal travels through the dust and gas separating galaxies, it becomes scattered in a predictable way that causes some frequencies to arrive slightly later than others. The farther away the FRB, the more dispersed the signal. Comparing this delay with distance estimates to nine known FRBs, Hagstotz and colleagues measured the Hubble constant.

The largest error in the new method comes from not knowing precisely how the FRB signal disperses as it exits its home galaxy before entering intergalactic space, where the gas and dust content is better understood. With a few hundred FRBs, the team estimates that it could reduce the uncertainties and match the accuracy of other methods such as supernovas.

“It’s a first measurement, so not too surprising that the current results are not as constraining as other more matured probes,” says Birrer.

New FRB data might be coming soon. Many new radio observatories are coming online and larger surveys, such as ones proposed for the Square Kilometer Array, could discover tens to thousands of FRBs every night. Hagstotz expects there will sufficient FRBs with distance estimates in the next year or two to accurately determine the Hubble constant. Such FRB data could also help astronomers understand what’s causing the bright outbursts.

“I am very excited about the new possibilities that we will have soon,” Hagstotz says. “It’s really just beginning.”

Cannabis Use Disorder Rate Rose among Pregnant Women between 2001-2012

Original post: Newswise - Drug and Drug Abuse Cannabis Use Disorder Rate Rose among Pregnant Women between 2001-2012

Newswise imageBabies born to mothers diagnosed with cannabis use disorder are more likely to experience negative health outcomes, such as preterm birth and low birth weight, than babies born to mothers without a cannabis use disorder diagnosis, report UC San Diego researchers.

Videocalling needed more than a pandemic to finally take off. Will it last?

Eileen Donovan, an 89-year-old mother of seven living in a Boston suburb, loved watching her daughter teach class on Zoom during the coronavirus pandemic. She never imagined Zoom would be how her family eventually attended her funeral.

Donovan died of Parkinson’s disease on June 30, 2020, leaving behind her children, 10 grandchildren and six great-grandchildren. She always wanted a raucous Irish wake. But only five of her children plus some local family could be there in person, and no extended family or friends, due to coronavirus concerns. This was not the way they had expected to mourn.

For online attendees, the ceremony didn’t end with hugs or handshakes. It ended with a click on a red “leave meeting” button, appropriately named for business meetings, but not much else.

It’s the same button that Eileen Donovan-Kranz, Donovan’s daughter, clicks when she finishes an English lecture for her class of undergraduate students at Boston College. And it’s the same way she and I ended our conversation on an unseasonably warm November day: Donovan-Kranz sitting in front of a window in her dining room in Ayer, Mass., and me in my bedroom in Manhattan.

“I’m not going to hold the phone during my mother’s burial,” she remembers thinking. Just a little over a year ago, it would have seemed absurd to have to ask someone to hold up a smartphone so that others could “attend” such a personal event. Donovan-Kranz asked her daughter’s fiancé to do it.

The COVID-19 pandemic has profoundly changed the way people interact with each other and with technology. Screens were for reminiscing over cherished memories, like watching VHS tapes or, more recently, YouTube videos of weddings and birthdays that have already happened. But now, we’re not just watching memories. We’re creating them on screens in real time.

As social distancing measures forced everyone to stay indoors and interact online, multibillion-dollar industries have had to rapidly adjust to create experiences in a 2-D world. And although this concept of living our lives online — from mundane work calls to memorable weddings or concerts — seems novel, both scientists and science fiction writers have seen this reality coming for decades.

In David Foster Wallace’s 1996 novel Infinite Jest, videotelephony enjoys a brief but frenzied popularity in a future America. Entire industries emerge to address people’s self-consciousness on camera. But eventually, the industry collapses when people realize they prefer the familiar voice-only telephone.

Despite multiple efforts by inventors and entrepreneurs to convince us that videoconferencing had arrived, that reality didn’t play out. Time after time, people rejected it for the humble telephone or for other innovations like texting. But in 2020, live video meetings finally found their moment.

It took more than just a pandemic to get us here, some researchers say. Technological advances over the decades together with the ubiquity of the technology got everyone on board. But it wasn’t easy.

Initial attempts

On June 30, 1970 — exactly half a century before Donovan’s death — AT&T launched what it called the nation’s first commercial videoconferencing service in Pittsburgh with a call from Peter Flaherty, the city’s mayor, to John Harper, chairman and CEO of Alcoa Corporation, one of the world’s largest producers of aluminum. Alcoa had already been using the Alcoa Picturephone Remote Information System for retrieving information from a database using buttons on a telephone. The data would be presented on the videophone display. This was before desktop computers were ubiquitous.

This was not AT&T’s first videophone, however. In 1927, U.S. Secretary of Commerce Herbert Hoover had demonstrated a prototype developed by the company. But by 1972, AT&T had a mere 32 units in service in Pittsburgh. The only other city offering commercial service, Chicago, hit its peak sales in 1973, with 453 units. AT&T discontinued the service in the late 1970s, concluding that the videophone was “a concept looking for a market.”

group photo of people at Bell Telephone Laboratories in 1927
AT&T President Walter Sherman Gifford (third from right) makes a videocall at Bell Telephone Laboratories in New York City on April 7, 1927. The call went to U.S. Secretary of Commerce Herbert Hoover in Washington, D.C., via 300 miles of long-distance wire.Federal Communications Commission/PhotoQuest/Getty Images

About a decade after AT&T’s first attempt at commercialization, a band called the Buggles released the single “Video Killed the Radio Star,” the first music video to air on MTV. The song reminded people of the technological change that occurred in the 1950s and ’60s, when U.S. households transitioned away from radio as televisions became more accessible to the masses.

The way television achieved market dominance kept videophone developers bullish about their technology’s future. In 1993, optimistic AT&T researchers predicted “the 1990s will be the video communication decade.” Video would change from something we passively consumed to something we interacted with in real time. That was the hope.

When AT&T launched its VideoPhone 2500 in 1992, prices started at a hefty $1,500 (about $2,800 in today’s dollars) — later dropping to $1,000. The phone had compressed color and a slow frame rate of 10 frames per second (Zoom calls today are 30 frames per second), so images were choppy.

Though the company tried to enchant potential customers with visions of the future, people weren’t buying it. Fewer than 20,000 units sold in the five months after the launch. Rejection again.

Building capacity

Last June, to commemorate the 50th anniversary of AT&T’s first videophone launch, William Peduto, Pittsburgh’s mayor, and Michael G. Morris, Alcoa’s chairman at the time, spoke over videophone, just as their predecessors had done.

Several scholars, including Andrew Meade McGee, a historian of technology and society at Carnegie Mellon University in Pittsburgh, joined for an online panel to discuss the rocky history of the videophone and its 2020 success. McGee told me a few months later that two things are crucial for a product’s actual adoption: “capacity and circumstance.” Capacity is all about the technology that makes a product easy to use and affordable. For videophones, it’s taken a while to get there.

When the Picturephone, which was launched by AT&T and Bell Telephone Laboratories, premiered at the 1964 World’s Fair in New York City, a three-minute call cost $16 to $27 (that’s about $135 to $230 in 2021). It was available only in booths in New York City, Chicago and Washington, D.C. (SN: 8/1/64, p. 73). Using the product required planning, effort and money — for low reward. The connection required multiple phone lines and the picture appeared on a black-and-white screen about the size of today’s iPhone screens.

man and woman watching Lady Bird Johnson on a prototype AT&T videophone
Lady Bird Johnson, who was then first lady of the United States, is visible on the screen of a prototype AT&T videophone in 1964.Everett Collection Historical/Alamy Stock Photo

These challenges made the Picturephone a tough sell. Marketing researchers Steve Schnaars and Cliff Wymbs of Baruch College at the City University of New York theorized why videophones hadn’t taken off decades before in Technological Forecasting and Social Change in 2004. Along with capacity and circumstance, they argued, critical mass is key.

For a technology to become popular, the researchers wrote, everybody needs the money and motivation to adopt it. And potential users need to know that others also have the device — that’s the critical mass. But when everyone uses this logic, no one ends up buying the new product. Social networking platforms and dating apps face the same hurdle when they launch, which is why the apps create incentive programs to hook those all-important initial users.

Internet access

Even in the early 2000s, when Skype made a splash with its Voice over Internet Protocol, or VoIP, enabling internet-based calls that left landlines free, people weren’t as connected to the internet as they are today. In 2000, only 3 percent of U.S. adults had high-speed internet, and 34 percent had a dial-up connection, according to the Pew Research Center.

By 2019, the story had changed: Seventy-three percent of all U.S. adults had high-speed internet at home; with 63 percent coverage in rural areas. Globally, the number of internet users also increased, from about 397 million in 2000, to about 2 billion in 2010 and 3.9 billion in 2019.

But even after capacity was established, we weren’t glued to our videophones as we are today, or as inventors predicted years ago. Although Skype claimed to have 300 million users in 2019, Skype was a service that people typically used on occasion, for international calls or as something that took advance planning.

One long-time barrier that the Baruch College researchers cite from an informal survey is the aversion to always being “on.” Some people would have paid extra to not be on camera in their home, the same way people would pay extra to have their phone numbers left out of telephone books.

“Once people experienced [the 1970s] videophone, there was this realization that maybe you don’t always want to be on a physical call with someone else,” McGee says. Videocalling developers had predicted these challenges early on. In 1969, Julius Molnar, vice president at Bell Telephone Labs, wrote that people will be “much concerned with how they will appear on the screen of the called party.”

A scene from the 1960s cartoon The Jetsons illustrates this concern: George Jetson answers a videophone call. When he tells his wife Jane that her friend Gloria is on the phone, Jane responds, “Gloria! Oh dear, I can’t let her see me looking like this.” Jane grabs her “morning mask” — for the perfect hair and face — before taking the call.

That aversion to face time is one of the factors that kept people away from videocalling.

It took the pandemic, a change in circumstance, to force our hand. “What’s remarkable,” McGee says, “is the way in which large sectors of U.S. society have all of a sudden been thrust into being able to use videocalls on a daily basis.”

Circumstance shift

Starting in March 2020, mandatory stay-at-home orders around the world forced us to carry on an abridged form of our pre-pandemic lives, but from a distance. And one company beat the competition and rose to the top within a matter of months.

Soon after lockdown, Zoom became a verb. It was the go-to choice for all types of events. The perfect storm of capacity and circumstance led to the critical mass needed to create the Zoom boom.

Before Zoom, a handful of companies had been trying to fill the space that AT&T’s videophone could not. Skype became the sixth most downloaded mobile app of the decade from 2010 to 2019. FaceTime, WhatsApp, Instagram, Facebook Messenger and Google’s videochatting applications were and still are among the most popular platforms for videocalls.

Then 2020 happened.

Zoom beat its well-established competitors to quickly become a household name globally. It gained critical mass over other platforms by being easy to use.

 “The fact that it’s been modeled around this virtual room that you come into and out of really simplifies the connection process,” says Carman Neustaedter of the School of Communication, Art and Technology at Simon Fraser University in Burnaby, Canada, where his team has researched being present on videocalls for work, home and health.

Zoom reflects our actions in real life — where we all walk into a room and everyone is just there. Casual users don’t need to have an account or connect ahead of time with those we want to talk to.

Beyond design, there were likely some market factors at play as well. Zoom connected early with universities, claiming by 2016 to be at 88 percent of “the top U.S. universities.” And just as K–12 schools worldwide started closing last March, Zoom offered free unlimited meeting minutes.

In December 2019, Zoom statistics put its maximum number of daily meeting participants (both paid and free) at about 10 million. In March 2020, that number had risen to 200 million, and the following month it was up to 300 million. The way Zoom counts those users is a point of contention.

But these numbers still provide some insight: If the product wasn’t easy and helpful, we wouldn’t have kept using it. That’s not to say that Zoom is the perfect platform, Neustaedter says. It has some obvious shortcomings.

“It’s almost too rigid,” he says.

It doesn’t allow for natural conversation; participants have to take turns talking, toggling the mute button to let others take a turn. Even with the ability to send private and direct messages to anyone in the room, the natural way we form groups and make small talk in real life is lost with Zoom.

It’s also not the best for parties — it’s awkward to attend a birthday party online when only one out of 30 friends can talk at a time. That’s why some people have been enticed to switch to other videocalling platforms to host larger online events, like graduations.

For example, Remo, founded in 2018, uses visual virtual rooms. Everyone gets an avatar and can choose a table after seeing who else is there, to talk in smaller groups. Instead of Zoom breakout sessions where you’re assigned a room and can’t enter another one on your own, a platform like Remo allows you to virtually see all the rooms and pick one, exit it and go to another one all without the help of a host.

The rigidity also results in Zoom fatigue, that feeling of burnout associated with overusing virtual platforms to communicate. Videocalling doesn’t allow us to use direct eye contact or easily pick up nonverbal cues from body language — things we do during in-person conversations.

The psychological rewards of videocalling — the chance to be social — don’t always outweigh the costs.

Jeremy Bailenson, director of the Virtual Human Interaction Lab at Stanford University, laid out four features that lead to Zoom fatigue in the Feb. 23 Technology, Mind and Behavior. Along with cognitive load and reduced mobility, he blames the long stretches of closeup eye gazing and the “all-day mirror.” When you constantly see yourself on camera interacting with others, self-consciousness and exhaustion set in.

Bailenson has since changed his relationship with Zoom: He now hides the box that lets him view himself, and he shrinks the size of the Zoom screen to make gazing faces less imposing. Bailenson expects minor changes to the platform will help reduce the psychological heaviness we feel.

Other challenges with Zoom have revolved around security. In April 2020, the term “Zoombombing” arose as teleconferencing calls on the platform were hijacked by uninvited people. Companies that could afford to switch quickly moved away from Zoom and paid for services elsewhere. For everyone else who stayed on the platform, Zoom added close to 100 new privacy, safety and security features by July 2020. These changes included the addition of end-to-end encryption for all users and meeting passcodes.

Anybody’s guess

In Metropolis, the 1927 sci-fi silent film, a master of an industrial city in the dystopian future uses four separate dials on a videophone to put a call through. Thankfully, placing a videocall is much easier than it was predicted to be. But how much will we use this far-from-perfect technology once the pandemic is over?

In the book Productivity and the Pandemic, released in January, behavioral economist Stuart Mills discusses why consumers might keep using videocalling. This pandemic may establish habits and preferences that will not disappear once the crisis is over, Mills, of the London School of Economics, and coauthors write. When people are forced to experiment with new behaviors, as we did with the videophone during this pandemic, the result can be permanent behavioral changes. Collaboration through videocalling may remain popular even after shutdowns lift now that we know how it works.

Events that require real-life interactions, such as funerals and some conferences, may not change much from what we were used to pre-pandemic.

For other industries, videocalling may change certain processes. For example, Reverend Annie Lawrence of New York City predicts permanent changes for parts of the wedding industry. People like the ease of getting a marriage license online, and she’s been surprisingly in demand doing video weddings since the pandemic started. Before, getting booked for officiating a wedding would require notice months in advance. “Now, I’ve been getting calls on Friday to ask if I can officiate a wedding on Saturday,” she says.

Other sectors of society may realize that videocalling isn’t for them, and will leave just a few processes to be done online. Jamie Dimon, CEO of JPMorgan Chase, for example, stated in a March 1 interview with Bloomberg Markets and Finance that he thinks a large portion of his staff will permanently work in the office when that becomes possible again. Culture is hard to build on Zoom, relationships are hard to strengthen and spontaneous collaboration is difficult, he said. And there’s research that backs this.

But none of these changes or reversions to our previous normal are a sure bet. We may find, just like in Wallace’s satirical storyline, that videocalls are just too much stress, and the world will revert back to phone calls and face-to-face time. We may realize that even when the technology gets better, the lifting of shutdowns and return to in-person life may mean fewer people are available for videocalls.

It’s hard to say which scenario is the most likely to play out in the long run. We’ve been terribly wrong about these things before.

How the laws of physics constrain the size of alien raindrops

Whether they’re made of methane on Saturn’s moon Titan or iron on the exoplanet WASP 78b, alien raindrops behave similarly across the Milky Way. They are always close to the same size, regardless of the liquid they’re made of or the atmosphere they fall in, according to the first generalized physical model of alien rain.

“You can get raindrops out of lots of things,” says planetary scientist Kaitlyn Loftus of Harvard University, who published new equations for what happens to a falling raindrop after it has left a cloud in the April JGR Planets. Previous studies have looked at rain in specific cases, like the water cycle on Earth or methane rain on Saturn’s moon Titan (SN: 3/12/15). But this is the first study to consider rain made from any liquid.

“They are proposing something that can be applied to any planet,” says astronomer Tristan Guillot of the Observatory of the Côte d’Azur in Nice, France. “That’s really cool, because this is something that’s needed, really, to understand what’s going on” in the atmospheres of other worlds.

Comprehending how clouds and precipitation form are important for grasping another world’s climate. Cloud cover can either heat or cool a planet’s surface, and raindrops help transport chemical elements and energy around the atmosphere.

Clouds are complicated (SN: 3/5/21). Despite lots of data on earthly clouds, scientists don’t really understand how they grow and evolve.

Raindrops, though, are governed by a few simple physical laws. Falling droplets of liquid tend to take on the same characteristic teardrop shape, regardless of the properties of the liquid. The rate at which that droplet evaporates is set by the its surface area.

“This is basically fluid mechanics and thermodynamics, which we understand very well,” Loftus says.

She and Harvard planetary scientist Robin Wordsworth considered rain in a variety of different forms, including water on early Earth, modern Mars and a gaseous exoplanet called K2 18b that may host clouds of water vapor (SN: 9/11/19). The pair also considered Titan’s methane rain, ammonia “mushballs” on Jupiter and iron rain on the ultrahot gas giant exoplanet WASP 76b (SN: 3/11/20). “All these different condensables behave similarly, [because] they’re governed by similar equations,” she says.

The team found that worlds with higher gravity tend to produce smaller raindrops. Still, all the raindrops studied fall within a fairly narrow size range, from about a tenth of a millimeter to a few millimeters in radius. Much bigger than that, and raindrops break apart as they fall, Loftus and Wordsworth found. Much smaller, and they’ll evaporate before hitting the ground (for planets that have a solid surface), keeping their moisture in the atmosphere.

Eventually the researchers would like to extend the study to solid precipitation like snowflakes and hail, although the math there will be more complicated. “That adage that every snowflake is unique is true,” Loftus says.

The work is a first step toward understanding precipitation in general, says astronomer Björn Benneke of the University of Montreal, who discovered water vapor in the atmosphere of K2 18b but was not involved in the new study. “That’s what we are all striving for,” he says. “To develop a kind of global understanding of how atmospheres and planets work, and not just be completely Earth-centric.”

Readers react to million-year-old mammoths, parasitic plants and more

Big mouths to feed

The oldest animal DNA yet recovered comes from a mammoth that lived over 1 million years ago, Erin Garcia de Jesús reported in “Million-year-old mammoth DNA found” (SN: 3/13/21, p. 6).

Considering that a modern elephant can consume hundreds of kilograms of plant food per day, what could the mammoth have eaten in the frozen tundra, reader Peter Nissenson asked.

Researchers don’t know for sure what the ancient animal may have eaten. But like other mammoths, its menu may have included flowers and grasses (SN: 3/22/14, p. 13). Recent studies suggest that dwindling food sources contributed to the demise of most mammoths about 10,000 years ago, Garcia de Jesús says. “So as the flowers disappeared, so too, perhaps, did the mammoths.”

A wild escape

Escaped genes from modified cotton crops are disrupting wild cotton’s interactions with insects, causing irreversible ecological effects, Emiliano Rodríguez Mega reported in “Modified genes may harm wild cotton” (SN: 3/13/21, p. 10).

Reader Marc Sapir wondered why the effects are irreversible if wild cotton plants with the escaped genes struggled more to survive than those without.

Once the genes escape, we can no longer control how they behave or where they go, Rodríguez Mega says. That doesn’t necessarily mean the genes will devastate wild cotton populations. But there currently isn’t a way for scientists to rid those populations of the genes.

The genes could disappear if they are selected against by natural forces, Rodríguez Mega says, but we don’t know that that will happen. More studies are needed to understand the long-term effects the escaped genes could have in the wild.

What makes a plant?

A parasitic plant, Sapria himalayana, has lost genes for stems, roots and photosynthetic tissue to live within its hosts, Jake Buehler reported in “A parasitic plant is missing many genes” (SN: 3/13/21, p. 13).

Given all that S. himalayana has lost, “is it even still a plant?” reader Jeff Fisher asked. Could it be something entirely new?

Biologists group organisms based on their shared evolutionary history, so S. himalayana is still considered a plant, Buehler says. But taxonomists and other researchers have long debated at what point an organism deviates enough from its ancestral line to earn a unique taxonomic rank. There currently isn’t a standard that scientists use to determine these bounds, Buehler says.

Researchers found that the parasite has taken a lot of DNA from its hosts, though much of it doesn’t encode any genes, Buehler reported. Reader D.C. Randle questioned why the plant would bother pilfering this genetic material.

Some of the stolen DNA do encode genes. And those genes may be useful, Buehler says. Some are involved in the parasite’s defense and stress responses, and another is instrumental in making pyrimidine, a crucial building block of nucleic acids like DNA.

Flying high

Microfliers powered by sunlight could fly in conditions like those high in Earth’s atmosphere, Emily Conover reported in “Tiny aircraft that fly by light could soar beyond airplanes’ reach” (SN: 3/13/21, p. 5).

Reader William S. Darter wondered how scalable these microfliers might be.

“The lift forces produced in this case are quite small, so the aircraft and its instruments have to be extremely light,” Conover says. That makes the aircraft a challenge to scale up. “One idea would be to create arrays of microfliers connected by thin carbon fibers. That would allow the microfliers to carry more massive payloads, though still only in the range of grams.”

Correction

Two new books search for the meaning of life” (SN: 3/27/21, p. 28) incorrectly stated that a person’s metabolism increases to about 0.5 times its resting rate after eating. Metabolism increases to about 1.5 times the resting rate.

June 2021 Issue of AJPH Comprises the Effects of COVID-19 on Drug Overdoses, E-cigarette Use, and Public Health Measures and Strategies

June 2021 AJPH Issue highlights COVID-19 concerns in relation to fatal drug overdoses, drops in youth e-cigarette use, importance of public health measures, and strategies to protect correctional staff.

Only 3 percent of Earth’s land hasn’t been marred by humans

The Serengeti looks largely like it did hundreds of years ago.

Lions, hyenas and other top predators still stalk herds of wildebeests over a million strong, preventing them from eating too much vegetation. This diversity of trees and grasses support scores of other species, from vivid green-orange Fischer’s lovebirds to dung beetles. In turn, such species carry seeds or pollen across the plains, enabling plant reproduction. Humans are there too, but in relatively low densities. Overall, it’s a prime example of what biologists call an ecologically intact ecosystem: a bustling tangle of complex relationships that together sustain a rich diversity of life, undiminished by us.

Such places are vanishingly rare.

The vast majority of land on Earth — a staggering 97 percent — no longer qualifies as ecologically intact, according to a sweeping survey of Earth’s ecosystems. Over the last 500 years, too many species have been lost, or their numbers reduced, researchers report April 15 in Frontiers in Forests and Global Change.

Of the few fully intact ecosystems, only about 11 percent fall within existing protected areas, the researchers found. Much of this pristine habitat exists in northern latitudes, in Canada’s boreal forests or Greenland’s tundra, which aren’t bursting with biodiversity. But chunks of the species-rich rainforests of the Amazon, Congo and Indonesia also remain intact.

“These are the best of the best, the last places on Earth that haven’t lost a single species that we know of,” says Oscar Venter, a conservation scientist at the University of Northern British Columbia in Prince George who wasn’t involved in the study. Identifying such places is crucial, he says, especially for regions under threat of development that require protection, like the Amazon rainforest.

Conservation scientists have long tried to map how much of the planet remains undegraded by human activity. Previous estimates using satellite imagery or raw demographic data found anywhere from 20 to 40 percent of the globe was free from obvious human incursions, such as roads, light pollution or the gaping scars of deforestation. But an intact forest canopy can hide an emptied-out ecosystem below.

“Hunting, the impacts of invasive species, climate change — these can harm ecosystems, but they can’t be easily sensed via satellite,” says conservation biologist Andrew Plumptre of the University of Cambridge. A Serengeti with fewer lions or hyenas — or none at all — may look intact from space, but it’s missing key species that help the whole ecosystem run.

What exactly constitutes a fully intact and functioning ecosystem is fuzzy and debated by ecologists, but Plumptre and his colleagues started by looking for habitats that retained their full retinue of species, at their natural abundance as of A.D. 1500. That’s the baseline the International Union for the Conservation of Nature uses to assess species extinctions, even though humans have been altering ecosystems by wiping out big mammals for thousands of years (SN: 8/26/15).

Large swaths of land are necessary to support wide-ranging species. So the researchers initially considered only areas larger than 10,000 square kilometers, roughly the size of Puerto Rico. The team combined existing datasets on habitat intactness with three different assessments of where species have been lost, encompassing about 7,500 animal species. While 28.4 percent of land areas larger than 10,000 square kilometers is relatively free from human disturbance, only 2.9 percent holds all the species it did 500 years ago. Shrinking the minimum size of the area included to 1,000 square kilometers bumps the percentage up, but barely, to 3.4.

Simply retaining species isn’t enough for ecological intactness, since diminished numbers of key players could throw the system out of whack. The researchers tallied up the population densities of just over a dozen large mammals whose collective ranges span much of the globe, including gorillas, bears and lions. This is a narrow look, Plumptre concedes, but large mammals play important ecological roles. They also have the best historical data and are also often the first to be affected by human incursion. Factoring in declines in large mammals only slightly decreased the percentage of ecologically intact land, down to 2.8 percent.

Overall the tally of ecologically intact land “was much lower than we were expecting,” says Plumptre. “Going in, I’d guessed that it would be 8 to 10 percent. It just shows how huge an impact we’ve had.”

Both Venter and Jedediah Brodie, a conservation ecologist at the University of Montana in Missoula, question whether the authors were too strict in their definition of ecological intactness.

“Many ecosystems around the world have lost one or two species but are still vibrant, diverse communities,” Brodie says. A decline in a few species may not spell disaster for the whole ecosystem, since other species may swoop in to fill those roles.

Still, the study is a valuable first look that shows us “where the world looks like it did 500 years ago and gives us something to aim for,” Plumptre says. It also identifies areas ripe for restoration. While only 3 percent of land is currently ecologically intact, the introduction of up to five lost species could restore 20 percent of land to its former glory, the researchers calculate. 

Species reintroductions have worked well in places like Yellowstone National Park, where the restoration of wolves has put the ecosystem back into balance (SN: 7/21/20). Such schemes may not work everywhere. But as the global community discusses how to protect nature over the next decade (SN: 4/22/20), Plumptre hopes this study will prompt policy makers to “not just protect the land that’s there, but also think about restoring it to what it could be.”