Macaque facial gestures are more than just a reflex, study finds

BIAS: Lean Left
RELIABILITY: High

Political Bias Rating

This rating indicates the source’s editorial stance on the political spectrum, based on analysis from Media Bias/Fact Check, AllSides, and Ad Fontes Media.

Far Left / Left: Progressive editorial perspective
Lean Left: Slightly progressive tendency
Center: Balanced, minimal editorial slant
Lean Right: Slightly conservative tendency
Right / Far Right: Conservative editorial perspective

Current source: Lean Left. Stories with cross-spectrum coverage receive elevated prominence.

Reliability Rating

This rating measures the source’s factual accuracy, sourcing quality, and journalistic standards based on third-party fact-checking assessments.

Very High: Exceptional accuracy, rigorous sourcing
High: Strong factual reporting, minor issues rare
Mixed: Generally accurate but occasional concerns
Low: Frequent errors or misleading content
Very Low: Unreliable, significant factual issues

Current source: High. Higher reliability sources receive elevated weighting in story prioritization.

Ars Technica
21:25Z

Recent advances in brain-computer interfaces have made it possible to more accurately extract speech from neural signals in humans, but language is just one of the tools we use to communicate. “When my young nephew asks for ice cream before dinner and I say ‘no,’ the meaning is entirely dictated by whether the word is punctuated with a smirk or a stern frown,” says Geena Ianni, a neuroscientist at the University of Pennsylvania. That’s why in the future, she thinks, neural prostheses meant for patients with a stroke or paralysis will decode facial gestures from brain signals in the same way they decode speech.

To lay a foundation for these future facial gesture decoders, Ianni and her colleagues designed an experiment to find out how neural circuitry responsible for making faces really wor

Continue reading at the original source

Read Full Article at Ars Technica →