Skip to main content

Freaky new A.I. scans your brain, then generates faces you’ll find attractive

Brain-computer interface for generating personally attractive images

Imagine if some not-too-distant future version of Tinder was able to crawl inside your brain and extract the features you find most attractive in a potential mate, then scan the romance-seeking search space to seek out whichever partner possessed the highest number of these physical attributes.

Recommended Videos

We’re not just talking qualities like height and hair color, either, but a far more complex equation based on a dataset of everyone you’ve ever found attractive before. In the same way that the Spotify recommendation system learns the songs you enjoy and then suggests others that conform to a similar profile — based on features like danceability, energy, tempo, loudness, and speechiness — this hypothetical algorithm would do the same for matters of the heart. Or, at least, the loins. Call it physical attractiveness matchmaking by way of A.I.

To be clear, Tinder isn’t — as far as I’m aware — working on anything remotely like this. But researchers from the University of Helsinki and Copenhagen University are. And while that description might smack somewhat of a dystopian shallowness pitched midway between Black Mirror and Love Island, in reality their brain-reading research is pretty darn fascinating.

Searching the face space

In their recent experiment, the researchers used a generative adversarial neural network, trained on a large database of 200,000 celebrity images, to dream up a series of hundreds of fake faces. These were faces with some of the hallmarks of certain celebrities — a strong jawline here, a piercing set of azure eyes there — but which were not instantly recognizable as the celebrities in question.

The images were then gathered into a slideshow to show to 30 participants, who were kitted out with electroencephalography (EEG) caps able to read their brain activity, via the electrical activity on their scalps. Each participant was asked to concentrate on whether they thought the face they were looking at on the screen was good-looking or not. Each face showed for a short period of time, before the next image appeared. Participants didn’t have to mark anything down on paper, press a button, or swipe right to indicate their approval. Just focusing on what they found attractive was enough.

The Cognitive Computing Group

“We showed a large selection of these faces to participants, and asked them to selectively concentrate on faces they found attractive,” Michiel Spapé, a postdoctoral researcher at the University of Helsinki, told Digital Trends. “By capturing the brain waves by EEG that occurred just after seeing a face, we estimated whether a face was seen as attractive or not. This information was then used to drive a search within the neural network model — a 512-dimensional ‘face-space’ — and triangulate a point that would match an individual participant’s point of attractivity.”

Finding the hidden data patterns that revealed preferences for certain features was achieved by using machine learning to probe the electrical brain activity each face provoked. Broadly speaking, the more of a certain kind of brain activity spotted (more on that in a second), the greater the levels of attraction. Participants didn’t have to single out certain features as particularly attractive. To return to the Spotify analogy, in the same way that we might unconsciously gravitate to songs with a particular time signature, by measuring brain activity when viewing large numbers of images, and then letting an algorithm figure out what they all have in common, the A.I. can single out parts of the face we might not even realize we’re drawn to. Machine learning is, in this context, like a detective whose job it is to connect the dots.

Swipe right brain

“It is not necessarily ‘increased brain activity,’ but rather that certain images resynchronize neural activity,” Spapé clarified. “That is, the living brain is always active. EEG is quite unlike [functional magnetic resonance imaging] in that we are not very sure where activity comes from, but only when it comes from something. Only because many neurons fire at the same time, in the same direction, are [we] able to pick up their [electrical] signature. So synchronization and desynchronization is what we pick up rather than ‘activity’ as such.”

He stressed that what the team has not done is to find a way to look at random EEG brain data and tell, immediately, if a person is looking at someone they find attractive. “Attraction is a very complex subject,” he said. Elsewhere, he noted that “we cannot do thought control.”

The Cognitive Computing Group

So how exactly have the researchers managed to carry out this experiment if they cannot guarantee that what they are measuring is attraction? The answer is, in fact, that they are measuring attraction. In this scenario, at least. What the researchers see in this experimental setup is that, roughly 300 milliseconds after a participant sees an attractive image, their brain lights up with a particular electrical signal called a P300 wave. A P300 wave doesn’t always signify attraction, but rather a recognition of a certain relevant stimuli. But what that stimuli is depends on what the person has been asked to look for. In other scenarios, where a person is asked to focus on different features, it might indicate something entirely different. (Case in point: P300 response is used as a measure in lie detectors — and not necessarily to tell whether a person is telling the truth about their attraction to a particular person.)

NeuroTinder and beyond

In this study, the researchers then used this attraction data to have the generative adversarial network generate new customized faces combining the most brain-sparking traits — a Frankenstein assembly of facial features participants’ brain data had indicated they find personally attractive.

“While there may be some facial features that seem to be generally preferred across participants, as some generated faces in our experiments look similar to each other, the model really captures personal features,” Tuukka Ruotsalo, an associate professor at the University of Helsinki, told Digital Trends. “There are differences in all generated images. In the most trivial aspect, participants with different gender preferences get faces matching that preference.”

Generating attractive people who have never existed is certainly a headline-grabbing use of this technology. However, it could have other, more meaningful applications, too. The interaction between a generative artificial neural network and human brain responses could also be used to test out human responses to different phenomena present in data.

“This could help us to understand the kind of features and their combinations that respond to cognitive functions, such as biases, stereotypes, but also preferences and individual differences,” said Ruotsalo.

A paper describing the work was recently published in the journal IEEE Transactions in Affective Computing.

Luke Dormehl
Former Digital Trends Contributor
I'm a UK-based tech writer covering Cool Tech at Digital Trends. I've also written for Fast Company, Wired, the Guardian…
What happened to Amazon’s inaugural Project Kuiper launch?
Official Imagery for Amazon Project Kuiper.

Amazon is aiming to take on SpaceX’s Starlink internet service using thousands of its own Project Kuiper satellites in low-Earth orbit.

The first Project Kuiper satellites were suppsoed to launch aboard a United Launch Alliance (ULA) Atlas V rocket from Cape Canaveral in Florida on April 9, but rough weather conditions forced the mission team to scrub the planned liftoff.

Read more
EVs top gas cars in German reliability report — but one weak spot won’t quit
future electric cars 2021 volkswagen id4 official 32

Electric vehicles are quietly crushing old stereotypes about being delicate or unreliable, and the data now backs it up in a big way. According to Germany’s ADAC — Europe’s largest roadside assistance provider — EVs are actually more reliable than their internal combustion engine (ICE) counterparts. And this isn’t just a small study — it’s based on a staggering 3.6 million breakdowns in 2024 alone.
For cars registered between 2020 and 2022, EVs averaged just 4.2 breakdowns per 1,000 vehicles, while ICE cars saw more than double that, at 10.4 per 1,000. Even with more EVs hitting the road, they only accounted for 1.2% of total breakdowns — a big win for the battery-powered crowd.
Among standout performers, some cars delivered exceptionally low breakdown rates. The Audi A4 clocked in at just 0.4 breakdowns per 1,000 vehicles for 2022 models, with Tesla’s Model 3 right behind at 0.5. The Volkswagen ID.4, another popular EV, also impressed with a rate of 1.0 – as did the Mitsubishi Eclipse Cross at 1.3. On the flip side, there were some major outliers: the Hyundai Ioniq 5 showed a surprisingly high 22.4 breakdowns per 1,000 vehicles for its 2022 models, while the hybrid Toyota RAV4 posted 18.4.
Interestingly, the most common issue for both EVs and ICE vehicles was exactly the same: the humble 12-volt battery. Despite all the futuristic tech in EVs, it’s this old-school component that causes 50% of all EV breakdowns, and 45% for gas-powered cars. Meanwhile, EVs shine in categories like engine management and electrical systems — areas where traditional engines are more complex and failure-prone.
But EVs aren’t completely flawless. They had a slightly higher rate of tire-related issues — 1.3 breakdowns per 1,000 vehicles compared to 0.9 for ICE cars. That could be due to their heavier weight and high torque, which can accelerate tire wear. Still, this trend is fading in newer EVs as tire tech and vehicle calibration improve.
Now, zooming out beyond Germany: a 2024 Consumer Reports study in the U.S. painted a different picture. It found that EVs, especially newer models, had more reliability issues than gas cars, citing tech glitches and inconsistent build quality. But it’s worth noting that the American data focused more on owner-reported problems, not just roadside breakdowns.
So, while the long-term story is still developing, especially for older EVs, Germany’s data suggests that when it comes to simply keeping you on the road, EVs are pulling ahead — quietly, efficiently, and with far fewer breakdowns than you might expect.

Read more
You can now lease a Hyundai EV on Amazon—and snag that $7,500 tax credit
amazon autos hyundai evs lease ioniq 6 n line seoul mobility show 2025 mk08

Amazon has changed how we shop for just about everything—from books to furniture to groceries. Now, it’s transforming the way we lease cars. Through Amazon Autos, you can now lease a brand-new Hyundai entirely online—and even better, you’ll qualify for the full $7,500 federal tax credit if you choose an electric model like the Ioniq 5, Ioniq 6, or Kona EV.
Here’s why that matters: As of January 2025, Hyundai’s EVs no longer qualify for the tax credit if you buy them outright, due to strict federal rules about battery sourcing and final assembly. But when you lease, the vehicle is technically owned by the leasing company (Hyundai Capital), which allows it to be classified as a “commercial vehicle” under U.S. tax law—making it eligible for the credit. That savings is typically passed on to you in the form of lower lease payments.
With Amazon’s new setup, you can browse Hyundai’s EV inventory, secure financing, trade in your current vehicle, and schedule a pickup—all without leaving the Amazon ecosystem.
It’s available in 68 markets across the U.S., and pricing is fully transparent—no hidden fees or haggling. While Hyundai is so far the only automaker fully participating, more are expected to join over time.
Pioneered by the likes of Tesla, purchasing or leasing vehicles online has been a growing trend since the Covid pandemic.
A 2024 study by iVendi found that 74% of car buyers expect to use some form of online process for their next purchase. In fact, 75% said online buying met or exceeded expectations, with convenience and access to information cited as top reasons. The 2024 EY Mobility Consumer Index echoed this trend, reporting that 25% of consumers now plan to buy their next vehicle online—up from 18% in 2021. Even among those who still prefer to finalize the purchase at a dealership, 87% use online tools for research beforehand.
Meanwhile, Deloitte’s 2025 Global Automotive Consumer Study reveals that while 86% of U.S. consumers still want to test-drive a vehicle in person, digital tools are now a critical part of the buying journey.
Bottom line? Amazon is making it easier than ever to lease an EV and claim that tax credit—without the dealership hassle. If you're ready to plug in, it might be time to add to cart.

Read more