News

Re-Victimization from Police-Auctioned Cell Phones

Countless smartphones seized in arrests and searches by police forces across the United States are being auctioned online without first having the data on them erased, a practice that can lead to crime victims being re-victimized, a new study found. In response, the largest online marketplace for items seized in U.S. law enforcement investigations says it now ensures that all phones sold through its platform will be data-wiped prior to auction.

Researchers at the University of Maryland last year purchased 228 smartphones sold “as-is” from PropertyRoom.com, which bills itself as the largest auction house for police departments in the United States. Of phones they won at auction (at an average of $18 per phone), the researchers found 49 had no PIN or passcode; they were able to guess an additional 11 of the PINs by using the top-40 most popular PIN or swipe patterns.

Phones may end up in police custody for any number of reasons — such as its owner was involved in identity theft — and in these cases the phone itself was used as a tool to commit the crime.

“We initially expected that police would never auction these phones, as they would enable the buyer to recommit the same crimes as the previous owner,” the researchers explained in a paper released this month. “Unfortunately, that expectation has proven false in practice.”

The researchers said while they could have employed more aggressive technological measures to work out more of the PINs for the remaining phones they bought, they concluded based on the sample that a great many of the devices they won at auction had probably not been data-wiped and were protected only by a PIN.

Beyond what you would expect from unwiped second hand phones — every text message, picture, email, browser history, location history, etc. — the 61 phones they were able to access also contained significant amounts of data pertaining to crime — including victims’ data — the researchers found.

Some readers may be wondering at this point, “Why should we care about what happens to a criminal’s phone?” First off, it’s not entirely clear how these phones ended up for sale on PropertyRoom.

“Some folks are like, ‘Yeah, whatever, these are criminal phones,’ but are they?” said Dave Levin, an assistant professor of computer science at University of Maryland.

“We started looking at state laws around what they’re supposed to do with lost or stolen property, and we found that most of it ends up going the same route as civil asset forfeiture,” Levin continued. “Meaning, if they can’t find out who owns something, it eventually becomes the property of the state and gets shipped out to these resellers.”

Also, the researchers found that many of the phones clearly had personal information on them regarding previous or intended targets of crime: A dozen of the phones had photographs of government-issued IDs. Three of those were on phones that apparently belonged to sex workers; their phones contained communications with clients.

An overview of the phone functionality and data accessibility for phones purchased by the researchers.

One phone had full credit files for eight different people on it. On another device they found a screenshot including 11 stolen credit cards that were apparently purchased from an online carding shop. On yet another, the former owner had apparently been active in a Telegram group chat that sold tutorials on how to run identity theft scams.

The most interesting phone from the batches they bought at auction was one with a sticky note attached that included the device’s PIN and the notation “Gry Keyed,” no doubt a reference to the Graykey software that is often used by law enforcement agencies to brute-force a mobile device PIN.

“That one had the PIN on the back,” Levin said. “The message chain on that phone had 24 Experian and TransUnion credit histories”.

The University of Maryland team said they took care in their research not to further the victimization of people whose information was on the devices they purchased from PropertyRoom.com. That involved ensuring that none of the devices could connect to the Internet when powered on, and scanning all images on the devices against known hashes for child sexual abuse material.

It is common to find phones and other electronics for sale on auction platforms like eBay that have not been wiped of sensitive data, but in those cases eBay doesn’t possess the items being sold. In contrast, platforms like PropertyRoom obtain devices and resell them at auction directly.

PropertyRoom did not respond to multiple requests for comment. But the researchers said sometime in the past few months PropertyRoom began posting a notice stating that all mobile devices would be wiped of their data before being sold at auction.

“We informed them of our research in October 2022, and they responded that they would review our findings internally,” Levin said. “They stopped selling them for a while, but then it slowly came back, and then we made sure we won every auction. And all of the ones we got from that were indeed wiped, except there were four devices that had external SD [storage] cards in them that weren’t wiped.”

A copy of the University of Maryland study is here (PDF).

—————
Free Secure Email – Transcom Sigma
Boost Inflight Internet
Transcom Hosting
Transcom Premium Domains

Micro-Star International Signing Key Stolen

Micro-Star International—aka MSI—had its UEFI signing key stolen last month.

This raises the possibility that the leaked key could push out updates that would infect a computer’s most nether regions without triggering a warning. To make matters worse, Matrosov said, MSI doesn’t have an automated patching process the way Dell, HP, and many larger hardware makers do. Consequently, MSI doesn’t provide the same kind of key revocation capabilities.

Delivering a signed payload isn’t as easy as all that. “Gaining the kind of control required to compromise a software build system is generally a non-trivial event that requires a great deal of skill and possibly some luck.” But it just got a whole lot easier.

—————
Free Secure Email – Transcom Sigma
Boost Inflight Internet
Transcom Hosting
Transcom Premium Domains

Artificial Imposters—Cybercriminals Turn to AI Voice Cloning for a New Breed of Scam

Three seconds of audio is all it takes.  

Cybercriminals have taken up newly forged artificial intelligence (AI) voice cloning tools and created a new breed of scam. With a small sample of audio, they can clone the voice of nearly anyone and send bogus messages by voicemail or voice messaging texts. 

The aim, most often, is to trick people out of hundreds, if not thousands, of dollars. 

The rise of AI voice cloning attacks  

Our recent global study found that out of 7,000 people surveyed, one in four said that they had experienced an AI voice cloning scam or knew someone who had. Further, our research team at McAfee Labs discovered just how easily cybercriminals can pull off these scams. 

With a small sample of a person’s voice and a script cooked up by a cybercriminal, these voice clone messages sound convincing, 70% of people in our worldwide survey said they weren’t confident they could tell the difference between a cloned voice and the real thing. 

Cybercriminals create the kind of messages you might expect. Ones full of urgency and distress. They will use the cloning tool to impersonate a victim’s friend or family member with a voice message that says they’ve been in a car accident, or maybe that they’ve been robbed or injured. Either way, the bogus message often says they need money right away. 

In all, the approach has proven quite effective so far. One in ten of people surveyed in our study said they received a message from an AI voice clone, and 77% of those victims said they lost money as a result.  

The cost of AI voice cloning attacks  

Of the people who reported losing money, 36% said they lost between $500 and $3,000, while 7% got taken for sums anywhere between $5,000 and $15,000. 

Of course, a clone needs an original. Cybercriminals have no difficulty sourcing original voice files to create their clones. Our study found that 53% of adults said they share their voice data online or in recorded notes at least once a week, and 49% do so up to ten times a week. All this activity generates voice recordings that could be subject to hacking, theft, or sharing (whether accidental or maliciously intentional).  

Consider that people post videos of themselves on YouTube, share reels on social media, and perhaps even participate in podcasts. Even by accessing relatively public sources, cybercriminals can stockpile their arsenals with powerful source material. 

Nearly half (45%) of our survey respondents said they would reply to a voicemail or voice message purporting to be from a friend or loved one in need of money, particularly if they thought the request had come from their partner or spouse (40%), mother (24%), or child (20%).  

Further, they reported they’d likely respond to one of these messages if the message sender said: 

  • They’ve been in a car accident (48%). 
  • They’ve been robbed (47%). 
  • They’ve lost their phone or wallet (43%). 
  • They needed help while traveling abroad (41%). 

These messages are the latest examples of targeted “spear phishing” attacks, which target specific people with specific information that seems just credible enough to act on it. Cybercriminals will often source this information from public social media profiles and other places online where people post about themselves, their families, their travels, and so on—and then attempt to cash in.  

Payment methods vary, yet cybercriminals often ask for forms that are difficult to trace or recover, such as gift cards, wire transfers, reloadable debit cards, and even cryptocurrency. As always, requests for these kinds of payments raise a major red flag. It could very well be a scam. 

AI voice cloning tools—freely available to cybercriminals 

In conjunction with this survey, researchers at McAfee Labs spent two weeks investigating the accessibility, ease of use, and efficacy of AI voice cloning tools. Readily, they found more than a dozen freely available on the internet. 

These tools required only a basic level of experience and expertise to use. In one instance, just three seconds of audio was enough to produce a clone with an 85% voice match to the original (based on the benchmarking and assessment of McAfee security researchers). Further effort can increase the accuracy yet more. By training the data models, McAfee researchers achieved a 95% voice match based on just a small number of audio files.   

McAfee’s researchers also discovered that that they could easily replicate accents from around the world, whether they were from the US, UK, India, or Australia. However, more distinctive voices were more challenging to copy, such as people who speak with an unusual pace, rhythm, or style. (Think of actor Christopher Walken.) Such voices require more effort to clone accurately and people with them are less likely to get cloned, at least with where the AI technology stands currently and putting comedic impersonations aside.  

The research team stated that this is yet one more way that AI has lowered the barrier to entry for cybercriminals. Whether that’s using it to create malware, write deceptive messages in romance scams, or now with spear phishing attacks with voice cloning technology, it has never been easier to commit sophisticated looking, and sounding, cybercrime. 

Likewise, the study also found that the rise of deepfakes and other disinformation created with AI tools has made people more skeptical of what they see online. Now, 32% of adults said their trust in social media is less than it’s ever been before. 

Protect yourself from AI voice clone attacks 

  1. Set a verbal codeword with kids, family members, or trusted close friends. Make sure it’s one only you and those closest to you know. (Banks and alarm companies often set up accounts with a codeword in the same way to ensure that you’re really you when you speak with them.) Make sure everyone knows and uses it in messages when they ask for help. 
  2. Always question the source. In addition to voice cloning tools, cybercriminals have other tools that can spoof phone numbers so that they look legitimate. Even if it’s a voicemail or text from a number you recognize, stop, pause, and think. Does that really sound like the person you think it is? Hang up and call the person directly or try to verify the information before responding.  
  3. Think before you click and share. Who is in your social media network? How well do you really know and trust them? The wider your connections, the more risk you may be opening yourself up to when sharing content about yourself. Be thoughtful about the friends and connections you have online and set your profiles to “friends and families” only so your content isn’t available to the greater public. 
  4. Protect your identity. Identity monitoring services can notify you if your personal information makes its way to the dark web and provide guidance for protective measures. This can help shut down other ways that a scammer can attempt to pose as you. 
  5. Clear your name from data broker sites. How’d that scammer get your phone number anyway? It’s possible they pulled that information off a data broker site. Data brokers buy, collect, and sell detailed personal information, which they compile from several public and private sources, such as local, state, and federal records, in addition to third parties. Our Personal Data Cleanup service scans some of the riskiest data broker sites and shows you which ones are selling your personal info. 

Get the full story 

A lot can come from a three-second audio clip. 

With the advent of AI-driven voice cloning tools, cybercriminals have created a new form of scam. With arguably stunning accuracy, these tools can let cybercriminals nearly anyone. All they need is a short audio clip to kick off the cloning process. 

Yet like all scams, you have ways you can protect yourself. A sharp sense of what seems right and wrong, along with a few straightforward security steps can help you and your loved ones from falling for these AI voice clone scams. 

For a closer look at the survey data, along with a nation-by-nation breakdown, download a copy of our report here. 

Survey methodology 

The survey was conducted between January 27th and February 1st, 2023 by Market Research Company MSI-ACI, with people aged 18 years and older invited to complete an online questionnaire. In total 7,000 people completed the survey from nine countries, including the United States, United Kingdom, France, Germany, Australia, India, Japan, Brazil, and Mexico. 

The post Artificial Imposters—Cybercriminals Turn to AI Voice Cloning for a New Breed of Scam appeared first on McAfee Blog.

—————
Free Secure Email – Transcom Sigma
Boost Inflight Internet
Transcom Hosting
Transcom Premium Domains