RatMilad: A new spyware to watch out for.
close
अपनी भाषा में पढ़ें

RatMilad: A new spyware to watch out for.

    Posted by Rohit Yadav On 06-Oct-2022 07:30 AM

    Views 2814 Views

New Android Malware Used in the Middle East

Spyware.avif

A new type of Android malware, 'RatMilad', is being used in the Middle East to spy on victims and steal data via their smartphones. RatMilad is a type of spyware, a malware program used to spy on victims through their devices. RatMilad can record both video and audio, allowing an attacker to eavesdrop on private conversations and monitor people remotely.

Additionally, RatMilad allows malicious attackers to change application permissions on the victim's device. RatMilad uses fake VPNs and number spoofing apps Text Me and NumRent to infect devices. Since these apps are distributed via social media links, most people could be exposed to RatMilad. As soon as the fake app is installed on the device, RatMilad malware is able to start stealing data and spying on the victim. It is being used in a campaign by an Iranian hacker group called AppMilad.

RatMilad poses a serious threat, according to Android security company

RatMilad Twitter.png

The RatMilad malware strain was first discovered by Zimperium, a mobile security company. The company tweeted that on October 5, 2022, a research team discovered that RatMilad was active in the Middle East.

A Zimperium blog post states that once the RatMilad spyware is activated, hackers can "sideload a set of fake tools to enable critical permissions on the device." In the same blog post, Zimperium claimed that they did not find any apps infected with RatMilad in the Android store. Download links are shared via social media such as Telegram.

RatMilad spyware allows attackers to obtain various types of information on a victim's device. This is because RatMilad can function as a Remote Access Trojan (RAT). In the aforementioned blog post, Zimperium states that RatMilad can access contact lists, call history, SMS lists, device information, and file lists. You can access the victim's SIM card information as well as the GPS location of the device.

##RatMilad poses a serious threat to Android users

RatMilad is definitely a very dangerous program as it can perform various malicious functions. RatMilad was only used in the Middle East at the time of writing but may spread to other locations in the coming months.


how-to-spot-deep-fake
Deep Fake Defense 101: Learn How to Stop the Spread of Deceptive Videos

In recent times, video calls have become a popular way for people to connect with their loved ones or colleagues across the globe. However, it's important to be cautious during video calls, as advancements in technology, particularly artificial intelligence (AI), have made it easier for fraudsters to deceive unsuspecting individuals. A case in point is an incident in northern China where a man fell victim to an AI-driven video call scam involving deepfake technology. Using AI-powered face-swapping, the scammer posed as the victim's close friend during the call and convinced him to transfer a significant amount of money. The victim realized he had been duped when his real friend expressed no knowledge of the call. Fortunately, local police were able to recover most of the stolen money and are actively working to trace the remaining amount. To combat the growing threat of AI-driven scams, China has implemented new rules to provide legal protection for victims and has been tightening its scrutiny of such technology. Now the question arises, how can one spot a fake video call? Here are some signs to watch out for: Video quality: Fake videos often have poor quality. Look for watermarks or other indications that the video is sourced online. Contact information: Verify if the caller is on your contact list and if the displayed name matches the contact information you have for that person. Video sizing: Pay attention to any distortion in the video proportions, as fake video calls may require resizing to fit the webcam window. Given the rise of AI-based scams in India, it is crucial for individuals to remain vigilant and exercise caution during video and voice calls. A study revealed that India has the highest number of victims, with 83% of Indians falling prey to fraudulent activities. Deepfake technology has become a concern in recent years. It involves the use of AI to create realistic yet fake videos or images by collecting and mimicking visual and audio data of the target individual, often obtained from publicly available sources like social media. In summary, staying alert and being aware of the signs of a fake video call can help protect individuals from falling victim to AI-driven scams.

02-Jun-2023 05:39 AM
Read Full News