Connect with us

Cyber Crime

GhostGPT: The AI Chatbot Empowering Cybercriminals with Malware and Phishing Schemes

Published

on

Cybersecurity researchers have uncovered a new malicious AI chatbot named GhostGPT, designed to facilitate a range of illegal activities, including phishing and malware creation. Developed to assist cybercriminals, this tool has been observed for sale on Telegram since late 2024, according to researchers at Abnormal Security.

GhostGPT is believed to operate by leveraging a jailbroken version of ChatGPT or an open-source large language model (LLM) through a custom wrapper, enabling unrestricted and potentially harmful outputs.

GhostGPT: The Latest in a Line of Malicious AI Tools

GhostGPT follows in the footsteps of WormGPT, an AI chatbot released in 2023 and tailored for business email compromise (BEC) schemes. Other similar tools, such as WolfGPT and EscapeGPT, have since surfaced, further demonstrating the growing interest of cybercriminals in exploiting generative AI technology.

Researchers noted that GhostGPT has already garnered significant attention, with thousands of views on online forums. This indicates a rising demand among threat actors for AI-driven tools that simplify complex cybercriminal operations.

Registrations Open for FutureCrime Summit 2025: India’s Largest Conference on Technology-Driven Crime

 Simplifying Cybercrime

GhostGPT stands out for its accessibility and ease of use. Unlike previous tools that required jailbreaking ChatGPT or setting up an open-source LLM, GhostGPT is available as a Telegram bot. Users can purchase access via the messaging platform, bypassing the technical challenges associated with configuring similar tools.

“Users can pay a fee, gain immediate access, and focus directly on executing their attacks,” Abnormal Security researchers stated.

The creators of GhostGPT also claim that the tool ensures anonymity by not recording user activity, further appealing to cybercriminals seeking to conceal their operations.

Capabilities of GhostGPT

GhostGPT is marketed as a versatile tool for a variety of malicious activities, including:

  • Writing convincing phishing and BEC emails.
  • Coding and developing malware.
  • Crafting exploits for cyberattacks.

Promotional materials for GhostGPT highlight its fast response times and efficiency, enabling users to produce harmful content and gather information more effectively.

To evaluate the tool’s effectiveness, researchers asked GhostGPT to generate a phishing email impersonating DocuSign. The chatbot promptly produced a highly convincing template, underscoring its potential as a powerful weapon for cybercriminals.

 Growing Threat of AI in Cybercrime

The emergence of GhostGPT and similar tools signals a troubling trend in the cybercrime landscape. By lowering the barrier to entry for less-skilled threat actors, these AI-driven platforms could lead to a surge in sophisticated cyberattacks.

As the misuse of AI technology continues to evolve, cybersecurity experts emphasize the urgent need for robust defenses and proactive measures to counter these emerging threats.

Follow The420.in on

 TelegramFacebookTwitterLinkedInInstagram and YouTube

Continue Reading