The National Information Technology Development Agency (NITDA) has issued a security advisory, warning users about newly discovered vulnerabilities in OpenAI’s ChatGPT models.
NITDA issued the advisory in a post on Sunday on its
official X account.
The agency said seven flaws were identified in “GPT-4o and
GPT-5 models” that could enable attackers to manipulate the system through
indirect prompt injection.
“By embedding hidden instructions in web pages, comments, or
crafted URLs, attackers can cause ChatGPT to execute unintended commands simply
through normal browsing, summarisation, or search actions,” the agency said.
“Some flaws also enable attackers to bypass safety filters
using trusted domains, exploit markdown rendering bugs to hide malicious
content and even poison ChatGPT memory so that injected instructions persist
across future interactions.
“While OpenAI has fixed parts of the issue, LLMs still
struggle to reliably separate genuine user intent from malicious data.”
The agency warned that the weaknesses pose significant
risks, including unauthorised system actions, data exposure, distorted outputs,
and behavioural manipulation.
These threats, according to the data body, could be
triggered even when users do not click on anything, particularly when ChatGPT
processes search results or online content containing hidden commands.
NITDA urged organisations to “limit or disable
browsing/summarisation of untrusted sites within enterprise environments”.
“Only enable ChatGPT capabilities like browsing or memory
when operationally necessary,” the agency said.
NITDA also recommended that users “regularly update and
patch the GPT-4o and GPT-5 models to ensure that any known vulnerabilities are
addressed”.
The agency directed users to contact its computer emergency
readiness and response team (CERRT) for further enquiries.
Advertise on NigerianEye.com to reach thousands of our daily users

No comments
Post a Comment
Kindly drop a comment below.
(Comments are moderated. Clean comments will be approved immediately)
Advert Enquires - Reach out to us at NigerianEye@gmail.com