Trending in Nigeria

Loading...

2/3/2026



NITDA Warns Nigerians About Critical Vulnerabilities in GPT-4 and GPT-5 AI Models


The agency’s warning underscores the growing need for robust cybersecurity measures as AI systems become increasingly integrated into professional and everyday digital activities.

GPT Risk in Nigeria. Photo: Unsplash/Emiliano Vittoriosi

GPT Risk in Nigeria. Photo: Unsplash/Emiliano Vittoriosi

The National Information Technology Development Agency (NITDA), in its statement identified seven critical weaknesses that could allow attackers to manipulate AI outputs and potentially access users’ sensitive data.

It has issued a cautionary advisory to Nigerian cyber users and professionals relying on AI models, highlighting serious security vulnerabilities in OpenAI’s GPT-4 and GPT-5 systems.

According to the agency, hidden malicious instructions can be embedded in everyday web content, ranging from social media comments to shortened links. This can cause AI systems to unknowingly execute harmful commands during routine tasks such as summarizing text or browsing online.

Other exploits detailed by NITDA include bypassing safety filters, concealing dangerous content via markdown bugs, and memory poisoning. This is a tactic that can gradually alter an AI model’s behavior over time, potentially resulting in data leaks or unauthorized actions.

While OpenAI has acknowledged and patched some of these issues, large language models continue to struggle with detecting cleverly disguised malicious instructions, leaving users exposed to potential risks.

In light of these findings, NITDA strongly advises AI users to exercise caution, always verify AI-generated outputs, and remain vigilant against suspicious online content.

The agency’s warning underscores the growing need for robust cybersecurity measures as AI systems become increasingly integrated into professional and everyday digital activities.

Next

Announcements / Notice!!!

Tue Mar 17 2026


Loading...

X

Unlock News Faster ...Sign Up

Recommended