By CBT CPAN
The developments surrounding the upcoming 2026 Unified Tertiary Matriculation Examination (UTME), conducted by the Joint Admissions and Matriculation Board, suggest a troubling possibility: a repeat of the widespread irregularities that plagued the 2025 exercise.
In 2025, nearly a quarter of candidates were forced to retake the examination due to technical failures, questionable results, and inconsistencies in questions and answers. Rather than demonstrating lessons learned, recent actions indicate that similar systemic issues may be unfolding again.
1. Questionable Preparedness and Last-Minute Technical Decisions
One of the most alarming concerns is the late release of critical examination software.
• The browser designated for the 2026 UTME was reportedly released on April 12, 2026—just days before the exam scheduled for April 16.
• This release came after the national mock examination, which is traditionally intended to test system readiness.
This raises fundamental questions:
• What was the purpose of the mock exam if it did not utilize the actual examination software?
• How can CBT centres adequately test systems they had not yet received?
• Was the mock exercise truly a readiness assessment, or merely procedural?
Such timing undermines confidence in the board’s technical preparedness and suggests a reactive rather than proactive approach.
2. Administrative Actions and Lack of Technical Oversight
On April 11, 2026, JAMB reportedly convened a meeting with over 70 Computer-Based Test (CBT) centres, lasting until late at night. Following this meeting:
• Several centres were suddenly delisted
• Decisions were allegedly influenced by state coordinators with limited technical expertise
This introduces serious concerns:
• How can individuals without adequate technical knowledge fairly evaluate CBT centres?
• Were these decisions based on objective criteria or arbitrary judgment?
Such actions risk compromising transparency and fairness, while also destabilizing the infrastructure required for a national examination.
3. Financial Disputes and Their Operational Impact
Another critical issue is the non-payment and selective payment of CBT centres.
• Over 140 centres are reportedly yet to be paid for registration services
• A controversial policy—“No View, No Payment”—has been enforced
This creates multiple contradictions:
• If registrations are invalid, why were candidates not asked to re-register?
• If valid, why are centres not compensated?
The implications are serious:
• Financial strain discourages participation by capable centres
• Operational readiness is weakened
• Trust between stakeholders is eroded
Given that JAMB had a full year to prepare after the 2025 issues, this lack of financial accountability suggests institutional inefficiency.
4. Inconsistent Policies and Allegations of Double Standards
There are also claims of unequal treatment of CBT centres:
• Some centres that expressed concerns were persuaded and reassured
• Others were threatened or outright delisted
For example:
• A centre in Lagos was reportedly permanently delisted simply for indicating uncertainty about participating
This inconsistency raises concerns about:
• Fairness in regulatory enforcement
• Possible favoritism or arbitrary decision-making
• Lack of a unified policy framework
Such double standards weaken institutional credibility and discourage stakeholder cooperation.
5. Rising Operational Costs and Institutional Silence
CBT centres are facing significant cost pressures, particularly:
• High diesel prices (critical for powering systems during exams)
• Increased operational expenses
Despite this:
• There appears to be no structured engagement between JAMB and CBT operators to address these challenges
This lack of dialogue increases the risk of:
• Power disruptions during exams
• Reduced participation from centres
• Compromised candidate experience
6. Accountability and the Need for Institutional Responsibility
A recurring concern is JAMB’s tendency to attribute failures to CBT centres.
However, given that:
• JAMB oversees the entire examination framework
• It controls policies, software, and coordination
It follows that ultimate responsibility lies with the Board itself.
If similar issues arise in 2026, accountability should not be deflected onto third parties but should rest squarely with the institution entrusted with conducting the examination.
Conclusion: A Preventable Repeat?
The warning signs are clear:
• Late technical deployment
• Questionable administrative decisions
• Financial disputes
• Policy inconsistencies
• Lack of stakeholder engagement
These are not isolated issues—they point to systemic weaknesses.
Unless urgently addressed, the 2026 UTME risks becoming not just a repetition of past failures, but evidence of a failure to learn from them.
The question now is not whether problems could occur—but whether they are being actively prevented.


