News & Events

Decoding the “100-Point Scale”: How Team 14 of Lac Hong University Was Evaluated in the Dong Nai Cybersecurity Exercise

 

In the Dong Nai Province Live-Fire Cybersecurity and Information Security Exercise 2025, ams were not assessed based on subjective impressions or the sheer number of technical actions performed. Instead, performance was measured through a rigorous evaluation framework with a maximum score of 100 points. This approach reflects the true nature of modern cybersecurity, where quality, depth of analysis, and the value of remediation recommendations are just as important as the discovery of vulnerabilities.

For Attack Team No. 14 from Lac Hong University, the 100-point scale served not only as a benchmark of technical competence but also as a test of professional thinking within a provincial-level information security environment.

A Multidisciplinary Judging Panel and Comprehensive Evaluation

The judging panel for this year’s exercise brought together experts from the Dong Nai Provincial Police, the Dong Nai Department of Science and Technology, and representatives of major telecommunications and technology enterprises such as MobiFone. This multidisciplinary composition reflected a comprehensive view of cybersecurity, extending beyond technical considerations to include public-sector governance and real-world system operations.

From the judges’ perspective, a strong attack team was not one that created numerous “high-risk situations,” but one that could accurately identify key issues, correctly assess risk levels, and propose recommendations appropriate to the operational context of government information systems.

Scores Beyond Vulnerability Discovery

Within the overall 100-point scale, the identification and classification of security vulnerabilities accounted for only part of the evaluation. While vulnerability discovery remained essential, it was not the sole determinant of team performance. The judging panel placed greater value on identifying vulnerabilities with real-world impact, rather than listing purely technical issues with limited relevance to the overall security posture of the system.

This scoring approach required Team 14 to adopt a selective and strategic methodology, focusing on weaknesses that truly mattered to the official email system and the operational activities of government agencies.

Technical Complexity and Depth of Expertise

Another key evaluation criterion was the level of technical complexity and analytical depth demonstrated during the exercise. The judges did not prioritize superficial testing or basic trial-and-error techniques. Instead, they focused on the team’s ability to analyze system architecture, understand risk chains, and identify interconnections among vulnerabilities.

For the student members of Team 14, this criterion posed a significant challenge. It demanded not only technical knowledge, but also systems thinking and an understanding of how public-sector infrastructure operates. Meeting this requirement highlighted the effectiveness of the university’s training in preparing students to think like cybersecurity professionals.

Remediation Recommendations as a Measure of Responsibility

One of the most notable components of the scoring framework was the evaluation of remediation recommendations following vulnerability identification. The judging panel placed particular emphasis on whether the attack team could propose solutions aligned with the system’s real-world conditions, encompassing processes, personnel, and technology.

Recommendations were expected to go beyond merely stating problems. They had to reflect a sense of responsibility and aim to support the defensive team and system administrators in improving long-term security. This criterion most clearly distinguished responsible security testing from destructive or irresponsible attack behavior.

The 100-Point Scale as a Test of Professional Mindset

Viewed holistically, the “100-point scale” used in the Dong Nai cybersecurity exercise was not simply a ranking mechanism, but a comprehensive assessment of professional mindset in the field of information security. Attack teams were required to demonstrate a clear understanding of the exercise’s objectives, strict adherence to discipline, depth of expertise, and a strong sense of responsibility toward the systems under assessment.

For Team 14 of Lac Hong University, participation under such stringent evaluation criteria provided students with early exposure to professional standards while still in an academic environment. In this context, the final score reflected not only exercise performance but also the maturity of professional thinking among future cybersecurity practitioners.

From Exercise Scoring to Training Outcomes

The evaluation methodology applied in the live-fire exercise also set clear expectations for cybersecurity education and workforce development. Students must not only master technical skills, but also develop risk analysis capabilities, operational awareness, and the ability to propose feasible solutions. These competencies align closely with the needs of both the labor market and government authorities.

Through participation and assessment under the “100-point scale,” Lac Hong University reaffirmed its commitment to practice-oriented education grounded in real-world requirements and professional standards. The Dong Nai cybersecurity exercise thus served not only as a technical competition, but also as a benchmark for training quality and the capabilities of the next generation of information security professionals.

 

 

Khoa Công Nghệ Thông Tin

        304,670       0/909,740