Assessment System Manual

The purpose of this manual is to establish an assessment system for all training courses offered by the institution. It provides a structured framework of policies and procedures to ensure that trainee assessments are conducted consistently, fairly, and in complete alignment with STCW 1978, as amended, national regulations, and company-specific training in the maritime sector. This manual fulfills that requirement by clearly outlining how assessments will be designed, administered, and managed to verify that seafarers meet the prescribed competence standards under the STCW Convention. In essence, the rationale for this manual is to promote valid and reliable assessments of competencies, uphold the integrity of the certification process, and ensure that every assessment contributes to the production of competent and safety-conscious seafarers in accordance with STCW objectives and Maritime Industry Authority policies.

This manual’s objectives are to:

  • Document assessment policies and procedures that comply with all applicable MARINA-STCW regulations and international standards. By doing so, the manual helps the institution meet the requirements of the MARINA Memorandum Circulars, which mandate the establishment of an adequate assessment system. It also aligns with ISO guidelines on planning, implementing, and reviewing assessments of learning outcomes.
  • Provide a uniform approach to designing and conducting assessments across all STCW-mandated courses and other courses offered by the institution. This ensures consistency in how knowledge, skills, and competence are evaluated, regardless of course or instructor. Standardization promotes fairness and reliability in results, as the same criteria and processes are applied to all trainees.
  • Define measures to guarantee that assessments are conducted ethically and without bias. All candidates are assessed under equivalent conditions, scoring criteria are transparent, and anti-cheating safeguards are in place to protect the integrity of each exam. Adhering to these procedures supports the requirements of fairness and ethics.
  • Serve as a reference for instructors, assessors, and administrators on their responsibilities in the assessment process. The manual clarifies who is responsible for what – from preparing question banks to invigilation and grading – so that staff can carry out assessments confidently and in compliance with approved procedures.
  • Establish a system for reviewing and refining assessment tools and processes over time. By setting objectives for monitoring results and obtaining feedback, the manual ensures that the assessment system can be periodically evaluated and improved. Problems such as unclear questions or outdated content can thus be identified and corrected in subsequent revisions of the assessments.

This manual covers the assessment system for all STCW-mandated training courses delivered by the institution. It applies to the full range of mandatory courses listed by MARINA (as per the official list of STCW mandatory training courses), including any ancillary or refresher training that falls under STCW Convention requirements. The procedures herein govern both theoretical (written) assessments and practical assessments (including simulator-based assessments) conducted as part of course completion requirements.

The provisions of this manual apply to all personnel involved in training assessment processes, including but not limited to: course developers, instructors, designated assessors, proctors/invigilators, and training administrators. All MARINA-accredited instructors and assessors are expected to implement these policies in their respective roles. The manual is to be followed in both classroom-based (face-to-face) training and in technology-assisted learning environments. Suppose courses are delivered via distance learning or e-learning modes. In that case, the same assessment standards apply, with additional provisions to address remote assessment security in line with MARINA’s distance learning guidelines. This manual is therefore applicable to assessments conducted on-site at the training center, as well as LMS-based assessments used in blended or online course delivery.

It is the responsibility of the Training Manager (or equivalent academic head) to ensure that all assessments within scope are designed and conducted in accordance with this manual. Individual instructors and assessors are responsible for conducting assessments in accordance with the procedures, while the Quality Assurance (QA) department is responsible for monitoring the compliance and effectiveness of the assessment system. Trainees are also within the scope insofar as they must adhere to the examination rules and procedures defined herein.

For the purpose of this manual, the following key terms are defined operationally:

Term Definition Reference
Administration refers to the Maritime Industry Authority (MARINA) MC No. SC – 2021-08, MC No. SC - 2022 - 05
Approved Training Course refers to a learner-centered system of instructions, approved by the Administration, and designed to equip the trainees with the necessary knowledge, understanding, and proficiency that would lead to the acquisition of the required competences under the STCW Convention, 1978, as amended. MC No. SC – 2021-08, MC No. SC – 2021-09
Asynchronous delivery (offline learning) refers to the conduct of classes through distance and e-learning that do not occur in the same place or at the same time. MC No. SC – 2021-10
Assessment of Learning refers to the systematic collection, measurement, and examination of the trainee's performance with respect to the intended learning outcomes. MC No. SC – 2021-09, MC No. SC - 2021
Assessment Tools refers to the following components: context and conditions of assessment, tasks to be administered to the trainees, an outline of the evidence to be gathered from the candidate and evidence criteria used to judge the quality of performance. MC No. SC – 2021 – 09, Superseded by MC No. SC – 2022-05
Assessment of Competence refers to the process of collecting evidence through theoretical examination and practical assessment of the knowledge, understanding and proficiency, gained from the following: approved education and training; approved training ship experience; approved laboratory equipment training; or approved in-service experience, and making judgments on whether competency has been achieved and that an individual can perform the relevant standards in the table of competences of the STCW Code, as amended. MC No. SC – 2021-08
Blended Learning (BL) refers to the combination of distance and/or e- learning and face-to-face modes of delivery of training of seafarers. MC No. SC – 2021-10
Cloud-based simulation a facility for training through a remote desktop solution which enables physical and operational realism through virtual reality in order to achieve the required competences in the appropriate provisions of the STCW Code. MC No. SC – 2021-10
Course Package refers to the Course Plan plus the Instructional materials and Assessment tools. MC No. SC – 2021-09
Course Plan is the systematic organization of course documents designed and structured based on IMO Model Course format consisting of: Course Framework (Part A); Course Outline and Timetable (Part B); Course Syllabus (Part C); Instructor's Guide (Part D); and Course Assessment (Part E). MC No. SC – 2021 – 09, Superseded by MC No. SC – 2022-05
Distance Learning (DL) and e-Learning (EL) refer to the conduct of training where trainees receive instruction through online classes, video recordings, video conferencing, or any other audio/visual technology medium. It enables trainees to undergo training without having to be physically present in a classroom. MC No. SC – 2021-10
Face to Face Learning refers to an instructor-led activity in a traditional setting for the conduct of theoretical, practical or laboratory part of the maritime training courses using laboratory facilities and equipment including simulator. MC No. SC – 2021-10
Learning Management System (LMS) refers to the software application used by the MTI in the administration, documentation, tracking, reporting, automation of delivery and assessment of the training courses through distance and e-learning modes, and the issuance of certificates. It also pertains to the use of a system to provide a number of critical services that make the interaction between the instructor, assessor, and trainees more seamless. MC No. SC – 2021-10
Instructional Materials (IMs) These are materials that complement and supplement instruction. These are also referred to as teaching aids used in the delivery of the course, such as audio-visual presentations or computer-generated slides, exercise sheets, workbooks, pictures, diagrams, and the like. MC No. SC – 2021 – 09, Superseded by MC No. SC – 2022-05
Institution refers to a Maritime Training Institution or an Assessment Center MC No. SC – 2021-08
Learning Resource Center refers to a facility within an MTI, staffed by a specialist, containing several information sources to facilitate learning for trainees and staff. It focuses on multimedia resources and information technology. MC No. SC – 2021-08
Planned Maintenance System (PMS) is the documented process of periodic inspection, testing, and repair of equipment and facilities to ensure that, at any given time, they are up and running, preventing any costly unplanned downtime from unexpected equipment failure MC No. SC – 2021-08
Quality Standards System refers to the documented policies, procedures, controls and internal quality assurance system, relating but not limited to training, assessment of competence and revalidation activities, designed to ensure the achievement of defined objectives of the training course in accordance with the requirements of the STCW Convention. MC No. SC – 2021-08, MC No. SC – 2021-09
Receiving technology refers to the hardware and software associated with and used by the trainee. MC No. SC – 2021-10
Record of Review, Verification, and Validation refers to the working documents and evidences as a result of review, verification and validation process. Basically, the records were based on acceptance, rejection or qualifying data or information in an objective and consistent manner. MC No. SC – 2021-09
Synchronous delivery (online learning) refers to the conduct of classes through distance and e-learning which occurs through a virtual platform while the instructors and trainees are separated physically but connected real-time through the internet or other medium. MC No. SC – 2021-10
STCW Convention refers to the International Convention on Standards of Training, Certification and Watch keeping for Seafarers, 1978, as amended and its associated Code. MC No. SC – 2021-09
STCW Office refers to the office in MARINA, specifically tasked to give full and complete effect to the requirements of the STCW Convention, 1978, as amended. MC No. SC – 2021-08
Training Completion and Record of Assessment (TCROA) refers to the prescribed document where the name of trainees who have completed the training course and the outcome of their assessment are recorded as certified by the qualified assessor and Training Director of an accredited Maritime Training Institution. MC No. SC – 2021-09

The following general provisions establish the foundational requirements and policies for implementing the assessment system:

Regulatory Reference Date of Issue Description
MC No. SC – 2022 – 0511/14/2022Standards For Mandatory Training Courses Under The STCW Convention, 1978, As Amended
MC No. SC – 2021 - 1012/29/2021Revised Guidelines on Training and Assessment of Seafarers by Distance Learning and E-Learning in Accordance With The provisions of Regulation 1/6 Of The STCW Convention, 1978, As Amended
MC No. SC – 2021 – 0912/29/2021Policies, Rules and Regulations on the Approval of Training Courses under the STCW Convention, 1978, as Amended
MC No. SC – 2021 – 0812/29/2021Policies, Rules, And Regulations on the Accreditation Of Maritime Training Institutions and Assessment Centers
MC No. SC – 2021 – 0203/18/2021Revised Rules on the Monitoring of Approved Training Courses (ATCs) conducted by the Maritime Training Institutions (MTIs), and Assessment of Seafarers’ Competence carried out by Accredited Assessment Centers (ACs)

Statutory Reference

Statutory Reference Date of Issue Description
ISO 9001:2015Quality Management System – Requirements

The following general policies govern the implementation and maintenance of the assessment system manual itself, as well as overarching responsibilities and practices. These policies ensure that the manual is a living document effectively used by the organization:

6.1 Implementation of the Manual

This Assessment System Manual shall be implemented across all departments and personnel involved in training delivery and evaluation. The Training Manager is responsible for officially issuing the manual and ensuring that all instructors, assessors, and relevant staff have been oriented on its contents. A formal orientation or training session on the manual’s policies and procedures will be conducted whenever new staff are onboarded or when major updates to the manual occur.

All assessments for STCW courses must be conducted strictly in accordance with the procedures laid out in this manual – no ad hoc or informal assessment methods are permitted. Department heads and course supervisors shall monitor compliance by staff. Any deviation from the manual’s procedures (if necessary in exceptional cases) must be approved by the Quality Assurance (QA) Manager in advance and documented with justification. The principle is that this manual’s procedures are mandatory instructions to be followed to ensure standardization and regulatory compliance.

To support implementation, controlled copies of the manual (printed or digital) will be distributed (see distribution policy below), and easy access will be provided. The QA department may conduct spot checks or observe assessments to verify that the practice aligns with the manual. Where issues are found, corrective training or disciplinary measures will be taken to enforce adherence. Ultimately, the successful implementation of the manual is measured by smooth, uniform assessment operations and positive audit/inspection results from external bodies like MARINA or Accreditation Bodies.

6.2 Manual Review and Revision

The assessment manual is a controlled document that is subject to periodic review and revision to ensure it remains current with the latest requirements and best practices. At a minimum, the manual should be reviewed annually. In addition, specific triggers will prompt immediate review and possible revision, such as:

  • Regulatory Changes: Any new MARINA STCW circulars or advisories affecting training assessments, or amendments to the STCW Code or IMO model courses, will be reviewed as soon as issued. The manual will be amended accordingly to incorporate new rules. Recent circulars have been included in this edition, and future changes will be similarly captured.
  • Internal Process Changes: If the institution adopts new technologies or modifies its internal processes, the manual will be updated to reflect those changes. Likewise, improvements identified through internal audits or after-action reviews of course deliveries will be documented. For instance, if analysis shows that the time allowed for a particular exam is consistently insufficient, the procedure may be revised.
  • Feedback and Continuous Improvement: Feedback from instructors, assessors, or trainees regarding the assessment process is encouraged and will be considered during reviews. Any recurrent issues, such as particular exam questions causing confusion or difficulties in the appeals process, will be evaluated. The Design and Development Team (which may include curriculum developers and subject matter experts) may suggest changes to assessment design procedures based on validation studies or new pedagogical insights.

The QA Manager will coordinate the review process, assembling a review team that may include instructors, the Training Manager, and a curriculum specialist. Proposed revisions are documented and presented to top management for approval. Once approved, a new revision number is assigned, and the manual is reissued. All manual revisions are logged in a Document Control table at the front of the manual, noting the changes made and the effective date. Obsolete versions (physical copies) are retrieved and archived to avoid confusion. The QA Manager ensures that MARINA is informed, if required, of substantive changes (for example, MARINA might require resubmission of portions of the manual if significant changes occur in assessment policy).

Version control is strictly maintained; the current version and revision date are indicated on every page or the cover of the manual. During review, attention is also given to ensure alignment with the latest versions of standards for learning services, so that best practices continue to be integrated.

In summary, this policy guarantees that the manual is not static – it evolves through a formal review cycle to remain relevant, effective, and compliant with all applicable requirements.

6.3 Responsibilities

Clear roles and responsibilities are defined to implement and uphold the assessment system:

  • Training Manager: Has overall responsibility for the assessment system. Approves the assessment procedures and any changes to them. Ensures that all courses have appropriate assessment packages and that assessors are assigned. The Training Manager also signs off the TCROA reports to certify that assessments were done per approved plans, and oversees the result approval process before certificates are issued. They handle escalated issues, such as appeals that require managerial decision, and liaise with regulatory bodies (MARINA) on matters of assessments (e.g., reporting course completions or any incidents during evaluations).
  • Quality Assurance (QA) Manager: Ensures that the assessment process complies with the Quality Management System (QMS) and relevant regulatory standards. The QA Manager organizes internal audits of assessment activities, checks documentation (assessment packages, records), and verifies that assessors are qualified. They also coordinate the regular review of the manual as described above. In cases of non-conformity (e.g. an assessment not carried out per procedure), the QA Manager facilitates root cause analysis and corrective actions. The QA Manager may also deploy QA personnel to monitor examinations (e.g. doing unannounced classroom visits during exams as observers) to ensure that invigilation and conduct are proper.
  • Instructional Designers: An Instructional Designer is a qualified education and training professional or a group of individuals responsible for the systematic design, development, and evaluation of instructional materials, using instructional systems design (ISD) models and digital authoring tools. The Instructional Designer ensures that training programs meet prescribed standards of competence as defined in the STCW Code and MARINA Circulars. This role includes defining learning outcomes, mapping content to performance criteria, selecting delivery modalities (e.g., e-learning, blended, classroom), and developing interactive content using authoring tools. Such as SCORM-compatible platforms. The Instructional Designer collaborates with subject matter experts, instructors, and assessors to ensure pedagogical soundness, technical accuracy, and regulatory compliance throughout the course lifecycle.
  • Instructors: Instructors deliver the training and also contribute to the assessment process, especially formative assessments. Their responsibilities include administering formative quizzes on the LMS, providing feedback to trainees on their performance, and ensuring that trainees have completed required formative assessments before moving on. They are also expected to ensure academic integrity during any on-going assessments (even formative ones) by advising trainees of rules and reporting any suspected dishonesty.
  • Assessors: The designated Assessors carry out the summative assessments. Their primary responsibilities include: preparing the examination room or simulator for the test, verifying the identity of each candidate, explaining exam instructions, and conducting the assessment strictly as per procedure. They ensure that only authorized materials are used and that no cheating occurs. After the exam, Assessors grade the answer sheets or evaluate the practical performance using the provided answer keys or scoring rubrics. They fill out the results on the TCROA and any internal scoring sheets. Assessors then recommend the results for approval and are available to participate in any result review or appeals process. Notably, all Assessors must be duly accredited by MARINA for the specific training program they assess, as required by regulation. The institution will not assign any person to be an Assessor unless they have the appropriate background (e.g. they themselves hold at least the level of certificate being trained, have completed a Train-the-Assessor or equivalent course, etc.) and have current MARINA approval.
  • IT Support Staff: Since many assessments rely on the LMS and possibly computer labs or simulators, IT personnel are responsible for ensuring the technical readiness and security of these systems. Before an online quiz or computer-based test, IT staff verify that the network, servers, and devices are functioning and that any exam security settings (like lockdown browser or access restrictions) are correctly configured. They remain on standby during online exams in case of technical issues. They also manage user accounts and permissions on the LMS, including creating exam accounts for trainees and a MARINA monitoring account as required. After assessments, IT may assist in retrieving data (e.g. quiz logs) if needed for review of incidents or analysis.
  • Trainees/Candidates: They must familiarize themselves with and adhere to the Assessment Guidelines provided (usually an excerpt or summary of key rules from this manual is given to trainees at course start). Trainees are responsible for completing all formative assessments by the deadlines, preparing adequately for summative assessments, and upholding academic integrity. They are required to present necessary identification and documentation (e.g. assessment permit or official receipt of fees, if applicable) during exams. If a trainee does not abide by the assessment rules (cheating, misconduct), consequences are enforced as per the institution’s policies (which could include failing the assessment or other disciplinary action). Trainees also have the right to appeal or file a complaint about assessment results or processes if they believe an injustice has occurred; the procedure for appeals is provided to them (and detailed later in this manual).

By clearly delineating these responsibilities, the manual ensures that everyone knows their role in the assessment system, which promotes accountability and smooth operation. A RACI (Responsible, Accountable, Consulted, Informed) chart may be included in an appendix to summarize these roles for each major process step (design, approval, conduct, etc.).

6.4 Manual Distribution

This manual is distributed as a controlled document. The QA department will maintain a distribution list to ensure all relevant personnel have access to the current version. Distribution is done in the following manner:

  • Digital Copies: The master approved version of the manual is stored in the institution’s document management system or shared network drive accessible to staff. All instructors, assessors, and academic managers are given read access to this digital copy. A PDF version may also be available for easy viewing. The digital document is protected from unauthorized editing. If printed, the cover page will indicate that it is an uncontrolled copy unless it is stamped by QA as controlled.
  • Printed Copies: A limited number of printed copies of the manual are produced for use in key locations, one in the Faculty/Instructor common area, one in the Training Manager’s office, one in the QA office, and one at the reception/registry (for reference by regulatory visitors or others). Each printed controlled copy is numbered and marked with the revision number and date. Holders of controlled hard copies will be issued new pages or completely new versions when the manual is revised, and are instructed to replace and destroy old pages. Any uncontrolled prints (e.g. a personal reference copy made by an instructor) should be checked against the current version to avoid outdated guidance.
  • Availability to Regulators: A copy of this manual (or relevant excerpts) will be made available to MARINA auditors/inspectors upon request. We ensure that, during MARINA inspections or accreditation visits, the assessors and instructors can demonstrate familiarity with these procedures. The manual may also be shared with other auditors, such as ISO evaluators, to maintain certification.
  • Trainee Access: While the complete manual is an internal document, the portions that concern examinee conduct (exam rules, grading system, appeal process) are communicated to trainees. This might be via a summarized “Assessment Policy for Trainees” document or in the course orientation briefing. In particular, trainees are informed of passing criteria, re-sit opportunities, and the no-cheating rules. By distributing this information, we ensure transparency and that trainees know what to expect. However, detailed internal procedures (like how we design exams) are not distributed externally to avoid compromising assessment security.
  • Confidentiality: Certain parts of the assessment system – such as specific test content, question banks, or answers – are confidential and are not broadly distributed. Access to the assessment instruments is restricted to the curriculum developers, assigned assessors, and necessary approvers. The manual references these instruments but does not contain the questions themselves. For example, the TOS and sample forms may be included in appendices, but the actual question bank is stored securely in the LMS question bank or locked files. This distribution policy ensures that sensitive information is only in the hands of those who need it, mitigating the risk of leaks.

All staff must ensure they reference the current version of the manual when performing their duties. The QA department will notify all users (e.g. via email or memo) whenever a new revision is released, summarizing the changes. Each recipient is responsible for updating their controlled copy or replacing it with the new version. During annual staff training, a short refresher on key manual points and any recent changes will be given.

In conclusion, through controlled distribution, we maintain consistency in understanding and applying the assessment policies and avoid the scenario of someone inadvertently following outdated procedures. This supports the integrity and uniformity of our assessment system across the entire institution.

This section provides an overview of the end-to-end process for managing assessments, from initial design through administration and up to the validation of results and continuous improvement. The assessment system process is structured to ensure a systematic approach covering planning, development, implementation, and review, consistent with the framework ISO standards. The key stages in the assessment process are:

  1. Design and Development of Assessments – Determining how each learning outcome will be assessed and creating the assessment instruments (written exams, practical test scenarios, etc.) and associated documentation (TOS, answer keys, rubrics). This includes both formative assessments (embedded in the course) and summative assessments (final evaluations).
  2. Review, Verification, and Validation – A quality check of the designed assessment instruments before they are used. This involves reviewing the content for accuracy and alignment with course objectives, verifying that the difficulty level and coverage are appropriate (often via peer review or SME input), and validating the instruments through pilot-testing or statistical analysis to ensure they are fair and reliable. Approval by the Training Manager/QA of the assessment package occurs at this stage.
  3. Implementation (Conduct of Assessment) – The actual administration of assessments to trainees. This includes scheduling the exams, preparing the venue or online platform, invigilating the exam, grading the responses, and recording the results. Implementation covers both written examinations (which might be on paper or computer) and practical assessments (including simulator-based assessments), and also covers how formative assessments are deployed during training. The grading system (pass marks, scoring weightages, etc.) is applied in this stage to derive each trainee’s results.
  4. Monitoring and Evaluation – Oversight of the assessment conduct in real-time (monitoring) and post-exam review activities. Monitoring may include having QA personnel or supervisors observe exam sessions to ensure compliance. Post-assessment evaluation includes analyzing results for any patterns (e.g. a question that many failed might indicate an issue), getting feedback from assessors or even trainees, and checking that the assessors followed procedures. It also includes formal internal verification of results – e.g. second-marking a sample of papers or cross-checking practical assessment scores – as an added measure of quality control before finalizing results.
  5. Continual Improvement – Using the insights from the evaluation stage to make improvements to future assessments or to the system itself. This can involve updating questions, refining the scoring rubric, adjusting the time allowed, retraining assessors, or even revising certain training content if it was found that learning outcomes were not adequately assessed. Audits and management reviews formally ensure that at least annually, the assessment system is audited and any changes needed are implemented to enhance the system. This closes the loop, feeding back into the next cycle of design and development.

Throughout these stages, administrative requirements such as documentation, record-keeping, and reporting are handled (details on those in subsequent sections). For instance, during implementation and monitoring we ensure that attendance sheets, identity checks, and log records are kept; and after completion, we ensure results are reported to MARINA via the TCROA as required.

This holistic process ensures that assessments are not one-off events but part of a managed system. Each step is governed by procedures in this manual, which will be detailed in the following sections. By following this structured process, the institution can demonstrate that its assessment system is planned, systematic, and quality-assured end-to-end, which is expected by both ISO standards and MARINA. In the next sections, we delve deeper into specific procedures for key parts of this process: designing written and practical assessments, the grading system, conducting assessments, maintaining security, and validating results.

The design and development process of training programs and assessment instruments is a systematic and standards-driven activity that ensures alignment with the Maritime Industry Authority (MARINA) regulations, particularly those mandated under the STCW Convention, 1978, as amended.

At the core of the development phase is mapping each learning outcome to the appropriate assessment methods, ensuring that both knowledge and skills are evaluated in accordance with the STCW Code, Section A-I/6. The process involves:

  • Defining Assessment Strategies: Each learning outcome is mapped to a specific assessment type — whether written, practical, or oral — depending on its complexity and cognitive domain (knowledge, skills, or attitude).
  • Creating Assessment Instruments: Tools include written examinations, practical test scenarios, and oral questioning guides. These are constructed to ensure fairness, validity, reliability, and alignment to learning objectives and competency standards.
  • Development of Supporting Documentation:
    • Table of Specifications (TOS): Ensures comprehensive coverage of the course content and appropriate weight per topic area.
    • Answer Keys: For objective assessments to ensure consistency in marking.
    • Scoring Rubrics: For performance and practical tasks, ensuring transparency and uniform evaluation.
  • Formative Assessments: Embedded within each module or lesson, these are required at the end of each topic. They are accessible through the LMS and must be completed before progression.
  • Summative Assessments: Conducted onsite or online under controlled conditions, these final evaluations verify that the learner has achieved the required competency. Summative assessments include comprehensive written exams and practical demonstrations of skill.
  • Practical Assessments: These are conducted using real or simulated environments (e.g., simulators, practical sites) to verify competence in applying theoretical knowledge. Assessment is performed by MARINA-accredited assessors, supported by validated rubrics and observation checklists in line with the STCW Code and MARINA Circulars.

All assessment activities are documented and traceable, with results recorded in the Training Completion and Record of Assessment (TCROA), as required under MARINA Circular SC-2021-09.

This structured approach ensures that each element of the training program — from content delivery to performance evaluation — upholds the quality and integrity expected in maritime education and training.

8.1 Design and Development of Formative Assessments

StepDetails of the ProcedureRespInputOutput
Analysis and Planning
Initiate syllabus review & MARINA alignmentExamine the Course Syllabus (Part C) to extract the ILOs the quiz must measure.InstructorPart CILO list aligned to MARINA standards
Map cognitive level via TOSUse the Table of Specifications to assign Bloom’s level to each ILO/topic, guiding item type and count.InstructorILO list; TOSCompleted TOS matrix
Design and Development
Load the standard templateCopy the institution’s Standard Formative-Assessment Template (with built-in layouts and authoring guide) into the course folder.Instructor, Instructional DesignerStandard templateCourse-specific quiz shell
Develop questionsPopulate the template with items directly mapped from the TOS, using approved question types for each Bloom’s level and citing instructional-material pages for accuracy.Instructor, Instructional DesignerTOS; Quiz shell; ResourcesDraft quiz
Check feedback & branching logicVerify that each question’s feedback text, branching, and retry paths follow template rules and pedagogical intent.Instructor, Instructional DesignerDraft quizLogic-verified quiz
Set scoring, attempts & completionConfigure quiz properties: unlimited attempts, randomized order, summary-only feedback, linear navigation, completion = answer all questions.Instructor, Instructional DesignerLogic-verified quizConfigured quiz
Internal preview & verificationRun Quiz Preview to confirm layout, navigation, feedback, scoring, and accessibility; fix any anomalies.Instructor, Instructional DesignerConfigured quizQA-cleared draft
Review and Verification
Publish to LMS (sandbox) & route to Instructor (Reviewer)Publish the draft SCORM to the LMS sandbox/test area and notify the Instructor (Reviewer) for review via the built-in review link. Note: An Instructor Reviewer is a qualified individual other than the course developer or primary instructor, designated to independently review the course content, instructional materials, and assessment tools.IT OfficerQA-cleared draftSandbox quiz link sent to SMEs
Instructor (Reviewer) review & commentTest the quiz, add comments on accuracy, alignment, clarity, and technical behavior.Instructor (Reviewer)Sandbox linkComment log
Instructional Design revisionRevises quiz based on Instructor (Reviewer) feedback, and updates SCORM buildInstructional DesignerInstructor (Reviewer) comment logRevised quiz v1.1
Pilot Testing and Validation
Pilot Test Question BankGather at least five (5) participants or trainees who have completed relevant modules. Review flagged items, revise or discard items with a difficulty index of < 0.30 or > 0.80 or with a discrimination index of < 0.25; log changesInstructor, Instructional Designer, Pilot ParticipantsPilot Test(s), Question BankPilot Test Results
Revise & Log ChangesThe assessor and developer addresses the feedback of the reviewer(s) and pilot participants, updates questions based on the pilot test results, and logs changes.Instructor, Instructional DesignerSME Comment LogFinal Question Set
Submit to the Training ManagerSubmits revised quiz to the Training Manager for final pedagogical and compliance check.Instructional DesignerRevised quiz v1.1TM approval request
Final publish & version controlUpon TM approval, the Course Developer publishes the validated SCORM to the production LMS, labels the file, and records it in the Version-Control Log.Instructional Designer, IT Officer, QATM-approved SCORMLive formative quiz (production) + updated version log
Upload & archiveUploads the SCORM package, source file, and SME logs to the digital repositoryIT OfficerFinal assetsArchived package

8.2 Procedure for the Conduct of Summative Theoretical Assessment

StepDetails of the ProcedureRespInputOutput
DESIGN & DEVELOPMENT
Extract ILOsRetrieve Part C of the approved syllabus and list all Intended Learning Outcomes (ILOs).AssessorApproved Course Package (Part C)ILO List
Complete ToSConfirm the Table of Specifications based on the MARINA Standards (topics, thinking level, items / topic).AssessorILO ListApproved ToS v1.0
Create Question CategoriesIn Question Bank, add categories: Course Name – Topic.Assessor, IT OfficerLMSCategory Tree
Select MCQ FormatInside chosen category, choose question type “Multiple Choice”.Assessor, IT OfficerLMSMCQ Draft
Name StandardName each item: Course – Topic – ILO No. – Item No.Assessor, IT OfficerNaming GuideItem ID
Draft MCQWrite stem + 4 options using MCQ design criteria (ILO alignment, stem rules, option rules).AssessorMCQ GuidelineMCQ V1
Save & VersionSave question; verify Version = 1 in Version History.AssessorLMSItem Version Log
Build 3× PoolRepeat Steps 4-7 until each ToS cell has 3 × required items (redundancy).AssessorToSMCQ Pool
Notify ReviewerNotify reviewer that draft pool is ready for review.Assessor, IT OfficerE-mailReviewer Notice
REVIEW & VERIFICATION
Designation of an Review AssessorDesignate another assessor to review the MCQ other than the Assessor who developed the QbankTraining ManagerNotice to AssessorAcceptance of Assessor
Peer ReviewEvaluate each item against MCQ criteria (ILO alignment, stem, options). Comment “Valid” or list findings in LMS comment box.Assessor ReviewerMCQ PoolReview Log
Document FindingsDocument identified items needing correction and input in the comments log.Assessor ReviewerReview LogReview Log
REVISION
Revise Items if requiredEdit items per comments; update stems/options; Version automatically saves to next numerical integer.AssessorCommentsComments “Revised”
Version CheckEnsure Version History reflects changes and rationale.AssessorLMSVersion History
Notify Training ManagerAdvise Training Manager that revisions are complete and pilot may begin.AssessorNoticeTM Acceptance
PILOT TESTING & VALIDATION
Create Pilot QuizAdd a Quiz (sandbox): Time limit: 1 min/item, Grade to pass: 70 %, Attempts: 3, Completion tracking pass grade required.IT OfficerLMSPilot Quiz
Import ItemsAdd entire pool; enable question & answer shuffle.IT OfficerMCQ PoolPilot Item Set
Nominate TraineesSelect pilot group at least 5 students, enroll in sandbox.IT OfficerTrainee ListPilot Roster
Run PilotConduct quiz, monitor & record.AssessorLMSPilot Response Data
Analyze StatsGenerate item statistics: Facility Index, Discrimination (r-bis), Usage.Assessor, IT OfficerResponse DataItem Stats Report
Adjust ItemsRevise / drop items below thresholds; update versions & comments.IDD (Instructional Design Department)Item StatsMCQ Final
FINALIZATION & APPROVAL
Finalize BankLock approved items; map number of questions per topic per ToS; enable shuffle.Assessor, IT OfficerFinal MCQ SetLive Question Bank
Submit for ApprovalSet bank status to “For Approval”; send link to Training Manager.Assessor, IT OfficerE-mailApproval Request
Approve BankTraining Manager reviews and comments “Approved”; updates Status field.Training ManagerLive BankApproved Bank Log
Archive & PublishArchive prior drafts; export final bank backup; record revision history.QA OfficerApproved BankArchive File; Rev. History

8.3 Design Principles for MCQ Construction (Note: Document has 9.2 in original, seems like a typo, corrected to 8.3 to follow sequence)

PrincipleDescription
Clarity and Focus in the Stem
  • The stem should pose a clear and specific question or problem that the test-taker can understand without ambiguity.
  • The stem should be a question or an incomplete statement that can be answered or completed without looking at the alternatives. It should contain the central idea.
  • All necessary context and qualifying language should be in the stem, not repeated in each option. This makes the question more concise.
  • Avoid jargon or overly complex vocabulary unless it's specific to the knowledge being tested and the audience is expected to know it.
  • Generally, phrase stems positively. If negative wording (e.g., "Which of the following is NOT...") is unavoidable, emphasize the negative word.
Developing Effective Options
  • There should be only one answer that is clearly correct or demonstrably the best among the choices. Avoid options that are arguably correct or based on obscure opinions.
  • Distractors (incorrect options) should be incorrect but believable and appealing to test-takers who lack the knowledge or understanding being assessed. They often reflect common errors or misconceptions.
  • All options should be similar in length, grammatical structure, and style to the correct answer. This prevents guessing based on an option looking different.
  • Ensure that options do not overlap in meaning or content. If one option being true means another could also be true, the question is flawed.
  • "All of the above" / "None of the above": Can be problematic because if a test-taker identifies two correct options, they know it's the answer. If they identify one incorrect option, they can eliminate it. Doesn't reveal if the test-taker actually knows the correct answer, only that they can identify incorrect ones. If used, ensure the keyed answer is truly absent.
  • All options should be grammatically consistent with the stem. Reading the stem with each option should form a grammatically correct sentence.
Avoiding Clues and Bias
  • Avoid making the correct answer consistently longer or shorter than the distractors.
  • Ensure the correct answer appears in each possible position (A, B, C, D, etc.) roughly an equal number of times throughout the test. Avoid predictable patterns.
  • Terms like "always," "never," "only," "all" are often found in incorrect options because few things are universally true or false. Use them carefully and accurately if needed.
  • Avoid repeating keywords from the stem in the correct answer but not in the distractors, or vice-versa, as this can unintentionally guide test-takers.
  • Ensure questions are free from cultural, gender, or group bias that could disadvantage some test-takers or make the question incomprehensible to some.
Enhancing Cognitive Level
  • Design questions that require more than just recall. Use scenarios, case studies, or data interpretation to assess application, analysis, evaluation, or problem-solving.
  • Questions should probe whether students understand concepts, not just whether they've memorized definitions or isolated facts.
Formatting and Review
  • Provide clear instructions to the test-takers, such as choosing the "best" answer if some ambiguity is unavoidable in complex topics.
  • Use consistent formatting for all questions and options (e.g., capitalization, numbering/lettering of options). Lettering options (A, B, C) is generally preferred.
  • Check for typos, grammatical errors, and awkward phrasing that could confuse test-takers or invalidate a question.
  • Have colleagues review your MCQs. A fresh pair of eyes can often spot ambiguities, flaws, or potential clues that you might have missed.
  • After the test, analyze how each MCQ performed (e.g., item difficulty, item discrimination). This data can help identify flawed questions and improve future MCQ writing.

8.4 Design and Development of Summative Practical Assessments

StepDescriptionRespInputOutput
Map Assessment Task & CriteriaThe developer maps out each practical task and its observable criteria, aligned with ILOs and competence tables.AssessorCourse ILOs; Competence TablesTask & Criteria Map
Apply Standard TemplateDevelop a practical assessment tool using the standard templateAssessorTask & Criteria MapDraft Assessment (Template)
Fill-up Practical Assessment ToolFill in the required procedures, assessment task(s), performance criteria, performance standards, and rating scale.AssessorDraft Assessment ToolDeveloped Assessment Tool
Submit for Document ReviewForward the developed assessment to verify formatting, terminology, and standard compliance.Course DeveloperDocument ReviewStandardized Assessment Tool
Assign Assessor ReviewerAssigns an SME to review the document for technical and pedagogical soundnessTraining ManagerAssign(s) Reviewer (SME)Designated Reviewer (SME)
Review and VerificationThe assessor reviews the document, annotates comments in the "Comments" section, and returns it to the Developer.SMEAssessment Tool Review and VerificationReview and Verification Form
Revise & Log ChangesThe developer addresses the comment, revises the document, and logs changes.Course DeveloperReviewer Comment LogRevised Assessment Tool
Submit to Training ManagerSubmit revised assessment tool to the Training Manager for preliminary check of alignment and resource feasibility.Course DeveloperRevised AssessmentTraining Manager’s Feedback
Select Pilot Test CandidatesIn coordination with TM, identify a representative group of trainees and schedule the pilot.Course Developer, Training ManagerParticipant ListPilot Test Plan
Conduct a Pilot TestFacilitate pilot execution, observing time, clarity, criterion usability, and scoring consistency.Assessor, Course Developer, Training Manager (TM), Quality Assurance (QA)Pilot Test PlanPilot Testing Forms and Validation Tools
Review Pilot Test Results and FeedbackAnalyze pilot outcomes, refine tasks, rubrics, and timing;Course DeveloperPilot Testing Forms and Validation ToolsPractical Assessment Tool
Finalize & Document ChangesIntegrate all refinements, update version number, and prepare final assessment package.Course DeveloperPractical Assessment ToolRevised Practical Assessment Tool
Submit for Training Manager Final ApprovalPresent final assessment and supporting documents for formal approval.Training ManagerRevised Practical Assessment Tool & LogsApproved Practical Assessment Tool
QA Compile & ArchiveQA collates all versions, review logs, pilot data, approval memos, and files them for a minimum of five years.QAApproved Practical Assessment Tool & LogsArchived Assessment Package

The implementation phase refers to the actual conduct and administration of assessments, ensuring alignment with approved training standards and MARINA requirements. This includes scheduling assessments, preparing examination venues or LMS platforms, deploying invigilation procedures, and applying the established grading system.

9.1 Procedure — Conduct of Formative Assessments (End-of-Lesson LMS Quiz)

StepDescriptionRespInputOutput
Issue Topic-End InstructionsDuring the introduction of the course and briefing that at the close of each lesson, the Formative Assessment must be taken before proceeding. Emphasize unlimited attempts, mastery score, and academic-honesty expectations.InstructorInstructor's InstructionsUnderstanding of the Trainee
Verify SCORM SettingsConfirm the quiz for the topic is already visible in the course, set to Unlimited Attempts and “Highest Attempt” grading, with completion tracking set to Passed. (No upload required: package is pre-loaded.)IT OfficerCourse PageSCORM Setting
Learner Completion (Asynchronous)Learners launch the SCORM quiz at their own pace, repeat attempts until the mastery score is reached; LMS records every attempt automatically.TraineeLMS CredentialsAttempt Records
Immediate Feedback & Progress GateSCORM package displays item-level feedback; once the quiz status = Passed, the next topic automatically unlocks.LMS (Auto)Attempt DataTopic Unlock
Instructor MonitoringReview SCORM activity report (attempt counts, scores). Identify any learner needing support or any topic showing persistent low first-attempt pass rates.InstructorSCORM ReportSupport Outreach
Record RetentionLMS automatically stores settings logs, attempt records, and reports.IT OfficerSystems LogsAchieved Data Logs

9.2 Procedure — Conduct of Summative Theoretical Assessment (Note: Document has 9.1 again, corrected to 9.2)

StepDescriptionRespInputOutput
Preparation of Assessment
Verify Quiz LoadVerify that the Summative Quiz is already loaded in the live training course, linked to the approved question bank and Table of Specifications (ToS).AssessorLive Course ToSQuiz Presence Confirmed
Check Quiz SettingsCheck quiz settings: single attempt (already set) and time-limit exactly as stated in the ToS; ensure question/option randomization is enabled; keep quiz hidden until start.AssessorQuiz Setting Page, ToSQuiz Configuration Log
Confirm Hardware ReadinessConfirm the assessment computers and tablets are ready; perform PC and network readiness check, including one spare workstation.IT OfficerPC and TabletsReadiness
Sign & Archive Config LogSign and archive the Configuration Log; leave quiz status “Hidden from students.”AssessorConfig LogApproved Config Log
Briefing of Assesses
Admit & VerifyAdmit candidates, verify identity against seating plan, remind them they will log in with their usual LMS credentials once the assessment is shown.AssessorSeating PlanAttendance Sheet
Standard BriefingDeliver standard briefing covering: Assessment objective, pass mark, cheating policy, re-sits, appeals rules, timing, grading, Hardware.AssessorBriefing SlideSigned Briefing Acknowledgement
Collect SignaturesCollect signatures from the trainees.AssessorAttendance ListCompleted Attendance List
Assessment Proper
Start AssessmentAt start time, manually un-hide the quiz and click “Open attempt for all.” Announce the official start and remaining time checkpoints.AssessorQuiz Availability ToggleQuiz Start Timestamp
InvigilateInvigilate: patrol aisles, monitor CCTV; record any integrity or technical incidents.AssessorInvigilationCompleted Invigilation
Ensure SubmissionEnsure each candidate sees “Assessment Submitted” before exiting the assessment site.AssessorAssessmentCompleted Assessment
Result of the Assessment
Auto-Grade & ReviewAllow the LMS to Auto–Grade, manually review any flagged items.AssessorQuiz AttemptsConfirmed Grade Book
Cross-Check & SaveCross–Check grades against attendance and save results directly in the LMS.AssessorGradebookVerified Results in System
Debriefing of Assessment
Announce ResultsImmediately after grading, announce total passes and fails to the group. Explain the 48-hour appeals window and the re-sit schedule.AssessorGrade SummaryDebrief Session Log
Advise on Re-sitAdvise unsuccessful assesses of the mandatory re-sit procedure.AssessorRe-Sit PolicyRe-Sit Notice
Complete Debrief ReportComplete a short Debrief Report noting issues and improvement actions; file in course records.AssessorGradebookEnd of Debriefing

9.3 Conduct of Practical Assessment

StepDetails of the ProcedureRespInput / ToolOutput
Retrieve assessment packPull Practical-Assessment Package vX.X (exercise script, assessor checklist, scoring rubric, risk assessment) from Examination Master File.AssessorMaster FilePrinted package
Facility & equipment checkVerify simulator settings / equipment functionality; perform safety inspection; log in Lab / Sim Readiness Checklist.Lab Tech / Safety OfficerSimulator console; PPESigned readiness checklist
Candidate schedule & noticePost roster with assessment slots, PPE requirements, and pass criteria (Competent / Not Yet Competent).Course AdminTimetable boardNotice posted
Identity & PPE checkConfirm photo ID, attendance; inspect PPE (or issue center PPE); brief on safety rules & assessment scoring.AssessorID cards; PPE listAttendance sheet
Safety & task briefingExplain scenario objectives, time limits, stop command, and fail-safe procedures; answer procedural questions only.Safety Officer / AssessorExercise scriptBriefing record
Launch scenarioStart simulator/exercise timer; observe from control station or designated area.AssessorSimulator controlsTimer log
Observe & scoreMark each criterion on Practical-Checklist (C = Competent, NYC = Not Yet Competent); note time stamps and critical errors.AssessorChecklist formCompleted checklist
Safety oversightIntervene only for unsafe act; pause scenario, apply corrective action, record in Incident Report if required.Safety OfficerEmergency stopIncident report (if any)
Verbal debriefOn completion, give short factual feedback (“task complete”, “missed valve isolation”); advise result will be posted after QA.AssessorChecklistDebrief note
Result decisionIf all critical criteria = C, mark Competent; else NYC.AssessorChecklistScore sheet
QA verificationQA Officer samples ≥10 % checklists & video (if recorded) for scoring consistency; countersigns sample.QA OfficerChecklists; videoQA sign-off memo
TM approvalTraining Manager reviews result summary and incidents; signs Practical TCROA section within 24 h.TMResult summarySigned TCROA
Post resultsPublish Competent/NYC list on LMS or notice board; outline re-assessment slot (max 2 attempts).Course AdminTCROA dataResult bulletin
Handle appealsReceive appeal form within 5 days; Appeals Panel re-views video/checklist; issue decision memo.Appeals PanelAppeal form; videoAppeals decision
Item analysisAggregate KPI: pass rate, average time, frequent errors; log in Practical-Stats Sheet.QAChecklists; timer logStats sheet
Improvement actionsCurriculum team reviews stats; adjust scenario or training content; version-up assessment pack to vX.X+1.Course Developer / AssessorStats sheet; feedbackUpdated package
Archive documentationFile checklists, sign-offs, videos, incident & stats in /Assessments/Practical///vX.X/; retain ≥ 5 years.Document ControllerAll recordsArchived folder
Equipment resetReturn simulator/equipment to neutral state; log maintenance needs.Lab TechReset checklistMaintenance log

This section outlines the grading criteria for assessments, how final course grades or outcomes are determined, and the policies for re-examination (re-sits) and appeals of assessment results. Consistent and transparent grading practices are critical for fairness and for compliance with standards that require clear pass/fail criteria to be defined in the assessment procedures.

Assessment ComponentWeightPassing RequirementAttempts AllowedAdditional Rules / Notes
Formative Assessments (topic-end quizzes, exercises) 0 % (non-bearing)¹ Completion of every formative item Unlimited Attempts Completion unlocks the next topic. The instructor reviews low-scoring items and schedules refreshers before the summative exams.
Summative Theoretical Assessment (written / computer-based) 50 % of the final grade ≥ 75 % overall score² 1 regular sitting + up to 2 authorized re-sits A trainee who still fails after the 3rd re-sit must take a formal refresher course before reassessment. Refer to Re-sits Policy. Passing this component is mandatory before the trainee may attempt the Summative Practical Assessment. Item weights follow the Table of Specifications; results recorded in TCROA.
Summative Practical Assessment (performance tasks) 50 % of final grade Competent in all critical tasks³ 1 regular sitting + task-specific re-assessment (if feasible) Binary rating per task: Competent / Not Yet Competent. Any “Not Yet Competent” on a critical task ⇒ component Fail. May only be taken after passing the Summative Theoretical Assessment. Conducted with checklists & rubric; results logged in TCROA.
Final Course Grade 100 % Summative Theoretical ≥ 75 % AND Summative Practical = Competent Both summative components must be passed; otherwise, the overall grade will be a FAIL. Formative completion is a prerequisite to sit the summative assessment.

¹ Formative assessments carry no percentage weight but are compulsory gateways.
² The 75 % minimum pass mark is fixed.
³ “Critical tasks” correspond to mandatory MARINA/STCW competence elements and must all be passed in one session or during the allowed reassessment.

Policy ElementStandard ProvisionImplementation / Notes
Eligibility for Re-sitA trainee who fails any final summative component (written or practical) may re-take the failed component only.New but equivalent paper or fresh practical scenario; instructor debriefs weaknesses (no question disclosure).
Number of AttemptsMaximum three (3) sittings in one exam cycle: 1 initial + 2 authorized re-sits.After the second re-sit (third failure), the cycle ends; no further immediate tests are conducted.
Required Remedial TrainingMandatory refresher or course re-enrolment before a new exam cycle can start.Ensures added learning before more attempts.
Timing / SchedulingRe-sit on same day is allowed; re-sits must occur within 1 year of initial failure.Same-day re-take which is dependent on the availability of personnel and facility; practical re-sit depends on facility & assessor availability.
FeesRe-sits may carry an additional fee as per the schedule or approved internal policy.The fee policy is disclosed in the Course Info Sheet.
Component-Specific Re-sitOnly the failed component is repeated; the passed component stands.TCROA marks the pass; the failed part is flagged “pending”.
Recording & TCROA NotationAll re-sits logged; TCROA flags 2nd or 3rd-attempt passes (e.g., asterisk).Supports audits & data analytics.
Exam Integrity & SecurityRe-sits follow identical proctoring rules; new question sets preserve integrity.May use the same or a different qualified assessor.
Special / Marginal CasesNo oral supplementation to push borderline scores; competence proven via formal re-sit.Guarantees fairness.

The institution recognizes the right of a trainee to appeal an assessment result if they believe it was unjust or in error. An appeal is a formal request by a candidate to review and reconsider their assessment outcome.

Applies to all summative assessment components (theoretical and practical) conducted by Maritime Training Institution. Covers grievances about grading errors, out‑of‑scope questions, assessor/proctor conduct, or significant procedural irregularities.

The following constitute acceptable grounds for lodging a formal appeal. Appeals that do not fall within these categories may be dismissed unless exceptional circumstances are demonstrated.

Acceptable GroundExample
Grading ErrorThe correct answer marked wrong; miscalculation of total score.
Out-of-Scope QuestionItem tests content not included in course syllabus Part C or official learning materials.
Bias / Improper ConductEvidence of assessor or proctor prejudice, intimidation, or favoritism.
Procedural IrregularityDisturbance, equipment failure, or breach of exam rules that materially affected performance.
Administrative ErrorWrong candidate identity used, incorrect exam version issued, or data-entry mistake.

Procedures for Handling of Appeals

StepDetails of the ProcedureRespInputOutput
Informal ClarificationThe trainee discusses the result informally with the Instructor within two (2) working days of the result's publication.Trainee, InstructorAssessment ResultClarification Provided
Lodge Formal AppealIf unresolved, the trainee submits a written Appeal Letter to the Training Manager (TM) within five (5) working days of the result release, specifying the grounds and evidence.TraineeAppeal LetterAppeal Log Entry
Acknowledge & NDATM acknowledges receipt within 1 day, issues an Appeal Acknowledgement, and has the trainee sign an NDA covering the confidentiality of assessment materials.Training ManagerAppeal Log EntryAppeal Acknowledgement: Signed NDA
Constitute the Appeals PanelTM forms a 3-member Appeals Panel (TM or delegate as Chair, an independent SME/assessor, and the original assessor, unless there is a conflict).Training ManagerAppeal Log; Availability RosterPanel Assignment Memo
Evidence CollectionThe panel gathers answer sheets, scoring rubrics, LMS logs, video recordings, incident reports, the syllabus, and the Terms of Service (ToS).Appeals PanelAssessment RecordsEvidence Dossier
Review & DeliberationThe panel reviews evidence related to the appeal grounds and interviews the staff involved as needed.Appeals PanelEvidence DossierDraft Decision
Decision & ReportThe panel finalizes the outcome (Upheld/Denied) and recommends remedial action; completes the Appeals Review Report, signed by all members.Panel ChairDraft DecisionSigned Appeals Review Report
Update RecordsIf the appeal was upheld, adjust scores, TCROA, gradebook; if denied, no change—record decision in QA Appeals Log.QA Officer, Training ManagerAppeals Review ReportUpdated Records
Communicate OutcomeTM issues formal Decision Letter to trainee within five (5) working days of appeal receipt, including brief rationale and next steps (resit eligibility, etc.).Training ManagerAppeals Review ReportDecision Letter
Implement Remedial ActionIf the panel orders a retest or voids the exam, schedule accordingly; waive fees if the fault lies with the institution.Instructor, QADecision LetterScheduled Retest / Corrected Grade
Archive DocumentationFile Appeal Letter, NDA, Evidence Dossier, Appeals Review Report, Decision Letter in secure QA archive for 5 years.QAAppeals DocumentsArchived Appeals File
StepDescriptionRespInputOutput
Select & Assign InvigilatorsThe Training Manager designates may designate the course supervisor or another assessor to lead each summative assessment. He assigns additional trained staff so that assessment have at least one invigilator per 20–24 candidates, and practical exams have candidates (plus a safety aide if needed).Training ManagerAssessment Schedule: faculty listConfirmed invigilator roster
Brief Invigilators on Confidentiality & ImpartialityBefore every session, invigilators declare any conflict of interest, and are reminded that they must not assist candidates or perform unrelated tasks while on duty.Training Manager, AssessorConfidentiality conflict-of-interestCleared invigilator
Conduct Pre-Exam Room & Material ChecksInvigilators inspect desks, walls, equipment, and waste bins for hidden notes or devices, remove any unauthorized materials, and lay out spare pens, attendance sheets, and incident forms.InvigilatorAssessment venue: checklistCleared, ready assessment room
Verify Candidate Identity & AdmissionAt the door, assessor and invigilator/s match each candidate’s photo ID and admission slip against the roster; unresolved cases are referred to the Training Manager before entry.InvigilatorID cards, admission slips, rosterAuthenticated seating list
Announce Rules & Secure Personal ItemsInvigilators read the standard rules aloud, instructing candidates to switch off and surrender their smart devices, place their bags in the designated area, and keep only allowed materials on their desks.InvigilatorRules script; storage binsCandidates briefed; items secured
Log Candidate AttendanceOnce seated, invigilators take attendance on the official sheet and note late arrivals; latecomers are admitted only within the first 15 minutes and receive no extra time.InvigilatorAttendance sheetCompleted attendance record
Maintain Active InvigilationInvigilators position themselves to view all candidates, rotate unpredictably, and discreetly observe answer sheets or screens for copying patterns while avoiding actions that could be construed as coaching.InvigilatorSeating planContinuous supervision
Detect & Deter Prohibited BehaviorsInvigilators watch for whispering, wandering eyes, crib sheets, and devices; they may re-seat suspects, confiscate materials, or accompany any candidate who must briefly leave the room.InvigilatorSurveillance observationsReal-time intervention
Intervene & Document IncidentsIn cases of confirmed cheating (e.g., use of phones, pre-written answers), the invigilator terminates the exam, collects evidence, records details in the Invigilation Incident Report, and discreetly removes the candidate.Invigilator; AssessorIncident form; seized itemsCompleted Incident Report; secured evidence
Collect Scripts & Rough WorkAt “time’s up,” invigilators collect all answer scripts and rough paper, count them against the attendance sheet, and seal them in the tamper-evident envelope.InvigilatorAnswer scripts; envelopeSecured exam package
Ensure Post-Exam ConfidentialityInvigilators hand-seal scripts to the Assessor, shred surplus rough paper, and refrain from discussing questions publicly; they countersign the transfer log to confirm chain of custody.Invigilator; AssessorTransfer logProtected exam content
Review Incidents & Impose SanctionsThe Training Manager reviews all Incident Reports, interviews parties if needed, and applies misconduct penalties (fail, bar on immediate re-sit, or dismissal). MARINA is notified of grave cases, such as impersonation.Training ManagerIncident Reports, policy matrixSanction decision; MARINA report (if required)
File RecordsAll attendance sheets, incident reports, and sanction decisions are archived in secure storage (both electronic and hard copy) for a minimum of 5 years, in accordance with the Records Retention Procedure.Records OfficerExam documentsArchived records

To safeguard the integrity of all assessments by preventing, detecting, and addressing cheating in a consistent, fair, and traceable manner.

Applies to every formative and summative assessment—onsite, distance-learning, or practical—administered under the Learning Management System.

StepDescriptionRespInputOutput
Deploy Preventive MeasuresPrior to any exam, the IT Officer enables, randomizes question and answer order, and geofences LMS access. The Training Manager ensures that assessment version are uploaded, and seating plans are created to ensure adequate spacing.IT Officer; Training ManagerExam bank, seating chartHardened assessment environment
Honor StatementEach candidate signifies himself to follow the honor statement during the conduct of assessment affirming they will not give or receive unauthorized assistance.InvigilatorHonour statementCommitment
Identity & Permit VerificationInvigilators check photo IDs and admission slips at the door; any mismatch results in temporary denial of entry until resolved with the Training Manager.InvigilatorID card, rosterAuthenticated seating list
Personal-Item ControlInvigilators instruct candidates to power off their devices, surrender smart devices, store bags in the designated area, and keep only transparent pencil cases and approved materials on their desks.InvigilatorStorage binsSecured exam hall
Active Proctoring & DetectionInvigilators rotate positions, observing for whispering, answer-pattern mirroring, or frequent lap gazes (which may indicate possible phone use); they re-seat suspects and accompany any candidate who must leave the room.InvigilatorSeating planReal-time deterrence
Confiscation & Evidence CollectionOn discovery of unauthorized notes, devices, or impersonation, the invigilator discreetly confiscates the item, notes the time, the candidate's name, and the question involved, and photographs digital evidence if feasible.InvigilatorSeized itemsEvidence packet
Immediate Sanction DecisionPossession of devices or pre-written answers triggers automatic exam termination; a first-time copying attempt earns an on-paper warning, but repetition leads to termination.AssessorIncident detailsCandidate dismissed or warned
Incident ReportingThe invigilator completes SEAV-ASM-C01, attaches the evidence, and submits it to the Assessor for countersignature; the candidate is then offered a brief written statement opportunity.Invigilator; AssessorReport formSigned Incident Report
Post-Exam Chain of CustodyInvigilators collect all rough sheets, seal answer scripts, shred scrap paper, and log transfer of materials to the Records Officer.Invigilator; Records OfficerScripts; logSecured exam package
Management Review & PenaltyThe Training Manager reviews the incident, consults the policy matrix, and decides on the penalty (fail, re-sit bar, remedial training, or expulsion). For grave cases, they prepare the MARINA notification within five (5) working days.Training ManagerIncident ReportSanction decision; MARINA letter
Records Retention & KPI LoggingThe Records Officer archives all related documents (both hard copies and digital) for a minimum of five years and logs the incident in the KPI dashboard.Records OfficerIncident dossierArchived file; updated KPI
Continuous ImprovementIT Officer analyses LMS logs for unusual IP patterns; Chief Assessor runs answer-similarity checks; findings inform updates to invigilator training and preventive controls in the next quarterly review.IT Officer; Chief AssessorLMS logs; similarity reportUpdated controls & training

Onsite Geo-Fencing LMS Assessments

To ensure that all on-site LMS assessments are taken only from approved computer laboratories or classrooms, using IP-range locking and invigilator oversight to match in-person security.

Applies to every formative or summative exam delivered through the Learning Management System.

StepDescriptionRespInputOutput
Whitelist Venue IPsIT enters the lab’s static IP range (e.g., 192.168.12.0/24) in “Quiz access → Require network address.”ITIP listUpdated LMS config
Publish Geo-Locked Exam WindowTraining Manager schedules the quiz (e.g., 10:00–12:00 PHT) and tags it “On-Site Only.”Training ManagerTimetableThe exam is visible in the LMS calendar
Workstation Readiness CheckThirty (30) minutes before start, the Invigilator logs in on each PC to verify LMS recognizes “Location Verified.”InvigilatorPCsCleared lab
Identity & AdmissionAt the door, the Invigilator matches the photo ID to the roster and seats the candidate at a verified PC.InvigilatorIDs; rosterAttendance sheet
Active MonitoringInvigilator patrols aisles; SEB enforcement is in effect. Any IP hop alert is escalated to IT.InvigilatorSEB dashboardSecure exam run
Geo-Fail HandlingIf LMS wrongly blocks a PC, IT quickly re-checks IP; if unresolved, candidate swaps to spare workstation.IT; InvigilatorError screenCandidate seated
Close & ArchiveAfter “time’s up,” Invigilator seals scripts; IT exports IP logs; Records Officer stores both for ≥ 5 yrs.Invigilator; IT; Records OfficerScripts; log filesArchived dossier

Offsite Remote Geo-Fencing LMS Assessments

StepDetailsRespInputOutput
Remote Site ApprovalCandidate submits address + ISP details ≥ 5 days pre-exam; TM issues approval letter and whitelists IP/GPS.Training ManagerRequest formApproval letter
SEB Profile DistributionIT emails SEB config locking exam to approved GPS radius (±250 m), blocking VPN/proxy.ITGPS coordsSEB-config file
24-h Location TestCandidate runs SEB “Pre-check”; system uploads GPS/IP. Failure auto-generates support ticket.Candidate; ITSEBPass/fail log
Identity & 360° Room ScanAt exam start, Remote Proctor confirms photo ID and records room sweep; mismatch cancels exam.ProctorWebcam feedLocation video log
Continuous Geo-MonitoringLMS & proctor tool flag GPS drift > 250 m, face-off-camera, or IP change; proctor pauses exam to investigate.ProctorLive dashboardIncident alert
VPN/Proxy Auto-BlockFirewall rejects known proxy IPs; candidate sees “VPN Detected—Disconnect.”ITProxy DBConnection blocked
Geo-Fail EmergencyLegitimate block? Proctor initiates live call; candidate shows outside view + weather; if validated, IT issues temporary unlock token and records SEAV-GEO-F01.Proctor; ITError screenUnlocked session; form
Close & Evidence ArchiveProctor ends session, uploads video; IT exports GPS/IP logs; RO archives video+logs for ≥ 5 yrs.Proctor; IT; Records OfficerRecordings; logsArchived dossier