The purpose of this manual is to establish an assessment system for all training courses offered by the institution. It provides a structured framework of policies and procedures to ensure that trainee assessments are conducted consistently, fairly, and in complete alignment with STCW 1978, as amended, national regulations, and company-specific training in the maritime sector. This manual fulfills that requirement by clearly outlining how assessments will be designed, administered, and managed to verify that seafarers meet the prescribed competence standards under the STCW Convention. In essence, the rationale for this manual is to promote valid and reliable assessments of competencies, uphold the integrity of the certification process, and ensure that every assessment contributes to the production of competent and safety-conscious seafarers in accordance with STCW objectives and Maritime Industry Authority policies.
This manual’s objectives are to:
This manual covers the assessment system for all STCW-mandated training courses delivered by the institution. It applies to the full range of mandatory courses listed by MARINA (as per the official list of STCW mandatory training courses), including any ancillary or refresher training that falls under STCW Convention requirements. The procedures herein govern both theoretical (written) assessments and practical assessments (including simulator-based assessments) conducted as part of course completion requirements.
The provisions of this manual apply to all personnel involved in training assessment processes, including but not limited to: course developers, instructors, designated assessors, proctors/invigilators, and training administrators. All MARINA-accredited instructors and assessors are expected to implement these policies in their respective roles. The manual is to be followed in both classroom-based (face-to-face) training and in technology-assisted learning environments. Suppose courses are delivered via distance learning or e-learning modes. In that case, the same assessment standards apply, with additional provisions to address remote assessment security in line with MARINA’s distance learning guidelines. This manual is therefore applicable to assessments conducted on-site at the training center, as well as LMS-based assessments used in blended or online course delivery.
It is the responsibility of the Training Manager (or equivalent academic head) to ensure that all assessments within scope are designed and conducted in accordance with this manual. Individual instructors and assessors are responsible for conducting assessments in accordance with the procedures, while the Quality Assurance (QA) department is responsible for monitoring the compliance and effectiveness of the assessment system. Trainees are also within the scope insofar as they must adhere to the examination rules and procedures defined herein.
For the purpose of this manual, the following key terms are defined operationally:
| Term | Definition | Reference |
|---|---|---|
| Administration | refers to the Maritime Industry Authority (MARINA) | MC No. SC – 2021-08, MC No. SC - 2022 - 05 |
| Approved Training Course | refers to a learner-centered system of instructions, approved by the Administration, and designed to equip the trainees with the necessary knowledge, understanding, and proficiency that would lead to the acquisition of the required competences under the STCW Convention, 1978, as amended. | MC No. SC – 2021-08, MC No. SC – 2021-09 |
| Asynchronous delivery | (offline learning) refers to the conduct of classes through distance and e-learning that do not occur in the same place or at the same time. | MC No. SC – 2021-10 |
| Assessment of Learning | refers to the systematic collection, measurement, and examination of the trainee's performance with respect to the intended learning outcomes. | MC No. SC – 2021-09, MC No. SC - 2021 |
| Assessment Tools | refers to the following components: context and conditions of assessment, tasks to be administered to the trainees, an outline of the evidence to be gathered from the candidate and evidence criteria used to judge the quality of performance. | MC No. SC – 2021 – 09, Superseded by MC No. SC – 2022-05 |
| Assessment of Competence | refers to the process of collecting evidence through theoretical examination and practical assessment of the knowledge, understanding and proficiency, gained from the following: approved education and training; approved training ship experience; approved laboratory equipment training; or approved in-service experience, and making judgments on whether competency has been achieved and that an individual can perform the relevant standards in the table of competences of the STCW Code, as amended. | MC No. SC – 2021-08 |
| Blended Learning (BL) | refers to the combination of distance and/or e- learning and face-to-face modes of delivery of training of seafarers. | MC No. SC – 2021-10 |
| Cloud-based simulation | a facility for training through a remote desktop solution which enables physical and operational realism through virtual reality in order to achieve the required competences in the appropriate provisions of the STCW Code. | MC No. SC – 2021-10 |
| Course Package | refers to the Course Plan plus the Instructional materials and Assessment tools. | MC No. SC – 2021-09 |
| Course Plan | is the systematic organization of course documents designed and structured based on IMO Model Course format consisting of: Course Framework (Part A); Course Outline and Timetable (Part B); Course Syllabus (Part C); Instructor's Guide (Part D); and Course Assessment (Part E). | MC No. SC – 2021 – 09, Superseded by MC No. SC – 2022-05 |
| Distance Learning (DL) and e-Learning (EL) | refer to the conduct of training where trainees receive instruction through online classes, video recordings, video conferencing, or any other audio/visual technology medium. It enables trainees to undergo training without having to be physically present in a classroom. | MC No. SC – 2021-10 |
| Face to Face Learning | refers to an instructor-led activity in a traditional setting for the conduct of theoretical, practical or laboratory part of the maritime training courses using laboratory facilities and equipment including simulator. | MC No. SC – 2021-10 |
| Learning Management System (LMS) | refers to the software application used by the MTI in the administration, documentation, tracking, reporting, automation of delivery and assessment of the training courses through distance and e-learning modes, and the issuance of certificates. It also pertains to the use of a system to provide a number of critical services that make the interaction between the instructor, assessor, and trainees more seamless. | MC No. SC – 2021-10 |
| Instructional Materials (IMs) | These are materials that complement and supplement instruction. These are also referred to as teaching aids used in the delivery of the course, such as audio-visual presentations or computer-generated slides, exercise sheets, workbooks, pictures, diagrams, and the like. | MC No. SC – 2021 – 09, Superseded by MC No. SC – 2022-05 |
| Institution | refers to a Maritime Training Institution or an Assessment Center | MC No. SC – 2021-08 |
| Learning Resource Center | refers to a facility within an MTI, staffed by a specialist, containing several information sources to facilitate learning for trainees and staff. It focuses on multimedia resources and information technology. | MC No. SC – 2021-08 |
| Planned Maintenance System (PMS) | is the documented process of periodic inspection, testing, and repair of equipment and facilities to ensure that, at any given time, they are up and running, preventing any costly unplanned downtime from unexpected equipment failure | MC No. SC – 2021-08 |
| Quality Standards System | refers to the documented policies, procedures, controls and internal quality assurance system, relating but not limited to training, assessment of competence and revalidation activities, designed to ensure the achievement of defined objectives of the training course in accordance with the requirements of the STCW Convention. | MC No. SC – 2021-08, MC No. SC – 2021-09 |
| Receiving technology | refers to the hardware and software associated with and used by the trainee. | MC No. SC – 2021-10 |
| Record of Review, Verification, and Validation | refers to the working documents and evidences as a result of review, verification and validation process. Basically, the records were based on acceptance, rejection or qualifying data or information in an objective and consistent manner. | MC No. SC – 2021-09 |
| Synchronous delivery | (online learning) refers to the conduct of classes through distance and e-learning which occurs through a virtual platform while the instructors and trainees are separated physically but connected real-time through the internet or other medium. | MC No. SC – 2021-10 |
| STCW Convention | refers to the International Convention on Standards of Training, Certification and Watch keeping for Seafarers, 1978, as amended and its associated Code. | MC No. SC – 2021-09 |
| STCW Office | refers to the office in MARINA, specifically tasked to give full and complete effect to the requirements of the STCW Convention, 1978, as amended. | MC No. SC – 2021-08 |
| Training Completion and Record of Assessment (TCROA) | refers to the prescribed document where the name of trainees who have completed the training course and the outcome of their assessment are recorded as certified by the qualified assessor and Training Director of an accredited Maritime Training Institution. | MC No. SC – 2021-09 |
The following general provisions establish the foundational requirements and policies for implementing the assessment system:
| Regulatory Reference | Date of Issue | Description |
|---|---|---|
| MC No. SC – 2022 – 05 | 11/14/2022 | Standards For Mandatory Training Courses Under The STCW Convention, 1978, As Amended |
| MC No. SC – 2021 - 10 | 12/29/2021 | Revised Guidelines on Training and Assessment of Seafarers by Distance Learning and E-Learning in Accordance With The provisions of Regulation 1/6 Of The STCW Convention, 1978, As Amended |
| MC No. SC – 2021 – 09 | 12/29/2021 | Policies, Rules and Regulations on the Approval of Training Courses under the STCW Convention, 1978, as Amended |
| MC No. SC – 2021 – 08 | 12/29/2021 | Policies, Rules, And Regulations on the Accreditation Of Maritime Training Institutions and Assessment Centers |
| MC No. SC – 2021 – 02 | 03/18/2021 | Revised Rules on the Monitoring of Approved Training Courses (ATCs) conducted by the Maritime Training Institutions (MTIs), and Assessment of Seafarers’ Competence carried out by Accredited Assessment Centers (ACs) |
| Statutory Reference | Date of Issue | Description |
|---|---|---|
| ISO 9001:2015 | Quality Management System – Requirements |
The following general policies govern the implementation and maintenance of the assessment system manual itself, as well as overarching responsibilities and practices. These policies ensure that the manual is a living document effectively used by the organization:
This Assessment System Manual shall be implemented across all departments and personnel involved in training delivery and evaluation. The Training Manager is responsible for officially issuing the manual and ensuring that all instructors, assessors, and relevant staff have been oriented on its contents. A formal orientation or training session on the manual’s policies and procedures will be conducted whenever new staff are onboarded or when major updates to the manual occur.
All assessments for STCW courses must be conducted strictly in accordance with the procedures laid out in this manual – no ad hoc or informal assessment methods are permitted. Department heads and course supervisors shall monitor compliance by staff. Any deviation from the manual’s procedures (if necessary in exceptional cases) must be approved by the Quality Assurance (QA) Manager in advance and documented with justification. The principle is that this manual’s procedures are mandatory instructions to be followed to ensure standardization and regulatory compliance.
To support implementation, controlled copies of the manual (printed or digital) will be distributed (see distribution policy below), and easy access will be provided. The QA department may conduct spot checks or observe assessments to verify that the practice aligns with the manual. Where issues are found, corrective training or disciplinary measures will be taken to enforce adherence. Ultimately, the successful implementation of the manual is measured by smooth, uniform assessment operations and positive audit/inspection results from external bodies like MARINA or Accreditation Bodies.
The assessment manual is a controlled document that is subject to periodic review and revision to ensure it remains current with the latest requirements and best practices. At a minimum, the manual should be reviewed annually. In addition, specific triggers will prompt immediate review and possible revision, such as:
The QA Manager will coordinate the review process, assembling a review team that may include instructors, the Training Manager, and a curriculum specialist. Proposed revisions are documented and presented to top management for approval. Once approved, a new revision number is assigned, and the manual is reissued. All manual revisions are logged in a Document Control table at the front of the manual, noting the changes made and the effective date. Obsolete versions (physical copies) are retrieved and archived to avoid confusion. The QA Manager ensures that MARINA is informed, if required, of substantive changes (for example, MARINA might require resubmission of portions of the manual if significant changes occur in assessment policy).
Version control is strictly maintained; the current version and revision date are indicated on every page or the cover of the manual. During review, attention is also given to ensure alignment with the latest versions of standards for learning services, so that best practices continue to be integrated.
In summary, this policy guarantees that the manual is not static – it evolves through a formal review cycle to remain relevant, effective, and compliant with all applicable requirements.
Clear roles and responsibilities are defined to implement and uphold the assessment system:
By clearly delineating these responsibilities, the manual ensures that everyone knows their role in the assessment system, which promotes accountability and smooth operation. A RACI (Responsible, Accountable, Consulted, Informed) chart may be included in an appendix to summarize these roles for each major process step (design, approval, conduct, etc.).
This manual is distributed as a controlled document. The QA department will maintain a distribution list to ensure all relevant personnel have access to the current version. Distribution is done in the following manner:
All staff must ensure they reference the current version of the manual when performing their duties. The QA department will notify all users (e.g. via email or memo) whenever a new revision is released, summarizing the changes. Each recipient is responsible for updating their controlled copy or replacing it with the new version. During annual staff training, a short refresher on key manual points and any recent changes will be given.
In conclusion, through controlled distribution, we maintain consistency in understanding and applying the assessment policies and avoid the scenario of someone inadvertently following outdated procedures. This supports the integrity and uniformity of our assessment system across the entire institution.
This section provides an overview of the end-to-end process for managing assessments, from initial design through administration and up to the validation of results and continuous improvement. The assessment system process is structured to ensure a systematic approach covering planning, development, implementation, and review, consistent with the framework ISO standards. The key stages in the assessment process are:
Throughout these stages, administrative requirements such as documentation, record-keeping, and reporting are handled (details on those in subsequent sections). For instance, during implementation and monitoring we ensure that attendance sheets, identity checks, and log records are kept; and after completion, we ensure results are reported to MARINA via the TCROA as required.
This holistic process ensures that assessments are not one-off events but part of a managed system. Each step is governed by procedures in this manual, which will be detailed in the following sections. By following this structured process, the institution can demonstrate that its assessment system is planned, systematic, and quality-assured end-to-end, which is expected by both ISO standards and MARINA. In the next sections, we delve deeper into specific procedures for key parts of this process: designing written and practical assessments, the grading system, conducting assessments, maintaining security, and validating results.
The design and development process of training programs and assessment instruments is a systematic and standards-driven activity that ensures alignment with the Maritime Industry Authority (MARINA) regulations, particularly those mandated under the STCW Convention, 1978, as amended.
At the core of the development phase is mapping each learning outcome to the appropriate assessment methods, ensuring that both knowledge and skills are evaluated in accordance with the STCW Code, Section A-I/6. The process involves:
All assessment activities are documented and traceable, with results recorded in the Training Completion and Record of Assessment (TCROA), as required under MARINA Circular SC-2021-09.
This structured approach ensures that each element of the training program — from content delivery to performance evaluation — upholds the quality and integrity expected in maritime education and training.
| Step | Details of the Procedure | Resp | Input | Output |
|---|---|---|---|---|
| Analysis and Planning | ||||
| Initiate syllabus review & MARINA alignment | Examine the Course Syllabus (Part C) to extract the ILOs the quiz must measure. | Instructor | Part C | ILO list aligned to MARINA standards |
| Map cognitive level via TOS | Use the Table of Specifications to assign Bloom’s level to each ILO/topic, guiding item type and count. | Instructor | ILO list; TOS | Completed TOS matrix |
| Design and Development | ||||
| Load the standard template | Copy the institution’s Standard Formative-Assessment Template (with built-in layouts and authoring guide) into the course folder. | Instructor, Instructional Designer | Standard template | Course-specific quiz shell |
| Develop questions | Populate the template with items directly mapped from the TOS, using approved question types for each Bloom’s level and citing instructional-material pages for accuracy. | Instructor, Instructional Designer | TOS; Quiz shell; Resources | Draft quiz |
| Check feedback & branching logic | Verify that each question’s feedback text, branching, and retry paths follow template rules and pedagogical intent. | Instructor, Instructional Designer | Draft quiz | Logic-verified quiz |
| Set scoring, attempts & completion | Configure quiz properties: unlimited attempts, randomized order, summary-only feedback, linear navigation, completion = answer all questions. | Instructor, Instructional Designer | Logic-verified quiz | Configured quiz |
| Internal preview & verification | Run Quiz Preview to confirm layout, navigation, feedback, scoring, and accessibility; fix any anomalies. | Instructor, Instructional Designer | Configured quiz | QA-cleared draft |
| Review and Verification | ||||
| Publish to LMS (sandbox) & route to Instructor (Reviewer) | Publish the draft SCORM to the LMS sandbox/test area and notify the Instructor (Reviewer) for review via the built-in review link. Note: An Instructor Reviewer is a qualified individual other than the course developer or primary instructor, designated to independently review the course content, instructional materials, and assessment tools. | IT Officer | QA-cleared draft | Sandbox quiz link sent to SMEs |
| Instructor (Reviewer) review & comment | Test the quiz, add comments on accuracy, alignment, clarity, and technical behavior. | Instructor (Reviewer) | Sandbox link | Comment log |
| Instructional Design revision | Revises quiz based on Instructor (Reviewer) feedback, and updates SCORM build | Instructional Designer | Instructor (Reviewer) comment log | Revised quiz v1.1 |
| Pilot Testing and Validation | ||||
| Pilot Test Question Bank | Gather at least five (5) participants or trainees who have completed relevant modules. Review flagged items, revise or discard items with a difficulty index of < 0.30 or > 0.80 or with a discrimination index of < 0.25; log changes | Instructor, Instructional Designer, Pilot Participants | Pilot Test(s), Question Bank | Pilot Test Results |
| Revise & Log Changes | The assessor and developer addresses the feedback of the reviewer(s) and pilot participants, updates questions based on the pilot test results, and logs changes. | Instructor, Instructional Designer | SME Comment Log | Final Question Set |
| Submit to the Training Manager | Submits revised quiz to the Training Manager for final pedagogical and compliance check. | Instructional Designer | Revised quiz v1.1 | TM approval request |
| Final publish & version control | Upon TM approval, the Course Developer publishes the validated SCORM to the production LMS, labels the file, and records it in the Version-Control Log. | Instructional Designer, IT Officer, QA | TM-approved SCORM | Live formative quiz (production) + updated version log |
| Upload & archive | Uploads the SCORM package, source file, and SME logs to the digital repository | IT Officer | Final assets | Archived package |
| Step | Details of the Procedure | Resp | Input | Output |
|---|---|---|---|---|
| DESIGN & DEVELOPMENT | ||||
| Extract ILOs | Retrieve Part C of the approved syllabus and list all Intended Learning Outcomes (ILOs). | Assessor | Approved Course Package (Part C) | ILO List |
| Complete ToS | Confirm the Table of Specifications based on the MARINA Standards (topics, thinking level, items / topic). | Assessor | ILO List | Approved ToS v1.0 |
| Create Question Categories | In Question Bank, add categories: Course Name – Topic. | Assessor, IT Officer | LMS | Category Tree |
| Select MCQ Format | Inside chosen category, choose question type “Multiple Choice”. | Assessor, IT Officer | LMS | MCQ Draft |
| Name Standard | Name each item: Course – Topic – ILO No. – Item No. | Assessor, IT Officer | Naming Guide | Item ID |
| Draft MCQ | Write stem + 4 options using MCQ design criteria (ILO alignment, stem rules, option rules). | Assessor | MCQ Guideline | MCQ V1 |
| Save & Version | Save question; verify Version = 1 in Version History. | Assessor | LMS | Item Version Log |
| Build 3× Pool | Repeat Steps 4-7 until each ToS cell has 3 × required items (redundancy). | Assessor | ToS | MCQ Pool |
| Notify Reviewer | Notify reviewer that draft pool is ready for review. | Assessor, IT Officer | Reviewer Notice | |
| REVIEW & VERIFICATION | ||||
| Designation of an Review Assessor | Designate another assessor to review the MCQ other than the Assessor who developed the Qbank | Training Manager | Notice to Assessor | Acceptance of Assessor |
| Peer Review | Evaluate each item against MCQ criteria (ILO alignment, stem, options). Comment “Valid” or list findings in LMS comment box. | Assessor Reviewer | MCQ Pool | Review Log |
| Document Findings | Document identified items needing correction and input in the comments log. | Assessor Reviewer | Review Log | Review Log |
| REVISION | ||||
| Revise Items if required | Edit items per comments; update stems/options; Version automatically saves to next numerical integer. | Assessor | Comments | Comments “Revised” |
| Version Check | Ensure Version History reflects changes and rationale. | Assessor | LMS | Version History |
| Notify Training Manager | Advise Training Manager that revisions are complete and pilot may begin. | Assessor | Notice | TM Acceptance |
| PILOT TESTING & VALIDATION | ||||
| Create Pilot Quiz | Add a Quiz (sandbox): Time limit: 1 min/item, Grade to pass: 70 %, Attempts: 3, Completion tracking pass grade required. | IT Officer | LMS | Pilot Quiz |
| Import Items | Add entire pool; enable question & answer shuffle. | IT Officer | MCQ Pool | Pilot Item Set |
| Nominate Trainees | Select pilot group at least 5 students, enroll in sandbox. | IT Officer | Trainee List | Pilot Roster |
| Run Pilot | Conduct quiz, monitor & record. | Assessor | LMS | Pilot Response Data |
| Analyze Stats | Generate item statistics: Facility Index, Discrimination (r-bis), Usage. | Assessor, IT Officer | Response Data | Item Stats Report |
| Adjust Items | Revise / drop items below thresholds; update versions & comments. | IDD (Instructional Design Department) | Item Stats | MCQ Final |
| FINALIZATION & APPROVAL | ||||
| Finalize Bank | Lock approved items; map number of questions per topic per ToS; enable shuffle. | Assessor, IT Officer | Final MCQ Set | Live Question Bank |
| Submit for Approval | Set bank status to “For Approval”; send link to Training Manager. | Assessor, IT Officer | Approval Request | |
| Approve Bank | Training Manager reviews and comments “Approved”; updates Status field. | Training Manager | Live Bank | Approved Bank Log |
| Archive & Publish | Archive prior drafts; export final bank backup; record revision history. | QA Officer | Approved Bank | Archive File; Rev. History |
| Principle | Description |
|---|---|
| Clarity and Focus in the Stem |
|
| Developing Effective Options |
|
| Avoiding Clues and Bias |
|
| Enhancing Cognitive Level |
|
| Formatting and Review |
|
| Step | Description | Resp | Input | Output |
|---|---|---|---|---|
| Map Assessment Task & Criteria | The developer maps out each practical task and its observable criteria, aligned with ILOs and competence tables. | Assessor | Course ILOs; Competence Tables | Task & Criteria Map |
| Apply Standard Template | Develop a practical assessment tool using the standard template | Assessor | Task & Criteria Map | Draft Assessment (Template) |
| Fill-up Practical Assessment Tool | Fill in the required procedures, assessment task(s), performance criteria, performance standards, and rating scale. | Assessor | Draft Assessment Tool | Developed Assessment Tool |
| Submit for Document Review | Forward the developed assessment to verify formatting, terminology, and standard compliance. | Course Developer | Document Review | Standardized Assessment Tool |
| Assign Assessor Reviewer | Assigns an SME to review the document for technical and pedagogical soundness | Training Manager | Assign(s) Reviewer (SME) | Designated Reviewer (SME) |
| Review and Verification | The assessor reviews the document, annotates comments in the "Comments" section, and returns it to the Developer. | SME | Assessment Tool Review and Verification | Review and Verification Form |
| Revise & Log Changes | The developer addresses the comment, revises the document, and logs changes. | Course Developer | Reviewer Comment Log | Revised Assessment Tool |
| Submit to Training Manager | Submit revised assessment tool to the Training Manager for preliminary check of alignment and resource feasibility. | Course Developer | Revised Assessment | Training Manager’s Feedback |
| Select Pilot Test Candidates | In coordination with TM, identify a representative group of trainees and schedule the pilot. | Course Developer, Training Manager | Participant List | Pilot Test Plan |
| Conduct a Pilot Test | Facilitate pilot execution, observing time, clarity, criterion usability, and scoring consistency. | Assessor, Course Developer, Training Manager (TM), Quality Assurance (QA) | Pilot Test Plan | Pilot Testing Forms and Validation Tools |
| Review Pilot Test Results and Feedback | Analyze pilot outcomes, refine tasks, rubrics, and timing; | Course Developer | Pilot Testing Forms and Validation Tools | Practical Assessment Tool |
| Finalize & Document Changes | Integrate all refinements, update version number, and prepare final assessment package. | Course Developer | Practical Assessment Tool | Revised Practical Assessment Tool |
| Submit for Training Manager Final Approval | Present final assessment and supporting documents for formal approval. | Training Manager | Revised Practical Assessment Tool & Logs | Approved Practical Assessment Tool |
| QA Compile & Archive | QA collates all versions, review logs, pilot data, approval memos, and files them for a minimum of five years. | QA | Approved Practical Assessment Tool & Logs | Archived Assessment Package |
The implementation phase refers to the actual conduct and administration of assessments, ensuring alignment with approved training standards and MARINA requirements. This includes scheduling assessments, preparing examination venues or LMS platforms, deploying invigilation procedures, and applying the established grading system.
| Step | Description | Resp | Input | Output |
|---|---|---|---|---|
| Issue Topic-End Instructions | During the introduction of the course and briefing that at the close of each lesson, the Formative Assessment must be taken before proceeding. Emphasize unlimited attempts, mastery score, and academic-honesty expectations. | Instructor | Instructor's Instructions | Understanding of the Trainee |
| Verify SCORM Settings | Confirm the quiz for the topic is already visible in the course, set to Unlimited Attempts and “Highest Attempt” grading, with completion tracking set to Passed. (No upload required: package is pre-loaded.) | IT Officer | Course Page | SCORM Setting |
| Learner Completion (Asynchronous) | Learners launch the SCORM quiz at their own pace, repeat attempts until the mastery score is reached; LMS records every attempt automatically. | Trainee | LMS Credentials | Attempt Records |
| Immediate Feedback & Progress Gate | SCORM package displays item-level feedback; once the quiz status = Passed, the next topic automatically unlocks. | LMS (Auto) | Attempt Data | Topic Unlock |
| Instructor Monitoring | Review SCORM activity report (attempt counts, scores). Identify any learner needing support or any topic showing persistent low first-attempt pass rates. | Instructor | SCORM Report | Support Outreach |
| Record Retention | LMS automatically stores settings logs, attempt records, and reports. | IT Officer | Systems Logs | Achieved Data Logs |
| Step | Description | Resp | Input | Output |
|---|---|---|---|---|
| Preparation of Assessment | ||||
| Verify Quiz Load | Verify that the Summative Quiz is already loaded in the live training course, linked to the approved question bank and Table of Specifications (ToS). | Assessor | Live Course ToS | Quiz Presence Confirmed |
| Check Quiz Settings | Check quiz settings: single attempt (already set) and time-limit exactly as stated in the ToS; ensure question/option randomization is enabled; keep quiz hidden until start. | Assessor | Quiz Setting Page, ToS | Quiz Configuration Log |
| Confirm Hardware Readiness | Confirm the assessment computers and tablets are ready; perform PC and network readiness check, including one spare workstation. | IT Officer | PC and Tablets | Readiness |
| Sign & Archive Config Log | Sign and archive the Configuration Log; leave quiz status “Hidden from students.” | Assessor | Config Log | Approved Config Log |
| Briefing of Assesses | ||||
| Admit & Verify | Admit candidates, verify identity against seating plan, remind them they will log in with their usual LMS credentials once the assessment is shown. | Assessor | Seating Plan | Attendance Sheet |
| Standard Briefing | Deliver standard briefing covering: Assessment objective, pass mark, cheating policy, re-sits, appeals rules, timing, grading, Hardware. | Assessor | Briefing Slide | Signed Briefing Acknowledgement |
| Collect Signatures | Collect signatures from the trainees. | Assessor | Attendance List | Completed Attendance List |
| Assessment Proper | ||||
| Start Assessment | At start time, manually un-hide the quiz and click “Open attempt for all.” Announce the official start and remaining time checkpoints. | Assessor | Quiz Availability Toggle | Quiz Start Timestamp |
| Invigilate | Invigilate: patrol aisles, monitor CCTV; record any integrity or technical incidents. | Assessor | Invigilation | Completed Invigilation |
| Ensure Submission | Ensure each candidate sees “Assessment Submitted” before exiting the assessment site. | Assessor | Assessment | Completed Assessment |
| Result of the Assessment | ||||
| Auto-Grade & Review | Allow the LMS to Auto–Grade, manually review any flagged items. | Assessor | Quiz Attempts | Confirmed Grade Book |
| Cross-Check & Save | Cross–Check grades against attendance and save results directly in the LMS. | Assessor | Gradebook | Verified Results in System |
| Debriefing of Assessment | ||||
| Announce Results | Immediately after grading, announce total passes and fails to the group. Explain the 48-hour appeals window and the re-sit schedule. | Assessor | Grade Summary | Debrief Session Log |
| Advise on Re-sit | Advise unsuccessful assesses of the mandatory re-sit procedure. | Assessor | Re-Sit Policy | Re-Sit Notice |
| Complete Debrief Report | Complete a short Debrief Report noting issues and improvement actions; file in course records. | Assessor | Gradebook | End of Debriefing |
| Step | Details of the Procedure | Resp | Input / Tool | Output |
|---|---|---|---|---|
| Retrieve assessment pack | Pull Practical-Assessment Package vX.X (exercise script, assessor checklist, scoring rubric, risk assessment) from Examination Master File. | Assessor | Master File | Printed package |
| Facility & equipment check | Verify simulator settings / equipment functionality; perform safety inspection; log in Lab / Sim Readiness Checklist. | Lab Tech / Safety Officer | Simulator console; PPE | Signed readiness checklist |
| Candidate schedule & notice | Post roster with assessment slots, PPE requirements, and pass criteria (Competent / Not Yet Competent). | Course Admin | Timetable board | Notice posted |
| Identity & PPE check | Confirm photo ID, attendance; inspect PPE (or issue center PPE); brief on safety rules & assessment scoring. | Assessor | ID cards; PPE list | Attendance sheet |
| Safety & task briefing | Explain scenario objectives, time limits, stop command, and fail-safe procedures; answer procedural questions only. | Safety Officer / Assessor | Exercise script | Briefing record |
| Launch scenario | Start simulator/exercise timer; observe from control station or designated area. | Assessor | Simulator controls | Timer log |
| Observe & score | Mark each criterion on Practical-Checklist (C = Competent, NYC = Not Yet Competent); note time stamps and critical errors. | Assessor | Checklist form | Completed checklist |
| Safety oversight | Intervene only for unsafe act; pause scenario, apply corrective action, record in Incident Report if required. | Safety Officer | Emergency stop | Incident report (if any) |
| Verbal debrief | On completion, give short factual feedback (“task complete”, “missed valve isolation”); advise result will be posted after QA. | Assessor | Checklist | Debrief note |
| Result decision | If all critical criteria = C, mark Competent; else NYC. | Assessor | Checklist | Score sheet |
| QA verification | QA Officer samples ≥10 % checklists & video (if recorded) for scoring consistency; countersigns sample. | QA Officer | Checklists; video | QA sign-off memo |
| TM approval | Training Manager reviews result summary and incidents; signs Practical TCROA section within 24 h. | TM | Result summary | Signed TCROA |
| Post results | Publish Competent/NYC list on LMS or notice board; outline re-assessment slot (max 2 attempts). | Course Admin | TCROA data | Result bulletin |
| Handle appeals | Receive appeal form within 5 days; Appeals Panel re-views video/checklist; issue decision memo. | Appeals Panel | Appeal form; video | Appeals decision |
| Item analysis | Aggregate KPI: pass rate, average time, frequent errors; log in Practical-Stats Sheet. | QA | Checklists; timer log | Stats sheet |
| Improvement actions | Curriculum team reviews stats; adjust scenario or training content; version-up assessment pack to vX.X+1. | Course Developer / Assessor | Stats sheet; feedback | Updated package |
| Archive documentation | File checklists, sign-offs, videos, incident & stats in /Assessments/Practical///vX.X/; retain ≥ 5 years. | Document Controller | All records | Archived folder |
| Equipment reset | Return simulator/equipment to neutral state; log maintenance needs. | Lab Tech | Reset checklist | Maintenance log |
This section outlines the grading criteria for assessments, how final course grades or outcomes are determined, and the policies for re-examination (re-sits) and appeals of assessment results. Consistent and transparent grading practices are critical for fairness and for compliance with standards that require clear pass/fail criteria to be defined in the assessment procedures.
| Assessment Component | Weight | Passing Requirement | Attempts Allowed | Additional Rules / Notes |
|---|---|---|---|---|
| Formative Assessments (topic-end quizzes, exercises) | 0 % (non-bearing)¹ | Completion of every formative item | Unlimited Attempts | Completion unlocks the next topic. The instructor reviews low-scoring items and schedules refreshers before the summative exams. |
| Summative Theoretical Assessment (written / computer-based) | 50 % of the final grade | ≥ 75 % overall score² | 1 regular sitting + up to 2 authorized re-sits | A trainee who still fails after the 3rd re-sit must take a formal refresher course before reassessment. Refer to Re-sits Policy. Passing this component is mandatory before the trainee may attempt the Summative Practical Assessment. Item weights follow the Table of Specifications; results recorded in TCROA. |
| Summative Practical Assessment (performance tasks) | 50 % of final grade | Competent in all critical tasks³ | 1 regular sitting + task-specific re-assessment (if feasible) | Binary rating per task: Competent / Not Yet Competent. Any “Not Yet Competent” on a critical task ⇒ component Fail. May only be taken after passing the Summative Theoretical Assessment. Conducted with checklists & rubric; results logged in TCROA. |
| Final Course Grade | 100 % | Summative Theoretical ≥ 75 % AND Summative Practical = Competent | Both summative components must be passed; otherwise, the overall grade will be a FAIL. Formative completion is a prerequisite to sit the summative assessment. |
¹ Formative assessments carry no percentage weight but are compulsory gateways.
² The 75 % minimum pass mark is fixed.
³ “Critical tasks” correspond to mandatory MARINA/STCW competence elements and must all be passed in one session or during the allowed reassessment.
| Policy Element | Standard Provision | Implementation / Notes |
|---|---|---|
| Eligibility for Re-sit | A trainee who fails any final summative component (written or practical) may re-take the failed component only. | New but equivalent paper or fresh practical scenario; instructor debriefs weaknesses (no question disclosure). |
| Number of Attempts | Maximum three (3) sittings in one exam cycle: 1 initial + 2 authorized re-sits. | After the second re-sit (third failure), the cycle ends; no further immediate tests are conducted. |
| Required Remedial Training | Mandatory refresher or course re-enrolment before a new exam cycle can start. | Ensures added learning before more attempts. |
| Timing / Scheduling | Re-sit on same day is allowed; re-sits must occur within 1 year of initial failure. | Same-day re-take which is dependent on the availability of personnel and facility; practical re-sit depends on facility & assessor availability. |
| Fees | Re-sits may carry an additional fee as per the schedule or approved internal policy. | The fee policy is disclosed in the Course Info Sheet. |
| Component-Specific Re-sit | Only the failed component is repeated; the passed component stands. | TCROA marks the pass; the failed part is flagged “pending”. |
| Recording & TCROA Notation | All re-sits logged; TCROA flags 2nd or 3rd-attempt passes (e.g., asterisk). | Supports audits & data analytics. |
| Exam Integrity & Security | Re-sits follow identical proctoring rules; new question sets preserve integrity. | May use the same or a different qualified assessor. |
| Special / Marginal Cases | No oral supplementation to push borderline scores; competence proven via formal re-sit. | Guarantees fairness. |
The institution recognizes the right of a trainee to appeal an assessment result if they believe it was unjust or in error. An appeal is a formal request by a candidate to review and reconsider their assessment outcome.
Applies to all summative assessment components (theoretical and practical) conducted by Maritime Training Institution. Covers grievances about grading errors, out‑of‑scope questions, assessor/proctor conduct, or significant procedural irregularities.
The following constitute acceptable grounds for lodging a formal appeal. Appeals that do not fall within these categories may be dismissed unless exceptional circumstances are demonstrated.
| Acceptable Ground | Example |
|---|---|
| Grading Error | The correct answer marked wrong; miscalculation of total score. |
| Out-of-Scope Question | Item tests content not included in course syllabus Part C or official learning materials. |
| Bias / Improper Conduct | Evidence of assessor or proctor prejudice, intimidation, or favoritism. |
| Procedural Irregularity | Disturbance, equipment failure, or breach of exam rules that materially affected performance. |
| Administrative Error | Wrong candidate identity used, incorrect exam version issued, or data-entry mistake. |
| Step | Details of the Procedure | Resp | Input | Output |
|---|---|---|---|---|
| Informal Clarification | The trainee discusses the result informally with the Instructor within two (2) working days of the result's publication. | Trainee, Instructor | Assessment Result | Clarification Provided |
| Lodge Formal Appeal | If unresolved, the trainee submits a written Appeal Letter to the Training Manager (TM) within five (5) working days of the result release, specifying the grounds and evidence. | Trainee | Appeal Letter | Appeal Log Entry |
| Acknowledge & NDA | TM acknowledges receipt within 1 day, issues an Appeal Acknowledgement, and has the trainee sign an NDA covering the confidentiality of assessment materials. | Training Manager | Appeal Log Entry | Appeal Acknowledgement: Signed NDA |
| Constitute the Appeals Panel | TM forms a 3-member Appeals Panel (TM or delegate as Chair, an independent SME/assessor, and the original assessor, unless there is a conflict). | Training Manager | Appeal Log; Availability Roster | Panel Assignment Memo |
| Evidence Collection | The panel gathers answer sheets, scoring rubrics, LMS logs, video recordings, incident reports, the syllabus, and the Terms of Service (ToS). | Appeals Panel | Assessment Records | Evidence Dossier |
| Review & Deliberation | The panel reviews evidence related to the appeal grounds and interviews the staff involved as needed. | Appeals Panel | Evidence Dossier | Draft Decision |
| Decision & Report | The panel finalizes the outcome (Upheld/Denied) and recommends remedial action; completes the Appeals Review Report, signed by all members. | Panel Chair | Draft Decision | Signed Appeals Review Report |
| Update Records | If the appeal was upheld, adjust scores, TCROA, gradebook; if denied, no change—record decision in QA Appeals Log. | QA Officer, Training Manager | Appeals Review Report | Updated Records |
| Communicate Outcome | TM issues formal Decision Letter to trainee within five (5) working days of appeal receipt, including brief rationale and next steps (resit eligibility, etc.). | Training Manager | Appeals Review Report | Decision Letter |
| Implement Remedial Action | If the panel orders a retest or voids the exam, schedule accordingly; waive fees if the fault lies with the institution. | Instructor, QA | Decision Letter | Scheduled Retest / Corrected Grade |
| Archive Documentation | File Appeal Letter, NDA, Evidence Dossier, Appeals Review Report, Decision Letter in secure QA archive for 5 years. | QA | Appeals Documents | Archived Appeals File |
| Step | Description | Resp | Input | Output |
|---|---|---|---|---|
| Select & Assign Invigilators | The Training Manager designates may designate the course supervisor or another assessor to lead each summative assessment. He assigns additional trained staff so that assessment have at least one invigilator per 20–24 candidates, and practical exams have candidates (plus a safety aide if needed). | Training Manager | Assessment Schedule: faculty list | Confirmed invigilator roster |
| Brief Invigilators on Confidentiality & Impartiality | Before every session, invigilators declare any conflict of interest, and are reminded that they must not assist candidates or perform unrelated tasks while on duty. | Training Manager, Assessor | Confidentiality conflict-of-interest | Cleared invigilator |
| Conduct Pre-Exam Room & Material Checks | Invigilators inspect desks, walls, equipment, and waste bins for hidden notes or devices, remove any unauthorized materials, and lay out spare pens, attendance sheets, and incident forms. | Invigilator | Assessment venue: checklist | Cleared, ready assessment room |
| Verify Candidate Identity & Admission | At the door, assessor and invigilator/s match each candidate’s photo ID and admission slip against the roster; unresolved cases are referred to the Training Manager before entry. | Invigilator | ID cards, admission slips, roster | Authenticated seating list |
| Announce Rules & Secure Personal Items | Invigilators read the standard rules aloud, instructing candidates to switch off and surrender their smart devices, place their bags in the designated area, and keep only allowed materials on their desks. | Invigilator | Rules script; storage bins | Candidates briefed; items secured |
| Log Candidate Attendance | Once seated, invigilators take attendance on the official sheet and note late arrivals; latecomers are admitted only within the first 15 minutes and receive no extra time. | Invigilator | Attendance sheet | Completed attendance record |
| Maintain Active Invigilation | Invigilators position themselves to view all candidates, rotate unpredictably, and discreetly observe answer sheets or screens for copying patterns while avoiding actions that could be construed as coaching. | Invigilator | Seating plan | Continuous supervision |
| Detect & Deter Prohibited Behaviors | Invigilators watch for whispering, wandering eyes, crib sheets, and devices; they may re-seat suspects, confiscate materials, or accompany any candidate who must briefly leave the room. | Invigilator | Surveillance observations | Real-time intervention |
| Intervene & Document Incidents | In cases of confirmed cheating (e.g., use of phones, pre-written answers), the invigilator terminates the exam, collects evidence, records details in the Invigilation Incident Report, and discreetly removes the candidate. | Invigilator; Assessor | Incident form; seized items | Completed Incident Report; secured evidence |
| Collect Scripts & Rough Work | At “time’s up,” invigilators collect all answer scripts and rough paper, count them against the attendance sheet, and seal them in the tamper-evident envelope. | Invigilator | Answer scripts; envelope | Secured exam package |
| Ensure Post-Exam Confidentiality | Invigilators hand-seal scripts to the Assessor, shred surplus rough paper, and refrain from discussing questions publicly; they countersign the transfer log to confirm chain of custody. | Invigilator; Assessor | Transfer log | Protected exam content |
| Review Incidents & Impose Sanctions | The Training Manager reviews all Incident Reports, interviews parties if needed, and applies misconduct penalties (fail, bar on immediate re-sit, or dismissal). MARINA is notified of grave cases, such as impersonation. | Training Manager | Incident Reports, policy matrix | Sanction decision; MARINA report (if required) |
| File Records | All attendance sheets, incident reports, and sanction decisions are archived in secure storage (both electronic and hard copy) for a minimum of 5 years, in accordance with the Records Retention Procedure. | Records Officer | Exam documents | Archived records |
To safeguard the integrity of all assessments by preventing, detecting, and addressing cheating in a consistent, fair, and traceable manner.
Applies to every formative and summative assessment—onsite, distance-learning, or practical—administered under the Learning Management System.
| Step | Description | Resp | Input | Output |
|---|---|---|---|---|
| Deploy Preventive Measures | Prior to any exam, the IT Officer enables, randomizes question and answer order, and geofences LMS access. The Training Manager ensures that assessment version are uploaded, and seating plans are created to ensure adequate spacing. | IT Officer; Training Manager | Exam bank, seating chart | Hardened assessment environment |
| Honor Statement | Each candidate signifies himself to follow the honor statement during the conduct of assessment affirming they will not give or receive unauthorized assistance. | Invigilator | Honour statement | Commitment |
| Identity & Permit Verification | Invigilators check photo IDs and admission slips at the door; any mismatch results in temporary denial of entry until resolved with the Training Manager. | Invigilator | ID card, roster | Authenticated seating list |
| Personal-Item Control | Invigilators instruct candidates to power off their devices, surrender smart devices, store bags in the designated area, and keep only transparent pencil cases and approved materials on their desks. | Invigilator | Storage bins | Secured exam hall |
| Active Proctoring & Detection | Invigilators rotate positions, observing for whispering, answer-pattern mirroring, or frequent lap gazes (which may indicate possible phone use); they re-seat suspects and accompany any candidate who must leave the room. | Invigilator | Seating plan | Real-time deterrence |
| Confiscation & Evidence Collection | On discovery of unauthorized notes, devices, or impersonation, the invigilator discreetly confiscates the item, notes the time, the candidate's name, and the question involved, and photographs digital evidence if feasible. | Invigilator | Seized items | Evidence packet |
| Immediate Sanction Decision | Possession of devices or pre-written answers triggers automatic exam termination; a first-time copying attempt earns an on-paper warning, but repetition leads to termination. | Assessor | Incident details | Candidate dismissed or warned |
| Incident Reporting | The invigilator completes SEAV-ASM-C01, attaches the evidence, and submits it to the Assessor for countersignature; the candidate is then offered a brief written statement opportunity. | Invigilator; Assessor | Report form | Signed Incident Report |
| Post-Exam Chain of Custody | Invigilators collect all rough sheets, seal answer scripts, shred scrap paper, and log transfer of materials to the Records Officer. | Invigilator; Records Officer | Scripts; log | Secured exam package |
| Management Review & Penalty | The Training Manager reviews the incident, consults the policy matrix, and decides on the penalty (fail, re-sit bar, remedial training, or expulsion). For grave cases, they prepare the MARINA notification within five (5) working days. | Training Manager | Incident Report | Sanction decision; MARINA letter |
| Records Retention & KPI Logging | The Records Officer archives all related documents (both hard copies and digital) for a minimum of five years and logs the incident in the KPI dashboard. | Records Officer | Incident dossier | Archived file; updated KPI |
| Continuous Improvement | IT Officer analyses LMS logs for unusual IP patterns; Chief Assessor runs answer-similarity checks; findings inform updates to invigilator training and preventive controls in the next quarterly review. | IT Officer; Chief Assessor | LMS logs; similarity report | Updated controls & training |
To ensure that all on-site LMS assessments are taken only from approved computer laboratories or classrooms, using IP-range locking and invigilator oversight to match in-person security.
Applies to every formative or summative exam delivered through the Learning Management System.
| Step | Description | Resp | Input | Output |
|---|---|---|---|---|
| Whitelist Venue IPs | IT enters the lab’s static IP range (e.g., 192.168.12.0/24) in “Quiz access → Require network address.” | IT | IP list | Updated LMS config |
| Publish Geo-Locked Exam Window | Training Manager schedules the quiz (e.g., 10:00–12:00 PHT) and tags it “On-Site Only.” | Training Manager | Timetable | The exam is visible in the LMS calendar |
| Workstation Readiness Check | Thirty (30) minutes before start, the Invigilator logs in on each PC to verify LMS recognizes “Location Verified.” | Invigilator | PCs | Cleared lab |
| Identity & Admission | At the door, the Invigilator matches the photo ID to the roster and seats the candidate at a verified PC. | Invigilator | IDs; roster | Attendance sheet |
| Active Monitoring | Invigilator patrols aisles; SEB enforcement is in effect. Any IP hop alert is escalated to IT. | Invigilator | SEB dashboard | Secure exam run |
| Geo-Fail Handling | If LMS wrongly blocks a PC, IT quickly re-checks IP; if unresolved, candidate swaps to spare workstation. | IT; Invigilator | Error screen | Candidate seated |
| Close & Archive | After “time’s up,” Invigilator seals scripts; IT exports IP logs; Records Officer stores both for ≥ 5 yrs. | Invigilator; IT; Records Officer | Scripts; log files | Archived dossier |
| Step | Details | Resp | Input | Output |
|---|---|---|---|---|
| Remote Site Approval | Candidate submits address + ISP details ≥ 5 days pre-exam; TM issues approval letter and whitelists IP/GPS. | Training Manager | Request form | Approval letter |
| SEB Profile Distribution | IT emails SEB config locking exam to approved GPS radius (±250 m), blocking VPN/proxy. | IT | GPS coords | SEB-config file |
| 24-h Location Test | Candidate runs SEB “Pre-check”; system uploads GPS/IP. Failure auto-generates support ticket. | Candidate; IT | SEB | Pass/fail log |
| Identity & 360° Room Scan | At exam start, Remote Proctor confirms photo ID and records room sweep; mismatch cancels exam. | Proctor | Webcam feed | Location video log |
| Continuous Geo-Monitoring | LMS & proctor tool flag GPS drift > 250 m, face-off-camera, or IP change; proctor pauses exam to investigate. | Proctor | Live dashboard | Incident alert |
| VPN/Proxy Auto-Block | Firewall rejects known proxy IPs; candidate sees “VPN Detected—Disconnect.” | IT | Proxy DB | Connection blocked |
| Geo-Fail Emergency | Legitimate block? Proctor initiates live call; candidate shows outside view + weather; if validated, IT issues temporary unlock token and records SEAV-GEO-F01. | Proctor; IT | Error screen | Unlocked session; form |
| Close & Evidence Archive | Proctor ends session, uploads video; IT exports GPS/IP logs; RO archives video+logs for ≥ 5 yrs. | Proctor; IT; Records Officer | Recordings; logs | Archived dossier |