MTS

  1. State the purpose of a testing program.
    Effectively assess the trainee’s achievements
  2. State the Roles and Responsibilities for a testing plan:
    NETC
    NETC N7
    COMMANDING OFFICER
    DIRECTOR OF TRAINING
    LEARNING STANDARDS OFFICER
    CCMM
    CURRICULUM DEVELOPER
    LEARNING SITE CO
    LEARNING SITE TESTING OFFICER
    COURSE SUP
    PARTICIPATING ACTIVITIES
    • NETC: provides policy and guidance
    • NETC N7: provides oversight of policy and guidance and monitors centers.
    • COMMANDING OFFICER: serves as CCA, manages sites, resolves differences, oversees test development.
    • DIRECTOR OF TRAINING: ensures testing is conducted, oversees development of testing plans.
    • LEARNING STANDARDS OFFICER: provides guidance to curriculum developers, monitors TQI and TIA. Approves KTAG and PTAG.
    • CCMM: approves test design, maintains master blank.
    • CURRICULUM DEVELOPER: design and develops the testing plan, admin guides, and test.
    • LEARNING SITE CO: approves testing plan, designates Testing Officer and Course Sup.
    • LEARNING SITE TESTING OFFICER: test administration, oversees grading, secures test, maintains test blank.
    • COURSE SUP: ensure, monitors, validates admin, security and TIA.
    • PARTICIPATING ACTIVITIES: provides comments, feedback, new test items and maintains test and TIA.
  3. State the primary course source data for creating test items.
    • -JDTA  
    • -OCCSTDS  
    • -CTTL/PPP Table 
    • -COI
  4. List course source data to be used when the primary course source data is not available or hasn’t been created.
    Data elements from a combination of OCCSTDS, CTTL, PPP, and COI.
  5. Define the following test:

    -Formal
    -Informal
    • -Formal: Test: used in final GPA
    • -Informal: Quiz/Homework: not used in final GPA
  6. Define the three levels of proficiency levels contained within each:

    SKILL and KNOWLEDGE
    • -SKILL       
    • Level 1- Imitation       
    • Level 2- Repetition       
    • Level 3- Habit

    • -KNOWLEDGE       
    • Level 1- Knowledge/Comprehension        Level 2- Application/Analysis       
    • Level 3- Synthesis/Evaluation
  7. List the five categories for performance and knowledge test
    • Pre-test- validates material
    • Progress- test blocks of instruction
    • Comprehension Test- within course or final

    • Oral Test- board assesses trainee
    • Quiz- short test assess achievement of recent material
  8. Discuss the process of piloting a test.
    Assess test reliability and validity
  9. Describe the use of each test instrument as they relates to knowledge and performance test:
    -Job Sheet
    -Problem Sheet
    -Assignment Sheet
    -Multiple Choice
    -True/False
    -Matching
    -Completion
    -Labeling
    -Essay
    -Case Study
    -Validation of Test Instruments
    • Job Sheet: step by step performance
    • -Problem Sheet: requiring analysis and decision making
    • -Assignment Sheets: direct the study or homework
    • -Multiple Choice: most versatile
    • -True/False: Provide on two answers
    • -Matching: two lists of connected words, phrases, pictures, symbols.
    • -Completion: supply the missing information from memory.
    • -Labeling: recall facts and label parts
    • -Essay: a written response
    • -Case Study: complex issue
    • -Validation of Test Instruments: test instrument have been constructed.
  10. What are the two types of testing methods used in training?
    • -Criterion Referenced: skill or knowledge is met
    • -Norm Referenced:  estimates skill or knowledge for a group
  11. Discuss the test failure policies and associated grading criteria within your learning environment.
    • Test
    • Re-Train
    • Re-Test
    • Highest Re-Test Score 80%
  12. Discuss during performance test design how the skill learning objective is criticality is determined.
    • *Rank order of objectives*
    • High=3, Moderate=2, Low=1
  13. Discuss during knowledge test design how the knowledge learning objective criticality is determined to perform a task.
    Provides the information for determining learning objective criticality
  14. Identify the ten sections of a testing plan.
    • -Course Data     
    • -Course Roles and Responsibilities
    • -Course Waivers              
    • -Test Development
    • -Test Administration      
    • -Course Test and Test Types      
    • -Grading Criteria              
    • -Remediation
    • -Test and Test Item Analysis
    • -Documentation
  15. State the purpose of test and test item analysis.
    To determine statistical validity and difficulty
  16. In remediation program, discuss what the primary and secondary goal is.
    • -Primary goal is to motivate and assist
    • -Secondary goal is to remove barriers
  17. Discuss the three methods of remediation available to instructors:
    • -Targeted: material during normal classroom time
    • -Scalable: major portion of course
    • -Iterative: SME engaged one on one
  18. Define the following sections of a remediation program
    • Re-Test: retest may cover a portion or entire test.               
    • Setback: Decision based on student’s degree of difficulty.               
    • Drop from training: student is clearly unsuitable for training               
    • Counseling: used at A and C schools for personal and performance problems.               
    • ARB: 
    • **Convened when other remediation has failed**                               
    • -Course Average Fall below minimum                               
    • -Unable to achieve objectives after counseling
    • Performance is below expected academic progress                               
    • -Fails to achieve the objectives after and academic setback
Author
aoanstokes
ID
192009
Card Set
MTS
Description
Master Training Specialist
Updated