Instruction Design Evaluation
Formative evaluation
Goals Review
After the needs analysis and audience analysis is done, a set of goals were written. To assess those goals, the company’s managers reviewed them to clarify if any of those needed to be changed or eliminated before the sessions were designed. These goals were also reviewed by experts in the field of Instructional Design and colleagues working on similar designs.
The questions asked for this evaluation were the following:
To Managers
1. Do these goals follow the company’s philosophy for training employees?
2. Do these goals fulfill your expectations to what the learners will be able to do at the end of the sessions?
3. Would you add, change or eliminate any of those goals after reviewing the needs analysis and audience analysis provided?
To the ID expert and colleagues
1. Were these goals well written?
2. Do you believe these goals are valid, reliable and practical?
Expert Reviews:
ID experts and colleagues reviewed each step of the design while we were in the first phase of the project. As mentioned earlier, they reviewed the goals. They also gave constructive feedback on the learning assessment and content outline.
The questions asked to the ID experts and designer colleagues were the following:
1. Are the examples, practice exercises, and feedback realistic and accurate?
2. Is the instruction appropriate for the target learners?
3. Is the material well organized?
4. Are the assessment questions valid and reliable?
5. Do the assessment questions line up with the goals of the sessions?
By using these two evaluation processes, I was able to write better goals and objectives for the sessions that were accurate, tangible and achievable. The ID experts and designer colleagues also provided great suggestions to make the assessment more interesting by using videos and examples which connect the theory with the day to day activities that the learners go through every day at work.
Up to this point, the goals reviews and expert reviews have already been done.
Field Trial/Pilot
Because this is a program that I will try first with 10 employees of CSCI, we will do mainly a pilot, which can also be considered a field trial before the program is used with other employees of CSCI or other companies interested in the sessions.
Role of the designer in the pilot: designer will be just an observer of the sessions.
Role of the facilitator in the pilot: facilitator will teach the sessions as they would do any other lesson following the content outline as closely as possible and provide as much feedback as possible at the end of the program.
Role of the learners in the pilot: learners will participate in the sessions as if it was the real session and provide as much feedback as possible at the end of the program.
After the pilot is delivered, the learners will receive the following questionnaire:
After the needs analysis and audience analysis is done, a set of goals were written. To assess those goals, the company’s managers reviewed them to clarify if any of those needed to be changed or eliminated before the sessions were designed. These goals were also reviewed by experts in the field of Instructional Design and colleagues working on similar designs.
The questions asked for this evaluation were the following:
To Managers
1. Do these goals follow the company’s philosophy for training employees?
2. Do these goals fulfill your expectations to what the learners will be able to do at the end of the sessions?
3. Would you add, change or eliminate any of those goals after reviewing the needs analysis and audience analysis provided?
To the ID expert and colleagues
1. Were these goals well written?
2. Do you believe these goals are valid, reliable and practical?
Expert Reviews:
ID experts and colleagues reviewed each step of the design while we were in the first phase of the project. As mentioned earlier, they reviewed the goals. They also gave constructive feedback on the learning assessment and content outline.
The questions asked to the ID experts and designer colleagues were the following:
1. Are the examples, practice exercises, and feedback realistic and accurate?
2. Is the instruction appropriate for the target learners?
3. Is the material well organized?
4. Are the assessment questions valid and reliable?
5. Do the assessment questions line up with the goals of the sessions?
By using these two evaluation processes, I was able to write better goals and objectives for the sessions that were accurate, tangible and achievable. The ID experts and designer colleagues also provided great suggestions to make the assessment more interesting by using videos and examples which connect the theory with the day to day activities that the learners go through every day at work.
Up to this point, the goals reviews and expert reviews have already been done.
Field Trial/Pilot
Because this is a program that I will try first with 10 employees of CSCI, we will do mainly a pilot, which can also be considered a field trial before the program is used with other employees of CSCI or other companies interested in the sessions.
Role of the designer in the pilot: designer will be just an observer of the sessions.
Role of the facilitator in the pilot: facilitator will teach the sessions as they would do any other lesson following the content outline as closely as possible and provide as much feedback as possible at the end of the program.
Role of the learners in the pilot: learners will participate in the sessions as if it was the real session and provide as much feedback as possible at the end of the program.
After the pilot is delivered, the learners will receive the following questionnaire:
Questionnaire modified from Smith, 2012.
The facilitator will also receive a questionnaire as showed below:
The facilitator will also receive a questionnaire as showed below:
Questionnaire modified from Smith, 2012.
The feedback gathered from learners and facilitator will be used to modify the design, and present the results back to the managers, who will decide if the program is ready to be presented to other employees at CSCI. Some of those modifications that might need to be addressed would be the following:
I expect the learners to ask:
· For a change of examples that relate closely to their roles in the company.
· Longer sessions or shorter sessions but more often during the week.
As for the facilitator, I would expect to address the following areas per their request:
· Longer sessions or shorter sessions but more often during the week.
The feedback gathered from learners and facilitator will be used to modify the design, and present the results back to the managers, who will decide if the program is ready to be presented to other employees at CSCI. Some of those modifications that might need to be addressed would be the following:
I expect the learners to ask:
· For a change of examples that relate closely to their roles in the company.
· Longer sessions or shorter sessions but more often during the week.
As for the facilitator, I would expect to address the following areas per their request:
· Longer sessions or shorter sessions but more often during the week.
summative evaluation
This evaluation will be done during the actual instruction sessions. For a complete Summative/Program Evaluation, please click on this link:
http://workingeffectivelyinateam.weebly.com/program-evaluation.html
Reference List
Smith, P. (2012). Instructional Design. Wiley Higher Ed. Kindle Edition.
http://workingeffectivelyinateam.weebly.com/program-evaluation.html
Reference List
Smith, P. (2012). Instructional Design. Wiley Higher Ed. Kindle Edition.