Plans, Purposes, Missions, Goals, Objectives, & Outcomes
Program Goals and Objectives
Methods and Measures
Outcomes and Results
Plans, Purposes, Missions, Goals, Objectives, & Outcomes
The implementation of assessment strategies begins with an assessment plan. Assessment plans are linked to the purposes, mission, goals, and objectives of the University, as well as the specific College or Program being assessed. Assessment plans generally have five components:
- Identify objectives and outcomes
- Establish methodologies to assess the achievement of outcomes
- Gather and analyze the evidence by implementing the methodologies
- Share the results of the analysis
- Make evidence-based improvements as needed.
Source: NIU Assessment Plan (1994-2003)
Other important aspects of an assessment plan, as signified by Central Michigan University (1994), are that the plan should:
- flow from the institutions mission
- lead to institutional improvements
- be implemented according to a timeline
- have faculty ownership/responsibility
- have institution-wide support
- use multiple methods and/or measures
- provide feedback to students and the institution
- lead to improvement
Source: Institutional Assessment Plan: Central Michigan University (1994)
To begin writing an assessment plan, The first step in implementation is to identify the purpose of assessment. Each component within a curriculum exists for some purpose. By having a purpose statement, we make that purpose explicit to others and begin to identify what is important to include in the assessment plan. Without a clear idea of what our target is, it will be difficult to determine how close to the mark we are performing.
The purpose statement should describe what the academic program is, what it does, and for whom it does it. It should summarize any specific educational approach or philosophy and any important values. It should also clearly establish its relationship with both the department’s and College’s mission statements. A student or stakeholder reading the statement should be able to identify how the program contributes to the education and careers of students and how it supports the department’s and College’s missions.
A mission statement describes the goals regarding teaching and learning. It should state values and philosophies and be understandable to new students and persons outside the field / college. In order to develop a mission statement, existing mission statements from both the University and individual colleges should be examined. Ultimately, the mission statement should identify the overall goals regarding teaching and learning which are specific to the discipline but which also are consistent with the goals of the College and the University.
Information adapted from:
The following is NIU’s Mission and Scope Statement. This can be referenced when building a department mission statement.
Recognizing that students will need to learn throughout their lives, the university provides them with the opportunity to become more competent in analytical thought, informed judgment, and effective communication and to develop an appreciation for the life of the mind. In its instructional activities, the university conveys an understanding of the organization of knowledge and the means of inquiry. It aims to develop a respect for rationality, a tolerance for ambiguity, and an appreciation of diversity. It fosters the capacity to explore the unfamiliar, to use the intellect in the process of discovery and the synthesis of knowledge, and to become familiar with new technology and its implications. It strives to enhance the imagination, sensibility, and creative talents of each student. It believes that all students should attain a level of academic and professional competence sufficient for productive employment and citizenship and that many students should be able to undertake the advanced study required for leadership in their chosen professional fields and academic disciplines. Click here for more details on NIU's Mission Statement.
All NIU colleges have their own mission statement. Here are select examples. Referencing the college’s mission statement will be helpful when building department mission statements.
College of Business
The College of Business provides higher education and professional outreach in northern Illinois, participates in relevant activities at the national level, and is committed to raising students' awareness about the implications of international and global business operations. The college offers outstanding educational programs and academic services to students, faculty, business, and other stakeholders. The college endorses a balance between teaching and research while engaging in partnerships with the business community. The college focuses its efforts to continually improve the quality of instruction and scholarship to advance its disciplines and the success of all learners.
College of Education
The College of Education, in its leadership role in the state and region, prepares professional educators who are responsive to the needs of constituencies and audiences with whom they interact. State-of-the-art preparation addressing current societal needs is a symbol of the responsibility the college takes for its students. Among the societal needs to which the college is dedicated are issues associated with the interface of technology and the infusion of cognitive and affective dimensions related to a multi-cultural, pluralistic society. The college's intent is that its graduates will be responsible masters of change and adaptation and be responsive to the needs of an ever-changing society. The College of Education, in its focus on excellence, promotes a professional development school relationship with a local school district(s) which will be able to involve all of the college's disciplines as well as partnerships with appropriate agencies, business, and industry.
College of Liberal Arts & Sciences
The College of Liberal Arts and Sciences fosters the generation, dissemination, and preservation of knowledge as the foundation of a liberal education. The mission of the college is to provide high-quality education that contributes to the intellectual growth, self-discovery, and enhanced expertise of all members of the university community. The college makes available to the widest possible audience the rich cultural and scientific legacy represented by the disciplines that make up the liberal arts and sciences. Because bodies of knowledge do not exist in isolation, the college promotes interdisciplinary inquiry and is committed to the integration of teaching, scholarship, and service. The research and scholarship in the college permeate teaching and service, generating a wide range of opportunities for faculty and students to work together in transmitting, expanding, and applying knowledge. The college programs are designed to serve the university, its students, and the residents of the region, the country, and the world. These programs link basic and applied research and scholarly endeavors to the interests and needs of individuals and society.
Additional examples of well written mission statements have been compiled by Western Carolina University, accessed from their Handbook for Program Assessment (pg 21).
Examples of Well-Defined Program Mission/Purpose Statements:
“The mission of the civil engineering program is to prepare students for professional engineering and management positions in all phases of civil engineering projects. The program will provide a broad educational background with a foundation in basic engineering and business principles. These basic skills will be complemented by advanced topics in engineering design, management, finance, computer applications, and real world civil engineering experiences throughout the Baccalaureate Degree program.”
(Department of Civil Engineering, Western Kentucky University)
“The MBA Program within the College of Business at James Madison University emphasizes excellence and continuous improvement in graduate learning by stressing knowledge, technical and interpersonal and experiential skills in the development of managerial decision making. The MBA Program seeks to serve part-time students who are full-time working professionals in the Shenandoah Valley and Piedmont Region of Virginia. The MBA Program also serves other professionals through contract programs utilizing on-site and distance learning methodologies.”
(MBA Program, James Madison University)
“Our mission is to foster an intellectual community, based on critical inquiry, that explores the human condition while enabling students to develop the capacity to “think like an economist,” thereby providing the skills necessary for meaningful work, citizenship, and leadership.”
(Department of Economics, Texas Christian University)
“The Department of Political Science offers strong major and minor programs sharing a focus on public concerns. We are committed to providing our students with the tools and competence to succeed in their lives, their graduate education and their careers by instilling academic rigor, information access and research skills, dedication to life-long learning and respect for diversity in cultures, nations and institutions of democracy.”
(Department of Political Science, James Madison University)
Program goals can be included either as part of the purpose statement or separately. The purpose of having goal statements is to clearly communicate the direction and aspirations of the program for faculty, students, and stakeholders. Commonly, programs specify their goals in relation to its major functions such as instruction, student learning, research, and service. Program goal statements provide impetus for planning and continuous improvement efforts. Primary among program goals should be those pertaining to student learning.
Goals differ from objectives in that goals represent what the program is striving to achieve in the long-term, and tend to be written using broader and more inclusive language than objectives. They state what the program wants to accomplish or become over time. Goals also define a set of specific (observable and measurable) student learning outcomes. For example, statements of learning goals might begin with:
- Gain an understanding of…
- Become aware of…
- Develop an appreciation for…
Performance objectives, on the other hand, flow from program goals. Objects are brief, clear statements of learning outcomes of instruction that are subsets of each program goal. Objectives use more precise terms and should focus on the students, rather than the curriculum. Thus, objectives such as “completing a course” or “writing a thesis” are unacceptable. Writing objectives with action verbs and statements, rather than vague terms such as ‘know’ and ‘understand’ is essential. Such ‘action terms’ are:
Bennion suggests that specific performance objectives offer the best available means to measure the achievement of educational goals. A performance objective is distinguished from a goal by three characteristics: (1) specifying the student behavior that will be accepted as evidence that the objective is met, (2) describing the conditions under which the behavior is expected to occur, and (3) specifying the criteria of acceptable performance that the student must attain.
For example, given the goal: “The student will understand criminal law,” the following performance objectives might be stated:
A) Given a case study of crime, the student will identify whether it is a robbery, burglary, or theft.
B) Given a case study of a crime, the student will determine who is the principal in the first degree, the principal in the second degree, accessory before the fact, and accessory after the fact.
Access Bennion’s complete paper on writing performance objectives and relating performance objectives to program goals.
Additional word phrases for writing objectives can be found here. These were originally presented by Indiana Wesleyan University.
Concrete actions terms turned into learning objectives could look like this:
- Specify the conditions under which certain actions are to be performed
- State the minimum criteria for successful completion of the program or course.
Excerpt adapted from:
ASU Guidelines For The Assessment Of Learning Outcomes (Pages 11-13)
The primary difference between goals and performance objectives is that goals are intended to provide general information and thus are not as measurable, while performance objectives indicate concrete measurable outcomes. Performance objectives are developed from departmental and course goals. Goals are the lens through which faculty can focus attention on specific requirements of a program. Objectives provide the means to focus on specific requirements of a department or course: they facilitate the selection of course content, teaching techniques or strategies, and assessment procedures.
Excerpt adapted from: ASU Guidelines For The Assessment Of Learning Outcomes (Pages 11-13)
To help solidify the understanding of program goals and performance objectives, here is a goal/objective pyramid presented by University of Connecticut. This pyramid image is chosen to convey the fact that increasing complexity and level of specificity are encountered as one moves downward. The pyramid structure also reinforces the notion that learning flows from the mission of the institution down to the units of instruction.
As stated earlier, goals should focus on the general aims or the broad purposes of the program and curriculum. Objectives should be brief, clear statements of learning outcomes and should flow from the goals. Here are some examples:
Goal 1: To educate students in the basic methods and philosophy used to conduct scientific research, particularly in the Earth Sciences. Graduates from this degree program should be able to:
- Objective 1.1. Use the scientific method to organize and conduct research.
- Objective 1.2. Demonstrate knowledge of the information resources available in the Earth Sciences such as scientific journals, geologic databases, and internet resources.
- Objective 1.3. Be able to collect original data using field techniques and archival material.
- Objective 1.4. Apply quantitative methods to solve problems, analyze data and formulate models.
- Objective 1.5. Develop the ability to work independently and collaboratively in teams to solve open-ended questions.
Goal 2: To help students develop effective oral and written communication skills. Graduates from this degree program should be able to:
- Objective 2.1 Effectively disseminate technical findings and conclusions by means of written reports in the format used in professional/technical writing.
- Objective 2.2 Organize and give professional oral presentations.
- Objective 2.3 Use maps, three-dimensional diagrams and other earth imagery to summarize findings and display them to a range of different audiences.
Excerpt adapted from: Guide to Outcomes Assessment of Student Learning: Why Outcomes Assessment? California State University, Fresno
A checklist for writing goals and objective was created by Indiana Wesleyan University. This checklist could be a helpful resource.
- Is it Integrated with the Mission of the University? Does it fit with the Institutional Strategic Plan?
- Will it Benefit the University Community?
- Can it Be Measured?
- Is it Reasonable/achievable?
- Does it Relate to Students’ Needs?
- Is it Output Driven?
- Is it Documented?
- Is it Specific?
- Is it Simple and Easy to Understand?
- Is it Focused on the Purpose of the Program?
- Does it Improve Performance?
- Is it Cost Effective?
- Does it Use Current Technology?
- Is There a Time/date Established for Goal Completion?
- Is it Written with Assessment and evaluation in mind?
When referring to methods and measures, method specifically mean the type of data that will be collected (e.g., survey, exam, portfolio, capstone project, etc.) and measure specially means the actual instrument used to collect the data (e.g., NIU’s alumni survey, Train and Assess IT for testing technology skills, LiveText portfolios, and thesis capstone projects, etc.). Both methods and measures are essential determinations for making assessment plans.
In preparing an assessment plan, the data collection process needs to be determined. This involves selecting the methods and measures to assess each objective (for objectives, see prior section). Some objectives may be more important than others, so selecting the most important objective(s) to measure is imperative. No single method or measure is perfect, and many experts believe that good assessment requires the use of multiple methods or measures.
There are two types of assessment methods: direct and indirect.
Direct methods of assessment involve students' displays of knowledge and skills. Thus, students’ learning is evaluated through their performance (e.g., via: test results, written assignments, presentations, classroom assignments, etc.). Direct measures are often more time-consuming to collect and evaluate than are indirect methods, however direct methods can be more closely tied to the identified learning objectives.
Examples of direct assessment methods include:
- Behavioral Observations
- Capstone Projects
- Comprehensive examinations
- Course-embedded assignments (e.g., tests, papers, reviews, presentations)
- Dance or Music Productions
- Essay tests (blindly scored by faculty across department, division, or college)
- National accreditation exams
- Oral exams
- Performance appraisals
- Practicum / Internship evaluations
- Thesis or Dissertation Projects
- Tests and Examinations (Local/Faculty designed or Commercial/Standardized tests)
- Video and Audiotaped Performances
Indirect methods of assessment gather data that is not based directly on academic or work performance. Instead, indirect methods assess perceptions of learning. For example, self-reports from students about their progress of learning, what experiences they attribute their learning to, and how they feel about what they know, would exemplify indirect assessment. Indirect measures are less objective, but they are easier to collect and provide important information about the extent to which the program is meeting student or employer needs. Thus, survey use is the most common indirect assessment method.
Examples of indirect assessment methods include:
- Alumni surveys
- Archival Records
- Continued scholarly success of graduates
- Curriculum and Syllabus Analysis
- Employer surveys
- Exit interviews
- External Reviewers
- Focus groups
- Licensure Statistics
- Placement records of graduates
- Student Satisfaction surveys
- Transcript Analysis
You can view all the assessment Methods currently being used by NIU Departments via our Methods Matrix
For additional Methods resources, see the Examples of Best Practices in the Good Practice Section of the Manual. This section provides case examples of how specific methods have been effectively implemented.
Above information adapted from:
Assessment Techniques. California State University, Fresno
University of Wisconsin-Madison, Assessment Manual, Chapter 6: Assessment Instruments and Methods Available for Assessing Student Learning in the Major
Western Carolina University, accessed from their Handbook for Program Assessment (pgs 33-34).
- Formative assessments are on-going assessments, reviews, and observations in a classroom.
- Teachers use formative assessment to improve instructional methods and student feedback through teaching and learning processes (Wiliam & Black, 1996).
- One way of conceptualizing formative assessments are as “practice.”
- Summative assessments are given to assess what students do and do not know about a particular learning topic.
- They measure the level of success or proficiency that has been obtained at the end of an instructional unit.
- Examples include final exams, state assessments, end-of-unit chapter tests, and benchmark assessments.
- Course-Embedded Assessment can have a variety of advantages, including purposeful reexamination of course objectives, sequencing, and content and feedback
- This allows instructors to redesign assignments and give clearer direction to students about what is expected.
Course-Embedded Techniques Applied to Formative and Summative Assessment
- Course-embedded assessments may be formative as well as summative.
- They can be used to evaluate the development of student skills and provide feedback (formative) and they can be summative as well (evaluating final student product (MSU.edu, 2012)).
- For example, one could use the sentences that students wrote to sum up a lecture (formative) or a final graded essay (summative) as units of a course-embedded assessment.
For more information as well as References, please see "Course-embedded Assessments" in the glossary.
The Methods Matrix is an Excel document created by the Office of Assessment Services to track the Methods of Assessment used at NIU. The methods matrix is broken into direct (link to term in glossary) and indirect assessment methods by degree program. View the Methods Matrix.
The instruments used to score student performance are often lumped under the broad title of ‘rubrics’. In actuality, there are several types of scoring instruments: Rubrics, Performance Checklists, and Checklists. Each technique has their advantages and disadvantages, yet the most important aspect of scoring student performances is to make sure your using a instrument to standardize your the scoring process (whether it’s a rubric, performance checklist, or checklist). You may be thinking… Is their a best instrument? Yes. True rubrics are ideal, and this section will help distinguish between rubrics, performance checklists, and checklists. Examples of well constructed scoring instruments will also be provided.
Below is Morningside College’s “A Rubric - by any other name?” that distinguishes ‘true’ rubrics from other scoring tools.
- One of the problems endemic to assessment is that many of the learning outcomes we are trying to assess are complex. Whenever we are asked to make judgments about something, we have criteria that we use. Sometimes, however, we can’t always clearly describe what those criteria are. As a result, there can be confusion when two or more individuals are judging the same thing. A good example of this is critical thinking. Even though it is a common learning outcome, it is also one that faculty find difficult to assess. One reason for this difficulty is that the measurement of complex things tends to be more subjective as different individuals often have varying ideas about what is being measured and what is appropriate evidence that it has occurred. If, however, the individuals involved can come to a mutual agreement on what important elements should be evident in student products or performances, then more consistent assessment can occur since the performance criteria do not vary among those involved. Performance criteria are the guidelines, rules or principles by which the work will be judged. They describe what to look for in order to determine the quality of the work. For those whose work is being assessed, it makes performance expectations more public and less mysterious. This is where rubrics come in.
What is a rubric?
Rubrics are instruments that attempt to make subjective measurements as objective, clear, consistent, and as defensible as possible by explicitly defining the criteria on which performance or achievement should be judged. They are devices for organizing and interpreting data gathered from observations or learning artifacts (papers, products, etc.) of student learning. Rubrics are designed to allow for the differentiation between levels of achievement, or development, by communicating detailed information about what constitutes excellence.
What is the difference between a checklist, a performance list and a rubric?
It is sometimes unclear what the differences are between a checklist, a performance list and a rubric. A checklist is an instrument in which the required elements of a performance or product are listed and a score is assigned based on whether the element is present or not. They are useful devices for assessing simple performances or achievement in which the individual elements being assessed typically involve dichotomous types of judgments. For example, I may use a checklist that includes an element like the one listed below if I were assessing a student’s ability to apply a Gibney basket weave tape job for a lateral ankle sprain.
- Tape anchors applied to shin and foot: Yes (1 pt) No (0 pt)
Notice that this checklist element does not address the concept of quality of the work and does not easily inform the rater what to do with partial performances. In the example above, which word should be circled if the student only applied an anchor to the shin and not to the foot, or applied anchors that were too tight, too loose or wrinkled?
As we increase the complexity of the outcome being addressed, the next level of instrument could be considered to be a performance list. Like checklists, performance lists outline the elements to be addressed. Unlike checklists, performance lists include a quality dimension by incorporating some kind of scaled scoring system. For example, one of the items on my performance list for assessing a student’s writing sample might be:
Performance list item:
- Appropriate spelling and grammar is used 1 (poor) 2 (satisfactory) 3 (excellent)
Performance lists allow for more flexibility in scoring by varying the point values used in the scale (1 to 3, 1 to 5, etc.) and for allowing a weighting of the elements. However, the performance judgments still allow for a great deal of subjectivity as the criteria by which scores are selected is not clear. Even though the example above includes single-word descriptors to clarify what the numbers represent, it is still not clear how the rater will distinguish between a 1 and a 2 or a 2 and a 3. As a result, the device itself adds an element of inconsistency to the measurement. This is not to say that performance lists should not be used; they are useful for assessing somewhat simple products or performances.
The difference between a performance list and a rubric is the degree to which the elements and performance levels are described. In order for the scoring of the performance product to be as objective, clear, consistent and defensible as possible, the performance criteria must clearly describe the essence of what is being assessed and what level of quality is associated with each score. For a simple example, a faculty member wants to include a peer assessment of group work as one measure of a group project. She leads the class in a discussion about what good group work should look like and they settle on eleven performance criteria. One criterion is participation in group problem solving.
Below are three examples showing the differences between a checklist, a performance list, and a rubric for assessing participation in group solving.
Checklist: Participates in group problem-solving Yes / No
Performance list: Participates in group problem solving:
4 (Outstanding): Actively looks for and suggests solutions to problems
3 (Satisfactory): Does not actively look for solution; participates in the refining of solutions suggested by others
2 (Tolerable): Does not suggest solutions; does not refine solutions suggested by others; is willing to try out solutions suggested by others
1 (Unsatisfactory): Does not try to solve problems; does not help others solve problems; unwilling to try solutions suggested by others; does not provide any assistance
NIU’s Toolkit “What is a Rubric” article
Winona State University has an extensive list of rubrics for various disciplines, skills, and college-level assignments includes examples from many campuses. Note the link you can use to suggest additional examples.
Bowling Green State University has additional links to assessment rubric information.
Bridgewater State College provides several examples of Rubrics used at their College
State University of New York rubrics for campus-based general education assessment.
Rubrics in Student Affairs [.ppt] provides information and resources from the Director of Student Affairs Research and Assessment at North Carolina State University
California Polytechnic State University provides a resource for preparing to use assessment data. The following segment comes from their Learning Outcomes Assessment Planning Guide.
Using evidence gathered in assessment
Specify procedures for analyzing and interpreting the evidence gathered in assessment. It may be useful at this stage to form a small work group. Determine before hand what form the raw data will be in for your work group to analyze. Pay particular attention to maintaining anonymity to personal identifiers in the data. Prior to scoring assessments, determine any performance expectations.
If you are utilizing multiple assessment instruments, review the results for related parts that directly address your program goals and educational objectives. Is there a relationship between the findings? Are they consistent, inconsistent or at opposite ends of the spectrum? Use the data to pinpoint the areas in your program that are achieving program goals and also areas of your program that warrant change for improvement.
Identify the means by which information that results from assessment can be used for decision-making, strategic planning, program evaluation and program improvement.
How, exactly, will your data be used to help with program planning and improvement? Will your program form a committee to review assessment findings, and make recommendations for change or improvement in a timely manner? Will your entire department convene to discuss assessment results and program changes? Who will make formal recommendations for curricular or other changes—the chair/head? The committee?
The final aspect of assessment is the use and dissemination of results. The University of Arkansas at Little Rock shares how they use their assessment results:
With this step, we come to the final piece of the assessment puzzle. The success of an assessment plan ultimately rests on whether or not it provides you with the information needed to make informed decisions about your course’s or program’s future. Assessment findings must be usable and used for plotting the future of your curriculum. Here are some things to keep in mind when writing this section of your progress report.
Assessment writers often talk about the importance of the assessment feedback loop. The underlying idea is that results provide feedback that leads to decisions that in turn lead to new goals and objectives emerging for a given curricular program. Assessment plans that do not incorporate a feedback loop are seen as failures, no matter how much data is gather or how psychometrically meticulous that data may be.
For information to be used, it must be communicated. How was the information passed on to those who need it to inform their decision-making? Was it communicated in a way that was transparent to the user? Assessment results should never be used to lay blame on individuals, but rather should be communicated in ways that will lead to ever improving program quality.
What conclusions can be drawn from the data? What changes, if any, have been or will be made? This could include a revision of the objective or the assessment method to be used in the future. What improvements have been made based on assessment findings? How will the findings be used to make decisions about curriculum and instruction? Will your target for the coming assessment year change? How was feedback about the assessment results and program improvement communicated internally and externally? You should include how faculty, students and other stakeholders were involved in decision-making based on the results.
To download examples of NIU assessment results in report form, please see the following:
- Alumni Survey
- Annual Assessment Update Results (bottom of page)
- First Year Composition Report (right side of page)
- University Writing Project Report (right side of page)
This chapter provides some guidance on the things to consider as you analyze and interpret assessment data. It is also designed to walk you through the process of defining an assessment report in terms of audience and needs, formatting the data for effective presentation, and distributing and sharing the results of your work.
Chapter excerpt from Western Washington University's Tools and Techniques for Program Improvement: handbook for program review and assessment of Student Learning.
Quick handout with questions to answer when reporting assessment results and evaluating an assessment process.
Excerpt from Appendix 8-A, Western Washington University's Tools and Techniques for Program Improvement: handbook for program review and assessment of Student Learning. Entire handbook can be viewed here.