Summary of Mid-cycle Assessment Plan and Status Reports Review
The following summarizes programs’ use of data for continuous improvement in the mid-cycle assessment status report to the University Assessment Panel.
|Year||Program||Changes Made||Evidence Motivating Changes Made||Impact of Changes on Student Learning||UAP Feedback for Continuous Improvement|
|2021-2022||Computer Science, B.S.||There is a strong potential to improve the current assessment plan by leveraging methods and processes that already exist. Revisiting and updating the tools to align more directly with stated SLOs will help the program identify specifically how students are performing on these outcomes- where gaps exist, skills that are strong in our BS program.|
|2021-2022||Computer Science, M.S.||It is encouraging to know that reports are shared in a place where all faculty can access them and that multiple individuals are involved in the discussions for recommended changes. There are references to improvement actions undertaken, but the link between those actions and assessment results is presently unclear.|
|2021-2022||Computer Science, Ph.D.||The program faculty and staff’s comments and questions during the UAP discussion were authentic, meaningful, and productive for not just theirs, but all academic programs. Please use the discussion to review the Ph.D. assessment plan- focused language for student learning outcomes (Bloom’s taxonomy), intentionality of SLO coverage by each course in the curriculum map, formative and summative assessment methods to provide student achievement on listed outcomes during the program, as well at the end; use of assessment data for actions to address gaps and strengths. Please contact AAE staff as needed during your program’s internal discussion and assessment planning.|
|2021-2022||Electrical Engineering, M.S.||1. Considering increasing exposure to literature survey in the program (SLO2)
2. Considering increasing exposure to research early in the program would be appropriate (SLO3)
|1. Ability to apply advanced design processes to engineering problems, was slightly lower than the others (SLO2).
2. Ability to conduct independent research was also lower than other areas (SLO 3).
|Once the assessment team meets to discuss the results, it would help to document the next steps it plans to take as a result of what it observed (i.e. SLO assessment results). This documentation can be used to help keep track of the process as the program determines if the changes it makes has the desired impact of improving student learning. Demonstrating how the program closed the loop by assessing their improvement actions in future reporting would be the ultimate use of the assessment processes.|
|2021-2022||Environmental Studies, B.A./B.S.||1. As a result of several challenging attempts to report on progress towards the original SLOs with the assessment methods, updated SLOs and assessment measures were adopted in 2020.||1. The assessment measures were not directly tied to the SLOs used in 2012 and 2016.||The program plans to continue to share its results among faculty. The next step will be for the program to discuss as a team what types of actions are needed (or not needed) in response to the results, that is, how will they close the loop base on assessment data they have collected.|
|2021-2022||Geology, M.S.||1. Additional training was instituted in GEOL 501 and other elective courses. More frequent meetings with their thesis committees also improved their abilities.
2. We are working on student learning outcomes and assessment methods that are both flexible for more expansive department but are also aspirational. and Manipulation of Data) need to be revised
|1. A specific group of students consistently scored ~30% lower than the other students in the program. To be clear, those students still met expectations, but they were statistically lower than students from other research groups.
2. These students are poorly evaluated under our current metrics or receive minimal feedback on their data collections and evaluation. This will become more common as GEOL merges with GAS and the ways in which we look at data become even more varied.
|This program seems to be moving toward closing the loop and is beginning to look at disaggregating data. Great points were brought up to make us all thinkIt sounds like the department is doing the best with what it has and has also made the most sense of the data that have been collected. It’s impressive to see so many years/semesters of data in aggregation. A challenge is to devise ways to incorporate the qualitative feedback from faculty. about things like incentivizing assessment activities for faculty. It sounds like the department is doing the best with what it has and has also made the most sense of the data that have been collected. It’s impressive to see so many years/semesters of data in aggregation. A challenge is to devise ways to incorporate the qualitative feedback from faculty.|
|2021-2022||Geology, Ph.D.||1. The graduate committee and select faculty are working on new language and assessment tools and additional time in class will be spent on addressing how different areas within the newly merged unit collect and evaluate data and how to properly discuss the way in which the student collected their data.
2. A new assessment form is being developed to better align with the expectations of employers for Ph.D. students
|1. Students are poorly evaluated under our current metrics or receive minimal feedback on their data collections and evaluation
2. Surveys of employers - low response rate and many responses are difficult to codify in a meaningful way.
|This program seems to be moving toward closing the loop and is beginning to look at disaggregating data.|
|2021-2022||Nonprofit and NGO Studies, B.A./B.S.||1. Some terms are explained differently in different disciplines, so we will weave these concepts more intentionally through our required coursework. (SLO1)
2. Considering revising SLO and explosring other ways to capture informatino about community engagement. (SLO2)
|1. Below target scores definition of terms.
2. NNGO 495 retest assesses retention of knowldege of terms does not capture direct exerience of community engagement.
|Despite having limited data which constrained analysis and conclusions, it was clear from the presentation that program improvement actions and assessment of these actions are being planned. A more systematic analysis of the data (especially when the data collected are more complete) will support better insights and more deliberate conclusions for ‘closing the loop’ actions.|
|2022-2023||Law, J.D.||As noted in the report, not much data has been collected so far. Therefore, any inferences or related actions seem premature at this point. More data will allow for this. Right now, the focus is on getting the assessment system implemented and seeing how well it works.|
|2022-2023||Geography, M.S.||1. Required plans with advisors
2. New course (EAE 501)
3. Create/modify assessment tools/process (i.e. more rigorous assessment plan)
|1. Needed to reduce the time to thesis/degree completion because it was more than the two year funding window
2. Students need more support in the design of research projects (SLO3)
3. Lack of data due to unclear assessment tools and low survey response rates
|It’s great to know that the department has an assessment coordinator that prepares the collected results and feedback received each year for discussion by department faculty. It is clear that the program is collecting information, despite some gaps in the data explained in the report. It appears to be reviewing, analyzing, and making use of data they have for improvement. The program intends to revisit its assessment processes. Consider consulting with AAE for any questions about this feedback or additional assistance the program needs.|
|2022-2023||Geography, B.A./B.S.||1. Considering streamlining projects, especially with new data analytics, so that students can be better prepared in this outcome (SLO 3).
2. Began increasing the number of required writing assignments and in providing more direct feedback on research projects. 300-level courses integrating more writing and poster/presentation assignments and 300- and 400-level courses give closer attention to developing students’ writing, graphic and oral communication skills through formal guidelines (rubrics) shared with students (SLO 4).
|1. Lower scores in analyzing geospatial information (SLO 3)
2. Need to improve ability to convey information through maps and other products through oral and written communication skills (SLO 4)
|It’s great to know that the department has an assessment coordinator that prepares the collected results and feedback received each year for discussion by department faculty. It is heartening to read that AAE’s annual comments on assessment reports are reflected upon each year. Based on the summary provided, it seems that data for each SLO are thoughtfully reflected upon and often action is taken to address areas of concern. It is clear that there is a good culture of assessment, despite some gaps in the data explained in the report. The program intends to revisit its assessment processes. Consider consulting with AAE for any questions about this feedback or additional assistance the program needs.|
|2022-2023||Geography, Ph.D.||1. Discussing ways to incorporate literature searches and utilization into a wider range of core classes (SLO 1)
2. Creation of a new course (EAE 501) beginning in the fall semester of 2023 (SLO 1 & 2).
3. Creating new, rigorous assessment plan to collect additional data
|1. Students needed more support with critically evaluating and utilizing knowledge from the literature (SLO 1)
2. Additional practice needed with communicating ideas, information and scholarship orally and in written form (SLO 2)
3. Lack of data due to low employer and student survey response rates
|It’s great to know that the department has an assessment coordinator that prepares the collected results and feedback received each year for discussion by department faculty. It is clear that the program is collecting information, despite some gaps in the data explained in the report. The program appears to be reviewing, analyzing, and making use of the data they have for improvement. The program intends to revisit its assessment processes. Consider consulting with AAE for any questions about this feedback or additional assistance the program needs.|
|2022-2023||Meteorology, B.S.||1. Plan to modify the targets to reflect expectations given a student's time in the program. Also, plan to add more more intensive computer programming and data analysis courses. (SLO2, SLO3)
2. Creating new assessment plan focusing less on external feedback.
|1. Students performed lower earlier in the program (missing targets), but improved later in the program to meet the targets (SLOs 2 and 3).
2. Gap in data due to chair turnover and low survey completion rates.
|It’s great to know that the department has an assessment coordinator that prepares the collected results and feedback received each year for discussion by department faculty. It is heartening to read that AAE’s annual comments on assessment reports are reflected upon during these discussions. Based on the summary provided, it seems that data for each SLO are thoughtfully reflected upon and often action is taken to address areas of concern. It is clear that there is a good culture of assessment, despite some gaps in the data explained in the report. The program intends to revisit its assessment processes. Consider consulting with AAE for any questions about this feedback or additional assistance the program needs.|
|2022-2023||Geology and Environmental Geosciences, B.S.||1. Plan to incorporate more basic data analysis in classes early in our sequence to help the department bridge the gap that currently exists between outcomes in the core classes and later capstone courses (SLO5).
2. Revisiting assessment tools and revising/creating new rigorous assessment plan with built in faculty incentives.
|1. Students needed additional support with the collection, application and manipulation of data (SLO5)
2. Data gaps (e.g. low employer and student survey response rates)
|It’s great to know that the department has an assessment coordinator that prepares the collected results and feedback received each year for discussion by department faculty. It is encouraging to see that AAE suggestions from annual reports are considered. It is clear that there is a good culture of assessment, despite some gaps in the data explained in the report. The program intends to revisit its assessment processes. Consider consulting with AAE for any questions about this feedback or additional assistance the program needs.|
|2022-2023||Special Education, M.S.Ed.||1. Faculty have suggested several changes to the SLOs and student and program level targets as well as methods.||1. Although still within the “meets standards” range, candidates have more difficulty with collecting, interpreting data and designing progress monitoring tools.||This section mostly discussed a number of changes and updates made due to changes with the standards over the past several years such as changes to the curriculum, which led to changes in the SLO’s and assessment methods. It would be helpful to make connections between conclusions drawn from assessment data and changes made or to note that the data did not indicate a need for action.|
|2022-2023||Early Childhood Education, M.S.Ed.||1. Further consideration will be given to the ways that we teach students to collect and use data in early childhood settings (SLO 3).||1. Students needed additional support in the areas of collection, analysis, and use of data to design instruction and learning environments (SLO 3).||This section mostly discussed a number of changes and updates made due to changes with the standards over the past several years such as changes to the curriculum, which led to changes in the SLO’s and assessment methods. It would be helpful to make connections between conclusions drawn from assessment data and changes made or to note that the data did not indicate a need for action.|
|2022-2023||Philosophy, B.A./B.S.||1. Modified some SLOs to equally distribute student and course representation in data collected 2. Moved all sections of PHIL 301 and PHIL 495 have returned to a face-to-face modality (2021-2022). At the start of the fall 2022 semester, faculty teaching PHIL 495 will notify their students of their expectations for the capstone essay, meet with capstone students to discuss their writing, and to provide students with feedback on drafts of their writing. 3. Modifying the course assessment and adding a tutor (SLO II.5).||1. Disproportionate data for some of the outcomes 2. Written communication declined when courses moved online (SLO 1) 3. Average student performance fell below departmental expectations in SLO II.5 (construct derivations to prove the validity of arguments).||2. It is too early to tell. Data were collected from the face-to-face sections of PHIL 495 in 2021-2022, nearly all the students who completed PHIL 495 in that year had taken the preparatory course, PHIL 301, during the previous year, when it was fully online. If student performance in analytical essay writing rebounds, it would be reasonable to conclude that the decline seen over the reporting period was a temporary aberration caused by the shift to online teaching and learning.||It is evident that the program is reviewing its results thoroughly and making appropriate changes to its assessment plan, curriculum and overall assessment process in response to these results. Overall, the program does a good job of closing the loop.|
|2022-2023||Philosophy, M.A.||1. New course materials were created to ensure student success (PHIL 505)||1. Graduates fell just short of departmental expectations which requires students to identify the metalogical relationships among the concepts of validity, consistency, logical truth, and logical equivalence (SLO 1) (formal logic)||The program is clearly reviewing its results thoroughly and making appropriate changes to its assessment plan, curriculum and overall assessment process in response to these results. Improvements in assessment data and on-time graduation were noted in response to curricular, pedagogical, and learning outcome adjustments.|
|2022-2023||Chemistry, M.S.||1. TA effectiveness institute is now mandatory for our TAs. We also require our new MS students to attend CITL workshops – this addresses SLO 2b.
2. We have just created a new accelerated BS/MS program (4+1) that should allow some of our top undergraduates to get into the MS program and finish it quickly and should improve the average quality of students in our MS program and address SLO 1a.
|1. There is zero-tolerance for violations like not wearing safety goggles in the lab. Unfortunately, violations of the rules were discovered 1-2 times every semester.
2. One area where our incoming students struggle is in SLO 1a. The level of preparation (background chemistry knowledge) of our MS students has gone down.
|It is not entirely clear how these conclusions follow from learning outcomes to data on meeting targets to areas of strength and areas needing improvement to plans to address areas needing improvements. Clarifications throughout the assessment plan along with tweaks to how the data is presented (and how it is collected, if necessary) would be helpful to optimize use of the data and/or make it clear how data and performance targets are being used to determine student strengths and weakness in specific knowledge and skill areas.|
|2022-2023||Chemistry, Ph.D.||1. Additional recruitment methods to ensure students are more prepared at entry. 2. Modify assessment plan to include additonal data collection for SLO 3e and 4b.||1. Incoming students struggled with background chemistry knowledge (SLO 3a).
2. Additional data collection methods are needed to fill gaps (e.g. SLO 3e, students will conduct research and 4b, students will mentor undergraduates).
|The program is to be commended for using assessment evidence to address needs in the area of communicating information from multiple sources. Like the MA program, the results associated with student evaluations of TAs indicate that quality of TAs is an issue, and more training is needed. Do the results suggest whether the lower than desired ratings are due to content knowledge, teaching, practices, and/or other things? It would be helpful to see a detailed report by item or question topic in order to answer this question. If it is due to knowledge and application of teaching practices, then the approach of responding with more use of teaching training seems appropriate. If it is “insufficient quality of our TAs,” that suggests to me an issue with content knowledge, which is noted when it comes to performance on the ACS. Are students performing more poorly in the same knowledge areas on other non-test measures prior to the dissertation? I would encourage the program to report formative data (if it exists) such as this to provide evidence for (or provide nuance for) this assertion and help inform what helpful improvement actions might be to address the needs of the graduate students in the program.|
|2022-2023||Nutrition, Dietetics and Wellness, B.S.||1. The program will revert back to the prior method of assessment used in NUTR 200B (SLO1, 2022-2023)
2. The assessment used in NUTR 309 has been moved to NUTR 201 (SLO2, Fall 2020); adding this to NUTR 309 would help students further develop their skillset in this area.
|1. Data from the most recent two semesters of assessment method one (NUTR 200B) were slightly below the target but this seems to be the result of attempting a new format for the practical exam.
2. The data in NUTR 309 demonstrates that students struggle with the analysis of their nutrient content.
|It seems like some of the conclusions drawn were based on specific data that was not displayed in this report. Showing the specifics may be more helpful. It is stated that students in both pathways of the degree program are performing well. Data was not systematically analyzed comparing the two, so there is no direct evidence provided for this assertion. This section also provides statements about areas in which students need more support, which is a great observation for action planning to address this. However, it would be helpful to know where these conclusions come from (e.g., more granular data than was presented in this report, anecdotal reports from faculty). Consider (for the future) providing more detailed data in each knowledge or skill area assessed under each SLO so that there is clear evidence where these conclusions come from.|
|2022-2023||Nutrition and Dietetics, M.S.||1. More emphasis was placed on application of this information into real life scenarios (SLO1, SLO2; 2020)||1. Assessment method 1 for SLO 1 did not meet the target 2017-2019||1. Data from last 2 semesters (2021-2022) has shown success in this learning objective whereas the target was not met previous to that||It seems like some of the conclusions drawn were based on specific data that was not displayed in this report. Showing the specifics may be more helpful. It is not clear what data for SLO #3, suggested that “more practice opportunities will be needed in identifying the appropriate behavioral theory for a nutrition education program” or that “students struggled with setting and describing the specific goal and objectives for the programs they developed.” Data on these individual elements were not reported. If there are specific rubric criteria related to these items, it would be useful to disaggregate the data in order to show performance on each aspect rather than solely the overall score.|
|2022-2023||Higher Education and Student Affairs, M.S.Ed.||1. Identified assessment methods that better aligned with showcasing how students have met the learning outcomes of the program. (2022)
2. Need to develop formal ways to assess learning in assistantships and internships.
3. Need to develop systms to record data from multiple methods/courses each semester/year.
|1. In completing the mid-cycle assessment, we recognized that the original assessment methods we identified were insufficient in both in number and type to fully capture how students in the program have met the learning outcomes.
2. Coversations about how to do this are ongoing.
3. Coversations about how to do this are ongoing.
|The program summarizes its findings and discusses next steps in order to improve its assessment processes. It seems to draw relatively pertinent conclusions about student learning based on the data it has gathered. The report indicates that the program has identified that it could do a better job of identifying and collecting data on core assessments throughout the program that are not incorporated in the existing assessment plan and has plans to do so.
It is great that the program would like to capture the co-curricular assessment that students are learning from various student affairs units. Progress in this area would also be helpful for the units as well.
|2022-2023||Higher Education, Ed.D.||The transition to sequencing of courses and implementing cohort programs has led to more consistent scaffolding of content (2016-2019)||This seems to have decreased time-to-completion for students enrolled in the cohort programs. This is significant as it is evidence that students are being better prepared for dissertation completion throughout the program.||The program summarizes its findings and discusses next steps in order to improve its assessment processes including the addition of shared rubrics across instructors and development of a method to better track data. It seems to draw relatively pertinent conclusions about student learning based on the data it has gathered.|
|2022-2023||Kinesiology, B.S.||Providing more hands-on experiences for students in exercise testing, design, and programming; opportunities to develop their interpersonal communication skills; experiences working with rehabilitation and special populations; and information regarding internships, jobs, and graduate school.||Feedback from students||This assessment experience helped us to begin the evaluate the effect of the program changes that were recently made in our curriculum. Because several changes have been made, we are still in the process of determining the impact of these changes.||As the program continues to gather evidence on its students, it may be helpful for it to begin by revisiting the clarity of its learning outcomes so that it is clear and intentional about what and how students will demonstrate proficiency in learning by the end of the program. Once the learning outcomes are more specific, revisiting the curriculum map may result in the design of a more useful assessment process, including methods that more directly assess student learning progress.|
|2022-2023||Communicative Disorders, B.S.||1. Continue to maximize students’ self-reflection through active learning paradigms and built-in feedback loops.
2. Capping basic science class sizes to allow for maximal peer interaction and discussion, and adding more opportunities for interactive, hands-on learning experiences. (Fall 2022)
|1.Students scoring higher early in their academic program, and lower when measured during a higher-level course may reflect need for ongoing feedback to improve critical thinking throughout the program.
2. As a group, our students appear to struggle more in with basic science content (e.g., neuroscience) compared to liberal-arts-based content.
|1. Students demonstrated improved scores compared to previous semesters on the critical thinking VALUE rubric during the Fall 2021 and Spring 2022 semesters||It is clear that the program has thoroughly reviewed its assessment results and drawn appropriate conclusions. It seems they’ve looked at their processes along with student results and have determined what they need to do to help improve the process and address student learning (e.g. adding more formative/specific feedback for oral communication skills).|
|2022-2023||Communicative Disorders, M.A.||1. Clinical writing labs were included in students’ practicum course, students were provided with resources from the NIU writing center, clinical writing manuals, extra one-on-one practice with clinical faculty, and if needed, student-specific support plans geared toward writing. (Fall 2018)
2. Added simulated cases to increase practice with measurement, added an assessment course which includes coursework on how to apply measurements to clients in the clinic. (Spring 2020)
3. In 2017, the examination was changed to multiple choice and short answer vs. essay. This allowed for us to assess students’ content knowledge (SLO 7) vs their writing styles. The questions were edited again in 2019 and 2020 to ensure content questions are relevant to the trends in the field of SLP. In 2017, we also included an automatic feedback loop for students who did not pass all four on the initial attempt. These students would meet with the faculty member (s) whose exam(s) was/were failed and discuss the concepts that require further attention.
|1. Data from 2016 and 2017 as well as feedback fro mfaculty indicated a need for more attention to effective professional written communication skills (SLO2)
2. downward trend in outcomes for students’ abilities to obtain, interpret, and synthesize client/patient data to make appropriate recommendations regarding intervention (SLO3)
3. Based on Praxis data from the entering 2016 cohort
|1. Even with the shift to online instruction and clinical service delivery, students continue to demonstrate improved outcomes in writing following the inclusion of the above action.
2. We have seen a nice upward tic of students’ averages since implementing these changes. This has worked well as evidenced by a 100% pass rate.
|The report describes tracking cohort performance overall and on each SLO along with adjustments to curriculum, pedagogy, and implementation of assessment measures in response to data and feedback. These changes have generally resulted in improvement based on subsequent data.
The summary at the end was very nice as it revisited each SLO for an overall picture to discuss how results were used and what the next steps are for each learning outcome.
|2022-2023||Audiology, Aud.||1. In 2017, incoroporating more opportunities (e.g., case studies).
2. Assessment was changed to include a specific breakdown of skills and to include both formative and summative assessment.
|1. 2017, lowest rubric scores related to the SLO on professionalism suggested room for imrpovement.
2. In 2016, data suggested there was room for improvement in collecting evidence for meeting the established criteria for the SLO on audiological assessment skills.
|1. In 2017, only 50% of students scored at least 4/5 on the rubric; in 2019 100% did so.
2. 2019 data showed 100% meeting the target (same as 2016)--detail necessary to ascertain the impact was not provided.
|Improvements to the rubrics have been made over time and improvements (or at times, lack thereof) in student performance have also been noted over time. Improvements made during the pandemic have been incorporated into the normal operations of the program to the benefit of the students. Plans are underway to revise the assessment plan in response to a more streamlined curriculum. Some advice would be to ensure SLO-rubric alignment or at least understand how the program will know students are meeting specific components of each SLO. The rubrics may be more appropriate for course-level assessment (but not impossible to use for program assessment if the alignment can be reasoned).|
|2022-2023||Health Sciences, B.S.||1. (2018-2019) (a) provide depth in discussion; (2) provide descriptive examples; (c) integrate their reflections on learning about professional behaviors necessary for health careers
2. (2019-2020) Consider making HSCI 350 eligible to meet the criteria for writing intensive course designation. HSCI 460 will place more emphasis on grading and student improvement for syntax and editing, and focus on syntax and grammatical errors in discussion posts.
|1. Lower than expected scores (60% met the criteria) in syntax and grammar in discussion posts (criteria for professionalism) in capstone course.
2. Targets were not met for 3 of 4 criteria on the capstone activity: (1) accurate knowledge of concepts or theories – assessment strategies and Cross Model application/analysis; (3) cross-cultural approaches to service delivery – recommendations and resources; and (4) use of data to analyze contributing factors, disparities, and influence of systems and culture – assessment/outcome analysis and strengths/challenges analysis
|1. Will continue to monitor, assess, and address concerns in the future||In addition to considering and planning to improve pedagogy, improvements include reviewing SLOs and course alignment to them as well as involving and communicating with all faculty and instructors, including adjuncts to ensure consistency and quality of course delivery and assessment of student learning. The program is using a lot of course-embedded data and some of the conclusions seem to be notes for course-level improvement. Once the SLO-method links are more clarified, the discussion can be moved to more of a program-level SLO relevant discussion. Maybe it would be helpful for the program to operationalize in written form what each SLO means (possibly add a few objectives) so it could help them with alignment to the methods.|