ACADEMIC PLANNING COUNCIL
Minutes of March 20, 2006
3 p.m., Holmes Student Center – HSC 505
Present: Cassidy, House, Legg, Marcellus, Munroe, Musial, Prawitz, Reynolds, Russo, Schoenbachler, Seaver, Self (for Levin), Thompson, Waas, Williams, Willis (for Bose)
Guests: Donna Askins, Research Associate, Office of the Provost; Carolinda Douglass, Acting Coordinator, Assessment Services
The meeting was called to order at 3:10 p.m. Legg introduced Richard Marcellus who is replacing Parvis Payvar for the remainder of his term.
Legg announced that the item on students’ progress on dissertations and enrollments/policies within graduate programs will be discussed at the April 17, 2006, APC meeting. The changes in the APC responsibilities were approved by the University Council and should appear in the Committees of the University when it is updated in fall 2006. Cassidy noted that the changes requested in the APC responsibilities resulted from last year’s discussion of the survey results on the extent to which the council members’ perceived it had fulfilled its responsibilities and the council’s recommendations for changes. By the end of this semester you will receive an electronic survey on the council’s responsibilities; please complete and return this survey.
Legg asked if there were any questions about the CHEA report on the Reauthorization of the Higher Education Act included in the materials sent with the agenda. Cassidy reported that some of the provisions in the proposed legislation related to the assessment of learning outcomes and hinted at standardized national testing for college students.
It was moved and seconded to approve the minutes of November 21, 2005, as distributed, and the motion passed unanimously.
Cassidy stated that NIU’s 2005 Performance Report was distributed to the APC in mid-September. The IBHE used the Performance Reports and various other reports to compile a 2005 Statewide Performance Report, which also incorporates IBHE initiatives and accomplishments. The statewide report is divided into several sections and is over 500 pages long; the report can be accessed on the IBHE web site. The Effective Practices section of the report is searchable by Illinois Commitment policy area, institution, and key word. The Statewide Performance Report is not searchable, but it does contain a great wealth of information. The reporting format has changed over the years, and it now includes information on the common institutional indicators and the mission specific indicators. We have just received the guidelines for the 2006 report and basically they have stayed the same. We will report on our own initiatives for policy area five (high quality) and policy area six (accountability and productivity). If you have any suggestions for items we might include in these two areas (i.e., cost efficiencies and good assessment outcomes), we would appreciate your suggestions. You can send this information to me or to Carolyn Cradduck. We wanted to give you an opportunity to ask any questions based on the report from last year.
Legg said that the next item on the agenda is the program review guidelines. Cassidy stated we need to report on outcomes including graduation rates and satisfaction, among others. In the main body of the report we focus on those items that cut across programs. The agenda committee has talked about some specific suggestions about the program review guidelines, and the APC has historically discussed this each spring. In the program review documents, many programs repeat their assessment plans rather than discuss the outcomes. We need to see the results of the program’s initiatives and findings and how this affects decision making. Schobachler noted that the assessment feedback loop is not closed. Cassidy added that we need to think about how to communicate that we want programs to report on their initiatives, findings, and decision making more clearly.
Legg added that we recently met with one program that does not have an assessment plan. The program spent their time calling other institutions to see if they were doing assessment.
Musial noted that every outcome is expected to use one indirect and one direct assessment method. In the program review we are discussing far more holistic findings. There seems to be a mismatch. It would be interesting to see if we could look at the reports and see if this could be made easier. Cassidy asked Musial if she had some suggestions about how this could be done. Musial responded that she thought we needed a committee to look at the assessment plans, updates, status reports, and program review requirements. A subcommittee could review all of this information. General education is another requirement that should be included. All of these pieces need to fit together more easily. Thompson added that she wondered if this was a task for the UAP. There are representatives from the APC that serve on the UAP. Cassidy noted that the Graduate Council and the Undergraduate Coordinating Council also have representatives that serve on the UAP. Thompson stated that we need people who will look at all the pieces. There are also groups at different levels of participation at this point.
Cassidy said that we want people to invest in the systematic review of their programs so they and we can defend their programs and report on positive outcomes as well as recommendations for future action. If you read the CHEA article on the Reauthorization of the Higher Education Act, you will note that we are looking at what amounts to “no coed left behind,” and we want individuals to be aware that we have mechanisms in place that show that we already do this. House added that it would help people understand the external obligations we face on an ongoing basis. These are expectations of both the Higher Learning Commission and the IBHE.
Munroe stated that when she teaches a group how to use an assessment tool, she tries to help them see where the pieces of these tools fit in with program review, accreditation, etc. Could we look at the guidelines and make it a little more obvious what fits where. Cassidy replied that we could try this, and we could provide some examples. In the past we have distributed a sample report. In one section of the review there is a question where information from alumni surveys should be included and then there is another section that deals with evaluation and outcomes. Waas said that his sense in talking with the program this year was that they did have data and nice examples, but these date weren’t necessarily in the document. Providing exemplary reports is helpful, but more specific kinds of data used by various programs might be more helpful. Thompson noted that some of the things that are being said are types of information people might not think of in terms of assessment. Thompson asked if part of the problem was embedding it.
Cassidy added that another item that came up in the discussion with the Agenda Committee was that some of the effective practices that people were selecting perhaps were not the best ones to point out. As subcommittees were talking to program review representatives, other things came out that could have been used as effective practices. We have talked about having some of this information on the provost’s office web site.
Williams asked if there could be an assessment guidelines list that programs could pick and chose what fits their program. Cassidy replied that she thinks most programs have selected assessment methods that are appropriate for the discipline; this is not a question of what should I pick and report on. The issue is what we are asking for. What we get nine times out of ten is the process programs have in place, but many times there is no reporting about what students have achieved. Williams said that this is what she is asking for: a list of outcomes.
Waas asked if it was Cassidy’s sense that these were high-quality programs that were reviewed this year and by-and-large people are on board with assessment. Cassidy responded that we had a good discussion at the organizational meeting, and the reports this year were well done and thorough.
Legg turned to the budget priorities item on the agenda. When he arrived on campus, he found out that the way we could increase our budget was to have a budget priority list. The first year Legg was here the university received about one-half million dollars to fund priorities that were included on the list. We have carried out the budget priorities exercise for the last couple of years, but the state has not been funding any new initiatives. Cassidy noted that this is the list that has been submitted the last couple of years and these are still our priorities. This does not reflect every priority, and these are broad based initiatives. We wanted to show you what these priorities were for the coming fiscal year. We will start working on our FY08 priorities in May and wanted to get your input on any additional priorities we should address. Legg added that you will notice that in the first three to four initiatives, the priority is faculty. This was also the case when I arrived here. We have lost more faculty since that time, and faculty are still the priority. If you have any comments, please let us know. Douglass asked if we have received money for these initiatives. Cassidy replied no, but in the past we have received some funding for our priorities.
Munroe said that the database infrastructure that the library uses relies upon social security numbers and asked if the new student information system would take care of this issue. Seaver said that we have started a two-year process for the new student information system, and the target to go on-line is fall 2008. Most of the campus has already been weaned from using social security numbers. Munroe said that she would like to see this process move quickly so the library could move away from using social security numbers. The social security numbers will not go away; they will still be used for the federal systems.
Legg turned to the next agenda item on grade inflation, limited admission and retention, and GPA requirements. Last fall there were questions raised about grade inflation and limited admission and retention programs. House said that over the last 20 years the mean undergraduate GPA has changed by only .1. Grade inflation has not hit our undergraduate programs. You need to remember that during this 20 year period there were differences in students, faculty, course delivery, etc. Legg stated that this is a good assessment and asked if this was done by department and college. House replied that he did look at this data by college, and it was very similar to the table that was distributed. Waas asked if you plotted ACT scores, would they be similar. House responded yes, but ACT scores are not a good indicator. What comes into play is students’ interest in what they are selecting. Waas asked if overall the quality of our students has gone up. House replied that he would say that the quality of our students is the same. Douglass said that if in 2005 a professor gave a student a B how we would know that in the past the professor would have given this student a C+ for this work. How do we know that professors haven’t in fact lowered the standards? House replied that we don’t know. House added there is a common perception of how transfer students do. If you look at the mean grades of transfer students, they do somewhat better than our native students. Approximately 70 percent of our native students have taken courses elsewhere, and this creates a blurry line. Only 15 percent of the degrees awarded are awarded to students who have only taken courses at NIU. Reynolds asked if there was comparative information from other schools. House replied no: the central data never come back to us. Reynolds added that it would be nice to know if a 2.8 at NIU is the same as a 3.2 at Kent. House noted that this is a good point because when you use grades as an assessment, they don’t tell us how much a student has learned. Legg stated that this is why we need assessment. Williams added that with certain programs this is built in because they have licensing requirements.
Seaver distributed the policies on undergraduate limited admission and retention, which are in the Academic Policies and Procedures Manual. All programs must have their policies for limited admission or limited retention re-approved as part of the program review process. There are about 17 limited admission programs, and there is no exact listing of limited retention programs, but it is being developed. The procedures are exactly the same. The Admissions Policies and Academic Standards Committee (APASC) approved this policy about two years ago. If a program is considering having a limited admission or retention program, it starts with the dean and the provost having a discussion, which is based around resources. The final decision is made by APASC and the UCC committees. As new programs come through, we also look at issues like what impact it will have in the university. If requirements are added at the upper-division level, it could be problematic for the students. We look at background information to make these decisions. We have to provide these students advising. We also look at the effect of the policies on underrepresented groups within majors. We are seeing more programs come forward, particularly with retention policies. Thompson asked what the university GPA requirement is. Seaver replied that students have to have a 2.0 to graduate from the university. If they go lower than 2.0, students are placed on academic probation. Our greatest fear is that all programs will raise retention requirements to 2.5. Many programs are requiring Cs or better to go on to the next course. Williams added that in clinical laboratory sciences students are permitted to retake courses. Seaver explained that limited retention means that you have to meet particular standards to stay in the program. If you don’t meet these standards, then you are not permitted to take the next course. Some programs say that if you have two Ds, you are done. We ask programs to think about the consequences of these kinds of decisions. Programs don’t go down this road lightly.
Legg asked if there were any other issues. The meeting adjourned at 4:10 p.m.
Carolyn A. Cradduck