Skip navigation

Writing Program Prioritization Narratives:
Tips from External Panelists

NIU’s Program Prioritization process has been informed both by internal discussion and external advice.  In mid-October, NIU hosted eight panelists from universities that have already undergone Program Prioritization.  Those panelists organized and administered the process at their institutions – but they also wrote and evaluated program narratives.  The following is some of their advice on writing narratives that read well and speak to the quality, effectiveness, efficiency and relevance of university programs.

Getting Started

  • Don’t procrastinate – it shows in your report.
  • Make a list of what you do and then ask what value each item brings.
  • There are two buckets:  The business of the university (teaching, research, service) and the university as a business (supporting, facilitating, adding value).
  • Don’t bury the lead:  Make it absolutely clear what you do.
  • Think of your program in relation to the whole university.
  • Just get started writing, delve deeply into a program, and then step back and look at what you’ve written.  Ignore word limits at first and focus on whether or not you’ve addressed the questions and criteria, and whether or not you’ve described your function thoroughly. 

Style

  • Wordy is not good.  Think of your reader – more isn’t necessarily better.
  • Bullet points are great.
  • Remember your reader at all times.  Interrogate yourself as if you were the reader:  Is this helping me answer questions about quality, effectiveness, relevance and efficiency?  We weren’t ranking writing skills.
  • Focus on impact more than activities.  Example:  ‘In ABC program, students do XYZ.’  That’s great – but what does that experience prepare them to do?  And what is the impact on students, the institution, and/or the region?
  • Add meaningful information – it doesn’t have to be data.
  • Excellent prose is not as important as clarity about what you do and tying things together for the reader.
  • Take a snapshot of your current status, then describe what future steps will add value for students and for your university.
  • Focus on outcomes and efficiencies.  Readers want to see that you’ve been constantly improving your product and processes.
  • Talk about best practices, and about internal and external demand.
  • Readers will see through fluff immediately.  Be honest.

The Process

  • The writing process should be collaborative.  Get feedback on all your responses.  It was obvious to us when one person had written a narrative versus a collaborative process.  The final product needs to be written in one voice, but the process to get there should involve multiple people.
  • We did presentations by putting draft narratives up on a screen and discussing.
  • Review your colleagues’ reports. 
  • We compared notes and our institutions gave an average of six weeks for narrative writing.  Drawing it out is worse.  We got more confident as we went on.
  • If you find yourself using up your word limits at the criteria level, you can write “NA” to questions that don’t apply.
  • Look at the questions you can answer.  Evaluators are looking at your submission as a whole.  Demonstrate breadth and depth of your operation.
  • Don’t let perfection be the enemy of good.  Trust your colleagues.

Use of Data

  • Be able to source any data that you provide yourself.
  • Student-to-staff ratios are important over time.  Explain any changes.
  • Tell your readers what you think you should be measuring and why.
  • Qualitative data can be as effective as quantitative.  Describe the types of issues you deal with.  Also consider your university’s mission and how you feed into that.
  • Most of those units that had little data ended up in the 2, 3, or 4 ranks, meaning that they needed improvements, including collecting data for the future.
  • Sometimes this process can surface data points of pride, like ‘We have the third-lowest level of student debt in the region’ or things like that.
  • Benchmarking data should be readily available.  Go to your IR office and find out what they report.  Anything we report to the state or the feds is information that other universities also have to report.  Also think about internal benchmarks, saying ‘Here’s what we want to be and here’s what it would take to get there.’
  • Also go to your professional organizations to find benchmarking data.
  • If you have lots of data but aren’t sure what to include, ask yourself what is most meaningful to the criteria.  What ties most closely to meaningful impact?
  • Don’t be afraid to reference key data points in your narrative.  In fact, if you upload a chart with data, be sure to explain its significance in your narrative.
  • We used IPEDS completion data and compared it to our competitors to see where we were losing market share.
  • If you haven’t been consistent about collecting data, just be upfront about that.  Tell the reader you are providing snapshots because you don’t have longer-term data.
  • Every university that goes through this process gets complaints about data.  At our place, we couldn’t get to the level of granularity that many wanted.  But in the end, what we discovered was that the “deficiencies” in data made no difference.  If you’re relying on data to make some point that you can’t make in another way, it’s not likely to help you. 
  • Remember that the Data Support Team can help you and identify good sources of information for you.
  • Data has its limitations, as relevance and quality are hard to measure.  The challenge is to coalesce the data into something meaningful.
  • Data matters less than information.
  • We used data to start conversation.  The data often triggered questions we hadn’t been asking.  Program expenditures and revenues were often quite a revelation to program heads.

Showing Progress

  • It’s really important to be able to show progress, however that might be defined in your unit. 
  • At our university, one of the highest-ranked programs was our mailroom.  They were able to document efficiencies like going to a new postage meter system that saves more than $300,000 a year.
  • Our registrar was able to show how implementing a new grade change process saved time and money.
  • When we were reviewing programs at our university, what we cared about was ‘Are you doing better over time?’  Benchmarking is great, but are you doing what you should be doing?  And do you have any measures of improvement?

Focus on Outcomes

  • If you’re in an advising area, for example, don’t just talk about how many people you’ve advised.  If you can, talk about the results of your various interventions.
  • Focus on outcomes and efficiencies.  Readers want to see that you’ve been constantly improving your product and processes.
  • Programs that ended up in the lowest quintile were seldom a surprise to anyone.  They tended to be those with the fewest outcomes to report – they were things we just did because we got lazy about letting operations grow without thought to efficiency or duplication.

Recommending Change

  • If you suggest a new program, do a cost/benefit analysis.  We recommended hiring a new person to do ‘on-boarding’ for new, non-degree students.  We were able to show how it would improve service and save money.
  • What would it take to improve the effectiveness and efficiency of your program in order to make a greater impact?
  • Don’t be afraid of recommending change (elimination, merger, etc.) for programs that aren’t performing well.  We’re all victims of “indeterminate growth” or “creep,” in which we’ve continued to add new things without reducing old ones.  We never give things up.
  • Hold on to all the new ideas that come out of this process – you may not be able to do them immediately, but they could be of tremendous value down the road.
  • Some of the new ideas coming out of the reports were things we could act on immediately outside of the Program Prioritization process.  Many were system processes that could be improved.  For example, we found many inefficiencies in reports to the state and eventually sponsored legislation to change those requirements.