PSPA 504: PROGRAM EVALUATION: Fall, 2006

Fall (Version: 8/21/06 )

Bruce Rocheleau, Ph.D.
Division of Public Administration
Northern Illinois University
DeKalb, Illinois 60115
(815) 753-6147 Electronic: BROCHELE@NIU.EDU (Note:  Occasionally, I have problems with this e-mail so if you don't hear from me in a couple of days, you can try another e-mail address: tp0bar1@wpo.cso.niu.edu )
OFFICE HOURS:    and By Appointment

We will be using an online course system to facilitate discussion for this course.  Also lecture notes and other course information will be posted at this course website.  The address to login to the course website is as follows:  http://webcourses.niu.edu/  You will need to know your logon ZID in order to get into the website.  The course website is for the following purposes: (1) Encourage feedback including useful questions and answers between the instructor and students as well as among students themselves; (2) Post papers and other information that is relevant to the class. The course website also will allow you to access the electronic reserve.  I will also post outlines of lectures on the course website. They should be located under the Documents section of the online (Blackboard) course site.  If you experience problems with the course website, please let the instructor know.  Please check the website each week prior to the class.  I will also use email extensively in the class.  If I receive a question from one student that is relevant to others, I may copy my response to that students to everyone in the class.  

COURSE DESCRIPTION

This course introduces students to the concepts of designing, conducting, and utilizing evaluations as part of the managerial process. As a result of this course, students should be able to design and implement an evaluation program in a public or not-for-profit organization. They should be aware of the principles of good evaluation design such as how to eliminate as many threats to validity as possible. They should be able to critique evaluation studies and also understand how to maximize the use of evaluations.  We will also learn how to conduct a survey that maximizes its utility and validity while keeping costs manageable. They should understand methodological issues such as the debate between advocates of qualitative and quantitative approaches to evaluation. Finally, they should have a good understanding of the role that evaluation plays in organizations and the overall policy process.  Note that this course will place emphasis on relating program evaluation to methodologies such as performance measurement systems and recent assessment approaches such as the Balanced Scorecard System.

REQUIRED TEXTS AND XEROX READINGS:

Emil J. Posavac and Raymond G. Carey (P&C).  Program Evaluation:  Methods and Case Studies.  7th edition.  Prentice-Hall. ISBN 0-13-978-0-13-227560-6

Miller, Thomas I. & Kobayashi, Michelle. Citizen Surveys: How to do them, how to use them, what they mean.  ICMA.  2nd edition. ISBN 0-087326-172-0

Bingham, Richard D. & Felbinger, Claire L. Evaluation in Practice.  2nd edition.  New York:  Seven Bridges Press. 2002. ISBN: 1-889119-57-1.

Recommended:  Ph.D. students are encouraged to purchase and read the following book:  William R. Shadish et al. Experimental and Quasi-Experimental Designs for Generalized Causal Inference.  Boston:  Houghton-Mifflin Company, 2002.  ISBN 0-395-61556-9.  (paperback). 

Division Microcomputer Lab & This Course:

We will be having a specific assignment concerning SPSS for this course.  You may also need to use software to do statistical analysis for your final design.  Thus you are encouraged to use the Division Microcomputer Lab where there is a lab assistant who is trained to help you on these assignments. The Schedule for the Division Campus Lab  Spring is as follows:

Mon.  2:30-5:30 pm
Tue.
 6:30-9:30 pm
Wed.  11am-5pm
Thu.  6:30pm-9:30pm
Sun.  1:30-6:30pm

Of course, you may also use the lab at this Naperville site for the same purposes. 

Electronic Reserves:  The link to the Electronic Reserves is under EXTERNAL LINKS in the Course Blackboard website. I will be using electronic reserves for as many of the readings as possible that are not part of the textbooks .  This will make it easier for you to access the publications.  However, I strongly recommend that you print out all of the articles early in the term and inform me if you experience problems.  Students often encounter certain problems.  Sometimes a file will not print especially if it is very large.  In that case, try to use another browser (Netscape seems to work better with larger files.)  Sometimes the printer may get "stuck".  The printer may not have enough memory to print out all of the pages at once.  In that case, I recommend printing out the pages bit by bit, specifying which pages you print each time.  

Digital Dropbox:  I would like everyone to submit their written work to me via the digital drop box for this class.  There may be some parts of your papers that cannot be submitted via the drop box in which case, you will provide me with hard copy.  I generally will send you commented papers to your digital drop box so be sure to check for these prior to the class. 

Online readings:  An increasing amount of material is relevant to program evaluation that is online and I will use some of these for this class.  As with the electronic reserves:  I would strongly recommend printing them out early in the term and let me know if you experience any problems.

Additional Readings: In this course, you will be designing a program evaluation. You will have to become an expert in both a substantive field (e.g., recycling, traffic, building code, etc.) and a methodology (e.g., time series design, regression analysis, etc.) You will need to read additional books and articles that help you obtain the needed expertise. Check with the instructor for references.

In addition, each substantive field has its own evaluation literature. You are expected to THOROUGHLY research the evaluation literature of your area of choice. In your research design, you will be expected to describe in detail the literature review you made including the following:

(1) Computerized searches of data bases in library databases such as First Article, ABI-INFORM, etc. which can be conducted via the Internet. Be sure to check for relevant BOOKS too!   But you should also identify specialized databases that are particularly focused on your area of study. 

(2) Searches for evaluations of similar programs done by governmental organizations that may not be found in libraries (e.g., call up similar organizations and ask for copies of their evaluations). You should call relevant government organizations to identify these publications.  These evaluations are often the most useful of all and may furnish you some standards for comparison.  

(3) Searches of relevant journals and periodicals in your area. Be certain to keep track of what journals and years were searched. Be sure to search government publications.  Virtually every area (fire, police, garbage, etc.) has journals and periodicals associated with it. Be certain to show that you have thoroughly searched these periodicals for evaluation studies.   You invariably often find IMPORTANT articles that don't turn up in computerized searches just by going through the contents of such journals for recent years so it is important to go through relevant journals in person rather than just relying on computer searches.  

(4) Search via Search engines on the World Wide Web. Be aware that you also need to identify relevant sites of relevant organizations and not just rely on search engines.  However, be careful about use of Web sources.  You will need to fully document the full URL, the date that something was downloaded & the nature of the organization sponsoring the website.  For a research paper for this course, only reputable sources (e.g., done by well-known and respected research organizations and/or individuals) should be used.  Check with the instructor if you have questions.   Don't overly rely on internet sources.  This is one course where substantial library searches are expected. 

(5) You should also consult previous capstone papers which are on file at the Division of Public Administration. We have a list of these online at our website http://www.niu.edu/pub_ad/star.htm  Most of them can be checked out and xeroxed if you come to the Division of Public Administration office.

It is important that your study builds on previous work. However, you need to be aware of two issues: (1) Be careful to give full credit by footnotes, quotation marks, etc. to all material (including ideas, methods, etc.) that you use or you may be accused of plagiarism; (2) You must provide an significant "VALUE ADDED" to previous work.

Although we encourage partial replication, to be an acceptable paper, you should use new and improved methods, new concepts or measures, or something that adds substantially to the body of literature.

Finally, you are expected to document in your preliminary research design and in your final research design that you have done a thorough job of searching for all of the above sources.  If you cannot find much on your topic, you need to document the databases and journals searched as well as the organizations that you have contacted. 

GRADING:
The course grade will be based on the following:
Flowchart (see homework, ) 5%
Quiz on Threats to Validity 10% (see )
Participation: 20% includes quizzes on readings.  
Preliminary Research Design: 15%
Survey (see homework for 4/5) 5%
Final Exam: 20 %.
Paper (Design): 25%.

Grading Philosophy:
A grade of "A" is given only to those who have demonstrated EXCEPTIONAL performance, far above the expected.

A grade of "B+" is given for Very Good performances, above the expected level.

A grade of "B" is given for a GOOD performance that meets expectations for the class.

Participation:  
    Participation includes attendance. Participation includes basic expectations such as listening carefully to the instructor and/or other persons who are speaking.  It is expected that you will miss class only for exceptional reasons. If there is an extraordinary reason why you can't make class, it is also expected that you will inform the instructor PRIOR to that class. Participation includes participation in classroom discussions.  Certain homework assignments will also be given and count towards the participation grade.  Some of these assignments will receive a specific grade and they will be counted as noted above.  For ungraded homework assignments, depending on their nature, I usually assign either a check for doing doing a satisfactory and good job; a minus if there is something incomplete or not up to expectations; and a plus if the assignment goes well beyond expectations.  Normally I do not hand back homework but I do keep track of the above "marks" in assessing an overall participation grade.  
     The instructor expects all students to be actively involved in discussing the readings. Although the instructor will make certain lecture-type presentations, most of the class will involve discussion of texts and other readings. For the class to be successful, all students must be actively involved in these discussions. The instructor keeps close track of those who contribute to discussions of readings. The instructor will assign a grade each week concerning each student's participation.  

Classroom decorum counts towards participation.  It is expected that everyone will give polite attention to the person speaking.  Laptops should be used for taking notes and class-related purposes. 

"NIU abides by Section 504 of the Rehabilitation Act of 1973 which mandates reasonable accommodations be provided for qualified students with disabilities. If you have a disability and may require some type of instructional and/or examination accommodation, please contact me early in the semester so that I can provide or facilitate in providing accommodations you may need. If you have not already done so, you will need to register with the Center for Access-Ability Resources (CAAR), the designated office on campus to provide services and administer exams with accommodations for students with disabilities. The CAAR office is located on the 4th floor of the University Health Services building (815-753-1303). I look forward to talking with you soon to learn how I may be helpful in enhancing your academic success in this course."

PAPER/RESEARCH DESIGN: The paper for this course will be an evaluation research design. In other words, you will design (but not have to implement) an evaluation of some program. Much more information will be given in class about the nature of the design. But, the design should include the following components:

1. Overview/Executive Summary. In 1 or 2 pages, provide a broad overview of the evaluation including its purpose, methods, and likely uses.

2. Introduction and Problem Statement. You should explain the issues, hypotheses, and questions that you want to study. Why did you select these hypotheses or questions to study?  Why are they important?  Provide a good, clear justification for why this study would be useful.

3. Program Description: Provide a description of the program including a flow-chart or whatever materials that will help us understand the nature of the program you are evaluating--how it works.

4. Literature Review. You should present a comprehensive literature review that is helpful to evaluating your program. How has this type of program been evaluated in the past? What methods and measures have been used? What have been the findings? What are the weaknesses and limitations of this previous work? Most importantly, how does YOUR STUDY AND ITS METHODS RELATE TO PREVIOUS WORK? How do you build on previous work? How is yours different and why is it different?

Be sure to include either here or in an appendix a detailed statement of what sources (see above) that have been searched (including years for journals and computer searches). For computer searches, also include keywords used to search.

5. Methodology. You should clearly state the design and methods that will be used. What type of design are you using? Why? What threats to validity are controlled for? How are they controlled? What threats are uncontrolled? Why? What are the sources of the data? How reliable and valid are the data? What statistics will be used to analyze the data?

6. Presentation of Results. Since this is only a design, you can discuss how you intend to present results. For example, present "dummy tables" and show what kinds of statistics you would use, what kind of analysis would be conducted? What graphical aids would be used to present the information? (It would be nice to use dummy data and demonstrate the graphs too.) (Note: if you have data already, of course, you should present it.)

7.  Conclusions and Implications. If you have some data, you could present conclusions and summarize major findings here. Otherwise, you can do some hypothetical discussion on what types of actions could be taken depending on the nature of the results. You may use "dummy tables" to illustrate how your data might turn out. How might this study be followed-up? What would be the next logical evaluation?

8. Footnotes, Bibliography and Appendices. Include any survey instrument or other information that was used to collect data. The bibliography should be in appropriate form such as the American Psychological Association format. 

The criteria used to grade the paper will include the following:
First, I will be reading your papers and asking the following questions:

(a) Does the introduction provide a clear idea of the problem, the purpose of the research, and its importance? 
(b) Are the hypotheses clearly stated?  Are they reasonable and supported by the literature reviewed?
(c) Is the literature review thorough, up-to-date, and relevant to this study?
(d) Is the research design stated clearly.  Is the rationale for the selection of the design presented and does it make sense? Are threats to validity, both those controlled for and those not, discussed clearly?  
(e) Is there enough information about the instruments used?  Is there a discussion of the  reliability and validity of the instruments and measures? 
(f) Are appropriate statistical tests employed?  Are the data presented in a clear and full manner?  Are graphs or other devices used to enhance understanding? 
(g) Are the findings clearly stated?  Are the limitations and generalizability of the findings discussed in a clear manner?  
(h) Is the bibliography comprehensive and done using the APA format?  
(i) Are all instruments such as surveys fully presented?  
(j) Are the findings discussed in terms of their implications and practical significance?  

Note:  Although this research design is normally only a "plan," it should be feasible for you to complete without the assumption of large resources that normally would not be available to you.  The data should also be accessible to you even if you do not plan to implement the study.

1. Effort. There should be evidence of substantial work in the overall study, as evidence by the literature reviewed and methodology.

2. Methodology. How thorough are you in addressing the methodological issues that we have discussed in this paper?  Do you provide a clear explanation of how the program works?  Have you justified your hypotheses and/or research questions?  Have you justified what measures you plan to use?  Are they valid and reliable?  Have you done the best job possible to eliminate threats to validity given your resources? Have you given attention to dissemination, political, and ethical issues?  

3. Clarity and Style of presentation. How clear is the writing? Are devices used (e.g., graphics) as aids to the reader of the evaluation?

4. Use of Class Lecture and Reading Materials: To what extent do you make use of (and demonstrate and understanding of) the lectures and readings if they are relevant to your paper.

5.  Timeliness:  There will be a penalty imposed if your paper is handed in later than the scheduled date.  

Preliminary Design (15%):  On November 15th, I want a written preliminary design in which you state your plan for the research design (please e-mail it to me directly). It should include a clear explanation of what your evaluation will study, its major hypotheses, the design to be employed, and data to be collected. Discuss threats to validity--which are possible and which are plausible?   Also, an annotated bibliography of AT LEAST 15 sources very relevant and useful sources should be attached that are relevant to your proposed paper--that is in addition to texts and assigned readings.  Note, that in locating sources, if you do not find many that focus on the specific program that you are evaluating, then you may use evaluations of programs that present similar challenges and/or use require use of similar methods as well as sources that explain the substantive significance of the program.  

You are encouraged to talk with the instructor about your design at the earliest possible time. However, in order to make an informed decision about the design, you will need to be familiar with various aspects of the course, especially the alternative designs available and thus you may want to skim ahead in the readings. The Campbell reading is a good overview of many of the potential research designs as well as the Posavac & Carey and Bingham and Felbinger chapters relevant to your design. 

More specifically these are the components that should be present in the PRELIMINARY DESIGN:

(1) What is (or are) the evaluation hypothesis (or hypotheses) or research question(s) that you propose to study?  State them in hypothesis form or research question question form.  (E.g.,  The speed of response of the police to calls for help will be improved by the implementation of the E911 system.)   What is the significance of this research question? 

To illustrate, you may test for the achievement of intermediate outcomes (e.g., whether potential program clients heard or read about a recycling program through outreach activities conducted by the municipality, whether applications for assistance were processed within the desired time period, etc.)  You may also study cost efficiency of the program before and after some change was made in it.)  

(2) Your annotated bibliography should contain a paragraph which discusses how each publication could help you with your research design.

Each should make a clear contribution such as the following: (1)  For example, the publication might be necessary to provide a good description of the nature and context of the program.  (2)  Or it may provide an example of one model of evaluating the program or have some instrument or evaluation  measure that you could employ in your own evaluation.  (3) It may be an evaluation of a program, though not exactly the same as yours, poses similar problems and thus the evaluation gives you insight into how you should conduct your own.  (4)  Or it may help to provide justification that you are focusing on truly important set of issues; (5) It may provide hypotheses that are relevant to your study; (5) It may provide standards which will be useful in interpreting the program effects of the program you will be studying.

(3) You should do some thinking as to what type of evaluation (e.g., process evaluation, evaluation of intermediate or long-range outcomes, cost-efficiency evaluation) you intend to do. 

Also, what design (e.g., single-interrupted time series, pretest-posttest comparison group design, etc.) do you plan to do?

What kind of measures might you use and do you have access to these data?  Will you have to gather the data by survey or are do they exist already somewhere?  What measurement issues do you face?  What kinds of data are most appropriate to examine your proposed research questions/hypotheses? 

What would be the most challenging aspects of this study if you were actually to implement it? 

After you have handed in your design, I would like to meet in person (or possibly over the phone) with each of you concerning your design.  This may be done before, after, or during (if we have time available) class, or at my office, or possibly over the phone.  

PRESENTATION: Students will present their research design to the class during the last 3 classroom meetings. Students are expected to provide other students with a handout that will help them understand the project. After the 10 to 15 minute presentation, there will be a discussion of the design.

THE RESEARCH DESIGN IS DUE on  ___THERE WILL BE A PENALTY FOR PAPERS THAT ARE LATE.

Final Exam: The Final Exam will take place at the last class (May 10th) after all of the presentations have been made. It will be an essay test, based on a pool of questions handed out in the previous class.

INCOMPLETES: It is the policy of the Division of Public Administration NOT  to give incompletes except for extraordinary reasons.

RELATIONSHIP OF DESIGN TO CAPSTONE PAPER:

As you may know, students often use PSPA 504 as their capstone paper course.  However, students are encouraged to develop a capstone paper only about topics for which you have the necessary access and/or resources to conduct.  You should also have enthusiasm for the paper.

If you decide to use your 504 design for a capstone paper, then this should be indicated on the paper when it is handed in. In order for you to do this, you should have a good design (e.g., respectable quasi-experimental design) and a good set of data. If the instructor does not believe that the design or data lend themselves to a capstone paper, then this will be indicated to you.  Please note that you may choose to implement the design under any professor that you desire to choose.  Indeed, if the paper is very much focused on an area in which another professor is expert, I will be likely to encourage you to do so (e.g., Professors Thurmaier or Wood for budgeting, Professor Gabris for personnel issues, etc.)

You will need to sign up for the capstone paper completion course if you decide to implement your design as a capstone paper. You will have to obtain a form for this course from the MPA coordinator (or her secretary) and bring it to the instructor with whom you intend to implement the research design. 

Please note that the capstone papers must be completed ONE MONTH BEFORE COMPS. Since there cannot be a compromise in quality, be sure to get the paper in long before the deadline.

If you intend to implement your design, it is possible for you to begin collecting data during class. However, you should be forewarned that this could lead to problems because, in our review of the design, we may ask you to make changes in your approach that will lead to a duplication of effort.

STATISTICS AND COMPUTING: It is generally assumed that you have had statistics prior to taking this course. Also, it is assumed that you are familiar with a computer-based statistical package such as SPSS-PC. If you have not had statistics, we will have a session that introduces you to it, especially as it relates to evaluation.  However, you may want to take advantage of  short courses offered by the ACADEMIC COMPUTING CENTER (AT NIU).  The lab assistant at the NIU MPA Computing lab can also offer assistance in using SPSS. 

We will be using the lab at the Naperville for a session in which we will review your skills with SPSS adapted this evaluation course. 

SCHEDULE OF TOPICS: Below is a schedule of planned readings and topics. Students are responsible for keeping informed of modifications or additions to the schedule. Unannounced quizzes may be given at any time.

M.A./Ph.D. Students: Note that some of the readings below are optional.  MPA students are encouraged to read these too but they are optional.  However, Ph.D. students or M.A./MPA students interested in pursuing the Ph.D. are expected to read all of the optional readings. Likewise, if an optional reading is relevant to your research design (e.g., if you intend to do time series analysis), you should do the optional readings for those relevant readings too.

* = Recommended for Ph.D. Students

TOPIC ASSIGNMENTS DATE

INTRODUCTION: 9/6, 9/13

What is scientific program evaluation? 

P&C, Chs. 1
Bingham & Felbinger, Ch. 1. 
Miller & Kobayashi.  Citizen Surveys: Chs. 1, 6

*Shadish et al. Ch.1: Experiments and Generalized Causal Inference. in William R. Shadish et al. Experimental and Quasi-Experimental Designs for Generalized Causal Inference.  Boston:  Houghton-Mifflin Company, 2002.

What is scientific, program evaluation? What different types of evaluation exist? When is it appropriate to use them? Why are evaluations done? What are the stated reasons? What latent reasons exist? At this point, do you have any idea of how "performance measurement" and its many forms (e.g., balanced scorecard) relate to program evaluation?

Homework:  What, if any, systematic program evaluation is being done by some public or nonprofit organization with which you are familiar?  Inquire into this quesiton and be ready to discuss at 2/1 class. 


PLANNING EVALUATIONS and Needs Assessment:  9/20

Review Definitions of Evaluation and Above Readings
P&C, Ch.2
P&C, Ch.6

Wholey.  Ch.2. "Assessing the Feasibility and Likely Usefulness of Evaluation."  Pp. 15-39 in Wholey et al., eds., Handbook of Practical Program Evaluation (San Francisco: Jossey-Bass Inc., 1994).  

Quintanilla. "Head Start reformers say that program has lost its way."  Chicago Tribune, 5/20/01.  Read this article and be prepared to discuss how you would plan an evaluation of this program if you were put in charge of such an effort.  What steps would you take?  Is it an evaluable program? What would a "logic model": or flow-chart look like for this program? 

What are the main steps that need to be done to plan an evaluation? 
Why is it necessary to conduct an evaluability assessment prior to evaluating a program?
What is evaluability assessment? What methods are employed in evaluability assessment?   What steps do you take to implement an evaluability assessment?  What is the end product(s) of an evaluability assessment?
What are the key problems with identifying and measuring goals.
Can a program be viewed as theory? Explain. What is the difference between "theory failure" and "implementation failure."?
What is the difference between the rhetorical program vs. actual program?

Homework due this week (5%): Construct a flowchart linking a program's resources & activities to the outcomes desired for some program with which you are familiar in a manner similar to that used in the Wholey chapter.  

Methodological Issues and Qualitative Approaches to Evaluation. 9/27
Review  homework (flowchart--evaluability assessment)
P&C, ch. 3
P&C, ch. 4
P&C, ch. 8
B&F, Ch.3: Measurement

Duncan, Greg J. & Gibson, Christina. 2000 (January-February)"Qualitative-Quantitative Interactions in the New Hope Experiment."   http://www.jcpr.org/newsletters/vol4_no1/index.html (note: You have to skim down to get to the Duncan & Gibson article--there are several articles included in this file). 

Dunbar, et al. "Psychological and Spiritual Growth of Women Living with HIV."  Social Work, 43, 1998, pp. 144-154.  

Note: there are several other articles in the same group as the Duncan article and please skim through these other examples of qualitative research.


*Shadish et al. Ch.3 "Construct Validity and External Validity." 

What are the data sources that can be used for evaluation? What is the relationship between information systems and evaluations?
What are the definitions (as related to program evaluation) of validity and reliability?  
How do you measure reliability?  How do you determine validity?  
What is the key difference between internal and external validity?  
What are the strengths and weaknesses of quantitative vs. qualitative approaches to evaluation? Which do you find  more useful?  Can they be integrated? How? 
What is triangulation (as related to evaluation)?

THREATS TO VALIDITY 10/4

Present and discuss homework assignment.
B&F.  Ch.2. 
P&C, Ch.9
Talbot.  "The Placebo Prescription."  New York Times, Jan. 9, 2000. 
Campbell, "Reforms as Experiments."  American Psychologist, 24(4), 1969, pp. 409-429.  Read thoroughly 174-176 and 198-200.
Langbein,  Ch.3 (pp. 31-39) "The Principles of Causal Inference" and Ch. 4 "The Practice of Causal Inference" (pp. 40-49) in Laura Langbein, Discovering Whether Programs Work (Santa Monica:  Goodyear Pub. Company, 1980).  

*Shadish et al. Ch.2 "Statistical Conclusion Validity and Internal Validity." 

What are the threats to internal validity? What are the threats to external validity?
What threat(s) are controlled for by the use of statistical tests?
How can these threats be manipulated to make a poor program look good or a good program bad?

Graded Quiz (10%)  Note: You need to memorize the threats to internal and external validity and be able to cite an example of each. You will be quizzed on these during this class.

ONE GROUP PRETEST-POSTTEST DESIGN.  10/4
P&C, Ch.9
B&F, Part IV Introduction & Ch.11
Lurigio et al. "HIV Education for Probation Officers:  An Implementation and Evaluation Program."  Crime and Delinquency."  37(1), 1991, 125-134.

One Group Posttest Only Design (Case Study). 10/4
Frecknall & Luks.  An Evaluation of Parental Assessment of the Big Brothers/Big Sisters Program in New York City.  Adolescence, 27(107): 715-718.

*Shadish et al. "Ch.4: Quasi-Experimental Designs that either lack a control group or lack pretest observations on the outcome."  pp. 103-134.

EXPERIMENTAL APPROACHES. 10/11

P&C, Ch.11
B&F, All of Part 2 including overview and Chs. 5-7.

Skim the publication on the Kansas City Preventive Patrol Experiment.  It is available at the Police Foundation website, http://www.policefoundation.org/ under electronic publications.  Read Parts I and (pp. 1-9)  and skim through other parts.

Skim through the following and be ready to discuss:
The Police Foundation. Police Response to Domestic Violence. (available in the under "publications" and then under "ideas in American policing" at http://www.policefoundation.org/docs/domesticresponse.html  
The Police Foundation. The Newark Foot Patrol Experiment.. (available in the under "publications" and then under "ideas in American policing" at 
http://www.policefoundation.org/docs/newark.html
Eck. Reducing Crime and Drug Dealing by Improving Place Management. http://www.ncjrs.org/txtfiles/fs000235.txt

Homework due this week:  Be ready to discuss the following:  Is there some public program or activity with which you are familiar  that feasibly could be evaluated by the experimental method?  Why or why not? 

QUASI-EXPERIMENTAL DESIGNS (OVERVIEW) 10/18

P&C, Ch.10
B&F.  Part 3 Introduction.

Campbell, Review all and each design as we discuss it

What is the concept of a "quasi-experimental" design? In what key way(s), does it differ from an experimental design?

*Shadish et al.  "Quasi-Experimental Designs that use both control groups and pretests." 

PRETEST-POSTTEST COMPARISON GROUP 10/18 (also known as Non-Equivalent Comparison Group) Design
P&C, Ch.10
B&F.  Chs. 8 & 10. 

Sherman.  The Kansas City Gun Experiment.  (downloaded from National Institute of Justice website).  Available from:  http://www.ncjrs.gov/pdffiles/kang.pdf

What threat(s) to validity does this design control for? What threats doesn't it control for? What are the crucial aspects of implementing this design?


TIME SERIES (Simple Interrupted Time Series and Multiple Interrupted Time Series or Comparison Time Series) 10/25

Preliminary Design Due

Review P&C, Ch.10, 196-201
B&F, Chs. 9 & 12
CAMPBELL, 176-188
Rock, "Impact of 1980-1983 Selected Traffic Enforcement Program on Traffic Accidents in Batavia, Illinois".  409-419.  Eval Review, 13(4) (1989), 409-419 (reserve)

*McCain & McCleary. "The Statistical Analysis of the Simple Interrupted Time Series Quasi-Experiment." Pp. 233-253 in Thomas Cook & Donald Campbell, eds.
*Lewis-Beck, Michael S.,  pp. 209-240 in  New Tools For Social Scientists. William D. Berry and Michael Lewis-Beck, eds. Beverly Hills: Sage Publications. 1986. 

What are the strengths of this design compared with the PPCG design?
What are the crucial aspects of planning these designs?

Homework: Enter data for time series design into SPSS and run regression analysis to analyze data. Bring output to NEXT class.


META-ANALYSIS, META-EVALUATION, AND EVALUATION 11/1

Discussion with Instructor about Preliminary Design will be done this week. 

B&F, Part VI introduction Other Designs and Ch. 16

Read B& F, Part VIII, and write up your assessment--which of the reports on the Milwaukee Experiment do you find more convincing? Why? Be ready to discuss. 

*Shadish et al. Ch.13: "Generalized Causal Inference: Methods for Multiple Studies."
 

How can we combine the results of many different evaluations? How do we overcome the "onceness" problem that most evaluations suffer from?  What is external validity and how does it differ from internal validity?   What is Type 2 Error?   How can the probability of Type 2 error be reduced? What are the major criticisms of meta-analysis?  What are the defenses to these criticisms made by researchers who use meta-analysis? 

Is it possible for a program effect to be statistically significant but programmatically insignificant?  If it is, what steps and/or measures would you take to determine if it is programmatically significant?


PATCHED DESIGNS 11/1: 

B&F.  Ch. 15. 
What is a patched design?  Can you incorporate this approach into your design? 


NON-EXPERIMENTAL DESIGNS & MULTIVARIATE STATISTICS.  11/1
P&C, Ch.8. 

Police Foundation. Police Women on Patrol.  (available in the under "publications" and then under "ideas in American policing" at http://www.policefoundation.org/docs/policewomen.html  uses matching approach.)

*Shadish et al. Chs. 11 ("Generalized Causal Inference: A Grounded Theory") and Ch. 12 ("Purpose Sampling and Generalized Causal Inference.)

Review: What are the differences between experimental, quasi-experimental, and non-experimental designs?

How can multivariate statistics like multiple regression be used to control for threats to validity?
 

Survey Research (and Related Techniques) & Evaluation  11/8

Miller & Kobayashi.  Citizen Surveys.  Chs. 2-5

Ray, Nina M. & Tabor, Sharon W.  Cybersurveys come of age.  Marketing Research 15(1), Spring 2003, p. 32-37. ISSN: 1040-8460. Retrieved 12/18/04 from WilsonSelectPlus. 

Homework (5%) (due this week): Construct an original survey containing at least 10 questions concerning something relevant or interesting to your job (or some interest) and discuss how you would go about implementing the survey so that it would obtain as accurate and valid results as possible given limited resources. Be sure to define your theoretical population of interest and sampling frame. Discuss how you would decide on sample size to use. Discuss what sampling method you would use. Discuss how you would implement it to obtain a high rate of return.

Performance Monitoring, Balanced Scorecard,  & Evaluation  11/15

P&C: Ch.7
& Review P&C, Ch. 12; 
Police Foundation-Willis et al. Compstat in Practice:  An In Depth Analysis of 3 Cities.  (available online at http://www.policefoundation.org/ under their "electronic publications" section. 

Eagle, Kim.  "Translating Strategy into Results:  Public Sector Applications of the Balanced Scorecard: The Origin and Evolution of Charlotte's Corporate Scorecard.  Government Finance Review, 20(5), p. 16 & 19-22. ISSN:  0883-7856.  Retrieved from Wilson Select Plus.  12/18/04. 

Optional Reading: For more detailed examples of bsc:  look through the following. 

City of Charlotte, North Carolina.  Strategic Planning Handbook.  (Skim for parts that involve discussion of balanced scorecard and performance measurement as applied to municipalities).  Accessed 12/16/03 from:  http://www.charmeck.org/NR/rdonlyres/e6mgght3nmd7wv4rzkgjy7bp4iam7ujlgwzvhm4wormfvnmeyc2mudvhgy7vlugjdnzmappvfkcsulvsacuz4ovlxod/Strategic+Planning+Handbook.pdf

(If above URL does not work, check out Mecklenburg County. IT 2003 Report & 2004 Plan and/or other publications related to balanced scorecard at  http://www.charmeck.org ). 

Be prepared to discuss the following:

How do performance monitoring and instruments such as the "balanced scorecard"  relate to program evaluation?  What are the differences and similarities?

Study the information system of your organization in relationship to the evaluation of one (or more) programs. What, if any,  performance monitoring type of  information is in the computerized data base available from your organization's MIS or through some other method such as a survey that has not been incorporated into your MIS system? Is it used for this purpose?  (Rather than do this for an entire organization, you can do it for a single program). 

Cost Benefit & Effectiveness Approaches to Evaluation 11/15

P&C, Ch.12
Rocheleau. Handout. 
B& F.  Part IV.  Read Chs. 13-14 as well as introduction.  Be ready to discuss. 
 

Homework:  Has your organization conducted some cost effectiveness and/or cost benefit analysis?  If so, bring in an example of it.  If possible, post to newsgroup. 

Ethical  Issues and Possible Dysfunctional Consequences 11/15
P&C, 5
Ginsberg.  "Dysfunctional Side Effects of Quantitative Indicator Production."  Evaluation and Program Planning,  7 (1984), pp. 1-12 (reserve).
Chicago Tribune Editorial. http://www.chicagotribune.com/news/opinion/chi-0607140270jul14,1,2030288.story An `A' for everybody! July 14, 2006. (under external links)

Download the forms and guidelines of NIU for Institutional Review of Research Involving Human Subjects.  Does your proposed study require these forms to be filled out?  Be prepared to discuss and defend.  If your proposal would require such approval, then fill out the appropriate forms.  The forms are available at the following campus website of the IRB (Institutional Review Board) of the Office of Research Compliance of the NIU graduate school:   http://www.grad.niu.edu/orc/irb_homepage.htm 

Please download and go through the form that needs to be filled out.  Bring both to class and be prepared to discuss.

For next class: I will ask each student to obtain at least one capstone/capstone paper that is most relevant to their own paper and to post comments on the paper(s) and be ready to discuss the paper (both substance and methodology) at the next class.  Note that the secretaries may require notice (24-48 hours to obtain the paper so it requires some planning). 

Utilization of Evaluation & Politics & Evaluation.  11/29

Discuss capstone/starred paper that is (are) most relevant to your own paper.  Every student is to find at least one capstone paper that is relevant to their proposed study and write up a brief discussion of how it pertains to your study and be ready to discuss at class. 

P&C, Chs. 13, 14, & Appendix pp. 284-296
Miller & Kobyashi, Chs. 7-8
Rossi and Lyall, "The External Politics of the Negative Income Transfer Experiment." Pp. 286_295 in Francis G. Caro, ed., Readings in Evaluation Research. New York: Russell Sage Foundation, 1977. 2nd edition
Note: Remind me to hand out xeroxed copy of page 293 which is missing from electronic reserve copy of this file.
Vogt.  "Now, Many 'just say no to dare' in schools. Chicago Tribune.  January 26, 2003.
Karen Fisher. Bush Budget would abolish Spending for Voc Ed program. (Feb. 7, 2006).  Chronicle of Higher Education.  http://chronicle.com/daily/2006/02/2006020703n.htm  (under external links)

If your organization has conducted some evaluation, discuss in class how it went about utilizing the evaluation. 

11/29Beginning of Presentations of Designs if necessary.  This class may be employed for individual consultation with instructor about research designs & also may be used as a catch-up day if necessary. 

12/6:    Research Design Due Presentations of Designs.

12/13:    Final Exam