Degree Programs

Table of Contents

  1. Overview and Introduction to Assessment
    1. Defining assessment:
    2. Purpose of handbook
    3. Assessment and UALR mission
    4. Assessment and our program miassions
    5. Assessment and accountability
    6. Assessment and program review
    7. Organization of this handbook
    8. Additional Resources
  2. Assessment process and procedures at UALR
    1. Plans and progress reports
    2. Approval and review
    3. College-level assessment
    4. Institutional-level assessment
    5. Annual deadlines
    6. The assessment cycle
  3. Six steps to assessment
    1. Tips for success
  4. Assessment Plans: Steps 1 and 2: Goals and Objectives
    1. Definitions: goals and objectives
    2. Connection with program, college, and university mission
    3. Examples from plans
    4. Plan or progress report form instructions:
  5. Assessment Plans: Step 3: Connecting objectives to curriculum
    1. Curriculum and assessment map
    2. Identifying points of assessment
    3. Plan or progress report form instructions
  6. Assessment Plans: Step 4: Tools, Methods and Design
    1. Assessment tool types
    2. Issues in selecting tools
    3. When to assess
    4. Samples and sampling methods
    5. Rubrics
    6. Resources needed
    7. Implementation or the assessment cycle
    8. Stakeholder Involvement
    9. Plan or progress report form instructions
  7. Assessment Progress Reports: Step 5: Reporting Findings
    1. Connecting findings to goals and objectives
    2. Interpreting for different audiences
    3. Progress report form instructions
  8. Assessment Progress Reports: Step 6: Using Results
    1. Connecting decisions to results
    2. Communicating results and inclusion in decision-making
    3. Progress report form instructions
  9. Reviewing Assessment Plans and Progress Reports
    1. Review versus approval
    2. Level of approval or review:
    3. Review rubrics:
    4. Instructions for review checklists
  10. Appendix A
  11. Appendix B
  12. Appendix C

Assessment Handbook: Degree Programs


back to top

I. Overview and Introduction to Assessment

Defining assessment:

“Assessment” is a word that has multiple meanings, both within the educational context and outside of it. First and foremost, assessment is about student learning. As teachers, we are constantly assessing individual students’ performances against explicit and implicit benchmarks. When we “assess student learning outcomes” we are stepping back to get a bigger picture of learning than is represented by one student in one class section. We are trying to assess how our students as a whole are achieving across courses and throughout our curriculum. The goal is not to determine the success or failure of an individual student, or that of an individual faculty member, in a specific course. Rather, is our curriculum as a whole producing the types of graduates that we want to be known for? If so, where are the successes and how can we build on those strengths to make our programs even more successful? If not, where are the deficiencies and how can we improve them? Assessment of student learning provides the tools and opportunity to help us achieve our educational goals.

Purpose of handbook:

Assessing student learning outcomes is a vital part of our teaching and learning mission; but how does this work? This handbook represents an effort to provide faculty with everything that they need to know to pursue a successful student learning assessment program. A “successful” assessment program provides faculty with the information that they need to continuously improve the teaching and learning experience for their students.

Each section of this handbook addresses a different part of the assessment process. It is not meant to be prescriptive, but rather to provide a pool of resources to help programs develop plans that best fit their needs and circumstances. But first, let’s take a moment to remind ourselves of why assessing student learning is important.

Assessment and the UALR mission:

The University of Arkansas at Little Rock’s mission states that, “… The University has a responsibility to provide excellence in instruction to ensure high-quality education for our students. This responsibility includes developing faculty teaching skills, awareness of the ways students learn, assessing student learning outcomes, and enhancement of resources to support effective instruction.” (Adopted by UALR Faculty Senate, 1988.) Assessing student learning is the key to evaluating our effectiveness in achieving this part of our stated mission. As such, assessing student learning is everyone’s business and responsibility.

Assessment and our program missions:

In addition, several colleges and programs have mission statements that include the importance of a quality education for our students. Here are some examples:

In the College of Education:
Counseling, Adult, and Rehabilitation Education states that it:”…provides a quality education to a heterogeneous student body at the undergraduate and graduate levels. The goal is to prepare students with the special knowledge and skills for professional work in education and human service fields, and to enhance students’ abilities to be successful in their chosen fields of study.”

Department of Teacher Education (College of Education) states that its mission is to: “… provide balanced teacher education programs that embody institutional and college goals, the Arkansas Department of Education teacher licensure requirements, guidelines of learned societies and professional associations, and contemporary educational philosophies and practices.”

In the College of Arts, Humanities, and Social Sciences:
Department of Sociology, Anthropology, and Gerontology states it strives to “…teach students to analyze and understand basic sociocultural processes, statuses, and roles; prepare students for careers and graduate study in the various fields and sub-fields; contribute to the liberal arts training and knowledge of all undergraduates; and provide intellectual and skills background for students considering careers in the professions and in business.””

Department of History states that it:”…serves the university mission by introducing students to the study of the human past. This study involves not only knowledge about historical events but also an understanding of the causes and processes involved in the growth and development of cultures over time, an awareness of the function of change and continuities in past societies, and an appreciation of and respect for the many varieties of human experience across cultures and over time. The craft of the historian includes the critical analysis of texts and arguments, the interpretation of evidence, research conducted in a variety of media, and clear and effective written and oral communication. These skills prepare students for vocational and professional opportunities in a variety of fields and also enable them to be life-long learners. In addition, a major in History helps them to become thoughtful and effective citizens of an increasingly interconnected world.”

In the College of Business:
The Department of Accounting program aims to:”…provide the foundation for a professional career in public, corporate, or governmental accounting. In addition, the curriculum meets the needs of students planning graduate study in accountancy.”

The primary mission of the Department of Marketing and Advertising “…is to prepare students for a professional career in marketing and/or advertising in the private and public sectors. The knowledge, analytical skills, and technical expertise required of marketing professionals are emphasized. The curriculum also provides a solid foundation for students planning graduate study in marketing and business.”

In the Donaghey College of Information Science and Systems Engineering:
The Department of Computer Science “…seeks to prepare students for careers as computer specialists in Systems Analysis and programming, to enter careers in computer or software design, and for advanced study in Computer Science.”

Department of Information Science (CyberCollege) includes in its mission statement that:

  • “It educates its students in the use of current information and communication technologies and in the design and development of information processing systems. It focuses on the transformation of data into useful information.”
  • “It offers degree and certification programs that prepare graduates to make more effective use of information and to improve information technologies.”
  • “It conducts research to advance the field of Information Science in a way that supports applications of information technologies.”

In the College of Science and Mathematics:
The Department of Mathematics and Statistics includes in its mission: “…The primary goal of the programs is to give students the mathematical knowledge and understanding necessary for successful careers in business, industry or government, teaching mathematics at the secondary school level, and pursuing graduate studies.”

The Department of Health Sciences, mission is: “To provide an educational foundation preparing our students to develop into successful professional educators, or specialized health, fitness, and wellness programmers in campus, corporate, industrial, commercial, community, and clinical agencies. Cognitive course work, teaching methods, educational foundations, skill acquisition and practical teaching experiences are just part of the educational process. These experiences, in concert with a basic liberal arts curriculum, afford the individual the opportunity to become a certified secondary health and physical educator. An additional goal of Leisure Science and the Wellness Program is to make available a healthy lifestyle for UALR students, faculty, staff and the community through lifetime activities.”

In the College of Professional Studies:
The School of Mass Communication states that it aims to prepare: “…academically sound and technically proficient students. The School provides them with the theoretical and practical knowledge to become perceptive media consumers and effective media practitioners in broadcasting, media production and design, film, journalism, and public relations, as well as in areas created by new and emerging technologies. The School also actively participates in community outreach, and it instills in its students high degrees of social responsibility and career-related professionalism, coupled with a life-long desire to learn through post-graduate education and experience.”

The Department of Speech Communication works to: “… meet the needs of students with an interest in communicative behavior in business, education, industry, and other professions. Students are trained in interpersonal and public communication skills related to human relations and organizational communication.”

In the Division of Educational and Student Services:
“UALR is Arkansas’ metropolitan university. Embedded in its philosophy is providing access to a diverse student population. The Division of Educational and Student Services continues to provide quality support and academic services to UALR’s constituencies, and maintains multiple relationships with academic departments, students, and community groups. We continue to find creative ways to sustain and, in many instances, improve services.”

What is your program’s mission? What are the overarching goals that guide what degrees you choose to offer? What curriculum you design for those degrees? This is where to start in developing an assessment plan.

Assessment and accountability:

There are also external drivers behind the assessment process. As a state-assisted institution, we are accountable to the people of Arkansas through several state agencies and bodies. Thus assessment plays a major role in program review. We are also accountable to our peer higher education institutions through the process of accreditation—as a campus, and as colleges or programs within our disciplines. All of these external stakeholders are calling for data to demonstrate that students are achieving the institutions’ learning goals; and, if they are not, that we are making progress toward creating an educational experience that should support that achievement.

Assessment and program review:

Program goals and learning goals, program review and program assessment: how do they fit together? Every program has goals that guide decision-making and define success. These include but are not limited to learning goals. For example, it might be a program goal to increase the number of majors. A learning goal would state what you wanted those majors to learn.

Every program undergoes periodic review—either internally or through an external accrediting agency. During that review, programs must present their goals and demonstrate how well they are being met. Within that report, they must address how well they are meeting their learning goals. The degree program assessment plan and progress reports can form the basis for the student learning assessment piece of that program review. Bottom line: Assessment of student learning outcomes is an important part of program review—but it is still only one part.

Organization of this handbook:

In the following sections of this handbook, we hope to provide you with tools that will assist you in developing, implementing, and maintaining an assessment process that will help you reach your program goals. In order to do that, this handbook has drawn on the efforts of faculty and professional staff from across campus. And in order to keep doing that, your feedback and suggestions for improvement are vital. Please contact the Provost’s Office with any comments, suggestions, or corrections.

Additional resources:

You can find links to additional resources both on and off campus through the Assessment Central website.


back to top

II. Assessment process and procedures at UALR

Plans and progress reports:

All assessment efforts start with a plan which lays out what an individual program wants to accomplish. Plans include statements of goals, objectives, tools, methods, and how the results will be evaluated and connected with decision-making. Annual progress reports state what part of the plan was accomplished, the results obtained for that year, and how those results are/will be used to make decisions about the program. All degree programs—undergraduate and graduate, including certificates—must have an assessment plan in place. All degree programs are to submit an annual progress report to their college assessment committee.

Approval and review:

Plans are subject to approval and progress reports are subject to review. These processes fulfill several functions. First and foremost, they provide feedback to faculty to help them fine tune their assessment program so that it will maximize their success in answering the questions about student learning that are most important to them. Second of all, annual reviews help to insure that individual programs don’t “drift” away from keeping the assessment plan moving forward. Finally, annual reviews help to insure that programs will be ready when the time for program review and/or accreditation rolls around. (The year that the accreditation agency comes is not the year to start collecting student learning data!)

Assessment plans for new degree programs are approved at the college level first and then by at the university level by the Provost’s Assessment Advisory Group (PAAG). The college level evaluation typically focuses on college or discipline-specific accountability or accreditation needs. The PAAG provides feedback to insure that the plan will provide acceptable information at the institutional assessment level, meeting the accreditation standards set by the Higher Learning Commission/North Central Accreditation Association. Approval of the assessment plan by PAAG must accompany any request to the Undergraduate or Graduate Council for a new degree program. (More on the approval process in Chapter IX.)

Revisions of existing assessment plans and annual degree program progress reports are reviewed at the college level only. Feedback should center on the effectiveness of the assessment program itself, not the actual results obtained. Degree programs that identify weaknesses are just as “successful” in terms of assessment as programs that identify strengths. A “failed” assessment program is one in which the results are not being used for improvement and identified weaknesses are not being addressed. The college assessment committees provide an annual report to PAAG, on behalf of the Provost, that summarizes the review process within the college that year. That summary report and the annual progress reports are then posted to the college assessment webpage.

Finally, the question sometimes arises about whether or not an assessment plan also needs to go through approval by our Institutional Review Board. Typically they do not, provided that the data gathered will be used for internal purposes to evaluate teaching effectiveness. It becomes less clear if part of your plan includes educational research that could be disseminated publicly. Whichever it the case, all information provided by participants in learning outcome assessment should be treated confidentially, with the identity of any given participant and his or her performance obscured and protected.

College level assessment:

Each college assessment committee produces an annual summary of degree program progress report review process, including the identification of strong and weak plans and strategies to help those programs improve their assessment efforts. These summaries are posted on the college websites, along with the most recent version of the progress reports themselves. (Some colleges have chosen to include links to previous progress reports as well.) Individual reviews are for internal use only and should not be posted on a public-access website.

In addition to assessing learning outcomes in a manner to address the Higher Learning Commission of the North Central Association of Schools and Colleges’ accreditation standards, several colleges must also meet assessment standards set by more discipline-specific bodies. Examples of some discipline or college-level accreditations currently in force at UALR are the National Council for Accreditation of Teacher Education (NCATE), the Association to Advance Collegiate Schools of Business (AACSB), and the Accreditation Board for Engineering and Technology (ABET).

Institutional level assessment:

Each summer the Provost’s Assessment Advisory Group (PAAG) meets after all of the college level degree program progress reports have been reviewed and summarized. PAAG then produces an annual report that attempts to assess the current state of the assessment process at UALR. These annual reports are then posted on the Assessment Central website, and maintained by the Provost’s Office.

Every ten years UALR compiles information about assessment of student learning across campus in preparation for our campus-wide accreditation by the Higher Learning Commission of the North Central Association. All degree program plans and progress reports are included, as well as the annual college and PAAG reports.

UALR timeline:

Plans are submitted for approval whenever a new degree program is proposed. Plans can be submitted at any time of the year.

Progress reports are submitted in the spring to the college assessment committees, which review the reports, provide written feedback to the programs, and then post the progress report summary on the college assessment website. Colleges also post summaries of the overall ratings of the programs, pointing out the strongest and weakest degree programs and what will be done to help the latter. Colleges also include in those summaries any other college-level assessment activities, concerns, issues, and/or achievements for the past calendar year.

The data presented in progress reports are from the preceding calendar year. This is to allow faculty sufficient time to not only collect but to analyze, interpret, and make decisions based on those findings.

The assessment cycle:

Plans should be written with an assessment cycle in mind. How many years do you anticipate it will take to address all of the goals and objectives in your plan? You might want to, and it may be feasible to, assess every goal and objective every year. More complex plans may be more successful if they are implemented over a period of three to five years. Certain objectives are targeted in each year within the cycle, with all objectives addressed during the length of the cycle. The final year of the cycle is spent in evaluating all results obtained in reference to all of the objectives over the all of the years since the last plan was written.

The “progress report” for that concluding cycle year is a revised plan which includes an implementation plan for the next assessment cycle. You could think about your assessment plan as being like a grant for a research program. Grant applications are written and approved for a finite time period, with periodic progress reports, a final report, and the last year of the grant is often spent is submitting the next multi-year proposal. Not all objectives are addressed in every year, and the results from past research are used to inform the next stage in the research program.


back to top

III. Six steps to assessment:

    The following chapters are organized around six basic steps in creating an assessment plan:

  1. Identifying student learning goals for the degree progam.
  2. Stating learning outcomes or objectives.
  3. Connecting learning objectives to the curriculum.
  4. Identifying assessment tools, methods, and design, including implementation timeline and rubrics.
  5. Analyzing, interpreting, reporting and communicating findings.
  6. Using findings for degree program improvement and decision-making.

The first four steps are included in assessment plans. Progress reports focus on the last two. At the end of each chapter you will find instructions on how to complete the corresponding section of the degree program plan or progress report forms.

Tips for success:

#1. Make it meaningful. The purpose of assessment it to provide information for decision-making and curricular improvement. That means you must start by deciding what are the most important questions you could ask and the most useful information that you could have about your educational program.

#2. Keep it simple. Less may really be more when it comes to assessment. Faculty, staff, and students are all busy people. You are most likely to succeed if you limit your questions to the ones that really matter the most.

#3. Ask questions about factors over which you have control. It may be interesting to know that, for example, students who work more than 20 hours a week are less successful in your program; but is that anything over which you can exert control? Are there any changes or adjustments that you could really make that would address this? If not, then the information may be important but not useful for the purposes of program improvement.

#4. Create reasonable deadlines and assessment cycle timelines. It is important to create realistic deadlines for accomplishing your assessment plan, including all of the stated learning outcomes or objectives. You should think in terms of a three to five-year cycle. If it won’t be reasonable to accomplish everything you want during that length of time, perhaps you should re-think your approach and focus on what the most pressing concerns first. Less pressing questions could wait until the next cycle.

#5. Make it measurable. Assessment programs will be most effective when the objectives are written in a way that will make it transparent whether or not students have met your benchmark. The more guesswork or judgment is involved, the more room there will be for disagreement about what the results are, what they mean, and what should be done in response to them.

#6. Use a mix of multiple measures. Relying mainly on one or two achievement measures can be a risky strategy. Sometimes programs place all of their faith in a single test score or survey outcome. Multiple measures increase one’s chances of getting valid information that approaches a target from different angles.

#7. Use embedded, double-duty activities. Don’t add extra activities to either faculty or student workload simply for the sake of learning outcome assessments. You are more likely to get valid measures of student achievement if you use activities that already have meaning for students. Need a writing sample? Find one in a required class where students will be motivated to produce the best work of which they are capable.

#8. Remember that it is your program and therefore your plan. Yes, external stakeholders have provided a lot of the motivation for assessment; but in the end it is your program and your plan. You will have to live with the results, so it is important that everyone invested in the program is involved in implementing the plan and discussing the implications of the results. Plans are only useful to the extent that they are used, and they are only used to the extent that people whom they affect have a voice in them. On the other hand, remember that consensus does not mean unanimity. When people disagree about what should be in a plan, that can be an opportunity for discussion and clarification about the program mission, goals, or objectives. If the results are ambiguous, that can be an opportunity to reframe the objectives in a way that will be more measurable.

#9. Remember that the only “failure” in assessment is the failure to act. Successful assessment does not mean that the findings indicate that all is “rosy” and nothing needs changing. Successful assessment is when the findings are meaningful and point toward action. What matters is that assessment should lead to a process of continuous quality improvement.

#10. Consolidate your assessment efforts. We all have a need to do assessment for purposes of campus-wide accreditation and accountability. But several units also have college or program level accreditation requirements that include assessing student learning; so think in terms of how one plan can fulfill both audiences. (Please contact the Provost’s Office for assistance in aligning your discipline accreditation requirements with the campus-wide accreditation requirements.)


back to top

IV. Assessment Plans: Steps 1 and 2: Goals and Objectives

Before we can start collecting data, we need to decide: what are the most important questions that we want to ask? What is it that we are trying to accomplish with our students? How do we define success for our graduates? If you think of assessment as a research project, these are the hypotheses which we will be testing in the experimental design that follows. Without a clear model and set of hypotheses, a research program will break down into a series of disconnected studies. With a model and framework of hypotheses to guide our work, then our efforts are more likely to pay off in a new body of knowledge. So, we need to start by deciding what are the major assessment questions or hypotheses, i.e., what are the learning goals and objectives that we are going to test.

Definitions: goals and objectives:

Learning goals are the clearly articulated statements of the desired end results of the curriculum. Student learning goals are the general skill or knowledge categories you want your students to achieve. These are the standards for excellence for a given degree program, and they represent the program curriculum’s core competencies. For example:

    “When they complete our B.A. degree program, students should be able to:

  • Understand and apply major theories and trends to contemporary problems in the discipline.
  • Apply appropriate research methodologies.
  • Communicate professionally.
  • Adhere to discipline-specific ethics.”

Learning Goals themselves should be founded on the overall program mission. Why should your students be able to communicate professionally if your program’s mission says nothing about that? If your program does not have a mission statement, then you may need to start there before you can articulate your goals. (For ideas, you could check out examples of some program mission statements listed in Chapter II.) You might also check out your professional association or similar programs across the country to see how they have articulated their program mission or goals.

Student Learning goals form the foundation for any assessment plan, but articulating them can be the most difficult step in writing any plan. Assessment is intended to provide information to help us determine whether or not we are meeting our goals, but our goals are more often implicit than explicit. Everyone believes that we all “know” what we are trying to accomplish by having a particular requirement as part of the curriculum, but it is often surprising how much what one person “knows” differs from another. Goals should be the result of a consensus of all contributors to that curriculum—commonly referred to as the “stakeholders.” Stakeholders could include faculty, professional staff, students, alumni, employers, or segments of the larger central Arkansas community affected by the program’s learning goals. Identifying and stating the program’s goals can provide important guidance not only for assessment but in making many other decisions.

Finally, it is best to keep the list of goals short. From a practical point of view, every goal will need to be represented in your assessment plan so you don’t want to take on more than you can actually do. How many different goals does your program really have the resources to pursue? Goals should be comprehensive but also a realistic statement of what you actually intend to accomplish, not a wish list of everything that you could accomplish in an ideal situation.

Learning objectives or outcomes are the concrete, measurable manifestations of those goals. (In this document, we will use “objectives” and “outcomes” interchangeably. Some college or discipline specific review bodies prefer one to the other. Please use whatever works best for your unit.) Objectives or outcomes identify the specific accomplishments to be achieved and that will provide the proof that students have achieved your goals. These are the outcomes of the activities students encounter in their courses, not the activities themselves. Therefore it is important to:

  • Define objectives so you can recognize and measure them.
  • Identify specific and observable student behaviors–not teacher behaviors.
  • Describe outcomes–not processes.
    Objectives or outcomes typically fall into one of three domains:

  1. What do we want our graduates to know? These cognitive outcomes could include:
    • Knowledge base
    • Intellectual understanding
    • Theories and models
    • Subject matter
    • Analysis and synthesis
    • Evaluation
  2. What do we want our graduates to be able to do? These behavioral outcomes could include:
    • Application to real problems
    • Communication skills
    • Technology skills
    • Problem-solving strategies
    • Information gathering skills
    • Critical thinking
  3. What do we want our graduates to value? These affective outcomes could include:
    • Professional ethics
    • Attitudes
    • Self-concept/identity

It is important to select a reasonable number of learning objectives for your assessment plan. Each goal should probably not be associated with more than four to six learning objectives or outcomes. Using the research program analogy again, how many different hypotheses do you really have the power to test at one time? Each objective should be stated in measurable terms. Assessment plans should, over the period of the assessment cycle, address every learning objective for every degree program goal.

Connecting with program, college, and university mission

University Mission > Program Mission > Program Goals > Program Learning Goals > Learning Objectives or Outcomes

The learning objectives are the concrete measurable manifestations of the program student learning goals. The program learning goals are a subset of the overall program goals. They focus on what you want your students to learn as a result of your degree program. Program goals can include other activities besides student learning that are essential to program success. For example program goals could include: a) increasing community consultation; b) increasing enrollment; c) funding new construction; or d) increasing graduation rates. Program goals—including learning goals—should arise from and be connected to the program mission statement. Ultimately, the program mission should reflect the program’s role in accomplishing the university mission. This connection between mission, goals, and learning objectives or outcomes is very important and will be a major focus of accreditation reviews, especially the campus-wide HLC/NCA accreditation review.

Examples from plans

Degree program learning goals should be entered under Section 1 on the “Degree Program Assessment Plan” form (as explained in the “Degree Program Assessment Plan Form Instructions). Then the learning objectives that correspond to each of the goals should be entered into Section 2. Please take care to align the objectives with their corresponding goals. (See Appendix A for the plan form and instructions.)

Plan or progress report form instructions:

For plans:

Goals should be entered under Section 1 on the “Assessment Plan” form. Then the learning objectives that correspond to each of the goals should be entered into Section 2. Please take care to align the objectives with their corresponding goals. (See Appendix A for the plan form and instructions.)

For progress reports:

Copy the goals and learning objectives statements from the plan to Sections 1 and 2 of the “Degree Assessment Progress Report” form. Only include those goals and objectives that are addressed in the current progress report. The purpose is to help the reader understand how this progress report fits into the overall plan; so the goals and objectives can be stated briefly. A link to the entire plan can be included for those readers who wish more information. (See Appendix B for the progress report form and instructions.)


back to top

V. Assessment Plans: Step 3: Connecting objectives to curriculum

Once you have settled on your learning goals and objectives, the next step is to consider how you will assess them. To do that you will need to identify at what points in the degree program’s curriculum you would get the best “bang for the buck” for your assessment efforts. At what point should you, for example, assess understanding of major theories in your field? An ability to apply professional ethics? In order to make these decisions, you need to have the “big picture” of when different program goals and objectives are addressed in your curriculum. And then you can identify the best points at which to assess them.

Curriculum and Assessment Map

A useful way to get the “big picture” for your curriculum is to construct a curriculum and assessment map. The idea of a map is really very simple. Columns represent the learning objectives for the degree program and rows represent the actual courses or other required activities. Each cell, then, is the intersection of the overall objectives with the specific courses.

Within each cell you will be asked to make a statement about to what extent that course or activity addresses a specific objective. Is the “Emphasis” on that objective “Little” or “None”, or is there “Somewhat” more time spent on it, or is the emphasis in a given course intended to be “Extensive”? (The Curriculum Assessment Map is in Appendix A attached to the Plan Form, along with instructions.)

This process can also help to identify redundancies or gaps in a curriculum. Here is an example for discussion that includes the four objectives for the first degree program goal. What does this map tell us about this curriculum?

Goal 1 Emphasis Objective 1 Objective 2 Objective 3 Objective 4
Course A Emphasis: Limited None None None
Course B Emphasis: None Somewhat Somewhat None
Course C Emphasis: None Limited Somewhat None
Course D Emphasis: Somewhat Somewhat Extensive Somewhat

Are there objectives that aren’t getting the coverage that they should? Are there courses that either are not “pulling their weight” with respect to this goal or are being expected to cover too much? What would you say about the status of the curriculum as outlined in the table above?

This process of reviewing one’s curriculum with the learning goals and objectives in mind can be considered a form of assessment itself because it could result in changes being made to what is covered in which course and perhaps in what sequence. You might even decide to add or delete courses based on how they are fitting into the overarching degree program learning goals. When you are finished you will have a visual aid to how each course contributes to the overall degree program learning goals and objectives. How else might such a visual aid be useful to your unit?

Identifying points of assessment

The next step is to identify where in the curriculum it makes the most sense to assess each objective. Here is the above example but with possible points for assessment identified.

Goal 1 Emphasis Objective 1 Objective 2 Objective 3 Objective 4
Course A Emphasis: Assessed: Limited / Yes None / No None / No None / No
Course B Emphasis: Assessed: None / No Somewhat / No Somewhat / Yes None / No
Course C Emphasis: Assessed: None / No Limited / No Somewhat / Yes None / No
Course D Emphasis: Assessed: Somewhat / Yes Somewhat / Yes Extensive / Yes Somewhat / Yes

For example, do you want to assess Objective 1 in Course D or in Course A? Should Objective 3 be assessed in Course B, C, and D or in just one of them? What about Objectives 2 and 4? (For Section 4 you will decide exactly which tools you will use and enter those in the map instead.)

Completing such a map requires input and collaboration from all faculty involved in a given degree program. This discussion in itself is an important part of the assessment process: identifying just how our implicit assumptions about our curriculum are made explicit in what we are asking students to achieve at various points.

Plan or progress report form instructions:

For plans:

Each plan should have a curriculum and assessment map completed and attached as part of the response to Section 3 on the plan form (see Appendix A for map form and instructions). Degree programs may find it useful to start with a map for each required course to determine how the course learning goals and objectives fit within the degree program learning goals and objectives. However, individual course maps are not required for degree program assessment plans.

Each cell during which assessment data will be gathered should be noted. A more detailed description of each assessment data source or tool will be used should be included as part of Section 4 of the plan (more on that in the next chapter).

For progress reports:

Once the curriculum and assessment map is completed for the plan, it does not need to be attached to each progress report. However, you should copy and paste any information from Section 3 that is relevant to the results reported in a given progress report.


back to top

VI. Assessment Plans: Step 4: Tools, Methods and Design

At this point you should have a set of student learning goals (Step 1); measurable objectives based on those goals (Step 2); plus a “big picture” of how those objectives are addressed in your curriculum and at what points you would like to gather data about progress toward those objectives (Step 3).
/ Now it is time to decide on exactly what data you want to gather to help you answer your assessment questions or test your assessment hypotheses. What information will tell you whether or not a given objective has been met? How best can you obtain that information? Over what time frame? This is what will go into Section 4 of your plan.

Assessment tool types:

Direct methods involve the actual products of student work that can be evaluated in light of the learning outcomes. These may be the same materials for which students have been evaluated on an individual level (i.e., given a grade), but in this context one is looking across students at learning outcomes of groups. These methods could include:

  1. Course-embedded assignments (e.g., tests, papers, reviews, presentations)
  2. Standardized tests
  3. Locally developed tests
  4. Portfolio evaluation
  5. Thesis or dissertation
  6. Oral exams
  7. Focus groups

Indirect methods are not based directly on academic or work performance. These usually involve perceptions by either the student or someone in a position to observe the student’s work. These could include:

  1. Student satisfaction surveys
  2. Exit interviews
  3. Focus groups
  4. Alumni surveys
  5. Employer surveys
  6. External reviewer

Direct measures may be more time-consuming to collect and evaluate, but they can be more closely tied to the identified learning objectives. Indirect measures are less objective, but they are easier to collect and provide important information about the extent to which the degree program is meeting student or employer needs. Again, the best strategy will be to select a mix of measures depending upon the questions you have identified in your plan.

Issues in selecting tools:

There are numerous tools and strategies for collecting data about student learning. The most important thing to keep in mind is: How well do these measures match goals and objectives? Which tool or strategy will supply the information relevant to the questions or hypotheses that you are trying to test?

Concern is often expressed by faculty about student motivation to do their best on assessment activities. For this reason, experts in assessment recommend that you use course-embedded versus stand-alone measures. Students’ motivation to do their best will be highest on measures that are meaningful and have face validity for students. An activity which is part of a course, and perhaps part of a course grade, is the most likely to be meaningful for students. Stand-alone measures that are administered outside of a class can be valuable and may be the best way to get the information you need. However, you will need to take care to include some contingency that will encourage and reinforce students’ best efforts on such measures.

Validity and reliability also need to be considered when selecting assessment tools. A “valid” tool is one that measures what it purports to measure. What evidence is there that the test or activity actually elicits the knowledge, behavior, or attitude that you are trying to assess? A “reliable” measure is one that produces consistent results, such that if there is a change in a student’s performance you can have confidence that it is the result of a real change on the part of the student and not a result of the time, place, or person collecting the information.

While it is tempting to use one measure to assess any given objective, you are better off in the long run identifying multiple measures. What if the one measure you choose turns out not to be valid or reliable? What if it snows the day of the standardized exam and half of your sample does not show up? Relying on single measures can make you vulnerable.

When selecting tools, it is also wise to select a mix of qualitative and quantitative measures. While quantitative measures may seem more “objective” in that they result in numerical values that can be easily manipulated, it may be that qualitative measures may better capture the “holistic” values you want to examine about student performance.

Should you adopt or adapt measures or create your own? Certainly using measures that have already been tested and found reliable, valid, and useful can save you a lot of time and energy. However, they may not be an exact match to the questions you are trying to ask or the hypotheses you are trying to test. Again, selecting more than one measure to assess any single objective is the most likely way to end up with usable and useful information.

Finally, keep in mind that you are already assessing! You may already be gathering the data you need. Before you add new assignments or tools, check out your current data resources. These would include: institutional information, course evaluation feedback, surveys of alumni, the National Survey of Student Engagement, existing student records, and placement or license exam scores of graduates. What do these already tell you about student learning success?

Most programs are already doing assessment, although they may not think of what they are doing in those terms. What data are you already gathering about or from your students, graduates, alumni, employers or community partners? What data are other units across campus already gathering that you could use? (For example, did you know that University College contracts with ACT to do a survey of our alumni every other year? This data is currently available to individual colleges and can be broken down by major.)

When to assess

The curriculum and assessment map will help you identify during which classes it would make the most sense to gather assessment data. But are there other points in the curriculum at which it would also be important to your degree program to sample student performance? Such points could include entry into major, “rising junior” or some other landmark in the student’s progress through the curriculum, or perhaps shortly before or after graduation. Data collected at such points could help to assess any “value-added”, or pre- versus post-program, changes in students’ knowledge, skills, behaviors or values. Will collecting information at one point in time be sufficient to answer your assessment questions? If you are interested in progress over time, what are the critical points between which you hope to see change? If you want to track change over time, will you want to track individual students or will comparing different cohorts at different points in the program work?

More and more programs are choosing to require a capstone course as part of their curriculum. Capstone courses come in the last semester of a student’s academic career and provide an opportunity to integrate information from other coursework and apply that information to a case study, research paper, or other analytic product. These activities can then be used to assess several different objectives. However, capstone courses do come last in the curriculum. If you have questions about what students are learning at other points along the way, relying heavily on a capstone course may come too late.

Samples and sampling methods:

Plans should also include a description of the groups from which you plan to gather data and when. These groups could include:

  1. Student at different critical points in the curriculum
  2. Alumni at different points in time following graduation
  3. Employers who have hired our graduates and potential employers of our graduates
  4. Other community groups affected by the degree program

You should also consider sampling different groups, either randomly or according to some set of predetermined criteria. How many people should be included in your data-gathering? While the ideal would be to include everyone who falls within an identified group, this might not be practical. Again, the most important thing to keep in mind is what information you want to retrieve in order to answer your most important questions. Less can be more if the group selected and the information collected are a close match to what you will need.

Rubrics:

What is a rubric? This word has multiple meanings, the common feature being that it consists of some set of standards for evaluating an assessment artifact. A rubric includes a clear statement of the characteristics of the performance or artifact to be evaluated. (For example, what would an example of professional communication include?) It also includes clear descriptions of various levels of the criterion being evaluated. (What would be the characteristics of an inadequate, adequate, or excellent example of professional communication in your discipline?) A good rubric is connected to the criteria that define a quality performance. Rubrics also allow one to create a quantifiable measure to summarize and evaluate a body of qualitative data.

The best time to develop rubrics is when you are creating your plan. Rubrics can help make assessment more transparent to various stakeholders. They can also help raters achieve better consistency or reliability in how they evaluate different artifacts. If you wait until after you have an assessment artifact in hand (e.g., a sample of a “professional communication”), you may find that the qualities you are looking for are missing—not because of poor teaching but because of uninformed teaching. Developing a rubric is an opportunity for faculty to discuss and reach consensus on exactly what they mean by a particular criterion, and, thus, are more likely to reach consensus on approaches to use to reach that criterion. This is not the same as teaching to the test; rather it is useful for instructors to know what is important to include when they develop their course syllabi and teaching plans. However, if rubrics will developed at some later time, then that should be included in the implementation timeline.

The “Degree Program Assessment Plan Review Checklist” and “Degree Program Assessment Progress Report Review Checklist” are rubrics to help reviewers evaluate those documents (see Appendix C). The individual items are aligned with the sections of the plan or progress report forms. The sub-items are aligned with the components that would need to be present to be a good plan or progress report. The overall rating rubric provides an opportunity to provide a global evaluation, taking into account performance on all of the critical features listed, in a format that can be easily summarized across degree programs

Resources needed:

If you have not considered costs during your planning process, this would be a good time to do that. “Costs” include not only materials (e.g., standardized tests and their scoring) but human resources. Who will collect, organize, analyze, and report the data? Will you need to pay for outside services such as external reviewers? Will you want to provide some form of compensation for students, alumni, etc.? Does the department or program have the resources needed to conduct the assessment research outlined in the plan? If not, does the college have resources that could be allocated for this project? While not a requirement for getting your assessment plan approved, not calculating in your resource needs on the front end may make a critical difference in the actual success of your plan.

Implementation or the assessment cycle:

The final part of the assessment plan is implementation. Plans should be written with a specific “life cycle” or “assessment cycle” in mind. The length of the cycle will vary based on several factors. One will be the scope of the plan itself. How many years it is reasonable to expect that it will take to address all of the goals, objectives, and questions outlined in the current plan? At the completion of that time period, the program will need an additional year to review all of the findings over the assessment life cycle, and to then revise their old or develop a new plan for the coming cycle. Another factor could be the length of time between program and/or accreditation reviews. Assessment data are an important part of both types of review. It might be useful to complete an assessment cycle, including developing a new plan, the year before writing either a program or accreditation review report.

There is no “perfect cycle” that will fit all plans. Some degree program faculty groups might want to assess all objectives every year. Others might feel that three or even five years fits best. The important thing is to include a realistic implementation plan as you develop your overall assessment plan, with an explanation of why that cycle makes sense for your goals and objectives.

Stakeholder involvement:

Finally, you need to consider the important stakeholders that need to be included in the assessment process. This could include student groups, employers, alumni, community organizations, other academic units, or other constituents. In what ways could these groups be of assistance—not only in gathering data but also in interpreting results? Decision-making? Which groups would it make sense to keep informed about your assessment findings in order to fulfill your program goals and mission?

Plan or progress report form instructions:

For plans:

Section 4 of your plan asks for a complete description of the methods and design you will use to answer your assessment questions and test your important student learning outcome hypotheses (see plan form instructions in Appendix A for more details). You should add a brief statement to the curriculum map you completed for Section 3 indicating what type of assessment will be made and at what point in the degree program.

Here is an example building upon the map included in the previous section.

Goal Objective 1 Objective 2 Objective4 Objective 4
Course A Limited / Exams None / Not assessed None / Not assessed None / Not assessed
Course B None / Not assessed Somewhat / Not assessed Somewhat / Exam None / Not assessed
Course C None / Not assessed Limited / Not Assessed Somewhat / Paper None / Not assessed
Course D Somewhat / Exams Somewhat / Project Extensive / Project Somewhat / Exam

This section of your plan should also include an implementation timeline for the plan. How many years are covered by this plan? Which objectives or questions will be addressed each year and how many years will it take to address all of the goals and objectives? The final year of the cycle will be an opportunity to revisit the plan and make changes based on all the information gathered over the years of the cycle. Finally, how will important stakeholders be informed about and involved in the process?

For progress reports:

Once the methods and design and assessment timeline or cycle have been completed for the plan, they do not need to be attached to each progress report. However, you should copy and paste the parts from Section 4 of your plan that are relevant to the results reported in a given progress report. Any significant changes from the plan should be included and explained in the progress report.


back to top

VII. Assessment Progress Reports: Step 5: Reporting Findings

What were your main findings? How did you analyze them? How do you interpret them? This seems like the simplest part of any progress report, but simplicity does not belie its importance. This is your chance to describe what you found out about that aspect of your assessment plan that was put into motion during the past year.

Here are some things to consider when writing the results section of your progress report. You might also look at progress reports from other degree programs submitted for the past year by accessing them through the Assessment Central website.

Connecting findings to goals and objectives

What are the major results or findings regarding the goals and objectives addressed in the current report? These should be clearly aligned with the outcomes that they address.

Interpreting for different audiences

Your progress report may be read by several different audiences. It should be read by all of the faculty and staff involved in your degree program. It will also be read by your college assessment committee, by your Dean, and eventually by the Provost or his representative. Progress reports are posted on your college websites and thus are part of our public profile: evidence for our various stakeholders of our efforts to provide the best possible learning environment for our students. These reports will, at some point, be read by program review and/or accreditation bodies for whom assessment of student learning outcomes is one of the criteria for evaluation of your degree programs.

Progress report form instructions

The first four parts of the progress report form can be copied from the relevant sections of your assessment plan. Only the goals, objectives, methods, etc., that are relevant to the research conducted during the progress report period need be included. (See Appendix B for the progress report form and instructions.)

Section 5 is your opportunity to report on your assessment findings for the progress report year under consideration. Graphical data representations can be included, but it is also helpful to have a narrative that summarizes the main highlights of your findings.


back to top

VIII. Assessment Progress Reports: Step 6: Using Results

With this step, we come to the final piece of the assessment puzzle. The success of an assessment plan ultimately rests on whether or not it provides your program with the information needed to make informed decisions about its future. Assessment findings must be usable and used for plotting the future course of your curriculum.

Here are some things to keep in mind as when writing this section of your progress report. You might also look at progress reports from other degree programs submitted for the past year by accessing them through the Assessment Central website.

Connecting decisions to results

Assessment writers often talk about the importance of the “assessment feedback loop.” The underlying idea is that results provide feedback that leads to decisions that in turn lead to new goals and objectives emerging for a given curricular degree program. Assessment plans that do not incorporate a feedback loop are seen as failures, no matter how much data is gather or how psychometrically meticulous that data may be.

Communicating results and inclusion in decision-making

For information to be used, it must be communicated. How was the information passed on to those who need it to inform their decision-making, including relevant stakeholders? Was it communicated in a way that was transparent to the user? Assessment results should never be used to lay blame on individuals, but rather they should be communicated in ways that will lead to ever improving degree program quality.

Progress report form instructions:

What conclusions can be drawn from the data? What changes, if any, have been or will be made? This could include a revision of the objective or the assessment method to be used in the future. What improvements have been made based on assessment findings? How will the findings be used to make decisions about curriculum and instruction? Will your target for the coming assessment year change?

How was feedback about the assessment results and degree program improvement communicated internally and externally? You should include how faculty, students and other stakeholders were involved in decision-making based on the results.


back to top

IX. Reviewing Assessment Plans and Progress Reports

UALR has an annual review process in place for progress reports and a multi-level approval process for assessment plans. Why? One reason is to provide assistance, feedback, and guidance to degree programs. Your best resources for developing a successful assessment plan, and getting the most out of your progress reports, are your fellow faculty members in your college. By reviewing each other’s efforts, we can share our successes and move forward as a campus. Another reason is to be sure that no program is left behind when it comes time for external reviews, such as our discipline-specific and campus-wide accreditation visits. It is easy for assessment to slide to the “back burner” while we deal with immediate crises and needs. However, assessment is not something that can be done at the last minute. Annual review and monitoring helps keep the whole campus engaged and on-target.

Review versus approval:

Plans are submitted for approval and progress reports are submitted for review. The purpose of the approval process is to ensure that the assessment plan contains all of the required components and has the best possible chance of success. Are the student learning goals and objectives clearly defined? Are they measurable? Is the plan and timeline realistic? Are all of the goals and objectives included in the implementation plan? The purpose of the review process for progress reports is to ensure that the plan is unfolding as proposed and that the results are being used appropriately for degree program or course improvement.

Level of approval or review:

The first formal level of approval for plans is at the college level. Each college has an assessment committee that evaluates new or major revisions of plans as submitted. The second level of approval for plans is at the university level. The Provost’s Assessment Advisory Group (PAAG) evaluates new plans or plans for new degree programs following approval at the college level. One representative from each college sits on PAAG, usually the chair of the college assessment committee. That individual presents the plan, provides feedback to the degree program proposing the plan, and in general functions as a liaison between the program and/or college and the university committees. Approval of the assessment plan by PAAG must accompany any request to the Undergraduate or Graduate Council for a new degree program.

The college committee conducts the annual review of the degree program progress reports. It also reviews revised plans for existing degree programs. There is no second level of review at the university level. The college assessment committees provide an annual report to PAAG, on behalf of the Provost, that summarizes the review process within the college that year. That summary report and the annual progress reports are then posted to the college assessment webpage (but not the reviews of those individual progress reports).

Review rubrics:

The review checklists found in Appendix C are designed for evaluating degree program plans and progress reports. Each section lines up with one of the required sections on the plan or progress report forms.

Instructions for review checklists

For plans:

There is a section on the review checklist that corresponds to each section of the assessment plan. The reviewer should check all items that apply to the plan under review. Each section also has a place for comments relevant to that section of the plan. At the end of the checklist is a place where the reviewer can give an overall evaluation of the plan.

For progress reports:

There is a section on the review checklist that corresponds to each section of the progress report. The reviewer should check all items that apply to the report under review. Each section also has a place for comments relevant to that section of the report. At the end of the checklist is a place where the reviewer can give an overall evaluation of the report. The overall rating should also go into the space provided in the box at the top of the cover sheet.

College assessment committees may select to either turn over all the individual reviews to the author(s) of specific plans or progress reports; or they may choose to compile a summary that represents a consensus of the evaluations received. In either case, the purpose should be to provide constructive internal feedback. Actual reviews of plans or progress reports are for internal use only and should not be posted on a public website.