Personal tools
Log in
You are here: Home Compliance Certification Section 3 - Comprehensive Standards - Educational programs, to include student learning outcomes - Educational programs, to include student learning outcomes

Judgment of Compliance

The College certifies compliance.


Identifying expected outcomes, assessing the extent to which these outcomes are achieved, and providing evidence of improvement based on analysis of results are primary functions of the College's four-year cycle of program review. These functions also occur through the College’s review and assessment of the College’s strategic plan, Cleveland’s Continuous Improvement Plan for Student Success (CCIPSS), as well as through the work of the Enrollment, Retention, and Success Committee (ERS). The processes for identifying expected outcomes, documentation of performance, assessing expected outcomes, and evidence of improvement based on analysis of results are provided for program reviews, the CCIPSS strategic plan, and the Enrollment Management Plan as follows:

Identifying Expected Outcomes Process
Program Review:

Cleveland Community College has engaged in program reviews for many years. Over time, the format for the reviews has evolved into a comprehensive, formalized process which includes educational programs, academic and student support services, and other non-instructional areas of the College.

Since 2010, the College adopted the following revisions to the program review process, which are still in place:

  • Included non-instructional functions
  • Implemented a One-Year Follow-Up Report to close the loop
  • Adopted a new program review template with an emphasis on Program Outcomes, Student Learning Outcomes, and Administrative Outcomes
  • Moved the begin/end cycle dates for the review process from the fall of the year to the spring to ensure better alignment of planning and budgeting
  • Implemented the new begin/end cycle dates for the 2014–2015 year, which involved a one-time 18-month transitional review period

The Academic Program Review template adopted by the College requires participants to identify Administrative Outcomes, Program Outcomes, and Student Learning Outcomes. The template is designed to lead the participants through an analysis of the outcomes and to formulate the decisions and strategies necessary for improvement. Evidence illustrating use of results is shown in the following:

  • The decisions and/or strategies for change
  • The actions taken as a result of internal/external feedback
  • The Resource Needs Checklist, which marries the review process with budgetary considerations/prioritization

The Academic Program Review template includes the following five parts:

Part I is the Program Profile, which includes the College mission and program goals as they relate to the mission. Faculty credentials, accomplishments, and professional development activities are also included in this section, as well as data relating to our students. This data include a breakdown of students by type, specific programs that require courses being reviewed, numbers served, demographic information, and any trends the reviewer(s) identified. Additional data are supplied to faculty members upon request.

Part II describes program content; for whom the program is intended; and, any criteria for admission to the program (if applicable). Curriculum or coursework provided in Part II incorporates descriptions of courses for the general education core, stand-alone programs, and degrees, certificates, or diplomas. Any external accreditation, innovations including new programs or courses, state-wide or national efforts, and diversity applied to curriculum are included in Part II. Also included are testing and remedial coursework, distance education offerings and the use of technology, as well as funding for curricular changes or offerings. Finally, a Learning Resources Assessment is included to ensure resources for students are adequate, relevant, and current and that they are reviewed on a periodic basis.

Part III pertains to the measurement of outcomes. This includes the identification of outcomes including Administrative Outcomes, Program Outcomes, and Student Learning Outcomes. The measurement results for these outcomes are also included in Part III. Decisions regarding any needs for change are guided by the results.

Part IV discusses needs for change by examining strengths and weaknesses identified by external sources. This section also includes recommendations made by program staff to improve the program and strategies for change based on follow-up, thus closing the loop. A one-year follow-up to the appropriate Vice President reports the progress made since the last review and is also included.

Finally, Part V looks ahead to future issues and resources needed to meet future program demands based on findings (i.e. advisory meetings, professional development workshops, national trends, and enrollment trends) within the program area. Equipment, space, and faculty needs for future growth and/or program continuation are addressed, as well as general plans for the future.

The Administrative Program Review Template requires participants who provide support to perform a SWOT analysis by identifying strengths, weaknesses, opportunities, and threats. Data specific to the units is provided, as requested. The Library, Student Services, Finance and Administrative Services, and Continuing Education utilize this template for their program reviews, as it is more applicable to their functions.

The maturity of the College’s program reviews is as varied as our program offerings. Some programs have seasoned faculty who have been through the process numerous times, and that is reflected by the level of understanding and sophistication inherent in their reviews. Other programs, where faculty are newer or less experienced in the review process have slightly less refined reviews where outcomes may not be as fully developed. Both the process and the faculty are ever-evolving, thus a degree of variation across reviews will most likely remain a constant.

Administrative areas use a program review template better suited for their areas. This review includes a SWOT analysis and data gathered through various reports and/or surveys. An example of an administrative program review is provided from the most recent review of the Library.

Documentation of Performance

For instructional programs, five years of program-specific data are provided to program review participants, giving participants the opportunity to identify trends and changes which have occurred over time. Data provided include age, gender, ethnicity, program enrollment, grade distributions, and grades by delivery method. Participants can request additional data, which are provided whenever possible. Beyond program-specific data, institutional-level data for the same variables are also provided, allowing participants to compare their programs to the institution overall.

For non-instructional areas, data from biannual surveys are provided as well as data these units may have (usage reports, evaluations of numerous events – Career Day, New Student Orientation, etc.). The survey data includes Community College Survey of Student Engagement (CCSSE) data in odd-numbered years (2015, 2013, 2011) and data from an internal College Survey (2016, 2014) in even-numbered years. Examples of variables utilized in the internal College Survey include items like awareness of individual support services, satisfaction with individual support services, campus safety, satisfaction with individual library resources, services, and features, the degree to which the learning environment is enhanced by the College's Active Learning Classrooms, the level of effectiveness provided by technologies in the ALCs, level of student engagement in their courses, and overall satisfaction with the level of instruction delivered. These data provide non-instructional areas with information they can use to implement changes for improvement. Additional surveys are administered if more information is deemed necessary.

Assessing Expected Outcomes Process

The Program Review template requires outcome results to be reported and assessed in relation to the success criteria (targets) established for each expected Student Learning Outcome. In instructional programs a variety of assessment methods are used, which are chosen by faculty participants. Examples include rubrics, embedded test or exam questions, and demonstration check sheets, among others.

In non-instructional areas, outcome results are also required to be reported and assessed. Assessments are most often administered via surveys using SurveyMonkey or by using data gathered through the in-house College Survey, which is administered in even-numbered years.

Evidence of Improvement Based on Analysis of Results

The following, representative examples (a minimum of four from each division, click on the program to view program review samples) demonstrate how the College has systematically used the assessment of performance outcomes to improve institutional quality and/or the learning environment:


Resource Need Identified via Program Review



Arts and Sciences:




Chemical Inventory System w Liquid Waste Disposal


Room 5307 with an exhaust fan was proposed along with using a spill kit in lieu of floor drains. Since multiple programs identified the same need, a consensus was reached which allowed the need to be met by utilizing one space for all chemical storage.


Chemical Storage Room


Same as above (approved in BIO proposal).


Creation of Vented Storage Room


Same as above (approved in BIO proposal).


Storage area to share with Biology

0 if shared with BIO

Same as above (approved in BIO proposal).


Vented workroom

0 if shared with BIO



Exhaust Fan for Physics Room


Same as above (approved in BIO proposal).





Advanced Manufacturing & Public Services:




12" Compound Mitre Saw


Purchased already.


10" Table Saw


Replaces old saw.  This new saw has safety features that offer more protection to students.




We don't advertise individual programs (Will do a feature in Career Focus, though).


New AC DC Lab Equipment/Manuals


Will use across programs (Electronics Engineering, Electrical Systems, & Industrial Systems programs).


This includes computer-based troubleshooting equipment








Building Automation Equipment


To upgrade labs with new equipment and for new AAS degree program. Software is separate from the equipment and Bruce asked for additional $50,000 to purchase automation equipment proposed, as he has $10,000 from a grant.


Building Automation Software


To use with Brady System for Fall 2017


Small Chiller System


To use in BAT classes for AAS - contingent upon AAS program approval.


Small Boiler


Approved with Chiller


Building Automation Lab Space


Not at this time.


AWS Weld Bend Tester


Used to test welds.


Code Books


Already purchased for less than $1000


LED Lights for Welding Booths


Would allow reducing to one light per booth instead of two.


Band Saw


Would replace 25 year old saw.


Iron Worker


Replaces aging equipment that has 20 year life.





Business & Allied Health:



Nurse Aide I

Printed Curriculum Materials from DHSR to Replace Textbooks

$30 each (to be paid for by students)

Saves students money and has worked well with CCP students. Bookstore will sell them to students.


Cabinets/Drawers With Countertops for Storage


Students identified need to be comparable to clinical experiences. Develop room 5121 as a classroom and a lab, which would allow us to offer another section (10 more students) with a day, a night, and a second day section. Projected 17% gain in the field according to Bureau of Labor Statistics through 2024. John will research space utilization and bring this back to the November meeting.

Nurse Aide II

Medline Electric Hospital Bed




Medline Hospital Bed Mattress




Medline Headwall Surface Mount




Medline Patient Care Simulator Mannequin




McKesson Surface Wall Protector




McKesson Overbed Table



Practical Nursing

Basic Nursing Supplies




Plum XL Pump (Qty 4)

$566 x 4 = $2264

Could get by with 1 pump, so the proposal changed. Would be same system as they use at hospital.


Phlebotomy Simulation Arm


Would replace older arm.


Blood Drawing Chair


Would allow more realistic experience.


ARST Successful Study Strategies


Software to help students prepare for registry exam - this is a site license.


Radiography Simulators - Powered by Ziltron - Annual Educator Access Per Student

$39.95/  student = $960 Annually

Also helps prepare for registry – will try this with the seniors this year before opening up to all students in the program.  Has to be purchased for individuals - not a site license.  This should be a student fee. Will present to the BOT at the January 2017 meeting.

Identifying Expected Outcomes Process

Strategic Plan:

The College’s Continuous Improvement Plan for Student Success (CCIPSS) strategic planning document illustrates the objectives (outcomes) for each of the five units of the College. Objectives are reviewed and reported upon annually by the unit vice presidents. The CCIPSS is a three-year plan and objectives may carry forward to subsequent iterations of the plan if deemed appropriate or if efforts for improvement related to an objective are still ongoing. Objectives that are met or exceeded each year may be replaced to keep the plan relevant. When deficiencies are identified during annual reviews, steps are taken to rectify them, illustrating how results are used towards continuous improvement.

Documentation of Performance

Evidence documenting performance related to the College’s strategic plan is included in the Annual Summary of CCIPSS Objectives Reports (2013/14, 2014/15, 2015/16).

Assessing Expected Outcomes Process

Assessing expected outcomes related to the CCIPSS strategic plan occurs annually and results are reported to Planning Council by the unit vice presidents. Assessments vary broadly, as do the unit-level objectives, but provide results from which decisions are made regarding whether an objective should be revised or replaced altogether. Refining the strategic plan is work towards continuous improvement, as a plan made up of meaningful, measurable, actionable objectives provides the College direction and establishes targets towards which each unit can work.

Evidence of Improvement Based on Analysis of Results

Over the course of the 2014-2017 three-year plan, a need to strengthen some of the unit objectives became apparent. As a result, the next iteration of the plan will include objectives that are more substantive and are designed to better inform decision-making, as opposed to simply being an objective that is attainable. For example, one of the Academic Programs objectives is to ensure faculty participate in a minimum of 10 hours professional development each year. While this objective provides a target and was chosen with good intent, it does nothing to measure the impact of that professional development, nor does it ensure that the professional development obtained is particularly relevant to each employee’s current role. The need to refine and strengthen the strategic plan shows a commitment to improve how we hold ourselves accountable, which is work towards continuous improvement. Drafts of the 2018-2021 unit-level plans will be vetted, approved, and consolidated into one document during the 2018 fall term.

Identifying Expected Outcomes Process

Enrollment, Retention, & Success (ERS) Committee:

The function of the ERS Committee is to plan, evaluate, and implement procedures and actions, including marketing, to improve enrollment, retention, and success for all instructional areas of the College and to review and recommend actions designed to meet the Performance Measures of the North Carolina Community College System. Established in October 2014, the ERS Committee has identified numerous outcomes which are included in the Enrollment Management Plan.

Documentation of Performance

The Enrollment Management Plan documents targeted goals (outcomes) and baselines established by averaging the previous three terms’ or years’ data and the outcomes are reported to the Committee to monitor performance. Because the committee was newly established in 2014, it took considerable time to identify objectives and gather enough data to determine baselines, then assess performance.

Assessing Expected Outcomes Process

Upon analyzing the data gathered, several task forces were born out of a necessity to identify actions the College could take to improve in numerous areas. Those included an Enrollment Task Force, a Retention Task Force, and a Success Task Force. The task forces conducted focus groups to gather data on what suggestions various constituents had regarding improvement in these three areas of focus. Out of those focus groups, multiple initiatives emerged including task forces established to look specifically at new student orientation, effectiveness of pre-requisites, online course success rates compared to other modes of delivery, ACA courses, Advising, Developmental Education, and late registration. Several of these are discussed in further detail in the next section on use of evidence.

Evidence of Improvement Based on Analysis of Results

Using data gathered to formulate decisions for improvement, many changes have been implemented in the three years since the ERS Committee was formed. Examples of those include the following:

The adoption of Quality Matters and Master Course development and certification, which was recommended by the Online Task Force. Faculty develop online course content which the Master Course developer uses to improve online courses, allowing the College to provide optimum equivalence across course sections. CIS- 110 Introduction to Computers and MCO-110 (Introduction to Mission Critical Operations) went through Master Course development first in summer 2015 and were subsequently Quality Matters certified. ENG-111 (Writing and Inquiry) and MAT-171 (Precalculus Algebra) went through Master Course development and QM certification in 2016 (fall term). To determine which courses should follow for subsequent Master Course development, the ERS Committee analyzed data which included courses with the highest enrollment over the last three terms, classes with the highest number of faculty teaching them, courses that are taught as Career and College Promise courses, courses with the highest number of online sections, courses with the lowest success rates, and book costs. From these data variables, the Committee determined MAT-172 (Precalculus Trigonometry) and either COM-231 (Public Speaking), PSY-150 (Introduction to Psychology), or BIO-111 (General Biology I) should be developed as master courses next. PSY-150 was chosen and is being implemented in Fall 2017.

Beginning in the fall of 2016, a new Advising Center opened for students with 0-23 credit hours. These students are required to be advised in the Advising Center, where specialized Advisor/Success Coaches work with them individually to ensure they progress through their programs of study in a timely manner without accumulating excess credits. The development of the Advising Center stemmed from advising questions embedded in end of course evaluations, and that data was analyzed by the Advising Task Force which, when combined with articles and research studies of advising, resulted in evidence of need for the Center.

Beginning in the fall of 2016, ACA-115 (Success and Study Skills) or ACA-122 (College Transfer Success) are now required for all students whose program of study requires ENG-111 (Writing and Inquiry), and must be taken within the first fifteen credit hours of the student’s program of study. The ACA Task Force utilized multiple studies of student success courses, along with institutional data to develop their recommendation for requiring ACA courses.

Beginning in the summer of 2017, new student orientation is required for all new students and returning students who have missed the most recent three semesters. An online orientation that is ADA compliant is available for distance students as well to ensure they receive a comparable orientation experience.


Cleveland Community College has demonstrated that it identifies expected outcomes for each of it functional units (educational, administrative, and educational support) through its four-year cycle of program review, through review and assessment of its strategic plan, and through the Enrollment, Retention, and Success Committee’s work. Detailed examples presented in this section illustrate how the College systematically uses the results of assessments to improve the programs and services it provides.