Carteret Community College Title III Grant

August 18, 2010

Phase III Program Reviews 2009-2010

Filed under: Instructional Program Review — Donald Staub @ 11:29 am

We have just completed the third of four phases for the program review process.  If you would like to view previously completed reviews, click here for Phase II programs (2008-2009), and here for Phase I programs (2007-2008).  What follows are the Executive Summaries of each of the Phase III programs.  Scroll down to see the full narratives of each of these reviews.

Executive Summaries

Business Administration

Computer Information Technology

Interior Design

Math

Office Administration

Therapeutic Massage

Program Reviews

Associate in Science

Business Administration

Computer Information Technology

Interior Design

Math

Office Administration

Therapeutic Massage

June 11, 2009

Phase II Program Review debrief

Filed under: Instructional Program Review — Donald Staub @ 7:48 am

Picture 1CLICK HERE to download a pdf of the following report.

Introduction
Eight instructional programs and one discipline completed the review process this past year. All have submitted their final reports to the Program Review Sub-Committee, which has, in turned, reviewed each of the final products (using a rubric).  A final report, based on those comments, is forthcoming.

Major findings/suggestions
There were a number of principal findings and suggestions that emerged from the debriefing session.  Regarding data, it appears that thre may have been a bottle neck when it came to data requests; i.e. when everyone asks for data at once, it may be overwhelming for Jennifer.  One suggestion was to give each of the review teams a window during which they could work with Jennifer on their data needs.  Another suggestion is to begin requesting data in May or June, so that she may have more time to generate a report, and the team has more time to analyze.

In terms of the process, some suggestions were to begin the process in the spring. Another suggestion was that the manual was not as useful as it could be.  The sample was not aligned with what was required, and section descriptions were sometimes vague.  There are plenty of samples now available in the repository, but the manual itself could have more elaboration, with relevant samples in each section.  The group also agreed that for the final presentation, it should be the CAC who makes it.  Curriculum Committee liaisons felt uncomfortable about fielding questions for a program that was not theirs.  Finally, the group felt that it would be useful to forgo the August orientation for more one-on-one/small group discussions with the leadership of each review team.  One of the outcomes of that meeting would be the selection, with program input, of the full team composition.

The longest discussion centered on the review team and roles and responsibilities.  Perhaps the number one suggestion was to revise the way that teams are chosen.  It was suggested (and echoed) that the program under review play a greater role in selecting review team members.  This would be based on the needs of the program in terms of the review.  There was general consensus that the curriculum committee liaisons had difficulty grasping the process, until about the half-way point.  One suggestion was to have the PRASC serve as the external readers, providing feedback throughout the year, as sections are submitted.  Another suggestion was for a true external stakeholder to be a review team member, thus providing a state-of-the-industry perspective.

Main points from debriefing

Data
• Waiting for data could be frustrating
• Not knowing what data to ask for could be frustrating; part of this was the nature of the research…sometimes it wasn’t until late that you realized what data you needed
• Provide review teams with a list of available reports, so that they know what’s available, without having to go fishing for it on the L drive
• Start process in summer – gives you time to start pulling together data so that you have information to go on in the fall;
• Some data requests went unfulfilled, and team had to shift the way it responded to certain items
• Inconsistency in sources of data (What Susan provided some times did not agree with what Jennifer provided).
• Suggestion: Stagger start for data requests…a window of opportunity for each program

The Process
• There should be no exceptions for faculty and staff in fulfilling the responsibility of being on a team.  I was excused last year, and was in the dark this year.
• The model/sample in the manual is not aligned with what is requested
• What to provide up front for illustration: Not necessarily a template, but perhaps samples that fit broad categories (e.g. sciences, humanities, etc…) [of course, there are now 16 samples in the repository to select from].
• A template for formatting…not fill in the blanks, but one that has the basic formatting (e.g. fonts, headings, etc…) in place [ask Mary to help with this?]
• [working with Patrick to have some crossover between Phase IV pioneers and review team members…goal would be to have one or two reviews (including process) that are completely online]
• Final presentations:  to encourage more dialogue, make sure the curriculum committee understands their role for this process; Or, maybe presentations in a different venue
• CAC should be making the presentation – they can field questions a lot easier than the review chair

Roles & Responsibilities (the team)
• I was confused the whole time…CAC had been involved in a review the year before, “so she did all the writing and data collection and everything…I was just kind of doing a little editing…even though I was the chair of the committee, in name, I felt like I was just following.”
• It wasn’t until half-way through the process that our chair (i.e. curriculum committee liaison) realized what their role was to be.
• [as the curriculum committee liaison] “I have a real hard problem coming up with questions for a program that I know nothing about”
• I don’t think those who were not from the program area had a clear understanding of their role.  I think it was confusing for them to be the chair of a committee that’s not their program…it was confusing to them as to what they should be doing…they wanted to help, but didn’t know how.  And, in frustration, the program people end up taking the whole thing and writing it themselves.
• I had an SER person who provided data.  Otherwise, I wrote the entire thing.
• The SER person and the library person probably knew what they were doing, but for the others, the contribution was, “Oh yeah, that sounds good.”
• A lot of times, they are not in a position to know what to write…how to approach it…what questions to ask.
• People are busy as it is…to ask them to come in from other disciplines was busy work for them, and not useful for those who had to sit down and write it anyway.
• Have a true “outside” reader…someone from the community and the same profession who can bring educated objectivity…they know the lingo, they understand the needs of the industry, they know what’s cutting edge
• Could the faculty from the program under review, choose the team…the body of people who would be of greatest use to them
• Have the program review sub committee review the product – in part, and finally, in whole – throughout the year, using a rubric that is the same, or similar to, the one the PRASC is using already.  They could see trends, make suggestions (make clarifications, etc… to strengthen products throughout the year…as opposed to the end.
• The first meeting could be between Don (or the person arranging teams) and the review chair and the program chair… and the three of you brainstorm who you would need.  What approach does that program need to get the information that is needed.

Working toward a useful product
• The three would also work on shaping the review (within the broad parameters).
• A discussion could take place about the program…strengths, weaknesses, questions….which would provide an understanding of the process and the bigger picture (the outcome).
• Someone who could help, at least in this initial meeting, would be someone who has just gone through the process…someone who’s already been down the path.
• The [orientation] was not effective…the focus was on who was on what team…”what’s our title?”  etc….
• The best feedback was when we sat down with you [Don] and we had focused discussions
• The orientation, with the powerpoint, doesn’t make sense…there’s no context for us yet.
• Feedback, other than from the group, was not as consistent as it could be. I submitted everything, as prescribed, but I was not always sure it was what was needed.
• The manual should suggest the flexibility inherent in the process
• More leading questions and examples built into the manual…guide me a little bit more…particularly for the analysis sections…there needs to be more clarity about the difference between the sections preceding the Analysis, and the analysis itself.

May 5, 2009

Phase II Program Reviews 2008-2009

Filed under: Instructional Program Review — Donald Staub @ 12:46 am
{Click on the photo to download the program review manual}

{Click on the photo to download the program review manual}

As the 2008-2009 academic year draws to a close, we are wrapping up our second phase and second year of the program review process.  We have seen a significant improvement in the quality of report produced.  This is mainly because of the attention to detail when it came to facilitating the process.  It also helped that there were many more faculty and staff who were going through this for the second time, and they had a better handle on the process. Was the process flawless this year?  Absolutely not.  There were a number of warts that popped up this year, and I’ll be writing about these in a separate post (once we have our end-of-year debriefing at the end of May).  The purpose of this post is to let our readers know which programs went through the process this year, and to give readers access to the executive summaries and full reports of each program review. Immediately below are the executive summaries of the 9 programs that completed the process this year.  Scroll down further to access full program reviews for 2008-2009.

Executive Summaries

(click on the program name to download the Executive Summary)

Aquaculture

Associate in Fine Arts

College Prep

Culinary Technology

Horticulture

Practical Nursing

Respiratory Therapy

Social Sciences

Web Technologies

Full Program Reviews

Click on the name of the program to download the review.

Aquaculture

Associate in Fine Arts

College Prep

Culinary Technology

Horticulture

Practical Nursing

Respiratory Therapy

Social Sciences

Web Technology

September 19, 2008

Phase I Instructional Program Reviews – 2007-2008

Filed under: Instructional Program Review — Donald Staub @ 8:28 am

In 2007-2008, seven instructional programs implemented the comprehensive program review process. The process is undertaken by a team of program faculty, an external faculty member (whose program will undergo review in the following year), one staff member from Student Enrollment Resources, one faculty member who serves as a liaison to the Curriculum Committee, and one representative from the library.

The program review process is implemented from August to May, with a final document submitted at the end of the academic year, during a presentation by program faculty in front of the curriculum committee. The process was guided by the 2007-2008 Program Review Manual. Click on this hotlink or the document image to download a copy.instruct-prog-rev-manual

In 2007-2008, the following instructional programs underwent the program review process. If you click on the specific program’s name, you can download the final review document:

Associate Degree in Nursing

Basic Law Enforcement Technology

Communications

Cosmetology

Criminal Justice

Paralegal

Radiography

You can also find a wealth of Program Review examples from Central Piedmont Community College’s institutional effectiveness website. These samples may not follow the exact same outline as those here at Carteret, but you will get a good sense of how the program review is presented:
Central Piedmont CC Program Review Samples

April 30, 2007

Program Review Cycle

Filed under: Instructional Program Review — Don Staub @ 4:02 pm

Here is the proposed cycle of instructional program reviews, broken into four phases (click on table to see the full view)
Program review phases

Fran & Don’s Program Review Powerpoint

Filed under: Instructional Program Review — Don Staub @ 11:03 am

To view the powerpoint presentation that Fran & Don made on April 26, you can click here.

Instructional Program Review process document

Filed under: Instructional Program Review — Don Staub @ 11:03 am

To download a copy of the Program Review process document, click here.

April 26, 2007

Instructional Program Review

Filed under: Instructional Program Review — Don Staub @ 11:46 am

Watch this category as it grows. In just a short while at this space, you’ll be able to find the documents that will guide you through the Instructional Program Review process. We will be posting the Phases (indicating which program will be implementing the review process in one of four cyclical phases), we will also be posting the Program Review guide, as well as the powerpoint that Don & Fran will be using for their IPR meeting today (April 26). Stay tuned!

Blog at WordPress.com.