Carteret Community College Title III Grant

End of FA09 Report…how did we do?

At the end of the Fall semester, 2009, we completed the first semester pilot of our newly developed Early Alert system.  While the focus was clearly on the system and the process, there was certainly an effort to respond to student needs (i.e. those students in the pilot cohort). You can learn about the development of the software and implementation of the pilot project by clicking here.  We also had a control group that held many of the same characteristics as the pilot group (except that they were not pre-allied health students).

We looked at how students in both groups performed throughout the semester (i.e. success rates and withdrawal rates).  As you will see in the report below, compiled and written by Jennifer Ulz, our Institutional Researcher, there was no statistically significant difference between the number of withdrawals in the two groups, nor the final GPAs of the students in the two groups.

This is not a primary concern, however. Again, our objectives in getting the project off the ground were:

  1. Getting the software implemented
  2. Getting faculty & staff comfortable with the software (and working out any bugs they found)
  3. Developing an effective and efficient intervention & communications mechanism

We achieved  #1.  What we didn’t achieve last semester regarding #3, we think we’ve ironed out in the interim.  And, for #2, we need the feedback of the faculty and staff who utilized the system: To what extent did they think it worked?

We will follow up this report with a more extensive version…drilling deeper into the end-of-semester student data; providing faculty feedback; highlight changes made to the software, etc…  In the meantime, take a look at Jennifer’s report below and give us your thoughts and feedback.

click here to download a pdf of this report

Early Alert Cohort – Fall 2009 first-time students with a GOT healthcare major who were enrolled in at least two of the following courses:

ACA 115 CHM 131 ENG 075 MAT 050 MED 121
ACA 118 CIS 070 ENG 085 MAT 060 OST 080
BIO 163 ENG 095 MAT 070 PSY 150
BIO 168 ENG 111 MAT 080
MAT 140

Forty (40) such students were identified at the beginning of the fall 2009 semester.

Comparison Group – Fall 2009 first-time degree-seeking students with a major other than GOT healthcare who were enrolled in at least two of the above courses.  Forty (40) of these students were chosen randomly, also at the beginning of the semester.

Retention – Fall to Spring

Of the 40 students in the early alert cohort, 28 (70.0%) were enrolled in a curriculum class at the beginning of the spring 2010 semester.  Of the 40 students in the comparison group, 31 (77.5%) returned in the spring semester.  This difference in retention is not statistically significant (Chi-Square test, p=.4459).

Grade Point Averages

The following table shows statistics for the fall 2009 grade point averages for the two groups.

N Mean Standard Deviation
Early Alert 40 2.275 1.254
Comparison 40 2.402 1.404

The difference in the means of the two groups is not statistically significant (T-Test, p=.6707).

Course Success and Withdrawal

The number of course successes (grades of A, B, or C) and withdrawals (WD, WD*) were counted for each of the two groups.  The results of this calculation are shown in the following table

Total Courses Success Success Rate Withd. Withd.


Early Alert 152 100 65.8% 23 15.1%
Comparison 168 113 67.3% 29 17.3%

Again, these two groups are not significantly different in the percentage of successes (Chi-Square test, p=.7804) or withdrawals (p=.6060).

Conclusions and Observations

  • None of the above assessments – retention, grade point average, success and withdrawal rates – show any significant differences between the early alert cohort and the comparison group.
  • These students should also be tracked through the spring semester and into next fall with these same assessments.
  • An assessment will also be done for the spring 2010 early alert cohort and comparison group.  This may help determine if changes to the process are improving the system.
  • Are the characteristics of the first-time GOT health science majors the same as the first-time students in other majors?  If not, this would affect the comparison between the early alert cohort and the comparison group.  A similar comparison of groups from the fall 2008 semester (before the early alert initiative) might be useful.


  1. Are we analyzing the data for significant difference?

    Comment by Louise Mathews — January 14, 2010 @ 3:22 pm | Reply

    • Louise… Are you suggesting an analysis different than the one Jennifer carried out (i.e. in all instances, no statistical significance)?

      Comment by don staub — January 15, 2010 @ 8:49 am | Reply

  2. If the two groups are not statistically significantly different, then it seems to me we must focus on all students being successful, and lowering dropout/failure statistics. Also, I am not convinced of the intervention success.

    Comment by Kathleen Lang — January 15, 2010 @ 6:52 pm | Reply

    • Thanks for the input, Kathy. Please keep in mind that ultimately, we will include all students in this system. Currently, the focus is on the system/process itself, and determining whether or not it is effective, even for small groups. Our plan is to scale it up over the summer. And, you’re right…the jury is still out on whether our interventions had any success last semester. As the data and statistical analysis show: No, they didn’t have an impact. But, again, it was our first go at it and we’ve learned a lot. We are hoping that this semester, our interventions will make a difference. Thanks again for your thoughts!

      Comment by don staub — January 19, 2010 @ 8:22 am | Reply

  3. I think the software is working pretty well and the process is fairly easy to use. Are we seeing similarities or trends in the reasons that the groups are withdrawing or are unsuccessful? What percentage of the reasons are things we can help students resolve or prevent and what percentage are beyond our resources to assist? Heather Hebert

    Comment by Heather Hebert — January 22, 2010 @ 2:42 pm | Reply

    • Thanks for posting, Heather. It’s a good question, but one we did not ask at the end of last semester. It would be worth looking in to, however. Ironically, the numbers in both groups are about the same, and relatively small. And only two students from each of the groups dropped all classes. To me, the interesting question is: if only two dropped all courses, why did only 28 out of 40 (70%) return in the spring? This is not necessarily an early alert question, but it IS a retention question.

      Comment by don staub — January 22, 2010 @ 9:37 pm | Reply

  4. I agree the interesting retention question you asked is an important one. If the students completed the drop form rather than the instructor completing it, the form has a space for the students to record why they are dropping. I think it would be enlightening for us and may shed light on why the two dropped all and others did not, etc. I would also be curious to see if better advising about course load could have prevented those who dropped only one course.

    Comment by Heather Hebert — January 25, 2010 @ 1:43 pm | Reply

RSS feed for comments on this post. TrackBack URI

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

Blog at