Carteret Community College Title III Grant

EA Pilot – Final Report – SP10

Here’s a quick wrap-up of  the EA Pilot from Spring semester 2010.

Just to refresh your memory, in this semester’s pilot cohort (read how we selected the pilot cohort) there were 26 students enrolled in 91 sections.  The control group was comprised of students who met the same criteria as the pilot group, but were not GOT-D.  In this instance, the total number of students at the college who fell into this group was 59.  Therefore, rather than random sample within that group, we went with all 59.  There were 41 full and part-time instructors who monitored the students throughout the semester. Early Alerts are submitted by faculty members, using the software we’ve developed in-house (click here to see a demo).


  • 25 of 26 students in the pilot group (P) completed the semester
  • 57 of 59 in the control group (C) finished the semester.
  • Success rates – P: 80%; C: 61%
  • Credits completed/credits attempted –  P: 202/245 (82.4%);  C: 430/650 (66%)
  • Withdrawals – P: 15 ; C: 52
  • GPA (for the semester) – P:2.51; C: 2.27

From this data, it looks like the Pilot group, by virtue of receiving support via the Early Alert system was more successful than the Control group. Of course, looks can be deceiving.  In implementing the pilot for the second semester in a row, a number of issues emerged:

Participation by faculty:  How responsible were faculty By February 9th (the 5-week point in the semester), only 50% of the 41 participating faculty had submitted a report via the software or email.  Mass emails were sent on 1/7, 1/20, and 2/4, informing faculty of the need to report on students who were in the pilot, and in their respective courses.  Faculty were asked to submit reports on their respective students, regardless of student performance.

Reporting by Faculty: How timely were reports/alerts submitted by faculty?  What degree of priority (i.e. High, Med, or Low) were attributed to the alerts?  One student in the pilot group was enrolled in 5 courses, and this student never attended any courses.  Here is how her 5 instructors treated her attendance (note the dates on which alerts were submitted…keep in mind that she never showed for any of her courses):

  • Instructor 1: High priority (1/21)
  • Instructor 2: Never attended (1/29)
  • Instructor 3: No Intervention Required (2/16)
  • Instructor 4: Never attended (2/17)
  • Instructor 5: Nothing submitted
Communication with Students: How effective (tenacious?) are we at getting in touch with those students for whom alerts are submitted?  A not-atypical “proposed intervention” for a student will read: “CAPS will call student.”  Unfortunately, the follow-up may also read: “Tried to call – no phone contact information available.”  This should only trigger another, varied approach to reaching the student.

Follow-Through: Likewise, ensuring that the student received the services that were intended for him/her to stay focused on completing the course was not always the end result – or, at least, (and this was a secondary issue), receiving services were not always recorded into the EA software.

Therefore, as we look toward Fall 2010, and piloting the system for another semester, it is critical that the two areas that receive greatest emphasis be: 1) Initial reporting/contacting the student and 2) Ensuring the student receives appropriate services, in timely manner.


Leave a Comment »

No comments yet.

RSS feed for comments on this post. TrackBack URI

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

Blog at