Carteret Community College Title III Grant

August 10, 2011

Outcomes Assessment at CCC – 2010-2012

Filed under: Outcomes Assessment,Presentations — Donald Staub @ 4:36 pm

Click on image to download ppt

This is the presentation I made on Thursday, August 11, at the all-faculty meeting to kick off the new academic year. In the presentation, I covered a lot of ground rather quickly. The main points are:

  • Assessment Fellows: These three individuals (Sherry Faithful – A&S, Philip Morris – A&S, and Shana Olmstead – Business Technologies) have received stipends for the Fall semester and were given Title III funds to travel to the Institutional Effectiveness Institute in Baltimore in July. They will be serving as faculty leads on expanding, and deepening, the outcomes assessment process at Carteret.
  • ILLOs – 2010-2011. This is a quick and dirty review of what has been given three times already in various venues across campus. To learn more about this exhilarating topic, click here.
  • ILLOs – 2011-2012.  What are we doing this year?  We’ll be assessing Computer Literacy and Information Literacy.  The fall will be spent identifying program-level courses in which assessments will take place (the big question: will Computer Lit be assessed at the program-level, or across-the-board…stay tuned).  The spring will be dedicated to gathering assessments, scoring, if necessary, and identifying Use of Results.
  • Program Outcomes – 2011-2012. This will be a quick update on where we are for all programs and their analysis of data regarding Success, Persistence, and Retention data.  Currently, Jennifer (IE) is providing data to each of the programs regarding their data requests.  On October 10 (CCC in-service day), at the Assessment Workshop, Programs will identify Use of Results and Action Plans for their program outcomes.
  • CLLOs – 2011-2012.  We are moving toward full implementation of this assessment plan.  Last fall (2010), only those full online sections with a seated complement were asked to collect and analyze CLLO data.  In the spring (2011), it was any DL course (online or hybrid), plus any corresponding seated sections.  This fall, ALL courses must assess two CLLOs for each course taught.
  • PLLOs – 2011-2012.  Keep on, keepin’ on.  We are asking faculty to continue assessing and using data to make instructional improvements.

July 26, 2011

5 Years of Title III – Keough & Staub at N-L 2011

Five Years of Title III: Successes, Challenges, and Words from the Wise(er)

click on image to download ppt

Presenters:  Patrick Keough & Donald Staub – Carteret Community College

Presented at the Noel-Levitz National Conference on Student Recruitment, Marketing, and Retention in Denver on July 26, 2011.

The purpose of our presentation is to give a brief overview of what we have accomplished – qualitatively and quantitatively – through the course of the grant (which ends in September of this year).  To achieve the overall goal of increasing retention at the College, the grant has three primary objectives: Advising, Distance Learning, and Outcomes Assessment.  This presentation will highlight the retention efforts we have made in each of these three areas, with particular attention to the successes and challenges of implementing each.

The crux of the presentation will be the Learnings section: If you are looking down the barrel of a five-year T3 grant, and you asked me what I thought was important, I’d give you this list.

Getting started

• If you’re planning on proposing, do your best to name names – qualifications, etc…it shows you can hit the ground running.  It may also demonstrate commitment.

You’ve probably heard this already, but it really strengthens your proposal when you can say that you have qualified people ready to go as soon as the grant starts.  If you do have those positions filled, on paper, make sure you emphasize their qualifications and experience.  I can’t say to what degree this strengthened our application, but we already had a director of DL in place as we were writing the application (DL is one of our three objectives).  Again, we were able to highlight his qualifications.  Additionally, and perhaps more importantly, it demonstrated institutional commitment.  We were not asking to fund yet another (high dollar) position that would need to be sustained down the road…the funding was already in place.

• BTW…we’re pretty efficient around here, and it still took us about three months to get the three key positions filled: Director, Instructional Technologist, Staff Assistant.  For us, the staff assistant was critical.  I cannot underestimate the importance of this position; do not make it an afterthought.  Key skills and qualities are: Attention to detail, Time management, Ability to work under pressure and short deadlines, Good with numbers, Organized, Courteous, Ability to deal with multiple personalities – our staff assistant deals with everyone from the president to the VPs, to ALL faculty and staff.

Over the course of the grant, our assistant did a lot of great things, but the thing that she was best at was the financials…not budgeting, but record-keeping, reporting (on a monthly basis), purchasing, tracking orders, and nipping at my heels when certain things needed to be taken care of (e.g. Budget revisions).

• Become friends with the Business Office – ours does the draw downs, budget revisions, and purchasing.

I have a colleague, who shall remain nameless, that directed a T3 grant at another community college.  Between IT and the Business Office at her school, it would take 18 months from PO to actual receipt of any computers ordered through the grant.  At our school, it generally takes about one month for turn-around.

This is partly because our Business Office is extremely efficient.  The other reason is that we have a solid relationship with our Business Office, and, by extension, the Administrative Services division.  It should be needless to point this out, but we all know that words and deeds do not always follow each other: Don’t butt heads with the people who control the flow of your resources.

You don’t need to shower them with gifts (although chocolate can go a long way!), however, my advice would be to don’t wait to speak with them until there is an eleventh hour emergency order.  In short: follow their guidelines and policies (even when they may be unwritten);  cultivate relationships – stop by and say hello when you’re in the neighborhood…it doesn’t always have to be work related; look for other ways to work together (In my case, I work with the staff on developing and assessing outcomes.  I also have found myself on committees, coincidentally, with some of the key players, allowing me to develop and strengthen relationships); and, don’t forget, they are people too who like to be appreciated – compliment the staff, praise them, and thank them vociferously – in person, in public, via email, or in a meeting.  There’s no need to be an apple polisher, but they are trying just as hard as you to do their job effectively.

• Become friends with HR – we have written well over 100 contracts in our five years of providing PD.  If we didn’t have a good working relationship with them, the last five years would have been miserable.

Likewise, HR is valuable to you.  They will help you hire key staff, and they will ensure that those paid via contract are processed in a timely manner.  The same philosophy holds true here as with the Business Office: These are folks who are trying their best to perform effectively.  Appreciation and praise (and sometimes, simple acquiescence) go a long way; conversely, conflict and antagonism will set you back…way back.  It’s like the old adage about the pedestrian crosswalk: you may have the right of way versus the oncoming car, but in the end, who really wins? (hint: the larger, more powerful force).

• Become friends with IR/IE – they have the data, which unlocks they door to successful APRs, and efficient monitoring of progress toward your objectives…and any other time you may need numbers.

• Evaluator – your evaluator, if s/he is good, is worth their weight in gold. And how do you know if you have a good evaluator?  For me,  the most important indicators are: How well does s/he know EDGAR?  (if they don’t know what EDGAR is, run in the opposite direction).  And, how good are they at reviewing your books?  And, the only real way to determine this is to find an evaluator who has deep T3 experience – as an evaluator and/or as a director (our evaluator has both!).  Our evaluator comes twice a year to check  on our progress, to look under the hood of compliance and to pull out the dipstick of our financial record-keeping.

Our evaluator and I made a presentation at the 2011 IDUES (T3 & T5 directors’) conference in D.C. [For full disclosure, I was not able to attend the conference/presentation…I was in Russia on a Fulbright].  Our presentation centered on the qualities of an effective evaluator and an effective relationship, between you and your evaluator.

Click on image to download ppt

In essence, the sooner you get your evaluator on board, the better.  In fact, in an ideal situation, the evaluator should be a key member of the grant writing team.  In addition, as I mentioned above, find someone who has a nose for compliance and good record keeping.

Your evaluator should be able to easily follow the trail of purchasing. If it’s confusing or incomplete, you need an evaluator who will candidly tell you so.  I always say: I’d rather have an ornery evaluator looking over my books than a nice auditor.

• Get the word out.

Quantity does not always equal quality.  In fact, I’d be the first to admit that of our three objectives (advising, assessment, and DL), Advising has been the weak link…and it doesn’t take much looking around on our blogs to figure this out.  Our T3 blog is Assessment-heavy, and DL has their own blog dedicated strictly to issues of DL at the College.  Nevertheless, this has not stopped us from trumpeting our work. And, I firmly believe that the grant and the College have benefitted from this.

How do we benefit from sharing and transparency?  The grant has benefitted through an ongoing, public dialogue that provides us with feedback on our implementation strategies.  Every blog posting, every conference presentation,** every podcast and webinar is an opportunity for us gain constructive input on our objectives. And, to boot, it increases positive exposure (we hope!) for the College.

Other ways of getting the word out include: Twitter (both through an account where I send out tweets, or through our blog – new posts are promoted through an auto tweet); Getting involved in state-wide organizations (I’m on the executive committee of the nascent North Carolina Community College Learning Outcomes Assessment Group); Sign up for the T3 listserv (a great way to answer questions and keep a finger on the pulse of all things T3…I’ve been introduced to a number of other directors via the list).  There are other possibilities as well: I’ve poked around on LinkedIn, but have been unsuccessful in discovering a T3 group – although it’s been about a year since I last looked for one.  Of course, let’s not forget Facebook (full disclosure: I don’t do FB, so I’m not sure of it’s capabilities in this regard).

Finally, and I truly believe this: It is valuable to keep your program officer in the loop.  He gets a CC whenever I send out an email to our advisory committee.  He also receives occasional notifications of new blog posts – just to remind him that we’re still blogging.  I don’t expect him to read and respond, but he has applauded us in person and publicly for our transparency.  My caveat here would be to ensure that your grant is on a solid foundation before attracting too much attention.

** this is our fourth Noel-Levitz presentation…and our second invitation to speak at this annual meeting.  Over the course of the grant, I have made 14 presentations at the national level; 10 at the state level.

Short- Mid-Term

• Sustainability (remember, this is a developmental grant)

Is it sustainable?  With any major investment of time or dollars, this was our guiding question.  It started with online tutoring, carried through to outcomes software, early alert, advising software, and most recently, Blackboard and Moodle.  In any one of the investments, we ran it through the test of: How much is this going to cost the College once the funding is gone? Can the College really afford this without T3?

More often than not (and increasingly so in years 3-5), the answer was No, the College cannot sustain this initiative beyond the life of the grant.  Therefore, the solution was generally: let’s do it ourselves.  This started with online tutoring, continued with outcomes, early alert, and advising software packages, and it has helped drive us from Blackboard to Moodle.  In each of these cases, we could have justified the start-up expense and utilized T3 funds to get us out of the box, but there would have been no way that we could have sustained the likes of SmartThinking or Weave Online.

We found ourselves developing solutions in- house that we could tweak as needed.  The conclusion has been a much greater ease of institutionalization.  For the instances where we needed software, we either dug a little deeper into our knowledge of existing software.  For instance, the web-based Datatel has the capability to serve as advising software.  It provides the functionality we need for advising, plus the learning curve for faculty and staff has been minimal because of their familiarity with Datatel.

• PD

For our grant, 20% (~$300,000) of our funding has been dedicated to Professional Development.  And, we can proudly say (I believe) that the lion’s share has been spent on giving faculty and staff the skills and expertise to more effectively perform their duties.  In other words, most of these dollars have gone toward sending faculty and staff to trainings, conferences, etc.  These folks, in turn, bring the knowledge or skill back to campus and share it with their colleagues.

What we have not done a lot of is bringing in the hired gun.  Our belief (one part cultural, one part philosophical, and one part fiscal decision-making) has been to develop and strengthen the expertise from within. So, let me break this down.

* One part cultural.

What we learned early on is that the campus culture did not respond favorably to external assistance.  You can’t be a prophet in your own land, but when we brought in a few hired prophets, they stoned him and her. I’ll be honest.  We brought in an excellent consultant from Noel-Levitz to conduct a preliminary analysis of our enrollment management and advising systems.  After he left and we were entertaining the notion of more intensive work with N-L, the push back was so intense that we had to scrap that idea.  We could not move people out of their defensive posture – when it came to someone from the outside analyzing their work.  We tried a different consultant, with the same results.  Our advice- do a cultural map before bringing in external support.

* One part philosophical.

At the same time, what we realized was that you can be a prophet in your own land.  You may not be the best prophet, but if it helps us move forward, let’s do it.  Therefore, we really turned our focus to sending faculty and staff out, and letting them bring the message home.  This has turned out tob quite effective.  Again, it’s not perfect, but after five years, the number of staff, full-time, and part-time faculty who have gained expertise, and who have something to teach, has spread to all corners of the campus.

The implicit message here is that our community is rather tight knit.  There are more than a few that call us a family – in the true sense.  We all know each other, we get along the best we can, but to some degree, like any healthy family, we are a little dysfunctional, plus we’ve got a few eccentric aunts and uncles.

So, for five years, quite a few of us have traveled to a number of state and local conferences together.  At last year’s state DL conference, 12(!) faculty and staff participated.  Two weeks ago, eight faculty and staff traveled together to an Institutional Effectiveness institute.  And, most importantly, it’s not always the usual suspects.

One additional, yet critical piece of our approach is accountability.  If T3 is putting resources into attending PD opportunities, then the faculty or staff member must understand that this is a reciprocal agreement; that the expectation is that the individual will “give back” by sharing knowledge, providing training, etc… This may take the form of actually providing a workshop, a brown bag luncheon discussion, or a post to the blog.  On paper, this looks great.  In reality, it’s been a different story.  If you take this approach, I urge you to set up a formal system of accountability.  Otherwise, you’ll spend a lot of time chasing folks who will give you nothing in the end.

* One part fiscal decision-making.

This is where we come back to the sustainability issue.  By putting faculty and staff in a position where they have to be active participants (i.e. they learn and then they must share), it reinforces the knowledge and skill that they are gaining.  By spreading this wealth, and by reinforcing it as often as possible (by either attending multiple PD opportunities, or by presenting the material back on campus), we are solidifying its place on our campus.  This creates a sustainable, relatively inexpensive source of knowledge, experience, and energy.

In the end, T3 catalyst for changing the attitude toward PD.  Whereas in the past it was viewed more as a receptive activity, the shift is on toward PD being more productive.

• Be a change agent.

Another important lesson. You may be in an enviable position.  You need to learn how to think and act like a faculty member (if you are not one already).  But, we all know that “faculty member” is not a monolithic concept.  There is wide variation in how faculty, and staff, perceive professional development.  I have learned that the more I interact with faculty and staff, particularly in a one-on-one (or very small group) situation, the better I am able to work effectively with them.  This is not say that I have them eating out of my hand (see my comment above about eccentric aunts and uncles), but by developing a rapport, I am able to converse more freely with a greater number.

Likewise, one valuable lesson I have learned about getting support for change is to communicate early and communicate often.  Our faculty (and I’m sure that some of yours too) must be organized…for months in advance.  We are working on the 2013-2014 calendar now.  Textbooks for SP 2012 need to ordered soon.  So, by asking them to “fill out this form and get it back to me by Friday” may be met by compliance, but it will not be met with quality, nor will it be met by joy.  The further out you can communicate an upcoming task, workshop, etc… and the better clarity you provide (give examples on how you want it done), the more smiling faces that will greet you when they see you coming.

But, it should not all be on you.  Administrative buy-in is key also.  Not as a hammer, but as a lever.  Administration is not just the president and relevant VPs…it includes deans and chairs as well.  Everyone has to buy in.  Conversely, the more disciples you create within the faculty, the more leverage you will have as well.  Patrick’s DL initiatives are always successful because they are mostly faculty-driven.


• Keep your eyes on the prize

As all of the stuff above suggests, sometimes it’s easier to focus on the details, and lose sight of the big picture.  Why are we doing this, anyway?  Keep in mind that this is a “Strengthening Institutions” grant.  In your proposal, you pointed out the major challenges at your school, and you said that “we’re going to do this, and this, and this to make us better.”   Ultimately, better at retaining students.  So, it all gets back to – and, along with “can we sustain this?” the other critical question when considering investment of resources is – What will this do for retention?  And, if you do a cost-benefit analysis, sometimes the answer is Yes, sometimes it’s No.

• Learn to be flexible

According to the rules, you must stick to your objectives.  If you want to change them, you need a very compelling reason to do so.  However, this is not say that you cannot change your strategy (one of the beauties of T3 funding and the flexibility provided).

One of our objectives is Advising.  In our proposal, we were going to develop an advising/counseling center (CAPS), directed by a full-time faculty member on release, and staffed with some part-time counselors. The faculty member driving this concept decided to return to full-time teaching shortly after the grant started.  No other faculty member was willing and able to assume this role, and we had not requested funds to hire a full-time staff member to take over the reins.  So, a little juggling of full-time counselors later, we were left with a solution that really did not work because the heart and soul and vision had left.  So, this objective floundered for two years.  Why, you ask?  Because we were consumed by our ten-year accreditation re-affirmation process, and the time and energy were not there to work a new solution.

In the fourth year of the grant (after we had successfully gained re-affirmation), we turned our attention once again to advising.  It was a major piece of our QEP (SACS), which had a different vision, and a different visionary than the original incarnation of our advising/CAPS center.

The moral of this story is that we knew we had to maintain our objective – Strengthen Advising – but we also knew that we desperately need to change how we were going to approach it.  This past year has been fast and furious in terms of re-designing, re-structuring, and re-tooling for a new-look advising.  We are sad at this point that the T3 funds will soon be out of reach, but we have made terrific strides because we were flexible in the way we approached this objective.

June 27, 2011

General Education Assessment – 2011

Filed under: Outcomes Assessment — Donald Staub @ 8:47 am

click on image to download pdf of draft report

General Education Outcomes Assessment

Draft Report:

Written Communication


June, 2011




The purpose of this report is twofold: 1) to provide an overview of the process undertaken to assess the Written Communication skills of the students at Carteret Community College; 2) to provide Results, Analysis, and Use of Results based on the data collected during this process.  Following a summary of the process that was implemented in order to carry out the assessment, aggregated (College-leve) results will be presented, along with a Use of Results and Action Plan.  In the Administrative Report, Program-level results will be provided (i.e. these results will not be viewed by anyone but relevant administrators and program CACs).

The Process

In the 2009-2010 academic year, it was determined that the College would alter its approach to assessing General Education Outcomes (Institutional Level Learning Outcomes – ILLOs).  Prior to that point, the seven ILLOs had been assessed in specific, relevant courses (e.g. Written Communication assessed in ENG 111, Computer Literacy assessed in CIS 110).  The revised process would assess ILLOs at the program level.  That is, each program would be required to identify an assessment that would be administered in a relevant course (i.e. late-stage or capstone; not introductory).  Because of the non-linear curriculum in AA and AS, assessments would be administerd in courses with highest enrollments (i.e. AA: ENG 111; AS: BIO 110).

In Spring 2010, a timeline was identified for assessing the ILLOs.  Because of the complexity of shifting this assessment process to the program level, it was decided that for 2010-2011, only one General Education Outcome (Written Communication) would be assessed.  Here is the 2010-2017 timeline:

click on image for larger view

In Spring 2010, the General Education Sub-Committee devised an implementation plan for the Written Communication assessment in the 2010-2011 academic year.  The plan included:

• A communications plan for all programs to understand the process

• Development of a common rubric

• A training plan for all readers to effectively use the rubric

• A timeline for identification of assessments (by CACs)

• A timeline for collection and scoring of assessments

• Identification and training of scoring teams – for use of rubric

In April, 2011, Written assessments were collected from all CACs (except Practical Nursing; the assessment will be conducted in the summer term).  The initial plan was to collect a random sampling of papers from each program. However, due to the number of papers collected overall (N=279), it was decided that all submitted papers would be read and scored.

Throughout May, final preparations were made:

• Indentification of 10 two-person scoring teams. Each team had at least one member from the academic division represented by the papers (e.g. A Nursing faculty member was on the team that scored Allied Health papers);

• Scoring sets were grouped according to relative similarities between programs;

• Sets organized with relative equity among readers regarding number of pages (not papers) to be read.

May 25, 2011 was designated as Scoring Day.  Scoring teams gathered in one location to receive their sets of papers.  CACs explained their assignments to the relevant team, and clarified any inquiries. Readers were allowed to leave with their papers; 75% of the sets were completed by noon; 100% within 24 hours.

Scores were entered in a master spreadsheet (see example, below).  If scores given by the two readers on any of the four sub-components (Context, Content, Genre, Mechanics) had more than a one point differential, then a third reader scored those papers.  Out of the 279 papers, 50 required a third reader (some papers had +1 differential on more than one sub-component).

Once scores had been aggregated, averaged, and disaggregated by program, College-level results were shared with the General Education Committee on June 28, 2011.  The following pages include data on College-level results as well as a preliminary Use of Results.

Preliminary Use of Results

Upon initial review of the data, here is a potential list of Uses of Results:

• Scoring:  Only one program per reader.  CACs become readers (of different, but closely-related program?)

• Evaluate program-level writing.  Which programs emphasize writing already?  To what degree?  What writing instruction do their students receive?

• Offer Writing Workshops.  Via Academic Support.  Free to students.  Once per semester.

• Offer Writing Workshops.  To instructors.  How to effectively emphasize, e.g. Mechanics.

• Re-Assess Papers.  Once a focal point for improvement has been determined (e.g. Mechanics), re-evaluate (a random selection of?) papers for refined definition of Mechanics, and a more specific list of weaknesses (pick the top-10?).  Devise workshops for students and faculty based on this analysis.

• A more concerted effort is needed to generate broader representation of DL courses (HY & IN).


Any feedback on this report is greatly appreciated.

Feel free to send it to:

Don Staub, Director Title III


The Common Rubric (for Gen Ed outcomes assessment)

Filed under: Outcomes Assessment,Rubrics — Donald Staub @ 7:55 am

In 2010-2011, we shifted our Gen Ed Assessment program from one that gathered data in a few, select courses to a broader, program-level approach.  In the past,  Gen Ed (a.k.a. ILLO – institutional level learning outcomes) data had been gathered via relevant courses that had the greatest number of students passing through them.  For instance, we collected assessment data for our Computer Literacy only is CIS 110.  Likewise, Written Communication in ENG 111, and so on.

Of course, this approach had flaws.  Quite often, these courses were taken by students who were early (not late) in their community college experience (which is counter-intuitive to the essence of ILLO assessment; i.e. measuring what students have learned in their time at the College).  In addition, it was not always a representative sample (e.g. not all programs required their students to take ACA 115 – testing ground for Personal Growth and Responsibility).

So, in 2010-2011, we shifted our approach to the program-level in order to address both of these shortcomings.  We began by determining that 10-11 would be a pilot year for this process, and asked programs to focus on assessing only Written Communication. One of the essential pieces of this process (or so we believe) was to develop a common rubric that would be used to score all written samples collected for this assessment.  Once the rubric was agreed upon, we spent the Fall semester (2010) conducting training sessions for readers in how to use it.  In the Spring (May, 2011), we had a scoring day and readers used the rubric to score papers in sets that they had been assigned.  Click here to view a more detailed look at the process.

Our Gen Ed Outcomes Assessment Committee decided that we would begin with the VALUE (valid assessment of learning in undergraduate education) rubrics that had been developed under AACU’s LEAP project.  The Gen Ed committee felt that the Written Communication rubric needed tweaking for our own purposes, so it resulted in this rendition that we have used locally:

click on image to download our rubric

Discussion: Did it work?  Did we improve on the VALUE written communication rubric?  Was it a flawless process?

Theoretically, the rubric should work for just about any written piece.  And, for the most part, it did.  Where it got a little tricky was in how we divvied up the readings.  We tried to ensure that readers were scoring similar sets of papers.  This team read Radiography and Respiratory Therapy papers, where that team read Boat Manufacturing and Cosmetology papers.  However, this did not always work out swimmingly.  So, in some instances, readers were scoring papers from programs that required more academic/rigorous writing skills (e.g. Paralegal) alongside papers from a program such as Basic Law Enforcement Training (BLET).  A paper from Paralegal may warrant a 4 (out of 4) for Content, and so might a paper from BLET.  However, in the reader’s mind, these are still two distinct displays of writing ability.  Our proposed solution is, next time we use a common rubric, to have readers only score papers from one program.

June 7, 2011

T3 Final Months…

We are entering the final months of receiving our Title III grant. Looking back over the past 5 years I feel very good about what we as a college have accomplished in Distance Learning with the various instructional initiatives Title III enabled us to pursue. All in all I believe T3 has benefited the college and all our distance learning endeavors. Looking back I would have to say that pre-planning and fluid “open / transparent” communication between the T3 Project Director and myself the Director of Distance Learning was one of key’s to the success of our campus wide distance learning professional development initiatives. There is no doubt in my mind that the faculty and staff at Carteret Community College are more proficient both technically and in their online teaching methodology because of the funding and support that came from Title III.

Once CCC received the grant the first thing our T3 Project Director Don Staub did was have me create a T3 Blog to post anything and everything that related to Title III at our college. This was an effort to make the entire process transparent. Anyone can go to the T3 Blog and our CCC Distance Learning Blog and learn about all the professional development and college enhancement activities Title III supported through grants and the purchase of instructional and distance learning technologies.

The following are some of the primary distance learning highlights “successes” supported by Title III.

Distance Learning Pioneer Program
Online Tutoring Service Pilot Program
Blackboard Boot Camp

NC3ADL Regional and State Wide Conferences
Online Retention
• Peer Review “QAP” Online Course Evaluation Project

Distance Learning Campus Wide Forum

Assessing Distance Learning
• Moodle “Train the Trainer” Project
Moodle Migration Initiative

As of now over 50 staff and faculty have participated in the DL Pioneer Program which has positively impacted the online courses we offer here at CCC. Faculty have more tools and technology for doing an effective (and innovative) job teaching their courses in the online environment and have been given training both on campus and off in a variety of e-learning and instructional design “best practices”.

As we head into the final months of the grant it is imperative that we expand on the strong foundation Title III has helped us build here at Carteret Community College. There’s no doubt that the culture here at CCC has changed for the better as far as embracing technology and the latest instruction design best practices for e-learning. Our online courses incorporate a variety of “rich media” in order to address the different learning styles of our students. This would not have happened so extensively across campus if it wasn’t for the aggressive and ongoing training provided through T3 funding and support.

Our challenge is to put procedures in place so we can continue expanding upon the many successes T3 has helped us attain in Distance Learning over the past 5 years. The Distance Learning Advisory committee is in the process of developing a policy and procedure manual for distance learning. Many of these policies and procedures have been inspired by what we learned through all our T3 sponsored projects and initiatives.

Our goal is to continue to build on the momentum we’ve gained over the past 5 years thanks to the Title III grant. We have seen the culture at CCC change (evolve) over the past 5 yeare to embrace technology and online tools and applications in order to be on the “cutting edge” of online education – Title III has been a catalyst for this change in attitude and has added to the confidence of our staff and faculty to make the seamless transition from the traditional classroom environment to online instruction.

Video and Written Testimonials by Staff and Faculty.

Joseph Croom from CCC Student Services commented about a Title III sponsored conference, “This was an amazing conference, it was my first NC3ADL conference…something that made me really open my eyes. I learned so much about student services, and how to make sure that they are accessible by all. I was able to meet so many different people, from different jobs, at different colleges that brought a variety of perspectives on DL, and its place in the world today. I was able to get many new ideas that I plan to work with my department Student Services and the college overall to make Carteret Community College a great place to work, teach, serve and learn.

I thoroughly enjoyed going to the sessions on the NCCCS Help Desk, the Online Student Services, Google Apps for Education, and the great Skype presentation on Moodle.

Though it scares me, it excites me…stepping out on the ledge…driving the engine of Student Services, getting them up the hill and down the other side. I came back with great suggestions, so much energy, and great comments…the Registrar’s Office decided today to add live chat to their website.

I am so excited about the ideas swirling in my mind…causing waves of DL energy!! I hope that I will be able to go next year and the regional meeting in March….!!!!

CCC Anatomy and Physiology Instructor Phillip Morris stated, I was greatly impressed with the NC3DLA conference. From the level of organization to just how incredibly informative the conference was. Kudos to those involved in putting the whole thing together. I came away with knowledge and ideas that I hadn’t even imagined. The first thing I took with me is how many improvements I can make to my online courses. It doesn’t matter how good you think they already are there is always room for improvement. I personally am now aware at how deficient my courses are in the area of accessibility for special needs students. I will certainly work to correct that. Also, seeing what is now out there in the area of online science labs impressed me. From the late-night-lab presentation and demonstrations to the virtual microscope software available, all I kept saying was “wow”. Always something new, always changing. As a new faculty member, it also gave me a chance to really get to know some of my colleagues here at CCC. It was a group of great people and great educators. People who love what they do. It was a great experience.

March 18, 2011

NC3ADL Sp11 – Assessing Distance Learning

Filed under: Uncategorized — Donald Staub @ 8:01 am

click on photo to download pdf of ppt

Here is a copy of Don’s presentation on Assessing Distance Learning.
You will see in the presentation that there are two areas of focus:
1. Assessing the DL Program broadly
2. Assessing Distance Learning courses

If you want to download any of the materials that were discussed in the presentation, Click Here.

February 23, 2011

NACADA Advising Assessment Institute

Filed under: Uncategorized — Donald Staub @ 7:21 am

I’m participating in the NACADA Advising Assessment Institute this week in Clearwater, FL. I’m a first-timer, so both anxious and excited. Looking forward to an intense couple of days of learning (with some sharing). If you’re interested in the conference home page, click here…or in the agenda, click here.

Just to get us started, the main questions that I come with are:
* how does assessing advising differ from other services we assess?
* what are effective, multiple-perspective (relatively resource-friendly…as opposed to resource-intensive) assessment tools/processes that will get us to where we need?
* AND, here’s my soap box…relevant to the needs of community colleges [is it just me, or does it seem that the more conferences I attend, the less it seems that I hear about community colleges…is it that fewer of us are showing up due to travel restrictions, and thus less emphasis by the organizers?]

Stay tuned as I check in to report and comment on my learning outcomes(!).

December 1, 2010

Best Practices in Distance Learning

Filed under: Conferences,Faculty Professional Development,Presentations — Donald Staub @ 12:13 pm

Click on the image to download a pdf of the ppt


Laurie Freshwater, Division Director of our Allied Health Programs is presenting at the USDLA’s  4th International Forum for Women in E-Learning in Albuquerque. The title of her presentation is: Best Practices in Distance Learning.

The abstract for her presentation is:

Attrition in DL courses continues to be 10-20% higher than that of face-to-face courses.  This session will provide attendees with resources and current/emerging technological tools available to increase retention in DL courses through the establishment of learning communities and the use of learner-centered instructional approaches.

November 15, 2010

NC3ADL: Assessing Distance Learning

Filed under: Conferences,Presentations — Donald Staub @ 12:38 am

We’re at the 2010 annual meeting of the NC3ADL (North Carolina Community College Association of Distance Learning).  And, by my estimation, this is the best one yet.  Just check out the program. High quality presentations.  Very few repeat sessions.  And, as always, great opportunities to network with colleagues from far, and near(!).  And now that the conference is finished, I’d encourage to jump over to Patrick’s DL blog and check out his reporting, as well as commentary from some of CCC’s team that attended.

I am presenting on the Assessment of Distance Learning (8:30 … yikes…Monday a.m.).  My presentation will have two facets: One is how we, as an institution, are assessing the quality of DL overall.  The other is to look at how assessment in DL courses is taking place at the program and course-level.   I believe this presentation/discussion has application to both DL directors and to classroom instructors – both of whom have concerns (at different levels) about whether or not DL is passing muster.

Click on photo to download pdf of ppt

Relevant documents referred to in this presentation:

  • The Quality Assessment Plan (QAP) for evaluating and certifying distance learning courses before they go “live”
  • SACS Policy on Distance Learning:

  • Distance Learning Program Evaluations:

October 26, 2010

IUPUI 2010: Assessing Distance Learning

Filed under: Conferences,Presentations — Donald Staub @ 10:52 am


Click on image to download ppt

On October 26th, I’ll be presenting on the assessment of distance learning at the 2010 IUPUI Assessment Institute .  This is something that I’ve been working on  (and presenting on) for over a year now (and I’m very grateful to the planning committee at this esteemed conference for allowing me to present “the latest”).

Because this is a work-in-progress, each time I present, there are differences – to the data, and to the way that I’m interpreting the results (both qualitatively and quantitatively).  While I may be exploring this issue from a community college issue, I believe that this presentation has relevance to all higher education institutions: How do we know that what we’re teaching is actually being learned?

This started out as reaction of sorts to the question we were having to answer for our accrediting body (SACS): What are you doing to ensure quality control of your distance learning program?  This is still a major portion of this presentation.  But as I dig deeper into this issue,  I am spending more time at the course and program level and am trying to determine what is going on there.  And, if I may say so, it’s a darn good thing that SACS is pushing us to look more closely at this issue because as I look more closely at it at our (relatively small) school, I see wide variation in how instructors approach assessment in, and of, their DL courses.

If you would like to download a copy of the powerpoint, click on the image at the top of this posting.

Other relevant items  discussed during the session – that you may be interested in looking at – can be found below.

Relevant documents referred to in this presentation:

  • The Quality Assessment Plan (QAP) for evaluating and certifying distance learning courses before they go “live”
  • SACS Policy on Distance Learning:

  • Distance Learning Program Evaluations:

Next Page »

Create a free website or blog at