Carteret Community College Title III Grant

October 23, 2010

IUPUI Assessment Institute – 2010

Filed under: Conferences,Notes from the Project Director - Don Staub — Donald Staub @ 10:56 pm


On October 24th – 26th, I’ll be attending (and presenting at) the 2010 IUPUI Assessment Institute – “…the nation´s oldest and largest event focused exclusively on Outcomes Assessment in Higher Education.”  I’ll be reporting here on my interactions with the institute – my comments, questions, and observations.  I hope you’ll follow along.

The first thing I’ll be reporting on will be the pre-conference workshop:  Implementing Student Electronic Portfolios for Assessment

Followed by notes and commentary from these breakout sessions:

  • Panel Discussion (Linda Suskie – moderator): “Rethinking Assessment’s Purposes”
  • Breakout: Non-Cognitive Abilities and the First-Year Student: The Role of Assessment and Intervention (Gore)
  • Breakout: Online & Hybrid Courses: From the Students’ Perspective (Cook & Boldt)
  • Breakout: Assessing Online Courses and Online Evaluation Systems (Fatia & McGovern)
  • Breakout: Three Promising Alternatives for Authentic Assessment of SLOs (Banta, et. al)
  • Breakout: Assessing General Education: Mission Possible (Rose Carr & Stoudermire)

* * *

(more…)

October 12, 2010

Assessing DL: Are they learning what we’re teaching

Filed under: Conferences,Presentations,T3 Presentations — Donald Staub @ 1:04 am

 

click on image to download ppt

 

This was a presentation made at the 2010 conference of the North Carolina Association of Community College Instructional Administrators (NCACCIA) in Asheville.  Comments and feedback on this presentation are greatly appreciated!

Relevant documents referred to in this presentation:

  • The Quality Assessment Plan (QAP) for evaluating and certifying distance learning courses before they go “live”
  • SACS Policy on Distance Learning:

  • Distance Learning Program Evaluations:

September 27, 2010

Retaining Students in Online Education

Filed under: Conferences,Faculty Professional Development,Retention Issues — Donald Staub @ 6:46 am

Mary Walton (Division Director of Business Technologies), Patrick Keough (Director of Distance Learning) and myself are in Atlanta from Monday to Wednesday of this week, attending Academic Impressions’ workshop: Retaining Students in Online Education.” We are here to spend the next couple of days learning about and planning “…methods to track students, document progress, and put specific practices in place to ensure success,” (from the brochure).

We will be posting our learnings and impressions throughout the workshop.  As a quick overview, here’s what’s on the agenda:

  1. Rethinking Retention: “…with accountability and graduation rates becoming major issues, it is even more important to address retention in online education.”
  2. Identifying Needs: There are usually specific reasons why [online]  students enroll; being able to identify such reasons and respond appropriately can make or break a program.
  3. Developing Dashboards for Data Management: “How can you monitor progress and performance within a student’s lifecycle at your institution?”
  4. Measuring Retention Success: “Identify the significant characteristics of your student population and clarify retention goals at each step in the process from application to the end of the first term.”
  5. Critical Support Services: “…Institutions are challenged to integrate a wide range of student services to promote academic achievement and retention”
  6. Early Engagement Through Online First-Year Experiences: “… methods to engage and connect online students from the first point of contact.”
  7. The Role of Faculty and Academic Advisors in online Student Retention
  8. Delivering Support Services Online

Mark Parker and Kristen Betts led the first day’s sessions on:

  1. Rethinking Retention,
  2. Identifying Needs, and
  3. Dashboards (for data display).

My take-aways from these first three sessions:

  • First and foremost, when it comes to retention in DL, we may not be perfect, but we are doing a lot of good things.  We are providing boatloads of professional development to our faculty (thanks Title III), we are providing more and more services to our students in a cost-effective manner, we are assessing what we are doing, and we are providing training to our students to be better online learners, and we are coming to conferences such as this to gather information.
  • For me, one of the more interesting topics of discussion on the day was around “managing expectations of our students.”  The key point being – we can’t do it all, all the time.  Therefore, we need to ensure that our students understand what to expect…when they’ll get a response.  As one of our colleagues put it: “Is the service reliable…’Tell me what’s available and when it’s available.’”
  • One way to think about providing services to more students would be to collaborate with other colleges.  One suggestion was to form a consortium (as they have done in Mass.) to provide online tutoring.  Pooling of resources is a good thing.
  • How about this idea that was described by a colleague here.  At their school, they use Emoticons that students send to the help-desk. An automated email is generated and sent to students that says, “do you want me to intervene?” (this engages the student, and doesn’t require the time and effort of staff, until necessary).
  • I thought this was a good idea that one school has implemented: For all first-year courses that are taught online, phase in the use of technology. Don’t present all the bells and whistles from the outset.  Let them become comfortable with the technology in phases.
  • And, in the discussion on Dashboards,  the notion that they are not just for the leadership is obvious, but often overlooked.  As Kristen Betts pointed out: “optimize your dashboards for your division directors and program chairs”… what she referred to as “micro dashboards.”

4. Measuring Retention Success

This data-rich session was facilitated by Bill Bloemer – Research Associate at the Center for Online Learning Research and Services at the University of Illinois, Springfield (UIS)  [The director of COLRS, Ray Schroeder has a blog about online learning]. Some of the interesting discussion points that came out of this session include:

  • Students at UIS are hoarding courses…then they drop to fit their needs.  “Excessive Churn” from hoarding at the beginning of the semester is wreaking havoc with gathering true enrollment data.  There is also the issue of students not getting what they want because someone else has “gamed” the system and has grabbed a section that a student may truly need.
  • Look at withdrawals by registration date.  Is LIFO true? – one school at the conference claims that their look at their data says it isn’t. Instead, they that those students arrive focused and intent on completing a course.
  • Is there a connection between age and withdrawals?  Data that Bill showed at UIS suggests that there is.
  • Is it possible, utilizing Academic Analytics (click for Bill’s recommended reading) to predict who will get an F/W in an online course? Bill led a lengthy discussion on a binary logistical regression model he had been using to look at those students who had earned an F.   He worked backwards from this population to identify a common set of prediction variables.  What he found was that, at best, he could predict that slightly over 12% of the students in a course that will W/Fail.  Some of the “best” indicators to get him to this level of success are:
  1. the individual courses (those that have traditionally high rates of W/F;  Focus on the outliers…track only the problem courses)
  2. the student’s prior GPA
  3. prior hours resulting in an F/W (“Failure breeds failure.”  If you fail once, chances are, you’ll do it again.)
  4. student’s major
  • From our Australian colleagues (UNE-Au), Rhonda and Greg: “We take the student perspective [vis a vis] course enrollment vs. student goal success.  You may lose them in the short term, but let’s focus on keeping them for the long term.”  The interventions and practices they have designed work to this end.
  • Another insightful question worth posing (and whose answer is well worth promoting in order to get the attention of administration): What is the cost of increasing retention by 1%?

5.  Critical Support Services

  • Kathleen Polley, Director, Online RN-BS Program, University of Massachusetts Boston
  • The change has taken place from a campus-centered to a consumer-centered model where control is shared with the student.
  • Critical Services – what are the “stressors” for your population?  What’s their skill set?  How do you support them?  Use this to identify and develop your “critical services.”
  • One (successful) way that was suggested to increase Engagement: the weekly online chat – not required, but it’s used to talk about issues that are on the minds of the students in the program.  Kathleen pointed out that while online is supposed to equal asynchronous, giving equity to all students, she still has very high rates of participation in this synchronous chat.
  • Here are some poignant thoughts on Expectations:
  1. Don’t tell students  you will do things that you can’t
  2. You have to tell students what to expect from tutoring
  3. Every interaction is a “trust building” opportunity

Kathleen also talked about a successful Virtual Learning Community w/in BB…let the students use it themselves as a place to meet and discuss.  This has been a good way to build engagement among her students.

6. Early Engagement Through Online First-Year Experiences: “… methods to engage and connect online students from the first point of contact.”

  • Kristen Betts, Associate Clinical Professor, School of Education’s Higher Education Program, Drexel University
  • The average percentage of online of a student’s courseload is predicted to be 60% by 2020
  • She also suggested that we straight-out ask our students (in the student survey): Are you thinking about transferring/leaving?
  • She also argued that orientation is a process, not an event.  Their orientation is 75 minutes total…each person talks, then leaves but it continues throughout the year via their FYE.

Their FYE is event-focused…Key events:

  • Tea/wine & cheese party (they do this with a virtual component)
  • Invited speakers
  • Alumni speakers (work with John Smith/Wanda; offer courses online to alumni)

7. The Role of Faculty and Staff in Online Student Retention

  • Kathleen Polley, Director, Online RN-BS Program, University of Massachusetts Boston
  • “An assessment of student engagement must encompass the policies and practices that institutions use to induce students to take part in these activities.”
  • Not everyone (students) need to be socially connected.
  • Faculty engagement is key for student engagement….Key Consideration for faculty: “Satisfaction with Transparency”  need to know where senior management is going…Faculty satisfaction with policies
  • Kathleen suggested that during Week 4 of course, have students provide a Formative Evaluation (e.g. What have you learned so far? What would you still like to learn?)
  • Does your school have an Online Readiness Assessment? What does ?  Or, How reliable is the assessment?
  • Key indicators for student engagement: how frequently they log in, how often they read something before posting.
  • “How can we assess how often a student is engaging in the online material?”

8. Delivering Support Services Online

  • Kathleen Polley, Director, Online RN-BS Program, University of Massachusetts Boston
  • Admissions: do we really need everything we are asking for?
  • Have technology scaffolding throughout the semester in online courses [should we create technology CLLOs for each online course?]
  • U of New England – Australia: Check their library website for learning skills training (online).
  • Students look at the way you deliver your services and equate that with the way that you deliver instruction (i.e. is it quality?)

9. Benchmarking

  • Bill Bloemer, Research Associate & Dean Emeritus of Arts & Sciences, University of Illinois Springfield
  • Data point:  Terms since last registered.
  • Does your degree-audit system talk with your data warehouse?
  • SP-FA retention vs. FA-SP retention
  • What are the completion/graduation rates of those who are online-heavy in course loads?
  • “Term-earned hours” is a better predictor than “attempted hours.”
  • Course evaluation question: What is your expected grade?
  • On-ground courses using online evaluations increased overall return rate.
  • Bb has anonymous evaluation feature
  • Use online evaluation results as a component of “evaluate instructional modalities” in program review
  • Are there online-specific questions on CCSSE?

 

September 10, 2010

Assessing Distance Learning: Are they learning what we think they are?

Filed under: Conferences,Presentations — Donald Staub @ 8:19 am

Click on image to download ppt

This presentation is being made at the first Learning Outcomes Institute of The North Carolina Community College Learning Outcome Assessment Group . The NCCCLOAG is dedicated to promoting the meaningful assessment of student learning at North Carolina Community College institutions and facilitating the exchange of best practices among colleagues.  We don’t have a website yet, so no hotlinks…sorry (stay tuned!).

The goals of the organization include the following:

  • To develop a web presence;
  • To facilitate the development of a common “learning outcome assessment” vocabulary;
  • To seek out and support our NCCCS colleagues who are charged with the assessment of learning outcomes;
  • To support presentations at professional conferences;
  • To create and present a Learning Outcomes Institute; and
  • To develop a set of best practices in learning outcome assessment.

Relevant documents referred to in this presentation:

  • The Quality Assessment Plan (QAP) for evaluating and certifying distance learning courses before they go “live”
  • SACS Policy on Distance Learning:

click on image to download document

  • Distance Learning Program Evaluations:

click on image to download evaluation

August 18, 2010

Phase III Program Reviews 2009-2010

Filed under: Instructional Program Review — Donald Staub @ 11:29 am

We have just completed the third of four phases for the program review process.  If you would like to view previously completed reviews, click here for Phase II programs (2008-2009), and here for Phase I programs (2007-2008).  What follows are the Executive Summaries of each of the Phase III programs.  Scroll down to see the full narratives of each of these reviews.

Executive Summaries

Business Administration

Computer Information Technology

Interior Design

Math

Office Administration

Therapeutic Massage

Program Reviews

Associate in Science

Business Administration

Computer Information Technology

Interior Design

Math

Office Administration

Therapeutic Massage

June 20, 2010

Distance Learning Comprehensive Report Spanning 4 Years

Filed under: Distance Learning - P. Keough — Donald Staub @ 10:03 am

Click Here to View a comprehensive overview of the strides the distance learning department has made over the past 4 years thanks to the support and funding of Title III.

April 20, 2010

the ipad pilot

Title III is going to run a pilot on the use of an ipad.

Here’s the plan.  Title III will purchase an i-pad.  The device will be on loan to a pilot group of faculty and staff for 2-week intervals. For those who participate, the expectation is that during the two-week period, that faculty or staff member will explore ways to help provide better instruction or service. At the end of the full pilot period, the group will convene and discuss the pros and cons of the ipad, and, if the overall consensus is positive, to consider a larger pilot program for staff and faculty.
Participants were required to submit the following to be considered for the pilot:

  • An indication of interest in participating in the project.
  • A brief (~200 word) description of what the faculty/staff member would explore (to improve instruction or service).
  • A promise to blog during the experiential two-week period.
  • A promise to come together and discuss experiences at the end of the pilot period.

Once we get the project up and running (i.e. once we get the ipad), you’ll be able to follow participant reflections/feedback here.

April 11, 2010

Exploring the Assessment of Distance Learning – April 11, 2010

Click on image to download ppt

On April 11th, I made a presentation on assessing Distance Learning at Carteret CC. The framework for the presentation is the new set of guidelines that SACS has on their website for assessing DL.

Artifacts mentioned in the presentation:

In my presentation, I referred to a number of artifacts – but I only showed snapshots of them. Here are some of them, more easily accessed:

  • SACS-COC document: Distance Education and the Principles of Accreditation

  • CCC-Distance Learning blog – which includes information on all of the bootcamps for training faculty.
  • The Quality Assessment Plan (QAP) for evaluating and certifying distance learning courses before they go “live”
  • A similar instrument – for assessing online courses – developed at Cal State Chico.

  • Rubrics used for discussion boards in DL courses. Here are two examples: RCP 114 (Respiratory Therapy) and ART 111.

Atlantic Assessment Conference – 2010

Filed under: Uncategorized — Donald Staub @ 2:59 pm

[sorry…this posting is still under construction…taking me a lot longer to complete it than I thought]

You may know the Atlantic Assessment Conference in its earlier form: The NCSU Undergraduate Assessment Symposium. The program announces that some things are the same, and some are different. I’ve been here twice before, and believe that it’s a very good conference in terms of content (learnings) and networking.  I’m looking forward to this year’s model to experience the same (hopefully) and the new.

The first thing I did was attend the pre-conference workshop (Sunday 4/11 – 9:00-12:00): The Use of Rubrics for Assessment, Grading, and Encouraging Student Learning, presented by Mary Allen of Cal State-Bakersfield.  (see my notes below)

Next up was the lunchtime keynote by Randy Swing. He talked about about the role of assessment officers as “choice architects” and “nudging” people for positive change. (see my notes below)

The first afternoon concurrent session was given by Keston Fulcher and Chris Orem of James Madison University.  They talked about assessing their assessment plans using a rubric they had developed. (see my notes below)

At 5:00, I gave my presentation on assessing Distance Learning, so I skipped the second concurrent session to do some final tweaking.

On Monday (4/12/10), Mary Allen (the workshop presenter) gave a plenary session on the “Infrastructure for Sustainable Assessment.” She talked about many institutions paying lip service to the notion of continuous assessment cycles, but what she has found is that “assessment efforts wax and wane with the accreditation cycle.”  Her thoughts on actually making continuous are discussed below.

The first concurrent session of the day was Assessing Information Literacy. It was offered by Scott Heinerichs & Loretta Rieser‐Danner of  West Chester University (PA).  They explained their development of an Information Literacy rubric, and how they were applying it at their school (see my notes below).

The next breakout I attended was Making Student Learning Outcomes and Assessment a Part of the Culture, presented by Roger Werbylo, & Shelly Dorsey of Pima Community College (Tuscon, AZ).  With 60,000 students, they have quite a few faculty on 5 campuses that need to learn about outcomes assessment, and do it.  They talked about their approach to broad and deep professional development.

The final session of the day was a 1.5 hour discussion of the VALUE rubric project through the AACU.  The presenter was Terrel Rhodes of AACU. The conversation focused on the process of developing the VALUE rubrics, and included a brief exercise where participants applied one of the rubrics to a writing.

So, here are my notes on each of these sessions…

Sunday April 11

Workshop – The Use of Rubrics for Assessment, Grading, and Encouraging Student Learning – Mary Allen / Cal State-Bakersfield (click here to read her profile)

This turned out to be a good, fast-paced session.  We talked about technical items (norm-referenced v. criterion-referenced rubrics…the latter is the way we should be going), point scales in rubrics (Mary advocates for the 4-point scale), ways of developing rubrics and ways to ensure greater inter-rater reliability.

Lunchtime Keynote

Randy Swing (of the Association of Institutional Research).  Randy’s talk was entitled: Assessment Officers as Agents of Change.  He framed his talk around making sure that we are indeed providing value ($) to the students and the instructors.  He mentioned the ASK (Assessment Skills and Knowledge) Standards developed by ACPA and endorsed by assessment experts on staff at the Association of American College and Universities, the Higher Learning Commission of the North Central Association, and the Western Association of Schools and Colleges.  [btw, there’s a three day conference around the issue of assessment in student affairs this June in Charlotte, NC]

In the afternoon, the first concurrent (2:30-3:30)session I’ll be attending is: (CS1) Assessing the Assessment: Communicating Expectations for Academic Programs Keston Fulcher & Chris Orem.

Notes from the Rubrics Workshop

April 8, 2010

Pre-Ah Hill – CCC Staff Person of the Year!

Filed under: Uncategorized — Donald Staub @ 2:43 pm

image from the CCC website

Congratulations to CCC staff person of the year (2009-2010): Pre-Ah Hill!!

This only confirms what we’ve all known for some time: That Pre-Ah loves her job, and everyone loves the job that Pre-Ah does.

Pre-Ah is the Title III Instructional Technologist, working with faculty on any issue related to making their DL courses better learning environments for their students.

We are very lucky to have her.

« Previous PageNext Page »

Blog at WordPress.com.