Carteret Community College Title III Grant

February 21, 2010

Texas A&M Assessment Conference – Feb 21-23, 2010

Filed under: Conferences — Donald Staub @ 5:58 pm

click on the photo to go to the TAMU website

This is my second trip to TAMU (thanks to Title III), and I’m very happy to be here. The lineup looks great – check out the agenda and see for yourself.  [NEWclick here to access handouts from workshops and concurrent sessions].  Follow this blog here as I report on the workshops, the plenaries, the concurrent sessions, and, one of my favorite conference activities – the Communities of Practice Dinners  (organized around a common theme – e.g. assessment at community colleges – groups of conference participants head out to a local restaurant and engage in interesting discussions and networking).

This year, I was also lucky enough to get a proposal accepted for presentation. I’ll be presenting Monday from 4:00-5:00 on our assessment efforts at CCC. The presentation is titled: Assessing Distance Learning at Carteret Community College-Methodology and Meaning (click on the title for the PPT, narrative, etc…).  The objective of the presentation is to discuss ways in which CCC is assessing its DL program and how assessment is taking place in DL courses (two different things).  First and foremost, we are doing this to assess ourselves, and assess the impact that Title III resources are having on our campus.

And, of course, there is SACS.  The older policy on this topic (Distance and Correspondence Education – July 2009) has recently (Jan 2010) been replaced with a much more comprehensive document (under Commission Guidelines):  Distance Education and the Principles of Accreditation. I intend to use this document as a framework for discussing how we have approached assessment of DL at CCC.

Sunday Feb 21

Today is a workshop day. In the morning, I’m attending:
Assessing Academic Advising Through Student Learning Outcomes

  • It’s a process (we assess by whether students learn through the process)
  • NACADA assessment listserv
  • Assessing advising…not advisors.
  • Assessment of Advising: is the process through which we gather about the claims we are making…about the student learning that takes place through the advising process
  • [Add tab to Outcomes software…for advising?]
  • Three models of Advising – Centralized (e.g. an advising dept); Shared (w/in department & advising center); Decentralized (wholly w/in a department)
  • Professional Development stipends provided to advisors
  • When conducting PD (esp going to conference…get the process started as far in advance as possible…team-building, goal setting, assessment plan…etc..)
  • Mission Statement: ensuring that advising is a collaborative process
  • Two types of learning outcomes: 1) process/delivery (advisor); 2) student learning
  • Definition of SLO: articulate what students are expected to demonstrate that they know, are able to do, and value/appreciate as a result of involvement in the academic advising experience.
  • Key questions to ask in assessing the SLO – 1) What is the SLO; 2) Where do they learn it; 3) By when should it be learned; 4) For whom, when, and how often will evidence be gathered; 4) Where/How will you gather evidence; 5) Level of expected performance …performance criteria [in our lingo: target/benchmark]

In the afternoon, I’m participating in the session:

RUBRICS: How to Create an Assessment Tool to Save Grading Time and Engage Students in Their Own Learning.

Dannelle StevensProfessor, Department of Curriculum and Instruction,  Portland State University

  • Overheard at beginning of session…two sources of rubrics: Rubistar and the rubrics repository at Winona State U.
  • Why use a rubric?  It saves time, it clarifies expectations (don’t expect that they’ll get it from the syllabus)…and you communicate your expectations more clearly.
  • R is a full description of the assignment or task laid out on a matrix.
  • 4 basic parts:  1) Task description (taken from syllabus) (if online, students should attach the rubric at the end of the paper); 2) Dimensions/criteria; 3) scale; 4) descriptions of the dimensions
  • The 4 stages of Rubric Construction: Reflecting, Listing, Grouping & Labeling, Application
  • Reflecting: brainstorming the descriptors (food)
  • Listing: dimensions/criteria (beer, bbq, corn on the cob)
  • Grouping & Labeling: putting descriptors in appropriate categories…One suggestion: start with a pile of papers group into good, mediocre, bad…build rubric by using these.
  • Application: on grid, fill in highest then lowest, then middle
  • Informed judgement lies between objectivity and subjectivity
  • We are helping students internalize these descriptors – so they can judge on their own
  • Just because it’s difficult to measure, it doesn’t mean you can’t develop a rubric for it
  • Students need to know the expectations
  • We also split into groups and developed rubrics – based on the desires of the group.  So, our group worked on a rubric for Personal Growth & Responsibility.  We didn’t get to the actual rubric…we made it through the first three steps of the process (grouping & labeling), but ran out of time.
  • Another group looked at Portfolios (as a requirement for graduation) – Categories: 1) presentation, 2) content (artifacts, reflections), 3) quality of reflections/rationale (they had to link competencies to artifacts), 4) addressing program objectives/competencies/proficiencies (artifacts demonstrated proficiencies that teacher ed students needed to pay attention to for the state).

Monday Feb 21

First Plenary Session

Dr. Belle S. Wheelan

President Commission on Colleges of the Southern Association of Colleges and Schools

This session will focus on the role of assessment in the accreditation process and the importance of using data to explain to the general public why higher education is significant. Additionally, examples of assessment strategies used by various institutions will be shared.

  • Objectives: role of assessment in accreditation; use of data
  • I want to see what makes you think your graduates look good
  • We’re behind the curve in replacing us…the number of students coming in is decreasing, and the number of dropouts is increasing (40% in the South; 30% Nationally)
  • The problem: finding a vocabulary for those who need to look at the data (a national issue)…What document can we put out there, across the country, that tells us what “accreditation” means.
  • Discussions about releasing of reports – they are all public information (how much transparency – and when)
  • Courses offered to HS students should be as rigorous as what they are going to see next year at the college;
  • When assessing Outcomes, should we be including practical experiences (e.g. e-portfolios) as well as paper/pencil assessments
  • Incorporation of liberal arts principles with applied skills – this should be happening
  • If you’re trying to find out how a service area is performing, give it to them as a writing assignment (in ENG114?)

SACS: Students Are Central to Success (B. Wheelen)

  • Assessment to make sure that students have the tools they need to succeed when they leave
  • [DL – you can see the look on the faces of the students; classroom management]
  • Assessment: provides a variety of data points to help faculty and administration do their job better; you help them raise the standard.  You provide them information for improvement – self, student, overall.
  • Use data to “analyze the true cost per student”
  • Students “who don’t believe in debt” … they won’t come unless there is funding to get them through (grants) or if they have a job that will help them pay as they go

10:45 AM – 11:45 AM   Concurrent Session I

The Agile Assessment Loop: From Data to Workshops to Student Learning

Leslie Rach, Gallaudet University,  Amy Stevens, Gallaudet University, Kathy Wood, Gallaudet University

Interesting….these folks did not show and did not cancel.

1:30 PM – 2:30 PM      Concurrent Session 2

Using Rubric-Guided Student-Generated Video Podcasts as a Performance Assessment Tool and an Active Learning Tool

  • Project started about 5 years ago…but no rubric-based assessment until this year.
  • SUNY Stony Brook (presenters) has a recording studio
  • 25-30 students in grad-level course
  • Instructor breaks lecture (50 min) into 10 five-minute chunks
  • Posts chunked podcasts; disseminates with RSS feed
  • in undergrad courses, instructor posts video pocasts (about 10% of material)
  • For student podcasts, instructor started by simply asking students to meet one criterion: teach me something new.
  • This didn’t work, so it evolved into a rubric-based/rubric-driven assignment
  • Issues with the traditional way of teaching & assessing oral communication skills…drawback is that it takes time, and feedback is often difficult
  • The value of the student-generated video podcast – Active learning by engaging in literature search, content preparation and presentation…
  • If the students start to teach you, you learn more.
  • Students produce podcasts to demonstrate a principle
  • Podcast Production
  • Begins with brief tutorial…but usually they know how to do it already.
  • Recording: PPT presentation captured in video
  • Inclusion of video clips

Editing

  • quick time

Distribution (RSS); post to SUNY server

  • Peer review process (students score each other…pick top-3)
  • Issue: how to fairly grade (there was a lack of support and guidelines for students)
  • A rubric: can be used as a learning tool for students, and an assessment tool for evaluating each other
  • Literature Review + Survey + Student podcast examples
  • Survey: focused on how students use podcasts for learning
  • Results: mainly for review, exam prep, or when they missed classes
  • Assumption: that if you’re producing podcasts, students will stop coming…what happened was that attendance did not dip. (n=18) 94% of the students reported that they viewed podcasts when available.
  • Optimal length: 10-15
  • 89% of students reported that they were not engaged in other activities when they were viewing.
  • 88% reported using a pc/mac to view; 12% portable device (ipod)
  • Preferences for viewing: 50% (streaming) 33% (direct download) 17% (subscription)
  • Evaluating podcasts: Creativity, Engagement, Narration and text, and Citations
  • need to email presenter to request rubric
  • Quality has 7 dimensions: Intro, content, organization, AV enhancement, text, narration, length
  • Train the students: share the rubric with students early in the semester to guide their podcast production and refinement
  • Collect student feedback and reflection journals
  • Compare new and old podcasts to see if rubric has had an effect on quality
  • Issue: Server Space

Tuesday, February 23, 2010

8:00 AM – 9:00 AM

Using Web-based Portfolios to Asses Student Learning Outcomes and to Support Program Assessment

Frederick Burrack, Kansas State University

Frederick is in the Music Education program at KSU...his examples are from student portfolios in that division.

Assessment is often a faculty member’s prescription of what signifies learning…what is often forgotten is application, relevance

  • A tool for longitudinal learning
  • Used as vehicle for measuring outcomes
  • Teaching standards (outcomes) are tabs at top of page
  • Click on tab and Outcomes are at top of page; student “responds” in an essay underneath list of outcomes.
  • Students write essay near the end of their program.  Essay is scored with rubric…the student is demonstrating his/her
  • Artifacts in portfolio: lesson plans, compositions, audio files, video files
  • Write this as if you are writing for a future employer
  • Students have to learn how to self-promote in a public forum (this is open..e.g. facebook).
  • Taken off of KSU server 1 year after student graduates – student takes it with him/herself when graduating
  • video clips are 30 seconds – approximately 5mb

Advantages of the E-portfolio

  • Immediate feedback
  • Periodic scoring – throughout the semester…not in a flood of papers at the end of semester
  • Accessibility
  • Longitudinal documentation
  • Students develop the technology skills needed in the workplace
  • Need to teach the students how to utilize it as a tool further down the road (and to be conscious of the public nature of the content that they are putting on it).

Creating the e-portfolio

  • KompoZer (to create the portfolio…cross-platform)
  • FileZilla (for ftp)

Assessment

  • two questions they ask students in a survey:
  • Do the students recognize learning associated with the eportfolio
  • Do the students look back at the portfolio after they’d left
  • 85% of respondents to their survey “recognized the eportfolio as reflective of the effort put into the program”
  • 54% felt that the eP reflected the learning that occurred in their undergrad education
  • 44% used the ePortfolio on their job search

10:45-11:45

Get SMART, Take ACTION: Institutionalizing Effective Program Review

Eric Daffron, Mississippi University for Women

Sandra Jordan, Georgia College and State University

The presenter started with a “conceptual framework for academic leadership,” which looked a plan of action and leverage points — “try to predict who must be convinced, who will lead, who will follow, who will scuttle efforts.”

It’s important to “understand what you [admin] value (e.g. accreditation, sustainable processes, etc…) and what your constituents value”   Try to hook up with what faculty value (i.e. rewards, independence…).

Solution to making an effective change: uncouple IE from outcomes assessment…then re-couple.

  • strategic directions and “characteristics of our graduates”
  • Begin with BIG educational characteristics (purposeful selection of a term that did not have ownership by any particular group; e.g. goals, objectives…)…Graduates of our baccalaureate programs will have these characteristics [a guiding question for the program review process?]
  • Set no more than 10 (reviewed in sets of 5 )
  • Depts set learning goals for each program and then link them back to the BIG characteristics

Role of the Assessment committee … serve as support (not watchdog)

  • provide peer review and feedback
  • provide pd
  • look for linkages
  • archive
  • act as “go to” people for each college
  • lead group for focused discussion
  • advisory group for deans & departments

Departments do the assessment.  Results go back to Assessment Committee for discussion and advisement (e.g. PD opps).

SMART Assessment

  • Sharpen learning goals
  • Map
  • Assess
  • Review the results
  • Transfer into action

Building a culture of accountability

  • assessment day (first day before fall classes)
  • monthly electronic newsletter
  • couple PD with activities of Office of teaching & learning
  • be explicit about expectations for deans & dept heads
  • Enhanced appreciation of faculty

Planning Committee – runs parallel to the Assessment Com

  • Peer review & feedback
  • PD recommendations
  • Institutional reports and recommendations to appropriate groups
  • Archives
  • Note trends, redundancies, and potential collaboration

ACTION

  • Align (two-year objective with 6-year institution objective)
  • Create (a goal and a specific measurable action step that supports the objective)
  • Tie (the action step to a benchmark that indicates criteria of success)
  • Identify (means of measuring if the benchmark has been met)
  • Observe (actual results of actions taken)
  • Notify (planning committee about plans for using results)

The form used is color coded, which makes it easier to describe & train

Review and Feedback needs to be constant.  It’s critical that everyone feels that assessment and reporting is actually valued and acted upon.

12:00 – 1:00

Redefining General Education at San Jacinto College: The Journey Toward Defining Intellectual Competencies and Program-Level Outcomes

Judy Harrison, San Jacinto College South

Karen Hattaway, San Jacinto College North

Catherine O’Brien, San Jacinto College District

The Gen Ed taskforce (for re-defining Gen Ed)…had not re-visited the core in 10 years, so it was time.

[should include in program review process – value of Gen Ed to your program; does it provide value?]

What is Gen Ed?

  • sometimes called “liberal arts”
  • sometimes confused with “core curriculum” – the core is the group of courses that enable students to achieve a general education.

Held focus groups…with faculty, students, community members

Focus groups resulted in the identification 9 of Gen Ed outcomes.

Advertisements

2 Comments »

  1. […] to the four simple steps outlined by Danelle Stevens in her rubric-writing workshop at the recent Texas A&M Assessment Conference (click here for my notes on the whole conference…scroll down to Danelle’s […]

    Pingback by Creating Effective Rubrics « Carteret Community College Title III Grant — March 3, 2010 @ 10:49 am | Reply

  2. Each fresh bank account at are binary options a Scam receives an extra all the way to 50% on
    the 1st deposit. In regard to in charge are binary options a Scam, this specialist will in fact talk with you, help
    you continue to be within your budget, and possess a few suggestions that might be beneficial.
    The positive aspects associated with trading are binary
    options a Scam is that you know just how much anyone stand gain or perhaps
    drop. In case you are dealing with a broker that must be accountable to
    help a person, you happen to be likely intending undertake a greater possibility involving making an
    investment having another person that’s not basically conning an individual. As market boss, are binary options a Scam is continually developing fresh methods and also devices to support inside trading are binary options a Scam.

    Comment by are binary options a Scam — July 10, 2013 @ 2:58 am | Reply


RSS feed for comments on this post. TrackBack URI

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Create a free website or blog at WordPress.com.