Carteret Community College Title III Grant

October 23, 2010

IUPUI Assessment Institute – 2010

Filed under: Conferences,Notes from the Project Director - Don Staub — Donald Staub @ 10:56 pm

On October 24th – 26th, I’ll be attending (and presenting at) the 2010 IUPUI Assessment Institute – “…the nation´s oldest and largest event focused exclusively on Outcomes Assessment in Higher Education.”  I’ll be reporting here on my interactions with the institute – my comments, questions, and observations.  I hope you’ll follow along.

The first thing I’ll be reporting on will be the pre-conference workshop:  Implementing Student Electronic Portfolios for Assessment

Followed by notes and commentary from these breakout sessions:

  • Panel Discussion (Linda Suskie – moderator): “Rethinking Assessment’s Purposes”
  • Breakout: Non-Cognitive Abilities and the First-Year Student: The Role of Assessment and Intervention (Gore)
  • Breakout: Online & Hybrid Courses: From the Students’ Perspective (Cook & Boldt)
  • Breakout: Assessing Online Courses and Online Evaluation Systems (Fatia & McGovern)
  • Breakout: Three Promising Alternatives for Authentic Assessment of SLOs (Banta, et. al)
  • Breakout: Assessing General Education: Mission Possible (Rose Carr & Stoudermire)

* * *

Implementing Student Electronic Portfolios for Assessment (click here for google site for this workshop)

This workshop moved back and forth between the theoretical (why e-portfolios? EPs) and the practical (how to do e-portfolios).

From the Why EPs discussion came three reasons for EPs: student/faculty collection, selection, and reflection of work completed over time.  They also sliced the list of uses by pointing out that the values of EPs: 1) teaching & learning, 2) self-representation and identity development, and 3) accountability & assessment.  One of the most important outcomes of EPs is not only that they provide a “rich, digital resume” but that they also help students see the larger design of the curriculum, rather than a random-sequence of courses.

Why Use EPs (for students, faculty, and the institution):

For Students – “The focus [of EPs] is also on what students can do with their knowledge and skills and not simply on whether knowledge has been acquired,” (Huba, M. E. & Freed, J. E. (2000). Learner-centered assessment on college campuses: Shifting the focus from teaching to learning. Boston, MA: Allyn & Bacon.)

  • Track growth/development
  • Integrate and apply learning – “portfolios are fundamentally integrative, being composed of heterogenous artifacts” brought together through reflective thinking… Apparent or real Fragmentation of curriculum, so it seems chaotic to students.  EPs provide a medium for integrated learning – giving students a sense of the bigger picture.
  • Develop capacities for reflection & metacognition  – It can have powerful outcomes – if students are practicing it regularly.  Can deepen and extend learning by articulating connections between artifacts. Purposes and forms of reflection vary by context and discipline – could be the description of a problem-solving process.; Reflection can support student success
  • Increase engagement – Higher levels of self-reported engagement when students work on EPs consistently.
  • Leadership development (student life)
  • In Capstones – to give students a sense of what they’ve accomplished…students are often surprised, in looking back later, at their development over time.
  • Employment seeking.  One participant in the room from Loyola noted that in a survey of potential employers (conducted by Loyola), 70% indicated that they would like to see the portfolio before (not after) an interview. AACU did a similar survey.  “But skeptics will say that those are national surveys.  What about our local businesses?”
  • Association for Authentic, Experiential, and Evidence-Based Learning
  • Sources to consider:  Blogfolios (Penn State) faculty & students;  Google Site; Wikispaces

For Faculty

  • Enable assessment of boarder set of abilities and skills
  • Provider richer, more contextualized information to guide program/curriculum development and improvement
  • Blogfolios (Penn State) faculty

For Programs & Institutions

  • Support career and academic advising
  • Enable authentic and psychometrically rigorous assessment for assessment, accreditation…
  • Competencies for assessment & accreditation
  • Check out the IUPUI “online institutional portfolio”
  • Developmental portfolio for review/accreditation (change Program Review process to this?)

What to consider in implementing EPs:


  • Buy (proprietary; e.g. Task Stream).
  • Borrow
  • Build your own
  • Open-Source Portfolio (OSP), Mahara, Elgg
  • Allows you more flexibility in design
  • Open source software was good for individual students, but not as easy for program – level outcomes.
  • Pebble pad – targeted at helping students develop intellectually/personally; e.g. tab for students to develop an action plan
  • Adapt: Generic web-authoring tools or Web 2.0

Making Reflection Meaningful

  • Teach students how to reflect
    • Examples
    • Multiple drafts
    • Peer review
    • Rubrics
    • (e.g. students are asked to consider Gen Ed outcomes, and how they relate to their career, their learning experiences, and their citizenship/service)
    • Need to scaffold reflection; need good prompts


“The method of the evaluation is entirely up to the department.”  You can do them one at a time, or random-sampling. “You (as department/program/institution) need to decide what it is that you want to do with the portfolios.  There is no right or wrong.”

A couple of books that were referred to us…

There are also rubrics…

Important Caveat about e-portfolios and assessment:

“It’s authentic assessment and it takes time…we use EPs for a lot of things, but saving time is not one of them.”

* * *

Non-Cognitive Abilities and the First-Year Student: The Role of Assessment and Intervention

Paul Gore, University of Utah

Abstract: Considerable effort has been focused on identifying factors that contribute to learning, academic success, and persistence for college students. Research has clearly established the fact that academic outcomes are a function of both cognitive (academic) and noncognitive (attitudinal/motivational) abilities. This presentation will provide an overview of recent advances in the measurement of non-cognitive abilities and the use of data from these measures to develop and implement more efficient and effective student support services.

Paul started the session by asking participants to “build a successful student” (i.e what are the qualities that make a successful student?).  What the (unscientific) list revealed was that educators perceive these qualities to be tend more toward non-cognitive abilities (humor, confidence, time mgt…) as opposed to cognitive skills (math, science, etc…).   And thus he laid out his argument for the benefit of strengthening non-cognitive skills.  He added that, [NCV]s are “additive” predictors, and can account for an additional 8-15% of variance (GPA can predict 15-20%) in student success…the higher range of the NCV variance is found in first-generation students; NCVs are potent predictors of school-based outcomes.  He also stressed that many non-cognitive variables are malleable and can be supported, developed, or remediated.  Furthermore, he pointed out NCVs are higher predictors of 1st – 2nd year retention; the best predictor of 2nd semester GPA (better than ACT and 1st Semester GPA). Therefore, identification of NCV strengths and weaknesses in students and aligning them with services that exist on campus can result in higher retention and success rates.

He used this background to introduce an assessment instrument [the Student Strengths Inventory – SSI] that helps predict student success potential by measuring student self-perceptions in 6 areas:

  • Academic engagement
  • Academic self-efficacy
  • Campus engagement
  • Social comfort
  • Educational commitment
  • Resiliency

Don’s Commentary: There was not a lot of discussion on results.  They were more or less implied.  However, it is something that a number of TRiO programs have used, and we, too, are exploring its use – right now, it’s only in TRiO that we’re looking at this, but we may expand it to the whole college.

* * *

Online & Hybrid Courses: From the Students’ Perspective

Greg Cook and Steve Boldt, University of Wisconsin-Whitewater

Abstract: Our student survey provides interesting insights about elements of online instruction that students do and don’t like on our campus. In this session, participants can compare their campus experiences with ours and discuss ways to improve engagement, learning, and survey and assessment techniques for online students.

Steve and Greg began their presentation with a discussion of the recently published (2009), US Dept of Education report: Evaluation of Evidence-Based Practices in Online Learning: A meta-analysis and review of online learning studies.  With the backdrop that one conclusion of the study is that students prefer hybrid/blended learning opportunities, they described survey research that they had conducted on their own campus, exploring student perceptions of online (i.e. online & hybrid) courses vs. traditional/seated courses.  Their primary finding was the relative lack of participation in the survey itself (~16% of those asked to participate actually responded).  This lead to the broader discussion (and group work) around the issue of increasing student participation.   Here is the list of possible solutions generated by the group:

  • Moodle – link inside each course (anon) no grade until completion  – takes them outside to survey monkey – once all students have completed the survey, grades are released / or released early
  • ¾ way through course, you must complete survey
  • Drawing for itouch
  • Build it into last section/unit of course
  • Peer-pressure (if we get 80% response rates, students get X extra points; if we get 90 % response rates, students get more points)
  • Dean or Asst Dean following up in a month that says that we are responding to results
  • Incomplete grade until completed
  • Fee that is credited back once survey is completed

Relevant side-discussion: In those courses/programs where there are personal relationships, there are increased return rates.   Something that came up in the subsequent session (Assessing Online Courses and Online Evaluation Systems… see below) , the presenters, Fotia & Mcgovern, shared a report by Anderson, Brown, & Spaeth (2006) concluding that poor response rates with online forms “mask a more pressing problem.  A diminished response rate to course evaluations has less to do with the method of distribution than it does with faculty and students engagement, particularly since engagement reflects the efficacy of evaluation.”

* * *

Assessing Online Courses and Online Evaluation Systems: Using Student Evaluations to Compare Online to Traditional Courses and Comparing Online and Paper Student Evaluations

Dennis Fotia and Heather McGovern,

Abstract: Facilitators informed local debate, comparing online to other courses using student evaluations. Facilitators also compared responses between student evaluations completed online and on paper, as these processes involve multiple differences. Participants will examine student evaluations as assessment instruments that can influence processes and teaching at the institutional level.

One of the key points made by Dennis and Heather is that “good assessment guides professional development and instructional design.”  Therefore, a critical piece of any assessment program is the reliable and valid collection of student evaluations.  The issue with this is that it is particularly challenging to garner representative rates from online students (see Cook & Boldt, above). To illustrate this point, they cite Anderson, Brown, & Spaeth (2006) who posit that concerns about response rate with online forms “mask a more pressing problem.  A diminished response rate to course evaluations has less to do with the method of distribution than it does with faculty and students engagement, particularly since engagement reflects the efficacy of evaluation.”

Fotia & McGovern, in their own research, raise the question of whether or not utilizing online evaluations for online and face-to-face courses will increase response rates. Two years of data that they have collected, have lead to the preliminary conclusion that “…we might see a locally strong response rate to online evaluations in face-to-face courses.  Therefore, we don’t need to hesitate to start moving towards online forms for that reason.”

* * *


1 Comment »

  1. […] can follow my commentary on the sections I am attending by clicking here (it will send you to the Carteret CC Title III blog). This entry was posted in […]

    Pingback by IUPUI Assessment Institute – 2010 | North Carolina Community College Learning Outcomes Assessment Group — October 23, 2010 @ 11:04 pm | Reply

RSS feed for comments on this post. TrackBack URI

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Blog at