Wednesday, August 31, 2011

When the Tide Goes Out



When the Tide Goes Out  
In their recent article, “The Debt Crisis at American Colleges” published in The Atlantic Online (8/23/11) and drawn from their book, Higher Education?,  Andrew Hacker and Claudia Dreifus strip away the camouflage that covers up the reasons for the dramatic increases in college tuition and costs over the last 20 years. They reveal an assumptive arrangement in which students (and their parents) pay for the research agenda and myriad other hidden costs that have little or nothing to do with the actual teaching and learning enterprise they think they are enrolling to get. Furthermore,  the authors calculate and introduce into the equation the massive loan debt that accumulates as a consequence, beggaring future generations and weakening the economy.  I addressed this issue in passing in Harnessing America’s Wasted Talent, but Hacker and Dreifus really hit the ball out of the park.
I have heard it said that …”it is only when the tide goes out that you see who isn’t wearing a bathing suit”. And the current economic situation with downward pressure on Pell grants, dropping state appropriations, sagging endowments for those lucky enough to have them, and massive unemployment certainly qualifies as “the tide going out”. Here’s what the “no bathing suit” picture looks like.
Institutions, which for years have raised tuition by the approximate or exact amount of increases in state and federal student aid programs, are caught short with artificially high costs and equally high tuition relative to their sectors. And the learners who for years had only incrementally increasing out-of-pocket costs thanks to the same increases in financial aid and loan availability are now squarely under the gun to meet rising tuition costs with higher personal debt. For the institutions, further tuition increases are the only tool left in the traditional toolbox.
As the article closes, Hacker and Dreifus offer several suggestions about “what you should do” for students and parents going forward. And correctly, they give examples of community college – state university articulation paths and great, low cost campus programs that have escaped wider notice because of our infatuation with what Jane Wellman calls the “medallion” universities and colleges.  I would like to offer two additional suggestions that call on institutions to make changes in order to hold up their end of the bargain.
First, take a page out of Michael Crow’s playbook at Arizona State University. He is committed to dramatically increasing the student population without increasing the University’s footprint. How? By institutionalizing blended learning and requiring all undergraduates to take an increasing portion of their program on line.
Second, do what Kaplan University has done. Give new students the first 5 weeks free to see if they want to and are able to do the academic work required. Dismiss those that are failing or choose to withdraw and assign them to other avenues. And bill those who are engaged positively with the curriculum. This approach, the “Kaplan Commitment”, has cost the university a great deal of money. But it is the right thing to do and has led to a significant increase in academic quality.
I have heard others, usually from the public sector, say that they cannot afford to do this. Not accurate.  In fact, it will cost them less than it costs Kaplan because where our tuition constitutes 100% of the revenue, their tuition is net of other public appropriations. The traditional tool box, and the assumptions within it, need changing. Thanks to Hacker and Dreifus for getting this argument aired in public.

Monday, August 22, 2011

Using IT to increase educational effectiveness 3: Personalized services


Using IT to increase educational effectiveness 3: Personalized services

The KNEXT (Knowledge Extension) program is an online teacher-led or self-paced diagnostic program. Using a portfolio development approach with an independent third-party evaluation of credit claimed, KNEXT provides a comprehensive assessment of learning gleaned from prior life experience. Enrolled students are actively encouraged to make use of all prior learning – degrees and certificates earned, past college experience, professional seminars, and applicable work and life experience.
KNEXT is a form of pedagogy as well as a powerful diagnostic for the learner. Participants learn how to actively and consciously reflect on their experience defining the learning that has been generated by that experience.
In supporting learners in seeing anew what they know and where and how they learned it, the KNEXT process personalizes the degree planning and educational design process for each learner, transforming aspiration and motivation as a consequence. As a result, learners who complete the course enjoy three significant educational benefits.
First, they save time and money on their way to earning the degree. Data from the recent CAEL report, “Fueling the Race”, indicated that students who received credit through a prior learning portfolio process persisted to graduation within seven years at more than twice the rate of those who did not. Our data for KNEXT participants indicates a similarly favorable success ratio.
Second they perform significantly better than other students in the ensuing courses they take. As we track student performance inside Kaplan, KNEXT “graduates” consistently better in each course that they take subsequent to their portfolio completion.
And third, they persist to completion at significantly higher rates. In both CAEL report data and our internal data at KNEXT, students with portfolio assessment persist significantly longer than those who do not. Data gathered from over 1000 students who enrolled in the KNEXT assessment course indicate that more than 80% completed the course successfully, while more than 75% of those who enrolled in the course have either graduated from Kaplan or are still enrolled.
Historically, campus-based programs like KNEXT have been hamstrung by the available communication and information processing systems. There has been no way to scale the programs because of the volumes of accumulated information in each portfolio that could only be processed by hand. Put another way, logistics required that such programs be small and of a boutique nature.
Today, thanks to the tools available through technology and social networking, assessment of prior sponsored and non-sponsored learning can be done to scale with rigorous third party quality assurance. Furthermore, the student, with her portfolio, can then “shop” her transcript at any college electronically, dramatically reducing the time involved in finding out which credits will transfer and which will not at each institution. This technologically-enable approach is a major shift towards the student in pedagogy (active reflection), access (knowing college standards), and responsibility (shopping for the best college fit). 

Tuesday, August 16, 2011

Using IT to increase educational effectiveness 2: Curricular Matrixing


Using IT to increase educational effectiveness 2: Curricular Matrixing
Our use of technology at Kaplan gives us the capacity to embed certain learning outcomes across the curriculum in a matrixed approach. So, in a single course experience, we can evaluate the student’s mastery of the substance of the course. We can also, however, evaluate other knowledge development such as writing, team-work, or critical thinking as well.
For example, General Education at Kaplan is taught through a core curriculum of six courses with other outcomes distributed throughout the undergraduate curriculum. The overall program goal is for the student to be literate and knowledgeable in the following nine areas.
  • ·         Arts and Humanities
  • ·         Communication
  • ·         Critical Thinking
  • ·         Ethics
  • ·         Mathematics
  • ·         Research and Information
  • ·         Science

The distribution of outcomes allows us to “double up” the learning in our undergraduate courses, getting added value for the learner and increased effectiveness and efficiency for the institution. In this approach, the vast majority of courses contain a communication course outcome, key to our writing across the curriculum approach. All required courses also contain course outcomes in Critical Thinking, Ethics, or Research and Information, while elective courses contain evenly distributed course outcomes in Arts and Humanities, Mathematics, Science, or Social Science. Technological literacy is reinforced throughout a student’s program.
As mentioned above, this method provides several other advantages including:
  • ·         Centrally managed curriculum ensuring consistent learning objective distribution,
  • ·         Consistent course outcomes across course sections and faculty,
  • ·         Consistent faculty training on rubric use to ensure inter-rater reliability. And
  • ·         Universal learning objectives with common rubrics to evaluate student learning.

To date, we have conducted two studies on this approach.

In a 2009 cohort study, we used three courses in sequence, with all students remaining enrolled. We reviewed the percentage of students achieving “practiced” or higher in the communication outcome: demonstrate college-level communication through the composition of original materials in standard American English. Students achieving “practiced” or better increased from 76% in the first course to 85% in the third course, documenting steady improvement in core academic skills of students as they progress through courses.
In the 2010 ethics and communications study, the sample included 2581 BS in Psychology students learning at the 1-, 2-, and 400 levels. They were each assessed in ethics and communications. In ethics, the average scores on the 0-5 rubric scale for students were 2.72 (100 level), 3.54 (200 level), and 3.64 (400 level). In communications, the average scores improved from 3.20 (100 level), to 3.49 (200 level), to 3.54 (400 level). Our initial conclusions were that the general education program was resulting in documented student improvement of core general education skills in the areas of ethics and communications.
As students progress through their programs of study, their progress on achieving these outcomes is monitored. CLAs provide feedback to students, faculty, and administration about specific knowledge and skills acquired by a student during the course of his or her education. We use this feedback to improve the quality of our courses and to support our faculty’s ability to improve the proportion of students who achieve proficiency and mastery.
The ability to employ technology to matrix learning outcomes within a single learning experience also has implications for reducing the time and cost to degree without reducing learning. If, for example, the outcomes embedded across the curriculum amounted to the equivalent of 45 quarter credits, we could consider increasing the credit award per course and decreasing the number of courses required for graduation. For a real-life example of this kind of “degree-shortening”, see Southern New Hampshire University’s three-year, 90 semester credit hour Business Baccalaureate.
My thanks to colleagues Kara Van Dam , Jason Levin, and John Eads who conducted this research.  

Next Week’s blog: Using IT to increase educational effectiveness 3: Personalized services

Tuesday, August 9, 2011


Using IT to increase educational effectiveness 1: Course level assessments
Recently, I completed an article for the SloanC Journal on Asynchronous Learning Networks entitled “From Scarcity to Abundance: IT’s role in Achieving Quality-Assured Mass Higher Ed

In an increasingly mobile society, with burgeoning opportunities to learn throughout life, the accuracy, clarity, and transparency of learning assessment at the course and program level is essential. Academic outcomes that are consistently and rigorously assessed by the institution and validated by third parties will be seen to have integrity in the 21st century, earning the respect of learners and employers alike.
Course level assessment (CLA) is one of our answers to this emerging need at Kaplan. CLA measures student learning and informs our continuous academic improvement process with processes that have been developed to identify and assess learning outcomes. CLA measures student mastery of stated course level learning outcomes in an objective way. It is criterion-referenced, not norm-referenced.
The scores obtained measure the student’s current skill mastery and knowledge described by the outcomes. CLAs support program-level outcomes while providing the framework for assessing specific learning objectives and activities within a course. Outcomes also share the following characteristics:
·         Each describes one primary area of knowledge or skill.
·         Each reflects specific behaviors underlying the knowledge or skills for which students should be able to demonstrate mastery by the end of the course.
·         Each is written in a style that reflects the appropriate level of complexity of the underlying cognitive tasks required for given levels of mastery.
The learning outcomes are supported by rubrics at the course level. For each course, faculty members assess student success in all of the course’s outcomes using standardized rubrics. Rubrics are developed for each outcome based on specific criteria, identifying student progress towards mastery. Scores on outcomes are then analyzed to determine if students are gaining the desired mastery. We evaluate on a 0-5 scale with 0 = no progress, 3 = “practiced”, and 5 = “mastery”. The objective is that each learner will reach mastery of discipline course outcomes by the end of the course and mastery of the general education outcomes by the time the degree is completed.  
And, as students progress through their programs of study, their progress on achieving these outcomes is measured. In an initial analysis done in 2009 and 2010, showed a steady increase of the number of students being evaluated as “practiced” or better improved steadily, climbing from 77% to 87%.
Tracking student learning outcomes at the course level allows us to gauge both the effectiveness and the career relevance of our instruction and our curriculum – and to engage in a continuous improvement process. And the rubric structure allows us to look at the “profile of learning” within a section or across all sections of a particular course to identify anomalies and success rates as well as levels of learning.
In conclusion, technology is the key differentiator in being able to gather and analyze this kind of data.
·         First, an IT-based consistent approach to course and outcome design creates an environment where comparability is possible.
·         Second, technology allows us to scale this research to all of our learners, ultimately collecting information on hundreds of thousands of student/courses per year.
·         Third, technology allows us to collect consistent information across every section, something that would be impossible in a traditional institution’s classrooms.
·         And fourth, we know that we have clear control over the means and structure of learning assessment, leading to a high degree of consistency in that process as well.
My thanks to my colleagues Jason Levin, Kara Van Dam, and John Eads who contributed to this work.
Next week: Using IT to increase educational effectiveness 2: Curricular Matrixing