Skip navigation.

Michael Feldstein

Syndicate content
What We Are Learning About Online Learning...Online
Updated: 14 hours 48 min ago

Alternate Ledes for CUNY Study on Raising Graduation Rates

Sun, 2015-03-01 14:23

By Phil HillMore Posts (291)

Last week MDRC released a study on the City University of New York’s (CUNY) Accelerated Study in Associate Programs (ASAP) with near breathless terms.

Title page

  • ASAP was well implemented. The program provided students with a wide array of services over a three-year period, and effectively communicated requirements and other messages.
  • ASAP substantially improved students’ academic outcomes over three years, almost doubling graduation rates. ASAP increased enrollment in college and had especially large effects during the winter and summer intersessions. On average, program group students earned 48 credits in three years, 9 credits more than did control group students. By the end of the study period, 40 percent of the program group had received a degree, compared with 22 percent of the control group. At that point, 25 percent of the program group was enrolled in a four-year school, compared with 17 percent of the control group.
  • At the three-year point, the cost per degree was lower in ASAP than in the control condition. Because the program generated so many more graduates than the usual college services, the cost per degree was lower despite the substantial investment required to operate the program.

Accordingly the media followed suit with breathless coverage[1]. Consider this from Inside Higher Ed and their article titled “Living Up to the Hype”:

Now that firm results are in, across several different institutions, CUNY is confident it has cracked the formula for getting students to the finish line.

“It doesn’t matter that you have a particularly talented director or a president who pays attention. The model works,” said John Mogulescu, the senior university dean for academic affairs and the dean of the CUNY School of Professional Studies. “For us it’s a breakthrough program.”

MDRC and CUNY also claim that “cracking the code” means that other schools can benefit, as described earlier in the article:

“We’re hoping to extend that work with CUNY to other colleges around the country,” said Michael J. Weiss, a senior associate with MDRC who coauthored the study.

Unfortunately . . .

If you read the report itself, the data doesn’t back up the bold claims in the executive summary and in the media. A more accurate summary might be:

For the declining number of young, living-with-parents community college students planning to attend full-time, CUNY has explored how to increase student success while avoiding any changes in the classroom. The study found that a package of interventions requiring full-time enrollment, increasing per-student expenditures by 63%, and providing aggressive advising as well as priority access to courses can increase enrollment by 22%, inclusive of term-to-term retention. At the 3-year mark these combined changes translate into an 82% increase in graduation rates, but it is unknown if any changes to the interventions would affect the results, and it is unknown what results would occur at the 4-year mark. Furthermore, it is unclear whether this program can scale due to priority course access and effects on the growing non-traditional student population. If a state sets performance-funding based on 3-year graduation rates and nothing else, this program could even reduce costs.

Luckily, the report is very well documented, so nothing is hidden. What are the problems that would lead to this alternate description?

  • This study is only for one segment of the population, those willing to go full-time, first-time students, low income, and one or two developmental course requirements (not zero, not three+). This targeted less than one-fourth of the CUNY 2-year student population where 73% live at home with parents and 77% are younger than 22. For the rest, including the growing working-adult population:

(p. 92): It is unclear, however, what the effects might be with a different target group, such as low-income parents. It is also unclear what outcomes an ASAP-type program that did not require full-time enrollment would yield.

  • The study required full-time enrollment (12 credits attempted per term) and only evaluated 3-year graduation rates, which is almost explains the results by itself. Do the math (24 credits / year over 3 years minus 3 – 6 as developmental courses don’t count for degree credit) and you see that going “full-time” and getting 66 credits is likely the only way to graduate with a 60-credit associate’s degree in 3 years. As the report itself states:

(p. 85): It is likely that ASAP’s full-time enrollment requirement, coupled with multiple supports to facilitate that enrollment, were central to the program’s success.

  • The study created a special class of students with priority enrollment. One of the biggest challenges of public colleges is for students to even have access to the courses they need. The ASAP students were given priority enrollment as the report itself states:

(p. 34): In addition, students were able to register for classes early in every semester they participated in the program. This feature allowed ASAP students to create convenient schedules and have a better chance of enrolling in all the classes they need. Early registration may be especially beneficial for students who need to enroll in classes that are often oversubscribed, such as popular general education requirements or developmental courses, and for students in their final semesters as they complete the last courses they need to graduate.

  • The study made no attempt to understand the many variables at play. There were a plethora of interventions – full-time enrollment requirement, priority enrollment, special seminars, reduced load on advisers, etc. Yet we have no idea which components lead to which effects. From the report

(p. 85): What drove the large effects found in the study and which of ASAP’s components were most important in improving students’ academic outcomes? MDRC’s evaluation was not designed to definitively answer that question. Ultimately, each component in ASAP had the potential to affect students’ experiences in college, and MDRC’s evaluation estimates the effect of ASAP’s full package of services on students’ academic outcomes.

  • The study made no changes at all to actual teaching and learning practices. It almost seems this was the point to find out how we can everything except teaching and learning to get students to enroll full-time. From the report

(p. 34): ASAP did not make changes to pedagogy, curricula, or anything else that happened inside of the classroom.

What Do We Have Left?

In the end this was a study on pulling out all of the non-teaching stops to see if we can get students to enroll full-time. Target only students willing to go full-time, then constantly advise them to enroll full-time and stick with it, and remove as many financial barriers (fund gap between cost and financial aid, free textbooks, gas cards, etc) as is feasible. With all of this effort, the real result of the study is that they increased the number of credits attempted and credits earned by 22%.

We already know that full-time enrollment is the biggest variable for graduation rates in community colleges, especially if measured over 4 years or less. Look at the recent National Student Clearinghouse report at a national level (tables 11-13):

  • Community college 4-year completion rate for exclusively part-time students: 2.32%
  • Community college 4-year completion rate for mixed enrollment students (some terms FT, some PT): 14.25%
  • Community college 4-year completion rate for exclusively full-time students: 27.55%

And that data is for 4 years – 3 years would have been more dramatic simply due to the fact that it’s almost impossible to get 60 credits if you don’t take at least 12 credits per term over 3 years.

What About Cost Analysis?

The study showed that CUNY spent approximately 63% more per student for the program compared to the control group. The bigger claim, however, is that cost per graduate is actually lower (163% of the cost with 182% of the graduates). But what about the students who don’t graduate or transfer? What about the students who graduate in 4 years instead of 3? Colleges spend money on all their students, and most community college students (60%) can only go part-time and will never be able to graduate in 3 years.

Even if you factor in performance-based funding, using a 3-year graduation basis is misleading. No state is considering funding only for 3-year successful graduation. If that were so, I have a much easier solution – refuse to admit any students seeking less than 12 credits per term. That will produce dramatic cost savings and dramatic increases in graduation rates . . . as long as you’re willing to completely ignore the traditional community college mission that includes:

serv[ing] all segments of society through an open-access admissions policy that offers equal and fair treatment to all students

Can It Scale?

Despite the claims that “the model works” and that CUNY has cracked the formula, does the report actually support this claim? Specifically, can this program scale?

First of all, the report only makes its claims for a small percentage of students that are predominantly young and live at home with their parents – we don’t know if it applies beyond the target group as the report itself calls out.

But within this target group, I think there are big problems with scaling. One of which is the priority enrollment in all courses, including oversubscribed courses and those available at convenient times. The control group was at a disadvantage as were all non-target students (including the growing working adult population and students going back to school). This priority enrollment approach is based on scarcity, and the very nature of scaling the program will reduce the benefits of the intervention.

I have Premier Silver status at United airlines thanks to a few international trips. If this status gave me realistic priority access to first-class upgrades, then I would be more likely to fly United on a routine basis. As it is, however, I often show up at the gate and see myself #30 or higher in line for first-class upgrades when the cabin only has 5-10 first class grades available. The priority status has lost most of its benefits as United has scaled such that more than a quarter of all passengers on many routes also have priority status.

CUNY plans to scale from 456 students in the ASAP study all the way up to 13,000 students in the next two years. Assuming even distribution over two years, this changes the group size from 1% of the entering freshman population to 19%. Won’t that make a dramatic difference in how easy it will be for ASAP students to get into the classes and convenient class times they seek? And doesn’t this program conflict with the goals of offering “equal and fair treatment to all students”?

Alternate Ledes for Media Coverage of Study

I realize my description above is too lengthy for media ledes, so here are some others that might be useful:

  • CUNY and MDRC prove that enrollment correlates with graduation time.
  • Requiring full-time enrollment and giving special access to courses leads to more full-time enrollment.
  • What would it cost to double an artificial metric without asking faculty to change any classroom activities? 63% more per student.
Don’t Get Me Wrong

I’m all for spending money and trying new approaches to help students succeed, including raising graduation rates. I’m also for increasing the focus on out-of-classroom support services to help students. I’m also glad that CUNY is investing in a program to benefit its own students.

However, the executive summary of this report and the resultant media coverage are misleading. We have not cracked the formula, CUNY is not ready to scale this program or export to other colleges, and taking the executive summary claims at face value is risky at best. The community would be better served if CUNY:

  • Made some effort to separate variables and effect on enrollment and graduation rates;
  • Extended the study to also look at more realistic 4-year graduate rates in addition to 3-year rates;
  • Included an analysis of diminishing benefits from priority course access; and
  • Performed a cost analysis based on the actual or planned funding models for community colleges.
  1. And this article comes from a reporter for whom I have tremendous respect.

The post Alternate Ledes for CUNY Study on Raising Graduation Rates appeared first on e-Literate.

Unsubscribe

Sat, 2015-02-28 16:00

By Michael FeldsteinMore Posts (1015)

A little while back, e-Literate suddenly got hit by a spammer who was registering for email subscriptions to the site at a rate of dozens of new email addresses every hour. After trying a number of less extreme measures, I ended up removing the subscription widget from the site. Unfortunately, as a few of you have since pointed out to me, by removing the option to subscribe by email, I also inadvertently removed the option to unsubscribe. Once I realized there was a problem (and cleared some time to figure out what to do about it), I investigated a number of other email subscription plugins, hoping that I could find one that is more secure. After some significant research, I came to the conclusion, that there is no alternate solution that I can trust more than the one we already have.

The good news is that I discovered the plugin we have been using has an option to disable the subscribe feature while leaving on the unsubscribe feature. I have done so. You can now find the unsubscribe capability back near the top of the right-hand sidebar. Please go ahead and unsubscribe yourself if that’s what you’re looking to do. If any of you need help unsubscribing, please don’t hesitate to reach out to me.

Sorry for the trouble. On a related note, I hope to reactivate the email subscription feature for new subscribers once I can find the right combination of spam plugins to block the spam registrations without getting in the way of actual humans trying to use the site.

The post Unsubscribe appeared first on e-Literate.

Greg Mankiw Thinks Greg Mankiw’s Textbook Is Fairly Priced

Fri, 2015-02-27 16:37

By Michael FeldsteinMore Posts (1015)

This is kind of hilarious.

Greg Mankiw has written a blog post expressing his perplexity[1] with The New York Times’ position that textbooks are overpriced:

To me, this reaction seems strange. After all, the Times is a for-profit company in the business of providing information. If it really thought that some type of information (that is, textbooks) was vastly overpriced, wouldn’t the Times view this as a great business opportunity? Instead of merely editorializing, why not enter the market and offer a better product at a lower price? The Times knows how to hire writers, editors, printers, etc. There are no barriers to entry in the textbook market, and the Times starts with a pretty good brand name.

My guess is that the Times business managers would not view starting a new textbook publisher as an exceptionally profitable business opportunity, which if true only goes to undermine the premise of its editorial writers.

It’s worth noting that Mankiw received a $1.4 million advance for his economics textbook from his original publisher Harcourt Southwestern, which was later acquired by the company now known as Cengage Learning. That was in 1997. Now in its seventh edition, Mankiw has five different versions of his book published by Cengage (not counting the five versions of the previous edition, which is still on the market). That said, he is probably right that NYT would not view the textbook industry as a profitable business opportunity. But think about that. A newspaper finds the textbook industry unattractive economically. The textbook industry is imploding. Mankiw’s publisher just emerged from bankruptcy, and textbook sales are down and still dropping across the board.

One reason that textbook prices have not been responsive to market forces is that most faculty do not have strong incentives to search for less expensive textbooks and, to the contrary, have high switching costs. They have to both find an alternative that fits their curriculum and teaching approach—a non-trivial investment in itself—and then rejigger their course design to fit with the new book. A second part of the problem is that the publishers really can’t afford to lower the textbook prices at this point without speeding up their slow-motion train crash because their unit sales keep dropping as students find more creative ways to avoid buying the book. Their way of dealing with falling sales is to raise the price on each book that they sell. It’s a vicious cycle—one that could potentially be broken by the market forces that Mankiw seems so sure are providing fair pricing if only the people making the adoption decisions had motivations that were aligned with the people making the purchasing decisions. The high cost of switching for faculty, coupled with their relative personal immunity to pricing increases, translate into a barrier to entry for potential competitors looking to underbid the established players. Which brings me to the third reason. There are plenty of faculty who would like to believe that they could make money writing a textbook someday and that doing so would generate enough income to make a difference in their lives. Not all, not most, and probably not even the majority, but enough to matter. As long as faculty can potentially get compensated for sales, there will be motivation for them to see high textbook prices that they don’t have to pay themselves as “fair” or, at least, tolerable. It’s a conflict of interest. And Greg Mankiw, as a guy who’s made the big score, has the biggest conflict of interest of all and the least motivation of anyone to admit that textbook prices are out of hand, and that the textbook “market” he wants to believe in probably doesn’t even properly qualify as a market, never mind an efficient one.

  1. Hat tip to Stephen Downes for the link.

The post Greg Mankiw Thinks Greg Mankiw’s Textbook Is Fairly Priced appeared first on e-Literate.

Editorial Policy: Notes on recent reviews of CBE learning platforms

Fri, 2015-02-27 12:30

By Phil HillMore Posts (291)

Oh let the sun beat down upon my face, stars to fill my dream
I am a traveler of both time and space, to be where I have been
To sit with elders of the gentle race, this world has seldom seen
They talk of days for which they sit and wait and all will be revealed

- R Plant, Kashmir

Over the past half year or so I’ve provided more in-depth product reviews of several learning platforms than is typical – Helix, FlatWorld, LoudCloud, Bridge. Understanding that at e-Literate we are not a review site nor do we tend to analyze technology for technology’s sake, it’s worth asking ‘why the change?’. There has been a lot of worthwhile discussion in several blogs recently about whether the LMS is obsolete or critical to the future of higher ed, and this discussion even raised the subject of how we got to the current situation in the first place.

An interesting development I’ve observed is that the learning environment of the future might already be emerging on its own, but not necessarily coming from the institution-wide LMS market. Canvas, for all its market-changing power, is almost a half decade old. The area of competency-based education (CBE), with its hundreds of pilot programs, appears to be generating a new generation of learning platforms that are designed around the learner (rather than the course) and around learning (or at least the proxy of competency frameworks). It seems useful to get a more direct look at these platforms to understand the future of the market and to understand that the next generation environment is not necessarily a concept yet to be designed.

At the same time, CBE is a very important development in higher ed, yet there are plenty of signs of assuming that CBE is students working in isolation to learn regurgitated facts assessed by multiple choice questions. Yes, that does happen in cases and is a risk for the field, but CBE is far richer. Criticize CBE if you will, but do so based on what’s actually happening[1].

Both Michael and I have observed and even participated in efforts that seek to explore CBE and the learning environment of the future.

Perhaps given that I’m prone to visual communication approaches, the best approach for me to work out my own thoughts on the subjects as well as share more broadly through e-Literate has been to do more in-depth product reviews with screenshots.

Bridge, from Instructure, is a different case. I frequently get into discussions about how Instructure might evolve as a company, especially given their potential IPO. The public markets will demand continued growth, so what will this change in terms of their support of Canvas as a higher education LMS? Will they get into adjacent markets? With the latest news of the company raising $40 million in what is likely the last pre-IPO VC funding round as well as their introduction of Bridge to move into the corporate learning space, we now have a pretty solid basis for answering these questions. Understanding that Bridge is a separate product and seeing how the company approaches both its design and lack of change to Canvas are the keys.

With this in mind, it’s worth noting some editorial policy stuff at e-Literate:

  • We do not endorse products; in fact, we generally focus on the academic or administrative need first as well as how a product is selected and implemented.
  • We do not take solicitations to review products, even if a vendor’s competitors have been reviewed. The reviews mentioned above were more about understanding market changes and understanding CBE as a concept than about the products per se.
  • We might accept a vendor’s offer of a demo at our own discretion, either online or at a conference, but even then we do not promise to cover within a blog post.

OK, the lead-in quote is a stretch, but it does tie in to one of the best videos I have seen in a while.

Click here to view the embedded video.

  1. And you would do well to read Michael’s excellent post on CBE meant for faculty trying to understand the subject.

The post Editorial Policy: Notes on recent reviews of CBE learning platforms appeared first on e-Literate.

LoudCloud Systems and FASTRAK: A non walled-garden approach to CBE

Thu, 2015-02-26 13:44

By Phil HillMore Posts (291)

As competency-based education (CBE) becomes more and more important to US higher education, it would be worth exploring the learning platforms in use. While there are cases of institutions using their traditional LMS to support a CBE program, there is a new market developing specifically around learning platforms that are designed specifically for self-paced, fully-online, competency-framework based approaches.

Recently I saw a demo of the new CBE platform from LoudCloud Systems, a company whose traditional LMS I have covered a few years ago. The company is somewhat confusing to me – I had expected a far larger market impact from them based on their product design than what has happened in reality. LoudCloud has recently entered the CBE market, not by adding features to their core LMS but by creating a new product called FASTRAK. Like Instructure with their creation of a new LMS for a different market (corporate learning), LoudCloud determined that CBE called for a new design and that the company can handle two platforms for two mostly distinct markets. In the case of Bridge and FASTRAK, I believe the creation of a new learning platform took approximately one year (thanks a lot, Amazon). LoudCloud did leverage several of the traditional LMS tools such as rubrics, discussion forums and their LoudBook interactive eReader.

As was the case for the description of the Helix CBE-based learning platform and the description of FlatWorld’s learning platform, my interest here is not merely to review one company’s products, but rather to illustrate aspects of the growing CBE movement using the demo.

LoudCloud’s premier CBE partner is the University of Florida’s Lastinger Center, a part of the College of Education that provides professional development for Florida’s 55,000 early learning teachers. They have or expect to have more than a dozen pilot programs for CBE in place during the first half of 2015.

Competency Framework

Part of the reason for developing a new platform is that FASTRAK appears to be designed around a fairly comprehensive competency framework embodied in LoudTrack – an authoring tool and competency repository. This framework allows the school to combine their own set of competencies along with externally-defined job-based competencies such as O*NET Online.

Competency Structure

The idea is to (roughly in order):

  • Develop competencies;
  • Align to occupational competencies;
  • Define learning objectives;
  • Develop assessments; and
  • Then design academic programs.

LoudTrack Editing

One question within CBE design is what is the criteria for mastery within a specific competency – passing some, most, all of the sub-competencies? FASTRAK allows this decision to be set by program configuration.

Master Scale

Many traditional academic programs have learning outcomes, but a key differentiator for a CBE program is having some form of this competency framework and up-front design.

A unique feature (at least unique that I’ve seen so far) is FASTRAK’s ability to allow faculty to set competencies at an individual course level, provided in a safe area that stay outside of the overall competency repository unless reviewed and approved.

The program or school can also group together specific competencies to define sub-degree certificates.

Course Design Beyond Walled Garden

At the recent Instructional Technology Council (ITC) eLearning 2015 conference, I presented a view of the general ed tech market moving beyond the walled garden approach. As part of this move, however, I described that the walled garden will likely live on within top-down designs of specific academic programs such as many (if not most) of the CBE pilots underway.

Now it's clear what's the role @PhilOnEdTech gives to #LMS when he talks about a new "walled garden" age. #LTI +1 pic.twitter.com/DXgdjctHto

— Toni Soto (@ToniSoto_Vigo) February 22, 2015

What FASTRAK shows, however, is that CBE does not require a walled garden approach. Keep in mind the overall approach of starting with the competency framework through assessments and then academic program design. In this last area FASTRAK allows several approaches to bringing in pre-existing content and separate applications.

Add Resource Type

The system, along with current version of LoudBooks, is LTI as well as SCORM compliant and uses this interoperability to give choices to faculty. Remember that FlatWorld prides themselves on deeply integrating content, mostly their own, into the platform. While they can bring in outside content like OER, it is the FlatWorld designers who have to do this work. LoudCloud, by contrast, puts this choice in the hands of faculty. Two very different approaches.

LTI Apps

FASTRAK does provide a fairly impressive set of reports to see how students are doing against the competencies, which should help faculty and program designers to see where students are having problems or where the course designs need improving.

Competency Reporting

CBE-Light

An interesting note from the demo and conversation is that LoudCloud claims that half of their pilots are CBE-light, where schools want to try out competencies at the course level but not at the program level. This approach allows them to avoid the need for regulatory approval.

While I have already called out the basics of what CBE entails in this primer, I have also seen a lot of watering down or alteration of the CBE terminology. Steven Mintz from the University of Texas recently published an article at Inside Higher Ed that calls out CBE 2.0 in his terms, where they are trying approaches that are not fully online or even self-paced. This will be a topic for a future post on what really qualifies as CBE and where are people just co-opting the terminology.

The post LoudCloud Systems and FASTRAK: A non walled-garden approach to CBE appeared first on e-Literate.

e-Literate TV Preview: Essex County College and changing role of faculty

Wed, 2015-02-25 17:58

By Phil HillMore Posts (291)

As we get closer to the release of the new e-Literate TV series on personalized learning, Michael and I will be posting previews highlighting some of the more interesting segments from the series. When we first talked about the series with its sponsors, the Bill & Melinda Gates Foundation, they agreed to give us the editorial independence to report what we find, whether it is good, bad, or indifferent.

In this video preview (about 4:18 in duration), we hear from two faculty members who have first-hand experience in using a personalized learning approach as well as a traditional approach to remedial math. We also hear from students on what they are learning about learning. In our case studies so far, the real faculty issue is not that software is being designed to replace faculty, but rather that successful implementation of personalized learning necessarily changes the role of faculty. One of our goals with e-Literate TV is to allow faculty, staff and students to describe direct experiences in their own words. Take a look.

Click here to view the embedded video.

Stay tuned for the full episodes to be released on the In The Telling platform[1]. You can follow me (@PhilOnEdTech), Michael (@mfeldstein67), or e-Literate TV (@eLiterateTV) to stay up to date. You can also follow the e-Literate TV YouTube channel. We will also announce the release here on e-Literate.

  1. ITT is our partner in developing this series, providing video production as well as the platform.

The post e-Literate TV Preview: Essex County College and changing role of faculty appeared first on e-Literate.

First View of Bridge: The new corporate LMS from Instructure

Tue, 2015-02-24 04:41

By Phil HillMore Posts (291)

Last week I covered the announcement from Instructure that they had raised another $40 million in venture funding and were expanding into the corporate learning market. Today I was able to see a demo of their new corporate LMS, Bridge. While Instructure has very deliberately designed a separate product from Canvas, their education-focused LMS, you can see the same philosophy of market strategy and product design embedded in the new system. In a nutshell, Bridge is designed to a simple, intuitive platform that moves control of the learning design away from central HR or IT control and closer to the end user.

While our primary focus at e-Literate is on higher ed and even some K-12 learning, the development of professional development and corporate training markets are becoming more important even in the higher ed context. At the least, this is important for those who are tracking Instructure and how their company plans might affect the future of education platforms.

The core message of Instructure regarding Bridge – just as with Canvas – is that it is focused on ease-of-use whereas the entrenched competition has fallen prey to feature bloat based on the edge cases. Despite this claim and despite Instructure’s track record with Canvas, what does this mean? I’m pretty sure every vendor out there claims ease-of-use whether or not there are elegant or terrible designs[1].

Based on the demo, Bridge appears to define ease-of-use in three distinct areas – streamlined, clutter-free interface for learners, simple tools for content creation by business units, and simple tools for managing learners and content.

Learner User Experience

Bridge has been designed over the past year based on Instructure’s design to avoid force-fitting Canvas into corporate learning markets. The core use cases of this new market are far simpler than education use cases, and the resultant product has fewer bells and whistles than Canvas. In Instructure’s view, the current market has such cumbersome products that learning platforms are mostly used just for compliance – take this course or you lose your job – and not at all for actual learning. The Bridge interface (shown alongside the mobile screen and on laptop) is simple.

Mobile_same_as_laptop

Learner_progress

While this is a clean interface, I don’t see it as being that big of a differentiator or rationale for a new product line.

Content Creation

The content creation tools, however, start to show Instructure’s distinctive approach. They have made their living on being able to say no – refusing to let user requests for additional features to change their core design principle.  The approach for Bridge is to assume that content creators have no need to have web design or instructional design experience, providing them with simple formatting and suggestion-based tools to make content creation easy. The formatting looks to be on the level of Google Docs, or basic WordPress, rather than Microsoft Word.

Content_authoring_tool

When creating new content, the Bridge LMS even puts up prompts for pre-formatted content types.

Content_prompts

When creating quizzes, they have an interesting tool that adds natural language processing to facilitate simple questions that can be randomized. The author could write a simple sentence of what they are trying to convey to users, such as “Golden Gate Bridge is in San Francisco”. The tool selects each word and allows the author to add alternative objects that can serve in a quiz, such as suggesting San Mateo or San Diego (it is not clear if you can group words to replace the full “San Francisco” rather than “Francisco”). The randomized quiz questions could then be automatically created.

Quiz Creation

For content that is more complex, Instructure is taking the approach of saying ‘no’ – go get that content from a SCORM/AICC import coming from a more complex authoring tool.

Learner Administration Tools

Rather than relying on complex HR systems to manage employees, Bridge goes with a CSV import tool that reminds me of Tableau in that it pre-imports, shows the fields, and allows a drag-and-drop selection and re-ordering of fields for the final import[2].

CSV_Learner_Import

The system can also create or modify groups based on rules.

Group_creation_tool

To pull this together, Bridge attempts to automate as much of the background process as is feasible. To take one example, when you hire a new employee or change the definition of groups, the system retroactively adds the revised list of learners or groups to assigned courses.

For live training, you can see where Bridge takes the opposite approach to Canvas. In Canvas (as with most education LMSs), it is assumed that more time in the system means more time learning – the core job of learners. In Bridge, however, the assumption is that LMS time-on-task should be minimized. For compliance training in particular, you want the employee to spend as little time as reasonable training so they can get their real job done. Bridge focuses not on the live training itself but rather on the logistics tasks in setting up the course (scheduling, registering, taking attendance).

Live_training_tools_1

Prospects and Implications

Taken together, the big story here is that Instructure seeks to change the situation where learning management in corporations is cloistered within HR, IT and instructional design units.. As they related today, they want to democratize content creation and center learning in the business units where the subject matter experts reside.

Their future plans focus on engagement – getting feedback and dialogue from employees rather than just one-way content dissemination and compliance. If they are successful, this is where they will gain lasting differentiation in the market.

What does this mean from a market perspective? Although I do not have nearly as much experience with corporate training as I do with higher education, this LMS seems like a real system and a real market entry into corporate learning. The primary competitors in this space are not Blackboard, as TechCrunch and Buzzfeed implied, but are Saba, SumTotal, SuccessFactors, Cornerstone, etc. Unlike education, this is a highly fragmented market. I suspect that this means that the growth prospects for Instructure will be slower than in education, but real nonetheless. Lambda Solutions shared the Bersin LMS study to give a view of the market.

lms-market

This move is clearly timed to help with Instructure’s planned IPO that could happen as soon as November 2015[3]. Investors can now see potential growth in an adjacent market to ed tech where they have already demonstrated growth.

I mentioned in my last post that the biggest risk I see is management focus and attention. I suspect with their strong fund-raising ($90 million to date) that the company has enough cash to hire staff for both product lines, but senior management will oversee both the Canvas and the Bridge product lines and markets.

  1. Although I would love to see the honest ad: “With a horrible, bloated user interface based on your 300-item RFP checklist!”
  2. I assume they can integrate with HR systems as well, but we did not discuss this aspect.
  3. Note this is based on my heuristic analysis and not from Instructure employees.

The post First View of Bridge: The new corporate LMS from Instructure appeared first on e-Literate.

ITC #eLearning2015 Keynote Video and Material

Sat, 2015-02-21 17:20

By Phil HillMore Posts (291)

This past week I had the opportunity to provide the keynote at the Instructional Technology Council (ITC) eLearning2015 conference in Las Vegas. ITC is a great group that provides leadership and professional development to faculty and staff in community and junior colleges in online education, and increasingly in hybrid course models. To save time on individual sharing, I have included most of the material below.

Here is the MediaSite recording of the keynote:

And here are the slides in SlideShare:

And here is the YouTube channel for e-Literate TV. The Essex County College clip is a sneak preview of an upcoming e-Literate TV case study on personalized learning (more on that in the next post).

Click here to view the embedded video.

Finally, here are the two clips from the WCET14 student panel:

Need for some level of standardization:

Click here to view the embedded video.

Need for interaction:

Click here to view the embedded video.

And last, but certainly not least, the infamous Underpants Gnome video:

Click here to view the embedded video.

The post ITC #eLearning2015 Keynote Video and Material appeared first on e-Literate.

What TechCrunch Got Wrong (and Right) About Instructure Entering Corporate Learning Market

Thu, 2015-02-19 17:08

By Phil HillMore Posts (291)

After yesterday’s “sources say” report from TechCrunch about Instructure – maker of the Canvas LMS – raising a new round of financing and entering the corporate LMS space, Instructure changed plans and made their official announcement to today. The funding is to both expand the Canvas team and to establish the new corporate LMS team. I’m not a fan of media attempts to get a scoop based purely on rumors, and in this case TechCrunch got a few items wrong that are worth correcting.

  • Instructure raised $40 million in new financing (series E), not “between $50 to $70 million”. TechCrunch did hedge their bets with “low end of the range at over $40 million”.
  • The primary competition in the corporate LMS space is Saba, SumTotal, Skillsoft, Cornerstone – and not Blackboard.
  • The Canvas LMS was launched in 2010, not 2011. (OK, I’ll give them this one, as even Instructure seems to use the 2011 date).

TechCrunch did get the overall story of fund-raising and new corporate product right, but these details matter.

Instructure’s new product for the corporate learning market is called Bridge, with its web site here. This is an entirely new product, although it does share a similar product architecture as Canvas, the LMS designed for the education market (including being based on Ruby on Rails). Unlike Canvas, Bridge was designed mobile-first, with all mobile capabilities embedded in the product and not as separate applications. In an interview with Josh Coates, CEO of Instructure, he described their motivation for this new product.

We like the idea of building software that helps people get smarter. Post education there is a void, with bad corporate software.

The design goal of Bridge is to make the creation and consumption of learning content easy, although future directions for the company will emphasize employee engagement and two-way conversations within companies. According to Coates, this focus on engagement parallels their research for future emphasis in the education market.

Bridge

The Bridge product line will have a separate sales team and product team. From the press release:

Foundation partners include CLEARLINK, OpenTable and Oregon State University.

Oregon State University is an interesting customer of both products – they are adopting Canvas as part of their Unizin membership, and they are piloting Bridge as an internal HR system for training staff. This move will likely be adopted by other Canvas education customers.

Given the self-paced nature of both Competency-Based Education (CBE) and corporate learning systems, I asked if Bridge is targeted to get Instructure into the CBE land grab. Coates replied that they are researching whether and how to get into CBE, but they are first exploring if this can be done with Canvas. In other words, Bridge truly is aimed at the corporate learning market.

While Instructure has excelled on maintaining product focus and simplicity of user experience, this move outside of education raises the question about whether they can maintain company focus. The corporate market is very different than the education market – different product needs, fragmented vendor market, different buying patterns. Many companies have tried to cross over between education and corporate learning, but most have failed. Blackboard, D2L and Moodle have made a footprint in the corporate space using one product for both markets. Instructure’s approach is different.

As for the fund-raising aspects, Instructure has made it very clear they are planning to go public with an IPO sometime soon, as reported by Buzzfeed today.

CEO Josh Coates told BuzzFeed today that the company had raised an additional $40 million in growth funding ahead of a looming IPO, confirming a rumor that was first reported by Tech Crunch yesterday. The company has now raised around $90 million.

Given their cash, a natural question is whether Instructure plans to use this to acquire other companies. Coates replied that they get increasingly frequent inbound requests (for Instructure to buy other companies) that they evaluate, but they are not actively pursuing M&A as a key corporate strategy.

I have requested a demo of the product for next week, and I’ll share the results on e-Literate as appropriate.

Update: Paragraph on organization corrected to point out separate product team. Also added sentence on funding to go to both Canvas and Bridge.

The post What TechCrunch Got Wrong (and Right) About Instructure Entering Corporate Learning Market appeared first on e-Literate.

NGDLE: The quest to eat your cake and have it too

Tue, 2015-02-17 05:51

By Phil HillMore Posts (291)

And I’m going old school and sticking to the previous saying.

Google_Ngram_Viewer

Today I’m participating in the EDUCAUSE meeting on Next Generation Digital Learning Environments, funded by the Bill & Melinda Gates Foundation[1]. From the invitation:

The purpose of the panel is to identify potential investment strategies that are likely to encourage and hasten the arrival of “next-generation digital learning environments,” online learning environments that take us beyond the LMS to fully support the needs of today’s students and instructors. [snip]

It is clear that to meet the needs of higher education and today’s learner, the NGDLE must support a much wider range of functionality than today’s LMS, including different instructional modes, alternative credit models, personalized learning, robust data and content exchange, real-time and iterative assessment, the social web, and contemporary software design and usability practices. The policy and cultural context at our colleges and universities must also adapt to a world in which all learning has a digital component.

As I’m making an ill-timed trip from sunny California to snow-ravaged DC for a reduced-attendance meeting, I should at least lay down some of my thoughts on the subject in writing[2].

There is potential confusion of language here by implying NGDLE as an environment to replace today’s LMS. Are we talking about new, monolithic systems that replace today’s LMS but also have a range of functionality to support new needs, or are we talking about an environment that allows reasonably seamless integration and navigation between multiple systems? Put another way, investing in what?

To get at that question we should consider the current LMS market.

Current Market

Unlike five years ago, market dynamics are now leading to systems that better meet the needs of students. Primarily driven by the entrance of the Canvas LMS, the end of the Blackboard – Desire2Learn patent lawsuit, and new ed tech investment, today’s systems are lower in costs than previous systems and have much better usability. Canvas changed the standard of what an LMS can be for traditional courses – competitors that view it as just the shiny new object and not a material difference in usability have done so at their own peril. Blackboard is (probably / eventually / gosh I hope) releasing an entirely new user experience this year that seems to remove much of the multiple-click clunkiness of the past. Moodle has eliminated most of the scroll of death. Sakai 10 introduced a new user interface that is far better than what they had in the past.

It seems at every school I visit and every report I read, students are asking for consistency of usage and navigation along with more usable systems. This is, in fact, what the market is finally starting to deliver. It’s not a perfect market, but there are real changes occurring.

I have already written about the trend of the LMS, particularly based on IMS standards, to go from a walled garden approach:

walledgarden2

to an open garden approach that allows the coordination of the base system with external tools.

walledgarden5

 

Largely due to adoption of Learning Tools Interoperability (LTI) specifications from IMS Global, it is far easier today to integration different applications with an LMS. Perhaps more importantly, the ability to move the integration closer to end users (from central IT to departments and faculty) is getting closer and closer to reality. Michael has also written about the potential of the Caliper framework to be even more significant in expanding interoperability.

The LMS is not going away, but neither is it going to be the whole of the online learning experience anymore. It is one learning space among many now. What we need is a way to tie those spaces together into a coherent learning experience. Just because you have your Tuesday class session in the lecture hall and your Friday class session in the lab doesn’t mean that what happens in one is disjointed from what happens in the other. However diverse our learning spaces may be, we need a more unified learning experience. Caliper has the potential to provide that.

At the same time there are a new wave of learning platforms designed specifically for this latter category. I have started to cover the CBE platforms recently, as Motivis, Helix, FlatWorld, LoudCloud Systems, and others have been introduced with radically different features and capabilities. At e-Literate TV we are learning more about adaptive and personalized systems such as ALEKS, Smart Sparrow, OLI, Cerego and others that design around the learning.

If you look at this new wave of learning environments, you’ll see that they are designed around the learner instead of the course and are focused on competencies or some other form of learning outcomes.

In a sense, the market is working. Better usability for traditional LMS, greater interoperability, and new learning platforms designed around the learner. There is a risk for NGDLE in that you don’t want to screw up the market when it’s finally moving in the right direction.

And Yet . . .

The primary benefits of today’s LMS remains administrative management of traditionally-designed courses. From last year’s ECAR report on the LMS, faculty and students rated their LMS satisfaction highest for the basic administrative functions.

Faculty satisfaction LMS

Student satisfaction LMS

Opponents of the traditional LMS are right to call out how its design can stifle creativity and prevent real classroom engagement. Almost all capabilities of the LMS are available on the free Internet, typically in better-designed tools.

This situation leads to three challenges:

  • The community has discussed the need for direct teaching and learning support for years, yet most courses only use the LMS for rosters, grade book and document sharing (syllabus, readings, assignments). The market changed en masse to call their systems Learning Management Systems in the late 2000s, but the systems mostly remain Course Management Systems as previously named. Yes, some schools and faculty – innovators and early adopters – have found ways to get learning benefits out of the systems, but that is secondary to managing the course.
  • New educational delivery models such as competency-based education (CBE) and personalized learning require a learner-centric design that is not just based on added some features on top of the core LMS. It is worth noting that the new learning platforms tend to be wholesale replacements for the LMS in specific programs rather than expansion of capabilities.
  • The real gains in learner-specific functionality have arisen from applications that don’t attempt to be all things to all people. In today’s world it’s far easier to create a new web and mobile-based application that ever before, and many organizations are taking this approach. Any attempt to push platforms into broader functionality creates the risk of pushing the market backwards into more feature bloat.
Back to the NGDLE

I won’t go into investment strategies for NGDLE, as that is the topic for group discussions today. But I think it is worth calling out the need to support two seemingly incompatible needs.

  • Given the very real improvements in the LMS market, we should not abandon the gains made by institutions and faculty that have taken ~15 years to achieve.
  • The market should not just evolve – new educational models require new ground-up designs, and we need far more emphasis on learning support and student engagement.

Is it possible to eat your cake and have it, too? In my opinion, our best chance is through the encouragement and support for interoperability frameworks that allow a course or learner hub / aggregator – providing consistent navigation and support for faculty not looking to innovate with technology – along with an ecosystem of true learning applications and environments. This is the move to learning platforms, not just as marketing terms but as true support for integrated world of applications.

  1. Disclosure: Our upcoming e-Literate TV series has also received a grant from the Gates Foundation.
  2. Now that I’ve gone down for breakfast, the 2-inch snowfall would be somewhat embarrassing if not for the city being shut down.

The post NGDLE: The quest to eat your cake and have it too appeared first on e-Literate.

What Does Unizin Mean for Digital Learning?

Mon, 2015-02-16 13:41

By Michael FeldsteinMore Posts (1015)

Speaking of underpants gnomes sales pitches, Phil and I spent a fair amount of time hearing about Unizin at the ELI conference. Much of that time was spent hearing friends that I know, trust, and respect talk about the project. At length, in some cases. On the one hand, it is remarkable that, after these long conversations, I am not much clearer on the purpose of Unizin than I was the week before. On the other hand, being reminded that some of my friends really believe in this thing helped me refill my reservoir of patience for the project, which had frankly run dry.

Alas, that reservoir was largely drained away again during a Unizin presentation with the same title as this blog post. I went there expecting the presenters to answer that question for the audience.

Alack.

The main presentation was given by Anastasia Morrone of IUPUI, was probably the most straightforward and least hype-filled presentation about Unizin that I have heard so far. It was also short. Just when I was warming to it and figuring we’d get to the real meat, her last slide came up:

Split into groups of 5-7 people and discuss the following:

How can faculty, teaching center consultants, and learning technologists contribute to best practices with the evolving Unizin services?

Wait. What?

That’s right. They wanted us to tell them what Unizin means for digital learning. That might have been a good question to ask before they committed to spend a million dollars each on the initiative.

I joined one of the groups, resolving to try as hard as I could to keep my tongue in check and be constructive (or, at least, silent) for as long as I could. The very first comment in my group—not by me, I swear—was, “Before I can contribute, can somebody please explain to me what Unizin is?” It didn’t get any better from there. At the end of the breakout session, our group’s official answer was essentially, “Yeah, we don’t have any suggestions to contribute, so we’re hoping the other groups come up with something.” None of them did, really. The closest they came were a couple of vague comments on inclusive governance. I understand from a participant in one of the other groups that they simply refused to even try to answer the question. It was brutal.

Click here to view the embedded video.

Still, in the spirit of the good intentions behind their request for collaborative input, I will list here some possible ways in which Unizin could provide value, in descending order of credibility.

I’ll start with the moderately credible:

  • Provide a layer of support services on top of and around the LMS: This barely even gets mentioned by Unizin advocates but it is the one that makes the most sense to me. Increasingly, in addition to your LMS, you have a bunch of connected tools and services. It might be something basic like help desk support for the LMS itself. It might be figuring out how an external application like Voicethread works best with your LMS. As the LMS evolves into the hub of a larger ecosystem, it is putting increasing strain on IT department in everything from procurement to integration to ongoing support. Unizin could be a way of pooling resources across institutions to address those needs. If I were a CIO in a big university with lots of demands for LMS plug-in services, I would want this.
  • Provide a university-controlled environment for open courses: Back when Instructure announced Canvas Network, I commented that the company had cannily targeted the issue that MOOC providers seemed to be taking over the branding, not to mention substantial design and delivery decisions, from their university “partners.” Canvas Network is marketed as “open courses for the rest of us.” By adopting Canvas as their LMS, Unizin gets this for free. Again, if I were a CIO or Provost at a school that was either MOOCing or MOOC-curious, I would want this.
  • Providing buying power: What vendor would not want to sew up a sales deal with ten large universities or university systems (and counting) through one sales process? So far it is unclear how much Unizin has gained in reality through group negotiations, but it’s credible that they could be saving significant money through group contracting.
  • Provide a technology-assisted vehicle for sharing course materials and possibly even course cross-registrations: The institutions involved are large, and most or all probably have specialty strengths in some curricula area or other. I could see them wanting to trade, say, an Arabic degree program for a financial technology degree program. You don’t need a common learning technology infrastructure to make this work, but having one would make it easier.
  • Provide a home for a community researching topics like learning design and learning analytics: Again, you don’t need a common infrastructure for this, but it would help, as would having courses that are shared between institutions.

Would all of this amount to a significant contribution to digital learning, as the title of the ELI presentation seems to ask? Maybe! It depends on what happens in those last two bullet points. But the rollout of the program so far does not inspire confidence that the Unizin leadership knows how to facilitate the necessary kind of community-building. Quite the opposite, in fact. Furthermore, the software has only ancillary value in those areas, and yet it seems to be what Unizin leaders want to talk about 90%+ of the time.

Would these benefits justify a million-dollar price tag? That’s a different question. I’m skeptical, but a lot depends on specific inter-institutional intentions that are not public. A degree program has a monetary value to a university, and some universities can monetize the value better than others depending on which market they can access with significant degrees of penetration. Throw in the dollar savings on group contracting, and you can have a relatively hard number for the value of the coalition to a member. I know that a lot of university folk hate to think like that, but it seems to be the most credible way to add the value of these benefits up and get to a million dollars.

Let’s see if we can sweeten the pot by adding in the unclear or somewhat dubious but not entirely absurd benefits that some Unizin folk have claimed:

  • Unizin will enable universities to “own” the ecosystem: This claim is often immediately followed by the statement that their first step in building that ecosystem was to license Canvas. The Unizin folks seem to have at least some sense that it seems contradictory to claim you are owning the ecosystem by licensing a commercial product, so they immediately start talking about how Canvas is open source and Unizin could take it their own way if they wanted to. Yet this flies in the face of Unizin’s general stated direction of mostly licensing products and building connectors and such when they have to. Will all products they license be open source? Do they seriously commit to forking Canvas should particular circumstances arise? If not, what does “ownership” really mean? I buy it in relation to the MOOC providers, because there they are talking about owning brand and process. But beyond that, the message is pretty garbled. There could be something here, but I don’t know what it is yet.
  • Unizin could pressure vendors and standards groups to build better products: In the abstract, this sounds credible and similar to the buying power argument. The trouble is that it’s not clear either that pressure on these groups will solve our most significant problems or that Unizin will ask for the right things. I have argued that the biggest reason LMSs are…what they are is not vendor incompetence or recalcitrance but that faculty always ask for the same things. Would Unizin change this? Indiana University used what I would characterize as a relatively progressive evaluation framework when they chose Canvas, but there is no sign that they were using the framework to push their faculty to fundamentally rethink what they want to do with a virtual learning environment and therefore what it needs to be. I don’t doubt the intellectual capacity of the stakeholders in these institutions to ask the right questions. I doubt the will of the institutions themselves to push for better answers from their own constituents. As for the standards, as I have argued previously, the IMS is doing quite well at the moment. They could always move faster, and they could always use more university members who are willing to come to the table with concrete use cases and a commitment to put in the time necessary to work through a standards development process (including implementation). Unizin could do that, and it would be a good thing if they did. But it’s still pretty unclear to me how much their collective muscle would be useful to solve the hard problems.

Don’t get me wrong; I believe that both of the goals articulated above are laudable and potentially credible. But Unizin hasn’t really made the case yet.

Instead, at least some of the Unizin leaders have made claims that are either nonsensical (in that they don’t seem to actually mean anything in the real world) or absurd:

  • “We are building common gauge rails:” I love a good analogy, but it can only take you so far. What rides on those rails? And please don’t just say “content.” Are we talking about courses? Test banks? Individual test questions? Individual content pages? Each of these have very different reuse characteristics. Content isn’t just a set of widgets that can be loaded up in rail cars and used interchangeably wherever they are needed. If it were, then reuse would have been a solved problem ten years ago. What problem are you really trying to solve here, and why do you think that what you’re building will solve it (and is worth the price tag)?
  • “Unizin will make migrating to our next LMS easier because moving the content will be easy.” No. No, no, no, no, no, no, no. This is the perfect illustration of why the “common gauge rails” statement is meaningless. All major LMSs today can import IMS Common Cartridge format, and most can export in that format. You could modestly enhance this capability by building some automation that takes the export from one system and imports it into the other. But that is not the hard part of migration. The hard part is that LMSs work differently, so you have to redesign your content to make best use of the design and features of the new platform. Furthermore, these differences are generally not one that you want to stamp out—at least, not if you care about these platforms evolving and innovating. Content migration in education is inherently hard because context makes a huge difference. (And content reuse is exponentially harder for the same reason.) There are no widgets that can be neatly stacked in train cars. Your rails will not help here.
  • “Unizin will be like educational moneyball.” Again with the analogies. What does this mean? Give me an example of a concrete goal, and I will probably be able to evaluate the probability that you can achieve it, it’s value to students and the university, and therefore whether it is worth a million-dollar institutional investment. Unizin doesn’t give us that. Instead, it gives us statements like, “Nobody ever said that your data is too big.” Seriously? The case for Unizin comes down to “my data is bigger than yours”? Is this a well-considered institutional investment or a midlife crisis? The MOOC providers have gobs and gobs of data, but as HarvardX researcher Justin Reich has pointed out, “Big data sets do not, by virtue of their size, inherently possess answers to interesting questions….We have terabytes of data about what students clicked and very little understanding of what changed in their heads.” Tell us what kinds of research questions you intend to ask and how your investment will make it possible to answer them. Please. And also, don’t just wave your hands at PAR and steal some terms from their slides. I like PAR. It’s a Good Thing. But what new thing are you going to do with it that justifies a million bucks per institution?

I want to believe that my friends, who I respect, believe in Unizin because they see a clear justification for it. I want to believe that these schools are going to collectively invest $10 million or more doing something that makes sense and will improve education. But I need more than what I’m getting to be convinced. It can’t be the case that the people not in the inner circle have to convince themselves of the benefit of Unizin. One of my friends inside the Unizin coalition said to me, “You know, a lot of big institutions are signing on. More and more.” I replied, “That means that either something very good is happening or something very bad is happening.” Given the utter disaster that was the ELI session, I’m afraid that I continue to lean in the direction of badness.

 

The post What Does Unizin Mean for Digital Learning? appeared first on e-Literate.

Wanted – A Theory of Change

Sun, 2015-02-15 14:25

By Michael FeldsteinMore Posts (1014)

Phil and I went to the ELI conference this week. It was my first time attending, which is odd given that it is one of the best conferences that I’ve attended in quite a while. How did I not know this?

We went, in part, to do a session on our upcoming e-Literate TV series, which was filmed for use in the series. (Very meta.) Malcolm Brown and Veronica Diaz did a fantastic job of both facilitating and participating in the conversation. I can’t wait to see what we have on film. Phil and I also found that an usually high percentage of sessions were ones that we actually wanted to go to and, once there, didn’t feel the urge to leave. But the most important aspect of any conference is who shows up, and ELI did not disappoint there either. The crowd was diverse, but with a high percentage of super-interesting people. On the one hand, I felt like this was the first time that there were significant numbers of people talking about learning analytics who actually made sense. John Whitmer from Blackboard (but formerly from CSU), Mike Sharkey from Blue Canary (but formerly from University of Phoenix), Rob Robinson from Civitas (but formerly from the University of Texas), Eric Frank of Acrobatiq (formerly of Flat World Knowledge)—these people (among others) were all speaking a common language, and it turns out that language was English. I feel like that conversation is finally beginning to come down to earth. At the same time, I got to meet Gardner Campbell for the first time and ran into Jim Groom. One of the reasons that I admire both of these guys is that they challenge me. They unsettle me. They get under my skin, in a good way (although it doesn’t always feel that way in the moment).

And so it is that I find myself reflecting disproportionately on the brief conversations that I had with both of them, and about the nature of change in education.

I talked to Jim for maybe a grand total of 10 minutes, but one of the topics that came up was my post on why we haven’t seen the LMS get dramatically better in the last decade and why I’m pessimistic that we’ll see dramatic changes in the next decade. Jim said,

Your post made me angry. I’m not saying it was wrong. It was right. But it made me angry.

Hearing this pleased me inordinately, but I didn’t really think about why it pleased me until I was on the plane ride home. The truth is that the post was intended to make Jim (and others) angry. First of all, I was angry when I wrote it. We should be frustrated at how hard and slow change has been. It’s not like anybody out there is arguing that the LMS is the best thing since sliced bread. Even the vendors know better than to be too boastful these days. (Most of them, anyway.) At best, conversations about the LMS tend to go like the joke about the old Jewish man complaining about a restaurant: “The food here is terrible! And the portions are so small!” After a decade of this, the joke gets pretty old. Somehow, what seemed like Jack Benny has started to feel more like Franz Kafka.

Second, it is an unattractive personal quirk of mine than I can’t resist poking at somebody who seems confident of a truth, no matter what that truth happens to be. Even if I agree with them. If you say to me, “Michael, you know, I have learned that I don’t really know anything,” I will almost inevitably reply, “Oh yeah? Are you sure about that?” The urge is irresistible. If you think I’m exaggerating, then ask Dave Cormier. He and I had exactly this fight once. This may make me unpopular at parties—I like to tell myself that’s the reason—but it turns out to be useful in thinking about educational reform because just about everybody shares some blame in why change is hard, and nobody likes to admit that they are complicit in a situation that they find repugnant. Faculty hate to admit that some of them reinforce the worst tendencies of LMS and textbook vendors alike by choosing products that make their teaching easier rather than better. Administrators hate to admit that some of them are easily seduced by vendor pitches, or that they reflexively do whatever their peer institutions do without a lot of thought or analysis. Vendors hate to admit that their organizations often do whatever they have to in order close the sale, even if it’s bad for the students. And analysts and consultants…well…don’t get me started on those smug bastards. It would be a lot easier if there were one group, one cause that we could point to as the source of our troubles. But there isn’t. As a result, if we don’t acknowledge the many and complex causes of the problems we face, we risk having an underpants gnomes theory of change:

Click here to view the embedded video.

I don’t know what will work to bring real improvements to education, but here are a few things that won’t:

  • Just making better use of the LMS won’t transform education.
  • Just getting rid of the LMS won’t transform education.
  • Just bringing in the vendors won’t transform education.
  • Just getting rid of the vendors won’t transform education.
  • Just using big data won’t transform education.
  • Just busting the faculty unions won’t transform education.
  • Just listening to the faculty unions won’t transform education.

Critiques of some aspect of education or other are pervasive, but I almost always feel like I am listening to an underpants gnomes sales presentation, no matter who is pitching it, no matter what end of the political spectrum they are on. I understand what the speaker wants to do, and I also understand the end state to which the speaker aspires, but I almost never understand how the two are connected. We are sorely lacking a theory of change.

This brings me to my conversation with Gardner, which was also brief. He asked me whether I thought ELI was the community that could…. I put in an ellipse there both because I don’t remember Gardner’s exact wording and because a certain amount of what he was getting at was implied. I took him to mean that he was looking for the community that was super-progressive that could drive real change (although it is entirely possible that I was and am projecting some hope that he didn’t intend). It took me a while to wrap my head around this encounter too. On the one hand, I am a huge believer in the power of communities as networks for identifying and propagating positive change. On the other hand, I have grown to be deeply skeptical of them as having lasting power in broad educational reform. Every time I have found a community that I got excited about, one of two things inevitably happened: either so many people piled into it that it lost its focus and sense of mission, or it became so sure of its own righteousness that the epistemic closure became suffocating. There may be some sour grapes in that assessment—as Groucho Marx said, I don’t want to belong to any club that would have me as a member—but it’s not entirely so. I think communities are essential. And redeeming. And soul-nourishing. But I think it’s a rare community indeed—particularly in transient, professional, largely online communities, where members aren’t forced to work out their differences because they have to live with each other—that really provides transformative change. Most professional communities feel like havens, when I think we need to feel a certain amount of discomfort for real change to happen. The two are not mutually exclusive in principle—it is important to feel like you are in a safe environment in order to be open to being challenged—but in practice, I don’t get the sense that most of the professional communities I have been in have regularly encouraged  creative abrasion. At least, not for long, and not to the point where people get seriously unsettled.

Getting back to my reaction to Jim’s comment, I guess what pleased me so much is that I was proud to have provided a measure of hopefully productive and thought-provoking discomfort to somebody who has so often done me the same favor. This is a trait I admire in both Jim and Gardner. They won’t f**king leave me alone. Another thing that I admire about them is that they don’t just talk, and they don’t just play in their own little sandboxes. Both of them build experiments and invite others to play. If there is a way forward, that is it. We need to try things together and see how they work. We need to apply our theories and find out what breaks (and what works better than we could have possibly imagined). We need to see if what works for us will also work for others. Anyone who does that in education is a hero of mine.

So, yeah. Good conference.

 

The post Wanted – A Theory of Change appeared first on e-Literate.

e-Literate TV Case Study Preview: Middlebury College

Sun, 2015-02-15 10:48

By Michael FeldsteinMore Posts (1013)

As we get closer to the release of the new e-Literate TV series on personalized learning, Phil and I will be posting previews highlighting some of the more interesting segments from the series. Both our preview posts and the series itself start with Middlebury College. When we first talked about the series with its sponsors, the Bill & Melinda Gates Foundation, they agreed to give us the editorial independence to report what we find, whether it is good, bad, or indifferent. And as part of our effort to establish a more objective frame, we started the series by going not to a school that was a Gates Foundation grantee but to the kind of place that Americans probably think of first when they think of a high-quality personalized education outside the context of technology marketing. We decided to go to an elite New England liberal arts college. We wanted to use that ideal as the context for talking about personalizing learning through technology. At the same time, we were curious to find out how technology is changing these schools and their notion of what a personal education is.

We picked Middlebury because it fit the profile and because we had a good connection through our colleagues at IN THE TELLING.[1] We really weren’t sure what we would find once we arrived on campus with the cameras. Some of what we found there was not surprising. In a school with a student/teacher ratio of 8.6 to 1, we found strong student/teacher relationships and empowered, creative students. Understandably, we heard concerns that introducing technology into this environment would depersonalize education. But we also heard great dialogues between students and teachers about what “personalized” really means to students who have grown up with the internet. And, somewhat unexpectedly, we saw some signs that the future of educational technology at places like Middlebury College may not be as different from what we’re seeing at public colleges and universities as you might think, as you’ll see in the interview excerpt below.

Jeff Howarth is an Assistant Professor of Geography at Middlebury. He teaches a very popular survey-level course in Geographic Information Systems (GIS). But it’s really primarily a course about thinking about spaces. As Jeff pointed out to me, we typically provide little to no formal education on spacial reasoning in primary and secondary schooling. So the students walking into his class have a wide range of skills, based primarily on their natural ability to pick them up on their own. This broad heterogeneity is not so different from the wide spread of skills that we saw in the developmental math program at Essex County College in Newark, NJ. Furthermore, the difference between a novice and an expert within a knowledge domain is not just about how many competencies they have racked up. It’s also about how they acquire those competencies. Jeff did his own study of how students learn in his class which confirmed broader educational research showing that novices in a domain tend to start with specific problems and generalize outward, while experts (like professors, but also like more advanced students) tend to start with general principles and apply them to the specific problem at hand. As Jeff pointed out to me, the very structure of the class schedule conspires against serving novice learners in the way that works best for them. Typically, students go to a lecture in which they are given general principles and then are sent to a lab to apply those principles. That order works for students who have enough domain experience to frame specific situations in terms of the general principles but not for the novices who are just beginning to learn what those general principles might even look like.

When Jeff thought about how to serve the needs of his students, the solution he came up with—partly still a proposal at this point—bears a striking resemblance to the basic design of commercial “personalized learning” courseware. I emphasize that he arrived at this conclusion through his own thought process rather than by imitating commercial offerings. Here’s an excerpt in which he describes deciding to flip his classroom before he had ever even heard of the term:

Click here to view the embedded video.

In the full ten-minute episode, we hear Jeff talk about his ideas for personalized courseware (although he never uses that term). And in the thirty-minute series, we have a great dialogue between students and faculty as well as some important context setting from the college leadership. The end result is that the Middlebury case study shows us that personalized learning software tools do not just have to be inferior substitutes for the real thing that are only for “other people’s children” while simultaneously reminding us of what a real personal education looks like and what we must be careful not to lose as we bring more technology into the classroom.

  1. Full disclosure: Since filming the case study, Middlebury has become a client of MindWires Consulting, the company that Phil and I run together.

The post e-Literate TV Case Study Preview: Middlebury College appeared first on e-Literate.

California Community College OEI Selects LMS Vendor

Thu, 2015-02-12 13:53

By Phil HillMore Posts (289)

The Online Education Initiative (OEI) for California’s Community College System has just announced its vendor selection for a Common Course Management System (CCMS)[1]. For various reasons I cannot provide any commentary on this process, so I would prefer to simply direct people to the OEI blog site. Update: To answer some questions, the reason I cannot comment is that CCC is a MindWires client, and I facilitated the meetings. Based on this relationship we have a non-disclosure agreement with OEI.

Here is the full announcement.

The California Community Colleges (CCC) Online Education Initiative (OEI) announced its intent to award Instructure Inc. the contract to provide an online course management system and related services to community colleges statewide.

Support for Instructure’s Canvas system was nearly unanimous among the OEI’s Common Course Management System (CCMS) Committee members, with overwhelming support from student participants, officials said. Canvas is a course management platform that is currently being used by more than 1,000 colleges, universities and school districts across the country.

“Both the students and faculty members involved believed that students would be most successful using the Canvas system,” said OEI Statewide Program Director Steve Klein. “The student success element was a consistent focus throughout.”

The announcement includes some information on the process as well.

A 55-member selection committee participated in the RFP review that utilized an extensive scoring rubric. The decision-making process was guided by and included the active involvement of the CCMS Committee, which is composed of the CCMS Workgroup of the OEI Steering Committee, the members of OEI’s Management Team, and representatives from the eight Full Launch Pilot Colleges, which will be the first colleges to test and deploy the CCMS tool.

The recommendation culminated an extremely thorough decision-making process that included input from multiple sources statewide, and began with the OEI’s formation of a CCMS selection process in early 2014. The selection process was designed to ensure that a partner would be chosen to address the initiative’s vision for the future.

  1. Note that this is an Intent to Award, not yet a contract.

The post California Community College OEI Selects LMS Vendor appeared first on e-Literate.

A Sneak Preview of e-Literate TV at ELI

Tue, 2015-02-10 00:58

By Michael FeldsteinMore Posts (1013)

Phil and I will be chatting with Malcolm Brown and Veronica Diaz about our upcoming e-Literate TV series on personalized learning in a featured session at ELI tomorrow. We’ll be previewing short segments of video case studies that we’ve done on an elite New England liberal arts college, an urban community college, and large public university. Audience participation in the discussion is definitely encouraged. It will be tomorrow at 11:45 AM in California C for those of you who are here at the conference, and also webcast for those of you registered for the virtual conference.

We hope to see you there.

The post A Sneak Preview of e-Literate TV at ELI appeared first on e-Literate.

Flat World and CBE: Self-paced does not imply isolation

Mon, 2015-02-09 07:48

By Phil HillMore Posts (287)

As competency-based education (CBE) becomes more and more important to US higher education, it would be worth exploring the learning platforms in use. While there are cases of institutions using their traditional LMS to support a CBE program, there is a new market developing specifically around learning platforms that are designed specifically for self-paced, fully-online, competency-framework based approaches.

Several weeks ago Flat World announced their latest round of funding, raising $5 million of debt financing, raising their total to $40.7 million. The company started out by offering full e-textbooks (and was previously named FlatWorld Knowledge), developing 110 titles that included 25 of the 50 most-used lecture courses. The e-textbook market was not working out, however, and the company pivoted to competency-based education around the time that Chris Etesse became CEO two years ago. Now the company is developing a combined CBE learning platform with integrated course content – much of it repurposing the pre-existing e-textbook materials. Their first academic partner for CBE is Brandman University, a non-traditional part of the Chapman University system and is currently one of the CBEN network.

One central tenet of the Flat World approach is based on their history and pivot – a tight integration of content and platform. As Etesse describes it, content is a 1st-class citizen in their system whereas other loosely-coupled approaches that do not tie content and platform together can be difficult to navigate and collect learning analytics. In other words, this intentionally is a walled-garden approach. For Brandman, approximately 70% of the content comes from the pre-existing FlatWorld texts, 25% comes from various OER sources, and about 5% has been custom-designed for Brandman.

In other words, this is very much a walled garden by design. While there is support for outside content, I believe this integration must be done by Flat World designers.

As was the case for the description of the Helix CBE-based learning platform, my interest here is not merely to review one company’s products, but rather to illustrate aspects of the growing CBE movement using the demo.

CBE programs by their very nature tend to be self-paced. One criticism or line of questions I’m seeing more often deals with the nature of self-paced learning itself. Are students just plugging through mindless e-text and multiple-choice assessments in isolation? What Flat World illustrates – as with other major CBE learning platforms – is that self-paced does not imply isolation, either from a student-teacher or from a student-student perspective. New approaches that are different than simple discussion forums are required, however.

FlatWorld shows several learning activities:

  • Reading text and viewing multi-media content adaptively presented based on a pretest and progress against competencies;
  • Taking formative assessments primarily through multiple-choice quizzes;
  • Interacting with students and with faculty;
  • Working through project-based assignments;
  • Taking summative assessments through proctored, webcam-streaming approach.

The activities and assessments do not have to be students working in isolation and using multiple-choice. For example, the project based work can be included and assignments can include submission of written reports or based on short-form prompts. As can be seen below, the assessments can be based on submitted written work which faculty grade and use for feedback.

FWK_Demo_Submit_Assessment

For communication with others, students are tracked in how active they are in communicating with faculty and even with other students (called ‘social’), as seen below.

FWK_Demo_7_Day_Breakdown

One challenge of a self-paced program such as CBE approaches is figuring out how to encourage students to interact with others. There is not a simple cohort to work with – the interaction instead will often be based on content. Who else is working through the same content in roughly the same time period.

FlatWorld uses an approach that is very similar to Stack Overflow, where students can ask and answer questions over time, and the answers are voted up or down to allow the best answers to rise to the top. The stack overflow is moderated by faculty at Brandman. This not only allows students working on the same competencies at roughly the same time, but it even allows interaction with students on similar competencies separated in time.

FW_DiscussionBoards

 

FW_SocialInteraction

There certainly is a tendency in many CBE programs to stick to multiple-choice assignments and quizzes and to avoid much social interaction. That method is a whole lot easier to design, and with several hundred of new programs under development, I think the overall quality can be quite low in many programs, particularly those looking for a quick-win CBE introduction, essentially trying to jump on the bandwagon. You can see the tendency towards multiple-choice in the FlatWorld system as well.

But self-paced does not imply isolation, and the Flat World implementation of the Brandman University program shows how CBE can support project-based work, written assignments and assessments, and interaction between students and faculty as well as between multiple students.

The post Flat World and CBE: Self-paced does not imply isolation appeared first on e-Literate.

Instructure Releases 4th Security Audit, With a Crowd-sourcing Twist

Sat, 2015-02-07 12:17

By Phil Hill

Phil is a consultant and industry analyst covering the educational technology market primarily for higher education. He has written for e-Literate since Aug 2011. For a more complete biography, view his profile page.

Web | Twitter | LinkedIn | Google+ | More Posts (286)

In the fall of 2011 I made the following argument:

We need more transparency in the LMS market, and clients should have access to objective measurements of the security of a solution. To paraphrase Michael Feldstein’s suggestions from a 2009 post:

  • There is no guarantee that any LMS is more secure just because they say they are more secure
  • Customers should ask for, and LMS vendors should supply, detailed information on how the vendor or open source community has handled security issues in practice
  • LMS providers should make public a summary of vulnerabilities, including resolution time

I would add to this call for transparency that LMS vendors and open source communities should share information from their third-party security audits and tests.  All of the vendors that I talked to have some form of third-party penetration testing and security audits; however, how does this help the customer unless this information is transparent and available?  Of course this transparency should not include details that would advertise vulnerabilities to hackers, but there should be some manner to be open and transparent on what the audits are saying. [new emphasis added]

Inspired by fall events and this call for transparency, Instructure (maker of the Canvas LMS) decided to hold an public security audit using a white hat testing company, where A) the results of the testing would be shared publicly, and B) I would act as an independent observer to document the process. The results of this testing are described in two posts at e-Literate and by a post at Instructure.

Instructure has kept up the process, this year with a crowd-sourcing twist:

What was so special about this audit? For starters, we partnered with Bugcrowd to enlist the help of more than 60 top security researchers. To put that number in context, typical third-party security audits are performed by one or two researchers, who follow standard methodologies and use “tools of the trade.” Their results are predictable, consistent, and exactly what you’d want and expect from this type of service. This year, we wanted an audit that would produce “unexpected” results by testing our platform in unpredictable ways. And with dozens of the world’s top experts, plus Bugcrowd’s innovative and scrappy crowdsourcing approach, that’s exactly what we got.

So while last year’s audit found six issues, this year’s process unearthed a startling 59. (Yeah, you read that right. Fifty-nine.) Witness the power of crowdsourcing an open security audit.

The blog post goes on to state that all 59 issues have been fixed with no customer impacts.

I harp on this subject not just to congratulate Instructure on keeping up the process, but to maintain that the ed tech world would benefit from transparent, open security audits. Back in 2011 there were ed tech executives who disagreed with the approach of open audits.

There are risks, however, to this method of public security testing. Drazen Drazic, the managing director of Securus Global, indicated that in talking to people around the world through security-related social networks, no other companies have chosen to use an independent observer for this testing. This is not to argue that no one should do it, but clearly we are breaking new ground here and need to be cautious.

One downside of public security assessments is that the act of publicizing results can in fact increase the likelihood that vulnerabilities would be exploited by hackers. As one executive from a competitive LMS put it to me, we need to focus on security consistently and not as a once-a-year exercise. Any public exposure of vulnerabilities can increase the likelihood of hackers exploiting those vulnerabilities, so the trick is to not disclose specific pathways to exploitation. In our case, I described the category of vulnerability found, and I avoided disclosing any information on the critical and high-risk vulnerabilities until after they had been remediated. Still, this is a tricky area.

Two competitive LMS vendors have criticized these tests as a marketing ploy that could be dangerous. In their opinion, student and client data is best protected by keeping the testing process out of the public domain. I cannot speak for Instructure’s motivations regarding marketing, but I did want to share these criticisms.

We are now in the fourth year of Instructure providing transparent security audits, and I would note the following:

  • The act of publicizing the results has not in fact enabled hackers to exploit the security vulnerabilities identified.
  • While I am sure there is marketing value to this process, I would argue that the primary benefits have been enhanced security of the product, but more importantly better information for the institutions evaluating or even using Canvas.

I repeat my call for more ed tech vendors to follow a this type of process. I would love to cover similar stories.

The post Instructure Releases 4th Security Audit, With a Crowd-sourcing Twist appeared first on e-Literate.

Babson Study of Online Learning Released

Wed, 2015-02-04 23:52

By Phil Hill

Phil is a consultant and industry analyst covering the educational technology market primarily for higher education. He has written for e-Literate since Aug 2011. For a more complete biography, view his profile page.

Web | Twitter | LinkedIn | Google+ | More Posts (285)

Babson Survey Research Group (BSRG) just released its annual survey of online learning in US higher education (press release here). This year they have moved from use of survey methodology for the online enrollment section to use of IPEDS distance education data. Russ Poulin from WCET and I provided commentary on the two data sources as an appendix to the study.

The report highlights the significant drop in growth of online education in the US (which I covered previously in this e-Literate post). Some of the key findings:

  • Previous reports in this series noted the proportion of institutions that believe that online education is a critical component of their long-term strategy has shown small but steady increases for a decade, followed by a retreat in 2013.
  • After years of a consistently growing majority of chief academic officers rating the learning outcomes for online education “as good as or better” than those for face-to-face instruction, the pattern reversed itself last year.
  • This report series has used its own data to chronicle the continued increases in the number of students taking at least one online course. Online enrollments have increased at rates far in excess of those of overall higher education. The pattern, however, has been one of decreasing growth rates over time. This year marks the first use of IPEDS data to examine this trend.
  • While the number of students taking distance courses has grown by the millions over the past decade, it has not come without considerable concerns. Faculty acceptance has lagged, concerns about student retention linger, and leaders continue to worry that online courses require more faculty effort than face-to-face instruction.

BSRG looked at the low growth (which I characterized as ‘no discernible’ growth’ due to noise in the data) and broke down trends by sector.

Growth by sector

The report also found that more institutions are viewing online education as ‘critical to the long term strategy of my institution’.

Strategic online

 

There’s lots of good data and analysis available – read the whole report here.

I’ll write more about the critique of data sources that Russ and I provided in the next few days.

We are especially pleased that Phil Hill and Russ Poulin have contributed their analysis of the transition issues of moving to IPEDS data. Their clear and insightful description will be of value for all who track distance education.

I want to personally thank Jeff Seaman for the opportunity he and his team provided for us to provide this analysis.

The post Babson Study of Online Learning Released appeared first on e-Literate.

Is Standardized Testing a Pediatric Disease?

Sat, 2015-01-24 14:09

In my last post, I wrote about the tension between learning, with the emphasis on the needs and progress of individual human learners, and education, which is the system by which we try to guarantee learning to all but which we often subvert in our well-meaning but misguided attempts to measure whether we are delivering that learning. I spent a lot of time in that post exploring research by Gallup regarding the workplace performance of adults, various dimensions of personal wellbeing, and the links of both to each other and to college experiences. One of Gallup’s findings were that workers who are disengaged with their work are less healthy. They are more likely to get clinically depressed, more likely to get heart conditions, and more likely to die young. I then made a connection between disengaged adults and disengaged students. What I left implicit was that if being disengaged as an adult is bad for one’s health, it stands to reason that being disengaged as a child is also bad for one’s health. We could be literally making our children sick with schooling.

I am in the midst of reading Anya Kamenetz’s new book The Test. It has convinced me that I need to take some time making the connection explicit.

In that previous post, I wrote,

Also, people who love their jobs are more likely to both stay working longer and live longer. In a study George Gallup conducted in the 1950s,

…men who lived to see 95 did not retire until they were 80 years old on average. Even more remarkable, 93% of these men reported getting a great deal of satisfaction out of the work they did, and 86% reported having fun doing their job.

Conversely, a 2008 study the company found a link between employee disengagement and depression:

We measured their engagement levels and asked them if they had ever been diagnosed with depression. We excluded those who reported that they had been diagnosed with depression from our analysis. When we contacted the remaining panel members in 2009, we again asked them if they had been diagnosed with depression in the last year. It turned out that 5% of our panel members (who had no diagnosis of depression in 2008) had been newly diagnosed with depression. Further, those who were actively disengaged in their careers in 2008 were nearly twice as likely to be diagnosed with depression over the next year. While there are many factors that contribute to depression, being disengaged at work appears to be a leading indicator of a subsequent clinical diagnosis of depression.

Which is obviously bad for employer and employee alike.

In some cases, Gallup went all in with physiological studies. For example, they “recruited 168 employees and studied their engagement, heart rate, stress levels, and various emotions throughout the day,” using heart rate monitors, saliva samples, and handheld devices that surveyed employees on their activities and feelings of the moment at various points in the day.

After reviewing all of these data, it was clear that when people who are engaged in their jobs show up for work, they are having an entirely different experience than those who are disengage. [Emphasis in original.] For those who were engaged, happiness and interest throughout the day were significantly higher. Conversely, stress levels were substantially higher for those who were disengaged. Perhaps most strikingly, disengaged workers’ stress levels decreased and their happiness increased toward the end of the workday….[P]eople with low engagement…are simply waiting for the workday to end.

From here, the authors go on to talk about depression and heart attacks and all that bad stuff that happens to you when you hate that job. But there was one other striking passage at the beginning of this section:

Think back to when you were in school sitting through a class in which you had very little interest. Perhaps you eyes were fixed on the clock or you were staring blankly into space. You probably remember the anticipation of waiting for the bell to ring so you could get up from your desk and move on to whatever was next. More than two-thirds of workers around the world experience a similar feeling by the end of a typical workday.

I then went on to a point about preparing students to be engaged workers, but it’s worth pausing here and thinking for a moment. Schooling is the model, the archetype, for the workplace experience that literally causes people to lead shorter, sadder, sicker lives. Is that hyperbole? Is it a caricature of modern schooling? Actually, thanks to the current American obsession with standardized testing, the stereotype may actually understate the case.

In The Test, Kamenetz quotes the blog of a Chicago parent who had assisted her daughter’s class with computer-based testing. On the way home from the second day (?!) of testing, her daughter broke down in the car:

“I just can’t do this,” she sobbed. The ill-fitting headsets, the hard-to-hear instructions, the uncooperative mouse, the screen going to command modes, not being able to get clarification when she asked for it….It took just two days of standardized testing to doubt herself. “I’m just not smart, Mom. Not like everyone else. I’m just no good at kindergarten, just no good at all.”

I have read this paragraph a half dozen times now, and I still can’t get through it without tearing up.

Kamenetz then goes on to say that teacher and parents throughout the United States—especially the ones with elementary school-aged children—“report students throwing up, staying home with stomach aches, locking themselves in the bathroom, crying, having nightmares, and otherwise acting out on test days.”

A bit later in the book, she writes about a couple of Great Depression-era researchers named Harold Skeels and Harold Dye. They took a couple of one-year-old babies in an orphanage who had tested as “moderately to severely retarded” and moved them to a ward for mentally disabled young women, because the children were viewed as hopeless cases. Fourteen and sixteen months old, these girls were already discarded. But what happened next was anything but what the researchers expected. The girls became adopted by the residents and attendants of the ward. Kamenetz notes, “After just six months their IQ scores had improved to 77 and 87, and a few months after that their scores had climbed into the mid-90s, near average levels.”

The researchers were so taken aback that they repeated the experiment, bringing 13 “retarded” one- and two-year-old girls from orphanages to the adult women’s institution, where they were given foster mothers there.

According to an article discussing the case, the toddlers at the adult women’s home had toys bought for them by the attendants and clothes made for them by the residents. Their “mothers” cheerfully competed over which ones could be made to walk and talk first.

Meanwhile, a control group of supposedly low-IQ girls stayed at the orphanage, presumably living under the conditions one imagines in the kind of orphanage that would let some of its children be condemned to live out their lives in a mental institution when they were just 14 months old. What were the results?

The children [who were transferred to the mental institution] remained on the ward for a mean of nineteen months. All but two of the eleven gained more than 15 IQ points during that time. Once they tested at average intelligence they were moved to regular foster homes. A year after the experiment ended, of the thirteen original children, none was still classified as “feeble-minded.” At the first follow-up two and a half years later, in 1943, the mean IQ of the experimental group was exactly average, 101.4. Meanwhile the control group left at the orphanage had shown “marked deterioration” and now had an average IQ of 66.1, down from 86 at the beginning of the study.

Staying in the orphanage was actually more harmful to the young girls that putting them in an adult mental institution. This was not a short-term difference, either. In the 1960s, the researchers followed up with the girls from the original study.

Of the thirteen girls who had been adopted, first informally by developmentally disabled women[1] in the institution and then by families in the outside world, all of them were self-supporting. Eleven of them were married. They had a mean of 11.68 years of education. They earned an average wage of $4,224, which was in the range of average annual earnings for men in Iowa, their home state—not bad for a group of women from an institutional background in the 1960s.

Of the twelve girls in the control group, only four of them had jobs, all of them working in the institutions where they lived. Only three had been married. On average they had less than four years of schooling. The cost savings to the state for rescuing the girls who went on to live healthy, productive lives was approximately $200 million in today’s dollars.

Anya’s primary point for telling this story is to review the history of evidence that standardized tests are poor predictors of human potential. But the story is also a compelling illustration of the long-term harm to health and wellbeing that we do to humans when we subject them to inhumane conditions (and, on a more hopeful note, how just a little bit of human love and understanding can be so transformative in a person’s life). Note that the Gallup research shows long-term health effects for work situations that are likely a lot less stressful than those of living in a Depression-era orphanage and almost certainly not worse than the kind of stress that Chicago kindergartener endured.

As I was pondering this story, I was reminded of FDA Commissioner David Kessler. (Bear with me on this.) Kessler successfully argued that nicotine addiction is a pediatric disease based on the long-term harm that it does to children. On that basis, he was able to establish that regulating tobacco falls under the purview of the FDA and was therefore able to put a collar on the powerful tobacco industry and regulate it for the first time. Given the severe and long-term stress that American children endure today due to a testing regime that takes up to 25% of students’ total schooling time, I wonder whether similarly compelling evidence could be gathered showing that forcing students to endure endless rounds of high-stakes standardized testing has effects analogous to long-term exposure to hazardous waste.

  1. Michael’s note: Given the rest of the story that Anya is telling here, it makes one wonder how many of those women were really developmentally disabled.

The post Is Standardized Testing a Pediatric Disease? appeared first on e-Literate.

About Inside Higher Ed Selling Majority Stake

Sun, 2015-01-18 01:20

Update 1/21: See link and blurb at bottom of post from new Editor’s Note at Inside Higher Ed.

Last week the Huffington Post ran an article by David Halperin breaking the news that the private equity firm Quad Partners had acquired a controlling interest in Inside Higher Ed.

Quad Partners, a New York private equity firm that is invested heavily in the for-profit college industry, and whose founder has aggressively opposed regulation of that troubled industry, has acquired a controlling stake in the respected trade publication Inside Higher Ed (IHE), which often reports on for-profit colleges and the policy disputes surrounding them. There has been no public announcement, but the Quad Partners website now lists Inside Higher Ed as one of its investments, among a range of education-related companies, including for-profit trade schools Beckfield College, Blue Cliff College, Dorsey Schools, Pacific College of Oriental Medicine, and Marinello Schools of Beauty.

Doug Lederman, one of IHE’s two top editors, confirmed to me that Quad purchased a majority interest in IHE in November.

Quad Partner James Tieng is now an IHE board member. Quad also owns the influential college admissions management company Noel-Levitz and other education technology companies that contract with colleges and universities — another sector that IHE covers.

The rest of the article then goes full conspiracy theory, building off the for-profit connection of both Quad Partners and its founder. Halperin seems to believe mere indirect association with for-profits is evil and compromising in and of itself rather than finding any changes or compromises in IHE coverage.

The bigger issue in my mind was described by Keith Button at Education Dive.

While the list of potential conflicts of interest in such a sale is long, the fact that the deal wasn’t announced and the potential news coverage issues weren’t publicly addressed up-front raises more questions.

This issue of disclosure was partially addressed in the original article:

“I would expect people to be watching us” in light of this purchase, says Lederman. “Our credibility is hugely important to us, and ultimately it will rise or fall on the nature and tenor of our coverage.” He says IHE will go on as before: “The proof will be in what we publish.” If there are significant references in IHE to specific Quad-owned companies, the publication will disclose the relationship.

In my mind, IHE made a serious mistake by not publicizing the acquisition back in November and issuing a blanket disclosure. I don’t fault them for selling the controlling stake in the company, especially given the lack of a paywall. But I do fault them for not realizing how the lack of disclosure created the opportunity for a advocate to publicly challenge them. It’s actually ironic to see a full-fledged advocate (Halperin writes extensively attacking the for-profit sector as part of his funding and openly calls himself an advocate) require 100% pure financial independence for IHE.

There are two types of disclosure that are relevant – a blanket disclosure announcing a key event such as the sale of the majority of company shares, proactively distributed and available; and article-specific disclosures if IHE articles reference companies tied to their owners. IHE seems to be relying on the latter, but their credibility will take a hit by not doing the former.

IHE was caught off guard by the Huffington Post article, and they seem to have quickly put up an Ownership Statement on the same day the article ran (Jan 14th).

Inside Higher Ed is an independent journalism organization. The journalistic independence is critical in ensuring the fairness and thoroughness of our higher education coverage.

Inside Higher Ed Inc. is owned by its three founders, other individual investors, and Quad Partners, a private equity firm that invests in the education space. Quad purchased a controlling share of Inside Higher Ed in November 2014 from a group of venture capital firms that invested in the company originally a decade earlier.

Owners of Inside Higher Ed stock who are not editors play no role in the editorial policies of the company.

The problem is the following:

  • This statement comes across as a reaction to Halperin – you got us – leading to the appearance that IHE had something to hide; and
  • IHE has done little to actually disclose this ownership, as the statement is only linked on the About Us page and Doug Lederman’s page (no articles or prominent placement of significant news event).

I read and research quite a bit of higher ed news and it took me a while to find this statement, despite the fact that I was specifically looking for information. With the current placement, very few people would have seen it.

This news is relevant, more for the Quad Partners ownership of Noel-Levitz than for their ownership of Marinello Schools of Beauty. Higher ed enrollment in the US has been declining the past 2 years, and this change is shaping up to be one of the biggest drivers of change initiatives for institutions and associated markets. There might be no other organization with more influential on enrollment management than Noel-Levitz. In the past 12 months Inside Higher Ed has written eight articles where Noel-Levitz plays an important role, and this prominent Fortune article profiling the company states:

Noel-Levitz might be the most influential force in higher education pricing that you’ve never heard of, empowering what’s become a three-stage, market-distorting game for college administrators.

Readers should know about the ownership connection given the importance of enrollment management and college pricing, and readers should not have to find this if and only if they read an article with direct references.

Do I believe that Quad Partners has or will change IHE coverage, especially on enrollment management and pricing? No. In my experience, IHE’s leadership and the reporters I’ve dealt with have been very ethical and honest. Furthermore:

Lederman says that at the insistence of IHE, the purchase agreement includes a clause that precludes Quad Partners from any involvement in editorial operations. IHE was launched by Lederman and two co-founders in 2004, with a modest investment from three Washington DC-area venture funds, including the owners of the lead generation company Double Positive. Those three investors, who sold their shares to Quad in November, also had no role in editorial operations, says Lederman.

IHE does a great job covering important stories in higher ed, including a watch dog role of exposing problems that arise. We need them to be trusted, and they should quickly correct the mistake. My unsolicited advise:

  • Write an article disclosing the sale and linking to the Ownership Statement – don’t make this information hard to find;
  • Quote a portion of the purchase agreement clause in the article to clarify their statement of editorial independence; and
  • Create a separate page of editorial policies.

Update 1/19: In a separate Education Dive post from the weekend:

A top editor of Inside Higher Ed said Friday that, in hindsight, he wished there had been more transparency about the sale of the publication’s controlling interest to a private equity firm that has invested heavily in for-profit education.

“We were founded without any support, then we had one set of investors and we had never said anything about them,” Scott Jaschik, an Inside Higher Ed founder and editor, told Education Dive. “In hindsight, I wish we had, because clearly this is of interest to people.” [snip]

“I guess I would just say to anyone who has questions, read us and read our coverage and call me if you think we’re doing anything that we shouldn’t,” he said.

Excellent work by Education Dive, by the way. As for IHE, I still think they would benefit from a blanket disclosure.

Update 1/21: Inside Higher Ed has now posted a full blanket disclosure note. Good for them.

Some of you may have seen some recent blog posts and discussion on Twitter or elsewhere about Inside Higher Ed Inc.’s ownership status. We wanted you to have more information directly from us. [snip]

In November 2014, Quad Partners, a private equity firm that invests in numerous companies in the education space, including some small for-profit colleges, bought a controlling interest in our company by purchasing shares of Inside Higher Ed Inc.’s stock from our previous investors.

Quad intends to help Inside Higher Ed expand its staff, extend its reach, and improve its coverage and services. Its goal is to help Inside Higher Ed do what it does better. And yes, like all investors, it wants to make money.

Owners of Inside Higher Ed Inc. stock who are not editors play no role in the editorial policies of the company. Quad acknowledged explicitly in its agreement to invest in Inside Higher Ed Inc. that it would be precluded from any involvement in editorial operations.

The post About Inside Higher Ed Selling Majority Stake appeared first on e-Literate.