Skip navigation.

Michael Feldstein

Syndicate content
What We Are Learning About Online Learning...Online
Updated: 9 hours 50 min ago

Blackboard’s Messaging Problems

Fri, 2015-07-31 15:07

By Michael FeldsteinMore Posts (1038)

There are a lot of things that are hard to evaluate from the outside when gauging how a company is doing under new management in the midst of a turnaround with big new products coming out. For example, how good is Ultra, Blackboard’s new user experience? (At least, I think the user experience is what they mean by “Ultra.” Most of the time.) We can look at it from the outside and play around with it for a bit, but the best way to judge it is to talk to a lot of folks who have spent time living with it and delivering courses in it. There aren’t that many of those at the moment. Blackboard has offered to put us in touch with some of them, and we will let you know what we learn from them after we talk to them. How likely is Blackboard to deliver the promised functionality on their Ultra to-do list to other customers on schedule (or at all)? Since this is a big initiative and the company doesn’t have much of a track record, it’s hard to tell in advance of them actually releasing software. We’ll watch and report on it as it comes out. How committed is Blackboard to self-hosted customers on the current platform? We have their word, and logical reasons why we believe they mean it when they say they want to support those customers, but we have to talk to a bunch of customers to find out what they think of the support that they are getting, and even then, we only know about Blackboard’s current execution, which is not the same as their future commitment. So there are a lot of critical aspects about the company that are just hard and time-consuming to evaluate and will have to wait on more data.

But not everything is hard to evaluate. Communication, for example, is pretty easy to judge. Last year I mocked Jay Bhatt pretty soundly for his keynote. (Of course, we have hit D2L a lot harder for their communication issues because theirs have been a lot worse.) In some ways, it is so easy to critique communication that we have to be careful not to just take cheap shots. Everybody loves to mock vendors in general and LMS vendors in particular. We’re mainly interested in communications problems that genuinely threaten to hurt their relationship with their customers. Blackboard does have serious customer communication problems at the moment, and they do matter. I’m going to hit on a few of them.

Keynote Hits Sour Notes

Since I critiqued last year’s keynote, an update in that department is as good a place to start as any. It’s sort of emblematic of the problem. This year’s keynote was better than last year’s but that doesn’t mean it was good. Of the two-hour presentation, only the last twenty minutes or so directly addressed the software. The rest was about values and process. I get why the company is doing this. As I said in last year’s review, they are nothing if not earnest. So, for example, when Jay Bhatt says that we need to start a “revolution” in education and that Blackboard is inviting “you”—presumably the educators in the room—to join them, it doesn’t carry the sinister tone of the slick Sillycon Valley startup CEO talking about “disrupting” education (by which they generally mean replacing dumb, mean, unionized bad people teachers with slick, nice, happy-faced software). Jay comes across as a dad and a former teacher who honestly cares about education and wants very much to do his part to improve it. But his pitch is tone deaf. No matter how earnest you are, you can’t take center stage as the CEO of a software company that has a long and infamous reputation for disregarding customers making education worse rather than better and then, giant-face projected on the jumbotron and simulcast on the web, convince people that you are just a dad who wants to make education better. It doesn’t work. It’s not going to win over skeptical customers, never mind skeptical prospective customers. No matter how much you sincerely mean it. No matter how much it is said with the best of intentions. You also can’t spend the first 90+ minutes of the keynote talking about process and then get around to admitting that your revolutionary software is a year late. Phil and I both give Jay and Blackboard tons of credit for being forthright about the delay in the keynote, and for generally showing a kind of honesty and openness that we don’t see very often from big ed tech vendors. Really, it’s rare, it’s important, and it deserves more credit that it will probably be given by a lot of people. But in terms of having the intended effect on the audience, owning up to your delivery problems in the last 10 minutes of a two-hour keynote, most of which was also not spent talking about the stuff that customers most immediately care about, will not have the desired effect. The reason Blackboard went though that first 90 minutes is that they, really, really want to tell you, with all their hearts, that “Gee whiz, gang, we really do care and we really are trying super-hard to create something that will make students’ lives better.” But if the punchline, after 90+ minutes, is “…and…uh…we know we told you we’d have it done a year ago, but honestly, we mean it, we’re still working on it,” you will not win converts.

The one thing I did like very much, besides the honesty about missing their delivery dates, was the day-in-the-life walk-throughs of the software. They very compactly and effectively conveyed the quality of thought and concern for the student that the first 90 minutes of process talk did not. If you want to convince me that you really care about students, then don’t talk to me about how much you really care about the students. Show me what you have learned from them. Because talk is cheap. I won’t believe that you really care about students in a way that affects what you do in your business until you show me that you have developed a clear and actionable understanding of what students need and want and care about. That is what the walk-throughs accomplished (although they would have been even more effective with just a smidge less “golly gee” enthusiasm).

There’s one simple thing Blackboard could do that would vastly improve their keynotes and make a host of rhetorical sins more forgivable. They could bring back Ray Henderson’s annual report card. Every year, Ray would start the keynote by laying out last year’s publicly declared goals, providing evidence of progress (or not) toward those goals—quantitative evidence, whenever possible—and setting the goals for the new year. This set the right tone for the whole conference. “I made you some promises. Here’s how I did on those promises. Here’s what I’m going to do better this year. And here are some new promises.” As a customer, I will hear whatever else you have to say to me next much more charitably if you do that first. For example, Phil and I have heard a number of customers express dissatisfaction with the length of time it takes to fix bugs. At a time when Blackboard is trying to convince self-hosted customers that they will not be abandoned, this is particularly important not to let get out-of-hand because every customer who has an unaddressed bug will be tempted to read it as evidence that the company is secretly abandoning 9.1 and just lying about it. But if Blackboard leadership got up on stage—as they used to—and said, “Here’s the number of new bugs we had in the past year, here’s average length of time that P1s go unaddressed, here’s the trend line on it, here’s our explanation of why that trend line is what it is, and here’s our promise that we will give you an update on this next year, even if it looks bad for us,” then customers are going to be much more likely to give the company the benefit of the doubt. If you’ve addressed my concerns as a customer and said your “mea culpas” first, then I’m going to be more inclined to believe that anything else you want to tell me is truthful and meant for my benefit.

What Is Ultra and What Does It Mean For Me?

Ultra Man

Another problem Blackboard has is that it is very hard to understand what they mean by “Ultra.” Sometimes they mean an architecture that enables a new user experience. Sometimes they mean the new user experience that may or may not require the architecture. And at no time do they fully clarify what it means for hosting.

Here’s a webinar from last December that provides a pretty representative picture of what Blackboard’s Ultra talk is like:

Most of the Ultra talk is about the user experience. So it makes sense to infer that Ultra is a new user experience which, for those with any significant experience with Blackboard or many of the other LMS providers, would suggest a new skin (or “lipstick on a pig,” as Phil recently put it). And yet, Ultra doesn’t run on the self-hosted version of Blackboard. Why is that? A cynical person would say (and cynical customers have said) that Blackboard is just trying to push people off of self-hosting. No, says Blackboard, not at all. Actually, the reason we can’t do self-hosted Ultra is because Ultra requires the new cloud architecture, which you can’t self-host.

Except for Ultra on mobile. You can experience Ultra on mobile today, even if you are running self-hosted 9.1.


OK, so if I want to run Ultra, I can’t run it self-hosted (except for mobile, which is fine). What if I’m managed hosted? Here’s the slide from that webinar:


There you go. Clear as mud. What is “Premium SaaS”? Is it managed hosting? Is it private cloud? What does it mean for current managed hosting customers? What we have found is that there doesn’t seem to be complete shared understanding even among the Blackboard management team about what the answers to these questions are. Based on what Phil and I have been able to glean about the true state of affairs, here’s how I would explain the situation if I were a Blackboard executive:

  • Ultra is Blackboard’s new product philosophy and user interface. Rather than just sticking in a new tab or drop-down menu and a new bill from a new sales team every time we add new capabilities, we’re trying to design these capabilities into the core product experience in ways that fit with how customers would naturally use them. So rather than thinking about separate products living in separate places—like Collaborate, Community, Analytics, and Content, for example—you can think about synchronous collaboration, non-course groups, student progress tracking, and content sharing naturally when and where you need those capabilities in your daily academic life.

  • Blackboard Learn Cloud [Note: This is my made-up name, not Blackboard’s official product name] is the new architecture that makes Ultra possible for Learn. It also enables you to gain all of the benefits of being in the cloud, like being super-reliable and less expensive. But with regard to Ultra, we can’t create that nifty integrated experience without adding some new technical infrastructure. Learn Cloud enables us to do that. Update: Ultra is still a work in progress and may not be appropriate for all professors and all courses in its current form. Luckily, Learn Cloud also runs the traditional Learn user experience that is available on Learn Enterprise. So you can run Learn Cloud now without impacting your faculty and have them switch over to the Ultra experience—on the same platform—whenever they are ready for it and it is ready for them.

  • Blackboard Learn Enterprise [another Feldstein-invented name] is the classic architecture for Learn, currently on version 9.1. We think that a significant number of customers, both in the US and abroad, will continue to want to use the current architecture for a long time to come, in part because they want or need to self-host. We are committed to actively developing Learn Enterprise for as long as a significant number of customers want to use it. Our published road maps go out two years, but that doesn’t mean we only plan to develop it for another two years. It just means that it’s silly to create technology road maps that are longer than two years, given how much technology changes. Because Learn Enterprise shares a lot of code with Learn Cloud, we actually can afford to continue supporting both as long as customers are buying both in numbers. So we really do mean it when we say plan to keep supporting Enterprise for the foreseeable future. We will also bring as much of the Ultra experience to Enterprise as the technology allows. That won’t be all or most, but it will be some. The product will continue moving forward and continue to benefit from our best thinking.

  • Self-hosted Learn Cloud isn’t going to happen any time soon, which means that self-hosted Ultra isn’t going to happen any time soon. It is possible that the technologies that we are using for Blackboard Cloud will mature enough in the future that we will be able to provide you with a self-hosted version that we feel confident that we can support. (This is a good example of why it is silly to create technology road maps that are more than two years long. Who knows what the Wizards of the Cloud will accomplish in two years?) But don’t hold your breath. For now and the foreseeable future, if you self-host, you will use Learn Enterprise, and we will keep supporting and actively developing it for you.

  • Mobile is a special case because a lot of the functionality of the mobile app has lived in the cloud from Day 1 (unlike Learn Enterprise). So we can deliver the Ultra experience to your mobile apps even if you are running Learn Enterprise at home.

  • Managed hosting customers cannot run Ultra on Learn for the same reason that self-hosted customers cannot: They are currently using Learn Enterprise. They can continue to use Learn Enterprise on managed hosting for as long as they want, as long as they don’t need Ultra. We will, eventually, offer Learn Private Cloud [yet another Feldstein-invented name]. Just as it sounds, this will be a private, Blackboard-hosted instance of Blackboard Cloud. Managed Hosted clients are welcome to switch to Learn Private Cloud when it becomes available, but it is not the same as managed hosting and may or may not meet your needs as well as other options. Please be sure to discuss it with your representative when it becomes available. In the meantime, we’ll provide you with detailed information about what would change if you moved from managed hosting of Blackboard Enterprise to Blackboard Cloud, along with detailed information about what the migration process would be like.

To be clear, I’m not 100% certain that what I’ve described above is factually correct, in part because Phil and I have heard slightly different versions of the story from different Blackboard executives. (I’m fairly sure it’s at least mostly right.) The main point is that, whatever the truth is, Blackboard needs to lay it out more clearly. Right now, they are missing easy wins because they are not communicating well.

Time will tell whether Ultra pays off. I’m actually pretty impressed with what I’ve seen so far. But no matter how good it turns out to be, Blackboard won’t start winning RFPs in real numbers until they start telling their story better.

The post Blackboard’s Messaging Problems appeared first on e-Literate.

Reuters: Blackboard up for sale, seeking up to $3 billion in auction

Tue, 2015-07-28 13:48

By Phil HillMore Posts (345)

As I was writing a post about Blackboard’s key challenges, I get notice from Reuters (anonymous sources, so interpret accordingly) that the company is on the market, seeking up to $3 billion. From Reuters:

Blackboard Inc, a U.S. software company that provides learning tools for high school and university classrooms, is exploring a sale that it hopes could value it at as much as $3 billion, including debt, according to people familiar with the matter.

Blackboard’s majority owner, private equity firm Providence Equity Partners LLC, has hired Deutsche Bank AG and Bank of America Corp to run an auction for the company, the people said this week. [snip]

Providence took Blackboard private in 2011 for $1.64 billion and also assumed $130 million in net debt.

A pioneer in education management software, Blackboard has seen its growth slow in recent years as cheaper and faster software upstarts such as Instructure Inc have tried to encroach on its turf. Since its launch in 2011, Instructure has signed up 1,200 colleges and school districts, according to its website.

This news makes the messaging from BbWorld as well as their ability to execute on strategy, particularly delivering the new Ultra user experience across all product lines – including the core LMS – much more important. I’ll get to that subject in the next post.

This news should not be all that unexpected, as one common private equity strategy is to reorganize and clean up a company (headcount, rationalize management structures, reorient the strategy) and then sell within 3 – 7 years. As we have covered here at e-Literate, Blackboard has gone through several rounds of layoffs, and many key employees have already left the company due to new management and restructuring plans. CEO Jay Bhatt has been consistent in his message about moving from a conglomeration of silo’d mini-companies based on past M&A to a unified company. We have also described the significant changes in strategy – both adopting open source solutions and planning to rework the entire user experience.

Also keep in mind that there is massive investment in ed tech lately, not only from venture capital but also from M&A.

Update 1: I should point out that the part of this news that is somewhat surprising is the potential sale while the Ultra strategy is incomplete. As Michael pointed out over the weekend:

Ultra is a year late: Let’s start with the obvious. The company showed off some cool demos at last year’s BbWorld, promising that the new experience would be Coming Soon to a Campus Near You. Since then, we haven’t really heard anything. So it wasn’t surprising to get confirmation that it is indeed behind schedule. What was more surprising was to see CEO Jay Bhatt state bluntly in the keynote that yes, Ultra is behind schedule because it was harder than they thought it would be. We don’t see that kind of no-spin honesty from ed tech vendors all that often.

Ultra isn’t finished yet: The product has been in use by a couple of dozen early adopter schools. (Phil and I haven’t spoken with any of the early adopters yet, but we intend to.) It will be available to all customers this summer. But Blackboard is calling it a “technical preview,” largely because there are large swathes of important functionality that have not yet been added to the Ultra experience–things like tests and groups. It’s probably fine to use it for simple (and fairly common) on-campus use cases, but there are still some open manholes here.

Update 2: I want to highlight (again) the nature of this news story. It’s from Reuters using multiple anonymous sources. While Reuters should be trustworthy, please note that the story has not yet been confirmed.

Update 3: In contact with Blackboard, I received the following statement (which does not answer any questions, but I am sharing nonetheless).

Blackboard, like many successful players in the technology industry, has become subject of sale rumors. Although we are transparent in our communications about the Blackboard business and direction when appropriate, it is our policy not to comment on rumors or speculation.

Blackboard is in an exciting industry that is generating substantial investor interest. Coming off a very successful BbWorld 2015 and a significant amount of positive customer and market momentum, potential investor interest in our company is not surprising.

We’ll update as we learn more, including if someone confirms the news outside of Reuters and their sources.

The post Reuters: Blackboard up for sale, seeking up to $3 billion in auction appeared first on e-Literate.

UC Davis: A look inside attempts to make large lecture classes active and personal

Mon, 2015-07-27 13:16

By Phil HillMore Posts (345)

In my recent keynote for the Online Teaching Conference, the core argument was as follows:

While there will be (significant) unbundling around the edges, the bigger potential impact [of ed innovation] is how existing colleges and universities allow technology-enabled change to enter the mainstream of the academic mission.

Let’s look at one example. Back in December the New York Times published an article highlighting work done at the University of California at Davis to transform large lecture classes into active learning formats.

Hundreds of students fill the seats, but the lecture hall stays quiet enough for everyone to hear each cough and crumpling piece of paper. The instructor speaks from a podium for nearly the entire 80 minutes. Most students take notes. Some scan the Internet. A few doze.

In a nearby hall, an instructor, Catherine Uvarov, peppers students with questions and presses them to explain and expand on their answers. Every few minutes, she has them solve problems in small groups. Running up and down the aisles, she sticks a microphone in front of a startled face, looking for an answer. Students dare not nod off or show up without doing the reading.

Both are introductory chemistry classes at the University of California campus here in Davis, but they present a sharp contrast — the traditional and orderly but dull versus the experimental and engaging but noisy. Breaking from practices that many educators say have proved ineffectual, Dr. Uvarov’s class is part of an effort at a small but growing number of colleges to transform the way science is taught.

This article follows the same argument laid out in the Washington Post nearly three years earlier.

Science, math and engineering departments at many universities are abandoning or retooling the lecture as a style of teaching, worried that it’s driving students away. [snip]

Lecture classrooms are the big-box retailers of academia, paragons of efficiency. One professor can teach hundreds of students in a single room, trailed by a retinue of teaching assistants.

But higher-education leaders increasingly blame the format for high attrition in science and math classes. They say the lecture is a turn-off, higher education at its most passive, leading to frustration and bad grades in highly challenging disciplines.

What do these large lecture transformations look like? We got the chance in our recent e-Literate TV case study to get an inside look at the work done at UC Davis (episode 1, episode 2, episode 3), including first-person accounts from faculty members and students.

The organizing idea is to apply active learning principles such as the flipped classroom to large introductory science classes.

Phil Hill: It sounds to me like you have common learning design principles that are being implemented, but they get implemented in different ways. So, you have common things of making students accountable, having the classes much more interactive where students have to react and try to apply what they’re learning.

Chris Pagliarulo: Yeah, the main general principle here is we’re trying to get—if you want to learn something complex, which is what we try to at an R1 university, that takes a lot of practice and feedback. Until recently, much of that was supposed to be going on at home with homework or whatnot, but it’s difficult to get feedback at home when the smart people aren’t there that would help you—either your peers or your professor.

So, that’s the whole idea of the flipped classroom where come prepared with some basic understand and take that time where you’re all together to do the high-quality practice and get the feedback while we’re all together. Everything that we’re doing is focused on that sort of principle—getting that principle into the classroom.

Professor Mitch Singer then describes his background in the redesign.

Phil Hill: Several years ago, the iAMSTEM group started working with the biology and chemistry departments to apply some of these learning concepts in an iterative fashion.

Mitch Singer: My (hopefully) permanent assignment now, at least for the next five years, will be what we call “BIS 2A,” which is the first introductory course of biology here at UC Davis. It’s part of a series, and its primary goal is to teach fundamentals of cellular and molecular biology going from origins up to the formation of a cell. We teach all the fundamentals in this class: the stuff that’s used for future ones.

About three to four years ago, I got involved in this class to sort of help redesign it, come up with a stronger curriculum, and primarily bring in sort of hands-on, interactive learning techniques, and we’ve done a bunch of experiments and changed the course in a variety of ways. It’s still evolving over the last several years. The biggest thing that we did was add a discussion section, which is two hours long where we’ve done a lot of our piloting for this interactive, online, personalized learning (as the new way of saying things, I guess). This year (last quarter in the fall) was the first time we really tried to quote, flip part of the classroom.

That is make the students take a little bit more responsibility for their own reading and learning, and then the classic lecture is more asking questions trying to get them to put a and b together to come up with c. It’s sort of that process that we’d like to emphasize and get them to actually learn, and that’s what we want to test them on not so much the facts, and that’s the biggest challenge.

If you want to see the potential transformation of this core, it is crucial to look at the large lecture classes and how to make them more effective. The UC Davis case study highlights what is actually happening in the field, with input from real educators and students.

The post UC Davis: A look inside attempts to make large lecture classes active and personal appeared first on e-Literate.

Blackboard Ultra and Other Product and Company Updates

Sat, 2015-07-25 08:58

By Michael FeldsteinMore Posts (1038)

Phil and I spent much of this past week at BbWorld trying to understand what is going on there. The fact that their next-generation Ultra user experience is a year behind is deservedly getting a lot of attention, so one of our goals going into the conference was to understand why this happened, where the development is now, and how confident we could be in the company’s development promises going forward. Blackboard, to their credit, gave us tons of access to their top executives and technical folks. Despite the impression that a casual observer might have, there is actually a ton going on at the company. I’m going to try to break down much of the major news at a high level in this post.

The News

Ultra is a year late: Let’s start with the obvious. The company showed off some cool demos at last year’s BbWorld, promising that the new experience would be Coming Soon to a Campus Near You. Since then, we haven’t really heard anything. So it wasn’t surprising to get confirmation that it is indeed behind schedule. What was more surprising was to see CEO Jay Bhatt state bluntly in the keynote that yes, Ultra is behind schedule because it was harder than they thought it would be. We don’t see that kind of no-spin honesty from ed tech vendors all that often.

Ultra isn’t finished yet: The product has been in use by a couple of dozen early adopter schools. (Phil and I haven’t spoken with any of the early adopters yet, but we intend to.) It will be available to all customers this summer. But Blackboard is calling it a “technical preview,” largely because there are large swathes of important functionality that have not yet been added to the Ultra experience–things like tests and groups. It’s probably fine to use it for simple (and fairly common) on-campus use cases, but there are still some open manholes here.

Screenshot 2015-07-25 09.34.48

Ultra is only available in SaaS at the moment and will not be available for on-premise installations any time soon: This was a surprise both to us and to a number of Blackboard customers we spoke to. It’s available now for SaaS customers and will be available for managed hosting customers, but the company is making no promises about self-hosted. The main reason is that they have added some pretty bleeding edge new components to the architecture that are hard to wrap up into an easily installable and maintainable bundle. The technical team believes this situation may change over time as the technologies that they are using mature—to be clear, we’re talking about third-party technologies like server containers rather than homegrown Blackboard technologies—they think it may become practical for schools to self-host Ultra if they still want to by that time. But don’t expect to see this happen in the next two years.

Ultra is much more than a usability makeover and much more ambitious than is commonly understood: There is a sense in the market that Ultra is Blackboard’s attempt to catch up with Instructure’s ease of use. While there is some truth to that, it would be a mistake to think of Ultra as just that. In fact, it is a very ambitious re-architecture that, for example, has the ability to capture a rich array of real-time learning analytics data. These substantial and ambitious under-the-hood changes, which Phil and I were briefed on extensively and which were also shared publicly at Blackboard’s Devcon, are the reason why Ultra is late and the reason why it can’t be locally installed at the moment. I’m not going to have room to go into the details here, but I may write more about it in a future post.

Blackboard “Classic” 9.x is continuing under active development: If you’re self-hosted, you will not be left behind. Blackboard claims that the 9.x code line will continue to be under active development for some time to come, and Phil and I found their claims to be fairly convincing. To begin with, Jay got burned at Autodesk when he tried to push customers onto a next-generation platform and they didn’t want to go. So he has a personal conviction that it’s a bad idea to try that again. But also, Blackboard gets close to a quarter of its revenue and most of its growth from international markets now, and for a variety of reasons, Ultra is not yet a good fit for those markets and probably won’t be any time soon. So self-hosted customers on Learn 9.x will likely get some love. This doesn’t mean development will be as fast as they would like; the company is pushing hard in a number of directions, and we get the definite sense that there is a strain on developer resources. But 9.x will not be abandoned or put into maintenance mode in the near future.


If you want to get a sense of what Ultra feels like, try out the Blackboard Student mobile app: The way Blackboard uses the term “Ultra” is confusing, because sometimes it means the user experience but sometimes it means the next generation architecture for Learn. If you want to try Ultra the user experience, the play with the Student mobile app, which is in production today and which will work with Learn 9.x as well as Learn Ultra. Personally, I think it represents some really solid thinking about designing for students.



Moodle may make a comeback: One of the reasons that Moodle adoption has suffered in the United States the past few years is that it has lacked an advocate with a loud voice. Moodlerooms used to be the biggest promoter of the platform, and when Blackboard acquired them, they went quiet in the US. But, as I already mentioned, the international market is hugely important for Blackboard now, and Moodle is the cornerstone of the company’s international strategy. They have been quietly investing in the platform, making significant code contributions and acquisitions. There are signs that Blackboard may unleash Moodlerooms to compete robustly in the US market again. This would entail taking the risk that Moodle, a cheaper and lower-margin product, would cannibalize their Learn business, so file this under “we’ll believe it when we see it,” but Apple has killed the taboo of self-cannibalization when the circumstances are right, and they seem like they may be right in this situation.

Collaborate Ultra is more mature than Learn Ultra but still not mature: This is another case where thinking about Ultra as a usability facelift would be hugely underestimating the ambition of what Blackboard is trying to do. The new version of Collaborate is built on a new standard called WebRTC, which enables webconferencing over naked HTML rather than through Flash or Java. This is extremely hard stuff that big companies like Google, Microsoft, and Apple are still in the process of working out right now. It is just this side of crazy for a company the size of Blackboard to try to release a collaboration product based heavily on this technology. (And the only reason it’s not on the other side of crazy is because Blackboard acquired a company that has one of the world’s foremost experts on WebRTC.) Phil and I have used Collaborate Ultra a little bit. It’s very cool but a little buggy. And, like Learn Ultra, it’s still missing some features. At the moment, the sweet spot for the app appears to be online office hours.



My Quick Take

I’m trying to restrain myself from writing a 10,000-word epic; there is just a ton to say here. I’ll give a high-level framework here and come back to some aspects in later posts. Bottom line: If you think that Ultra is all about playing catch-up with Instructure on usability, then the company’s late delivery, functionality gaps, and weird restrictions on where the product can and cannot be run look pretty terrible. But that’s probably not the right way to think about Ultra. The best analogy I can come up with is Apple’s Mac OS X. In both cases, we have a company that is trying to bring a large installed base of customers onto a substantially new architecture and new user experience without sending them running for the hills (or the competitors). This is a really hard challenge. Hardcore OS X early adopters will remember that 10.0 was essentially an unusable technology preview, 10.1 was usable but painful, 10.2 was starting to feel pretty good, and 10.3 was when we really began to see why the new world was going to be so much better than the old one. If I am right, Ultra will go through the same sort of evolution. I don’t know that these stages will each be a year long; I suspect that they may be shorter than that. But right now we are probably partway through the 10.0 era for Ultra. As I mentioned earlier in the post, Phil and I still need to talk to some Ultra customers to get a sense of real usage and, of course, since it will be generally available to SaaS customers for use in the fall semester, we’ll have more folks to talk to soon. We will be watching closely to see how big the gaps are and how quickly they are filled. For example, how long will it take Blackboard to get to the items labeled as “In Development” on their slides? Does that mean in a few months? More? And what about the “Research” column? Based on these slide and our conversations, I think the best case scenario is that we reach the 10.2 era—where the platform is reasonably feature-complete, usable, and feeling pretty good overall—by BbWorld 2016, and with some 10.3-type new and strongly differentiating features starting to creep into the picture. Or they could fall flat and utterly fail to deliver. Or something in between. I’m pretty excited by the scope of the company’s ambition and am willing to cut them some slack, partly because they persuaded me that what they are trying to do is pretty big and party because they persuaded me that they probably know what they are doing. But they have had their Mulligan. As the saying goes (when properly remembered), the proof of the pudding is in the eating. We’ll see what they deliver to customers in the next 6-12 months.

Watch this space.

The post Blackboard Ultra and Other Product and Company Updates appeared first on e-Literate.

Giving D2L Credit Where Credit Is Due

Thu, 2015-07-23 21:20

By Phil HillMore Posts (345)

Michael and I have made several specific criticisms of D2L’s marketing claims lately culminating in this blog post about examples based on work at the University of Wisconsin-Milwaukee (UWM) and California State University at Long Beach (CSULB).

I understand that other ed tech vendors make marketing claims that cannot always be tied to reality, but these examples cross a line. They misuse and misrepresent academic outcomes data – whether public research-based on internal research – and essentially take credit for their technology “delivering results”.

This week brought welcome updates from D2L that go a long way towards addressing the issues we raised. As of Monday, I noticed that the ‘Why Brightspace? Results’ page now has links to supporting material for each claim, and the UWM claim has been reworded. Today, D2L released a blog post explaining these changes and admitting the mistakes. D2L even changed the web page to allow text selection for copy / paste. From the blog post:

Everyone wants more from education and training programs—so it’s critical that our customers are part of the process of measurement and constant improvement.

At Fusion, our customers came together to share new ideas and practices to push education forward. They like to hear about the amazing results, like U-Pace, which we post on our website. In our excitement to share the great results our customers are seeing through their programs, we didn’t always provide the details around the results. When we make mistakes, it’s our job to fix it—as we are doing now.

U-Pace is the specific program at UWM (course redesign from large lecture to self-paced / mastery approach), and D2L now links to a documented case study and quotes this case study in the blog post.

We have a Customer Success Program in place where approvals from our clients are acquired before we post anything about them. Stories are revisited every six months to make sure that they’re still valid and accurate. However, a recent customer success story was mistakenly posted on our website without their permission or knowledge. We will be doubling down on our efforts to help ensure that this doesn’t happen again, and we will work harder to provide citations for all the facts.

This “without their permission or knowledge” paragraph refers to a claim about CSULB.

Make no mistake, we’re extremely proud of what our clients are accomplishing. Our customers’ innovation, dedication, and just plain awesomeness is making a huge difference—and we’re proud to be a part of it. We will continue to measure and improve our offerings, listen to our community for suggestions, and when warranted, share their results. Here’s to them!

Kudos to D2L for these admissions and changes. Well done.

Notes and Caveats

While the overall change is very positive, I do have a few additional notes and caveats to consider.

  • The blog post today should have come from Renny Monaghan (Chief Marketing Officer) or John Baker (CEO). The blog post was written by Barry Dahl[1], and unless I misunderstand he is their lead for community engagement – building a user community that is mostly behind-login and not public-facing. The “mistakes” were made in official marketing and company communications. The leader of the department in charge of official messaging (Renny) or the company leader (John) should have taken ownership of what happened in the past and the corrections they are making.
  • In the blog post section describing the U-Pace program at the UWM, I would have included the description of moving from large lecture to self-paced / mastery approach. That change should not be embedded as one of “many factors that came together for UWM to achieve the results that they did, and that the increases in student success are not all attributed to their use of Brightspace.” That change to self-paced / mastery was the intervention, and all other factors are secondary. The case study describes the program quite well, but such an omission in the blog post is misleading.
  • The blog post only references UWM and CSULB examples, yet the ‘Why Brightspace? Results’ page added links to all claims. Changing them all was the right move.
  • Apparently, specific criticisms do not carry a CC-BY license.

These are welcome changes.

  1. For what it’s worth, Barry does great work for the company

The post Giving D2L Credit Where Credit Is Due appeared first on e-Literate.

Unizin Updates on Florida State System and Acquisition of Courseload

Wed, 2015-07-22 19:29

By Phil HillMore Posts (345)

I’m not sure when e-Literate was awarded the exclusive rights for non-PR Unizin coverage, but there were two announcements this week to cover.

State University System of Florida Joins

The first announcement is an update and confirmation of my recent post about the new associate membership option. If a member institution (one of the 11 members paying $1.050 million) sponsors their statewide system, that system can join Unizin as “associate members” for $100 thousand per year but without retaining a board seat and vote on product direction. The week the State University System of Florida (SUSFL) announced they are joining Unizin.

Building on its growing record of collaboration, the State University System of Florida, comprised of Florida’s 12 public universities, has joined Unizin, a group with a mission to have more control and influence over the digital learning ecosystem.

The decision helps secure Florida’s leadership in the realm of digital learning and gives access to tools under development, including a Digital Objects Repository and Learning Analytics. Florida is the first State University System to join the collaborative organization, which is a consortium of major research universities. The University of Florida is a founding member, alongside other top universities such as Pennsylvania State University, Ohio State University and the University of Michigan.The organization is a not-for-profit service operation and its membership is by invitation only.

It is not clear which of the 12 public universities beyond the University of Florida are actually planning to participate in Unizin. If you want details on the SUSFL plans and what associate membership means, go read the earlier post.

Courseload Acquisition And Content Relay

The second update is that Unizin acquired the IP, trademark, and remains of Courseload, a provider of e-reader platform for digital textbooks. From the announcement:

Unizin announced the acquisition of the Courseload software today. Courseload includes an eText reader platform and collaborative learning tools for the delivery of digital learning materials including Open Educational Resources, faculty-authored course packs, and publisher content. The addition of Courseload is a vital component for connecting content to learners in Unizin’s digital learning ecosystem.

This move now determines the second component of Unizin, as the plan is for the acquired Courseload employees will modify and develop a portion of their software to become the basis for the Content Relay. Previously Unizin had been planning to license or contract another organization to provide the Content Relay.

This acquisition means that Unizin will now be in the software development business and not just to integrate various products. This approach changes what had previously been the plans to not develop product, as Unizin co-founder and co-chairman of the board Brad Wheeler shared with me last year.

Unizin is not a Community Source effort in the way that I understand Community Source as we started applying the label 10+ years ago. Unizin is better understood, as you have reported, as a cloud-scale service operator somewhat like I2 [Internet2]. It does not plan to do lots of software development other than as needed for integrations. No biggie, just a nuanced observation from the end of the story.

When I asked Brad if this means that Unizin is ruling out product development, he replied:

Unizin is working on its roadmap for each area. If we do need to head down some development approach that is more than integration, we’ll give thought to the full range of options for best achieving that, but there is no plan to begin an open/community source effort at this time.

Courseload is based in Indianapolis, IN while Unizin is based in Austin, TX. This creates an interesting situation where a new organization will be managing a remote development team that likely outnumbers the pre-existing Unizin employees.

Common Origins

The Chronicle described the origins of Courseload in 2010.

Courseload, the e-book broker, started in 2000, when a co-founder, Mickey Levitan, a former Apple employee inspired by the company’s transformative role in the music industry, devised the idea and teamed up with a professor at Indiana University at Bloomington to try it. But the company failed to find enough takers, and it all but shut down after a brief run.

Then last year an official at Indiana, Bradley C. Wheeler, called Mr. Levitan and talked him into trying again.

Update (7/23): The following paragraph has been revised based on private communication from source which pointed out that Crunchbase data is wrong in this case.

Based on that company revival, in 2012 Courseload raised $1.6 million from IU’s Innovate Indiana fund according to Crunchbase. In 2012 the Innovate Indiana Fund, an organization that represents Indiana University’s push for economic development, joined other investment groups in helping to fund the new Courseload. The IIF investment was in the lower single digit % of the total raised. The The tight relationship with IU was further described in the Innovate Indiana end-of-year 2012 report.

In 2000, Mickey Levitan and IU Professor Alan Dennis had an idea that was ahead of its time. Through Courseload, the start-up learning platform company they cofounded, the two endeavored to make college course materials accessible online.

A decade later, Indiana University became the first customer, implementing the Courseload platform across all its campuses. Now with 50 clients and 32 employees, Courseload is leading the online course text revolution—lowering costs for students and providing capabilities that can improve educational outcomes, while offering professors the discretion to use the platform on a course-by-course basis. [snip]

Levitan is grateful for the company’s broad-reaching partnership with IU. Early support from [VP of IT Brad] Wheeler was critical to the company’s success, Levitan says. “He’s a wonderful partner and an extraordinary leader—a visionary who is ready to go out and shape the
world rather than be shaped by it.”

Levitan is also grateful for the company’s early and ongoing relationship with the IU Research and Technology Corporation (IURTC). Tony Armstrong, president and CEO of the IURTC, identified an early funding opportunity for Courseload through the Innovate Indiana Fund. Kenneth Green, manager of the Innovate Indiana Fund, sits on Courseload’s board of directors.

This Inside Higher Ed article from 2012 highlights the common origins of both Unizin and Courseload – both in terms of founder, Internet2, and common justification. As a reminder, Unizin is technically operates as part of Internet2.

In a session at the 2011 Educause conference in October, Bradley Wheeler, the chief information officer at Indiana University, issued a challenge to his colleagues. Unless universities assert their power as customers, the vendors that sell them products and services will continue squeezing those institutions for cash while dictating the terms under which they go digital.

That conversation revolved around expensive, institution-level investments such as learning-management platforms and enterprise resource planning software. Now Wheeler and his colleagues are looking to apply the same principles of “aggregated demand” to help students save money on electronic textbooks.

Internet2, a consortium of 221 colleges and universities, which last year brokered landmark deals with and Hewlett-Packard that gave its members discounts on cloud computing services, announced today that it had entered into a contract with McGraw-Hill, a major textbook publisher, aimed at creating similar discounts for students on digital course materials.

Moving Ahead

Unizin is now up to 11 full member institutions and 1 state-wide system associate member. Despite or because of the tangled paths of Unizin and Courseload, we finally have some clarity on the second component (the Content Relay) of the consortium’s services. It’s not what I would have guessed ahead of time, but I have to admit that seems to be a willing list of schools ready to join.

The post Unizin Updates on Florida State System and Acquisition of Courseload appeared first on e-Literate.

Release of University of California at Davis Case Study on e-Literate TV

Sun, 2015-07-19 16:55

By Phil HillMore Posts (345)

Today we are thrilled to release the fifth and final case study in our new e-Literate TV series on “personalized learning”. In this series, we examine how that term, which is heavily marketed but poorly defined, is implemented on the ground at a variety of colleges and universities. We plan to cap off this series with two analysis episodes looking at themes across the case studies.

We are adding three episodes from the University of California at Davis (UC Davis), a large research university that has a strong emphasis in science, technology, engineering, and math or STEM fields. The school has determined that the biggest opportunity to improve STEM education is to improve the success rates in introductory sciences classes – the ones typically taught in large lecture format at universities of their size. Can you personalize this most impersonal of academic experiences? What opportunities and barriers do institutions face when they try to extend personalized learning approaches?

You can see all the case studies (either 2 or 3 per case study) at the series link, and you can access individual episodes below.

UC Davis Case Study: Personalized The Large Lecture Class

UC Davis Case Study: Intro to Biology and Intro to Chemistry Examples

UC Davis Case Study: Opportunities and Barriers to Extending Personalization

e-Literate TV, owned and run by MindWires Consulting, is funded in part by the Bill & Melinda Gates Foundation. When we first talked about the series with the Gates Foundation, they agreed to give us the editorial independence to report what we find, whether it is good, bad, or indifferent.

As with the previous series, we are working in collaboration with In the Telling, our partners providing the platform and video production. Their Telling Story platform allows people to choose their level of engagement, from just watching the video to accessing synchronized transcripts and accessing transmedia. We have added content directly to the timeline of each video, bringing up further references, like e-Literate blog posts or relevant scholarly articles, in context. With In The Telling’s help, we are crafting episodes that we hope will be appealing and informative to those faculty, presidents, provosts, and other important college and university stakeholders who are not ed tech junkies.

We welcome your feedback, either in comments or on Twitter using the hashtag #eLiterateTV. Enjoy!

The post Release of University of California at Davis Case Study on e-Literate TV appeared first on e-Literate.

Unizin Perspective: Personalized learning’s existence and distance education experience

Wed, 2015-07-15 18:46

By Phil HillMore Posts (345)

By reading the Unizin pitch for the State University System of Florida shared yesterday, we can see quite a few claims about the (potential) benefits to be provided by the consortium. “Make sure that the universities were not cut out of [distance ed] process”; “Secure our foothold in the digital industry”; “Promote greater control and influence over the digital learning ecosystem”; Provide “access to the Canvas LMS at the Unizin price”; Provide “access to tools under development, including a Learning Object Repository and Learning Analytics”; Provide “potential for cooperative relationships to ‘share’ digital instruction within and across the consortium”.

I want to pick up on University of Florida provost Joe Glover’s further comment on Learning Analytics, however.

The third goal for Unizin is to acquire, create, or develop learning analytics. Some of the learning management systems have a rather primitive form of learning analytics. Unizin will build on what they have, and this will go from very mechanical types of learning analytics in terms of monitoring student progress and enabling intrusive advising and tutoring; all the way up to personalized learning, which is something that really does not exist yet but is one of the objectives of Unizin.

Personalized learning “really does not exist yet”? You can argue that personalized learning as a field is evolving and mostly in pilot programs, or that it is poorly defined and understood, or that there are not yet credible studies independently reviewing the efficacy of this family of approaches. But you cannot accurately say that personalized learning “really does not exist yet”. And is Unizin claiming that the consortium is key to making personalized learning a reality? This seemed to be one of the arguments in the pitch.

If A Tree Falls In A Different Sector . . .

There are multiple examples of personalized learning in practice, particularly at community colleges to deal with developmental math challenges. I have written about the massive investment in the emporium approach at Austin Community College’s ACCelerator Lab.

Rather than a pilot program, which I have argued plagues higher ed and prevents diffusion of innovations, Austin CC has committed to a A) a big program up front (~700 students in the Fall 2014 inaugural semester) and ~1,000 students in Spring 2015, yet B) they offer students the choice of traditional or emporium. To me, this offers the best of both worlds in allowing a big bet that doesn’t get caught in the “purgatory of pilots” while offering student choice.

We also shared through e-Literate TV an entire case study on Essex County College, showing their personalized learning approach.

In another e-Literate TV case study that does not focus on developmental math, we shared the personalized learning program at Empire State College, and they have been trying various personalized approaches for more than 40 years.

If A Tree Falls In A Non-Unizin Campus . . .

Personalized learning does exist, and Unizin schools could learn from the pioneers in this field. It would be wonderful if Unizin ends up helping to spread innovative teaching & learning practices within the research university community, but even there I would note that there are also some great examples in that group of schools (including at Arizona State University, UC Davis, and even at Unizin member Penn State). For that matter, the University of Florida would do well to travel two hours south and see the personalized learning programs in place at the University of Central Florida.

If this “consortium of large public universities that intends to secure its niche in the evolving digital ecosystem” means that the schools want to learn primarily among themselves, then Unizin will be falling prey to the same mistake that the large MOOC providers made – ignoring the rich history of innovation in in the field and thinking they are creating something without precedent leveraging their unique insight.

If A Tree Falls In A Distance Forest . . .

While Unizin has never claimed to be focused only on distance education, Glover does bring bring up the topic twice as the core of his argument.

That is a situation that we got ourselves in by not looking ahead to the future. We believe we are in a similar position with respect to distance learning at this point. [snip]

Every university in some sense runs a mom & pop operation in distance learning at this point, at least in comparison with large organizations like IBM and Pearson Learning that can bring hundreds of millions of dollars to the table. No university can afford to do that.

Let’s ignore the non sequitur about IBM for now. A few notes:

While these are larger non-profit online programs, it is not accurate to say that “every university in some sense runs a mom & pop operation”. It might be accurate based on the Unizin member institution experience, however. And the University of Florida did recently sign a long-term contract with Pearson Embanet to create its UF Online program, largely based on the school’s inexperience (beyond some masters programs) with fully online education.

In the graph below taken from the Fall 2013 IPEDS data, the Y axis is ratio of students taking exclusively DE courses (fully online programs), and the X axis is ratio of students taking some, but not all, DE courses (online courses within f2f program).


We see that the U Florida and Penn State U has fairly high percentage of students taking some online courses,Penn State World Campus is fully online (not sure if World Campus is part of Unizin or not, but I included it to be safe), and that Oregon State seems to have some fully online presence. But in general Unizin schools are not leaders in distance learning compared to other public 4-year universities. This is not a solid basis to think they have the answers on distance learning needs within the consortium.

Look Outward, Not Inward

In my mind, Unizin is looking the wrong direction. The money and focus thus far has been for the 10 (now 11) member institutions to look inward – form a club, talk amongst themselves, and figure out what should happen in a digital ecosystem. A different, more useful approach would be to look outward: get the club together and look beyond their own successes (e.g. Penn State World Campus), go visit schools that are at the forefront of digital education, invite them in to present and share, and learn from others.

What I’m suggesting is that Unizin should focus a lot more on participating in open communities and focus a lot less on forming an exclusive club. If the schools then set the consortium’s mission as leading instructional change within the member institutions, and forming a support community based on the similar profile of schools, then we might see real results.

The post Unizin Perspective: Personalized learning’s existence and distance education experience appeared first on e-Literate.

Unizin Offering “Associate” Membership For Annual $100k Fee

Tue, 2015-07-14 16:33

By Phil HillMore Posts (345)

Alert unnamed readers prompted me after the last post on the Unizin contract to pursue the rumored secondary method of joining for $100k. You know who you are – thanks.

While researching this question, I came across a presentation by the University of Florida provost to the State University System of Florida (SUSFL) seeking to get the system to join Unizin under these new terms. The meeting was March 19, 2015, and the video archive is here (first 15 minutes), and the slide deck is here. The key section (full transcript below):

Associate Membership FLSUS

Joe Glover: One of the things that Unizin has done – as I’ve said it consists of those 10 large research universities – is that the Unizin board decided that member institutions may nominate their system – in this case the state university system of Florida – for Associate Membership for an annual fee of $100,000 per system.

For $100,000 the entire state university system of Florida (SUSFL) could become an associate member of Unizin and enjoy all the benefits that Unizin brings forward, whether it’s reduced pricing of products that it’s licensing, or whether it products that Unizin actually produces. Associate Membership does not qualify for board representation, but as I mentioned you do enjoy the benefits of Unizin products and services.

This section reminded me of one item I should have highlighted in the contract. In appendix B:

The annual membership fees are waived for Founding Investors through June 30, 2017.

Does this mean that founding institutions that “invested” $1.050 million over three years will have to start paying annual fees of $100,000 starting in June 2017? That’s my assumption, but I’m checking to see what this clause means and will share at e-Literate.

Update (7/17): I talked to Amin Qazi today (CEO of Unizin) who let me know that the annual membership fee for institutional members (currently the 11 schools paying $1.050 million) has not be determined yet.

What is clear is that Unizin considers the board seat – therefore input on the future direction and operations of Unizin – to be worth $700,000.[1]

Full Transcript

The presentation is fascinating in its entirety, so I’m sharing it below. There are many points that should be analyzed, but I’ll save that for other posts and for other people to explore.

Joe Glover: I’d like to begin by explaining the problem that Unizin was created to try and avoid, and I’m going to do it by analogy with the publishing problem with scientific journals. About 30 years ago there was a plethora of publishing companies that would take the intellectual property being produced by universities in the form of journal articles, and they would print them and publish them. There was a lot of competition, prices were relatively low to do that.

Then in the ensuing 30 years there was tremendous consolidation in that industry to the point that there are only three or four major publishers of scientific articles. As a consequence they have a de facto monopoly, and they’re in the position of now taking what we produce, packaging it, and selling it back to the libraries of universities basically at whatever price they want to charge. This is a national problem. It is not a problem that is unique to Florida, and I think that every state in the nation is trying to figure out how to resolve this problem because we can’t afford to continue to pay exorbitant prices for journals.

That is a situation that we got ourselves in by not looking ahead to the future. We believe we are in a similar position with respect to distance learning at this point.

We have a plethora of universities and commercial firms. all trying to get into the digital space. Most of us believe that over the next 10 – 15 – 20 years there will be tremendous consolidation in this industry, and it is likely that there will emerge a relatively small number of players who control the digital space.

This consortium of universities wanted to make sure that the universities were not cut out of this process or this industry in much the same way that they had been cut out of scholarly publishing.

Every university in some sense runs a mom & pop operation in distance learning at this point, at least in comparison with large organizations like IBM and Pearson Learning that can bring hundreds of millions of dollars to the table. No university can afford to do that.

So a consortium of major research universities in the country, in an effort to look down the road and to avoid this problem, and to secure our foothold in the digital industry, formed a consortium called Unizin. I’m going to go briefly through this to tell you what this is, and then to lay before you an opportunity that the state university system can consider for membership in this consortium to enjoy the advantages that we expect it to bring.

Slide 1

This consortium is very new – it was launched in 2014. Its current membership is by invitation only. You cannot apply to become a member of this consortium, it is by invitation. As I mentioned, its objective is to promote greater control and influence over the digital learning ecosystem.

It’s governance is fairly standard. It has a board of directors that is drawn from the founding members. It has a CEO. It has a staff and it’s acquiring more staff. As a legal entity it is a not-for-profit service operation which is hosted by Internet2.

Slide 2

It’s current members include the universities that you see listed on this screen. These are 10 major universities in the nation – they’re all large research universities. There are other research universities that are considering joining. Unizin actually started out with four universities and quickly acquired the other six that are on this list.

Associate Membership FLSUS

The primary goals for Unizin as defined by its board of directors are the following. To acquire a learning management system that will serve as the foundation for what Unizin produces and performs. Secondly, to acquire or create a repository for digital learning objects. At the moment we are all producing all sorts of things, ranging from videos to little scientific clips, demonstrations, to illustrations, to lectures, notes, in all sorts of different formats – some retrievable, some not retrievable, some shareable, some not shareable. None of which is indexed, none of which I can see outside the University of Florida.

We believe there needs to be a repository that all of the members of Unizin can place the objects that they create to promote digital learning into, with an index. And in principle there will be developed a notion of sharing of these objects. It could be free sharing, it could be licensing, it could be selling. That’s something to be discussed in the future.

The third goal for Unizin is to acquire, create, or develop learning analytics. Some of the learning management systems have a rather primitive form of learning analytics. Unizin will build on what they have, and this will go from very mechanical types of learning analytics in terms of monitoring student progress and enabling intrusive advising and tutoring; all the way up to personalized learning, which is something that really does not exist yet but is one of the objectives of Unizin.

Those are the three primary goals for Unizin. If you believe that those are three important elements of infrastructure then you are probably interested in Unizin.

I have alluded to the possibility of a club, or of sharing content. We could think about sharing content. We could think about sharing courses. We could think about sharing degree programs. That is not really Unizin’s objective at this point. I will tell you that the universities that form the board for Unizin are in conversation about that, and we expect that to be one of the things that Unizin enables us to do as we create this repository, as we develop learning analytics we expect to be able to begin to collaborate with these universities. There are a lot of interesting questions as you approach that frontier, and by no means have these been resolved, but we believe it is inevitable and important for universities to begin sharing what they do in the digital learning space, and so Unizin would form the foundation for that.

One of the things that Unizin has done – as I’ve said it consists of those 10 large research universities – is that the Unizin board decided that member institutions may nominate their system – in this case the state university system of Florida – for Associate Membership for an annual fee of $100,000 per system.

For $100,000 the entire state university system of Florida (SUSFL) could become an associate member of Unizin and enjoy all the benefits that Unizin brings forward, whether it’s reduced pricing of products that it’s licensing, or whether it products that Unizin actually produces. Associate Membership does not qualify for board representation, but as I mentioned you do enjoy the benefits of Unizin products and services.

Slide 4

The potential benefits to the state university system I believe are the following. Unizin has settled on Canvas as the learning management system which would underlie the Unizin projects of building a repository and learning analytics. If you did not use Canvas you would still enjoy the benefits of Unizin and their products, but the use of them would not be as seamless as if you were on Canvas. You would have to build a crosswalk from the Unizin products to whatever LMS you are using. If you happen to be using Canvas you would enjoy the benefits of the Unizin products in a seamless fashion.

Unizin has negotiated a discount with Canvas. And so actually the University of Florida had signed the contract with Canvas before Unizin even existed. As soon as Unizin was created and negotiated a contract with Canvas, we actually received a discount from the price that we had negotiated. Because there were 10 large universities working on this, and there is some power in purchasing.

The second benefit, or second potential benefit which I think the system could enjoy is access to the tools which are under development as I’ve mentioned, including a digital repository and learning analytics.

Third, the system would enjoy membership in a consortium of large public universities that intends to secure its niche in the evolving digital ecosystem. As I have mentioned, we do see some potential risk as the industry consolidates, that we could be cut out of this industry if we don’t take the proper precautions.

Finally, as I’ve mentioned, there is the potential for cooperative relationships within the consortium to share digital instruction and to share digital objects and courses and degrees. That is really at the beginning conversation stage, that is not a goal of the Unizin organization itself but is a goal of the universities that underpin Unizin.

Q. I guess the real question is, tell me to what extent you can, how this will benefit each of the other universities who are not members at this time. And number two, could some of our other universities eventually become members?

A. Thank you for that question because I didn’t clarify one point that the question gives me the opportunity to clarify. Additional universities could be members of Unizin, and there are some universities in conversation with Unizin at this point. However, there is a larger charge for universities to become full board members of Unizin. University of Florida committed a million dollars over three years as part of the capitalization of Unizin. Every board member has done exactly the same. If a university in the system were interested in joining Unizin as a board member to help direct Unizin’s goals and operations, we could talk about that, but it would involve that level of investment.

At the lower level of investment, the $100,000 level which would be for the whole system – let’s say you join tomorrow – then an individual university would immediately have access to the preferred pricing for the Canvas learning management system. That would be a benefit to individual universities in the system who already are on Canvas or are considering going on Canvas. As the other tools or products are either acquired or developed by Unizin, the individual campuses would have access to those as well.

Q. I’d like to hear from John Hitt [president of UCF]. How does your university look at this proposal as it relates to online?

JH. I think the group membership for the system makes sense. I don’t think that it would make a lot of sense to have multiple institutions paying in a million bucks apiece. We would probably be interested in the $100,000 share. I doubt we would go for the full membership.

Q. Do you see the benefits they’re offering to benefit to UCF at this point, or would you use it?

JH. Yes, I think we would use some of it. We have more enthusiasm for some aspects of the membership than others. Yes, I think it would be useful.

There were no further questions, but it was apparent that some board members were not sure if they were being asked to pay $1 million for each campus or $100,000. Despite this short questioning, the motion passed as shared in the meeting minutes.

Chair Hosseini recognized Mr. Lautenbach for the Innovation and Online Committee report. Mr. Lautenbach reported the Committee heard an update from Provost Joe Glover on the Unizin Consortium and the Committee directed Chancellor Criser to work with university leadership in pursuing membership for the State University System in the consortium.

  1. The $1.050 million investment over three years minus alternate cost of $100,000 for these same three years.

The post Unizin Offering “Associate” Membership For Annual $100k Fee appeared first on e-Literate.

Instructure Is Truly Anomalous

Tue, 2015-07-14 08:54

By Michael FeldsteinMore Posts (1038)

Phil started his last post with the following:

I’m not sure which is more surprising – Instructure’s continued growth with no major hiccups or their competitors’ inability after a half-decade to understand and accept what is at its core a very simple strategy.

Personally, I vote for Door #1. As surprising as the competition’s seeming sense of denial is, Instructure’s performance is truly shocking. After five years, I continue to be surprised by it. It’s not just how well they are executing. It’s that they seem to defy the laws of physics in the LMS market. We had no reason to believe that any LMS company could rack up the numbers they are showing—in several different areas—no matter how well they execute.

Back in late 2010, I wrote a twopart series on LMS market share. For context, this was a year after Blackboard acquired ANGEL, a month before Instructure records its first clients on the growth graph in Phil’s previous post, six months before we wrote our first post about Instructure on e-Literate, and two years before WebCT was officially killed off. At that time, Blackboard still had dominant market share—over 50%—but it was starting to become clear for the first time that their dominance might not last forever. The posts were my attempt to figure out what might happen next. Here’s what the non-Blackboard LMS market looked like then:

Here’s what the market share looked like when the then-present trends were projected out to 2014:

What we see here is a steady decline of Blackboard’s market share getting spread out among multiple platforms. It’s worth calling out a problem with the data that we had at that time. Campus Computing, the source of the market share information in this graph, tracks market share by company, not by platform. So we had no way of knowing how much of their market share was from their Learn platform and how much of it was from WebCT. This was crucial (or, at least, it seemed crucial at the time) because Blackboard was force-migrating WebCT customers to their Learn product. The rate at which Blackboard’s market share got distributed to other platforms depended on how much of the attrition was from WebCT CE customers, how many were WebCT Vista customers, and how many were Blackboard Learn customers. The CE customers tended to be small schools with small contracts, and Blackboard wasn’t making much of an effort to keep them. To the degree that Blackboard’s losses were confined to CE going forward, the company would do just fine. On the other hand, to the degree that Blackboard lost customers from its core Learn platform, it would be a sign of impending catastrophe. LMS migrations were so hard and painful that very few schools migrated unless they felt that they absolutely had to. CE customers left Blackboard in part because it was clear that Blackboard didn’t care about them and that they therefore would never get the quality of product and service (and pricing) that they needed. Blackboard was making a real effort to keep Vista customers, but it was an open question as to whether the forced migration would cause Vista schools to look around at other options, or whether Blackboard could keep the pain of migration low enough that it would be easier to just roll over to Learn than to move to something else. If, on the other hand, Blackboard started losing Learn contracts, it would mean that customers on their core platform felt that the pain of staying was worse than the pain of leaving. At the time, there was strong anecdotal evidence that CE customers were leaving in droves, moderate anecdotal evidence that Vista customers were preparing to leave, and little evidence that Learn customers were leaving. My sense at the time was that Blackboard would probably lose a bunch of customers through the WebCT sunset in 2012 and then the market would more or less settle back into stasis.

That’s not what happened. To begin with, Instructure roared onto the scene in 2011 and ended up stealing the lion’s share of the market share that Blackboard was leaking. But that’s not all. Take a look at the graph Josh Coates presented at the most recent Instructurecon:

As Phil wrote,

There appears to be three periods of growth here:

  • From introduction (roughly Jan 2011) until May 2012: Average growth of ~65 clients per year;
  • From May 2012 until May 2014: Average growth of ~140 clients per year;
  • From May 2014 until present: Average growth of ~190 clients per year.

So Instructure’s growth has accelerated since the end of 2012, which is the opposite of what I would have expected. Where is that growth coming from? It’s hard to tell. Unfortunately, the data we have on LMS market share is not as good as one would hope. The best indications we have right now are that they are primarily coming from former Blackboard Learn and ANGEL customers. Switching data sources from Campus Computing to Edutechnica, here’s Phil’s September 2014 analysis:

  • Blackboard’s BbLearn and ANGEL continue to lose market share in US –[1] Using the 2013 to 2014 tables (> 2000 enrollments), BbLearn has dropped from 848 to 817 institutions and ANGEL has dropped from 162 to 123. Using the revised methodology, Blackboard market share for > 800 enrollments now stands at 33.5% of institutions and 43.5% of total enrollments.
  • Moodle, D2L, and Sakai have no changes in US – Using the 2013 to 2014 tables (> 2000 enrollments), D2L has added only 2 schools, Moodle none, and Sakai 2 schools.
  • Canvas is the fastest growing LMS and has overtaken D2L – Using the 2013 to 2014 tables (> 2000 enrollments), Canvas grew ~40% in one year (from 166 to 232 institutions). For the first time, Canvas appears to have have larger US market share than D2L (13.7% to 12.2% of total enrollments using table above).

But even if you assume that Instructure picked up 100% of the Learn and ANGEL customers—which is plausible, given these numbers—that’s still only 70 new customers. That’s half the ~140 new customers that Instructure is reporting. Could the rest be international? Maybe, although we have little reason to believe that to be the case. In the Edutechnica post that Phil references for the market share information, George Kroner does provide a little bit of information about Instructure’s international growth in the form of a graph of LMS market share in a few different countries:

We would need to see fully 50% of Instructure’s growth reflected in non-US markets to make the numbers square. We don’t see anything like that here. Of course, there are many other non-US markets. Maybe Canvas is all the rage in Turkmenistan. But it’s hard to square the circle. I just don’t know how to account for the company’s growth. I don’t doubt Instructure’s numbers. It’s just that there’s no way I can find to make sense of them with our current data about the market.

Beyond the numerical mystery, there seems to have been a change in market attitudes about LMS migration. Schools seem to be willing to look at alternatives even when they don’t have to. Nobody likes to migrate, of course, but a variety of factors, ranging from improved standards that make moving content easier to more technology maturity and experience among university faculty and staff, have reduced vendor lock-in. It’s a more fluid market now. I had hoped that would be the case someday but, in my heart of hearts, I really didn’t expect it. And at the moment, pretty much all of that new fluidity is flowing into Instructure—at least in US higher education.

Overall, Instructure’s growth is hard to explain. But there’s also another number that I can’t account for. I am in the process of writing an update to my post on the Glassdoor ratings of ed tech companies. At the moment, Instructure’s rating is 4.7. Out of 5. For reference, LinkedIn, which I used as context in last year’s post because it had one of the highest employee ratings on Glassdoor, currently rates only a 4.5. I have been to both Instructure’s and LinkedIn’s offices. LinkedIn’s is nicer. A lot nicer. I’m sure that their salaries are a lot higher as well. Instructure may be buoyed at the moment by the likelihood that they will have an IPO in the next year or two. But still. Instructure may be the highest rated company on Glassdoor right now, not just in ed tech, but the highest rated of any company.

Also weird is the fact that we don’t hear any major complaints from them from anywhere. People tell us stuff. Customers, former employees, and current employees come to us often to dish dirt. What we end up publishing is only the tip of the iceberg because we don’t publish anything unless we feel we have strong confirmation (which usually means multiple sources), we can protect our sources by preserving their anonymity, we believe the information is truly newsworthy, and so on. We hear a lot of dirt. But we hear very little about Instructure. When we poke around, we can get people to tell us things that they’re not happy with, but it’s all normal stuff—I really wish they had this feature, that feature doesn’t work as well as it could, the sales rep was a little annoying or a little unresponsive, and so on. And almost always, the person reporting the problem takes pains to tell us that he or she is generally happy with the company. As Phil wrote,

Companies change as they grow, and I have covered when the company lost both founders and a high-profile CTO. The company moves on, however, and I cannot find customers complaining (at least yet) that the company has changed and is ticking them off. They do have customer challenges, but so far these have been manageable challenges.

Pop quiz: Name the highest profile customer disaster (outage during examples or first week, broken implementation, major bugs, etc) for Canvas.

It’s not normal. And it can’t last forever. Sooner or later, gravity will assert itself and the company will start screwing up. They all do, eventually. But right now, Instructure’s performance is so good by multiple measures that it is almost literally unbelievable.

The post Instructure Is Truly Anomalous appeared first on e-Literate.

Instructure: Accelerating growth in 3 parallel markets

Mon, 2015-07-13 18:50

By Phil HillMore Posts (345)

I’m not sure which is more surprising – Instructure’s continued growth with no major hiccups or their competitors’ inability after a half-decade to understand and accept what is at its core a very simple strategy. Despite Canvas LMS winning far more new higher ed and K-12 customers than any other vendor, I still hear competitors claim that schools select Canvas due to rigged RFPs or being the shiny new tool despite having no depth or substance. When listening to the market, however, (institutions – including faculty, students, IT staff, academic technology staff, and admin), I hear the opposite. Canvas is winning LMS selections despite, not because of, RFP processes, and there are material and substantive reasons for this success.

The only competitor I see that seems to understand the depth of the challenge they face is Blackboard. Other LMS solutions are adding “cloud” options or making incremental improvements to usability, but only Blackboard is going for wholesale changes to both its User Experience (UX) and cloud hosting architecture. Unfortunately, I question whether Blackboard will be able to execute this strategy, but that is a story for another post.

Like last year’s post about InstructureCon, I believe that the company growth chart[1] gives a lot more information than just “gosh, we’re doing well”.

InstructureCon 2015 Growth Slide

Education Market Growth – Canvas

The use of Canvas in higher ed (show as blue above) has grown steadily, but not exponentially, since the product introduction more than 4 years ago. There appears to be three periods of growth here:

  • From introduction (roughly Jan 2011) until May 2012: Average growth of ~65 clients per year;
  • From May 2012 until May 2014: Average growth of ~140 clients per year;
  • From May 2014 until present: Average growth of ~190 clients per year.

The use of Canvas in K-12 (show as red above) has grown much faster, and in fact Instructure has more K-12 clients than higher ed and has more sales people in K-12 than higher ed. Let that sink in for a moment – it is a point that is not well understood by the market. Over the same three periods:

  • From introduction (roughly Jan 2011) until May 2012: Average growth of ~20 clients per year (much lower than higher ed);
  • From May 2012 until May 2014: Average growth of ~135 clients per year (almost the same as higher ed);
  • From May 2014 until present: Average growth of ~340 clients per year (far exceeds higher ed).

It should be noted, however, that K-12 clients tend to have fewer students per contract and tend to spend far less per student. I don’t have exact numbers, but we could assume the following:[2]

  • Instructure has more than 50% of its clients in K-12;
  • Instructure has 30 – 40% of its student counts in K-12; and
  • Instructure makes 25 – 33% of its revenue in K-12.
Corporate Market Growth – Bridge

Actually, the client numbers (shown in green above) do not show significant growth in corporate markets yet – just slow growth of ~30 per year. I wrote about the recent product introduction of Bridge (their LMS for corporate markets) here and here. This is a different strategy than other higher ed originated LMS approaches, where Blackboard, D2L, and Moodle all use the same LMS for both education and corporate markets.

In discussions at the conference, however, the company certainly believes they are about to experience real growth in the corporate market with the new product, and they are hiring the sales force to lead this effort. It will be interesting to watch over the next year to see if the company succeeds in getting similar levels of growth as in higher ed and K-12.

Product Announcements

There were two main product announcements at the conference:

  • After a half-decade on the market, Canvas is gradually moving to a new UX design. I’ll cover that more in a second post.
  • Instructure introduced Canvas Data, a hosted data solution that addresses the biggest weakness in Canvas (not in terms of leapfrogging competition but rather trying to close the gap or to remove the weakness).

At its core, Canvas Data is an easily accessible native-cloud service, delivered on Amazon Web Services through Redshift. Canvas Data provides clients access to their data, including course design features, course activity, assessment and evaluation, user and device characteristics and more.

Both announcements are interesting, but mostly as they further illuminate the company’s strategy.

Market Strategy

Taken together, what we see is a company with a fairly straightforward strategy. Pick a market where the company can introduce a learning platform that is far simpler and more elegant than the status quo, then just deliver and go for happy customers.  Don’t expand beyond your core competency, don’t add parallel product lines, don’t over-complicate the product, don’t rely on corporate M&A. Where you have problems, address the gap. Rinse. Repeat.

Instructure has now solidified their dominance in US higher ed (having the most new client wins), they have hit their stride with K-12, and they are just starting with corporate learning. What’s next? I would assume international education markets, where Instructure has already started to make inroads in the UK and a few other locations.

The other pattern we see is that the company focuses on the mainstream from a technology adoption perspective. That doesn’t mean that they don’t want to serve early adopters with Canvas or Bridge, but Instructure more than any other LMS company knows how to say ‘No’. They don’t add features or change designs unless the result will help the mainstream adoption – which is primarily instructors. Of course students care, but they don’t choose whether to use an LMS for their course – faculty and teachers do. For education markets, the ability to satisfy early adopters rests heavily on the Canvas LTI-enabled integrations and acceptance of external application usage; this is in contrast to primarily relying on having all the features in one system.

Avoid Problems

From the beginning Instructure designed their products from the ground up to fully utilize a cloud architecture, but this also applies to the product management and support services. Instructure has essentially one software version for each product[3] from the beginning, and unlike most other higher ed LMS providers, they reap the benefits of software release management and bug fixing simplicity. Cloud is not just an issue of cost-effective scaling, it is also a matter of getting the software out of the way – just have it work.

Companies change as they grow, and I have covered when the company lost both founders and a high profile CTO. The company moves on, however, and I cannot find customers complaining (at least yet) that the company has changed and is ticking them off. They do have customer challenges, but so far these have been manageable challenges.

Pop quiz: Name the highest profile customer disaster (outage during examples or first week, broken implementation, major bugs, etc) for Canvas.

It’s Not Complicated

I suspect that everything covered in this blog post has been said before, including at e-Literate. There is nothing complex or even nuanced here.

My biggest criticism at this year’s conference is that the keynotes were unfocused and didn’t share enough information about product roadmaps. It’s fine to not focus everything on technology and products, but come on, if you’re going to talk about empathy then tie it explicitly to how that concept affects your company’s approach to student-centered learning.

But despite the weak keynote and despite Josh Coates’ reputation as a jerk (he even referenced this in the keynote), consider the observation Michael made to me that Instructure is one of the very few companies whose employee reviews at Glassdoor rival (or even exceed) LinkedIn’s reviews. Trust me, this is not true for other ed tech companies.


I typically don’t write blog posts this positive about ed tech companies, but at this point I think the market needs to realize just how well-managed Instructure is and how positive schools are as they adopt and use its LMS. So far Instructure has been a net positive for higher ed and K-12, but change has come too slowly to the rest of the ed tech market in response to Canvas. Competition is good.

  1. The chart shows the number of clients, which is essentially the number of contracts signed with institutions, school districts, or statewide systems adopting either Canvas or Bridge LMS products.
  2. Note: this includes some personal bar-napkin estimates and student count and revenue are not reported by the company.
  3. It’s a little more complicated than just one software version based on test servers and client acceptance of changes, but the general idea holds in terms of understanding strategy.

The post Instructure: Accelerating growth in 3 parallel markets appeared first on e-Literate.

Promising Research Results On Specific Forms Of Adaptive Learning / ITS

Fri, 2015-07-10 12:45

By Phil HillMore Posts (344)

Recently I described an unpublished study by Dragan Gasevic and team on the use of Knowillage / LeaP adaptive platform.[1] The context of article was on D2L’s misuse of the results, but the study itself is interesting in terms of its findings that adaptive learning usage (specifically LeaP in addition to Moodle within an Intro to Chemistry course) can improve academic performance. I will share more when and if the results become public.

If we look to published research reports there are other studies that back up the potential of adaptive approaches, but the most promising results appear to be for a subset of adaptive systems that provide not just content selection but also tutoring. Last year a research team from Simon Fraser University and Washington State University published a meta-analysis on Intelligent Tutoring Systems (ITS) which they described as having origins from 1970 and the development of SCHOLAR.[2] The study looked at 107 studies involving 14,321 participants and found:

The use of ITS was associated with greater achievement in comparison with teacher-led, large-group instruction (g .42), non-ITS computer-based instruction (g .57), and textbooks or workbooks (g .35). There was no significant difference between learning from ITS and learning from individualized human tutoring (g –.11) or small-group instruction (g .05). Significant, positive mean effect sizes were found regardless of whether the ITS was used as the principal means of instruction, a supplement to teacher-led instruction, an integral component of teacher-led instruction, or an aid to homework. Significant, positive effect sizes were found at all levels of education, in almost all subject domains evaluated, and whether or not the ITS provided feedback or modeled student misconceptions. The claim that ITS are relatively effective tools for learning is consistent with our analysis of potential publication bias.

Relationship of ITS and Adaptive Learning Software

Unlike most marketing and media descriptions of Adaptive Learning, the report is quite specific on defining what an Intelligent Tutoring System is and isn’t.

An ITS is a computer system that for each student:

  1. Performs tutoring functions by (a) presenting information to be learned, (b) asking questions or assigning learning tasks, (c) providing feedback or hints, (d) answering questions posed by students, or (e) offering prompts to provoke cognitive, motivational or metacognitive change
  2. By computing inferences from student responses con- structs either a persistent multidimensional model of the student’s psychological states (such as subject matter knowledge, learning strategies, motivations, or emotions) or locates the student’s current psychological state in a multidimensional domain model
  3. Uses the student modeling functions identified in point 2 to adapt one or more of the tutoring functions identified in point 1

There are plenty of computer-based instruction (CBI) methods out there, but ITS relies on a multidimensional model of the student in addition to a model of the subject area (domain model). The report also calls out that CBI approaches that only model the student in one dimension of item response theory (IRT, more or less the model of a student’s ability to correctly answer specific questions) are not ITS in their definition. IRT can be one of the dimensions but not the only dimension.

A 2014 meta-analysis referred to by the above report further clarifies the conditions for a system to be an ITS as follows [emphasis added]:

VanLehn (2006) described ITS as tutoring systems that have both an outer loop and an inner loop. The outer loop selects learning tasks; it may do so in an adaptive manner (i.e., select different problem sequences for different students) based on the system’s assessment of each individual student’s strengths and weaknesses with respect to the targeted learning objectives. The inner loop elicits steps within each task (e.g., problem-solving steps) and provides guidance with respect to these steps, typically in the form of feedback, hints, or error messages.

For the sloppy field of Adaptive Learning, this means that the study looks at systems that model students, provide immediate feedback to students, and provide hints and support to students as they work through a specific task (inner loop). Adaptive Learning systems that only change the content or tasks presented to students adaptively (outer loop) do not qualify. Some examples of Adaptive Learning / ITS systems include McGraw-Hill’s ALEKS and AutoTUTOR. Knowillage / LeaP is an example of a system that is not an ITS.

Promising Findings

The results showed “the use of ITS produced moderate, statistically significant mean effect sizes” compared to large-group human instruction, individual CBI, and textbooks / workbooks. The results showed no statistically significant mean effect sizes compared to small-group human instruction and individual tutoring. In other words, the study shows improvements of ITS over large lecture classes, non-ITS software tools, and textbooks / workbooks but no real difference with small classes or individual tutors.

ITS Fig 1

What is quite interesting is that the results hold across multiple intervention approaches. Using ITS as Principal instruction, Integrated class instruction, Separate in-class activities, Supplementary after-class instruction, or Homework give similar positive results.

Why Does ITS Give Positive Results?

The report hypothesizes that the primary reasons that ITS seems to provide positive results as follows [formatting added, excerpted]:

[ITS shared characteristics with other forms of CBI] Specifically, they have attributed the effectiveness of CBI to:

  • greater immediacy of feedback (Azevedo & Bernard, 1995),
  • feedback that is more response-specific (Sosa, Berger, Saw, & Mary, 2011),
  • greater cognitive engagement (Cohen & Dacanay, 1992),
  • more opportunity for practice and feedback (Martin, Klein, & Sullivan, 2007),
  • increased learner control (Hughes et al., 2013), and
  • individualized task selection (Corbalan, Kester, & Van Mer- riënboer, 2006).

[snip] The prior quantitative reviews also concluded that using ITS is associated with greater achievement than using non-ITS CBI. We hypothesize that multidimensional student modeling enables ITS to outperform non-ITS CBI on each of its advantages cited in the previous paragraph.

[snip] ITS may also be more effective than non-ITS CBI in the sense that ITS can extend the general advantages of CBI to wider set of learning activities. For example, the ability to score and provide individualized comments on a student’s essay would extend the advantage of immediate feedback well beyond what is possible in non-ITS CBI. This document is copyrighted by the American Psychological Association or one of its allied publishers. This article is intended solely for the personal use of the individual user and is not to be disseminated broadly.

Student modeling also enables ITS to interact with students at a finer level of granularity than test-and-branch CBI systems.

These are very encouraging results for the field of ITS and a subset of Adaptive Learning. I view the results not as saying adaptive learning is the way to go but rather as there is evidence that adaptive learning working applied in a tutoring role can improve academic performance in the right situations.

We need more evidence-based evaluation of different teaching strategies and edtech applications.

  1. When the study started Knowillage was an independent company; mid-way through study D2L bought Knowillage and renamed product as LeaP.
  2. I would link to G+ post by George Station here if it were not for the ironic impossibility of searching within that platform.

The post Promising Research Results On Specific Forms Of Adaptive Learning / ITS appeared first on e-Literate.

Unizin One Year Later: View of contract reveals . . . nothing of substance

Thu, 2015-07-09 09:18

By Phil HillMore Posts (343)

I’ve been meaning to write an update post on Unizin, as we broke the story here at e-Literate in May 2014 and Unizin went public a month later. It’s one year later, and we still have the most expensive method to get the Canvas LMS. There are also plans for a Content Relay and Analytics Relay as seen in ELI presentation, but the actual dates keep slipping.

Unizin Roadmap

e-Literate was able to obtain a copy of the Unizin contract, at least for the founding members, through a public records request. There is nothing to see here. Because there is nothing to see here. The essence of the contract is for a university to pay $1.050 million to become a member. The member university then has a right (but not an obligation) to then select and pay for actual services. Based on the contract, membership gets you . . . membership. Nothing else.

What is remarkable to me is the portion of the contract spelling out obligations. Section 3.1 calls out that “As a member of the Consortium, University agrees to the following:” and lists:

  • complying with Unizin bylaws and policies;
  • paying the $1.050 million; and
  • designating points of contact and representation on board.

Unizin agrees to nothing. There is literally no description of what Unizin provides beyond this description [emphasis added]:

This Agreement establishes the terms of University’s participation in the Consortium, an unincorporated member-owned association created to provide Consortium Members access to an evolving ecosystem of digitally enabled educational systems and collaborations.

What does access mean? For the past year the only service available has been Canvas as an LMS. When and if the Content Relay and Analytics Relay become available, member institutions will have the right to pay for those. Membership in Unizin gives a school input into defining those services as well.

As we described last year, paying a million dollars to join Unizin does not give a school any of the software. The school has to pay licensing & hosting fees for each service in addition to the initial investment.

The contract goes out of its way to point out that Unizin actually provides nothing. While this is contract legalese, it’s important to note this description in section 6.5 [original emphasized in ALL CAPS but shared here at lower volume].[1]

Consortium operator is not providing the Unizin services, or any other services, licenses, products, offerings or deliverables of any kind to University, and therefore makes no warranties, whether express or implied. Consortium Operator expressly disclaims all warranties in connection with the Unizin services and any other services, licenses, products, offerings or deliverables made available to University under or in connection with this agreement, both express and implied, …[snip]. Consortium Operator will not be liable for any data loss or corruption related to use of the Unizin services.

This contract appears to be at odds with the oft-stated goal of giving institutions control and ownership of their digital tools (also taken from ELI presentation).

We have a vested interest in staying in control of our data, our students, our content, and our reputation/brand.

I had planned to piece together clues and speculate on what functionality the Content Relay will provide, but given the delays it is probably best to just wait and see. I’ve been told by Unizin insiders and heard publicly at conference presentations since February 2015 about the imminent release of Content Relay, and right now we just have slideware. I have asked for a better description of what functionality the Content Relay will provide, but this information is not yet available.

Unizin leadership and board members understand this quandary. As Bruce Maas, CIO at U Wisconsin, put it to me this spring, his job promoting and explaining Unizin will get a lot easier when there is more to offer than just Canvas as the LMS.

For now, here is the full agreement as signed by the University of Florida [I have removed the signature page and contact information page as I do not see the need to make these public].

Download (PDF, 587KB)

  1. Also note that Unizin is unincorporated part of Internet2. Internet2 is the “Consortium Operator” and signer of this agreement.

The post Unizin One Year Later: View of contract reveals . . . nothing of substance appeared first on e-Literate.

The Importance Of Student Control Of Learning, Especially For Working Adults

Tue, 2015-07-07 13:32

By Phil HillMore Posts (343)

When giving keynotes at conferences over the past two years, I have observed that some of the best non-verbal feedback occurs when pointing out that personalized and adaptive learning does not equal black-box algorithms choosing content for students. Yes, there are plenty of approaches pitching that solution (Knewton in its early state being the best-known if not most-current example), but there are other approaches designed to give faculty or instructional designers control over learning paths or even to give students control. There seems to be a sense of relief, particularly from faculty members, when discussing the latter approach.

In the Empire State College case study on e-Literate TV, I found the conversation Michael had with [faculty member] Maya Richardson to be a great example of not just giving faculty insight into student learning but also giving students control over their own learning. As Maya explains, this is particularly important for the working adult population going back to school. The software used in this pedagogical approach is CogBooks.

Michael Feldstein:While so-called personalized learning programs are sometimes criticized for moving students lockstep through a linear process, Maya emphasizes the choice and control that students have regarding how they go through the content.

Maya Richardson:What it is—it’s a concept mapping, so they take concepts here, concepts here, and then there’s a split-off, and those concepts then split off and then split off and split off. And then, depending on the student, now students can go, “OK, I understood that concept. I already know that concept, so I don’t need to go to that one right now. I can skip and go here.” This is where the individualized and personalized learning comes in—like a smorgasbord, you pick and choose what you want to learn.

And then you come in; you do the discussion, and you either have more to add to it and a greater enrichment of the experience for yourself but also for your classmates. Then there are those who go, “OK, I need to go through each one of these, step by step, and learn each one, and then move down to learning these and then these and then these and then these,” and then at the end, they’ve gotten so much more out of it.

Maya then goes on to describe her visibility enabled by this pedagogical approach – not just of which concepts the student has mastered but also the learning process and choices that the student makes.

Maya Richardson:It’s that kind of opportunity that I can now watch and go, “OK, so you’re the kind of learner that I can just basically let you go and do what you need to do. I am not going to be interrupting your learning path because you have a very positive learning path. I can watch you do this. “It’s a great pattern. You’re going for it,” and I’m just going, “Wonderful. Just come in, do the discussion, do your test,” and I’m like, “A-student, perfect, great, way to go.” Then I see the ones that sort of the sporadic. They come in, they touch and go, and I go, “OK, let me see how you’re doing.”

There’s a lot more in this conversation, but I want to skip ahead a minute or so in the conversation to this key point about student control, or agency.

Michael Feldstein:Maya and her colleagues are thoughtful about how this kind of software fits with the holistic approach that ESC takes towards education.

Maya Richardson:The personalized learning part of it is taking ownership. I think it motivates. As an adult learner, it’s really important to find that you have some control over—when I go in, I know what I want to learn. I hope I know what I want to learn, and I hope I learn it at the end.

There are disciplines and contexts where adaptive algorithms choosing appropriate content makes sense, but I find that too often this is the assumption for all of personalized learning. This example from Empire State College illuminates the growing importance of student control, especially for the growing working adult populations.

The post The Importance Of Student Control Of Learning, Especially For Working Adults appeared first on e-Literate.

D2L Again Misusing Academic Data For Brightspace Marketing Claims

Thu, 2015-07-02 05:56

By Phil HillMore Posts (342)

At this point I’d say that we have established a pattern of behavior.

Michael and I have been quite critical of D2L and their pattern of marketing behavior that is misleading and harmful to the ed tech community. Michael put it best:

I can’t remember the last time I read one of D2L’s announcements without rolling my eyes. I used to have respect for the company, but now I have to make a conscious effort not to dismiss any of their pronouncements out-of-hand. Not because I think it’s impossible that they might be doing good work, but because they force me to dive into a mountain of horseshit in the hopes of finding a nugget of gold at the bottom. Every. Single. Time. I’m not sure how much of the problem is that they have decided that they need to be disingenuous because they are under threat from Instructure or under pressure from investors and how much of it is that they are genuinely deluding themselves. Sadly, there have been some signs that at least part of the problem is the latter situation, which is a lot harder to fix. But there is also a fundamental dishonesty in the way that these statistics have been presented.

Well, here’s the latest. John Baker put out a blog called This Isn’t Your Dad’s Distance Learning Program with this theme:

But rather than talking about products, I think it’s important to talk about principles. I believe that if we’re going to use education technology to close the attainment gap, it has to deliver results. That — as pragmatic as it is — is the main guiding principle.

The link about “deliver results” leads to this page (excerpted as it existed prior to June 30th, for reasons that will become apparent).

Why Brightspace

Why Brightspace? Results.

So the stage is set – use ed tech to deliver results, and Brightspace (D2L’s learning platform, or LMS) delivers results. Now we come to the proof, including these two examples.


According to Californiat State University-Long Beach, retention has improved 6% year-over-year since they adopted Brightspace.[snip]

University of Wisconsin-Milwaukee reported an increase in the number of students getting A’s and B’s in Brightspace-powered courses by over 170%

Great results, no? Let’s check the sources. Ah . . . clever marketing folks – no supporting data or even hyperlinks to learn more. Let’s just accept their claims and move along.

. . .

OK, that was a joke.

CSU Long Beach

I contacted CSU Long Beach to learn more, but I could find no one who knew where this data came from or even that D2L was making this claim. I shared the links and context, and they went off to explore. Today I get a message saying that the issue has been resolved, but that CSU Long Beach would make no public statements on the matter. Fair enough – the observations below are my own.

If you now look at that Results page now, the CSU Long Beach claim is no longer there – down the memory hole[1] with no explanation, replaced by a new claim about Mohawk College.

Mohawk UWM Results

While CSU Long Beach would not comment further on the situation, there are only two plausible explanations for the issue being resolved by D2L taking down the data. Either D2L was using legitimate data that they were not authorized to use (best case scenario) or D2L was using data that doesn’t really exist. I could speculate further, but the onus should be on D2L since they are the ones who made the claim.

UW Milwaukee

I also contacted UW Milwaukee to learn more, and I believe the data in question is from the U-Pace program which has been fully documented.[2][3]

The U-Pace instructional approach combnes self-paced, master-based learning with instructor-initiated Amplified Assistance in an online environment.

The control group was traditionally-taught (read that as large lecture classes) for Intro to Psychology.

From the EDUCAUSE Quarterly article on U-Pace, for disadvantaged students the number of A’s and B’s increased 163%. This is the closest data I can find to back up D2L’s claim of 170% increase.

U-Pace results EQ

There are three immediate problems here (ignoring the fact that I can’t find improvements of more than 170% – I’ll take 163%).

  1. First, the data claim is missing the context of “for underprepared students” who exhibited much higher gains than prepared students. That’s a great result for the U-Pace program, but it is also important context to include.
  2. The program is an instructional change, moving from large lecture classes to self-paced, mastery-learning approach. That is the intervention, not the use of the LMS. In fact, D2L was the LMS used in both the control group and the U-Pace treatment group.
  3. The program goes out of its way to call out the minimal technology needed to adopt the approach, and they even list Blackboard, Desire2Learn, and Moodle as examples of LMS’s that work with the following conditions:

U-Pace LMS Reqs

This is an instructional approach that claims to be LMS neutral with D2L’s Brightspace used in both the control group and treatment group, yet D2L positions the results as proof that Brightspace gets results! It’s wonderful that Brightspace LMS worked during the test and did not get in the way, but that is a far cry from Brightspace “delivering results”.

The Pattern

We have to now add these two cases to the Lone Star College and LeaP examples. In all cases, there is a pattern.

  1. D2L makes marketing claim implying their LMS Brightspace delivers results, referring to academic outcomes data with missing supporting data or references.
  2. I contact school or research group to learn more.
  3. Data is either misleading (treatment group is not LMS usage but instead instructional approach, adaptive learning technology, or student support software) or just plain wrong (with data taken down).
  4. In all cases, the results could have been presented honestly, showing the appropriate context, links for further reading, and explanation of the LMS role. But they were not presented honestly.
  5. e-Literate blog post almost writes itself.
  6. D2L moves on to make their next claim, with no explanations.

I understand that other ed tech vendors make marketing claims that cannot always be tied to reality, but these examples cross a line. They misuse and misrepresent academic outcomes data – whether public research-based on internal research – and essentially take credit for their technology “delivering results”.

This is the misuse of someone else’s data for corporate gain. Institutional data. Student data. That is far different than using overly-positive descriptions of your own data or subjective observations. That is wrong.

The Offer

For D2L company officials, I have an offer.

  1. If you have answers or even corrections about these issues, please let us know through your own blog post or comments to this blog.
  2. If you find any mistakes in my analysis, I will write a correction post.
  3. We are happy to publish any reply you make here on e-Literate.
  1. Their web page does not allow archiving with the Wayback Machine, but I captured screenshots in anticipation of this move.
  2. Note – While I assume this claim derives from U-Pace, I am not sure. It is the closest example of real data that I could find, thanks to a helpful tip from UW-M staff. I’ll give D2L the benefit of the doubt despite their lack of reference.
  3. And really, D2L marketing staff should learn how to link to external sources. It’s good Internet practice.

The post D2L Again Misusing Academic Data For Brightspace Marketing Claims appeared first on e-Literate.

U of Phoenix: Losing hundreds of millions of dollars on adaptive-learning LMS bet

Tue, 2015-06-30 09:17

By Phil HillMore Posts (341)

It would be interesting to read (or write) a post mortem on this project some day.

Two and a half years ago I wrote a post describing the University of Phoenix investment of a billion dollars on new IT infrastructure, including hundreds of millions of dollars spent on a new, adaptive-learning LMS. In another post I described a ridiculous patent awarded to Apollo Group, parent company of U of Phoenix, that claimed ownership of adaptive activity streams. Beyond the patent, Apollo Group also purchased Carnegie Learning for $75 million as part of this effort.

And that’s all going away, as described by this morning’s Chronicle article on the company planning to go down to just 150,000 students (from a high of 460,000 several years ago).

And after spending years and untold millions on developing its own digital course platform that it said would revolutionize online learning, Mr. Cappelli said the university would drop its proprietary learning systems in favor of commercially available products. Many Apollo watchers had long expected that it would try to license its system to other colleges, but that never came to pass.

I wonder what the company will do with the patent and with Carnegie Learning assets now that they’re going with commercial products. I also wonder who is going to hire many of the developers. I don’t know the full story, but it is pretty clear that even with a budget of hundreds of millions of dollars and adjunct faculty with centralized course design, the University of Phoenix did not succeed in building the next generation learning platform.

Update: Here is full quote from earnings call:

Fifth. We plan to move away from certain proprietary and legacy IT systems to more efficiently meet student and organizational needs over time. This means transitioning an increased portion of our technology portfolio to commercial software providers, allowing us to focus more of our time and investment on educating and student outcomes. While Apollo was among the first to design an online classroom and supporting system, in today’s world it’s simply not as efficient to continue to support complicated, custom-designed systems particularly with the newer quality systems we have more recently found with of the self providers that now exist within the marketplace. This is expected to reduce costs over the long term, increase operational efficiency and effectiveness while still very much supporting a strong student experience.

The post U of Phoenix: Losing hundreds of millions of dollars on adaptive-learning LMS bet appeared first on e-Literate.

ASU Is No Longer Using Khan Academy In Developmental Math Program

Mon, 2015-06-29 17:37

By Phil HillMore Posts (340)

In these two episodes of e-Literate TV, we shared how Arizona State University (ASU) started using Khan Academy as the software platform for a redesigned developmental math course[1] (MAT 110). The program was designed in Summer 2014 and ran through Fall 2014 and Spring 2015 terms. Recognizing the public information shared through e-Literate TV, ASU officials recently informed us that they had made a programmatic change and will replace their use of Khan Academy software with McGraw-Hill’s LearnSmart software that is used in other sections of developmental math.

To put this news in context, here is the first episode’s mention of Khan Academy usage.

Phil Hill: The Khan Academy program that you’re doing, as I understand, it’s for general education math. Could you give just a quick summary of what the program is?

Adrian Sannier: Absolutely. So, for the last three-and-a-half years, maybe four, we have been using a variety of different computer tutor technologies to change the pedagogy that we use in first-year math. Now, first-year math begins with something we call “Math 110.” Math 110 is like if you don’t place into either college algebra, which has been the traditional first-year math course, or into a course we call “college math,” which is your non-STEM major math—if you don’t place into either of those, then that shows you need some remediation, some bolstering of some skills that you didn’t gain in high school.

So, we have a course for that. Our first-year math program encompasses getting you to either the ability to follow a STEM major or the ability to follow majors that don’t require as intense of a math education. What we’ve done is create an online mechanism to coach students. Each student is assigned a trained undergraduate coach under the direction of our instructor who then helps that student understand how to use the Khan Academy and other tools to work on the skills that they show deficit in and work toward being able to satisfy the very same standards and tests that we’ve always used to ascertain whether a student is prepared for the rest of their college work.

Luckily, the episode on MAT 110 focused mostly on the changing roles of faculty members and TAs when using an adaptive software approach, rather than focusing on Khan Academy itself. After reviewing the episode again, I believe that it stands on its own and is relevant even with the change in software platform. Nevertheless, I appreciate that ASU officials were proactive to let me know about this change, so that we can document the change here and in e-Literate TV transmedia.

The Change

Since the change has not been shared outside of this notification (limiting my ability to do research and analysis), I felt the best approach would be to again interview Adrian Sannier, Chief Academic Technology Officer at ASU Online. Below is the result of an email interview, followed by short commentary [emphasis added].

Phil Hill: Thanks for agreeing to this interview to update plans on the MAT 110 course featured in the recent e-Literate TV episode. Could you describe the learning platforms used by ASU in the new math programs (MAT 110 and MAT 117 in particular) as well as describe any changes that have occurred this year?

Adrian Sannier: Over the past four years, ASU has worked with a variety of different commercially available personalized math tutors from Knewton, Pearson, McGraw Hill and the Khan Academy applied to 3 different courses in Freshman Math at ASU – College Algebra, College Math and Developmental Math. Each of these platforms has strengths and weaknesses in practice, and the ASU team has worked closely with the providers to identify ways to drive continuous improvement in their use at ASU.

This past year ASU used a customized version of Pearson’s MyMathLab as the instructional platform for College Algebra and College Math. In Developmental Math, we taught some sections using the Khan Academy Learning Dashboard and others using McGraw Hill’s LearnSmart environment.

This Fall, ASU will be using the McGraw Hill platform for Developmental Math and Pearson’s MyMathLab for College Algebra and College Math. While we also achieved good results with the Khan Academy this past year, we weren’t comfortable with our current ability to integrate the Khan product at the institutional level.

ASU is committed to the personalized adaptive approach to Freshman mathematics instruction, and we are continuously evaluating the product space to identify the tools that we feel will work best for our students.

Phil Hill: I presume this means that ASU’s usage of McGraw Hill’s LearnSmart for Developmental Math will continue and also expand to essentially replace the usage of Khan Academy. Is this correct? If so, what do you see as the impact on faculty and students involved in the course sections that previously used Khan Academy?

Adrian Sannier: That’s right Phil. Based on our experience with the McGraw Hill product we don’t expect any adverse effects.

Phil Hill: Could you further explain the comment “we weren’t comfortable with our current ability to integrate the Khan product at the institutional level”? I believe that Khan Academy’s API approach is more targeted to B2C [business-to-consumer] applications, allowing individual users to access information rather than B2B [business-to-business] enterprise usage, whereas McGraw Hill LearnSmart and others are set up for B2B usage from an API perspective. Is this the general issue you have in mind?

Adrian Sannier: That’s right Phil. We’ve found that the less cognitive load an online environment places on students the better results we see. Clean, tight integrations into the rest of the student experience result in earlier and more significant student engagement, and better student success overall.


Keep in mind that ASU is quite protective of its relationship with multiple software vendors and that they go out of their way to not publicly complain or put their partners in a bad light, even if a change is required as in MAT 110. Adrian does make it clear, however, that the key issue is the ability to integrate reliably between multiple systems. As noted in the interview, I think a related issue here is a mismatch of business models. ASU wants enterprise software applications where they can deeply integrate with a reliable API to allow a student experience without undue “cognitive load” of navigating between applications. Khan Academy’s core business model relies on people navigating to their portal on their website, and this does not fit the enterprise software model. I have not interviewed Khan Academy, but this is how it looks from the outside.

There is another point to consider here. While I can see Adrian’s argument that “we don’t expect any adverse effects” in the long run, I do think there are switching costs in the short term. As Sue McClure told me via email, as an instructor she spent significantly more time than usual on this course due to course design and ramping up the new model. In addition, ASU added 11 TAs for the course sections using Khan Academy.  These people have likely learned important lessons about supporting students in an adaptive learning setting, but a great deal of their Khan-specific time is now gone. Plus, they will need to spend time learning LearnSmart before getting fully comfortable in that environment.

Unfortunately, with the quick change, we might not see hard data to determine if the changes were working. I believe ASU’s plans were to analyze and publish the results from this new program after the third term which will not happen.

If I find out more information, I’ll share it here.

  1. The terms remedial math and developmental math are interchangeable in this context.

The post ASU Is No Longer Using Khan Academy In Developmental Math Program appeared first on e-Literate.

Google Classroom Addresses Major Barrier To Deeper Higher Ed Adoption

Mon, 2015-06-29 11:28

By Phil HillMore Posts (339)

A year ago I wrote about Google Classroom, speculating whether it would affect the institutional LMS market in higher education. My initial conclusion:

I am not one to look at Google’s moves as the end of the LMS or a complete shift in the market (at least in the short term), but I do think Classroom is significant and worth watching. I suspect this will have a bigger impact on individual faculty adoption in higher ed or as a secondary LMS than it will on official institutional adoption, at least for the next 2 – 3 years.

And my explanation [emphasis added]:

But these features are targeted at innovators and early adopter instructors who are willing to fill in the gaps themselves.

  1. The course creation, including setting up of rosters, is easy for an instructor to do manually, but it is manual. There has been no discussion that I can find showing that the system can automatically create a course, including roster, and update over the add / drop period.

  1. There is no provision for multiple roles (student in one class, teacher in another) or for multiple teachers per class.
  2. The integration with Google Drive, especially with Google Docs and Sheets, is quite intuitive. But there is no provision for PDF or MS Word docs or even publisher-provided courseware.
  3. There does not appear to be a gradebook – just grading of individual assignments. There is a button to export grades, and I assume that you can combine all the grades into a custom Google Sheets spreadsheet or even pick a GAFE gradebook app. But there is no consistent gradebook available for all instructors within an institution to use and for students to see consistently.

Well today Google announced a new Google Classroom API that directly addresses the limitation in bullet #1 above and indirectly addresses #4.

The Classroom API allows admins to provision and manage classes at scale, and lets developers integrate their applications with Classroom. Until the end of July, we’ll be running a developer preview, during which interested admins and developers can sign up for early access. When the preview ends, all Apps for Education domains will be able to use the API, unless the admin has restricted access.

By using the API, admins will be able to provision and populate classes on behalf of their teachers, set up tools to sync their Student Information Systems with Classroom, and get basic visibility into which classes are being taught in their domain. The Classroom API also allows other apps to integrate with Classroom.

Google directly addresses the course roster management in their announcement; in fact, this appears to be the primary use case they had in mind. I suspect this by itself will have a big impact in the K-12 market (would love to hear John Watson’s take on this if he addresses in his blog), making it far more manageable for district-wide and school-wide Google Classroom adoptions.

The potential is also there for a third party to develop and integrate a viable grade book application available to an entire institution. While this could partially be done by the Google Apps for Education (GAFE) ecosystem, that is a light integration that doesn’t allow deep connection between learning activities and grades. The new API should allow for deeper integrations, although I am not sure how much of the current Google Classroom data will be exposed.

I still do not see Google Classroom as a current threat to the higher ed institutional LMS market, but it is getting closer. Current ed tech vendors should watch these developments.

Update: Changed Google Apps for Education acronym from GAE to GAFE.

The post Google Classroom Addresses Major Barrier To Deeper Higher Ed Adoption appeared first on e-Literate.

How Student and Faculty Interviews Were Chosen For e-Literate TV Series

Mon, 2015-06-29 06:47

By Phil HillMore Posts (338)

As part of our e-Literate TV set of case studies on personalized learning, Michael and I were fully aware that Arizona State University (ASU) was likely to generate the most controversy due to ASU’s aggressive changes to the concept of a modern research university. As we described in this introductory blog post:

Which is one reason why we’re pretty excited about the release of the first two case studies in our new e-Literate TV series on the trend of so-called “personalized learning.” We see the series as primarily an exercise in journalism. We tried not to hold onto any hypothesis too tightly going in, and we committed to reporting on whatever we found, good or bad. We did look for schools that were being thoughtful about what they were trying to do and worked with them cooperatively, so it was not the kind of journalism that was likely to result in an exposé. We went in search of the current state of the art as practiced in real classrooms, whatever that turned out to be and however well it is working.

As part of the back-and-forth discussions with the ASU case study release, John Warner brought up a good point in response to my description that our goal was “Basically to expose, let you form own opinions”.

@PhilOnEdTech Can't form opinion without a more thorough accounting. Ex. How did you choose students and fac. to talk to?

— John Warner (@biblioracle) June 1, 2015

Can’t form opinion without a more thorough accounting. Ex. How did you choose students and fac. to talk to?

Let’s explore this subject for the four case studies already released. Because the majority of interviewees shared positive experiences in our case studies, I’ll highlight some of the skeptical, negative or cautionary views that were captured in these case studies.

Our Approach To Lining Up Interviews

When we contacted schools to line up interviews on campus, it is natural to expect that the staff will tend to find the most positive examples of courses, faculty and students to share. As described above, we admit that we looked for schools with thoughtful approaches (and therefore courses), but we needed to try and expose some contrary or negative views as well. This is not to play gotcha journalism nor to create a false impression of equally good / equally bad perspectives. But it is important to capture that not everyone is pleased with the changes, and these skeptics are a good source of exposing risks and issues to watch. Below is the key section of the email sent to each school we visited.

The Case Study Filming Process
Each case study will include a couple of parts. First, we will interview the college leadership—whoever the school deems appropriate—to provide an overview of the school, it’s mission and history, it’s student body, and how “personalized education” (however that school defines the term) fits into that picture. If there are particular technology-driven initiatives related to personalized learning, then we may talk about those a bit. Second, we will want to talk some teachers and students, probably in a mixed group. We want to get some sample reactions from them about what they think is valuable about the education they get (or provide) at the school, how “personalization” fits into that, and how, when, and why they use or avoid technology in the pursuit of the educational goals. We’re not trying either to show “best/worst” here or to provide an “official” university position, but rather to present a dialog representing some of the diverse views present on the campus.

Campus Input on the Filming
In order for the project to have integrity, MindWires must maintain editorial independence. That said, our goal for the case studies is to show positive examples of campus communities that are authentically engaged in solving difficult educational challenges. We are interested in having the participants talk about both successes and failures, but our purpose in doing so is not to pass judgment on the institution but rather to enable to viewers to learn from the interviewees’ experiences. We are happy to work closely with each institution in selecting the participants and providing a general shape to the conversation. While we maintain editorial control over the final product, if there are portions of the interviews that make the institution uncomfortable then we are open to discussing those issues. As long as the institution is willing to allow an honest reflection of their own challenges and learning experiences as an educational community, then we are more than willing to be sensitive to and respectful of concerns that the end product not portray the institution in a way that might do harm to the very sort of campus community of practice that we are trying to capture and foster with our work.

As an example of what “willing to be sensitive to and respectful of concerns” means in practice, one institution expressed a concern that they did not want their participation in this personalized learning series to be over-interpreted as a full-bore endorsement of pedagogical change by the administration. The school was at the early stages of developing a dialog with faculty on where they want to go with digital education, and the administration did not want to imply that they already knew the direction and answers. We respected this request and took care to not imply any endorsement of direction by the administration.

Below are some notes on how this played out at several campuses.

Middlebury College

As described in our introductory blog post:

Middlebury College, the first school we went to when we started filming, was not taking part in any cross-institutional (or even institutional) effort to pilot personalized learning technologies and not the kind of school that is typically associated the “personalized learning” software craze. Which is exactly why we wanted to start there. When most Americans think of the best example of a personalized college education, they probably think of an elite New England liberal arts college with a student/teacher ratio of under nine to one. We wanted to go to Middlebury because we wanted a baseline for comparison. We were also curious about just what such schools are thinking about and doing with educational technologies.

Middlebury College staff helped identify one faculty member who is experimenting with technology use in his class with some interesting student feedback, which we highlighted in Middlebury Episode 2. They also found two faculty members for a panel discussion along with two students who have previously expressed strong opinions on where technology does and does not fit in their education. The panel discussion was highlighted in Middlebury Episode 3.

As this case study did not have a strong focus on a technology-enabled program, we did not push the issue of finding skeptical faculty or students and instead exposed that technology was not missing from the campus consideration of how to improve education.

The administration did express some cautionary notes on the use of technology to support “personalized learning” as captured in this segment:

Essex County College

By way of contrast, our second case study was at Essex County College, an urban community college in Newark, New Jersey. This school has invested approximately $1.2 million of its own money along with a $100 thousand Gates Foundation grant to implement an adaptive learning remedial math course designed around self-regulated learning. Our case study centered on this program specifically.

Of course, the place where you really expect to see a wide range of incoming skills and quality of previous education is in public colleges and universities, and at community colleges in particular. At Essex County College, 85% of incoming students start in the lowest level developmental math course. But that statistic glosses over a critical factor, which is there is a huge range of skills and abilities within that 85%. Some students enter almost ready for the next level, just needing to brush up on a few skills, while others come in with math skills at the fourth grade level. On top of that, students come in with a wide range of metacognitive skills. Some of them have not yet learned how to learn, at least this subject in this context.

Given the controversial nature of using adaptive learning software in a class, we decided to include a larger number of student voices in this case study. Douglas Walcerz, the faculty and staff member who designed the course, gave us direct access to the entire class. We actively solicited students to participate in interviews, as one class day was turned over to e-Literate TV video production and interviews, with the rest of the class watching their peers describe their experiences.

As we did the interviews, almost all students had a very positive view of the new class design, particularly the self-regulated learning aspect with the resultant empowerment they felt. What was missing was student voices who were not comfortable with the new approach. For the second day we actively solicited students who could provide a negative view. The result was shared in this interview:

As for faculty, it was easier to find some skeptical or cautionary voices, which we highlighted here.

As described above, our intent was not to present a false balance but rather to to include diverse viewpoints to help other schools know the issues to explore.

Arizona State University

At ASU we focused on two courses in particular, Habitable Worlds highlighted in episode 2 and remedial math (MAT 110) using Khan Academy software highlighted in episode 3.

We did have some difficulty getting on-campus student interviews due to both of these being online courses. For MAT 110 we did get find one student who expressed both positive and negative views on the approach, as shown in this episode.

Empire State College

Like ASU, Empire State College presented a challenge for on-campus video production from the nature of all-online courses. We worked with ESC staff to get students lined up for interviews, with the best stories coming from the prior learning affects on students.

It was easier and more relevant to explore the different perspectives on personalized learning from faculty and staff themselves, as evidenced by the following interview. ESC offered him up–proudly–knowing that he would be an independent voice. They understood what we meant in that email and were not afraid to show the tensions they are wrestling with on-camera. Not every administration will be as brave as ESC’s, but we are finding that spirit to be the norm rather than the exception.

Upcoming Episodes

It’s also worth pointing out the role of selecting colleges in the first place, which is not just about diversity. We know that different schools are going to have different perspectives, and we pick them carefully to set up a kind of implicit dialog. We know, for example, that ASU is going to give a full-throated endorsement of personalized learning software used to scale. So we balance them against Empire State College, which has always been about one-on-one mentoring in their design.

Hopefully this description of our process will help people like John Warner who need more information before forming their own opinion. At the least, consider this further documentation of the process. We are planning to release one additional case studies – the University of California at Davis in early July – as well as two analysis episodes. We’ll share more information once new episodes are released.

The post How Student and Faculty Interviews Were Chosen For e-Literate TV Series appeared first on e-Literate.

Prior Learning Assessments Done Right

Sun, 2015-06-28 21:53

By Michael FeldsteinMore Posts (1033)

This post has nothing to do with educational technology but everything to do with the kind of humane and truly personal education that we should be talking about when we throw around phrases like “personalized education.” Prior Learning Assessments (PLAs) go hand-in-glove with the trendy Competency-Based Education (CBE). The basic idea is that you test students on what they have learned in their own lives and give them credit toward their degrees based on what they already know. But it is often executed in a fairly mechanical way. Students are tested against the precise curriculum or competencies that a particular school has chosen for a particular class. Not too long ago, I heard somebody say, “We don’t need more college-ready students; we need more student-ready colleges.” In a logical and just world, we would start with what the student knows, rather than the with what one professor or group of professors decided one semester would be “the curriculum,” and we would give the student credit for whatever college-level knowledge she has.

It turns out that’s exactly what Empire State College (ESC) does. When we visited the college for an e-Literate TV case study, we learned quite a bit about this program and, in particular, about their PLA program for women of color.

But before we get into that, it’s worth backing up and looking at the larger context of ESC as an institution. Founded in 1971, the school was focused from the very beginning on “personalized learning”—but personalized in a sense that liberal intellectuals from the 1960s and 1970s would recognize and celebrate. Here’s Alan Mandell, who was one of the pioneering members of the faculty at ESC, on why the school has “mentors” rather than “professors”:

Alan Mandell: Every single person is called a mentor.

It’s valuable because of an assumption that is pretty much a kind of critique of the hierarchical model of teaching and learning that was the norm and remains the norm where there is a very, very clear sense of a professor professing to a student who is kind of taking in what one has to say.

Part of the idea of Empire State, and other institutions, more and more, is that there was something radically wrong with that. A, that students had something to teach us, as faculty, and that faculty had to learn to engage students in a more meaningful way to respond to their personal, academic, professional interests. It was part of the time. It was a notion of a kind of equality.

This was really interesting to me actually because I came here, and I was 25 years old. Every single student was older than I was, so the idea of learning from somebody else was actually not very difficult at all. It was just taken for granted. People would come with long professional lives, doing really interesting things, and I was a graduate student.

I feel, after many years, that this is still very much the case—that this is a more equal situation of faculty serving as guides to students who bring in much to the teaching and learning situation.

Unlike some of the recent adoptions of PLA, which are tied to CBE and the idea of getting students through their degree programs quickly, Empire State College approaches prior learning assessment in very much the spirit that Alan describes above. Here’s Associate Dean Cathy Leaker talking about their approach:

Cathy Leaker What makes Empire State College unique, even in the prior learning assessment field, is that many institutions that do prior learning assessment do what’s called a “course match.” In other words, a student would have to demonstrate—for example, if they wanted to claim credit for Introduction to Psychology, they would look at the learning objectives of the Introduction to Psychology course, and they would match their learning to that. We are much more open-ended, and as an institution, we really believe that learning happens everywhere, all the time. So, we try to look at learning organically, and we don’t assume that we already know exactly what might be required.

One of my colleagues, Elana Michelson, works on prior learning assessment. She started working in South Africa where they were—there it’s called “recognition for prior learning.” And she gives the example of some of the people who were involved in bringing down Apartheid, and how they, sort of as an institution working with the government, thought it might be ridiculous to ask those students to demonstrate problem solving skills, right? How the institution might look at problem-solving skills, and then if there was a strict match, they would say, “Well, wait a second. You don’t have it,” and yet, they’re activists that brought down the government and changed the world.

Those are some examples of why we really think we need to look at learning organically.

Students like Melinda come to us, talk about their learning, and then we try to help them identify it, come up with a name for it, and determine an amount of credit before submitting it for evaluation.

This is not personalized in the sense trying to figure out which institution-defined competencies you can check off on you way to an institution-defined collection of competencies that they call a “degree.” Rather, it’s an effort to have credentialed experts look at what you’ve done and what you know to find existing strengths that deserve to be recognized and credentialed. The Apartheid example is a particularly great one because it shows that traditional academic institutions may be poorly equipped to recognized and certify real-world demonstrations of competencies, particularly among people who come from disadvantaged or “marked” backgrounds. Here’s ESC faculty member Frances Boyce talking about why the school recognized a need to develop a particular PLA program for women of color:

Frances Boyce: Our project, Women of Color and Prior Learning Assessment, is based on a 2010 study done by Rebecca Klein-Collins and Richard Olson, “Fueling the Race to Success.” That found that students who do prior learning assessments are two and a half times more likely to graduate. When you start to unpack that data and you look at the graduation rates for students of color, for African American students the graduation rate increases fourfold. For Latino students it increases eightfold. Then, when you look at it in terms of gender, a woman who gets one to six credits in prior learning assessment will graduate more quickly than her male counterpart given the same amount of credit.

That seemed very important to us, and we decided, “Well, let’s see what we could do to improve the uptake rate for women of color.” So, we designed four workshops to help women of color, not only identify their learning—the value of their learning—but identify what they bring with them to the institution.

What’s going on here? Why is PLA more impactful than average for women and people of color? In addition to the fact that our institutions are not always prepared to recognize real-world knowledge and skills, as in the Apartheid example, people in non-privileged positions in our society are tacitly taught that college is not “for them.” That they don’t have what it takes to succeed there. By recognizing that they have, in fact, already acquired college-level skills and knowledge, PLA helps them get past the insults to their self-image and dignity and helps them to envision themselves as successful college graduates. Listen to ESC student Melinda Wills-Stallings’ story:

Michael Feldstein: I’m wondering if you can tell me, do you remember a particular moment, early on, when the lightbulb went off and you said to yourself, “Oh, that thing that’s part of my life counts”?

Melinda Wills-Stallings: I think when I was talking to my sons about the importance of their college education and how they couldn’t be successful without it and them saying to me, “But, Mom, you are successful. You run a school. You run a business.” To be told on days that I wasn’t there, the business wasn’t running properly or to be told by parents, “Oh, my, God. We’re so glad you’re back because we couldn’t get a bill, we couldn’t get a statement,” or, “No one knew how to get the payroll done.”

That’s when I knew, OK, but being told by an employer who said I wasn’t needed and I wasn’t relied on, I came to realize that it flipped on me. And I realized that’s what I had been told to keep me in my place, to keep me from aspiring to do the things that I knew that I was doing or I could do.

The lightbulb for me was when we were doing the interviews and Women of Color PLA, and Frances said to me, “That’s your navigational capital.” We would do these roundtables where you would interview with one mentor, and then you would go to another table. Then I went to another table, and she said, “Well, what do you hope to do with your college degree?” And I said, “I hope to pay it forward: to go continue doing what I love to do, but to come back to other women with like circumstances and inspire them and encourage them and support them to also getting their college degrees and always to be better today than I was yesterday, so that’s your aspirational capital.” And I went, “Oh, OK.” So, I have aspirational capital also, and then go to the next table and then I was like, I couldn’t wait to get to the next table because every table I went to, I walked away with one or two prior learning assessments.

And then to go home and to be able to put it into four- or five-page papers to submit that essay and to have it recognized as learning.

I was scared an awful lot of times from coming back to school because I felt, after I graduated high school and started college and decided I wanted to get married and have a family, I had missed the window to come back and get my college education. The light bulb was, “It’s never too late,” and that’s what I tell women who ask me, and I talk to them all the time about our school and our program. Like, “It’s never too late. You can always come back and get it done.”

Goals and dreams don’t have caps on them even though where I was, my employer had put a cap on where I could go on my salary and my position. Your goals and dreams don’t have a cap on it, so I think that was the light bulb for me—that it wasn’t too late.

It’s impossible to hear Melinda speak about her journey and not feel inspired. She built up the courage to walk into the doors of the college, despite being told repeatedly by her employer that she was not worthy. The PLA process quickly affirmed for her that she had done the right thing. At the same time, I recognize that traditionalists may feel uncomfortable with all this talk of “navigational capital” and “aspirational capital” and so on. Is there a danger of giving away degrees like candy and thus devaluing them? First, I don’t think there’s anything wrong with giving a person degree certification if they have become genuine experts in a college-appropriate subject through their life experience. In some ways, we are all the Scarecrow, the Tin Man, or the Cowardly Lion, waiting for some wizard to magically convey upon us a symbol that confers legitimacy upon our hard-won skills and attributes and thus somehow making them more real. But also, a funny thing happens when you treat a formal education as a tool for helping an individual reach her goals rather than a set of boxes that must be checked. Students start thinking about the work that education entails as something that is integral to them achieving those goals rather than a set of obstacles they have to get around in order to get the piece of paper that is the “real” value of college. Listen to ESC student Jessi Colón, a professional dancer who chose not to get all the credits she could have gotten for her dance knowledge because she wanted to focus on what she needed to learn for her next career working in animal welfare:

Jessi Colón: It was little bit tricky especially because I had really come here with the intention of maximizing and capitalizing on all this experience that I had. Part of the prior learning assessment and degree planning process is looking at other schools that may have somewhat relevant programs and trying to match what your learning is to those. As I was looking at other programs outside of New York or at other small, rural schools that do these little animal programs, I found that there were a lot of classes that I really wanted to take.

One of the really amazing things about Empire State is that they can also give you individualized courses, and I did a lot of those. So, once I saw these at other schools, I was like, “Man, I really want to take a class in animal-assisted therapy, and would I like to really, really indulge myself and do that or should I write another essay on jazz dance composition?” I knew that one would be more of a walk in the park than the other, but I was really excited about my degree and having this really personal degree allowed me to get excited about it. So, it made sense, though hard to let go of that prior learning in order to opt for the classes.

I could’ve written 20 different dance essays, but I wanted to really take a lot of classes. So, I filled that with taking more classes relevant to my degree, and then ended up only writing, I think, one or two dance-relevant essays.

It turns out that if you start from the assumption that the education they are coming for—not the certification, but the learning process itself—can and should have intrinsic value to them as tools toward pursuing their own ambitions, then people step up. They aspire to be more. They take on the work. If the education is designed to help them by recognizing how far they have come before they walk in the door and focusing on what they need to learn in order to do whatever it is they aspire to do after they leave, then students often come to see that gaming the system is just cheating themselves.

There are many ways to make schooling more personal but, in my opinion, what we see here is one of the deepest and most profound. This is what a student-ready college looks like. And in order to achieve it, there must be an institutional commitment to it that precedes the adoption of any educational technology. The software is just an enabler. If college community collectively commits to true personalization, then technology can help with that. If the community does not make such a commitment, then “personalized learning” software might help achieve other educational ends, but it will not personalize education in the sense that we see here.

I’m going to write a follow-up post how ESC is using that personalized learning software in their context, but you don’t have to wait to find out; you can just watch the second episode of the case study. While you’re at it, you should go back and watch the full ETV episode from which the above clips were excerpted. In addition to watching more great interview content, you can find a bunch of great related links to content that will let you dig deeper into many of the topics covered in the discussions.

The post Prior Learning Assessments Done Right appeared first on e-Literate.