Today we reached a big milestone in our MOOC activities: one million enrollments! Just before Christmas of 2012 I was asked to start a small project to develop 4 MOOCs. On September 15th 2013 the first two DelftX MOOCs started on the edX platform. Now 3 years later we have developed 36 MOOCs (and more in the pipeline). I'm proud of the result and the impact this project has had.
With the development of the first courses we were figuring out the process on-the-go. This was trully an adventure, nowadays it is a fully organised and supported process. For most course teams it is a first time experience and if they listen to our advice and tips it will be a easy but still an intensive process. Off course not all course teams listen (they're academics) and that keeps us busy ;-). More than 80 lecturers have been involved in one or more MOOCs. And don't forget all those students that assisted the lecturers in developing the courses.
In these three years were we started with a small support team of 4 (Janine, Mark, Gijs and me) and have grown to the Extension School support team of more than 20 people. That group is not just supporting MOOCs but also OpenCourseWare, Professional Education, Online and blended education.
To celebrate this milestone we have created a infographics. You can download it from our website. The data of three years of MOOCs is impressive:
1M course enrollments by 699.014 learners from 229 countries aging between 8 and 94 (average age is 29). These learners have watched 13.824.919 minutes of video (26 year and 104 days). We have issued 28.739 certificates. Average pass rate for the verified certificates is 69% (highest 81,5%). More than 100k surveys were submitted. Our learners appreciate our MOOCs with a 8 (out of 10). Some courses even score above 9.
When you start an initiative it is always great to receive external recognising for your course. And we did:
The small MOOC project has been a tremendous success for TU Delft. The impact is much bigger than anticipated at the start. And we will continue to develop our MOOCs in quantity and quality. Below is an graphical view of the enrollments per course.
An interesting paper of George Veletsianos (Royal Roads University), Justin Reich (MIT), and Laura Pasquini (University of North Texas and Royal Roads University) was just published in the journal AERA Open (American Educational Research Association). The paper focuses on the activities of the learners in MOOC that can't be tracked in the tracking logs.
The authors interviewed 92 learners from around the world in different ages and gender. The learners participated in 4 MOOCs of HarvardX (they are aware that these learner might not be very representative). Their research findings reveiled three domains of the experience of the MOOC learners that you can't see in the tracking logs.
Some of the findings:
learners work at workstations that include not only computers but also notebooks, paper printouts, reference books, additional devices, and other people.
students’ online activities extend beyond the MOOC platform, to a variety of reference resources and online social networks that support student learning. Whereas many MOOCs are designed as a comprehensive learning experience, tudents appear to treat them as a single node in a broader network of learning opportunities.
MOOC learning takes place in a broader learner world. This world goes beyond workstations, MOOC platforms, and online spaces, and it is a world in which students negotiate for time across multiple competing commitments.
I agree with the authors we should be more aware of what the learners are doing outside the platform. I do question if the HarvardX MOOCs are a good representation of current MOOC course designs. I personnally have the feeling they are still content centered and not focus on the learning experience.
The Life Between Big Data Log Events. Learners’ Strategies to Overcome Challenges in MOOCs. George Veletsianos, Justin Reich, Laura A. Pasquini. AERA Open Jun 2016, 2 (3) 2332858416657002; DOI: 10.1177/2332858416657002
Yesterday I attend the policy forum on European MOOCs organised by the HOME project. This was the third event of the HOME project I attended after Porto and Rome. In 22 presentation we got a good overview of policy for MOOCs on European, governmental and institutional level. The presentators submitted 19 papers.
Last week EADTU published the report MOOCs in Europe. This report is an overview of papers representing a collective European response on MOOCs as presented during the HOME conference in Rome November 2015. This report is published as part of the project HOME - Higher education Online: MOOCs the European way. The project is ending this month with a policy meeting on the 28th of June in Brussels.
The report includes 31 papers in 6 categories:
Part 1: Regional MOOC initiatives (3 papers)
Part 2: Role media exposure on MOOC development (2)
Part 3: Supporting the selection of MOOC platforms (8)
Part 4: Business models European MOOCs (5)
Part 5: Pedagogical approaches in European MOOCs (8)
Part 6: Shared services in European MOOC context (5)
Interesting to see is the diversity of authors that have written the reports. They representing institutes in 18 countries (Belgium, Denmark, Finland, Greece, Hungary, Iceland, Ireland, Israel, Italy, Korea, Lithuania, Netherlands, Poland, Portugal, Romania, Spain, Turkey and UK). Although some large European countries (France and Germany) are not included.
Jansen, Darco; Konings, Lizzie (2016). MOOCs in Europe. ISBN 978-90-79730-19-3
This week I'm attending the EDEN annual conference in the beautiful Budapest. Nelson Jorge, Sofia Dopper and me wrote a paper about the TU Delft Online Learning Experience. This is a pedagogical model that supports the development of our courses and strives for increasing quality. The creation of the OLE was an important step for TU Delft, contributing to the development of online courses in a more systematic and consistent way, guiding all course development teams through the realisation of several shared educational principles.
At the Gala Diner of the conference we received the EDEN 2016 Best Practice Inititative Award for our paper. The award is not just for the paper but for the whole initiative of designing the model and implementing it for our courses. I see it as a great appreciation for the work we are doing with the TU Delft Extension School.
Online Learning Experience model
The OLE holds 8 principles to support course teams in the design and development of online courses. The model was elaborated based on the foundations established by distance learning experts (Moore, 1991; Keegan, 1996; Palloff & Pratt, 1999; Garrison, 2000; Peters, 2000; Anderson, 2003; Garrison & Anderson, 2003; Salmon, 2011; Salmon, 2013; Bates, 2015) and the know-how of the TU Delft Online Learning Course Development Team.
Jorge, Nelson; Van Valkenburg, Willem; Dopper, Sofia (2016). The TU Delft Online Learning Experience: From Theory to Practice in Teixeira, Szucs and Mazar (2016). Conference Proceedings EDEN 2016 Annual Conference. ISBN 978-615-5511-10-3. License CC-BY 4.0
Here is the presentation Nelson presented at the conference.
And the link to the model explaining all the principles. Below is the video of teacher's perspective of Online Learning Experience
After an European tender procedure Delft University of Technology has selected a new LMS supplier. After 17 years we are saying goodbye to Blackboard and are going to migrate to the cloud-based platform of the Canadian Desire2Learn: Brightspace Learning Sytem. I'm very pleased with the result of our tender based on best value procurement. We have selected a partner that is eager to work with us for the next 10 years with a product that fits our strategy and is ready for the future. The new platform not only includes the full Brightspace Learning Environment (including ePortfolio, Learning Repository), but also their full Learning Analytics platform, including their predictive Learning Analytics system.
Before we started with our tender we noticed that the traditional tender method of listing all our requirements gave such a long list, that no supplier would meet all requirements. So that selecting the best solution would be hard. That is why we changed to a best value procurement. In stead of listing all the requirement we wrote down our mission, strategy and goals we wanted to reach with the conditions (price ceiling). It was now up to the suppliers to use their expertise and know-how to provide us with the best solution they could offer within the conditions.
It also meant that they didn't need to provide us thick offers. It was limited to 2 pages for their performance substantiation, 2 pages for their risk file, and 2 pages for their opportunity file. Next to the paperwork each supplier could send 2 key persons that would be interviewed according to a standard list of questions (first question is why are you a key person?).
The most interesting is in the interviews and dossiers that we are looking at relevant dominant data. So no marketing talk, but real measurable data that can be verified. So for example, not "we have done many successful implementation", but "we have done 83 implementations in the last 2 years. Of which 79 were on time and within costs. The industry standard is 80%". This also meant that during the tender the people that will do the implementation would need to be involved. And that really improves the quality of their dossiers (if you involve the right persons).
The grades of the dossiers and interviews are based on a system that starts at 6. If you have dominent information it can go up to 8 or 10 or down to 4 or 2. So no dominant data means a 6. These grades are converted into a subtraction of the price. Combining that with the price of their offer leads to a ranking. The number one goes to the clarification & verification phase.
In the clarification & verification phase we worked together with the team of Desire2Learn to create the plan (a whole list of deliverables) and to verify their dossiers (If that doesn't work out or the verification shows error we would move on to the supplier that was ranked second). We are not buying a platform, but a plan to implement their platform. Yesterday we have finished this process and we have (provisional) awarded the tender to D2L.
Desire2Learn (D2L) is a Canadian based company founded in 1999, that is still run by the founder John Baker. According to the Ovum decision matrix for selecting an online learning platform D2L is:
Brightspace received the highest overall technology assessment score, obtaining at least a top-three rating in all 15 categories. Not unexpectedly, Brightspace received a perfect score for student performance and retention. D2L offers analytics-driven progress monitoring capabilities from within Brightspace, and in 2012 the company partnered with IBM to deliver the Smarter Education Solution, which incorporates an intervention management system and predictive analytics. Although D2L is ahead of other OLP providers when it comes to integrated analytics – and in particular predictive
analytics – the company upheld its promise to drive successful learning outcomes and its reputation for providing an open learning platform that can easily integrate with other education technologies by partnering with IBM. IBM is more attuned to predictive models and data systems, and together the two companies can help institutions leverage student data in meaningful ways. Separately, D2L also achieved a perfect score for accessibility. Its accessibility program is integrated into its R&D lifecycle, and designs are regularly reviewed with its Accessibility Interest Group, which demonstrates its commitment to this category. D2L combines all of its capabilities with impressive training and support services, and a high-touch approach to customer engagement. For example, D2L has designed custom training sessions at the request of some of its customers to help institutions learn more about topics such as accessibility. Ovum anticipates that as the industry moves into the next phase of OLP purchasing, vendors with strong support services around its solutions will be particularly appealing.
Although at its core D2L is a technology provider, it also has a strong focus on pedagogy and how enhanced learning experiences can help address the skills gap when students move on to employment. As a result, D2L received the highest score for the capacity to support next-generation online teaching and learning. The Brightspace platform moves away from a one-size-fits-all approach and is instead highly personalized to meet differing student needs. Furthermore, D2L was ahead of its competitors in addressing demand for competency-based learning and adaptive learning.
Ovum recommends that as a market leader, Brightspace by D2L should be included in an institution's list of OLPs. Moving from managing to improving learning, Brightspace meets the core functionality criteria defined in this ODM, and although its brand awareness could be stronger in certain regions it is certainly strong in North America and among its competitors. The company is continuously evolving its offerings to meet the needs of the higher education market.
Implementation & Migration
So after all the formal and legal stuff around the tender, now we can start the actual work. We have formed a great team of people from D2L and TU Delft that will run the project under the project management of Erna Kotkamp. We are very lucky we have someone as Erna in our team. With her passion, drive, skills and eye for detail, I'm convinced this will be a successful project that will give our lecturers and students a platform for the next ten years.
Update 30 June: Stand-still period has ended and the contract is signed.
Yesterday the JRC IPTS published the report OpenCases: Case Studies on Openness in Education. This report is the final outcome of the study OpenCases: case studies on openness in education. The study was carried out by the IPTS in collaboration with the University of Bath as part of the OpenEdu project. The report is based on interviewees with inside knowledge of their organisations. There are 9 cases in this report of these organisations/projects around Europe.
France Université Numérique (french mooc platform)
OER Universitas (OERu
Universidad Carlos III de Madrid (UC3M)
Open AGH E-Textbooks
Virtual University of Bavaria (BVU)
I was one of the interviewees for the TU Delft case and think they have done a great job in composing the report and our case.
On Thursday I presented at the Qualtrics Live Event in Amsterdam. I was asked to present about our MOOCs as inspiration for the other participants (about 25). At the end of the presentation I got the question what we are doing with Qualtrics. Although I gave the presentation I'm not the one that is handling our Qualtrics activities. It was a good thing that Sara and Jan-Paul had joined me at the event!
Use of Qualtrics
We use qualtrics in 5 different ways in our MOOCs.
In all our online courses we have pre-, mid- and post-surveys. These surveys are mostly the same, although there are some custom questions per course. Before the surveys are added to the course, we ask the course teams if they have any specific questions to ask. In total we have more than 100,000 responses to these surveys.
The second category are surveys for course teams to get data for their research. Usually the questions in these MOOCs are related to the topic of the courses. This is a fast and cheap way to collect data from a very international group of learners. For example, the course team of framing included survey where participants were asked to respond to a certain 'frame'. Their interest was to find difference depending on the cultural differences of the learners.
As improvement of the EdX Quiz module
The EdX Quiz module is rather basic and lacks the advanced logics that qualtrics has to create custom paths through a survey. Because we link the user id of the edx platform to the specific survey response, we can link their response to their other results and activities in the course.
On our website and in direct emails to our learners we use marketing surveys to get more insights about our learners. We even offer a professional education course about this, starting in a couple of days.
Support surveys for our learning interventions
Our research team does not only analyse the data, but also does learning interventions in some of our MOOCs. Around these interventions they use surveys to get additional information from the learners. One of the learning interventions was a learner tracker. The research team presented this at the Learning Analytics for Learners workshop last month in Edinburgh (paper).
Dan Davis, Guanliang Chen, Ioana Jivet, Claudia Hauff, Geert-Jan Houben (2016). Encouraging Metacognition & Self-Regulation in MOOCs through Increased Learner Feedback. In Learning Analytics and Knowledge 2016 Learning Analytics for Learners Workshop. [ Bibtex ]
our surveys include questions that are similar to those of the UW research
If we look at the survey data (for 10 courses), the results are:
80% of both groups of students (developing and developed countries) reported they had taken an online course before;
12% of students from developed countries that had taken an online course before, never completed any course, and 16% from developing, which means that
88% of students from developed countries that had taken an online course before state they completed at least one course, and 84% of students from developing countries.
Here we see that many students actually completed at least one course in both groups, but overall percentages are still slightly “in favour” of students from developing countries.
If we look at the platform data (example of one course):
The average grade of developing countries is 3,73% vs. 5,55% for developed, and the passing rate is 2,93% vs. 5,26%.
There are more students from developing countries that hadn’t really started the MOOC (i.e. had a grade = 0), 91% vs. 87%, but even among those that did (grade > 0), the average grade is lower (36% vs 40%), as well as passing rates (32% vs. 47%).
Only when you look only at people who received a certificate, the average grade is basically the same.
Because Sara is very good, she also had some comments about shortcomings of the original research:
It tries to compare the result of their survey about completing any course ever to per-course completion rates. While it acknowledges that they don’t really have the actual data for comparison, it is still misleading to mention it alongside it, because in reality, it doesn’t tell us anything. They have no idea how many students in general completed at least one course already in developed countries.
The comparison is problematic even further because of very different sources of information. While the completion rates are actual, true numbers, the survey is an estimate, that can be largely biased. We know that more students that receive a certificate in the end complete even the pre-survey (compared to the actual passing rates), which may also be true in this case, i.e. more engaged students completing their survey.
Also, they compare their “completion” to per-course “completion rates”. But, our completion rates are actually certification rates, while their “completion” is completing the course with not necessarily receiving a certificate. Furthermore, students may understand very differently, what does it mean to complete a course, possibly connected to their intentions. We have no idea how many students in our courses would say they completed the course. So this is another point why these numbers can hardly be compared to the regular “completion rates”.
On a per-course basis, the number of “registrants” is rather high (usually only around 50% of students does anything in our courseware), which is very far from the 2% of “registrants” they identified in their sample, which further shows that their completion numbers and course completion numbers can hardly be comparable.
Their study is conducted only on people between 18-35 years old. As we know there are many students above 30 (30 is usually the median) in our courses, this is hardly representative of all MOOC users.
The researchers based their conclusion solely on self-reported survey data, but tried to compare their result to actual per-course completion rates, which creates a false sense that students from developing countries actually complete more courses than their peers from developed countries. While the high completion rates among students from developing countries may still be a surprise, it is important to keep in mind what we are actually looking at. Both platform data and survey data of ours revealed that somewhat more students from developed countries complete courses, or receive a certificate. In most of our research we combine the survey data with the platform data to get more accurate results and less user bias.