This week I'm attending the EDEN annual conference in the beautiful Budapest. Nelson Jorge, Sofia Dopper and me wrote a paper about the TU Delft Online Learning Experience. This is a pedagogical model that supports the development of our courses and strives for increasing quality. The creation of the OLE was an important step for TU Delft, contributing to the development of online courses in a more systematic and consistent way, guiding all course development teams through the realisation of several shared educational principles.
At the Gala Diner of the conference we received the EDEN 2016 Best Practice Inititative Award for our paper. The award is not just for the paper but for the whole initiative of designing the model and implementing it for our courses. I see it as a great appreciation for the work we are doing with the TU Delft Extension School.
Online Learning Experience model
The OLE holds 8 principles to support course teams in the design and development of online courses. The model was elaborated based on the foundations established by distance learning experts (Moore, 1991; Keegan, 1996; Palloff & Pratt, 1999; Garrison, 2000; Peters, 2000; Anderson, 2003; Garrison & Anderson, 2003; Salmon, 2011; Salmon, 2013; Bates, 2015) and the know-how of the TU Delft Online Learning Course Development Team.
Jorge, Nelson; Van Valkenburg, Willem; Dopper, Sofia (2016). The TU Delft Online Learning Experience: From Theory to Practice in Teixeira, Szucs and Mazar (2016). Conference Proceedings EDEN 2016 Annual Conference. ISBN 978-615-5511-10-3. License CC-BY 4.0
Here is the presentation Nelson presented at the conference.
And the link to the model explaining all the principles. Below is the video of teacher's perspective of Online Learning Experience
After an European tender procedure Delft University of Technology has selected a new LMS supplier. After 17 years we are saying goodbye to Blackboard and are going to migrate to the cloud-based platform of the Canadian Desire2Learn: Brightspace Learning Sytem. I'm very pleased with the result of our tender based on best value procurement. We have selected a partner that is eager to work with us for the next 10 years with a product that fits our strategy and is ready for the future. The new platform not only includes the full Brightspace Learning Environment (including ePortfolio, Learning Repository), but also their full Learning Analytics platform, including their predictive Learning Analytics system.
Before we started with our tender we noticed that the traditional tender method of listing all our requirements gave such a long list, that no supplier would meet all requirements. So that selecting the best solution would be hard. That is why we changed to a best value procurement. In stead of listing all the requirement we wrote down our mission, strategy and goals we wanted to reach with the conditions (price ceiling). It was now up to the suppliers to use their expertise and know-how to provide us with the best solution they could offer within the conditions.
It also meant that they didn't need to provide us thick offers. It was limited to 2 pages for their performance substantiation, 2 pages for their risk file, and 2 pages for their opportunity file. Next to the paperwork each supplier could send 2 key persons that would be interviewed according to a standard list of questions (first question is why are you a key person?).
The most interesting is in the interviews and dossiers that we are looking at relevant dominant data. So no marketing talk, but real measurable data that can be verified. So for example, not "we have done many successful implementation", but "we have done 83 implementations in the last 2 years. Of which 79 were on time and within costs. The industry standard is 80%". This also meant that during the tender the people that will do the implementation would need to be involved. And that really improves the quality of their dossiers (if you involve the right persons).
The grades of the dossiers and interviews are based on a system that starts at 6. If you have dominent information it can go up to 8 or 10 or down to 4 or 2. So no dominant data means a 6. These grades are converted into a subtraction of the price. Combining that with the price of their offer leads to a ranking. The number one goes to the clarification & verification phase.
In the clarification & verification phase we worked together with the team of Desire2Learn to create the plan (a whole list of deliverables) and to verify their dossiers (If that doesn't work out or the verification shows error we would move on to the supplier that was ranked second). We are not buying a platform, but a plan to implement their platform. Yesterday we have finished this process and we have (provisional) awarded the tender to D2L.
Desire2Learn (D2L) is a Canadian based company founded in 1999, that is still run by the founder John Baker. According to the Ovum decision matrix for selecting an online learning platform D2L is:
Brightspace received the highest overall technology assessment score, obtaining at least a top-three rating in all 15 categories. Not unexpectedly, Brightspace received a perfect score for student performance and retention. D2L offers analytics-driven progress monitoring capabilities from within Brightspace, and in 2012 the company partnered with IBM to deliver the Smarter Education Solution, which incorporates an intervention management system and predictive analytics. Although D2L is ahead of other OLP providers when it comes to integrated analytics – and in particular predictive
analytics – the company upheld its promise to drive successful learning outcomes and its reputation for providing an open learning platform that can easily integrate with other education technologies by partnering with IBM. IBM is more attuned to predictive models and data systems, and together the two companies can help institutions leverage student data in meaningful ways. Separately, D2L also achieved a perfect score for accessibility. Its accessibility program is integrated into its R&D lifecycle, and designs are regularly reviewed with its Accessibility Interest Group, which demonstrates its commitment to this category. D2L combines all of its capabilities with impressive training and support services, and a high-touch approach to customer engagement. For example, D2L has designed custom training sessions at the request of some of its customers to help institutions learn more about topics such as accessibility. Ovum anticipates that as the industry moves into the next phase of OLP purchasing, vendors with strong support services around its solutions will be particularly appealing.
Although at its core D2L is a technology provider, it also has a strong focus on pedagogy and how enhanced learning experiences can help address the skills gap when students move on to employment. As a result, D2L received the highest score for the capacity to support next-generation online teaching and learning. The Brightspace platform moves away from a one-size-fits-all approach and is instead highly personalized to meet differing student needs. Furthermore, D2L was ahead of its competitors in addressing demand for competency-based learning and adaptive learning.
Ovum recommends that as a market leader, Brightspace by D2L should be included in an institution's list of OLPs. Moving from managing to improving learning, Brightspace meets the core functionality criteria defined in this ODM, and although its brand awareness could be stronger in certain regions it is certainly strong in North America and among its competitors. The company is continuously evolving its offerings to meet the needs of the higher education market.
Implementation & Migration
So after all the formal and legal stuff around the tender, now we can start the actual work. We have formed a great team of people from D2L and TU Delft that will run the project under the project management of Erna Kotkamp. We are very lucky we have someone as Erna in our team. With her passion, drive, skills and eye for detail, I'm convinced this will be a successful project that will give our lecturers and students a platform for the next ten years.
Update 30 June: Stand-still period has ended and the contract is signed.
Yesterday the JRC IPTS published the report OpenCases: Case Studies on Openness in Education. This report is the final outcome of the study OpenCases: case studies on openness in education. The study was carried out by the IPTS in collaboration with the University of Bath as part of the OpenEdu project. The report is based on interviewees with inside knowledge of their organisations. There are 9 cases in this report of these organisations/projects around Europe.
France Université Numérique (french mooc platform)
OER Universitas (OERu
Universidad Carlos III de Madrid (UC3M)
Open AGH E-Textbooks
Virtual University of Bavaria (BVU)
I was one of the interviewees for the TU Delft case and think they have done a great job in composing the report and our case.
On Thursday I presented at the Qualtrics Live Event in Amsterdam. I was asked to present about our MOOCs as inspiration for the other participants (about 25). At the end of the presentation I got the question what we are doing with Qualtrics. Although I gave the presentation I'm not the one that is handling our Qualtrics activities. It was a good thing that Sara and Jan-Paul had joined me at the event!
Use of Qualtrics
We use qualtrics in 5 different ways in our MOOCs.
In all our online courses we have pre-, mid- and post-surveys. These surveys are mostly the same, although there are some custom questions per course. Before the surveys are added to the course, we ask the course teams if they have any specific questions to ask. In total we have more than 100,000 responses to these surveys.
The second category are surveys for course teams to get data for their research. Usually the questions in these MOOCs are related to the topic of the courses. This is a fast and cheap way to collect data from a very international group of learners. For example, the course team of framing included survey where participants were asked to respond to a certain 'frame'. Their interest was to find difference depending on the cultural differences of the learners.
As improvement of the EdX Quiz module
The EdX Quiz module is rather basic and lacks the advanced logics that qualtrics has to create custom paths through a survey. Because we link the user id of the edx platform to the specific survey response, we can link their response to their other results and activities in the course.
On our website and in direct emails to our learners we use marketing surveys to get more insights about our learners. We even offer a professional education course about this, starting in a couple of days.
Support surveys for our learning interventions
Our research team does not only analyse the data, but also does learning interventions in some of our MOOCs. Around these interventions they use surveys to get additional information from the learners. One of the learning interventions was a learner tracker. The research team presented this at the Learning Analytics for Learners workshop last month in Edinburgh (paper).
Dan Davis, Guanliang Chen, Ioana Jivet, Claudia Hauff, Geert-Jan Houben (2016). Encouraging Metacognition & Self-Regulation in MOOCs through Increased Learner Feedback. In Learning Analytics and Knowledge 2016 Learning Analytics for Learners Workshop. [ Bibtex ]
our surveys include questions that are similar to those of the UW research
If we look at the survey data (for 10 courses), the results are:
80% of both groups of students (developing and developed countries) reported they had taken an online course before;
12% of students from developed countries that had taken an online course before, never completed any course, and 16% from developing, which means that
88% of students from developed countries that had taken an online course before state they completed at least one course, and 84% of students from developing countries.
Here we see that many students actually completed at least one course in both groups, but overall percentages are still slightly “in favour” of students from developing countries.
If we look at the platform data (example of one course):
The average grade of developing countries is 3,73% vs. 5,55% for developed, and the passing rate is 2,93% vs. 5,26%.
There are more students from developing countries that hadn’t really started the MOOC (i.e. had a grade = 0), 91% vs. 87%, but even among those that did (grade > 0), the average grade is lower (36% vs 40%), as well as passing rates (32% vs. 47%).
Only when you look only at people who received a certificate, the average grade is basically the same.
Because Sara is very good, she also had some comments about shortcomings of the original research:
It tries to compare the result of their survey about completing any course ever to per-course completion rates. While it acknowledges that they don’t really have the actual data for comparison, it is still misleading to mention it alongside it, because in reality, it doesn’t tell us anything. They have no idea how many students in general completed at least one course already in developed countries.
The comparison is problematic even further because of very different sources of information. While the completion rates are actual, true numbers, the survey is an estimate, that can be largely biased. We know that more students that receive a certificate in the end complete even the pre-survey (compared to the actual passing rates), which may also be true in this case, i.e. more engaged students completing their survey.
Also, they compare their “completion” to per-course “completion rates”. But, our completion rates are actually certification rates, while their “completion” is completing the course with not necessarily receiving a certificate. Furthermore, students may understand very differently, what does it mean to complete a course, possibly connected to their intentions. We have no idea how many students in our courses would say they completed the course. So this is another point why these numbers can hardly be compared to the regular “completion rates”.
On a per-course basis, the number of “registrants” is rather high (usually only around 50% of students does anything in our courseware), which is very far from the 2% of “registrants” they identified in their sample, which further shows that their completion numbers and course completion numbers can hardly be comparable.
Their study is conducted only on people between 18-35 years old. As we know there are many students above 30 (30 is usually the median) in our courses, this is hardly representative of all MOOC users.
The researchers based their conclusion solely on self-reported survey data, but tried to compare their result to actual per-course completion rates, which creates a false sense that students from developing countries actually complete more courses than their peers from developed countries. While the high completion rates among students from developing countries may still be a surprise, it is important to keep in mind what we are actually looking at. Both platform data and survey data of ours revealed that somewhat more students from developed countries complete courses, or receive a certificate. In most of our research we combine the survey data with the platform data to get more accurate results and less user bias.
Two weeks ago the OE Global Conference in Krakow I got re-elected by the members of the consortium. It is a honour to serve on the board of directors. Next to me Sophie Touzé got also re-elected. We also welcomed four new board members:
Papa Youga Dieng, Organisation internationale de la Francophonie
Barbara Illowsky, Community College Consortium for Open Educational Resources
Allyn Radford, Corporate member
Naoko Tosa, Japan OCW Consortium
With the new board members it shows the international aspect of our board. We have board members from all over the world. Since the start of the consortium this has always been the case. Last year we formalised this in the by-laws:
The Consortium desires a strong, internationally diverse board of directors. Election results may be weighted to ensure representation from different regions of the world. In this case, the weighting of results shall be set by the nominating committee and disclosed to members in advance. Source: OEC By-laws
The Open Education Consortium has more than 250 members from 45 countries (At the conference we had participants from 38 countries):
I can recommend any organisation (not only universities) that supports open education to join the consortium. The fees are moderate and you join the global network of educational institutions, individuals and organizations that support an approach to education based on openness, including collaboration, innovation and collective development and use of open educational materials. The activities we do:
Networking and community development
Advocacy and advising
Capacity building and training
Want to join? You can contact me or one of the other board members of contact the OEC staff.
As part of the Opening Up Europe intiative the European Commission has started a project OpenEdu at their own research centre IPTS in Seville. At #OEGlobal Andreia Inamorato dos Santos gave an insightful presentation about the results of the project. Many of the reports will be published this year.
Defining Open Education
Andreia started with defining Open Education. The definition they adopted is focusing on removing barriers:
This definition is a very broad one, but I think it has the right focus:
not only digital technologies, although that is the most common form.
learning not only accessible, but also ambundant and customisable
not only formal, but also non-formal education and bridiging this
An important part of the project is the OpenEdu Framework. This framework should support higher education institutions to adopt and implement Open Education. The framework is build on 6 core dimensions and 4 transversal dimensions. For each dimension of open education, the framework brings a definition, a rationale and components with descriptors.
6 Core: access, content, pedagogy, recognition, collaboration, research
For the maturity of Open Education it is important that we get more solid research evidence about the open education and its value for policy maker, instructors, and students. In my previous blog I already mentioned the Open Educuation Group, another group of researchers that is doing some great work is the OER Hub of the Open University. At the OEGlobal conference I attended a session of the this group about the Open Research Agenda.
This activity is focused on forming a better understanding of research needs in open education. To do this, they published a survey (please submit!) and did sessions at #OEGlobal and #OER16 to collect information about research priorities. The results will be shared in the form of a report.
The researchers of OER Hub will not do all the research questions that will be mentioned in the report. Any researcher can work on the items in the agenda. The more the better I would say!
Normally the target audience are teachers that are OER novice but are interested in help with their course design. So the workshop also introduces them in Open Education. Below are the slides we used during the workshop. This workshop will be part of the offerings of our training programme for instructors.