Programmatic assessment in vocational secondary education 

Programmatic assessment in vocational secondary education 

In vocational secondary education (MBO) there is increasing interest in programmatic testing and “dialogue-driven” learning. And we notice that many ROCs and other vocational secondary education (MBO) institutions are thinking about how to give this shape. We expect that eventually this form of ‘programmatic assessment’, or programmatic feedback, will also be applied in secondary schools and even in primary education. After all, it is also the way to organize your learning processes flexibly and still keep a good overview of the learning process, both for the student and the teacher and supervisors.

What is programmatic assessment?

Programmatic assessment could basically be thought of as capturing the educational process in data points that give you a much better understanding of the learning process. But it is much more than that. It is about continuous testing, or rather continuous feedback, supporting and reinforcing the learning process. Feedback and testing as an integral part of learning. And that can be formative, where it is much more about the feedback on an activity performed, or it can be summative, where it is more of an evaluative nature. In Programmatic Assessment, it is important that there is a good balance between what and how much is measured, the instruments used and the forms of testing employed.

We increasingly refer to programmatic assessment as programmatic teaching, programmatic feedback or flexible teaching. The founder of programmatic testing is Prof. Cees van der Vleuten. He did pioneering work in this field for the past 25 years. The nice thing is that the Scorion platform, now most widely used in the Netherlands for programmatic assessment and flexible and personalized teaching, was also developed in the same period. At about the same time that Prof. Van der Vleuten delivered his inaugural address ‘beyond intuition’ in 1996 where the concept pretty much became public for the first time, we at Parantion started thinking about how a learning process could be ‘visualized’ using data. The metaphor below is widely used by Prof. Van der Vleuten to explain the concept.

We have outlined what programmatic assessment is, but there is of course much more information. Also about how this can be applied in vocational education. For more information we would like to refer to the platform ‘leren van toetsen‘.

The origins of Programmatic Assessment

The programmatic testing methodology has its origins in medical education. And this is no coincidence. In medical education, it is very important that the skills students are taught are also practiced well in practice. A medical error can have major consequences. Moreover, it is very difficult for an educational institution to properly assess workplace learning, which often takes place outside the immediate field of vision of the educator. When you consider that many skills are taught during medical training, you can imagine that it is important that you want to have a good view of the qualitative progress in development.

There was a lot of data recorded years ago in just about every health education program. In paper booklets. Both during training and during internships and continuing education. Dozens of (paper) forms often had to be filled out per internship. A student, if she or he was lucky and the portfolio was not lost, only had feedback on an assignment recorded in the portfolio after six months. This was not only very demotivating but, more importantly, too late. Feedback is especially meaningful if it is given right after the activity. And since a classic final grade says very little about everything you have learned, it was therefore important to start recording the actual learning information.

This “dialogue-driven learning” where learning/behavioral outcomes (based on feedback) had long been the starting point for the development process in health education at all levels and form the basis for student learning/development. The only problem was that all the paper portfolios were difficult to analyse and process. Now digitization offers a huge opportunity. Feedback and valuable learning moments are captured in data points. And this data is a huge opportunity to give the student, as well as supervisors, teachers and the program insight into the learning process.

Isn’t programmatic assessment a lot of work?

Initially, it was quite complicated to record the learning process itself. There was no software that could do this, nor did the paper booklets offer a solution. The first version of software programs provided a lot of extra administration and filling in. It was indeed a lot of extra work for teachers and tutors.

Now that we are several years down the road, we see that testing and feedback tools have been fully integrated into the learning-work process. For example, Feedback can now be recorded via a smartphone or feedback can be recorded by a teacher ‘unsolicited’ feedback. As it were, the app increasingly follows the learning process you are currently engaged in.

It is also important to choose carefully the number of measurement moments and the feedback instruments used. After all, you don’t have to record everything. The dashboard with the information you need to determine whether someone is on track is an important indicator of what data points you need. Moreover, much of the work and responsibility lies with the student. This is also possible because a good e-portfolio gives the student a very good understanding of the curriculum and the steps she or he needs to take. The learning benefit for the student is greater and it is less work for supervisors.

Is programmatic assessment appropriate for every course?

The answer to this is yes and no. Many programs still operate traditionally with a lot of classroom teaching and grades. Then it is often not effective to start assessing programmatically. It works best with personalized learning pathways. But most training programs are increasingly discovering that it is much more effective to match the learning pace of the individual student and to work with “real” assignments. And then it certainly makes sense to design the curriculum in a programmatic way. So programmatic education really works differently than traditional and group-based education. It often works much more with authentic learning-work assignments or challenges. There are also training programs that have a completely challenged-based learning program. That’s also one of the reasons it’s especially appropriate in Secondary Vocational Education.

How do you implement programmatic assessment?

From time to time we hear from a course that reports to us that they are also getting started with “programmatic assessment” and that they would like to start in two months. This is possible provided the educational concept has already been well thought out. But often a number of thinking sessions precede this.

In particular, thinking about what forms of testing are best suited to what types of assignments and how, as an educator, you know if someone is on track or not. The latter is very important for both the student and the program. You also need to ask how so-called “entrusted decisions” can be made, in other words, how to determine whether someone has mastered a skill well and at what level. And the latter, in turn, is important to be able to adjust where it matters. Hence, when designing programmatic training, we often start on the “output side. At the desired dashboard. Once it is clear what information students, teachers and supervisors need to see if someone is on track, you also know what information needs to be collected from the program. And based on the curriculum, the forms, feedback processes, tests, peer feedbacks or any of the other 25 to 30 ways to collect data points are then put into the system.

What is the role and function of the Scorion e-portfolio in programmatic assessment?

Scorion started developing this data-supporting portfolio some 15 years ago. We wanted to capture all meaningful learning moments in order to get a good picture of how someone is really developing. Initially, we got back from educational institutions that this was unworkable. And it was. All measuring moments had to be entered, which disrupted the learning process considerably.

Together with the education and work field, we have further developed Scorion. And now, as an assessment and feedback platform, Scorion is fully integrated with the learning and work process. You hardly notice the measurement of the data points and Scorion offers students guidance in following the curriculum. A personal learning process does not mean that you learn alone. It is quite possible to learn and work together in projects and still follow your learning path personally. With that, the learning process is personally tailored to the individual. So even though people can still have classroom or project group learning, everyone still has their own pace.

So you could say that in 2008 we developed the first prototype for programmatic assessment. In fact, that was the peer assessment tool we developed for an educational institution. Students started working much more in the real world on “real” assignments instead of spending most of it in school. Very effective, but with that, the teacher lost track of student development. With peer evaluation, students gave each other feedback using Rubrics and it turned out that a good picture of performance emerged.

Now Scorion is like one of the few learning platforms that actually supports a personalized learning path through programmatic feedback and provides insight into the valuable learning moments in the workplace. Not surprisingly, Scorion is being used in a lot of educational institutions. And now also in secondary vocational education, where this teaching method is extremely suitable. Possibly this creates an opportunity to make the MBO education system more flexible and provide truly personalized education. A combination between the educational requirements of the qualification files and the individual and personal development of the student. The idea that every student and every training can cope with one standard set of competences may also need a refresh. If you can eventually work toward a data-based certification system, examined by professionals, this could be a big step into the future. A number of pioneers in the MBO have indicated that they are ‘on board’.

Want to learn more about our portfolio tool for programmatic testing in secondary vocational education?