Automatic testing while entering questions

Automatic testing while entering questions

de Sjoerd Op 't Land -
Número de respuestas: 4

I have no idea how difficult this feature request is, but just dreaming...

I find myself passing quite some time on saving a new CodeRunner question, clicking preview, (automatically) filling in the right answer and checking that all tests pass. Would it be possible to automatically test a question upon saving the question? Or with a special button?

En respuesta a Sjoerd Op 't Land

Re: Automatic testing while entering questions

de Richard Lobb -

Yes, I'd like that too. I tried implementing it a year or two back but gave up. CodeRunner is just a question-type plug-in, so has to be consistent with the expected behaviour (i.e. that of the generic question type). There is a callback during saving so the question type can save its own extra data, but by then a whole heap of stuff has already been written to the database so it's too late. I did look into using the form-data validation callback, but it looked like being a lot of work: I think (and my memory's a bit vague here) the problem was that it was too early in the process, so the necessary data structures hadn't been created. Other possible ways of overriding base-class behaviour looked hacky and liable to cause long-term maintenance problems.

So while it's still vaguely on my "possibly-to-do-sometime" list, it's not about to happen any time soon.

FWIW, I and some colleagues have written question types that run the sample answer to get the expected output using a per-test template grader but apart from inefficiency (which didn't prove a major problem in our applications) it's fragile because an error in the sample answer can go undetected and totally break the question.

-- Richard

En respuesta a Richard Lobb

Re: Automatic testing while entering questions

de Tim Hunt -

To me (not understanding how the data-structures, etc. work) then this definitely should be done in the form validation.

If the expectation is that the teacher-supplied right answer should be graded as 100%, then when that is not the case, the data entered in the form is wrong, and the teacher should correct that before any data is saved. Hence the form needs to be redisplayed with the errors, and this is exactly what validation does.

This might, on occasion, be a pain for the teacher. It might be worth having a extra check-box on the form (perhaps only visible if the validation has failed) saying "Save this question even though the right answer is not graded 100%", or something like that.

As I say, I don't know about implementing this in the context of CodeRunner, but it is possible to do some pretty crazy things in forms, to get the usability you want. An example of that is the STACK question type. (I don't know if it is a comprehensible example.) https://github.com/maths/moodle-qtype_stack/blob/master/edit_stack_form.php#L656

En respuesta a Tim Hunt

Re: Automatic testing while entering questions

de Richard Lobb -

Hi Tim

Many thanks for the input. I agree with the principle that if there is a supplied right answer is should be graded as 100% and therefore checked before any data is saved. And indeed form-validation does seem like the appropriate place to do that, although as you say, there would need to be a checkbox to disable the checking, e.g. if the question has a large run time (some of my colleagues have written questions that take over 30 seconds to grade) or if the author wants to do a temporary save of an incomplete problem.

However, (and I may just be being cowardly) I find the prospect of implementing such a check in the form validation code rather daunting. At that stage all I have to work with is the data in the question-authoring form. The grade_response function is a method of the question object, so I need one of those for starters. Constructing this isn't as easy with CodeRunner questions as you might think because the question fields are only partly determined by the data in the form - most are inherited from the prototype, which must be loaded from the database first. Yes the code is all there, but it's distributed over various methods of the question-type superclass and my subclass, and I don't fully understand the sequence of operations while questions are being constructed - I just process the various callbacks you've supplied.

Then there's the problem of any test data files that might be associated with the question. These can come from the question instance and/or the prototype. In the current design data files are assumed to have all been loaded into the Moodle file system during question saving so are pulled from there during testing. I'd have to temporarily upload them and delete them later or do some other hack. Again, my understanding of the framework feels a bit fragile here.

Lastly, if I can solve all those problems, I have to send the run off to the Jobe server (or other sandbox) and untangle the response, feeding the errors back into the authoring form rather than the usual tabular result table display. Further complication comes from the fact that the author might be using a template grader so the results aren't necessarily just simple Yes/No booleans for each test case. I also need to be able to deal with the possibility that the Jobe server is down for maintenance.

I find myself thinking that all that sound and fury isn't really in keeping with the idea of form validation, is it?

All the same, if you still think it still sounds do-able and appropriate - and particularly if you're volunteering to help :-) - I'll move it higher up the TODO list. It won't happen for a while though - we're just starting our academic year here and I'm very busy.

Richard



En respuesta a Richard Lobb

Re: Automatic testing while entering questions

de Tim Hunt -

It sounds to me like this would potentially be some good de-tangling to do at some point in time, since it should lead to the code being more loosely coupled.

However, as you say, a lot of work, so it is unlikely to happen soon. I am afraid I can't offer to help.

So, I think it just needs to sit somewhere on the to-do list for now.