Question Authors' Forum

Can the grader be sidestepped?

Picture of Michel Wermelinger
Can the grader be sidestepped?
by Michel Wermelinger - Sunday, 14 May 2017, 11:03 PM

I'm starting to write a few questions for an assignment, where each question builds on the previous one: they first have to write 2 independent functions A and B, then a third function C that calls A and B, then have to run a program that calls C.

Without CodeRunner, I'd give them a code file to fill in and hope for the best, i.e. that they don't get stuck with A and B. With CodeRunner, each can be a separate question, so they can do C even if were unable to do A or B, because I can put the solutions to A and B as hidden support files for C. So far so good.

But for the last question, I really just want them to run the program with their own input and then report whether the output makes sense - it's an information retrieval type of program. I can put the program in a support file and have them write as the answer

result = process('...')

where process is the main function of the program, and '...' is their chosen input. The single test can then just do print(result), but since I don't know what is the expected output, the test will always fail. Can the grader be sidestepped so that the output is printed without being graded? 

I could of course hand out the program as a .pyc file so that the answers to A, B, C are not revealed, and hope nobody finds out about decompilers, but having it all in CodeRunner to avoid switching contexts/media would be better.


Picture of Richard Lobb
Re: Can the grader be sidestepped?
by Richard Lobb - Monday, 15 May 2017, 12:12 PM

As far as I know, all question types in a standard Moodle quiz need to gradable, so I suppose the short answer is "no". But there are workarounds.

If you're concerned that the students' submissions are graded wrong, you could use a regular expression grader and match against ".* ". You'd probably also want to suppress the 'Expected' column in the result table to avoid confusing students. You can also set the mark for that question to 0 so the overall quiz outcome is unaffected.

If you just want the student to suggest a single string parameter for the process method, it might be simpler for the student if you ask them just to paste that string into the answer box; you could then change the template to something like

param = """{ STUDENT_ANSWER | e('py')}"""

If you want complete control of the output displayed to the student (e.g. you don't want to show them a results table at all) and/or you want to control the grade programmatically, you could use a combinator grader and set the prelude or postlude to the output captured by running the task in a subprocess, but that's a lot harder.

One thing to be aware of: the support files you add to a question aren't really hidden. A student can easily write a program to print all the files in the current working directory. So if you're worried about students getting hold of your sample answers to the various functions (A, B, C, etc) you might want to consider putting up .pycs anyway. If even that worries you, you'll have to implement your own security within the template.