I understand that the whole point of CodeRunner is to automate grading, but I'm putting together an open ended question type where students are expected to write a web page that meets certain criteria. For example, they must include a list with at least 3 list items, a header, a paragraph, etc. I want to allow my students to have a lot of freedom in what their web page looks like, so I'll write a script that checks for these items, but not the text within. Most students will take the time to do a good job, but some might put in dummy text, and several will likely neglect to check spelling, grammar, etc. What's the best way to base part of the grade on CodeRunner's assessment of their work, and part on my assessment of their work?
I am probably under-thinking this, but if it was me I'd have the CodeRunner question in a quiz and follow it with a Moodle essay-type question that will be used for the 'style & effort' component of the mark. Essay-type questions are manually marked. I would use ask the student something they can answer briefly like "what aspect of your webpage are you most proud of? What would you have most liked to have done more on?" and then mark it manually - and my marking would be the style and effort marking of their web-page not really their reflection at all (but I have found that those reflection questions can give some fascinating answers...). You can easily include comments in the manual marking.