Thanks for raising this Trent.
“Students are appear to more comfortable working in an IDE when developing code solutions and this is how one would develop code in the workplace.Totally agree. That has always been our model here, too. Although having said that, I think there are quite a few other users who don't operate that way. For example, the University of Auckland's Moodle site says "Because the system is on-line students do not need to have programming tools installed on their lab or private computer."
This might be overcome with more practice with CodeRunner quiz items out of test conditionsYes, I think this is an important point. On-line tests and exams can be very stressful for students. In the past, when we had a special on-line exam environment that students were thrown into only for exams, a significant percentage of students crashed badly. But we now use CodeRunner for all labs, assignments, tests and exams. Students work with it both at home and in our labs from day 1 in the course. As a result, I think our on-line exams are no more stressful than written ones and possibly even significantly less stressful. It is still very important, however, to start every test or exam with some really trivial questions (as with written exams) to help students relax a bit.
Now, I’m not even sure that it is possible to have a “file upload” option for a quiz item in Moodle, so the request might not be technically possible anyway!I certainly could have an option to upload data from a file rather than cutting and pasting. However, would this really reduce the stress for your students? My feeling is that the stress arises because of the uncertainty and unfamiliarity with CodeRunner rather than the particular sequence of mouse and keyboard events required to transfer the code across.
If the quiz item had an upload file option this could allow the student to upload the package file that could be tested in the CodeRunner Quiz framework on the jobeserver still using the output text matching.Yes, this is a reasonably compelling use case. However, presentation of the results is now less clear as students can no longer see on screen exactly what they submitted. They would have to be able to download again the jar file they submitted to see what the errors were. [I'm thinking of cases where they somehow submitted an incorrect jar or one containing the wrong version of a file.]
It could also be interesting to entertain the idea of allowing code to be retrieved from a version control repository.Uploading directly from a repository into Moodle is semi-doable even now. The "student answer" could be just the student's repository URL, and the template could begin with something equivalent to
git clone {{ STUDENT_ANSWER }}The rest of the template could then cd into the directory and test the student's code. I've done something very similar to that in a web programming course, where students developed a website to certain specs and just pasted its url into the student answer.
However, this approach has a few problems, such as:
- As with jar file upload, it's harder to understand the table
of results, as the submitted code isn't clearly visible.
- Grading is not automatically triggered by a commit to the repo; students have to independently request it.
- The state of the repo changes over time so it would be difficult to go back through a student's submissions to see what happened unless the template were able (somehow) to record the state of the repo as the time it was being graded on each submission.
- jobe needs to have git installed and the firewall configured to allow access to the repo(s).
Richard