I want to use large test data because for some algorithm problems, the superiority of the algorithm only becomes evident when dealing with large quantities. Previously, I had students complete tasks on an Online Judge (OJ) platform where large datasets didn't significantly slow down processing. Is it because Moodle has more features, and the logical connections between data are more complex, making it more computationally intensive? Therefore, running large datasets using CodeRunner on Moodle is relatively slower? If so, I will continue to run large dataset problems on the OJ platform.
On the OJ management backend, uploading test data generally involves uploading a zip file, which allows for the sequential upload of multiple test cases. I haven't mastered how to unzip files on CodeRunner yet. If it's too complicated, I'll still use the method of pasting data to upload.
Thank you for your response!
On the OJ management backend, uploading test data generally involves uploading a zip file, which allows for the sequential upload of multiple test cases. I haven't mastered how to unzip files on CodeRunner yet. If it's too complicated, I'll still use the method of pasting data to upload.
Thank you for your response!