Dear organizers,
I just noticed that our inference submission (SubChallenge 1, submission ID 8112333)
has been terminated after running for a week. The status is:
STATUS: INVALID
STATUS DETAILS: OVER SUBMISSION LIMIT
According to the Leaderboard Inference Submissions table for this Subchallenge, we have
two Scored submissions, therefore we should have one more. Could you explain why
8112333 was terminated?
I'll point out that I have another submission running. But that means there were two
running, and one slot left. Which one gets terminated? It would be helpful to know the exact
rules. Thank you.
Ljubomir
Created by Ljubomir Buturovic ljubomir_buturovic Very well. You may get an email message saying that 8129441 encountered an error while running. That's due to manually stopping the running container. Thanks. Yes please requeue 8112333 and cancel 8129441
> It is not true that 8112333 was not begun. In fact it was in progress days before the other submission (8129441) was even submitted.
Yes, you are right. 8112333 had been running when the server running it encountered a problem. It was re-queued to be run again. So I was wrong to say that it hadn't started running. It was in the state of a submission which had not started running but only because we had re-queued it after the server problem.
**If you would like us to use 8112333 in place of 8129441, please let us know immediately by responding to this post. We will cancel 8129441 and re-queue 8112333 for processing. **
> This isn't sufficiently clear to me. What happens if a submission fails? For example, poorly formatted input file, or the script crashes half-way through, or the output file is wrongly formatted and not scored? Do these count?
If a submission terminates due to a problem with the challenge infrastructure we will restart it. If it terminates due to a "bug" in your code, it is marked INVALID and does not count towards your three submission quota. However the three *valid* submissions must be sent by the end of the round. With the huge spike in inference submissions just before the round end this means that there is no guarantee that the results from the inference submission queue can be guaranteed to report any bugs in your code in time for you to send a replacement. For that reason we recommend running your code locally and using the express lane to ensure code correctness before submitting to the inference queue. This is especially important when submitting near the end of the round.
> It appears that the system periodically allows more than 3 submissions.
I think I addressed this in my last post.
> Finally, ... We would appreciate if this instruction is corrected/clarified.
I will forward your request to the author.
Once more: **If you would like us to use 8112333 in place of 8129441, please let us know immediately by responding to this post. We will cancel 8129441 and re-queue 8112333 for processing. **
Appreciate your attention to this. However the responses open up a
number of questions.
First: "We terminated the one that had not yet begun."
It is not true that 8112333 was not begun. In fact it was in progress
days before the other submission (8129441) was even submitted. Please
see the attached screenshot. As you can see, 8129441 is not in the
screenshot because it wasn't even submitted at that time. So I still
don't know why was 8112333 removed
Next: "The exact rule is that participants are allowed to submit up to
three submissions per sub-challenge per five week round."
This isn't sufficiently clear to me. What happens if a submission
fails? For example, poorly formatted input file, or the script crashes
half-way through, or the output file is wrongly formatted and not
scored? Do these count? As you can see from the screenshot, we had
three incorrect submissions and one scored. From this, we concluded -
wrongly, it seems - that only scored submissions count. Therefore I
still would like to ask for more clarity regarding what counts as a
submission.
I hope you agree this is very important point, not least because there
are only 3 submissions left. It appears that the system periodically
allows more than 3 submissions. This means that the only safe approach
is to keep track of submissions ourselves, to avoid removals in the
middle of processing. I don't mind keeping track, but I don't know
how. Again, what are the rules?
Finally, there is a reason why we submitted over 3 submissions:
because of the following claim in the Wiki:
This also implies that the subjects in the scoring set must be
processed sequentially and not using multiple threads.
This is not true: we now know that the multi-threaded submissions work
perfectly fine. But we did not know that before actually
trying. Therefore we had to submit both single-thread and
multiple-thread versions - exact same models, just different number of
threads - not knowing if the multi-threaded ones will finish at
all. We would appreciate if this instruction is corrected/clarified.
Thanks for your support
${imageLink?synapseId=syn8242426&align=None&responsive=true}
@ljubomir_buturovic: As you mention, the submission limit is *three* per sub-challenge. Due to a configuration error, the enforced limit was not reset from the temporary extension to *six* intended only for Round 1. Most participants stopped at three submissions per sub-challenge, but a few (such as yourself) sent more. When we discovered the aforementioned configuration error, the status of your four submissions was as follows:
8074996 (scored)
8129608 (scored)
8129441 (evaluation in progress)
8112333 (received, not yet begun to be processed)
Since 8129441 was already in progress, the correction was simply to cancel the queued submission 8112333.
To answer your specific questions:
> our inference submission (SubChallenge 1, submission ID 8112333) has been terminated after running for a week. ... Could you explain why 8112333 was terminated?
8112333 was not terminated but was simply removed from the waiting queue.
> Which one gets terminated?
We terminated the one that had not yet begun.
> It would be helpful to know the exact rules.
The exact rule is that participants are allowed to submit up to three submissions per sub-challenge per five week round.
I hope this helps.