Do we select one/some of the scored model to run on the validation data? Or do we build new docker projects for the final validation? And what if the previous scored model encounters errors with the new validation data?

Created by Jifan Gao ggggfan
Hi @trberg, Thanks!
Hi @arielis, Thanks for catching that. We've moved to plan B. If you go to your [submission dashboard](https://www.synapse.org/#!Synapse:syn18405991/wiki/595490), we've added the docker digest column. You can pull the specific image using the above docker pull and the docker digest. Thanks, Tim
Hi @trberg , I tried to use your script to get the SHA of the submission. I got the following error: synapseclient.exceptions.SynapseHTTPError: 403 Client Error: User lacks READ_PRIVATE_SUBMISSION access to Evaluation
Hi @ggggfan, You will be allowed one final submission for the validation phase. Hi @shikhar-omar, You can submit whichever version of the docker image that was submitted for your high scoring submission. If that version no longer exists in your synapse project or your local repo, you can collect the unique SHA from synapse by using: ``` import synapseclient syn = synapseclient.Synapse() syn.login("username", "password") submission_id = "submission id of the submission you'd like to submit to the validation phase" submission = syn.getSubmission(submission_id) repo_name = submission.dockerRepositoryName digest = submission.dockerDigest ``` In the command line, you can then pull the specific instance with docker. ``` docker pull repo_name@digest ``` You would then push that specific image to your synapse project which you can submit. Thanks, Tim
Hi @trberg , Thanks for your reply! How many submissions are allowed in the final evaluation? Thanks,
Where can we selectpreviously scored models (if high scoring) for the validation phase? Thanks!
Hi @ggggfan, You can decide which model you'd like to submit. The purpose of the leaderboard phase was to allow you to test out different models. You may submit a previously scored model, or you may try a new model that you think will do better although you won't see the score until we release the final leaderboard. If the model encounters errors with the new validation data, we'll work with the teams to debug those errors. Thanks, Tim

Question on Validation Phase page is loading…