Just a reminder that the first leaderboard round deadline is coming up at 22:59 GMT on Wednesday (12/14). Please give yourself plenty of time to (re)submit in case you have errors.

Created by Solveig Sieberts sieberts
@Leo_Lu I am getting successful submission with the same file structure and scripts you shared above, probably onnx model location is not referenced correctly in your submissions. I will suggest to give a try with a file structure available in the demo example (your ONNX model should be available at model/model.onnx), update `entrypoint.sh` accordingly. We can not remove this ONNX model requirement. Also, getting successful submissions from multiple participants.
@vjbytes102 after changing my entrypoint.sh to: ``` #!/bin/sh ln -sf model.onnx /output python inference.py ``` Workflow still failing with the same error reason (no model.onnx in /output). Theoretically we only need the 'prediciton.csv' to calculate the test score, could we just remove the requirement to copy the model to /output?
@Leo_Lu yes, you will get the score on a successful submit after second round is over.
@vjbytes102 Thank you so much! I'll try it later, but it looks like I'll need to wait until the second round is over to see the test scores.
@Leo_Lu I just tried the same file structure as above with a successful submit. You just need to modify your "entrypoint.sh". It should be like this (if your model.onnx is at same file structure level as entrypoint.sh): ``` #!/bin/sh ln -sf model.onnx /output python inference.py ```
@vjbytes102 here is my file structure: ``` |--entrypoint.sh |--Dockerfile |--model.onnx |--inference.py ``` entrypoint.sh ``` #!/bin/sh cp model.onnx -d /output cp model/model.onnx -d /output ln -sf model/model.onnx /output python inference.py ``` Thank you very much for your help!
@Leo_Lu your last submit is still failing with the same error reason. Will it be possible for you to share your file structure? No need to share any code. Just share the file structure and 'entrypoint.sh' script.
@Leo_Lu For a successful submit, we need both ONNX model and 'predictions.csv'. We have successful submit (with both model and csv) from multiple participant as well. Also the demo example(with the same file structure and 'entrypoint.sh' ) is shared after testing, so it's correct. I will suggest to debug more or maybe use the same file structure from the demo.
@vjbytes102 I have just checked my 'model/' and model.onnx was successfully copied into it, I have also used model/model.onnx to infer the test data which is the same as the demo, the question is how do I copy model.onnx to output/. I have successfully inferred the test data via model/model.onnx, is it possible to just use 'predictions.csv' to calculate the score without copying model.onnx to output/? Referring to the demo I think I just need to copy the model.onnx to output/, why I use the 'cp' command does not work as well, it is very strange.
@Leo_Lu I guess you are partially following the demo example. If you want to use the same code in entrypoint.sh, then i will suggest you to follow the same file structure from demo example. In demo, trained ONNX model is available under 'model/model.onnx'. If you want to proceed with your own file structure, make sure to give correct reference to ONNX model in enterpoint.sh.
@vjbytes102 Thanks for the suggestion, after adding print(os.listdir('/output')) in the end, the output file shows that it contains the 'predictions.csv', but no 'model.onnx', as I have mentioned before. ([9729739 log result](https://github.com/Leo1998-Lu/CODA-TB-DREAM-Challenge/blob/main/9729740_log_txt.JPG)) method as per the demo didn't not work Dockerfile: ``` RUN mkdir model COPY model.onnx ./model/ ``` entrypoint.sh ``` ln -sf model/model.onnx /output ``` I also tried 'cp model.onnx -d /output' which didn't not work.
@Leo_Lu can you please add print(os.listdir('/output')) in the end. Just want to make sure, after prediction what is available at the '/output' location.
@vjbytes102 Thank you for your patient reply? Here is my inference code, and docker log in 9729739.txt shows that inference on the test set is complete ([9729739 log result](https://github.com/Leo1998-Lu/CODA-TB-DREAM-Challenge/blob/main/9729739_log.JPG)) ``` def get_prediction(val_X): model_inference = rt.InferenceSession("model/model.onnx") input_name = model_inference.get_inputs()[0].name label_name = model_inference.get_outputs()[0].name onnx_pred = model_inference.run([label_name], {input_name: val_X.astype(np.float32)}) pred = np.argmax(onnx_pred[0], axis=1) return pred print('predicting****') pred = get_prediction(x_test) train_te_meta['probability'] = pred train_te_meta[['participant','probability']].to_csv('/output/predictions.csv',index=False) print('*****done!!') ``` ![9729739 log result](https://github.com/Leo1998-Lu/CODA-TB-DREAM-Challenge/blob/main/9729739_log.JPG)
@Leo_Lu To debug, I will suggest to add/print more logger in your inference script. Like, at the end of the script, print what is available inside the '/output' folder.
@Leo_Lu I looked into the log file of your submissions ("No 'predictions.csv' file written to '/output'), looks like your inference code is not generating predictions.csv itself. In all of your submission, no 'predictions.csv' generated by the inference code. Both onnx model and prediction csv file should be available at '/output' location. We do have successful submission from other participant.
@vjbytes102 Yes, I am using the same way as the demo (copying model to /output location), but workflow shows FAILED, has anyone submitted it successfully so far?
@Leo_Lu As per the challenge requirements, participants are expected to share their trained ONNX model into '/output' location. It will be an INVALID/FAILED submission, if onnx model is not available at expected location. As per your comments, you have copied the onnx model into 'output/' location. I will suggest to double check, model should be available at '/output' location. Please find below link (copying model to /output location) from the demo example, https://github.com/Sage-Bionetworks/tb_challenge_demo/blob/main/inference_tb/entrypoint.sh#L3
@vjbytes102 can you help @Leo_Lu ?
Thank you for your reply! I've tried multiple commits so far and the docker results are fine, I got ?/output/predictions.csv? for inferencing the test dataset. But workflow still shows FAILED, I copied the onnx model into the 'output/' and I checked the workflow log (9729729_logs) which says "No 'ONNX format model' file written to /output", but I have already inferenced the results from the onnx model, so I don't know if this is a bug or not.
@Leo_Lu - You may check the status of your submissions here: https://www.synapse.org/#!Synapse:syn31472953/wiki/620175. Submission scores will be posted on the day following the round deadline. In this case that would be Thursday 12/15.
How could we check whether the docker submission was successful? I received a "valid submission" email after submitting, but no information about test score, will the score show up after the first leaderboard round?

Leaderboard Deadline 12/14 at 22:59 GMT page is loading…