Hello,
The order of columns in `input.csv` seem to vary. Specifically, the position of the column `normalization` differs between the Tumor_Deconvolution_Coarse_Fast_Lane and the Tumor_Deconvolution_R1_Coarse queue. Since it was allowed [here](https://www.synapse.org/#!Synapse:syn15589870/discussion/threadId=5922), I printed the content of the file in my logs.
### `normalization` comes last
- example files here: https://github.com/Sage-Bionetworks/Tumor-Deconvolution-Challenge-Workflow/blob/ee2ec061319c82a91341c4f7982665cac86a3eaa/example_files/fast_lane_dir/input.csv
- Tumor_Deconvolution_Coarse_Fast_Lane
### `normalization` comes at 5th position
- the wiki: https://www.synapse.org/#!Synapse:syn15589870/wiki/592699
- Tumor_Deconvolution_R1_Coarse queue
Changing the position of the `normalization` column from the 5th to the last subsequently changes the position of all columns in between.
As a result, all my containers loaded in the wrong expression files in the leader board queue, Tumor_Deconvolution_R1_Coarse, and hence could not recognize any features. Is there a way to test the prediction performance with the leaderboard files retrospectively?
Created by Dominik Otto djo @djo
I will have to get back to you on release of the validation files. Hi @andrewelamb,
Thank you for the response. Our dispatching is done by a bash script, that naively submits single lines of the `input.csv` file to parallel threads. We corrected this behavior now. However, we hoped to benchmark the software in round 1 concerning runtime and robustness to different normalizations in an independent dataset with different features. This would help to improve the algorithm for the next round. I wanted to find out if there will be a way to test it against the validation files you prepared, e.g., by making them available to us.
Kind regards,
Dominik Hi @djo
We never had a set column order in mind and you shouldn't expect there to be, We do expect participants to make use of column headers to access the correct columns.
Can you clarify:
"Is there a way to test the prediction performance with the leaderboard files retrospectively?"
Thanks!
-Andrew