python: can't open file './dream_subchallenge2.py': [Errno 2] No such file or directory I got the above error when I used the /data as the working directory in my Dockerfile and copy all the files to ?/data/? working directory, files were not copied to the docker image as such I got error when I ran synthetic data on the image generated using the code specified on https://github.com/Sage-Bionetworks-Challenges/Anti-PD1-DREAM-Examples as shown below: docker run -v (pwd)/CM026formattedsyntheticdatasubset/:/data:ro?v(pwd)/CM_026_formatted_synthetic_data_subset/:/data:ro -v (pwd)/CM0?26f?ormatteds?yntheticd?atas?ubset/:/data:ro?v(pwd)/output:/output:rw image:v1 But when I used /usr/project as my working directory and make a directory /usr/project/data where the contents of the synthetic data were copied. The image perfectly worked with the arguments below: docker run -v (pwd)/CM026formattedsyntheticdatasubset/:/usr/project/data?v(pwd)/CM_026_formatted_synthetic_data_subset/:/usr/project/data -v (pwd)/CM0?26f?ormatteds?yntheticd?atas?ubset/:/usr/project/data?v(pwd)/output:/usr/project/output image:v1. Can I submit the docker image from No.2 that is successfully running? The only difference between the first and the second is the working directory.

Created by Adeolu OGUNLEYE adeoluokiki
Thanks @vchung.
@adeoluokiki -- Thank you for sharing your Dockerfiles. The issue you are experiencing with your first Dockerfile is that you are copying the files and script into a directory named `/data`: ``` COPY dream_subchallenge2.py /data COPY subchallenge2_model1.sav /data COPY requirements.txt /data ``` During the container run, this directory gets overwritten since the input files are mounted into a directory using the same name: ``` docker run -v (pwd)/CM026formattedsyntheticdatasubset/:/data:ro \ ... ``` Also, I noticed you are creating an `output` directory in your Dockerfiles, which is not necessary since it is already being mounted: ``` ... -v(pwd)/output:/output:rw ``` As mentioned earlier, your predictions file should be written to `/output`, with the full path being `/output/predictions.csv`. I hope this helps!
FROM python:3.7 WORKDIR /data COPY dream_subchallenge2.py /data COPY subchallenge2_model1.sav /data COPY requirements.txt /data RUN pip install -r /data/requirements.txt RUN mkdir -p /data/output CMD ["python", "/data/dream_subchallenge2.py", "-g", "/data", "-m", "/data/subchallenge2_model1.sav"] #The Dockerfile above gave error "[Errno 2] No such file or directory" when ran on synthetic data while the Dockerfile below ran successfully on the synthetic data . FROM python:3.7 WORKDIR /usr/local/project COPY dream_subchallenge2.py /usr/local/project COPY subchallenge2_model1.sav /usr/local/project COPY requirements.txt /usr/local/project RUN pip install -r /usr/local/project/requirements.txt RUN mkdir -p /usr/local/project/output RUN mkdir -p /usr/local/project/data CMD ["python", "/usr/local/project/dream_subchallenge2.py", "-g", "/usr/local/project/data", "-m", "/usr/local/project/subchallenge2_model1.sav"] #why is first set of codes not working and the second set worked?
Hi @adeoluokiki , To best assist you, can you share your Dockerfile? Based on what you have described, I am thinking the error is most likely stemming from where the files are copied and how the script is called. Also, keep in mind that when we run your Docker image, all input files are mounted into a directory named `/data` and output files are expected in a directory named `/output`: > #### Input files >The challenge infrastructure will mount all the input data in the working directory of the container called `/data`. Please view the data section for all available files. The clinical data has the following code book (data dictionary). > >To facilitate use of input data we have created synthetic input data that is available here. > > #### Output file >An output file named `predictions.csv` should be written into a directory in the working directory of the container called `/output` with the full path being: `/output/predictions.csv`.

[Errno 2] No such file or directory page is loading…