Dear all,
I am trying to setup the MLCube docker for running the infrerence step.
I followed the steps to configure the docker:
```
mlcube configure -Pdocker.build_strategy=always
```
However, when running the container using
```
mlcube run --gpus device=0 --task infer data_path=PATH_TO_DATA output_path=OUT_PATH
```
I get a permission error when reading the image files. I can easily solve it by giving the right permission from the python inference script (os.chmod).
Unfortunately, after pushing the docker image and running it trough:
```
medperf --gpus=all test run --offline --no-cache --demo_dataset_url synapse:syn53368010 --demo_dataset_hash "920f00d146eadc10663d265c12176e978d1a001d64127adfd084c2bef308fb27" -p ./prep_segmentation -e ./eval_segmentation -m MY_PATH
```
I get:
```
SError: [Errno 30] Read-only file system: '/mlcube_io0/BraTS-ZZZ-00002-000/'
```
Am I doing something wrong?
PS: I also tried to setup permission before running the docker container and I also tried to not changing the permissions from the python script and pushing the image anyway. I get permission error:
```
ermissionError: [Errno 13] Permission denied:
'/mlcube_project/additional_files/nnUNet/nnUNet_raw/Dataset005_Brats24T/imgVal/B
raTS-ZZZ-00002-000_0003.nii.gz'
```
Created by Riccardo Levi riky_levi @vchung Aha yeah, it works now. Thank you so much! @adamsyah ,
Ah, I missed this when I first looked at your config file~ Looking more closely now, I see that `output_path` is defined (again) in `parameters.input` when it is already defined in `parameters.outputs`:
```text
parameters:
inputs: {
...
output_path: additional_files/predictions_gli/,
...
}
outputs: {output_path: {type: directory, default: predictions}}
```
I believe this is the reason for your "Read-only file system" error, as all input folders are mounted as read-only. To make it so your model can output the predictions, try removing the `output_path` defined in `parameters.inputs`.
Hi @vchung ,
I did the
```
git checkout brats2023 && git pull
```
and then reinstalled the MedPerf CLI from the source, but I'm still encountering the same error. Everything works perfectly fine when I run it locally with mlcube run. Since you mentioned it might be a permissions thing, I'm just going to submit my Docker and MLCube config tarball; I hope there will be no issue when you test my submission.
Thanks a lot for your help. Really appreciate your assistance! @abhijeetparida @adamsyah ,
Thank you for sharing - both of your config files look correct.
After some digging, the permission issues may be due to running your MLCube Docker containers as a non-root user, which is the default behavior of the MedPerf CLI starting this year. To be able to run your MLCube container as root, you will need to use the `brats2023` version instead. To update the CLI:
- Change to the `medperf` directory and checkout the "brats2023" branch. Also pull the latest changes:
```sh
git checkout brats2023 && git pull
```
- Switch to the virtual environment (if you're using one), then re-install the MedPerf CLI from source:
```sh
pip install --force-reinstall -e ./cli
```
Writing your Docker models to run as non-root is not a criteria for BraTS-GoAT 2024, so you do not need to be concerned with updating your Dockerfile if this was the issue. @vchung
```
docker:
# Image name
image: docker.synapse.org/syn54013323/cnmc-goat:latest
# Docker build context relative to $MLCUBE_ROOT. Default is `build`.
build_context: "../"
# Docker file name within docker build context, default is `Dockerfile`.
build_file: "Dockerfile"
gpu_args: --shm-size=2gb --gpus=all
tasks:
infer:
# Computes predictions on input data
parameters:
inputs: {
data_path: data/,
#parameters_file: parameters.yaml,
# Feel free to include other files required for inference.
# These files MUST go inside the additional_files path.
# e.g. model weights
# weights: additional_files/weights.pt,
}
outputs: {output_path: {type: directory, default: predictions}}
```
@vchung Yes sure
```
name: brats_goat
description: BraTS24
authors:
- {name: Adam}
platform:
accelerator_count: 1
docker:
# Image name
image: docker.synapse.org/syn54953987/biomediambzuai:latest
# Docker build context relative to $MLCUBE_ROOT. Default is `build`.
build_context: "../project"
# Docker file name within docker build context, default is `Dockerfile`.
build_file: "Dockerfile"
gpu_args: --shm-size=2g
tasks:
infer:
# Computes predictions on input data
parameters:
inputs: {
data_path: additional_files/dataset/,
parameters_file: parameters.yaml,
output_path: additional_files/predictions_gli/,
checkpoint_dir: additional_files/checkpoints/
# Feel free to include other files required for inference.
# These files MUST go inside the additional_files path.
# e.g. model weights
# weights: additional_files/weights.pt,
}
outputs: {output_path: {type: directory, default: predictions}}
```
@abhijeetparida @adamsyah ,
Hm, do you mind sharing the `tasks` section of your mlcube.yaml file? Hi. I am also facing this problem
```
OSError: [Errno 30] Read-only file system: '/mlcube_io2/BraTS-ZZZ-00002-000.nii.gz'
? Model MLCube failed: There was an error while executing the cube.
```
EDIT: CC @vchung @dream_service
I still get permission error
PermissionError: [Errno 13] Permission denied: 'segresnet/data.json' @riky_levi @YunfeiXie @abhijeetparida @Qiaojun -
Apologies for that! I pushed up a fix that should resolve the read-only issue. Please use this updated command when testing the MLCube compatibility:
```sh
medperf --gpus=all test run \
--offline --no-cache \
--demo_dataset_url synapse:syn53368010 \
--demo_dataset_hash "18534824fb01ca92fb4eed991cb19d0f9445821bfd935b58e4935ac276c62d2c" \
-p ./prep_segmentation \
-e ./eval_segmentation \
-m FILEPATH_TO_YOUR_MLCUBE
```
If an issue still arises, please let us know~
EDIT: add another user tag @vchung @ujjwalbaid I also have the same problem. Are there any solution to fix this? @abhijeetparida We are experiencing a similar issue. Running Docker directly on our server works fine, but using the official command results in a permission denied error. If you have any suggestions or insights on how to resolve this, it would be greatly appreciated. @vchung: I am also having trouble with permissions when I run the testing using the `medperf` command, whereas running the `mlcube` testing command works perfectly fine. I get a permissions error when I unzip a zip of all the model weights. I encountered the same issues. Could you please help me fix this problem of no permission? Thank you so much! @vchung @ujjwalbaid