Skip to content
This repository was archived by the owner on Jun 22, 2022. It is now read-only.

Commit ceb7d16

Browse files
author
minerva-ml
committed
Merge branch 'master' of github.com:neptune-ml/open-solution-ship-detection
2 parents ba37aed + 7b38eb6 commit ceb7d16

File tree

1 file changed

+117
-6
lines changed

1 file changed

+117
-6
lines changed

README.md

Lines changed: 117 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -3,9 +3,6 @@
33

44
This is an open solution to the [Airbus Ship Detection Challenge](https://www.kaggle.com/c/airbus-ship-detection).
55

6-
## More competitions :sparkler:
7-
Check collection of [public projects :gift:](https://app.neptune.ml/-/explore), where you can find multiple Kaggle competitions with code, experiments and outputs.
8-
96
## Our goals
107
We are building entirely open solution to this competition. Specifically:
118
1. **Learning from the process** - updates about new ideas, code and experiments is the best way to learn data science. Our activity is especially useful for people who wants to enter the competition, but lack appropriate experience.
@@ -27,9 +24,123 @@ In this open source solution you will find references to the [neptune.ml](https:
2724

2825
| link to code | CV | LB |
2926
|:---:|:---:|:---:|
30-
|[solution 0](https://github.com/neptune-ml/open-solution-ship-detection/tree/solution-1)|XXX|XXX|
31-
|solution 1|XXX|0.855|
32-
|solution 2|XXX|0.875|
27+
|solution 1|0.541|0.573|
28+
29+
## Start experimenting with ready-to-use code
30+
You can jump start your participation in the competition by using our starter pack. Installation instruction below will guide you through the setup.
31+
32+
### Installation *(fast track)*
33+
1. Clone repository and install requirements (*use Python3.5*) `pip3 install -r requirements.txt`
34+
1. Register to the [neptune.ml](https://neptune.ml) _(if you wish to use it)_
35+
1. Run experiment based on U-Net:
36+
37+
38+
#### Cloud
39+
```bash
40+
neptune account login
41+
```
42+
43+
Create project say Ships (SHIP)
44+
45+
Go to `neptune.yaml` and change:
46+
47+
```yaml
48+
project: USERNAME/PROJECT_NAME
49+
```
50+
to your username and project name
51+
52+
Prepare metadata and overlayed target masks
53+
It only needs to be **done once**
54+
55+
```bash
56+
neptune send --worker xs \
57+
--environment base-cpu-py3 \
58+
--config neptune.yaml \
59+
prepare_metadata.py
60+
61+
```
62+
63+
They will be saved in the
64+
65+
```yaml
66+
metadata_filepath: /output/metadata.csv
67+
masks_overlayed_dir: /output/masks_overlayed
68+
```
69+
70+
From now on we will load the metadata by changing the `neptune.yaml`
71+
72+
```yaml
73+
metadata_filepath: /input/metadata.csv
74+
masks_overlayed_dir: /input/masks_overlayed
75+
```
76+
77+
and adding the path to the experiment that generated metadata say SHIP-1 to every command `--input/metadata.csv`
78+
79+
Let's train the model by running the `main.py`:
80+
81+
```bash
82+
neptune send --worker m-2p100 \
83+
--environment pytorch-0.3.1-gpu-py3 \
84+
--config neptune.yaml \
85+
--input /SHIP-1/output/metadata.csv \
86+
--input /SHIP-1/output/masks_overlayed \
87+
main.py
88+
89+
```
90+
91+
The model will be saved in the:
92+
93+
```yaml
94+
experiment_dir: /output/experiment
95+
```
96+
97+
and the `submission.csv` will be saved in `/output/experiment/submission.csv`
98+
99+
You can easily use models trained during one experiment in other experiments.
100+
For example when running evaluation we need to use the previous model folder in our experiment. We do that by:
101+
102+
changing `main.py`
103+
104+
```python
105+
CLONE_EXPERIMENT_DIR_FROM = '/SHIP-2/output/experiment'
106+
```
107+
108+
and running the following command:
109+
110+
111+
```bash
112+
neptune send --worker m-2p100 \
113+
--environment pytorch-0.3.1-gpu-py3 \
114+
--config neptune.yaml \
115+
--input /SHIP-1/output/metadata.csv \
116+
--input /SHIP-1/output/masks_overlayed \
117+
--input /SHIP-2 \
118+
main.py
119+
```
120+
121+
#### Local
122+
Login to neptune if you want to use it
123+
```bash
124+
neptune account login
125+
```
126+
127+
Prepare metadata by running:
128+
129+
```bash
130+
neptune run --config neptune.yaml prepare_metadata.py
131+
```
132+
133+
Training and inference by running `main.py`:
134+
135+
```bash
136+
neptune run --config neptune.yaml main.py
137+
```
138+
139+
You can always run it with pure python :snake:
140+
141+
```bash
142+
python main.py
143+
```
33144

34145
## Get involved
35146
You are welcome to contribute your code and ideas to this open solution. To get started:

0 commit comments

Comments
 (0)