Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update docker implementation #14

Open
wants to merge 7 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 10 additions & 0 deletions .dockerignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
__pycache__/

.dockerignore
docker-compose.yml

dictionaries/*
!dictionaries/readme.txt
!dictionaries/sample_dict.npy

.git*
1 change: 0 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
@@ -1,5 +1,4 @@
__pycache__/
docker-compose.yml
dictionaries/*
!dictionaries/readme.txt
!dictionaries/sample_dict.npy
25 changes: 17 additions & 8 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,22 +33,26 @@ pip install -r requirements.txt
```

## Running with Docker

You can also use docker if you know what it is and have some knowledge on how to use it. Here are the steps to run the tool with docker.

- First you must build the container: `docker build . -t surpriver`
- Then you need to copy the contents of docker-compose.yml.template to a new file called docker-compose.yml
- Replace `<C:\\path\\to\\this\\dir>` with the directory you are working in.
- Run the container by executing `docker-compose up -d`
- Execute any of the commands below by prepending `docker exec -it surpriver` to your command line.
- First you must build the container: `make`
- Execute any of the commands below by prepending `docker-compose exec surpriver` to your command line.

### Predictions for Today
If you want to go ahead and directly get the most anomalous stocks for today, you can simple run the following command to get the stocks with the most unusual patterns. We will dive deeper into the command in the following sections.

#### Get Most Anomalous Stocks for Today
##### When you do not have the data dictionary saved and you are running it for the first time.
```

```shell
make today

# OR

python detection_engine.py --top_n 25 --min_volume 5000 --data_granularity_minutes 60 --history_to_use 14 --is_load_from_dictionary 0 --data_dictionary_path 'dictionaries/data_dict.npy' --is_save_dictionary 1 --is_test 0 --future_bars 0
```

This command will give you the top **25 stocks** that had the highest anomaly score in the last **14 bars** of **60 minute candles**. It will also store all the data that it used to make predictions in the **dictionaries/data_dict.npy** folder. Below is a more detailed explanation of each parameter.
- **top_n**: The total number of most anomalous stocks you want to see.
- **min_volume**: Filter for volume. Any stock that has an average of volume lower than this value will be ignored.
Expand All @@ -61,8 +65,13 @@ This command will give you the top **25 stocks** that had the highest anomaly sc
- **future_bars**: These number of bars will be saved from the recent history for testing purposes.
- **output_format**: The format for results. If you pass CLI, the results will be printed to the console. If you pass JSON, a JSON file will be created with results for today's date. The default is CLI.

##### When you have the data dictionary saved, you can just run the following command.
```
##### When you have the data dictionary saved, you can just run the following command.

```shell
make history

# OR

python detection_engine.py --top_n 25 --min_volume 5000 --data_granularity_minutes 60 --history_to_use 14 --is_load_from_dictionary 1 --data_dictionary_path 'dictionaries/data_dict.npy' --is_save_dictionary 0 --is_test 0 --future_bars 0 --output_format 'CLI'
```
Notice the change in **is_save_dictionary** and **is_load_from_dictionary**.
Expand Down
8 changes: 8 additions & 0 deletions docker-compose.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
version: "3"

services:
surpriver:
build: .
restart: always
volumes:
- "./:/usr/src/app"
8 changes: 0 additions & 8 deletions docker-compose.yml.template

This file was deleted.

7 changes: 5 additions & 2 deletions dockerfile
Original file line number Diff line number Diff line change
@@ -1,8 +1,11 @@
FROM python:3.8

# Setup environment
RUN cp /usr/local/bin/pip3.8 /usr/local/bin/pip3 # reenable pip3

# reenable pip3
RUN cp /usr/local/bin/pip3.8 /usr/local/bin/pip3
RUN pip3 install --upgrade pip

WORKDIR /usr/src/app

# Install requirements
Expand All @@ -13,4 +16,4 @@ COPY . .

VOLUME ["/usr/src/app"]

CMD ["/usr/src/app/entry_point.sh"]
CMD ["./entry_point.sh"]
Empty file modified entry_point.sh
100644 → 100755
Empty file.
37 changes: 37 additions & 0 deletions makefile
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
start: dockerfile docker-compose.yml
docker-compose up -d --build

stop:
docker-compose stop

down:
docker-compose down

ps:
docker-compose ps

today: start
docker-compose exec surpriver \
python detection_engine.py \
--top_n 25 \
--min_volume 5000 \
--data_granularity_minutes 60 \
--history_to_use 30 \
--is_load_from_dictionary 0 \
--data_dictionary_path 'dictionaries/data_dict.npy' \
--is_save_dictionary 1 \
--is_test 0 \
--future_bars 0

history: start
docker-compose exec surpriver \
python detection_engine.py \
--top_n 25 \
--min_volume 5000 \
--data_granularity_minutes 60 \
--history_to_use 14 \
--is_load_from_dictionary 1 \
--data_dictionary_path 'dictionaries/data_dict.npy' \
--is_save_dictionary 0 \
--is_test 0 \
--future_bars 0