Skip to content

Commit 18585e7

Browse files
authored
Dev
2 parents bd92432 + 4b8467f commit 18585e7

File tree

18 files changed

+8379
-25520
lines changed

18 files changed

+8379
-25520
lines changed

.github/workflows/build_and_deploy.yml

Lines changed: 9 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -54,7 +54,15 @@ jobs:
5454
curl -fsSL -o get_helm.sh https://raw.githubusercontent.com/helm/helm/master/scripts/get-helm-3
5555
chmod 700 get_helm.sh
5656
./get_helm.sh
57-
57+
58+
- name: Check for running Kubernetes jobs
59+
run: |
60+
KUBECONFIG=kubeconfig.yaml
61+
while kubectl get jobs -n ${{ env.NAMESPACE }} -o jsonpath='{.items[?(@.status.active)].metadata.name}' | grep "drexel-scheduler-cronjob"; do
62+
echo "Waiting for jobs to complete..."
63+
sleep 30
64+
done
65+
5866
- name: Deploy to Kubernetes
5967
run: |
6068
COMMIT_SHA=$(echo $GITHUB_SHA | cut -c1-7)

.github/workflows/functional_test.yml

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -28,7 +28,7 @@ jobs:
2828
run: rm cache/*
2929

3030
- name: Run the scraper
31-
run: docker compose run scraper python3 main.py --db --all-colleges --ratings
31+
run: docker compose run scraper python3 src/main.py --db --all-colleges --ratings
3232

3333
- name: Verify courses data exists in database
3434
run: |
@@ -81,10 +81,10 @@ jobs:
8181
fi
8282
8383
- name: Reset database
84-
run: docker compose run scraper python3 db.py create_tables
84+
run: docker compose run scraper python3 src/db.py create_tables
8585

8686
- name: Run scraper again (to test cache)
87-
run: docker compose run scraper python3 main.py --db --all-colleges --ratings
87+
run: docker compose run scraper python3 src/main.py --db --all-colleges --ratings
8888

8989
- name: Verify courses data exists in database
9090
run: |

Dockerfile

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -13,5 +13,4 @@ RUN pip install --upgrade pip
1313
RUN pip install -r requirements.txt
1414

1515
# Run the Python script
16-
# CMD ["python3", "main.py"]
17-
CMD ["python3", "main.py", "--db", "--all-colleges", "--ratings", "--email"]
16+
CMD ["python3", "src/main.py", "--db", "--all-colleges", "--ratings", "--email"]

README.md

Lines changed: 14 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -31,30 +31,30 @@ To run the scraper, simply run:
3131

3232
###### Mac/Linux
3333
```bash
34-
python3 main.py
34+
python3 src/main.py
3535
```
3636

3737
###### Windows
3838
```bash
39-
python main.py
39+
python src/main.py
4040
```
4141

4242
The scraper will output a JSON file called `data.json` in the same directory as the scraper.
4343

44-
You can modify the scraper to scrape other terms by changing the `year`, `quarter`, and `college_code` variables in `config.py`.
44+
You can modify the scraper to scrape other terms by changing the `year`, `quarter`, and `college_code` variables in `src/config.py`.
4545

4646
#### All Colleges
4747

48-
To scrape all colleges instead of just the one specified in the `config.json`, run the following command:
48+
To scrape all colleges instead of just the one specified in the `src/config.py`, run the following command:
4949

5050
###### Mac/Linux
5151
```bash
52-
python3 main.py --all-colleges
52+
python3 src/main.py --all-colleges
5353
```
5454

5555
###### Windows
5656
```bash
57-
python main.py --all-colleges
57+
python src/main.py --all-colleges
5858
```
5959

6060
#### Ratings
@@ -63,12 +63,12 @@ To also include the ratings field in `data.json` that requests data from RateMyP
6363

6464
###### Mac/Linux
6565
```bash
66-
python3 main.py --ratings
66+
python3 src/main.py --ratings
6767
```
6868

6969
###### Windows
7070
```bash
71-
python main.py --ratings
71+
python src/main.py --ratings
7272
```
7373

7474
Note that this will take longer to run since the scraper has to look up the rating on RateMyProfessors. However, it will cache the ratings in a file called `ratings_cache.json` (inside the `cache` directory) so that it doesn't have to look up the same professor again, which will run much faster. If you want to clear the cache to get new ratings, simply delete the `ratings_cache.json` file.
@@ -81,21 +81,21 @@ Then run the scraper with the `--db` flag:
8181

8282
###### Mac/Linux
8383
```bash
84-
python3 main.py --db
84+
python3 src/main.py --db
8585
```
8686

8787
###### Windows
8888
```bash
89-
python main.py --db
89+
python src/main.py --db
9090
```
9191

92-
This will create a new database `schedulerdb` and the necessary tables if they aren't already created, and then insert the data into the database. If the data is already populated, it will update the existing data. To delete all the data, make sure the environment variables specified in `db_config.py` are set and then run the following command (make sure you're using the Git Bash terminal if you're using Windows):
92+
This will create a new database `schedulerdb` and the necessary tables if they aren't already created, and then insert the data into the database. If the data is already populated, it will update the existing data. To delete all the data, make sure the environment variables specified in `src/db_config.py` are set and then run the following command (make sure you're using the Git Bash terminal if you're using Windows):
9393

9494
```bash
9595
./reset_db.bash
9696
```
9797

98-
To view the schema for the tables, you can look at the `create_tables.sql` file.
98+
To view the schema for the tables, you can look at the `src/create_tables.sql` file.
9999

100100
Connect to the database using the following command:
101101

@@ -116,12 +116,12 @@ You can also combine all the options together:
116116

117117
###### Mac/Linux
118118
```bash
119-
python3 main.py --db --all-colleges --ratings
119+
python3 src/main.py --db --all-colleges --ratings
120120
```
121121

122122
###### Windows
123123
```bash
124-
python main.py --db --all-colleges --ratings
124+
python src/main.py --db --all-colleges --ratings
125125
```
126126

127127
## Docker

0 commit comments

Comments
 (0)