|
1 | | -### BiomechanicsNet |
| 1 | +### AddBiomechanics |
2 | 2 |
|
3 | | -This is an open effort to assemble a large dataset of human motion, recorded from several modalities (kinematics, GRF, EMG, and IMU). We compose this dataset out of dozens of heterogeneous motion capture datasets published in both biomechanics and computer graphics. While there are dozens of these datasets, few contain everything we’d like. For example, [Mahmood et. al](https://amass.is.tue.mpg.de/) is very large and covers many motion types, but contains only skeleton kinematics (no GRF, EMG, or accelerometers). [Carmago et. al](http://www.epic.gatech.edu/opensource-biomechanics-camargo-et-al/) contains EMG, motion capture, and GRF data for motions on uneven surfaces, but only for the lower-half of the body. [Lencioni et. al](https://springernature.figshare.com/collections/Human_kinematic_kinetic_and_EMG_data_during_level_walking_toe_heel-walking_stairs_ascending_descending/4494755) contains EMG, motion capture, and GRF data for the whole body, but only for walking up and down stairs. There are dozens more datasets, each with its own idiosyncrasies. |
| 3 | +[](https://zenodo.org/badge/latestdoi/398424759) |
4 | 4 |
|
5 | | -Our goal in this project is to provide a standard format for modality-sparse human motion data, and loaders for as many datasets as we can. We standardize on the [Rajagopal Human Body Model](https://simtk.org/projects/full_body), as implemented in the [Nimble Physics Engine](https://nimblephysics.org). |
| 5 | +This is an open effort to assemble a large dataset of human motion. We're hoping to faccilitate this by providing easy-to-use tools that can automatically process motion capture data, and prepare it for biomechanical analysis. We're also working to provide large aggregate datasets in standard formats, along with tools to easily handle the data, at some point in the near future. |
6 | 6 |
|
7 | | -Licenses-permitting, we plan to make pre-translated aggregate datasets available for public download. |
| 7 | +### Getting Set Up (for Stanford Developers) |
8 | 8 |
|
9 | | -## Getting Set Up For Development (frontend) |
| 9 | +*A note for non-Stanford devs: these instructions probably won't help you!* We share the AddBiomechanics source code so that researchers can fully understand the methods. We highly encourage you to use the web application rather than building the code from source. Note that we are a small team and are not able to support individuals wishing to build from source. You're welcome to try, but it's probably going to be harder than you hope, and we're sorry about that. Part of the complexity here is that the cloud application is built to interface directly with a web of different AWS resources, each of which has its own (currently undocumented) IAM setup, which are provisioned and continually maintained by our team for the public instance of AddBiomechanics. If you are trying to run your own independent instance to avoid sharing data, even if we gave you the permissions files referenced in these instructions, your code would by default talk to our AWS resources, and effectively just join our cluster. If you want it to talk to your own resources, we cannot offer support debugging your setup to get everything to work. |
10 | 10 |
|
11 | | -1. Install the Amplify CLI `npm install -g @aws-amplify/cli` (may require `sudo`, depending on your setup) |
12 | | -2. From inside the `frontend` folder, run `amplify configure`, and follow the instructions to create a new IAM user for your computer (in the 'us-west-2' region) |
13 | | -3. From inside the `frontend` folder, run `amplify init` |
| 11 | +## Getting Set Up for Development (frontend) |
| 12 | + |
| 13 | +1. Download the [aws-exports-dev.js](https://drive.google.com/file/d/1IBr3Fm-8rYeGudyWLvIEGPkdzdpR0I90/view?usp=sharing) file, rename it `aws-exports.js` and put it into the `frontend/src` folder. |
| 14 | +2. Run `yarn start` to launch the app! |
| 15 | + |
| 16 | +## Notes (frontend) |
| 17 | + |
| 18 | +Note: the above instructions will cause your local frontend to target the dev servers, if you would rather interact with production servers, download the [aws-exports-prod.js](https://drive.google.com/file/d/1VZVgHHwSP-xmJW-qZeQ6U92FYWoU36aP/view?usp=sharing) file, rename it `aws-exports.js` and put it into the `frontend/src` folder. |
| 19 | + |
| 20 | +Because the app is designed to be served as a static single page application (see the wiki for details) running it locally with the appropriate `aws-exports.js` will behave exactly the same as viewing it from [dev.addbiomechanics.org](https://dev.addbiomechanics.org) (dev servers) or [app.addbiomechanics.org](https://app.addbiomechanics.org) (prod servers) |
| 21 | + |
| 22 | +## Getting Set Up For Deployment (frontend) |
| 23 | + |
| 24 | +1. Log in with the AddBiomechanics AWS root account on your local `aws` CLI. |
| 25 | +2. Install the Amplify CLI `npm install -g @aws-amplify/cli` (may require `sudo`, depending on your setup) |
| 26 | +3. From inside the `frontend` folder, run `amplify configure`, and follow the instructions to create a new IAM user for your computer (in the 'us-west-2' region) |
| 27 | +4. From inside the `frontend` folder, run `amplify init` |
14 | 28 | a. When asked "Do you want to use an existing environment?" say YES |
15 | 29 | b. Choose the environment "dev" |
16 | 30 | c. Choose anything you like for your default editor |
17 | 31 | d. Select the authentication method "AWS profile", and select the profile you created in step 2 |
18 | | -4. Run `yarn start` to launch the app! |
19 | | -## Getting Set Up For Development (server) |
| 32 | +5. Run `yarn start` to launch the app! |
| 33 | +## Getting Set Up For Development (server processing algorithm) |
| 34 | + |
| 35 | +The core algorithm for processing data exists in `server/engine/engine.py`. To test changes `engine.py`: |
20 | 36 |
|
21 | | -1. Download (credentials)[https://drive.google.com/file/d/1okCCdvqaZh20gc4TG152o7yJV9_vnBtf/view?usp=sharing] into `.devcontainer/.aws/credentials` and `server/.aws/credentials`. |
22 | | -2. Download (server_credentials.csv)[https://drive.google.com/file/d/1e1GrwpOm0viZhNGkw_lDNPa_cfYhJ3r3/view?usp=sharing] into `.devcontainer/server_credentials.csv` and `server/server_credentials.csv`. |
23 | | -3. Open this project in VSCode, and then use Ctrl+Shift+P and get to the command "Remote-Containers: Open Folder in Container...". Re-open this folder in a Docker container. |
24 | | -4. Using a VSCode Terminal, navigate to `frontend` and execute `yarn start` to begin serving a live frontend |
| 37 | +1. Run `pip3 install -r /engine/requirements.txt` |
| 38 | +2. Download the [`test_engine.sh` script](https://drive.google.com/file/d/1n-9KSv-wZevuVNwShb1Ur36MRAZlnNhv/view?usp=share_link), place it in this directory |
| 39 | +3. Download the [test_data/ folder](https://drive.google.com/drive/folders/1jGfgM1m13ksqLZByKUEoUwsy22OVtEza?usp=share_link) (ask Keenon for access to this), place it in this directory |
| 40 | +4. Run `./test_engine.sh` to test out your changes to `engine.py` on existing data. Change the line `TEST_NAME="opencap_test"` to different run against other folder names you find in `test_data/` (careful, don't include the `_original` part or you'll overwrite your input data by accident) |
25 | 41 |
|
26 | 42 | ## Hosting a Processing Server |
27 | 43 |
|
28 | 44 | 1. Got into the `server` folder |
29 | | -2. Run `docker build -f Dockerfile.dev .` (to run a dev server) or `docker build -f Dockerfile.prod .` (to run a prod server) to build the Docker container to run the server. It's important that you rebuild the Docker container each time you boot a new server, since that sets it up with its own PubSub connection. |
30 | | -3. Run the docker container you just built! That's your server. Leave it running as a process. |
| 45 | +2. Download the `server_credentials.csv` file, which Keenon can give you a link to |
| 46 | +3. Run `docker build -f Dockerfile.dev .` (to run a dev server) or `docker build -f Dockerfile.prod .` (to run a prod server) to build the Docker container to run the server. It's important that you rebuild the Docker container each time you boot a new server, since that sets it up with its own PubSub connection. |
| 47 | +4. Run the docker container you just built! That's your server. Leave it running as a process. |
31 | 48 |
|
32 | 49 | ## Switching between Dev and Prod |
33 | 50 | By default, the main branch is pointed at the dev servers. We keep the current prod version on the `prod` branch. |
|
0 commit comments