Skip to content

Commit 0eb776a

Browse files
author
Felipe Parodi
committed
Correct repo hyperlinks.
1 parent d5cc36e commit 0eb776a

File tree

10 files changed

+44
-46
lines changed

10 files changed

+44
-46
lines changed

.github/CONTRIBUTING.md

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ Thank you for your interest in contributing to PrimateFace! We welcome contribut
55
## Ways to Contribute
66

77
### 1. Report Issues
8-
- Use the [GitHub Issues](https://github.com/PrimateFace/primateface_oss/issues) page
8+
- Use the [GitHub Issues](https://github.com/KordingLab/PrimateFace/issues) page
99
- Check if the issue already exists before creating a new one
1010
- Provide detailed information including:
1111
- Your environment (OS, Python version, package versions)
@@ -17,7 +17,7 @@ Thank you for your interest in contributing to PrimateFace! We welcome contribut
1717

1818
#### Getting Started
1919
1. Fork the repository
20-
2. Clone your fork: `git clone https://github.com/YOUR_USERNAME/primateface_oss.git`
20+
2. Clone your fork: `git clone https://github.com/YOUR_USERNAME/PrimateFace.git`
2121
3. Create a new branch: `git checkout -b feature-name`
2222
4. Set up your development environment:
2323
```bash
@@ -88,8 +88,8 @@ Please ensure:
8888
### Environment Setup
8989
```bash
9090
# Clone the repo
91-
git clone https://github.com/PrimateFace/primateface_oss.git
92-
cd primateface_oss
91+
git clone https://github.com/KordingLab/PrimateFace.git
92+
cd PrimateFace
9393

9494
# Create conda environment
9595
conda create -n primateface-dev python=3.10
@@ -130,13 +130,13 @@ pytest --cov=primateface tests/
130130
## Questions?
131131

132132
- Email: primateface@gmail.com
133-
- GitHub Discussions: [Link](https://github.com/PrimateFace/primateface_oss/discussions)
134-
- Issues: [Link](https://github.com/PrimateFace/primateface_oss/issues)
133+
- GitHub Discussions: [Link](https://github.com/KordingLab/PrimateFace/discussions)
134+
- Issues: [Link](https://github.com/KordingLab/PrimateFace/issues)
135135

136136
## License
137137

138138
By contributing to PrimateFace, you agree that your contributions will be licensed under the MIT License.
139139

140140
## Acknowledgments
141141

142-
Thank you to all our contributors! Your efforts help advance primate behavioral research and computer vision.
142+
Thank you to all our contributors! Your efforts help advance primate behavioral research and computer vision.

CHANGELOG.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -59,6 +59,6 @@ See [CONTRIBUTING.md](.github/CONTRIBUTING.md) for information on how to contrib
5959
## Links
6060

6161
- [Documentation](https://primateface.studio)
62-
- [Paper](https://www.biorxiv.org/content/10.1101/2025.08.12.669927v2)
62+
- [Paper](https://www.biorxiv.org/content/10.1101/2025.08.12.669927)
6363
- [Dataset](https://huggingface.co/datasets/fparodi/PrimateFace)
64-
- [GitHub](https://github.com/PrimateFace/primateface_oss)
64+
- [GitHub](https://github.com/KordingLab/PrimateFace)

README.md

Lines changed: 11 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@
33
# **PrimateFace: A Machine Learning Resource for Automated Primate Face Analysis**
44

55
<p align="center">
6-
<a href="https://www.biorxiv.org/content/10.1101/2025.08.12.669927v2">
6+
<a href="https://www.biorxiv.org/content/10.1101/2025.08.12.669927">
77
<img src="https://img.shields.io/badge/Preprint-bioRxiv-orange" alt="bioRxiv">
88
</a>
99
&nbsp;&nbsp;
@@ -20,11 +20,11 @@
2020
</a>
2121
</p>
2222

23-
PrimateFace contains data, models, and tutorials for analyzing facial behavior across primates ([Parodi et al., 2025](https://www.biorxiv.org/content/10.1101/2025.08.12.669927v2)).
23+
PrimateFace contains data, models, and tutorials for analyzing facial behavior across primates ([Parodi et al., 2025](https://www.biorxiv.org/content/10.1101/2025.08.12.669927)).
2424

2525
This codebase enables you to use an off-the-shelf PrimateFace model for tracking facial movements or you can quickly fine-tune a PrimateFace model.
2626

27-
Most of the PrimateFace modules require GPU access. If you don't have access to a GPU, you can still use PrimateFace in Google Colab (see [notebooks](https://docs.primateface.studio/notebooks/index.html)).
27+
Most of the PrimateFace modules require GPU access. If you don't have access to a GPU, you can still use PrimateFace in Google Colab (see [tutorials](https://docs.primateface.studio/tutorials/)).
2828

2929

3030
<p align="center">
@@ -35,7 +35,7 @@ Most of the PrimateFace modules require GPU access. If you don't have access to
3535
#### **Quick Start**
3636
1. Test the [Hugging Face demo](https://huggingface.co/datasets/fparodi/PrimateFace) to get a feel for the capabilities of PrimateFace on your own data.
3737

38-
2. Run through the [Google Colab Notebook tutorials](https://docs.primateface.studio/notebooks/index.html) to explore several applications of PrimateFace.
38+
2. Run through the [Google Colab Notebook tutorials](https://docs.primateface.studio/tutorials/) to explore several applications of PrimateFace.
3939

4040
3. Clone this repository, install the dependencies, and run through the different modules (e.g., DINOv2, image and video demos, pseudo-labeling GUI, etc.) to fully utilize PrimateFace.
4141

@@ -118,18 +118,16 @@ Note: You may see a harmless `RequestsDependencyWarning` about urllib3 versions
118118

119119
#### Links
120120
- [Documentation Homepage](https://docs.primateface.studio)
121-
- [Notebook Tutorials](https://docs.primateface.studio/notebooks/index.html)
122-
123-
<!-- TODO: fix links for KordingLab -->
121+
- [Notebook Tutorials](https://docs.primateface.studio/tutorials/)
124122

125123
| Tutorial | Open in Colab |
126124
|---------|----------------|
127-
| **1. Lemur Face Visibility Time-Stamping** | [![Open](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/PrimateFace/primateface_oss/blob/main/demos/notebooks/App1_Lemur_time_stamping.ipynb) |
128-
| **2. Rapid Macaque Face Recognition** | [![Open](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/PrimateFace/primateface_oss/blob/main/demos/notebooks/App2_Macaque_Face_Recognition.ipynb) |
129-
| **3. Howler Vocal-Motor Coupling** | [![Open](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/PrimateFace/primateface_oss/blob/main/demos/notebooks/App3_Howler_Vocal_Motor_Coupling.ipynb) |
130-
| **4. Human Infant Social Gaze Tracking** | [![Open](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/PrimateFace/primateface_oss/blob/main/demos/notebooks/App4_Gaze_following.ipynb) |
131-
| **5. Data-Driven Discovery of Facial Actions** | [![Open](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/PrimateFace/primateface_oss/blob/main/demos/notebooks/App5_Data_Driven_Discovery_of_Facial_Actions.ipynb) |
132-
| **6. Cross-Subject Neural Decoding of Facial Actions** | [![Open](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/PrimateFace/primateface_oss/blob/main/demos/notebooks/App6_Cross_Subject_Neural_Decoding_of_Facial_Actions.ipynb) |
125+
| **1. Lemur Face Visibility Time-Stamping** | [![Open](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/KordingLab/PrimateFace/blob/main/demos/notebooks/App1_Lemur_time_stamping.ipynb) |
126+
| **2. Rapid Macaque Face Recognition** | [![Open](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/KordingLab/PrimateFace/blob/main/demos/notebooks/App2_Macaque_Face_Recognition.ipynb) |
127+
| **4. Human Infant Social Gaze Tracking** | [![Open](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/KordingLab/PrimateFace/blob/main/demos/notebooks/App4_Gaze_following.ipynb) |
128+
| **3. Howler Vocal-Motor Coupling** | Coming soon |
129+
| **5. Data-Driven Discovery of Facial Actions** | Coming soon |
130+
| **6. Cross-Subject Neural Decoding of Facial Actions** | Coming soon |
133131

134132

135133
#### References

demos/notebooks/App1_Lemur_time_stamping.ipynb

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,7 @@
2323
"\n",
2424
"| GitHub | Paper | Website |\n",
2525
"|---|---|---|\n",
26-
"| [Code](https://github.com/KordingLab/PrimateFace) | [Preprint](https://www.biorxiv.org/content/10.1101/2025.08.12.669927v2) | [Project](https://primateface.studio/) |\n",
26+
"| [Code](https://github.com/KordingLab/PrimateFace) | [Preprint](https://www.biorxiv.org/content/10.1101/2025.08.12.669927) | [Project](https://primateface.studio/) |\n",
2727
"\n",
2828
"Welcome! This tutorial notebook walks through using a **PrimateFace** detection model to automatically find and timestamp primate faces in videos, creating a 'visibility baseline' for behavioral coding.\n",
2929
"\n",
@@ -1568,7 +1568,7 @@
15681568
"cell_type": "markdown",
15691569
"source": [
15701570
"## 9. Resources\n",
1571-
"1. [PrimateFace](https://github.com/PrimateFace/PrimateFace)\n",
1571+
"1. [PrimateFace](https://github.com/KordingLab/PrimateFace)\n",
15721572
"2. [mmdetection](https://github.com/open-mmlab/mmdetection)\n",
15731573
"3. [roboflow](https://roboflow.com/)"
15741574
],
@@ -1577,4 +1577,4 @@
15771577
}
15781578
}
15791579
]
1580-
}
1580+
}

demos/notebooks/App2_Macaque_Face_Recognition.ipynb

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -370,7 +370,7 @@
370370
"\n",
371371
"| GitHub | Paper | Website |\n",
372372
"|---|---|---|\n",
373-
"| [Code](https://github.com/KordingLab/PrimateFace) | [Preprint](https://www.biorxiv.org/content/10.1101/2025.08.12.669927v2) | [Project](https://primateface.studio/) |\n",
373+
"| [Code](https://github.com/KordingLab/PrimateFace) | [Preprint](https://www.biorxiv.org/content/10.1101/2025.08.12.669927) | [Project](https://primateface.studio/) |\n",
374374
"\n",
375375
"Welcome! This tutorial notebook demonstrates a complete, closed-set face recognition pipeline. That is, given a set of images with corresponding primate IDs, we will chain together PrimateFace models with off-the-shelf face recognition models to achieve rapid identity recognition.\n",
376376
"\n",
@@ -2668,7 +2668,7 @@
26682668
"cell_type": "markdown",
26692669
"source": [
26702670
"## 7. Resources\n",
2671-
"1. [PrimateFace](https://github.com/PrimateFace/PrimateFace)\n",
2671+
"1. [PrimateFace](https://github.com/KordingLab/PrimateFace)\n",
26722672
"2. [mmdetection](https://github.com/open-mmlab/mmdetection)\n",
26732673
"3. [mmpose](https://github.com/open-mmlab/mmpose)\n",
26742674
"4. [InsightFace](https://github.com/deepinsight/insightface)"
@@ -2678,4 +2678,4 @@
26782678
}
26792679
}
26802680
]
2681-
}
2681+
}

demos/notebooks/App4_Gaze_following.ipynb

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -24,7 +24,7 @@
2424
"\n",
2525
"| GitHub Repo | Paper | Project Page |\n",
2626
"|---|---|---|\n",
27-
"| [PrimateFace](https://github.com/PrimateFace/PrimateFace) | [PrimateFace](https://arxiv.org/abs/10000) | [PrimateFace](https://primateface.github.io/) |\n",
27+
"| [PrimateFace](https://github.com/KordingLab/PrimateFace) | [PrimateFace](https://www.biorxiv.org/content/10.1101/2025.08.12.669927) | [PrimateFace](https://docs.primateface.studio/) |\n",
2828
"\n",
2929
"Welcome! This tutorial notebook demonstrates a gaze-following heuristic for videos containing two primates. It uses models from the **PrimateFace** and **Gazelle** projects to analyze social attention dynamics.\n",
3030
"\n",
@@ -496,7 +496,7 @@
496496
"source": [
497497
"\n",
498498
"## 8. Resources\n",
499-
"1. [PrimateFace](https://github.com/PrimateFace/PrimateFace)\n",
499+
"1. [PrimateFace](https://github.com/KordingLab/PrimateFace)\n",
500500
"2. [Gazelle](https://gazelle-gaze.github.io/)\n",
501501
"3. [mmdetection](https://github.com/open-mmlab/mmdetection)"
502502
],
@@ -505,4 +505,4 @@
505505
}
506506
}
507507
]
508-
}
508+
}

demos/notebooks/README.md

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -2,13 +2,13 @@
22

33
These notebooks are designed to help you get started with PrimateFace.
44

5-
These correspond to the scientific applications outlined in the PrimateFace [paper](https://www.biorxiv.org/content/10.1101/2025.08.12.669927v2).
5+
These correspond to the scientific applications outlined in the PrimateFace [paper](https://www.biorxiv.org/content/10.1101/2025.08.12.669927).
66

77
| Tutorial | Open in Colab |
88
|---------|----------------|
9-
| **1. Lemur Face Visibility Time-Stamping** | [![Open](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/PrimateFace/primateface_oss/demos/notebooks/App1_Lemur_time_stamping.ipynb) |
10-
| **2. Rapid Macaque Face Recognition** | [![Open](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/PrimateFace/primateface_oss/demos/notebooks/App2_Macaque_Face_Recognition.ipynb) |
11-
| **3. Howler Vocal-Motor Coupling** | [![Open](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/PrimateFace/primateface_oss/demos/notebooks/App3_Howler_Vocal_Motor_Coupling.ipynb) |
12-
| **4. Human Infant Social Gaze Tracking** | [![Open](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/PrimateFace/primateface_oss/demos/notebooks/App4_Gaze_following.ipynb) |
13-
| **5. Data-Driven Discovery of Facial Actions** | [![Open](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/PrimateFace/primateface_oss/demos/notebooks/App5_Data_Driven_Discovery_of_Facial_Actions.ipynb) |
14-
| **6. Cross-Subject Neural Decoding of Facial Actions** | [![Open](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/PrimateFace/primateface_oss/demos/notebooks/App6_Cross_Subject_Neural_Decoding_of_Facial_Actions.ipynb) |
9+
| **1. Lemur Face Visibility Time-Stamping** | [![Open](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/KordingLab/PrimateFace/blob/main/demos/notebooks/App1_Lemur_time_stamping.ipynb) |
10+
| **2. Rapid Macaque Face Recognition** | [![Open](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/KordingLab/PrimateFace/blob/main/demos/notebooks/App2_Macaque_Face_Recognition.ipynb) |
11+
| **4. Human Infant Social Gaze Tracking** | [![Open](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/KordingLab/PrimateFace/blob/main/demos/notebooks/App4_Gaze_following.ipynb) |
12+
| **3. Howler Vocal-Motor Coupling** | Coming soon |
13+
| **5. Data-Driven Discovery of Facial Actions** | Coming soon |
14+
| **6. Cross-Subject Neural Decoding of Facial Actions** | Coming soon |

docs/index.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -132,7 +132,7 @@ For pressing questions or collaborations, reach out via:
132132

133133
If you use PrimateFace in your research, please cite:
134134

135-
[Parodi et al., 2025](https://www.biorxiv.org/content/10.1101/2025.08.12.669927v2)
135+
[Parodi et al., 2025](https://www.biorxiv.org/content/10.1101/2025.08.12.669927)
136136

137137
```bibtex
138138
@article{parodi2025primateface,

gui/pseudolabel.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -492,7 +492,7 @@ def cmd_refine(args: argparse.Namespace) -> None:
492492
print(" 2. Edit COCO JSON files directly")
493493
print(" 3. Use annotation tools like CVAT, Label Studio, or VGG VIA")
494494
print(" that support COCO format import/export")
495-
print("\nFor updates, check: https://github.com/PrimateFace/primateface_oss")
495+
print("\nFor updates, check: https://github.com/KordingLab/PrimateFace")
496496
sys.exit(0)
497497

498498

@@ -624,4 +624,4 @@ def main() -> None:
624624

625625

626626
if __name__ == "__main__":
627-
main()
627+
main()

pyproject.toml

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -123,10 +123,10 @@ all = [
123123

124124
[project.urls]
125125
Homepage = "https://primateface.studio"
126-
Documentation = "https://primateface.studio/docs"
127-
Repository = "https://github.com/PrimateFace/primateface_oss"
128-
"Bug Reports" = "https://github.com/PrimateFace/primateface_oss/issues"
129-
Paper = "https://www.biorxiv.org/content/10.1101/2025.08.12.669927v1"
126+
Documentation = "https://docs.primateface.studio"
127+
Repository = "https://github.com/KordingLab/PrimateFace"
128+
"Bug Reports" = "https://github.com/KordingLab/PrimateFace/issues"
129+
Paper = "https://www.biorxiv.org/content/10.1101/2025.08.12.669927"
130130
"Hugging Face" = "https://huggingface.co/datasets/fparodi/PrimateFace"
131131

132132
[tool.setuptools.packages.find]
@@ -135,4 +135,4 @@ include = ["primateface*", "dataset*", "demos*", "dinov2*", "evals*", "gui*", "l
135135
exclude = ["tests*", "docs*", "notebooks*"]
136136

137137
[tool.setuptools.package-data]
138-
"*" = ["*.json", "*.yaml", "*.yml", "*.txt", "*.md"]
138+
"*" = ["*.json", "*.yaml", "*.yml", "*.txt", "*.md"]

0 commit comments

Comments
 (0)