diff --git a/.gitignore b/.gitignore index 7ae86ed..c6115f2 100644 --- a/.gitignore +++ b/.gitignore @@ -3,6 +3,7 @@ # Build outputs docs-expanded/ +# docs/ - removed from gitignore to track for GitHub Pages deployment # Quarto files .quarto/ @@ -17,8 +18,9 @@ __pycache__/ venv/ env/ -# Generated knowledge base -pheno_knowledge_base_expanded/knowledge-base-context.txt + +# Lambda deployment package (build artifact) +lambda/lambda-deployment.zip # OS files .DS_Store @@ -31,3 +33,4 @@ Thumbs.db *.swo *~ + diff --git a/LAMBDA_SETUP.md b/LAMBDA_SETUP.md new file mode 100644 index 0000000..7c4aaaf --- /dev/null +++ b/LAMBDA_SETUP.md @@ -0,0 +1,149 @@ +# Lambda Backend Setup - Summary + +## What Was Created + +### 1. Lambda Function (`lambda/lambda_function.py`) +- **What it does:** Converts Flask app to Lambda handler +- **Features:** + - Handles HTTP requests (GET /api/health, POST /api/chat) + - CORS protection (only your GitHub Pages URL can call) + - Loads context file from Lambda package + - Calls OpenRouter API (same as before) + - Saves conversations to S3 (with logging) + - Returns answers in same format + +### 2. Lambda Requirements (`lambda/requirements.txt`) +- **What it does:** Lists Python packages needed +- **Packages:** + - `requests` - For calling OpenRouter API + - `boto3` - For saving to S3 + +### 3. Packaging Script (`lambda/package_lambda.sh`) +- **What it does:** Creates zip file for Lambda deployment +- **Includes:** + - Lambda function code + - Python dependencies + - Knowledge base context file + +### 4. IAM Policy (`lambda/iam-policy.json`) +- **What it does:** Defines permissions Lambda needs +- **Permissions:** + - Write to S3 bucket (save conversations) + - Write to CloudWatch logs (automatic logging) + +### 5. Updated Files +- **`deploy.sh`:** Changed to use BACKEND_URL (Lambda URL) instead of API key +- **`env.example`:** Updated to show Lambda URL format + +## How It Works + +### Architecture Flow: +``` +GitHub Pages (Frontend) + ↓ +CORS Check (Lambda) + ↓ +Lambda Function + ↓ +OpenRouter API + ↓ +S3 (Save conversation) + ↓ +Return answer to frontend +``` + +### Key Differences from Flask: + +1. **No Flask:** Lambda uses handler function, not Flask routes +2. **CORS:** Manual headers instead of Flask-CORS +3. **Context:** Loaded from package, not filesystem +4. **S3 Logging:** Conversations saved to S3 automatically +5. **Environment:** Uses AWS environment variables, not .env file + +## Next Steps + +### 1. Create S3 Bucket +- Name: `pheno-chatbot-conversations` (or your choice) +- Region: Same as Lambda + +### 2. Package Lambda +```bash +cd lambda +chmod +x package_lambda.sh +./package_lambda.sh +``` + +### 3. Deploy to AWS Lambda +- Upload `lambda-deployment.zip` +- Set handler: `lambda_function.lambda_handler` +- Set environment variables: + - `OPENROUTER_API_KEY` - Your OpenRouter API key + - `S3_BUCKET_NAME` - Your S3 bucket name + - `FRONTEND_URL` - Your GitHub Pages URL (for CORS) + +### 4. Configure IAM Role +- Attach policy from `iam-policy.json` to Lambda execution role +- Update bucket name in policy if different + +### 5. Enable Function URL +- In Lambda console, create Function URL +- Copy the URL +- Update `.env` with: `BACKEND_URL=` + +### 6. Deploy Frontend +```bash +./deploy.sh +``` + +## Environment Variables Needed + +### In Lambda (AWS Console): +- `OPENROUTER_API_KEY` - Required +- `S3_BUCKET_NAME` - Required (default: pheno-chatbot-conversations) +- `FRONTEND_URL` - Required (your GitHub Pages URL for CORS) + +### In Local .env: +- `BACKEND_URL` - Lambda Function URL + +## Security + +✅ **API Key:** Stored in Lambda environment (never in frontend) +✅ **CORS:** Only your GitHub Pages URL can call Lambda +✅ **S3:** Conversations saved securely +✅ **Logs:** CloudWatch logs (automatic) + +## Testing + +### Test Lambda locally: +```bash +# Test health endpoint +curl https://your-lambda-url/api/health + +# Test chat endpoint +curl -X POST https://your-lambda-url/api/chat \ + -H "Content-Type: application/json" \ + -d '{"question": "What is the population dataset?"}' +``` + +### Test frontend: +```bash +./deploy.sh +cd docs +python3 -m http.server 8000 +# Visit http://localhost:8000 +``` + +## Troubleshooting + +### Lambda can't find context file: +- Make sure `create-knowledge-base.sh` was run first +- Context file should be in `pheno_knowledge_base_expanded/knowledge-base-context.txt` + +### CORS errors: +- Check `FRONTEND_URL` environment variable matches your GitHub Pages URL exactly +- Include protocol: `https://your-username.github.io` + +### S3 errors: +- Check IAM policy is attached to Lambda role +- Verify S3 bucket name matches `S3_BUCKET_NAME` environment variable +- Check bucket exists and is in same region as Lambda diff --git a/README.md b/README.md index da3a637..b9ae674 100644 --- a/README.md +++ b/README.md @@ -1,175 +1,213 @@ # Pheno Knowledge Base Expanded -This repository contains the **expanded version** of the Human Phenotype Project (HPP) documentation for researchers, featuring an interactive AI chatbot. +This repository contains the **expanded version** of the Human Phenotype Project (HPP) documentation for researchers, featuring an interactive AI chatbot with AWS Lambda backend. + +## 🏗️ Architecture + +**Serverless architecture with AWS Lambda backend (Development Environment)** + +``` +GitHub Pages (Frontend) → AWS Lambda Function URL → OpenRouter API → S3 (Logs) +``` + +**Components:** +- **Frontend**: GitHub Pages (static site with chatbot widget) +- **Backend**: AWS Lambda Function (Development Environment) +- **API**: OpenRouter (Claude 3.5 Sonnet) +- **Storage**: AWS S3 (conversation logs) +- **Security**: API keys in Lambda environment variables, CORS-protected ## 🚀 Quick Start ### Prerequisites 1. **Quarto** - [Install Quarto](https://quarto.org/docs/get-started/) -2. **Python dependencies**: - ```bash - pip install -r requirements.txt +2. **Python dependencies**: `pip install -r requirements.txt` + +### Setup + +1. Copy environment file: `cp env.example .env` +2. Edit `.env` and set `BACKEND_URL` to your Lambda Function URL: ``` + BACKEND_URL="https://your-lambda-url.lambda-url.region.on.aws/api/chat" + ``` + +🔒 **Security:** `.env` is gitignored. API keys are stored in Lambda environment variables. -### Setup API Key (Required for Chatbot) +## 📝 Editing Context (Chatbot Knowledge Base) -The chatbot requires an OpenRouter API key: +The chatbot answers questions based on the knowledge base context file. -1. Copy the example environment file: +### How to Update Context + +1. **Edit source documentation files:** + - Main files: `pheno_knowledge_base_expanded/*.md` and `*.qmd` + - Dataset files: `pheno_knowledge_base_expanded/datasets/*.ipynb` + +2. **Regenerate knowledge base:** ```bash - cp env.example .env + ./create-knowledge-base.sh ``` + This creates: `pheno_knowledge_base_expanded/knowledge-base-context.txt` -2. Get your API key from [OpenRouter](https://openrouter.ai/keys) - -3. Edit `.env` and add your real API key: +3. **Update Lambda (if context changed):** ```bash - nano .env + cd lambda + ./package_lambda.sh + aws lambda update-function-code \ + --function-name pheno-chatbot-backend \ + --zip-file fileb://lambda-deployment.zip ``` -🔒 **Security Note:** The `.env` file is gitignored and will never be committed. +### What Gets Included in Context -## 🏗️ Building and Deploying +The `create-knowledge-base.sh` script combines: +- `about.qmd` - Project overview +- `participant_journey.md` - Participant information +- `faq.md` - Frequently asked questions +- `data_format.qmd` - Data format documentation +- `datasets_description.md` - Dataset descriptions +- `platform_tutorial.md` - Platform tutorial +- `pheno_utils.md` - Utility documentation -### Option 1: Full Deployment (Recommended) +## 🎨 Building and Editing Frontend + +### Build Frontend ```bash ./deploy.sh ``` -This script will: -- Create the knowledge base context from all documentation -- Inject your API key into the chatbot widget -- Build the Quarto site -- Restore the placeholder (so the key isn't committed) -- Output to `docs/` +**What this does:** +- Creates knowledge base context +- Injects `BACKEND_URL` from `.env` into chatbot widget +- Builds Quarto site to `docs/` folder +- Restores placeholder (safe to commit) + +### Edit Frontend Files -### Option 2: Manual Build +**Main files to edit:** +- `pheno_knowledge_base_expanded/_quarto.yml` - Site configuration +- `pheno_knowledge_base_expanded/chatbot-widget-simple.html` - Chatbot UI +- `pheno_knowledge_base_expanded/*.md` / `*.qmd` - Content pages +**After editing:** ```bash -# 1. Update chatbot knowledge base -./create-knowledge-base.sh +./deploy.sh # Rebuilds with changes +``` -# 2. Build the site -cd pheno_knowledge_base_expanded -quarto render -cd .. +### Preview Locally -# 3. Preview locally +```bash cd docs python3 -m http.server 8000 +# Visit http://localhost:8000 ``` -⚠️ **Note:** Manual build requires manually injecting the API key from `.env` into the chatbot widget. - -## 📦 Deployment to GitHub Pages - -1. Build the site: `./deploy.sh` -2. Configure GitHub Pages to serve from the `docs/` folder: - - Go to Settings → Pages - - Source: Deploy from a branch - - Branch: `main` (or your default branch) - - Folder: `/docs` -3. Push your changes (the API key is safely excluded via `.gitignore`) - -## 🤖 AI Chatbot (Currently Disabled) +## 🧪 Testing -The chatbot feature is currently **disabled** for security reasons. All implementation files are preserved for future use. +### Test Locally -### To Re-enable the Chatbot: - -1. **Set up your API key**: +1. **Start backend (if testing locally):** ```bash - cp env.example .env - # Edit .env and add your OpenRouter API key from https://openrouter.ai/keys + cd backend + python3 app.py + # Backend runs on http://localhost:5000 ``` -2. **Enable the chatbot in the configuration**: - - Edit `pheno_knowledge_base_expanded/_quarto.yml` - - Find line ~102-103 with the commented chatbot include - - Uncomment this line: - ```yaml - include-after-body: chatbot-widget-simple.html - ``` +2. **Update `.env` for local testing:** + ``` + BACKEND_URL="http://localhost:5000/api/chat" + ``` -3. **Deploy with the chatbot**: +3. **Build and preview:** ```bash ./deploy.sh + cd docs + python3 -m http.server 8000 ``` -### Chatbot Features (When Enabled): -- **Purple button** in bottom-right corner on all pages -- Answers questions **only from website documentation** -- No backend server required (works on GitHub Pages) -- Powered by OpenRouter API (Claude 3.5 Sonnet) -- Implementation files: - - `chatbot-widget-simple.html` - Main widget - - `create-knowledge-base.sh` - Extracts website content - - `CHATBOT_DEPLOY.md` - Detailed documentation +4. **Test chatbot:** + - Visit `http://localhost:8000` + - Click chatbot button (bottom right) + - Ask a question + - Check browser console (F12) for errors -## 📝 Repository Structure +### Test Lambda Backend -``` -pheno-docs-expanded/ -├── pheno_knowledge_base_expanded/ # Quarto site source -│ ├── datasets/ # Dataset documentation (Jupyter notebooks) -│ ├── _quarto.yml # Site configuration -│ └── ... # Other content files -├── deploy.sh # Main deployment script -├── create-knowledge-base.sh # Creates chatbot knowledge base -├── convert_md_to_ipynb.py # Converts markdown to Jupyter notebooks -├── env.example # API key template -├── requirements.txt # Python dependencies -└── docs/ # Built site (for GitHub Pages) +```bash +# Test health endpoint +curl https://your-lambda-url.lambda-url.region.on.aws/api/health + +# Test chat endpoint +curl -X POST https://your-lambda-url.lambda-url.region.on.aws/api/chat \ + -H "Content-Type: application/json" \ + -d '{"question": "What is the population dataset?"}' ``` -## 🔄 Updating Content +### Test on GitHub Pages -### Source Content Location +1. Build: `./deploy.sh` +2. Commit and push: `git add docs/ && git commit && git push` +3. Wait 1-2 minutes for GitHub Pages to rebuild +4. Visit your GitHub Pages URL +5. Test chatbot functionality -Dataset markdown files are maintained in a separate repository: -- **Source**: `/home/ec2-user/workspace/pheno-docs/markdowns-expanded/` -- **Converted to**: `pheno_knowledge_base_expanded/datasets/*.ipynb` +## 📦 Deployment -### Update Workflow +### Deploy to GitHub Pages -1. Edit markdown files in `/home/ec2-user/workspace/pheno-docs/markdowns-expanded/` -2. Run `python3 convert_md_to_ipynb.py` to convert to notebooks -3. Run `./deploy.sh` to rebuild the site -4. Commit and push changes -5. GitHub Pages will automatically serve the updated `docs/` folder +1. Build: `./deploy.sh` +2. Configure GitHub Pages: + - Repository → Settings → Pages + - Source: Deploy from branch + - Branch: `main` (or your branch) + - Folder: `/docs` +3. Push: `git add docs/ && git commit && git push` -### Recent Updates +### Update Lambda Backend -- Changed output directory from `docs-expanded/` to `docs/` for simpler deployment -- Added new "Derived Phenotypes" category with Curated Phenotypes dataset -- Fixed formatting issues (bullet points, titles, duplicate content) -- Removed "Get to Know the HPP" from navigation -- Integrated Vaginal Microbiome dataset -- Updated "Health apps" to "Wearables" -- Excluded Samples Inventory from website +```bash +cd lambda +./package_lambda.sh +aws lambda update-function-code \ + --function-name pheno-chatbot-backend \ + --zip-file fileb://lambda-deployment.zip +``` -## 💡 Tips +## 📝 Repository Structure -- **Preview locally** before deploying: - ```bash - cd pheno_knowledge_base_expanded - quarto preview - ``` +``` +pheno-docs-expanded/ +├── pheno_knowledge_base_expanded/ # Quarto site source +│ ├── datasets/ # Dataset documentation +│ ├── _quarto.yml # Site configuration +│ ├── chatbot-widget-simple.html # Chatbot widget +│ └── knowledge-base-context.txt # Generated context +├── lambda/ # AWS Lambda backend +│ ├── lambda_function.py # Lambda handler +│ ├── package_lambda.sh # Packaging script +│ └── requirements.txt # Lambda dependencies +├── deploy.sh # Deployment script +├── create-knowledge-base.sh # Context generator +└── docs/ # Built site (GitHub Pages) +``` -- **Update chatbot knowledge** when you change documentation: - ```bash - ./create-knowledge-base.sh - ``` +## 🔧 Key Configuration Files -- **Keep API key safe**: Never commit `.env` file! +- **`.env`** - Backend URL (gitignored) +- **`lambda/lambda_function.py`** - Backend logic and system prompt +- **`pheno_knowledge_base_expanded/_quarto.yml`** - Site config (enable/disable chatbot) +- **`create-knowledge-base.sh`** - Controls what content goes into context -## 📚 Documentation +## 💡 Tips -For more details about the Human Phenotype Project, visit the published site. +- **Update context** when documentation changes: `./create-knowledge-base.sh` +- **Rebuild frontend** after editing: `./deploy.sh` +- **Test locally** before deploying to GitHub Pages +- **Check Lambda logs** in AWS CloudWatch for debugging ## Contributing -- Updates should be made via a PR -- Please separate commits for source changes from rendering/builds +- Updates via PR +- Separate commits for source changes vs. builds diff --git a/deploy.sh b/deploy.sh index 56c7c2b..762538c 100755 --- a/deploy.sh +++ b/deploy.sh @@ -12,39 +12,55 @@ echo "" # Check if .env file exists if [ ! -f .env ]; then echo "❌ ERROR: .env file not found!" - echo "Please create a .env file with your OPENROUTER_API_KEY" + echo "Please create a .env file with your BACKEND_URL" echo "" echo "Example:" - echo "OPENROUTER_API_KEY=sk-or-v1-your-key-here" + echo "BACKEND_URL=https://your-lambda-url.lambda-url.us-east-1.on.aws" + echo "" + echo "For local development, use:" + echo "BACKEND_URL=http://localhost:5000/api/chat" exit 1 fi -# Load API key from .env +# Load backend URL from .env source .env -if [ -z "$OPENROUTER_API_KEY" ]; then - echo "❌ ERROR: OPENROUTER_API_KEY not found in .env file!" +if [ -z "$BACKEND_URL" ]; then + echo "❌ ERROR: BACKEND_URL not found in .env file!" + echo "" + echo "Please add BACKEND_URL to your .env file:" + echo "BACKEND_URL=https://your-lambda-url.lambda-url.us-east-1.on.aws" exit 1 fi -echo "✅ Step 1: API key loaded from .env" +echo "✅ Step 1: Backend URL loaded from .env" +echo " Backend URL: $BACKEND_URL" echo "" -# Create knowledge base context -echo "📚 Step 2: Creating knowledge base context..." -./create-knowledge-base.sh -echo "" +# Create knowledge base context (only if it doesn't exist or if manually created) +CONTEXT_FILE="pheno_knowledge_base_expanded/knowledge-base-context.txt" +if [ ! -f "$CONTEXT_FILE" ]; then + echo "📚 Step 2: Creating knowledge base context (file doesn't exist)..." + ./create-knowledge-base.sh + echo "" +else + echo "📚 Step 2: Using existing knowledge base context (skipping regeneration)" + echo " File: $CONTEXT_FILE" + echo " Size: $(du -h "$CONTEXT_FILE" | cut -f1)" + echo " (To regenerate, delete the file and run deploy again)" + echo "" +fi -# Inject API key into widget -echo "🔑 Step 3: Injecting API key into chatbot widget..." +# Inject backend URL into widget +echo "🔗 Step 3: Injecting backend URL into chatbot widget..." WIDGET_FILE="pheno_knowledge_base_expanded/chatbot-widget-simple.html" WIDGET_TEMP="pheno_knowledge_base_expanded/.chatbot-widget-temp.html" -# Replace placeholder with actual API key -sed "s|__OPENROUTER_API_KEY__|$OPENROUTER_API_KEY|g" "$WIDGET_FILE" > "$WIDGET_TEMP" +# Replace placeholder with actual backend URL +sed "s|__BACKEND_URL__|$BACKEND_URL|g" "$WIDGET_FILE" > "$WIDGET_TEMP" mv "$WIDGET_TEMP" "$WIDGET_FILE" -echo "✅ API key injected successfully" +echo "✅ Backend URL injected successfully" echo "" # Build the site @@ -54,12 +70,12 @@ quarto render cd .. echo "" -# Restore placeholder in widget (so we don't commit the key) +# Restore placeholder in widget (so we don't commit the URL) echo "🔒 Step 5: Restoring placeholder in widget..." -sed "s|$OPENROUTER_API_KEY|__OPENROUTER_API_KEY__|g" "$WIDGET_FILE" > "$WIDGET_TEMP" +sed "s|$BACKEND_URL|__BACKEND_URL__|g" "$WIDGET_FILE" > "$WIDGET_TEMP" mv "$WIDGET_TEMP" "$WIDGET_FILE" -echo "✅ Placeholder restored (API key not in source)" +echo "✅ Placeholder restored (backend URL not in source)" echo "" echo "╔════════════════════════════════════════════════════════════╗" @@ -69,12 +85,15 @@ echo "║ ║" echo "╚════════════════════════════════════════════════════════════╝" echo "" echo "📦 Output: docs/" -echo "🌐 Your site is ready with the chatbot (API key included)" +echo "🌐 Your site is ready with the chatbot (backend URL included)" echo "🔒 Source code still has placeholder (safe to commit)" echo "" +echo "⚠️ IMPORTANT: Make sure your Lambda backend is running and accessible!" +echo "" echo "Next steps:" -echo " 1. Test locally: cd docs && python3 -m http.server 8000" -echo " 2. Deploy docs/ to GitHub Pages" +echo " 1. Ensure Lambda backend is running at: $BACKEND_URL" +echo " 2. Test locally: cd docs && python3 -m http.server 8000" +echo " 3. Deploy docs/ to GitHub Pages" echo "" diff --git a/docs/CHATBOT_DEPLOY.html b/docs/CHATBOT_DEPLOY.html index 9bd7dde..80e459b 100644 --- a/docs/CHATBOT_DEPLOY.html +++ b/docs/CHATBOT_DEPLOY.html @@ -2,12 +2,12 @@ - + -Pheno.AI +chatbot_deploy – Pheno.AI + + + + + +
+
+
Pheno.AI Assistant
+ +
+
+
+
+ Hello! I can answer questions based on the Human Phenotype Project website content. Ask me anything about our datasets, methodology, or research! +
+
+
+
+
+ + +
+
+
+ + + + + }); + diff --git a/docs/about.html b/docs/about.html index 242b10c..3ad091b 100644 --- a/docs/about.html +++ b/docs/about.html @@ -2,12 +2,12 @@ - + -Pheno.AI - About the Human Phenotype Project +About the Human Phenotype Project – Pheno.AI + + + + + +
+
+
Pheno.AI Assistant
+ +
+
+
+
+ Hello! I can answer questions based on the Human Phenotype Project website content. Ask me anything about our datasets, methodology, or research! +
+
+
+
+
+ + +
+
+
+ + + + + }); + diff --git a/docs/category_clinical_assessments.html b/docs/category_clinical_assessments.html index 984b7b4..594f885 100644 --- a/docs/category_clinical_assessments.html +++ b/docs/category_clinical_assessments.html @@ -2,12 +2,12 @@ - + -Pheno.AI - Clinical Assessments +Clinical Assessments – Pheno.AI + + + + + +
+
+
Pheno.AI Assistant
+ +
+
+
+
+ Hello! I can answer questions based on the Human Phenotype Project website content. Ask me anything about our datasets, methodology, or research! +
+
+
+
+
+ + +
+
+
+ + + + + }); + diff --git a/docs/category_derived_phenotypes.html b/docs/category_derived_phenotypes.html index 27c4de7..b76db2d 100644 --- a/docs/category_derived_phenotypes.html +++ b/docs/category_derived_phenotypes.html @@ -2,12 +2,12 @@ - + -Pheno.AI - Derived Phenotypes +Derived Phenotypes – Pheno.AI + + + + + +
+
+
Pheno.AI Assistant
+ +
+
+
+
+ Hello! I can answer questions based on the Human Phenotype Project website content. Ask me anything about our datasets, methodology, or research! +
+
+
+
+
+ + +
+
+
+ + + + + }); + diff --git a/docs/category_medical_imaging.html b/docs/category_medical_imaging.html index ba4e30c..1facb67 100644 --- a/docs/category_medical_imaging.html +++ b/docs/category_medical_imaging.html @@ -2,12 +2,12 @@ - + -Pheno.AI - Medical Imaging +Medical Imaging – Pheno.AI + + + + + +
+
+
Pheno.AI Assistant
+ +
+
+
+
+ Hello! I can answer questions based on the Human Phenotype Project website content. Ask me anything about our datasets, methodology, or research! +
+
+
+
+
+ + +
+
+
+ + + + + }); + diff --git a/docs/category_medical_records_surveys.html b/docs/category_medical_records_surveys.html index 35daf48..8dd7500 100644 --- a/docs/category_medical_records_surveys.html +++ b/docs/category_medical_records_surveys.html @@ -2,12 +2,12 @@ - + -Pheno.AI - Medical Records and Surveys +Medical Records and Surveys – Pheno.AI + + + + + +
+
+
Pheno.AI Assistant
+ +
+
+
+
+ Hello! I can answer questions based on the Human Phenotype Project website content. Ask me anything about our datasets, methodology, or research! +
+
+
+
+
+ + +
+
+
+ + + + + }); + diff --git a/docs/category_multi_omics.html b/docs/category_multi_omics.html index ecc132b..2471174 100644 --- a/docs/category_multi_omics.html +++ b/docs/category_multi_omics.html @@ -2,12 +2,12 @@ - + -Pheno.AI - Multi-Omics +Multi-Omics – Pheno.AI + + + + + +
+
+
Pheno.AI Assistant
+ +
+
+
+
+ Hello! I can answer questions based on the Human Phenotype Project website content. Ask me anything about our datasets, methodology, or research! +
+
+
+
+
+ + +
+
+
+ + + + + }); + diff --git a/docs/category_sensors_app_logging.html b/docs/category_sensors_app_logging.html index 0de6961..f748b9c 100644 --- a/docs/category_sensors_app_logging.html +++ b/docs/category_sensors_app_logging.html @@ -2,12 +2,12 @@ - + -Pheno.AI - Sensors and App Logging +Sensors and App Logging – Pheno.AI + + + + + +
+
+
Pheno.AI Assistant
+ +
+
+
+
+ Hello! I can answer questions based on the Human Phenotype Project website content. Ask me anything about our datasets, methodology, or research! +
+
+
+
+
+ + +
+
+
+ + + + + }); + diff --git a/docs/category_study_core.html b/docs/category_study_core.html index 8e36af9..7930348 100644 --- a/docs/category_study_core.html +++ b/docs/category_study_core.html @@ -2,12 +2,12 @@ - + -Pheno.AI - Study Core +Study Core – Pheno.AI + + + + + +
+
+
Pheno.AI Assistant
+ +
+
+
+
+ Hello! I can answer questions based on the Human Phenotype Project website content. Ask me anything about our datasets, methodology, or research! +
+
+
+
+
+ + +
+
+
+ + + + + }); + diff --git a/docs/data_format.html b/docs/data_format.html index 4b61223..3da69c7 100644 --- a/docs/data_format.html +++ b/docs/data_format.html @@ -2,12 +2,12 @@ - + -Pheno.AI +data_format – Pheno.AI + + + + + +
+
+
Pheno.AI Assistant
+ +
+
+
+
+ Hello! I can answer questions based on the Human Phenotype Project website content. Ask me anything about our datasets, methodology, or research! +
+
+
+
+
+ + +
+
+
+ + + + + }); + diff --git a/docs/datasets/000-population.html b/docs/datasets/000-population.html index aa69434..3f0c04b 100644 --- a/docs/datasets/000-population.html +++ b/docs/datasets/000-population.html @@ -2,12 +2,12 @@ - + -Pheno.AI - Population +Population – Pheno.AI - + - + + + - - + + + - - + + + - + @@ -110,7 +115,169 @@ - +
@@ -122,7 +289,7 @@ - -
+