Preview of the website in action - Visit the live site →
Modern personal portfolio website built with Angular 20, featuring a BAML-powered AI chatbot and beautiful gradient design system. This project showcases modern web development practices with server-side rendering, serverless architecture, and AI integration.
🌐 Live Website: silviobaratto.com | silviobaratto.vercel.app
- Features
- Tech Stack
- Architecture
- Local Development
- Deployment
- Project Structure
- Environment Variables
- Troubleshooting
- Performance & Optimization
- FAQ
- Contributing
- License
- Acknowledgments
- Contact
- 🎨 Modern Design System: Beautiful gradient theme (blue → indigo → purple) built with Tailwind CSS 4
- 🤖 AI-Powered Chatbot: Interactive chat interface using BAML and OpenAI GPT-4
- 📱 Fully Responsive: Mobile-first approach ensuring perfect display on all devices
- ⚡ Server-Side Rendering: Angular Universal for optimal performance and SEO
- 🚀 Serverless Architecture: Deployed on Vercel with serverless functions
- ♿ Accessible: Following WCAG guidelines for accessibility
- 🎯 Standalone Components: Modern Angular architecture with signals and standalone components
- Angular 20: Latest version with standalone components, signals, and modern reactive patterns
- Standalone components for better modularity
- Signals for reactive state management
- Native control flow syntax (@if, @for, @switch)
- OnPush change detection for optimal performance
- Tailwind CSS 4: Utility-first CSS framework with custom gradient design system
- TypeScript: Strict type checking for maintainable code
- RxJS: Reactive programming for async operations
- Express.js: Server for Angular Universal SSR
- Angular Universal: Server-side rendering for improved SEO and performance
- Vercel Serverless Functions: Scalable backend endpoints for the chatbot
- BAML (Boundary ML): Type-safe LLM function definitions and structured outputs
- OpenAI GPT-4: Language model powering the chatbot
- Streaming Responses: Real-time chat experience with Server-Sent Events (SSE)
- Vercel: Edge network deployment with automatic HTTPS
- Git Integration: Automatic deployments on push to main branch
- Environment Variables: Secure configuration management
The application follows a modern serverless architecture with three main layers:
┌─────────────────────────────────────────────────────────────┐
│ Client (Browser) │
│ Angular 20 SPA with SSR-rendered initial content │
└─────────────────────┬───────────────────────────────────────┘
│
├─ Static Assets (Vercel CDN)
│
├─ SSR Requests
│ └─> Express.js + Angular Universal
│
└─ API Requests
└─> Vercel Serverless Functions
└─> BAML Client
└─> OpenAI API
- Pages: Home, About, Side Projects, Chatbot
- Shared Components: Header, Hero, Footer, Gradient backgrounds
- Services: Chat service for API communication
- State Management: Angular signals for reactive state
- Routing: Lazy-loaded routes for optimal bundle size
- Purpose: Initial page load optimization and SEO
- Implementation: Angular Universal with Express.js
- Deployment: Vercel Edge Functions
- Benefits:
- Faster first contentful paint (FCP)
- Better SEO with crawlable content
- Social media preview cards
- Location:
/apidirectory - Functions:
serverless-chatbot-stream.ts: Streaming chat endpoint (SSE)serverless-chatbot.ts: Non-streaming chat endpoint (fallback)
- Features:
- CORS-enabled for cross-origin requests
- Error handling and logging
- 60-second timeout for long-running requests
- Automatic scaling
- Purpose: Type-safe LLM function definitions
- Location:
baml_src/directory - Benefits:
- Structured outputs from LLMs
- Type safety between frontend and AI responses
- Easy prompt management and versioning
- Built-in streaming support
- User Interaction: User types a message in the chatbot
- Frontend: Angular service sends POST request to
/api/chatbot/stream - Serverless Function: Vercel function receives request
- BAML Client: Initializes with OpenAI credentials
- OpenAI API: Processes the request and streams response
- SSE Stream: Response chunks sent back to client in real-time
- Frontend Update: UI updates progressively as tokens arrive
- Node.js 22+
- npm or yarn
- Clone the repository:
git clone git@github.com:SilvioBaratto/personal-website.git
cd personal-website- Install dependencies:
npm install- Create a
.envfile in the root directory:
OPENAI_API_KEY=your_openai_api_key_here- Start the development server:
npm startThe application will be available at http://localhost:4200
npm run buildThis project is optimized for deployment on Vercel. Follow these steps:
- A Vercel account (sign up at vercel.com)
- Git repository connected to your Vercel account
- OpenAI API key (get one at platform.openai.com)
git clone git@github.com:SilvioBaratto/personal-website.git
cd personal-websitenpm install -g vercelYou need to set environment variables in Vercel. There are two ways to do this:
Option A: Via Vercel Dashboard
- Go to your project in the Vercel dashboard
- Navigate to Settings > Environment Variables
- Add the following variable:
- Key:
OPENAI_API_KEY - Value: Your OpenAI API key
- Environments: Select Production, Preview, and Development
- Key:
Option B: Via Vercel CLI
vercel env add OPENAI_API_KEY
# Follow the prompts to add the value for each environmentImportant Notes:
- Make sure to add the variable to all environments (Production, Preview, Development)
- After adding environment variables, you must redeploy for changes to take effect
- Never commit your
.envfile to version control
Method 1: Automatic Deployment via Git (Recommended)
-
Connect your repository to Vercel:
vercel link
-
Push to your repository:
git add . git commit -m "Initial deployment" git push origin main
-
Vercel will automatically:
- Detect the Angular project
- Install dependencies
- Build the application
- Deploy to production
Method 2: Manual Deployment via CLI
# Deploy to preview
vercel
# Deploy to production
vercel --prod- Check the deployment URL provided by Vercel
- Test the chatbot functionality at
/chatbot - Verify SSR is working by viewing page source
- Check Vercel function logs for any errors
The project includes a vercel.json file with the following configuration:
{
"version": 2,
"buildCommand": "npm run build",
"outputDirectory": "dist/silviobaratto/browser",
"framework": "angular",
"functions": {
"api/**/*.ts": {
"maxDuration": 60
}
},
"rewrites": [
{
"source": "/api/(.*)",
"destination": "/api/$1"
}
]
}Key Configuration Points:
- maxDuration: Set to 60 seconds for AI response generation
- rewrites: Routes API requests to serverless functions
- outputDirectory: Points to Angular build output
After deployment, your API endpoints will be available at:
- Streaming Chat:
https://your-domain.vercel.app/api/chatbot/stream - Standard Chat:
https://your-domain.vercel.app/api/chatbot
View Logs:
vercel logs <deployment-url>View Function Logs in Dashboard:
- Go to your project in Vercel
- Click on Deployments
- Select a deployment
- Navigate to Functions tab
- Click on a function to view logs
Common Deployment Issues:
- Build Failures: Check
package.jsonscripts and dependencies - API Errors: Verify environment variables are set correctly
- Timeout Issues: Adjust
maxDurationinvercel.jsonif needed - CORS Errors: Check CORS headers in serverless functions
- Go to Settings > Domains in Vercel dashboard
- Add your custom domain
- Configure DNS records as instructed
- SSL certificate will be automatically provisioned
silviobaratto/
├── src/
│ ├── app/
│ │ ├── pages/ # Page components
│ │ │ ├── home/ # Landing page
│ │ │ ├── about/ # About me page
│ │ │ ├── side-projects/ # Projects showcase
│ │ │ └── chatbot/ # AI chatbot interface
│ │ ├── shared/ # Reusable components
│ │ │ ├── header/ # Navigation header
│ │ │ ├── hero/ # Hero sections
│ │ │ ├── footer/ # Footer component
│ │ │ └── gradient-bg/ # Gradient backgrounds
│ │ ├── services/ # Angular services
│ │ │ └── chat.service.ts # Chatbot API integration
│ │ ├── app.component.ts # Root component
│ │ ├── app.config.ts # App configuration
│ │ └── app.routes.ts # Route definitions
│ ├── assets/ # Static assets
│ │ ├── images/ # Images and icons
│ │ └── data/ # JSON data files
│ ├── styles.css # Global styles & Tailwind
│ └── server.ts # Express server for SSR
│
├── api/ # Vercel serverless functions
│ ├── serverless-chatbot-stream.ts # Streaming chat endpoint
│ └── serverless-chatbot.ts # Non-streaming endpoint
│
├── baml_src/ # BAML AI configurations
│ ├── chatbot.baml # Chatbot function definitions
│ └── clients.baml # OpenAI client configuration
│
├── dist/ # Build output (generated)
│ └── silviobaratto/
│ ├── browser/ # Client-side bundle
│ └── server/ # SSR bundle
│
├── node_modules/ # Dependencies (generated)
│
├── .env # Environment variables (not in git)
├── .gitignore # Git ignore rules
├── angular.json # Angular CLI configuration
├── package.json # Dependencies and scripts
├── tailwind.config.js # Tailwind CSS configuration
├── tsconfig.json # TypeScript configuration
├── vercel.json # Vercel deployment config
└── README.md # This file
Contains page-level components that represent different routes:
- Each page is a standalone component
- Implements lazy loading for optimal performance
- Uses Angular signals for state management
Reusable components used across multiple pages:
- Header with navigation and mobile menu
- Hero sections with gradient backgrounds
- Footer with social links
- All components use OnPush change detection
Injectable services for business logic:
ChatService: Handles API communication with chatbot endpoints- Uses RxJS for reactive data streams
- Implements error handling and retry logic
Vercel serverless functions (Node.js runtime):
- Each file exports a default handler function
- Deployed as separate serverless functions
- Auto-scales based on traffic
- 60-second timeout configured
BAML (Boundary ML) configuration:
- Type-safe LLM function definitions
- Prompt templates and schemas
- OpenAI client configuration
- Generates TypeScript types at build time
| Variable | Description | Required |
|---|---|---|
OPENAI_API_KEY |
OpenAI API key for the chatbot | Yes |
NODE_ENV |
Environment (production/development) | No |
Symptoms: API endpoint returns 405 error when chatbot tries to connect
Causes:
- Environment variables are not set in Vercel
- Serverless functions are not deployed correctly
- CORS headers are blocking the request
- Incorrect API route configuration
Solutions:
- Verify
OPENAI_API_KEYis set in Vercel dashboard under Settings > Environment Variables - Ensure the variable is added to all environments (Production, Preview, Development)
- Redeploy the project after adding environment variables:
vercel --prod --force
- Check Vercel function logs for detailed error messages:
vercel logs --follow
- Verify
vercel.jsonhas correct rewrite rules for/apiroutes
Symptoms: Chat interface loads but doesn't respond to messages
Causes:
- BAML client initialization failure
- Invalid or expired OpenAI API key
- OpenAI API rate limits or insufficient credits
- Network timeout issues
Solutions:
- Check BAML client initialization in browser console
- Verify OpenAI API key is valid at platform.openai.com
- Check OpenAI account has available credits
- Review Vercel function logs:
- Go to Vercel Dashboard > Project > Deployments
- Select latest deployment > Functions tab
- Click on
serverless-chatbot-streamto view logs
- Increase timeout in
vercel.jsonif responses are slow:"functions": { "api/**/*.ts": { "maxDuration": 60 } }
Symptoms: Deployment fails during build step
Causes:
- Missing dependencies
- TypeScript compilation errors
- Angular build configuration issues
- Node version mismatch
Solutions:
- Check Node.js version matches requirements (Node 22+):
node --version
- Clear dependencies and reinstall:
rm -rf node_modules package-lock.json npm install
- Run build locally to identify errors:
npm run build
- Check Vercel build logs for specific error messages
- Verify
tsconfig.jsonsettings are correct - Ensure all imports are correct and files exist
Symptoms: Console warnings about hydration mismatches
Causes:
- Different content rendered on server vs client
- Using browser-only APIs during SSR
- Timing-dependent rendering
Solutions:
- Use
isPlatformBrowser()to check environment:import { isPlatformBrowser } from '@angular/common'; if (isPlatformBrowser(this.platformId)) { // Browser-only code }
- Avoid using
window,document, orlocalStoragedirectly - Use Angular's platform detection for conditional rendering
- Check for timing-dependent content (dates, random numbers)
Symptoms: Application takes long to load initially
Causes:
- Large bundle sizes
- Missing lazy loading
- Unoptimized images
- Too many HTTP requests
Solutions:
- Enable lazy loading for routes in
app.routes.ts - Use
NgOptimizedImagefor images - Analyze bundle size:
npm run build -- --stats-json npx webpack-bundle-analyzer dist/silviobaratto/browser/stats.json
- Enable Vercel Analytics to monitor performance
- Optimize images and assets
- Use OnPush change detection strategy
Symptoms: API requests fail with CORS errors locally
Causes:
- Missing CORS headers in serverless functions
- Different origins (localhost:4200 vs localhost:3000)
Solutions:
- Add CORS headers to serverless functions (already configured)
- Use proxy configuration in
angular.json:"serve": { "options": { "proxyConfig": "proxy.conf.json" } }
- Create
proxy.conf.json:{ "/api": { "target": "http://localhost:3000", "secure": false } }
If you're still experiencing issues:
- Check the logs: Always start with Vercel function logs and browser console
- Search issues: Look through GitHub Issues
- Create an issue: If the problem persists, create a detailed issue with:
- Error messages
- Steps to reproduce
- Environment information
- Screenshots or logs
- Ask for help: Open a Discussion
To start a local development server, run:
ng serveOnce the server is running, open your browser and navigate to http://localhost:4200/. The application will automatically reload whenever you modify any of the source files.
Angular CLI includes powerful code scaffolding tools. To generate a new component, run:
ng generate component component-nameFor a complete list of available schematics (such as components, directives, or pipes), run:
ng generate --helpTo build the project run:
ng buildThis will compile your project and store the build artifacts in the dist/ directory. By default, the production build optimizes your application for performance and speed.
To execute unit tests with the Karma test runner, use the following command:
ng testFor end-to-end (e2e) testing, run:
ng e2eAngular CLI does not come with an end-to-end testing framework by default. You can choose one that suits your needs.
For more information on using the Angular CLI, including detailed command references, visit the Angular CLI Overview and Command Reference page.
This project implements several performance optimization techniques:
Current Setup:
- Production Build:
npm run build - Bundle Analysis:
npm run build -- --stats-json && npx webpack-bundle-analyzer dist/silviobaratto/browser/stats.json
Optimization Techniques:
- Lazy Loading: All routes are lazy-loaded to reduce initial bundle size
- Tree Shaking: Unused code is automatically removed during production build
- AOT Compilation: Ahead-of-time compilation for faster rendering
- Minification: Code is minified and compressed in production
- Code Splitting: Separate chunks for vendor, polyfills, and application code
Benefits:
- Faster FCP: First Contentful Paint happens on the server
- SEO: Search engines can crawl fully-rendered content
- Social Sharing: Meta tags are properly set for social media previews
Implementation:
- Angular Universal with Express.js
- Deployed as Vercel Edge Functions
- Automatic hydration on the client
Best Practices:
- Use
NgOptimizedImagedirective for all images - Serve WebP format with fallbacks
- Implement lazy loading for below-the-fold images
- Use appropriate sizing attributes
Example:
<img
ngSrc="/assets/images/hero.jpg"
width="1200"
height="800"
priority
alt="Hero image"
>Vercel Edge Caching:
- Static assets cached at the edge
- Serverless functions cached when possible
- Stale-while-revalidate for optimal performance
Browser Caching:
- Versioned assets with long cache times
- Service Worker for offline support (optional)
Vercel Analytics:
- Enable in Vercel dashboard: Analytics tab
- Install package (optional for detailed insights):
npm install @vercel/analytics
- Add to
app.component.ts:import { inject } from '@vercel/analytics'; inject();
Web Vitals to Monitor:
- LCP (Largest Contentful Paint): < 2.5s
- FID (First Input Delay): < 100ms
- CLS (Cumulative Layout Shift): < 0.1
- TTFB (Time to First Byte): < 800ms
- All images optimized and using
NgOptimizedImage - Routes implement lazy loading
- Components use OnPush change detection
- Production build is minified and compressed
- SSR is working correctly
- Bundle size is under acceptable limits (< 500KB initial)
- No console errors in production
- Lighthouse score > 90 for all categories
- Web Vitals are in green zone
- API responses are cached when appropriate
For Future Improvements:
- Service Worker: Implement PWA with offline support
- HTTP/2 Push: Push critical resources
- Preloading: Preload fonts and critical CSS
- Resource Hints: Use
dns-prefetch,preconnect - CDN: Use Vercel's global CDN for static assets
- Compression: Enable Brotli compression
Contributions are welcome! This is an open-source project and I appreciate any help in making it better.
-
Fork the Repository
# Click the "Fork" button on GitHub git clone git@github.com:YOUR_USERNAME/personal-website.git cd personal-website
-
Create a Feature Branch
git checkout -b feature/your-feature-name
-
Make Your Changes
- Follow the existing code style and conventions
- Use TypeScript strict mode
- Follow Angular best practices (see CLAUDE.md)
- Write meaningful commit messages
- Test your changes locally
-
Commit Your Changes
git add . git commit -m "feat: add your feature description"
-
Push to Your Fork
git push origin feature/your-feature-name
-
Create a Pull Request
- Go to the original repository on GitHub
- Click "New Pull Request"
- Select your feature branch
- Provide a clear description of your changes
- Reference any related issues
Code Style:
- Use TypeScript with strict type checking
- Follow Angular style guide
- Use standalone components (not NgModules)
- Use signals for state management
- Implement OnPush change detection
- Use native control flow (@if, @for, @switch)
Commit Messages: Follow the Conventional Commits specification:
feat:New featurefix:Bug fixdocs:Documentation changesstyle:Code style changes (formatting, etc.)refactor:Code refactoringtest:Adding or updating testschore:Maintenance tasks
Before Submitting:
- Code follows the project's style guidelines
- Changes have been tested locally
- No console errors or warnings
- Build completes successfully (
npm run build) - Commit messages follow conventional commits
- PR description clearly explains the changes
Here are some areas where contributions would be especially welcome:
- Features: New page sections, animations, interactions
- AI/Chatbot: Improved prompts, new BAML functions, chat features
- Performance: Bundle size optimization, lazy loading improvements
- Accessibility: A11y improvements, keyboard navigation
- Testing: Unit tests, E2E tests, integration tests
- Documentation: Improved docs, tutorials, code comments
- Bug Fixes: Report and fix any issues you find
Feel free to open a Discussion if you have questions about:
- How to implement a feature
- Architecture decisions
- Best practices
- General usage
This project is licensed under the MIT License - see the LICENSE file for details.
Feel free to use this project as a template for your own portfolio! If you do, I'd appreciate a link back to this repository or a mention. Happy coding!
- Angular Team: For the amazing framework and tooling
- Vercel: For seamless deployment and hosting
- OpenAI: For GPT-4 API powering the chatbot
- BAML: For type-safe LLM integration
- Tailwind CSS: For the utility-first CSS framework
- Open Source Community: For all the amazing libraries and tools
Silvio Angelo Baratto
- Email: silvio.baratto22@gmail.com
- GitHub: @SilvioBaratto
- LinkedIn: Silvio Baratto
- Website: silviobaratto.vercel.app
Star this repo if you find it helpful! Contributions and feedback are always welcome.