|
| 1 | +<div align="right"> |
| 2 | + <a href="README_CN.md">中文</a> | <strong>English</strong> |
| 3 | +</div> |
| 4 | +<div align="center"> |
| 5 | +<h1>Serverless API Proxy</h1> |
| 6 | +<p>Serverless API Proxy: Multi-API Proxy Gateway Based on Vercel Routes, Cloudflare Workers, and Netlify Redirects</p> |
| 7 | +</div> |
| 8 | + |
| 9 | +## Support |
| 10 | + |
| 11 | +- openai |
| 12 | +- gemini |
| 13 | +- groq |
| 14 | +- claude |
| 15 | +- cohere |
| 16 | +- huggingface.co |
| 17 | +- Fireworks AI |
| 18 | +- ... |
| 19 | + |
| 20 | +## How to deploy |
| 21 | + |
| 22 | +### Vercel |
| 23 | + |
| 24 | +[](https://vercel.com/new/clone?repository-url=https://github.com/lopins/serverless-api-proxy) |
| 25 | + |
| 26 | +### Cloudflare |
| 27 | + |
| 28 | +[](https://deploy.workers.cloudflare.com/?url=https://github.com/lopins/serverless-api-proxy) |
| 29 | + |
| 30 | +### Netlify |
| 31 | + |
| 32 | +[](https://app.netlify.com/start/deploy?repository=https://github.com/lopins/serverless-api-proxy) |
| 33 | + |
| 34 | +## How to use |
| 35 | + |
| 36 | +### Configure proxy address |
| 37 | + |
| 38 | +``` |
| 39 | +# openai |
| 40 | +https://self.domain/openai/ |
| 41 | +
|
| 42 | +# gemini |
| 43 | +https://self.domain/gemini/ |
| 44 | +
|
| 45 | +# groq |
| 46 | +https://self.domain/groq/ |
| 47 | +
|
| 48 | +# claude |
| 49 | +https://self.domain/claude/ |
| 50 | +
|
| 51 | +# cohere |
| 52 | +https://self.domain/cohere/ |
| 53 | +
|
| 54 | +# huggingface |
| 55 | +https://self.domain/huggingface/ |
| 56 | +
|
| 57 | +# fireworks |
| 58 | +https://self.domain/fireworks/ |
| 59 | +``` |
| 60 | + |
| 61 | +| Name | Original API | Proxy API | Use Example | |
| 62 | +| :---: | :--- | :--- | :--- | |
| 63 | +| OpenAI API | <https://api.openai.com/v1> | `/openai/v1` | `/openai/v1/chat/completions` | |
| 64 | +| Gemini API | <https://generativelanguage.googleapis.com/v1> | `/gemini/v1` | `/gemini/v1/models/gemini-pro:generateContent?key=AIzaSyBbBDDvGwJqKjsmE6CpNheqmzp30bz9saI` | |
| 65 | +| Gemini API | <https://generativelanguage.googleapis.com/v1beta> | `/gemini/v1beta` | `/gemini/v1beta/models/gemini-pro:generateContent?key=AIzaSyBbBDDvGwJqKjsmE6CpNheqmzp30bz9saI` | |
| 66 | +| Groq API | <https://api.groq.com/groq/openai/v1> | `/groq/openai/v1` | `/groq/openai/v1/chat/completions` | |
| 67 | +| Claude API | <https://api.anthropic.com/v1> | `/claude/v1` | `/claude/v1/completions` | |
| 68 | +| Cohere API | <https://api.cohere.ai/v1> | `/cohere/v1` | `/cohere/v1/chat/completions` | |
| 69 | +| Huggingface API | <https://api-inference.huggingface.co> | `/huggingface` | `/huggingface/models/meta-llama/Llama-3.1-70B-Instruct/v1/chat/completions` | |
| 70 | +| Fireworks API | <https://api.fireworks.ai/v1> | `/fireworks/v1` | `/fireworks/v1/chat/completions` | |
| 71 | + |
| 72 | + |
| 73 | +### API Usage |
| 74 | + |
| 75 | +``` python |
| 76 | +import random |
| 77 | +import re |
| 78 | + |
| 79 | +from openai import OpenAI |
| 80 | + |
| 81 | +ApiKey = "sk-Qa7GFtgCspCVfVGqKhm43QFmEB1FxsFvkXNysVycCuwDv2rz" |
| 82 | +BaseUrl = "https://self.domain/openai/v1" |
| 83 | +models = [ |
| 84 | + "gpt-3.5-turbo", |
| 85 | + "gpt-4o-mini" |
| 86 | +] |
| 87 | + |
| 88 | +def gentext(): |
| 89 | + client = OpenAI(api_key=ApiKey, base_url=BaseUrl) |
| 90 | + model = random.choice(models) |
| 91 | + try: |
| 92 | + completion = client.chat.completions.create( |
| 93 | + model=model, |
| 94 | + messages=[ |
| 95 | + { |
| 96 | + "role": "system", |
| 97 | + "content": "You are a smart and creative novelist." |
| 98 | + }, |
| 99 | + { |
| 100 | + "role": "user", |
| 101 | + "content": "As the king of fairy tales, please write a short fairy tale, the theme of the story is to always maintain a kind heart, to stimulate children's interest and imagination in learning, and to help children better understand and accept the truth and values contained in the story. Only the story content is output, and the title and others are not required." |
| 102 | + } |
| 103 | + ], |
| 104 | + top_p=0.7, |
| 105 | + temperature=0.7 |
| 106 | + ) |
| 107 | + text = completion.choices[0].message.content |
| 108 | + print(f"{model}:{re.sub(r'\n+', '', text)}") |
| 109 | + except Exception as e: |
| 110 | + print(f"{model}:{str(e)}\n") |
| 111 | +``` |
| 112 | + |
| 113 | +## Vercel Region List |
| 114 | + |
| 115 | +https://vercel.com/docs/edge-network/regions#region-list |
0 commit comments