-
-
Notifications
You must be signed in to change notification settings - Fork 1k
Description
Hey!
I've received this email from figma, which was unfortunate, but I'm not surprised. 😁
Email from figma
Hi there,
On November 17, we'll roll out adjusted rate limits for apps and tokens that use Figma's REST API. We expect that your personal access token, Figma MCP, will hit these limits based on your past usage. Learn more about the updates we're making to our developer platform.
The rate limits increase according to your seat and plan. If you have multiple accounts, consider using this personal access token from an account on a higher plan or from a seat with higher limits.
If you're using your personal access token in a script you have written, we recommend updating the code to make fewer API calls. Read more about how to bring down API calls here.
Please reach out to our team if you have any questions—we're here to help.
The Figma team
I've been already downloading figma data for LLM usage manually using a script below to have the ability to throw some things out, chunk it or do some adjustments to reduce context usage for the task, or just restart the process if LLM outputs garbage in the code.
It's been especially useful when syncing UI kit variants with figma, e.g. huge flat array of button nodes. With links you would either do multiple requests for each button variant, or get a response for a frame that's too big for LLM to process in one go.
Figma MCP wrapper script
#!/usr/bin/env node
import { spawn } from 'child_process';
const figmaUrl = process.argv[2];
if (!figmaUrl) {
console.error('Usage: node get-figma-data.js <figma-url>');
process.exit(1);
}
// Persistent server process & messaging helpers
function startServer() {
const proc = spawn('npx', ['figma-developer-mcp', '--stdio'], {
stdio: ['pipe', 'pipe', 'inherit'],
});
let buffer = '';
const resolvers = {};
proc.stdout.on('data', (chunk) => {
buffer += chunk;
let index;
while ((index = buffer.indexOf('\n')) !== -1) {
const line = buffer.slice(0, index).trim();
buffer = buffer.slice(index + 1);
if (!line) continue;
try {
const msg = JSON.parse(line);
if (resolvers[msg.id]) {
resolvers[msg.id](msg);
delete resolvers[msg.id];
}
} catch (e) {
console.error(e);
process.exit(1);
}
}
});
let messageId = 1;
function send(msgObj) {
const id = messageId++;
msgObj.id = id;
const msgJSON = JSON.stringify(msgObj) + '\n';
proc.stdin.write(msgJSON);
return new Promise((resolve, reject) => {
resolvers[id] = resolve;
// Optionally add a timeout/reject logic here
});
}
function cleanup() {
proc.kill();
}
return { send, cleanup };
}
async function main() {
const url = new URL(figmaUrl);
const nodeId = url.searchParams.get('node-id');
const fileKey = url.pathname.split('/')[2];
if (fileKey.length !== 22) {
console.error('Wrong url, doc id should be of 22 char length');
process.exit(1);
}
const { send, cleanup } = startServer();
await send({
jsonrpc: '2.0',
method: 'initialize',
params: {
protocolVersion: '2024-11-05',
capabilities: {},
clientInfo: { name: 'test', version: '1.0' },
},
});
await send({
jsonrpc: '2.0',
method: 'tools/list',
});
const result = await send({
jsonrpc: '2.0',
method: 'tools/call',
params: {
name: 'get_figma_data',
arguments: { fileKey, nodeId },
},
});
cleanup();
const content = result.result?.content?.[0]?.text;
console.log(content ?? JSON.stringify(result, null, 2));
}
main().catch((e) => {
console.error(e);
process.exit(1);
});What do you think about having a CLI along with mcp in the package? 🤔
I might try making a PR if you support the idea.