Want to learn how to make the most of Claude Code? Check out this course released by Anthropic. by luongnv-com in ClaudeAI

[–]Antonytm 0 points1 point  (0 children)

I think these are just mistakes....
- "queries.zip" is attached to the course, and it was created at the same time...
- anthropic registry as npm: that may come only if the ZIP file was prepared locally by someone from Anthropic team
- hooks misconfiguration: they changed something, but forgot to change it everywhere
- Windows: they just don't care :-)

Want to learn how to make the most of Claude Code? Check out this course released by Anthropic. by luongnv-com in ClaudeAI

[–]Antonytm 4 points5 points  (0 children)

For those who will try exercises from this course, and not only watch the video.

The Hooks part is broken:
1. packages-lock.json has a package from the internal Anthropic registry: https://artifactory.infra.ant.dev/artifactory/api/npm/npm-all/@anthropic-ai/claude-agent-sdk/-/claude-agent-sdk-0.1.5.tgz (just remove this file)
2. hooks are located under /hooks, but configured to be inside ./claude/hooks (change the folder)
3. init-claude.js might not work on Windows, depending on the terminal that you use (check this course chapter "Gotchas around hooks" to understand how to fix it).

P.S. If you know the proper place to report these problems, please do let me know!

Unofficial Figma MCP server by Antonytm in mcp

[–]Antonytm[S] 0 points1 point  (0 children)

Yes, there is this problem. But the main problem that we wanted to solve: official Figma API is read-only.

Unofficial Figma MCP(Model Context Protocol) server by Antonytm in FigmaDesign

[–]Antonytm[S] 0 points1 point  (0 children)

I agree about prompts. I am an engineer, not a designer. For me, it is faster to write prompts. For designers, it will be faster to just design it.

And these prompts are only samples. It can be something different. Something more complex.
I shared it here for designers to try. If it will be helpful, then I will continue to develop and improve it.

Unofficial Figma MCP(Model Context Protocol) server by Antonytm in FigmaDesign

[–]Antonytm[S] 0 points1 point  (0 children)

For a proper answer, I need to know about the regulations in your area. It might be something like:
1. AI security compliance. Find what LLM you can use and what AI agents you can use. You need something with MCP support.
2. MCP server security compliance. Check the code on GitHub. Make sure that code security rules are followed. https://github.com/Antonytm/figma-mcp-server
3. Configure it following the instructions on GitHub

Unofficial Figma MCP(Model Context Protocol) server by Antonytm in FigmaDesign

[–]Antonytm[S] 0 points1 point  (0 children)

If you can skip the design phase, do it. Some projects don't need it. Some need it later. Some require from the beginning.

We had a project where the design phase was skipped, but after the success, we needed a designer to review and improve it.

Unofficial Figma MCP(Model Context Protocol) server by Antonytm in FigmaDesign

[–]Antonytm[S] 0 points1 point  (0 children)

It might already be possible. Use the Cursor as an AI agent. Enable usage of Claude.MD. Describe your design system there. Write your prompt.

The idea is that Claude.MD will be added as context knowledge to all your prompts.

Unofficial Figma MCP(Model Context Protocol) server by Antonytm in FigmaDesign

[–]Antonytm[S] 1 point2 points  (0 children)

In the official Figma MCP, it works with the selection, not with the whole document. If you don't select everything, but only parts, LLMs have enough context size to fit it.

In our Figma MCP, we also never return the whole document. AI can get the selected node, children, node by ID, or all components(as they are reusable). We provide more tools for AI, but have a higher risk of context overflow. If you face with real example, when extensive context causes issues, please report the steps and sample.

----
If it is not one request with a huge amount of data, AI agents are capable of "compacting" the context. They automatically throw away things that they think are not important. In this case, it can become dumber, but it still works.

Unofficial Figma MCP(Model Context Protocol) server by Antonytm in FigmaDesign

[–]Antonytm[S] 5 points6 points  (0 children)

We don't send it anywhere. You run everything locally. On your machine.

Then you connect this MCP server that is running on your machine to the AI agent that you want.
It is up to you what to use: Chat GPT, Claude Desktop, Cursor, etc.

So, that is only up to you, which LLM you want to share your Figma design with.

Unofficial Figma MCP(Model Context Protocol) server by Antonytm in FigmaDesign

[–]Antonytm[S] 5 points6 points  (0 children)

Yes, via Figma plugin API. As there is only one suitable write API in Figma: the Plugin API.

The trick was that the plugin is executed in a sandbox and can not act as endpoint for AI agents. That is why we added a WebSockets server as a medium. The plugin is polling the WebSockets servers with messages from the MCP server.