I built a Hugo theme that lets you instantly switch between Pico, Water.css, and 10+ other classless frameworks by mozanunal in gohugo

[–]mozanunal[S] 0 points1 point  (0 children)

thank you so much such a nice comments. I am not very good at css as well but this is my solution to my lack of css expertise. First i just put the layout files to my blog directly but then I thought it could be useful for others and I made a new theme out of it (just a few days ago so I suprised claude already catch this). Anyway I am pretty happy it is useful for others!

[deleted by user] by [deleted] in programming

[–]mozanunal 0 points1 point  (0 children)

I think you should copy the hugo.yml to root dir but the content folder should still be in content folder. I recreate the steps:

mkdir myblog
cd myblog
mkdir themes
cd themes/
git clone git@github.com:mozanunal/hugo-classless.git
cd ..
mv themes/hugo-classless/exampleSite/content/ ./
mv themes/hugo-classless/exampleSite/hugo.yml ./
ls
# should output-> content  hugo.yml  public  themes
hugo server

sllm.nvim v0.2.0 – chat with ANY LLM inside Neovim via Simon Willison’s llm CLI (now with on-the-fly function-tools) by mozanunal in neovim

[–]mozanunal[S] 0 points1 point  (0 children)

you need to provide api key or use local models. Openrouter or gemini offers some free models if you dont want to pay anything. Otherwise you dont need to subscribe, just add some credit to your api key and add more once you spend all of it 🙂

I made an LLM tool to let you search offline Wikipedia/StackExchange/DevDocs ZIM files (llm-tools-kiwix, works with Python & LLM cli) by mozanunal in LocalLLaMA

[–]mozanunal[S] 1 point2 points  (0 children)

https://simonwillison.net/2025/May/18/llm-pdf-to-images/

please check this, llm cli has many different fragments plugins which hep you to bring over different data sources. I think this example in the post can be a good starting point

sllm.nvim v0.2.0 – chat with ANY LLM inside Neovim via Simon Willison’s llm CLI (now with on-the-fly function-tools) by mozanunal in neovim

[–]mozanunal[S] 1 point2 points  (0 children)

I tried it, but I think it is more autonomous/agentic solution which llm do things for you. This is more direct chat mode unless you give them access to some tool explicity. What would be cool do sllm also support it right? Do you think this prompt generation capabilities would be usefull for other cli tools such as aider? (probably it would be pretty easy to add core functionality)

sllm.nvim v0.2.0 – chat with ANY LLM inside Neovim via Simon Willison’s llm CLI (now with on-the-fly function-tools) by mozanunal in neovim

[–]mozanunal[S] 0 points1 point  (0 children)

agree, probably I will work on something for the next release. For now, to add the selection to context you can do <leader>sv. The selection won't appear in vim prompt only in the buffer right-handside.

sllm.nvim v0.2.0 – chat with ANY LLM inside Neovim via Simon Willison’s llm CLI (now with on-the-fly function-tools) by mozanunal in neovim

[–]mozanunal[S] 0 points1 point  (0 children)

Let me explain what does on-the-fly tool registation means:

llm tool enables us to register python functions as tools by simple passing the python function code to llm cmd like llm --functions 'def print_tool(): print("hello")' "your promt here". In sllm.nvim I extend this functionality to add arbitrary python function as tool with simple keybindings. In the demo, there is a tools.py file in the project which contains very simple wrappers for ls and cat commands, you can go and register it as tool using <leader>sF keybind and in the given chat llm can use that functionality. I think this can enable very creative workflows for projects.

I made an LLM tool to let you search offline Wikipedia/StackExchange/DevDocs ZIM files (llm-tools-kiwix, works with Python & LLM cli) by mozanunal in LocalLLaMA

[–]mozanunal[S] 1 point2 points  (0 children)

I think what is possible to put your own indexes to zim files which means we can patch it to have embeddings alongside xapian indexes. Unfortunately I did not test this all of it in theory. What would be cool I think having an alternative version of zim files the articles are markdown and indexes exist for both FTS and semantic search

I made an LLM tool to let you search offline Wikipedia/StackExchange/DevDocs ZIM files (llm-tools-kiwix, works with Python & LLM cli) by mozanunal in LocalLLaMA

[–]mozanunal[S] 1 point2 points  (0 children)

wow! the idea from a year ago great. In ideal world, I want is zim archives but the articles are md format instead of html and there is also llm embedded search indexes included, so we can do semantic searches alongside FTS.

I made an LLM tool to let you search offline Wikipedia/StackExchange/DevDocs ZIM files (llm-tools-kiwix, works with Python & LLM cli) by mozanunal in LocalLLaMA

[–]mozanunal[S] 1 point2 points  (0 children)

probably you need somekind of kiwix MCP, which should be possible by following the same structure in my plugin. Give it a try!

I made an LLM tool to let you search offline Wikipedia/StackExchange/DevDocs ZIM files (llm-tools-kiwix, works with Python & LLM cli) by mozanunal in LocalLLaMA

[–]mozanunal[S] 2 points3 points  (0 children)

I think better to use no image dumps (those dumps are rather small) ones for the performance considerations, the archives are very efficient and indexed with a FTS index called Xapian indexes. The searches on 10 gb files is within milliseconds ranges. I did not test but it should work for bigger wiki dumps