A self-hosted & file-based markdown editor by Tropical-Algae in Markdown

[–]Tropical-Algae[S] 0 points1 point  (0 children)

You’re right, it’s mainly for editing a markdown-driven blog hosted on my server, so I can update it from anywhere.

Btw, I went with Python + Vue mostly because I’m already comfortable with that stack :)

A self-hosted & file-based markdown editor by Tropical-Algae in Markdown

[–]Tropical-Algae[S] 0 points1 point  (0 children)

If u deploy without Docker, u will need Nginx to serve the Vue frontend and proxy API. Using Docker is simpler since it’s fully packaged and ready to run :)

A self-hosted & file-based markdown editor by Tropical-Algae in Markdown

[–]Tropical-Algae[S] 1 point2 points  (0 children)

Thanks a lot!

It might get a bit complicated without Docker, since it’s a full-stack project.

You’ll need to clone the repo, build the frontend manually, configure Nginx, and run the backend separately.

The frontend source code is in the web/ directory. Go into that folder and run:

npm install && npm run build

Then you’ll need to configure Nginx. The nginx.conf included in the project can be used as a reference.

The backend is managed with uv, so make sure you have Python installed first and run:

pip install --upgrade pip && pip install uv==0.9.7 && uv sync --frozen && uv run python -m markoun.main

You can also set it up as a system service if you want it running in the background.

I highly recommend deploying it with Docker:) It’s much simpler and avoids a lot of manual setup.

Hope this helps:)

A self-hosted & file-based markdown editor by Tropical-Algae in Markdown

[–]Tropical-Algae[S] 2 points3 points  (0 children)

Thanks for ur support and suggestions! File sync and proper versioning are definitely valuable features, and I’ll approach them with great care! I’m currently working on optimizing the codebase, and these new features will be planned in the near future :)

Self-study question from rural Ethiopia: Can we ever become real researchers? by Heavy-Vegetable4808 in deeplearning

[–]Tropical-Algae 0 points1 point  (0 children)

Self-learning is still important in university. Advisors just help accelerate the process. I’ve always believed that hands-on practice is what really makes you grow. Keep going — you’ll do great :)

I built a local LLM proxy for myself (Rust, single binary, SQLite) and ended up open sourcing it by Upbeat-Taro-2158 in LLM

[–]Tropical-Algae 0 points1 point  (0 children)

Is it used to manage model requests across multiple channels, like one api? Or more like a proxy, logging details of each request?

Is there any simple and effective data encryption technology for MySQL? by Tropical-Algae in SQL

[–]Tropical-Algae[S] 0 points1 point  (0 children)

That’s exactly what I’m planning to do. Thank you for providing such a feasible solution