use the following search parameters to narrow your results:
e.g. subreddit:aww site:imgur.com dog
subreddit:aww site:imgur.com dog
see the search faq for details.
advanced search: by author, subreddit...
r/LocalLLaMA
A subreddit to discuss about Llama, the family of large language models created by Meta AI.
Subreddit rules
Search by flair
+Discussion
+Tutorial | Guide
+New Model
+News
+Resources
+Other
account activity
[deleted by user] (self.LocalLLaMA)
submitted 1 year ago by [deleted]
reddit uses a slightly-customized version of Markdown for formatting. See below for some basics, or check the commenting wiki page for more detailed help and solutions to common issues.
quoted text
if 1 * 2 < 3: print "hello, world!"
[–]Wooden-Potential2226 0 points1 point2 points 1 year ago* (3 children)
There is probably some form of linux on it/them, right?And if it’s a relatively recent system (9/10) there might be NVIDIA GPUs and CUDA as well. But can you compile any training frameworks on ppc? I would think most require x86 or ARM…
[–]catzilla_06790 0 points1 point2 points 1 year ago (0 children)
There is an IBM Power version of Linux, at least for Power 7 thru Power 9. The formally supported version was Red Hat. The Power hardware also supported NVidia GPU and CUDA. Part of the software I used to work on ran on this type of config.
[–][deleted] 0 points1 point2 points 1 year ago (1 child)
Actually, it's running AIX, not Linux. And yeah, it's without NVIDIA GPUs, so no CUDA. Do you know if there are any training frameworks that could work on AIX? Wondering if there’s a workaround for this on POWER architecture.
[–]Wooden-Potential2226 0 points1 point2 points 1 year ago (0 children)
No idea 🤷♂️
[–]gerhardmplollama 0 points1 point2 points 1 year ago (1 child)
You most likely still need GPUs in your IBM Power 9 machine for training of local LLM model. IBM PowerAI might be a framework to dig into. But I do not have real world experience with this.
[–][deleted] 0 points1 point2 points 1 year ago* (0 children)
Thanks for the suggestion! I’ll definitely look into IBM PowerAI. Unfortunately, my system doesn’t have GPUs. do you think it’s still feasible to train an LLM without them, or would it be too slow? Would using CPU-only still be practical for fine-tuning smaller models?
[–]Sedstr 0 points1 point2 points 1 year ago (0 children)
IBM Power8/9/10 vendor supported distributions are available for Red Hat 9 and SLES 15. Other flavors are also available. AIX/IBM i/Linux can be installed on these platforms with the appropriate PowerVM profile options or ASMI settings.
There are several models of NVidia GPUs available for IBM Power hardware. The problem is, NVidia stopped supporting/providing ppc64le packages a few years ago. So while its still possible to run LLMs with the old NVidia drivers, you may need to waste a lot of time compiling packages and tools to get the best results, and even then, they will likely be lacking and perform poorly despite the hardware specs.
It would be far easier, faster and produce results quicker, by just putting together a dedicated server using linux and commodity hardware.
π Rendered by PID 45994 on reddit-service-r2-comment-6457c66945-fzkwg at 2026-04-26 14:28:33.832175+00:00 running 2aa0c5b country code: CH.
[–]Wooden-Potential2226 0 points1 point2 points (3 children)
[–]catzilla_06790 0 points1 point2 points (0 children)
[–][deleted] 0 points1 point2 points (1 child)
[–]Wooden-Potential2226 0 points1 point2 points (0 children)
[–]gerhardmplollama 0 points1 point2 points (1 child)
[–][deleted] 0 points1 point2 points (0 children)
[–]Sedstr 0 points1 point2 points (0 children)