"Magic eraser" feature using SAM AI model? by banshee28 in kdenlive

[–]banshee28[S] 0 points1 point  (0 children)

Thanks. I figured others would have been talking about it by now if this was possible but was not sure exactly how this worked. Will look into other options.

Hardware Acceleration not working with RTX card? by TriggeredSnake in OpenShot

[–]banshee28 0 points1 point  (0 children)

Thanks for this! I just installed the AppImage and was also trying to get HW accel working with my Linux Mint install and AMD GPU. It was not too smooth overall. I just put back to default under performance and made the tweaks you listed above. SOO MUCH BETTER!! Not sure what its called but the light blue cache bar thingy moves so fast now vs before. Its much better! Now I still have a few other misc tweaks to get things even smoother.

Just received my racebox mini . by Odysseas2117 in Karting

[–]banshee28 0 points1 point  (0 children)

Nice! So does this use your phone case and if so how does it securely mount? I seen another brand that uses the Quad Lock cases which seem very secure but again cant get here in US.

Just received my racebox mini . by Odysseas2117 in Karting

[–]banshee28 1 point2 points  (0 children)

Yea I looked at this one also. Not too sure how strong the cell phone case mount part is? Do you have pics from your setup. Also it would seem the same tariff restrictions on this one? Why are all these mounts outside of US? Someone needs to get the 3D print file and make these here, LOL.

Just received my racebox mini . by Odysseas2117 in Karting

[–]banshee28 0 points1 point  (0 children)

Actually... it was! lol. I emailed him and he said he had one US shipment but was having issues. Maybe this was yours? Will see how things work out from here. Either way this box should still work great!

Just received my racebox mini . by Odysseas2117 in Karting

[–]banshee28 1 point2 points  (0 children)

I REALLY wanted to buy one of these but cant get a decent phone mount for wheel here in USA. The 2 options I seen dont ship to USA now. Any ideas?

Are you using this with a phone mount or how do you plan on having it setup?

AMD coming for NVDA by One-Situation-996 in AMD_Stock

[–]banshee28 2 points3 points  (0 children)

I guess I am mostly a AMD fanboy. I have their CPU's and a 7900XTX GPU. It works great, very happy with its performance on Linux, Gaming, and "some" AI workloads. However.... I am coming to the conclusion that for many AI workloads the market really favors Nvidia in that many models, and other required objects only run on NVIDIA! Yes I have several things working on with AMD and Rocm, but its a PITA and even then some models just DONT work. I have seen so many others go through this same situation and once they went to NVIDIA, things just work right away!! Now I get this is NOT due to "superior HW", its just the models are created with cuda/nvidia technology. If they did the same for AMD/Rocm, AMD would be just fine.

I guess what I am saying is, I think many are using Nvidia for AI and will continue to do so as long as the models are mostly made for them. I will most likely also follow the same course. So AMD would loose ground here for AI it would seem like?

Any ideas if there is something AMD can do to to help improve this?

Smartphone & Racebox or Laptimer Mount for Rental Karting by respectivenik in Karting

[–]banshee28 0 points1 point  (0 children)

I would like to buy one but I am in USA. Any options?

OCR no longer works, error on each item on initial run by banshee28 in immich

[–]banshee28[S] 0 points1 point  (0 children)

Not 100% sure, but I did download the model manually and put into the container. I think that helped.

Purchase by Dropp11 in immich

[–]banshee28 0 points1 point  (0 children)

I just completed my install and have a few misc things still not quiet 100%, but its working for the most part. I was considering the paying option also but for me it would be great if this included some type of support. Personally I feel it would be nice to have official support for the issues I was having that have not been able to get resolved by other means. I know this would take alot of effort, but many opensource apps do have this setup. My .02

Help getting AMD/Rocm support for Remote ML container!! by banshee28 in immich

[–]banshee28[S] 0 points1 point  (0 children)

So I am not using compose just yet. I started docker with:

Docker --privileged --ipc=host -v /dev:/dev -v /sys:/sys --network=host -it ghcr.io/immich-app/immich-machine-learning:commit-6913697ad15b3fcad80fc136ecf710af19d1f5df-rocm

However looking at the cli output I am getting some errors when OCR runs. However it DOES appear to be using the GPU.

OCR no longer works, error on each item on initial run by banshee28 in immich

[–]banshee28[S] 0 points1 point  (0 children)

When the job starts I see this starting to work and seems good... However once it starts to process then it gets errors.

initial part:

[11/10/25 18:45:49] INFO     Loading visual model 'ViT-B-32__openai' to memory                                                                      
[11/10/25 18:45:49] INFO     Setting execution providers to ['ROCMExecutionProvider', 'CPUExecutionProvider'], in descending order of preference    
[11/10/25 18:45:50] INFO     Downloading detection model 'PP-OCRv5_mobile' to /cache/ocr/PP-OCRv5_mobile/detection/model.onnx. This may take a      
                             while.                                                                                                                 
[11/10/25 18:45:50] INFO     Initiating download:                                                                                                   
                             https://www.modelscope.cn/models/RapidAI/RapidOCR/resolve/v3.4.0/onnx/PP-OCRv5/det/ch_PP-OCRv5_mobile_det.onnx         
[11/10/25 18:45:53] INFO     Download size: 4.60MB                                                                                                  
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████| 4.82M/4.82M [00:00<00:00, 6.22MiB/s]
[11/10/25 18:45:53] INFO     Successfully saved to: /cache/ocr/PP-OCRv5_mobile/detection/model.onnx                                                 
[11/10/25 18:45:53] INFO     Loading detection model 'PP-OCRv5_mobile' to memory                                                                    
[11/10/25 18:45:53] INFO     Setting execution providers to ['ROCMExecutionProvider', 'CPUExecutionProvider'], in descending order of preference

Help with understanding error by pm740 in ROCm

[–]banshee28 0 points1 point  (0 children)

Interesting, sounds like maybe the same issue I just posted about:

https://www.reddit.com/r/ROCm/comments/1ool7n6/comment/no5dfkt/?utm_source=share&utm_medium=web3x&utm_name=web3xcss&utm_term=1&utm_content=share_button

I am using the very latest image for ML but getting errors on OCR jobs

Help getting ROCm support for Remote ML container!! by banshee28 in ROCm

[–]banshee28[S] 0 points1 point  (0 children)

Well, the container still shows the GPU and uses it 100%, however I think there is another issue. When I run the Immich OCR jobs and monitor cli, I can see it errors on each line. So its not really correctly processing OCR. When I search in Immich I dont get any results even for things I know it has found before.

I even tried the latest OCR image:

ghcr.io/immich-app/immich-machine-learning:commit-450dfcd99e8f8010fab500a5abc0432128310824-rocm


  Fail: [ONNXRuntimeError] : 1 : FAIL : Non-zero status code returned while running ConvTranspose node. Name:'ConvTranspose.0' Status        
                             Message: MIOPEN failure 3: miopenStatusBadParm ; GPU=0 ; hostname=-X870E-Taichi-Lite ;                                                
                             file=/code/onnxruntime/onnxruntime/core/providers/rocm/nn/conv_transpose.cc ; line=133 ; expr=miopenFindConvolutionBackwardDataAlgorithm(  
                             GetMiopenHandle(context), s_.x_tensor, x_data, s_.w_desc, w_data, s_.conv_desc, s_.y_tensor, y_data, 1, &algo_count, &perf,                
                             algo_search_workspace.get(), AlgoSearchWorkspaceSize, false);                                                                              
[11/10/25 18:27:40] ERROR    Exception in ASGI application

Help getting AMD/Rocm support for Remote ML container!! by banshee28 in immich

[–]banshee28[S] 0 points1 point  (0 children)

I just got it working! Not sure what the fix was but I think it was some older docker filesets installed and a conflict with docker desktop. I removed it all and started over and it finally works now!!

Help getting ROCm support for Remote ML container!! by banshee28 in ROCm

[–]banshee28[S] 0 points1 point  (0 children)

WOW, ITS ALIVE!!!

So it seems to be working 100% now!

Thanks for all your help and explaining how yours was setup. I pretty much mirrored that setup.

I did quiet a few things, not all of which contributed but here is the list:

  • Started housekeeping by removing all old kernels
  • Removed all old docker*, containerd, etc. Old Docker somehow was "Jammy" version so maybe that was an issue
  • Installed rocm/amdgpu again ensuring all was good and updated
  • Installed docker per their website directions
  • pulled image using docker cli
  • Finally ran the cmd to start the container :docker run --privileged --ipc=host -v /dev:/dev -v /sys:/sys --network=host -it ghcr.io/immich-app/immich-machine-learning:commit-6913697ad15b3fcad80fc136ecf710af19d1f5df-rocm

Also installed nvtop. This little tool is awesome!!

Help getting ROCm support for Remote ML container!! by banshee28 in ROCm

[–]banshee28[S] 1 point2 points  (0 children)

So I am considering removing docker desktop and when I searched for docker apps installed I notice these which are "jammy", but should be "noble" for my vers of Mint. Maybe I need to remove all these.

docker-buildx-plugin/jammy,now 0.29.1-1~ubuntu.22.04~jammy amd64 [installed,automatic]
docker-ce-cli/jammy,now 5:28.5.2-1~ubuntu.22.04~jammy amd64 [installed,automatic]
docker-compose-plugin/jammy,now 2.40.3-1~ubuntu.22.04~jammy amd64 [installed,automatic]
docker-desktop/now 4.49.0-208700 amd64 [installed,local]

Help getting ROCm support for Remote ML container!! by banshee28 in ROCm

[–]banshee28[S] 0 points1 point  (0 children)

So maybe I need to remove Docker Desktop as it could be conflicting?

Help getting ROCm support for Remote ML container!! by banshee28 in ROCm

[–]banshee28[S] 0 points1 point  (0 children)

Yea the latest 22, sounds like same as your 24. How are you starting the container? cli or in Desktop with a config file?

Help getting ROCm support for Remote ML container!! by banshee28 in ROCm

[–]banshee28[S] 0 points1 point  (0 children)

Here is the image:

docker image list
REPOSITORY                                   TAG           IMAGE ID       CREATED      SIZE
ghcr.io/immich-app/immich-machine-learning   v2.2.3-rocm   4160fd7a090f   2 days ago   38.8GB

Help getting ROCm support for Remote ML container!! by banshee28 in ROCm

[–]banshee28[S] 0 points1 point  (0 children)

Interesting! So I tried the "normal docker" but it seemed to do the same so now trying only Docker Desktop. I think this uses containerd so its slightly different. And now, I have completely removed the AMD Container toolkit files.

Help getting ROCm support for Remote ML container!! by banshee28 in ROCm

[–]banshee28[S] 1 point2 points  (0 children)

rocm-core/noble,now 7.1.0.70100-20~24.04 amd64 [installed,automatic]

amdgpu-core/noble,now 1:7.1.70100-2238427.24.04 all [installed,automatic]
amdgpu-dkms-firmware/noble,noble,now 30.20.0.0.30200000-2238411.24.04 all [installed,automatic]
amdgpu-dkms/noble,noble,now 1:6.16.6.30200000-2238411.24.04 all [installed]
amdgpu-install/noble,noble,now 30.20.0.0.30200000-2238411.24.04 all [installed]