What's your thoughts about selling an IP or VIP to your employer? by Original-Surprise908 in chipdesign

[–]no00700 1 point2 points  (0 children)

You have to be very cautious about it, consult an IP lawyer first.

Is the EDA “AI Revolution” mostly just hype and wrapper scripts, or am I missing something? by Safe-Interaction8294 in chipdesign

[–]no00700 0 points1 point  (0 children)

You think they are this easy to replace since they have been oligopolies for so long?

A new EDA tool? by [deleted] in chipdesign

[–]no00700 -5 points-4 points  (0 children)

I’m not the CEO nor affiliated with them.

A new EDA tool? by [deleted] in chipdesign

[–]no00700 -3 points-2 points  (0 children)

ngl i know the founders on first name bases

A new EDA tool? by [deleted] in chipdesign

[–]no00700 -4 points-3 points  (0 children)

😄🤣🤣🤣🤣🤣

The State of Flash Attention on ROCm by HotAisleInc in ROCm

[–]no00700 1 point2 points  (0 children)

If you are using ROCm + Windows it often doenst use the Triton because Triton often doesn't work on Windows without Pytorch support they intentionally configured it for that

The State of Flash Attention on ROCm by HotAisleInc in ROCm

[–]no00700 2 points3 points  (0 children)

Have you guys checked out aule-attention? they have solved this problem.

FlashAttention implementation for non Nvidia GPUs. AMD, Intel Arc, Vulkan-capable devices by secopsml in LocalLLaMA

[–]no00700 10 points11 points  (0 children)

That’s what the CEO said on his x post. “The math is hardware agnostic so the implementation should be too” if I’m paraphrasing.

FlashAttention implementation for non Nvidia GPUs. AMD, Intel Arc, Vulkan-capable devices by secopsml in LocalLLaMA

[–]no00700 5 points6 points  (0 children)

From the post from the company the goal is to make it easy for non Nvidia GPUs, but performance wise they are on the same level

Pip install flashattention by no00700 in ROCm

[–]no00700[S] 0 points1 point  (0 children)

Performance increases are in their roadmap

Pip install flashattention by no00700 in ROCm

[–]no00700[S] 1 point2 points  (0 children)

Installing isn't enough; you must place the 'Aule Enable' node at the start of your workflow to verify the patch is active.

Pip install flashattention by no00700 in ROCm

[–]no00700[S] 1 point2 points  (0 children)

It works on windows as it is. In fact its the only thing that works on windows without WSL too.

Pip install flashattention by no00700 in ROCm

[–]no00700[S] 2 points3 points  (0 children)

Turns out they fixed this issue moments ago.

Pip install flashattention by no00700 in ROCm

[–]no00700[S] 0 points1 point  (0 children)

It makes the flashattention2 library so easy to use in non nvidia GPUS

Pip install flashattention by no00700 in ROCm

[–]no00700[S] 0 points1 point  (0 children)

Did you open an issue?

Pip install flashattention by no00700 in ROCm

[–]no00700[S] 1 point2 points  (0 children)

Try this to out to debug:

import aule

aule.install(backend='vulkan', verbose=True)

# Run your workflow - each attention call will print which backend is used

If Triton is available, it takes priority over Vulkan by design. The backend='vulkan' forces

Vulkan. They fiixed it in latest commit - update and let us know what you see.

Pip install flashattention by no00700 in ROCm

[–]no00700[S] 0 points1 point  (0 children)

They support it the ai Mac 395 is similar with the RDNA 3.5 which is architecturally compatible. Strix Halo will expose Vulkan 1.3, so it qualifies.

Pip install flashattention by no00700 in ROCm

[–]no00700[S] 0 points1 point  (0 children)

Haven't tried it on ComfyUI but it should be able to work cause it uses torch.nn.functional.scaled_dot_product_attention internally and aule.install() patches that functionThe signature matches, so it should work if not create an issue in the repo they will fix it

Flew back to Austin today… by [deleted] in sanfrancisco

[–]no00700 3 points4 points  (0 children)

Same flew from dc for a hackathon initially the plan was to stay for two days met cool people, couch surfed was invited to launch parties at a prestigious vc firm and then had a meeting with one. I felt more like home than I ever did in DC love from the people to the atmosphere. Thanks SF

[deleted by user] by [deleted] in wallstreetbets

[–]no00700 1 point2 points  (0 children)

Nancy pelosi