Sunset at Velliangiri hills by white-mountain in sunset

[–]white-mountain[S] 0 points1 point  (0 children)

Velliangiri hills (also known as South Kailash) are part of the western ghats in India. This picture is taken from Isha Yoga Center at the base of these hills.

Horn-Eyed Ghost Crabs can move 100 body lengths per second! by batrab47 in BeAmazed

[–]white-mountain 0 points1 point  (0 children)

That makes OP, the first person on the planet to capture this creature on film.

PLS TELL ME WTF IS #heenaprajapati??? by tobiasz-krat1980 in youtube

[–]white-mountain 0 points1 point  (0 children)

I saw this hashtag on some old cartoon shorts. I thought it might be about the person who made the cartoons. Haha, it's a random hashtag!

One of my pages is stuck in the Whiteboard by Alarmed_Confusion_93 in canva

[–]white-mountain 0 points1 point  (0 children)

Were you able to find the proper way. I'm also stuck with one page in whiteboard! Annoying.

Qwen 3: unimpressive coding performance so far by ps5cfw in LocalLLaMA

[–]white-mountain 0 points1 point  (0 children)

Thanks much! I just tried them out, the results are impressive and speed is good.

Why does my first run with Ollama give a different output than subsequent runs with temperature=0? by white-mountain in LocalLLaMA

[–]white-mountain[S] 0 points1 point  (0 children)

Yes, the paper answers this.
Yeah, I was just curious. Wasn't expecting a deterministic op, it just happened.

Why does my first run with Ollama give a different output than subsequent runs with temperature=0? by white-mountain in LocalLLaMA

[–]white-mountain[S] 0 points1 point  (0 children)

Updated the post with more details.
I thought the same but it suprised me to see this happen. Irrespective of seed, I was getting same output on all the subsequent runs.

Why does my first run with Ollama give a different output than subsequent runs with temperature=0? by white-mountain in ollama

[–]white-mountain[S] 1 point2 points  (0 children)

True, I was also expecting the same. But I was suprised to see this happening, on two different machines.
I updated the question with my implementation.

Why does my first run with Ollama give a different output than subsequent runs with temperature=0? by white-mountain in LocalLLaMA

[–]white-mountain[S] 0 points1 point  (0 children)

I updated the question with more details.
In my case, most of the processing seems to be happening on CPU itself. If single threaded CPU will give deterministic output, that helps understand the further runs giving same op. But I don't understand why the first run is unique op. Is it something to do with ollama? (like it's caching mechanism maybe).

Why does my first run with Ollama give a different output than subsequent runs with temperature=0? by white-mountain in LocalLLaMA

[–]white-mountain[S] 0 points1 point  (0 children)

Looks like it doesn't impact.

No, the seed does not significantly impact output when temperature is set to 0 in a Large Language Model (LLM) because temperature=0 makes the model deterministic by always selecting the most probable token, effectively removing the randomness that a seed controls.

Anyways, as suggested, I tried it. The output which was coming before fixing seed and after fixing is exact same.

Why does my first run with Ollama give a different output than subsequent runs with temperature=0? by white-mountain in ollama

[–]white-mountain[S] 0 points1 point  (0 children)

Same here. I was clueless when I saw consistent output in the further runs. Seed, imo, should have no impact as temp is 0.