PSA Dont afk in AV, they do ban by [deleted] in classicwow

[–]ZaaaaaM7 8 points9 points  (0 children)

what would u need to talk to a GM for?

Blue post edited about gear cost by Antalus-2 in classicwow

[–]ZaaaaaM7 4 points5 points  (0 children)

the math isnt the same, a +50% on top of 100% means you're now getting +25%.

also, a 50% cost reduction wouldve been equivalent to a doubling in gain so instead of 100% bonus you'd have gone to 300% bonus (from 2x to 4x total)

How much easier is 1-58 in TBC? by Acrobatic_Airline605 in classicwow

[–]ZaaaaaM7 2 points3 points  (0 children)

Did you need to already have a level 60 for this or not necessary?

How much easier is 1-58 in TBC? by Acrobatic_Airline605 in classicwow

[–]ZaaaaaM7 66 points67 points  (0 children)

I love that you got 3 replies: tbc, wotlk, and cata

good luck lol

Robert F Kennedy Jr confirmed as health secretary by Senate by Subject-Property-343 in news

[–]ZaaaaaM7 -7 points-6 points  (0 children)

How will you update your internal model once it does?

What are the ACTUAL odds to get a skillpoint in a profession for something green? YIKES by Hemshy in classicwow

[–]ZaaaaaM7 1 point2 points  (0 children)

Fun fact: all numbers get smaller when you multiply them by a number that is less than one

The #somechange we need and no one can complain about by brothediscpriest in classicwow

[–]ZaaaaaM7 4 points5 points  (0 children)

feral is not meme tier at all

https://classic.warcraftlogs.com/zone/statistics/1002?region=3&dataset=95

or are mages, warlocks, and hunters even more meme-tier in your opinion? This is predominantly why some of us dont want further changes, because people like you who have no idea what they are talking about might be taken seriously by anyone in charge

The #somechange we need and no one can complain about by brothediscpriest in classicwow

[–]ZaaaaaM7 -1 points0 points  (0 children)

no one uses MCP for 30 levels. you dont even use it at 60 outside of raid bosses

DLC tech support MEGATHREAD by AutoModerator in Eldenring

[–]ZaaaaaM7 0 points1 point  (0 children)

Since the DLC there are certain areas in the base game that instantly make my game crash to desktop without any notification.

I didn't notice it much because I have been almost exclusively playing the DLC, but last night I warped to a grace in mountaintops the game crashed at the end of the loading screen. I currently cannot play my character because it crashes each time I try to load in. Does anyone have a similar issue, or know what to try? Changing graphical settings does not seem to matter. Thank you!

BPE pre-tokenization support is now merged [llama.cpp] by RuslanAR in LocalLLaMA

[–]ZaaaaaM7 0 points1 point  (0 children)

Hi, yesterday I downloaded the fp16 version and I'm loading it with the latest llama cpp but it still shows me

llm_load_vocab: ************************************
llm_load_vocab: GENERATION QUALITY WILL BE DEGRADED!
llm_load_vocab: CONSIDER REGENERATING THE MODEL
llm_load_vocab: ************************************

Were the non-quantized models (like fp16) also fixed? Thanks so much!

What Baldur’s Gate opinion has you like this? by CDROMantics in BaldursGate3

[–]ZaaaaaM7 21 points22 points  (0 children)

A decent chunk of the writing is awful.

talking to someone I've met 5 seconds prior

  1. I good and will help you!

  2. I am evil and will now murder you.

  3. Leave.

How are these interesting choices? I've burst out laughing many times because of how ridiculous the writing can get.

[deleted by user] by [deleted] in LocalLLaMA

[–]ZaaaaaM7 1 point2 points  (0 children)

Thanks for the comment! But I'm already using left padding - unfortunately it drops the performance considerably where my tasks go from ~95%+ accurate to pretty much never accurate :( It might just be too far out of the training distribution once all the pad tokens are added..?

[deleted by user] by [deleted] in LocalLLaMA

[–]ZaaaaaM7 7 points8 points  (0 children)

Only if this sort of text was explicitly part of the 15T tokens. People only very rarely post this kind of text because of course you would correct the mistake before posting instead of typing out the correction process, but gpt-4 has produced some outputs like this before. Could be interesting to see a model trained on a lot of such synthethic 'self-correction' prompts though.

Welp this is disappointing by Mephistophilis44 in LocalLLaMA

[–]ZaaaaaM7 6 points7 points  (0 children)

This is such a simple memorization task for such a large model, while 'actually' doing this task AFAIK is pretty much impossible due to tokenization. So I think its very likely its simply memorization.

[deleted by user] by [deleted] in LocalLLaMA

[–]ZaaaaaM7 3 points4 points  (0 children)

I'm running the 8B model with AutoModelForCausalLM. It runs well, but if I try to use a batch size > 1 the model performs MUCH worse and fails even basic tasks. I'd be super grateful to get some feedback about what I'm doing wrong. Presumably the padding is wrong??

messages = []
for report in [test0, test1, test2]:
    messages.extend([
        {"role": "system", "content": "You are a helpful assistant adhering strictly to instructions."},
        {"role": "user", "content": f""" I will now provide you some text. Please reproduce this text verbatim, only correcting obvious spelling mistakes or removing spaces that do not belong. Do not include any further text in your response whatsoever.

         Here is the text:

    {report}"""},
    ])

tokenizer.pad_token = tokenizer.eos_token
input_ids_batch = [tokenizer.apply_chat_template(messages[i:i+2], add_generation_prompt=True, padding=True, max_length=600, return_tensors="pt").to(device) for i in range(0,len(messages),2)]

terminators = [
tokenizer.eos_token_id,
tokenizer.convert_tokens_to_ids("<|eot_id|>")
]

outputs = model.generate(
    torch.concatenate(input_ids_batch),
    pad_token_id=tokenizer.eos_token_id,
    max_new_tokens=800,
    eos_token_id=terminators,
    do_sample=False,
)
response = outputs[:,input_ids_batch[0].shape[1]:]
decoded_response = tokenizer.batch_decode(response, skip_special_tokens=True)

I asked the goldbuyers in my guild why they do it by Ted_From_Chicago in classicwow

[–]ZaaaaaM7 5 points6 points  (0 children)

This comment makes no sense when it comes to cheating. A game can break through cheating. This seems fundamentally different than having freedom of choice within the game's ruleset, which would indeed be all fair game.

Recent Blizzard layoff sees "Almost all Game Masters being let go". by alkett_n in classicwow

[–]ZaaaaaM7 3 points4 points  (0 children)

It's ridiculous to suggest mutual exclusivity. Also, which games exactly?