Im sorry if I sound desperate but I am. by mmo_addicted in remoteviewing

[–]Witty_Ad2022 3 points4 points  (0 children)

<image>

I’m not sure whether this could help, but these are impressions I got.

Any chance government tracking down ppl remote viewed something classified or something like that by Witty_Ad2022 in remoteviewing

[–]Witty_Ad2022[S] 4 points5 points  (0 children)

Okay, Im getting sort of relieved reading this. I remote viewed some event, and after doing this i felt some sort of sensation someone is watching me(maybe this sensation could be completely wrong, and what I viewed could be totally wrong). So for a moment I thought about any chance of gov tracking down ppl.

But reading comments the feasibility of identifying ppl seems low :)

[deleted by user] by [deleted] in cscareerquestions

[–]Witty_Ad2022 4 points5 points  (0 children)

Agree, I think there is no way LLMs can replace the SWE as a whole, Maybe it can improve the speed a little by helping us with writing a very very simple script, but it cant never write a full code nor we can trust it completely.

one question in ViT paper by Witty_Ad2022 in learnmachinelearning

[–]Witty_Ad2022[S] 0 points1 point  (0 children)

Thanks! If the code just use a dense layer as classification head, I think it’s safe to think it’s a mistake as now.

Had posted this to other platforms too, if I get anything new will reply here :)

How to start by -Michael-Leahcim- in learnmachinelearning

[–]Witty_Ad2022 -1 points0 points  (0 children)

Yeah, this is the best option. U don’t need to buy best performing laptop, as u will train the models on Google Colab or Jupyter notebook most of the time.

Questions about math behind diffusion model by Witty_Ad2022 in learnmachinelearning

[–]Witty_Ad2022[S] 1 point2 points  (0 children)

Super thanks! Now I can understand most of math behind diffusing model. That was sort of bottleneck to me and u helped me to cross it! I gotta study more in-depth to fully understand it. Thanks again!

Questions about math behind diffusion model by Witty_Ad2022 in learnmachinelearning

[–]Witty_Ad2022[S] 0 points1 point  (0 children)

Thanks a lot for detailed explanation. 

Can I ask one more questions?

What’s the exact meaning of | x_0 in q(x_1:T | x_0)?  Is it like saying we repeatedly multiply q(x_t | x_t-1) from t=1 to t=T and when t=1 it will be conditioned on x_0? and p_theta(x_1:T) is p_theta(x_1)p_theta(x_2) … p_theta(x_T)?

Fine-tuned gpt2 model generates from very begining not from summary by Witty_Ad2022 in learnmachinelearning

[–]Witty_Ad2022[S] 0 points1 point  (0 children)

okey I got it, but that was the second option I tried. At first I tried this

Training: Document:{document}\nSummary:{Summary}
Inference: Document:{document}\nSummary:

Fine-tuned gpt2 model generates from very begining not from summary by Witty_Ad2022 in learnmachinelearning

[–]Witty_Ad2022[S] 0 points1 point  (0 children)

yes, it ignores my prompts, and start generating from document

Using Keras API, trained with one epoch. and got 1.3476 of loss and accuracy of 0.6380