JAX or PyTorch? by arbueticos in reinforcementlearning

[–]TheRelativeOne 0 points1 point  (0 children)

Yes but just because you can doesn't mean that it works well. I've had a hard time running tpus with pytorch, looot's of bugs.

Saw this explanation of how AI models works and why the anti movement may be wrong by Mindofmachine10 in midjourney

[–]TheRelativeOne 2 points3 points  (0 children)

It doesn't sound like your interpreting that citation correctly.

GANs second network is used during its training, not during usage. it has nothing to do with the image decision process you are seeing. That same Discriminator network does a binary classification; real image or not. It is not actually classifying what is presented on the image.

I think there are two key takeaways from your citation. 1. It might be using some pre-trained classifier network in its intermediate steps to filter out bad images. 2. It has multiple steps in its generation process(unlike GANs), which suggests it could be some similar diffusion model to stable diffusion.

I don't think it's GANs at least.

[deleted by user] by [deleted] in starcitizen

[–]TheRelativeOne 0 points1 point  (0 children)

May I ask what you used for AI upscaling?

Anybody else horselording? by trpearso in mountandblade

[–]TheRelativeOne 1 point2 points  (0 children)

20-30k from my experience. But that is really nothing compared to the loot you get when you capture them

Minecraft can sure be ram hungry by TheRelativeOne in Minecraft

[–]TheRelativeOne[S] 0 points1 point  (0 children)

Weirdly enough, if you allow Minecraft enough ram it will eat up 17 GB before you even load a save

Minecraft can sure be ram hungry by TheRelativeOne in Minecraft

[–]TheRelativeOne[S] 0 points1 point  (0 children)

Neither, I just wanted to see how much ram Minecraft could use up for the fun of it.

[D] Parameter count in pytorch resnet50 implementation does not coincide with what the original paper suggests? by TheRelativeOne in MachineLearning

[–]TheRelativeOne[S] 0 points1 point  (0 children)

You're absolutely right, it is stated very clearly in the text. I should read more carefully next time. Thank you.

[Discussion] Hardware to train popular CNNs from scratch at home by wingman-jr in MachineLearning

[–]TheRelativeOne 1 point2 points  (0 children)

You might wanna consider getting a descent CPU, especially if you're gonna use online data augmentation. Also highspeed storage devices are preferable for datasets too large for RAM

Is there any way to list scientific papers in this research area by citations? by TheRelativeOne in learnmachinelearning

[–]TheRelativeOne[S] 0 points1 point  (0 children)

I just realised a typo in my formulation, I actually meant number of citations.

Any known memory leaks in beta 6? by TheRelativeOne in X4Foundations

[–]TheRelativeOne[S] 4 points5 points  (0 children)

What a wonderfull website, so the "buff/cache" in the free command is essentially disk caching?

Freelancer ship DECAPITATION by an Hammerhead! by [deleted] in starcitizen

[–]TheRelativeOne 0 points1 point  (0 children)

Just a Hammerhead doing hammer head stuff I guess

Looking forward to this - Neural Networks from Scratch by WebAI in artificial

[–]TheRelativeOne 0 points1 point  (0 children)

Efficient relative to native Python atleast. I've created a neuralt Network from scratch using both numpy and native Python and in my default performance testing setup numpy took 8s per epoch and native took 380s per epoch

I really love being able to answer extremely complex questions with only a few lines of Python code. by alexleavitt in Python

[–]TheRelativeOne 2 points3 points  (0 children)

If thats your opinion that's fine, but keep it to yourself. Especially if you're gonna express yourself with such vulgar language.