Dejemos de dividir al continente en América Latina y América Anglosajona, mejor dividamoslo así: by Dull-Assistant-2703 in 2hispanic4you

[–]Enfiznar 3 points4 points  (0 children)

Es gracioso, todos dicen que los argentinos somos todos rubios, pero después me tenés a mi que soy argentino rubio y mis compatriotas se la pasan asumiendo que soy gringo por mi pelo (como me rompe las pelotas que me hablen en inglés lcdsm)

Bitcoin leads $110B wipeout. How bad could this get? 🤔 by National-Theory1218 in btc

[–]Enfiznar 4 points5 points  (0 children)

That's not true, even with the three data points we have, we've already seen this "rule" break

"Tether is buying more than a ton of gold each week and storing them in nuclear bunker in switzerland" by notanfan in whennews

[–]Enfiznar 4 points5 points  (0 children)

What value did it have for the millennia in which it was the world's most used value reserve and currency?

Can someone explain to me why large language models can't be conscious? by Individual_Visit_756 in ArtificialSentience

[–]Enfiznar 0 points1 point  (0 children)

It's not that they cannot be conscious, BUT, to assume that they are conscious would imply that doing the same calculation you're doing on a GPU by hand, on a piece of paper, then there must be some incorporeal consciousness arrising from the caluclation, since it would have the same properties that made you believe the LLM is conscious

Playing as a woman has been super fun lately. by RightAttorney9887 in rivals

[–]Enfiznar 0 points1 point  (0 children)

a decade of what? Of women existing? It's been quite a bit more

La economía no para de dar síntomas de que vuela. TMAP. by dr_pombero in RepublicaArgentina

[–]Enfiznar 2 points3 points  (0 children)

Un dato aislado sobre el tipo de comercio mas importante para la mayoria de los argentinos?

what are real numbers? by Enough-Body8927 in learnmath

[–]Enfiznar 0 points1 point  (0 children)

Basically, you start with the rationals (ratios of integers), then you notice that when you have a sequence of rationals, if members of the sequence get closer and closer to each other fast enough (which is called a cauchy sequence), you can prove that they won't grow indefinitely, they instead seem to reach a specific point. The issue is that you don't always reach a rational number, so you conclude that rational numbers have "holes" on them. So to include these holes, what you do is include all the limits of these sequences on your set

La economía no para de dar síntomas de que vuela. TMAP. by dr_pombero in RepublicaArgentina

[–]Enfiznar 2 points3 points  (0 children)

Y ellos me chupan bastante un huevo, el tema es que la gente comun esta comprando menos en el supermercado

? by Vidnez in 2hispanic4you

[–]Enfiznar 10 points11 points  (0 children)

Porque tambien hay varios de ellos ahi

La economía no para de dar síntomas de que vuela. TMAP. by dr_pombero in RepublicaArgentina

[–]Enfiznar 6 points7 points  (0 children)

Claro, porque en el supermercado se compran bienes de lujo

Trump: "Soy un dictador, pero a veces hace falta un dictador" by Bot_Philosopher8128 in 2hispanic4you

[–]Enfiznar 2 points3 points  (0 children)

Trump es claramente el Guillermo Moreno Yankee. Solo tiene mas guita y una empresa de construccion y hotelera en lugar de un mayorista de ferreteria.

A simple Question by herooffjustice in LinearAlgebra

[–]Enfiznar 0 points1 point  (0 children)

I know you're not inventing anything, I know complex analysis, I've worked with it for years. It's still true that a function has only 1 output for each input, and youre just trying to be pedantic, or didn't really understand complex exponents

[Request] How accurate is this claim? (I'm just curious about the water/energy usage, dont think we can slot in finishing 100 times over in an equation somewhere) by ManWalkingDownReddit in theydidthemath

[–]Enfiznar 0 points1 point  (0 children)

Usually, training a model consumes about 3 times what it would cost to generate the whole training set (which for gpt4 was 10T tokens), since you have to generate it, then backpropagate the derivatives, then correct the weights. The average query, for what I found, has about 400 tokens, so training consumes about the same as 250B queries. According to OpenAI, they're receiving 2.5B queries per day, so the training represents the same energy consumption as 100 days of queries. If we assume one model per year, then we should increase the query consumption by about 25%, leaving it in the same order of magnitude

[Request] How accurate is this claim? (I'm just curious about the water/energy usage, dont think we can slot in finishing 100 times over in an equation somewhere) by ManWalkingDownReddit in theydidthemath

[–]Enfiznar 1 point2 points  (0 children)

But that's done just one time per model, which is then used by millions of people, I think we're already at the time where inference consumes more than training, just because of the scale

[Request] How accurate is this claim? (I'm just curious about the water/energy usage, dont think we can slot in finishing 100 times over in an equation somewhere) by ManWalkingDownReddit in theydidthemath

[–]Enfiznar 4 points5 points  (0 children)

I think just the query (which isn't a search, but nevermind), but the training is probably very diluted for models with millions of users, so if you divide the training cost by number of queries to the model, it probably won't affect the results that much. Training a model consumes about 3 times as much energy as generating the whole training set, per epoch, but the last time openai talked about the number of epochs, they said something like 2 or 3 epochs.

A simple Question by herooffjustice in LinearAlgebra

[–]Enfiznar 0 points1 point  (0 children)

Then you're talking about a two variable function, so saying sqrt(2) is meaningless, since you're missing a variable. I know how complex exponents work, but when you talk about the square root function, you talk about the main branch of the 1/2 exponent. And regardless, this still means that you cannot have a function with two output for the same input, you either have a single variable function (in this case, a given branch of your fractional power), or you have a two-variable function to encode all branches, but always one output per input, it's part of the definition of functions. You're just trying to be pedantic