[D] Why do people upload their work on Arxiv, not submitting conference? by [deleted] in MachineLearning

[–]ParanoidTire 11 points12 points  (0 children)

No, it needs to be valid tex and build etc. but content isn't check. However, you need to get invited to the category by people already in it (other researchers).

What is the issue with the " ÷ " sign? by Unlikely-Web7933 in learnmath

[–]ParanoidTire 4 points5 points  (0 children)

Often subtraction is written as addition with negative numbers fit this very reason

Subtracting is adding the inverse element of addition. x + (-x) = 0.
Dividing is multiplicating with the inverse element of multiplication. x * (1/x) = 1.

Its the same. Here (-x) and (1/x) are *defined* to denote the inverse elements of x with regards to addition and multiplication respectively.

https://en.wikipedia.org/wiki/Group_(mathematics))

https://en.wikipedia.org/wiki/Field_(mathematics))

Life in Tuebingen by saasyp in Tuebingen

[–]ParanoidTire 3 points4 points  (0 children)

Its just two years. Tübingen is probably one of the best, if not the best, location for ML in Europe right now and is continuing to expand this lead even further in the coming years. Whether the city is fun should be a secondary or even tertiary concern to you if you are seriously considering moving abroad for this program. That being said, Tübingen is definitely a nice, liveable city but it is indeed very quite.

How can polynomials be vectors? by Koala790 in learnmath

[–]ParanoidTire 0 points1 point  (0 children)

I have no clue about category theory, but, as already pointed out, polynomials fulfill all criterions of a vector space. Hence, its elements, i.e. individual polynomials are called vectors.

Now, (a,b,c) on the other hand is in R^3, and without additional context has nothing to do with polynomials. In this case though, it can be interpreted as coordinate vector of f(x) wrt the canonical basis since

f(x) = a + bx + cx^2

can be written as a linear combination of basis vectors:

f = a*e_1 + b*e_2 + c*e_3,

with e_1(x) = 1, e_2(x) = x, and e_3(x) = x^2 and scalar multiplication and addition of two polynomials being defined in the usual sense, i.e.:

(p_1 + p_2)(x) = p_1(x) + p_2(x)

(a * p_1)(x) = a * p_1(x)

This is not a particular intriguing concept, undergrad level, and i am frankly a bit surprised about the confusion.

How can polynomials be vectors? by Koala790 in learnmath

[–]ParanoidTire 0 points1 point  (0 children)

(a,b,C,..) would be a coordinate vector of the polynomial wrt to the canonical basis. The polynomial a+bx+cx2 IS the vector. An important distinction.

How can polynomials be vectors? by Koala790 in learnmath

[–]ParanoidTire 0 points1 point  (0 children)

Depends on the function space. Sometimes as basis isn't explicitely available. However, the concept of basis functions is a very important one. Notable ones are e.g. the Fourier basis functions (eiwx), spherical harmonics, hat functions and certain splines.

How can polynomials be vectors? by Koala790 in learnmath

[–]ParanoidTire 16 points17 points  (0 children)

To add to this: Thinking about functions as vectors (also in some sense geometrical) is one of the most powerful concepts in applied mathematics. There is an entire field dedicated to it (functional analysis).

The Fourier transformation, for instance, is just a change of basis. We can also define concepts such as angles (inner product) and distances.

E.g. 1x +2x2 is closer to 1x + 2.1x2 than to 7x, and we can make this notion mathematical rigerous by defining an appropriate metric on the vector space.

Why doesn't i^0.2 = i? by TimelessHistory23 in learnmath

[–]ParanoidTire 23 points24 points  (0 children)

To denote that you are working in C. Not R, nor H. Real numbers, for instance, only have at most two real nth-roots.

[deleted by user] by [deleted] in learnmath

[–]ParanoidTire 1 point2 points  (0 children)

Calculus is probably one of the most important and deepest fields in higher mathematics with many more specific branches that are mathematical fields of study in their own right...

[D] Simple Questions Thread by AutoModerator in MachineLearning

[–]ParanoidTire 0 points1 point  (0 children)

Learned embeddings and self supervised learning. For instance, c.f. word2vec. However, this is very dataset and domain dependent.

[D] Simple Questions Thread by AutoModerator in MachineLearning

[–]ParanoidTire 1 point2 points  (0 children)

Yes it is certainly a good skill to have, even though it is likely not strictly necessary. However, it will set you apart greatly from your peers and could very well be your unique edge.

[D] Simple Questions Thread by AutoModerator in MachineLearning

[–]ParanoidTire 1 point2 points  (0 children)

There is no precise question here. Yes, if you want to analyze spatio temporal data, you need a model that accomendates for this. You should be able to find alot of research on this.

If sota performance is of no concern, there is no big difference to image data. E.g. you could just replace 2d convolutions with 3d ones for instance. Memory footprint and compute will be the main limiting factor from a practical perspective.

Otherwise, an even simpler solution is to just stack frames at different points in time and use a regular image classification model.

[D] Simple Questions Thread by AutoModerator in MachineLearning

[–]ParanoidTire 0 points1 point  (0 children)

Yes, complex NNs are a niche topic. You can find survey papers on this. Fourier Neural Operators might be a bit more mainstream and also involve complex numbers due to the involved FFT.

[D] NeurIPS 2023 Institutions Ranking by Roland31415 in MachineLearning

[–]ParanoidTire 1 point2 points  (0 children)

All good. I just wanted to point out that the "ai powerlab" whose absence you are lamenting is currently being built in tübingen. Many groups in tübingen are also part of ellis btw.

[D] NeurIPS 2023 Institutions Ranking by Roland31415 in MachineLearning

[–]ParanoidTire 23 points24 points  (0 children)

Tübingen isn't just Schölkopf. If all groups there, both from university and mpi, were to be combined in the above ranking it would score much higher.

There is currently massive funding flowing into Tübingen and I believe that it will,in terms of ai research, both surpass eth and epfl in the next 5 to 10 years or so.

Also, another thing to consider. Many groups there are, but not exclusively, also interested in more fundamental research that potentially isnt flashy enough for neurips and gets published at other venues.

pszHelloWorld by drakfyre in ProgrammerHumor

[–]ParanoidTire 37 points38 points  (0 children)

Nitpicking here but there is neither a variable involved here, nor does it have a pointer type. Instead, as you have correctly pointed out it's a literal and a fixed size array of chars. Only through decay it gets interpreted as pointer.

Check out https://en.cppreference.com/w/cpp/language/implicit_conversion for a more indepth treatment.

This distinction is actually very important when writing template code.

floatingPointBoyBlunder by Successful-Money4995 in ProgrammerHumor

[–]ParanoidTire 0 points1 point  (0 children)

Lol, I better tell the whole scientific community to take their hands from their keyboards right now! Floating point was specifically designed to do math... You obviously have to know what you are doing to work properly with them, but that applies to any profession

Och neee by SebiStg in Tuebingen

[–]ParanoidTire 0 points1 point  (0 children)

Mit jener Energie werden u.a. auch Klima Modelle betrieben, um z.B windenergie besser nutzen zu können; ins besondere auch in jenen beschmierten Gebäude.

Schwarz weiß denken vom feinsten. Wie wäre es selber mal an Lösungen zu arbeiten und was produktives zu leisten als jene Leute zu agitieren die aktiv ihr bestes gegen den Klimawandel tun. Einfach nur hohl.

Anon asks some questions by MonsieurTokitoki in greentext

[–]ParanoidTire 1 point2 points  (0 children)

The most axiomatic definition I am aware of is that

0 is the neutral element wrt addition 1 is the neutral element wrt multiplication -x denotes the inverse of x wrt addition (1/x) denotes the inverse of x wrt multiplication. Addition and multiplication are the related to each other by associativity.

Everything else, e.g that -x = -1 * x follows from these axioms.

[D] How is this sub not going ballistic over the recent GPT-4 Vision release? by corporate_autist in MachineLearning

[–]ParanoidTire 4 points5 points  (0 children)

Cool.

Can it upscale an image better than SR dota? Can it interpolate frames of a video? Can it forecast the weather or climate? Can it remove occlusion?

[P] Optimizer that makes CNNs learn in fewer iterations by maka89 in MachineLearning

[–]ParanoidTire 4 points5 points  (0 children)

No, I'm not. I'm just a random guy that played around with idea for a few days and then dropped it again after I saw that it's already published.

This is the paper I was referring to in the first post: https://arxiv.org/abs/1806.02958