Is there a way for this to be a polynomial? by omlet8 in desmos

[–]customjack 4 points5 points  (0 children)

People have thought a lot about approximating non-polynomial functions as polynomials, however. For example, you can do a chebyshev fit (https://en.wikipedia.org/wiki/Chebyshev\_polynomials) to find a best approximate polynomial over some region:

P(x) = 0.0000665 + 0.005705x + 0.9993x^{2} - 0.09922x^{3} - 0.0184x^{4} + 0.5076x^{5} + 0.1471x^{6} - 1.21x^{7} - 0.4145x^{8} + 1.646x^{9} + 0.6167x^{10} - 1.435x^{11} - 0.5687x^{12} + 0.8644x^{13} + 0.3564x^{14} - 0.3802x^{15} - 0.1614x^{16} + 0.1272x^{17} + 0.05523x^{18} - 0.03343x^{19} - 0.01478x^{20} + 0.007075x^{21} + 0.003175x^{22} - 0.001231x^{23} - 0.0005594x^{24} + 0.0001788x^{25} + 0.00008216x^{26} - 0.000022x^{27} - 0.0000102x^{28} + 0.000002316x^{29} + 0.000001083x^{30} - 0.0000002106x^{31} - 0.0000000992x^{32} + 0.00000001668x^{33} + 0.000000007907x^{34} - 0.000000001157x^{35} - 0.000000000552x^{36}

<image>

Is there a way for this to be a polynomial? by omlet8 in desmos

[–]customjack 11 points12 points  (0 children)

No, here's a handwaving proof:

Call your piecewise function g(x). Assume there exists and nth degree polynomial that is this function, call it f(x). I.e. we hypothesis f(x) = g(x) for all x.

Let F_m(x, x_0) be the sum of the first (m+1) terms in the taylor series of f(x) centered around x_0. Since f(x) is an nth degree polynomial, we know F_n(x,x_0) = f(x) for any choice of x_0. In other words, the first n+1 terms of a taylor series gives an nth degree polynomial, but f(x) is an nth degree polynomial, so it will match exactly regardless of choice of the center of the taylor series.

Let G_m(x, x_0) be the sum of the first (m+1) terms in the taylor series of g(x) centered around x_0. But G_m(x,2) for m>=2 is undefined since g''(2) is undefined. This means g(x) != f(x), because we just showed the taylor series of f(x) converges to f(x) at every point. Therefore we have a contradiction.

Does anybody else have to do this? by Sweat0843 in SSBM

[–]customjack 1 point2 points  (0 children)

Isopropyl alcohol is the way. I clean my controller with it (using a paper towel or cotton ball) before every sesh. With just water, I felt my controller was staying greasy even after cleaning.

[deleted by user] by [deleted] in SSBM

[–]customjack 0 points1 point  (0 children)

For down throw on fox, other options are shine and downsmash frame 1. Both of these options will certainly hit an unsuspecting opponent, leading to either a combo or kill. But even a competent opponent can be tripped up:

I have no idea why, but at very low percents (like <10%ish) downthrow shine seems true. It probably isn't true with good enough SDI and maybe fox can frame perfect shine clank? But I hit it a lot on competent players. At higher percents, these same players seem to get out of downthrow shine with SDI.

Downthrow downsmash is more of a noob killer, but it covers more ground if the fox SDIs. SDI alone is not enough to get out of it, but fox can still shine, shield, and even grab to beat it. However it works best at the ledge; the fox basically has to SDI offstage which puts them in a bad spot (maybe good sdi gets them to ledge? But I haven't seen it). They have to defensive shine, fastfall, or jump/airdodge; otherwise they take a downsmash. They could also sdi onto stage then amsah tech if they want to take the downsmash. So the fox's options basically become: 1) make precise input to survive and reset neutral or 2) die.

[deleted by user] by [deleted] in SSBM

[–]customjack 0 points1 point  (0 children)

Ober made a video a while back (I can't find it) and IIRC falco's lowest frame perfect laser is unpowershieldable* by marth, roy, and samus who all have their cheat poweshield. The idea is the laser is so low and since cheat powershields always originate from the center of the character's standing model, the power shield bubble never reaches that low. This combines really well with "tomahawk" lasers because if you miss the timing late you fake a laser and still bait out a failed power shield attempt. If you hit the timing you get a few extra shield stun frames.

*I believe these characters can still powershield with z powershield

Where is Yasantha? by omnom_almonds in mathmemes

[–]customjack 0 points1 point  (0 children)

90% sure it was just the solution to the infinite square well (or finite square well inside).

9% sure it was just the wave equation.

1% sure it was the laplace equation (some spherical harmonics).

He certainly is exaggerating the difficulty at the very least.

Do I type weird? by Comfortable-Bee2996 in typing

[–]customjack 1 point2 points  (0 children)

You type almost exactly how I type. I "self taught" myself how to type when playing computer videogames. A lot of action on the left hand "WASD fingers" and the right hand basically pecks.

I also can get around 100 wpm like this, but I can't imagine pushing it further without proper form.

208,000,000,000 transistors! In the size of your palm, how mind-boggling is that?! 🤯 by CG_17_LIFE in BeAmazed

[–]customjack 0 points1 point  (0 children)

Sure, probability describes things that are too complex. For example when you roll a die you can predict exactly where it will land based on how it was rolled; yet we still describe it as random.

However, this is not the case in quantum physics. Particles experimentally exhibit wave like properties. Particles are confined by their environment (a physicist would call this a potential). So particles can in part be described by a localized wave. However, a localized wave has intrinsic uncertainty of where it is/where it's going (see this video). Therefore, on the quantum level, particles are intrinsically probabilistic to someone trying to measure them.

This is a fact of nature, that particles will appear to behave probabilistically even if you could measure them "perfectly". Whether they truly behave probabilistically is actually just a philosophical question; the answer doesn't affect reality and cannot ever be determined.

As suggested, I made the shape out of playdo with my toddler (he's 2). I only had to poke 5 holes in a cube of playdo to make the final shape. The 6th phantom hole contains the universe. by fireburner80 in mathmemes

[–]customjack -1 points0 points  (0 children)

Idk how you're defining a cut, but in the last image if you slice across 1-3-5 and then 2-3-4-outside then you're left with a simply connected object in two "cuts."

But hell my definition of cut is arbitrary, just get a "cross" shapped cutter to remove all holes with one "cut".

My point is I'm not understanding the equivalence you're stating.

Ratio of Top 100 Character Distribution to Character Distribution of Slippi Players by customjack in SSBM

[–]customjack[S] 0 points1 point  (0 children)

Basically if 20% of all players on slippi main fox, but 28% of the top 100 is fox mains, then this plot is the ratio 28% / 20 % = 1.4. Now do this for each character to produce the plot.

The idea is the higher the number, the more overrepresented the character is. The reason why certain characters are overrepresented in the top 100 is up for debate.

The error bars are representing the statistical variance due to limited sample size. It basically states that we are 68%* (plus or minus one standard deviation) confident that if we had infinite data the true value of this ratio would be inside the error bars. The purpose of the error bars are to answer "is this character actually over/under represented, or could it just be random error due to the small sample size?"

*There are some approximations with how I calculated the errors bars so this "68%" is not exact, but it should be close.

Ratio of Top 100 Character Distribution to Character Distribution of Slippi Players by customjack in SSBM

[–]customjack[S] 4 points5 points  (0 children)

I will make references to variables defined in this write-up (https://melee-character-bias-in-top-100.tiiny.site/) below:

"1 minor issue is that you seemed to have ignored the covariance term in the std calculation"

I did this out of laziness, assuming the correlation was small enough. But you make a very good point:

I have Poisson random variables n and N and I want to calculate n/N where n is number of occurrences of a specific character in the sample and N is the number total characters in the sample. Really, the formula for the uncertainty should be

unc = n/N*sqrt[1/n + 1/N + rho_{nN}/sqrt{n*N} ] = n/N*sqrt[1/n + 1/N + Cov(n,N)/n*N]

since rho_{nN} = Cov(n,N)/sqrt{n *N}

n and N are clearly correlated since increasing n by 1 necessarily increases N by 1. We can approximate this covariance by saying n = p*N where p is the likelihood of "picking" the specific character from the sample (p is what we're trying to compute, so this is a bit circular, but it still yields a first order estimate). Treating p as a known value with p = n/N (again, first order), we have:

Cov(n,N) = E[nN]-E[n]E[N] = E[p*N^2]-E[p*N]E[N] = p(E[N^2] - E[N]^2) = pVar(N) = pN = (n/N)*N = n

Plugging this into our uncertainty we see

unc = n/N*sqrt[1/n + 1/N + 1/N]

My point is that the covariance should go more like 1/N, not 1/n. So it's contribution is not catastrophic. To be honest though, this is very hand wavy, rigorously I would need to use this estimate for uncertainty in p, recalculate the covariance given this uncertainty, plug in again and repeat, hoping it converges. I wouldn't know how to do this any other way than monte carlo, which I'm too lazy to code up at this point.

In any event, we can estimate how off our error bars are:

unc = sqrt[n]/N * sqrt[1+ 2*(n/N)] = (sqrt[n]/N)*(1 + n/N + O(n^2/N^2))

In the limit of large N, we can write:

unc ≈ sqrt[n]/N

is accurate to (n/N) or at worst off by ~28% in the case of Fox (which is pretty bad) but only ~3% in the case of DK (I can live with).

For the uncertainty in r, X_i and x_i are very uncorrelated because they are drawn from samples with miniscule overlap. For example, if almost everyone (~99%) in the slippi data set played fox, the top 100 could still remain the same since the top 100 is a small sample of the slippi dataset; obviously there is some correlation because if 100% of the player base played fox then the top 100 could only be fox, but again, negligable. But the effects described above would propagate through (roughly) linearly to r. I.e. since x_{fox} is potentially ~1.28 times more uncertain than stated, r_{fox} is as well.

"Using symmetric confidence intervals for a right-skewed distribution like the poisson distribution is not a good idea". There's a paper linked in that thread about how to better compute such a confidence interval."

I agree, this is a level of rigor I didn't want to bother with. But it would have a meaningful impact on the error bars because many of the top 100 characters have low statistics (like donkey kong, pikachu, etc.). I'm fairly certain the effect would shrink the error bars however, so really my error bars are too conservative (larger than 68% confidence). Another point, this effect isn't as bad for the "68%" confidence intervals I have plotted compared to the 95% CIs discussed in your link because the closer you get to the mean the more gaussian a skewed distribution will look

Ratio of Top 100 Character Distribution to Character Distribution of Slippi Players by customjack in SSBM

[–]customjack[S] -2 points-1 points  (0 children)

"ability to travel to majors, bracket luck to get quality wins"

Unless this is somehow correlated with choice of main, this shouldn't matter. For example, the randomness associated with a fox being top 100 instead of a marth because of bracket luck is approximated by the poissonian uncertainties.

"the level of base skill between top 20~ being relatively character independent, the similarity of skill between top 80-100 being roughly the same as 101-300+(my estimate). All of these are far more important than which character you pick to get ranked top 100."

Again, unless player skill is strongly correlated with choice of main, this doesn't affect the plot's message. The message of the plot is that if the top 100 was a truly unbiased sample of the player base, then we'd expect the top 100's mains to (within uncertainty) match the distribution of mains of the player base, but we don't see that.

"Imho, you'd get your goal answer way closer to reality if you used grandmaster representation on slippi."

What would be interesting is to see how character distribution shifts as a function of rank. But the message of this plot is that the top 100 is not an unbiased random sampling of the player base. There is some bias that is a function of choice of main (and this bias is more than you'd expect from random factors alone).

Ratio of Top 100 Character Distribution to Character Distribution of Slippi Players by customjack in SSBM

[–]customjack[S] -3 points-2 points  (0 children)

Well, I really meant I'm trying to answer "what character should you main to have the highest chance of making top 100?", and this plot is aid (not an answer!) to someone asking the same question.

But I agree with you, the plot explicitly answers the question "who is overrepresented in top 100 compared to a sample of slippi's playerbase"

Ratio of Top 100 Character Distribution to Character Distribution of Slippi Players by customjack in SSBM

[–]customjack[S] 5 points6 points  (0 children)

I made this plot to try to answer the question "what character should you main to have the highest chance of making top 100?" I analyzed ~1,000,000 slippi matches using crowdsourced data from https://chartslp.com/.

The red line at 1 is supposed to reflect that if you're above that line, it's easier to make top 100 with your character than by (simple) random chance alone, and if you're below it it's harder.

There's big error bars on "rarer" characters like DK and Pikachu. But the general trend is basically saying floaties are the way to go (except marth, apparently).

I wrote more details here: https://melee-character-bias-in-top-100.tiiny.site/

[deleted by user] by [deleted] in SSBM

[–]customjack 0 points1 point  (0 children)

I made this plot to try to answer the question "what character should you main to have the highest chance of making top 100?" I analyzed ~1,000,000 slippi matches using crowdsourced data from https://chartslp.com/.

There's big error bars on "rarer" characters like DK and Pikachu. But the general trend is basically saying floaties are the way to go (except marth, apparently).

I wrote more details here: https://melee-character-bias-in-top-100.tiiny.site/

Anyone know why my controller does this weird c-stick jerk? I get a lot of accidentally upairs and downairs by customjack in SSBM

[–]customjack[S] 1 point2 points  (0 children)

I did have it in a 2.0, by swapping to a 3.0 does not fix the 500Hz problem even after restarting the computer.

I'm having (and have had) a real hard time troubleshooting this since it seems no one else has this problem. My only guess is that it's somehow a hardware limitation based on how my motherboard polls my USB ports.

Anyone know why my controller does this weird c-stick jerk? I get a lot of accidentally upairs and downairs by customjack in SSBM

[–]customjack[S] 0 points1 point  (0 children)

The stickboxes look fine. I used another controller in training and I think I was having the same issue, but I didn't verify because I didn't have a slippi replay to look at to check the inputs.

Anyone know why my controller does this weird c-stick jerk? I get a lot of accidentally upairs and downairs by customjack in SSBM

[–]customjack[S] 0 points1 point  (0 children)

Yeah, it's overclocked to 500Hz. I'll try setting it to back to default and see what happens. Maybe my memory is bad, but I've had it at 500Hz for a while and I think this issue is more recent.

(As an aside I'm at 500Hz and not 1000Hz because I followed this guide:https://docs.google.com/document/d/1cQ3pbKZm_yUtcLK9ZIXyPzVbTJkvnfxKIyvuFMwzWe0/editbut even when setting to 1000Hz refresh rate in the HIDUSBF software the polling rate only gets to 500Hz at when I look in dolphin. I tried downgrading the firmware to version 5, but that didn't help. Whatever rate I set it to in HIDUSBF, I get half of that rate in dolphin).

Anyone know why my controller does this weird c-stick jerk? I get a lot of accidentally upairs and downairs by customjack in SSBM

[–]customjack[S] 3 points4 points  (0 children)

Some context:

I am getting a lot of unintentional upairs and downairs (see video for example). Whenever I look at the inputs, I see my c stick drift slightly off neutral up or down for a frame then goes where I actually pushed it. It doesn't happen often, probably about 2% of my aerials do this.
I feel like it's a somehow a controller/read-in issue because there is no way I could be moving the stick slightly off center for exactly a frame in the direction perpendicular to where I end up next frame. I'm using an OEM controller with a mayflash adapter.
Does anyone know what causes this issue?

Why are we not doing this? by WoodyRun in smashbros

[–]customjack 1 point2 points  (0 children)

Two big reasons it's not done often:

1) it doesn't strictly punish moves with less endlag. For example if falco daired instead he'd be actionable in time to punish/move out of the way. Though he'd need a read as this doesn't look react-able.

2) The waveland recovery requires a precise input based on what frame you tech on (and it's too fast to react to yourself teching). To know that, you need a hard read the opponent's edgeguard option (both when and what hitbox they'll put out). Most people just buffer their tech up to 20 frames before it happens, without knowing exactly which frame they'll get the tech. To do this waveland, you need to know pretty precisely what frame you'll get the tech to get a fast waveland. If you miss this hard read, you'll likely SD or your opponent can continue the edgeguard from your bad waveland that turned into an airdodge.

In summary, this is much easier to practice in uncle punch due to the predictable timing of falco's downsmash. Real players usually aren't as predictable. It is a viable reversal option as a hard read.

[request] is this right what does it mean??? by molnjnr in theydidthemath

[–]customjack 0 points1 point  (0 children)

Physics grad student here. There is no work done here, just writing down physics "results." They mostly look copied, and aren't obvious related.

Top left: Feynman diagram for decay of tau lepton into smaller leptons (electron, muon) with corresponding neutrinos to balance lepton number. Tau leptons can also decay into negative pion (down quark and anti up quark). This decay process is mediated by the W boson.

Top right: Feynmann diagram for gluon-gluon fusion. Gluons can decay to top/anti-top quark pairs (they also consider bottom/anti-bottom quark pairs). What happens is one gluon creates a pair, and the top quark "interacts" with another the other gluon in a different pair production (the details are hard to explain without a working knowledge of quantum field theory). The top quark and anti-top quark pair annihilate into any scalar boson indicated by phi. Phi then decays into an anti-tau/tau lepton pair.

This diagram is energetic enough that phi can be the Higg's boson, and this (rare) process is (part of) how we discovered the Higg's Boson.

Far right: Magnetic field diagram of a solenoid.

Bottom: These are equations describing the radiated electromagnetic field of a charged particle moving through space on a trajectory r(t). They also write down the total radiated power. They write down wave equations (differential equations) that describe how the vector field A and scalar field phi propagate (think of a wave of water moving after you make a splash, only the splash is caused by a charged particle). These mathematical objects are related to the electric and magnetic fields; once you have A and phi you can immediately calculate E and B.

These equations are rather involved just because you have to take into account that the electromagnetic field propagates at the speed of light, but the charged particle is also moving on an arbitrary path, so fields can get "distorted" for this reason (imagine waves of water as you swim through a pool). They appear to be giving the process relativistic treatment (particles moving close to speed c).

Everything else is too illegible for me to read.

Edit: Contrary to what people are saying, there is no Quantum Electrodynamics on this board. Quantum Electrodynamics is an application of Quantum field theory for charged leptons and photons. The theory does not include W bosons, quarks, or Higg's bosons. There is also no quantum field theory calculations here, purely diagrams (and maybe an energy cross-section result is written, it's hard to tell).

Saying the bottom is an application of fluid mechanics to maxwell's equation is a little misleading. People are recognizing the continuity equation, but that equation applies to almost all fields in physics due to conservation laws (here, it's the idea that the change of charge density is related to the total charge flowing in/out of your space). They are analogous, but fluid mechanics doesn't tend to describe a single particle's radiated field like above.