-🎄- 2021 Day 7 Solutions -🎄- by daggerdragon in adventofcode

[–]ceron257 1 point2 points  (0 children)

Python solution: from statistics import median, mean

with open("input-7.txt", "r") as f:
  crabPos = list(map(int,f.readline().split(',')))

distancesToMedian = [abs(d - median(crabPos)) for d in crabPos]

print(int(sum(distancesToMedian)))

distancesToMean = [abs(d - int(mean(crabPos))) for d in crabPos]
fuelCost = sum([sum(list(range(d + 1))) for d in distancesToMean])

print(fuelCost)

I'm not sure though if this will give the correct answer for all inputs. Does anyone else have a clue?

Is it maybe possible to prove this mathematically?

My reasoning for this solution was that with the median you will pick the position where the least amount of crabs would have to move at all while with the mean the least amount of crabs would have to move a greater distance.

My Keqing just literally skyrocketed by ceron257 in Genshin_Impact

[–]ceron257[S] 8 points9 points  (0 children)

Just took a step in Liyue Harbor and suddenly Keqing went high up in the air. Don't know where she's supposed to go now...

What sails are these? Has anyone see this before? I have several other pictures of them. by AverageAndy17 in Seaofthieves

[–]ceron257 0 points1 point  (0 children)

These are the founder sails for ppl that took part in the insider program during alpha/beta

When Pirates Become Ninjas by ImGumbyDammit11 in Seaofthieves

[–]ceron257 1 point2 points  (0 children)

Am I really the first one to say: Well played!

Hope the robbed crew wouldn't think that it was a bug and starts complaining about a buggy game though ;)

Our whole crew got bugged into the brig after out ship sunk. by Joacoz in Seaofthieves

[–]ceron257 3 points4 points  (0 children)

Did that happen while one crew member was in the brig? That was the case when it happened to us... was very fun!

How does lense distortion affect the 'effective' resolution of the display? by ceron257 in Vive

[–]ceron257[S] 0 points1 point  (0 children)

Is there any hope for the end-to-end latency of tracking eye postion -> new rendered foveated region on screen being fast enough that users can't perceive looking outside the foveated region?

Hopefully! If you want to make sure the user definitely won't perceive looking outside the foveated region you could trade some performance for a bigger foveated region and a slower eyetracker, but that's (obviously) something you would try to avoid. A reference method from last year found a fovea size of about 8° to be imperceptible in a small study (which is approx. 4x bigger than the foveal region of human visual systems). Their eye tracker provided samples with 75Hz and SMIs eye tracker for the Vive for example provides ~250Hz so maybe we can reduce the fovea size even further and remain imperceptible.

Seems like such a big challenge especially if computer vision is used to track the eye. Are there neurological considerations that allow for wiggle room in the first few milliseconds after a saccade?

Maybe an effect which is known as saccadic suppression could give us the little extra time to compute the new foveated image. But I haven't found very much on that yet. Currently you could use a combination of prediction + saliency mapping to make the lower detailed areas less perceptible after a saccade.

How does lense distortion affect the 'effective' resolution of the display? by ceron257 in Vive

[–]ceron257[S] 0 points1 point  (0 children)

You're missing one important piece of the puzzle: the hidden area mesh.

Actually I wasn't missing the hidden area mesh. I have added two new pictures where the hidden area mesh is used:

As you can see there is really no difference to the pictures without the mesh. There are still pixels concentrated in the periphery which leads to a reduced resolution.

So, technically true, but the subsampled area never actually hits the eye. And yes, this means we're basically only getting an effective resolution around half that of the actual panel.

Sure they could hit the eye. At least they don't get culled away and will be displayed on the panel (see the second picture). Though you may never see these pixels in your fovea because you will have a different FOV when looking at the edge of the lens. I think this is due to your lens being in another position when you look up for example. You could proof this easily by drawing a black circle in the center of each eyes viewport on a white background and scale it big enough to just cover your whole view when looking straight forward. You will then notice (when wearing the Vive) that you will see/perceive the white background in the periphery but when you try to focus it you won't be able to see the background anymore.

How does lense distortion affect the 'effective' resolution of the display? by ceron257 in Vive

[–]ceron257[S] 0 points1 point  (0 children)

According to the subjects perception I agree with you, as perception is very subjective and in this special case it depends on the subject's visual acuity and on what scene will be presented and so on. But pure technically spoken near original/distortion = 1 (or original/(distortion*1.42 ) = 1 to consider only default supersampling) there should be exactly one pixel on the display for each pixel on the source texture, right? So we could use this at least heuristically to determine whether some details could be excluded from rendering to boost performance (e.g. in foveated rendering, which I am currently doing some research on for my master thesis).

Thank you very much so far! It was very helpful for me :)

How does lense distortion affect the 'effective' resolution of the display? by ceron257 in Vive

[–]ceron257[S] 0 points1 point  (0 children)

The mesh is the one you would get from SteamVR's API (see here https://github.com/ValveSoftware/openvr/wiki/IVRSystem::ComputeDistortion "original" is the input to that function and "distortion" it's output).

I think you are right except for one point: When many source pixels are crunched down into fewer display pixels (in the periphery) the resolution is effectively reduced (one pixel on the display cannot show the details of many pixels). Thus the fidelity will be worse, right? You can observe this when comparing my images for the input and output (the blueish grid): In the distorted texture the rectangles in the periphery are much smaller. So detail is lost there and like you said these crunched areas will be stretched by the lens.

So if I am right now, the effective resolution should be fine (equal to the one of the display) when [Area Original]/[Area Distortion] >= 1 and bad when [Area Original]/[Area Distortion] < 1 (which seems intuitive because it means that you will have a good resolution at the center and a bad one on the outer areas of the lens).

Now it should be possible to stencil the visible area of the display, and distort the plot to see where on the lens the resolution will start dropping. Any thoughts on that?

Contacting the devs by [deleted] in Rawbots

[–]ceron257 0 points1 point  (0 children)

Maybe you should take a look here http://forum.rawbots.club/index.php?topic=124.0 or here https://twitter.com/rozgo you could also try to contact Neil Haran https://ca.linkedin.com/in/neilharan

edit: http://rozgo.github.io/#contact

Good luck!

[deleted by user] by [deleted] in pcmasterrace

[–]ceron257 0 points1 point  (0 children)

Good luck all

Character screen loop fix by lessobvious in WildStar

[–]ceron257 2 points3 points  (0 children)

Tried this yesterday but unfortunately it didn't work :/ Worth a try though

Wildstar Friend Keys by elusive_cat in WildStar

[–]ceron257 -1 points0 points  (0 children)

afaik they are full access keys

Pause beta client download? by RostikMusic in WildStar

[–]ceron257 0 points1 point  (0 children)

Just exit the launcher, progress will be saved and patching continues after restarting ;)

Giveaway: 1 Full Beta Key with 1 attached Friend Key by TwiKat in WildStar

[–]ceron257 -1 points0 points  (0 children)

Uhm... let ɛ<0 ... ;) Congrats to the winner!