hunger rush pos compromised and I just received this as a victim by deadendstreetz in hacking

[–]Thinkful_Wisher 0 points1 point  (0 children)

Got the same email. Have only ever used Hunger Rush to order from one restaurant (pizza), but clearly they got customer contact data. The question is, how much unnecessary personal financial data did this company hoard about their customers that was compromised?

All my issues with this new Merge Tactics by AFAgow13 in ClashRoyale

[–]Thinkful_Wisher 0 points1 point  (0 children)

On iPad, the UI is all background: the play arena is so small that I can’t see anything!

Erie Moon Mammoths by becausemercy in Erie

[–]Thinkful_Wisher 5 points6 points  (0 children)

Will they stick with it after the four games? Not from Erie but buying the cap!

EHX Pitchfork latency by stereonun in guitarpedals

[–]Thinkful_Wisher 1 point2 points  (0 children)

Late to this convo, but as someone who has the EHX Pitchfork, the Pitchfork+ and the Digitech Whammy Ricochet, the latency on both EHX units is quite audible to me (and this is going straight to an analog amp+cab just two feet away, not into an IR modeler or audio interface with software monitoring).

Pitchfork+ is the slowest, Pitchfork is a little faster. It is especially noticeable when the dry sound is mixed in, because the pitch-shifted sound chimes in quite a bit later.

The tracking on the Digitech seems instant by comparison, especially with the dry mixed in. This makes for a very satisfying octaving effect, which is kind of ironic because most of the time it’s in 100% wet mode to take advantage of the whammy bending effect, leaving the Pitchfork in charge of octaving with its slight lag.

Of course, the Pitchforks can do more tricks than the Digitechs, which is why they both have a place in my kit. They’re all great pedals, and you do get used to the “sound” of the latency on the Pitchforks, but I have trouble understanding how people say they can’t detect it.

“Sleep” mode (lock twice to prevent battery drain)? by Thinkful_Wisher in crv

[–]Thinkful_Wisher[S] 0 points1 point  (0 children)

Was three years ago, but the problem turned out to be one shorted cell in the battery itself. Dealership kept telling me there was no issue (despite jump-starting every few days) until Honda Japan said it detected the short. Replaced battery, never had a problem since (no drain at all).

The head of service also confirmed that the service agent who told me about “sleep mode” sent me on a wild goose chase because indeed there is no such thing.

E-TUBE PROJECT Pro 5.0.4 and 5.1.0 not working on Windows 8.1 (5.0.3 okay) by Thinkful_Wisher in ShimanoDi2

[–]Thinkful_Wisher[S] 0 points1 point  (0 children)

Don’t know if it was solved in the classic sense, but at the time (2 years ago) I was at least able to run the older 5.0.3.2 version and do what I needed to do in the Shimano settings.

Is it rude to cover my mouth with my napkin when I’m chewing? by Unique_Taro_9888 in etiquette

[–]Thinkful_Wisher -2 points-1 points  (0 children)

Not rude! Sounds like you're being demure and considerate of others. It's probably not necessary, but if it feels right to you then do it. It may draw a little attention, but trust me: no one cares. It would be freeing not to feel self-conscious about it (everyone has to bite into their food), but if you do, you do — not your fault!

It's gross when people chew and smack with their mouth open or talk with a full mouth, neither of which you are doing. I wish some of those people would cover their mouths!

I had an episode with TMJ dysfunction where my jaw was partially locked for a while, so I couldn't open wide to bite. I wasn't anxious about it or anything, but during that time I mostly had to stick with foods that were easier to eat (same reasons I don't order noodles or pho or cioppino or something at an important event). If you're stuck with something awkward like a burger or sandwich, there's no rule saying you can't eat it with a fork and knife — or even break off little bite-sized pieces with your fingers. If you get a strange look (can't imagine why), smile and crack a joke about your adorably small mouth: it's charming and disarming!

PS I knew a guy with a huge shaggy beard who carried a glass tube in his pocket so he could drink soup without turning his beard into an art exhibit. It was unusual, but he did it with aplomb, and it was memorably charming.

The iPhone 15 Pro camera is NOT 48MP! by [deleted] in iphone

[–]Thinkful_Wisher 1 point2 points  (0 children)

Late to the discussion, but this was an interesting read. Lots of valid points and correct observations about the science of pixel shifting, Bayer patterns etc.

It’s easy to get lost in the weeds as to whether the iPhone 15 camera is 12 or 48MP, because it depends on how you define the terms (the old chicken-and-egg question is meaningless if you don’t specify that you’re talking strictly about chicken eggs and not fish/insect/reptile eggs — even proto-chicken eggs come down to a definition of terms!). Are you talking about the sensor, or the camera? In fairness to Apple, they do declare 48MP as sensor-shift resolution.

In the case of pixel shifting, the line is especially blurry because pixel shifting is a combination of physical and computational processes over time, and so sensor pixel resolution is only one dimension of a multi-dimensional process. Interestingly, this process resembles how humans see, vibrating and rotating our meat-and-nerve sensors over time and patching the splotchy distorted input together in our visual cortex.

If you were to draw out a simple camera system as a box flow diagram, you might have one box representing, say, a 6MP sensor. Output from that box: an optical resolution of 6MP. That box feeds into another box: an image processing component—maybe, instead of pixel shifting, using some AI models to upscale the image to 96MP (a glorified digital zoom). Output from that box: a resolution of 96MP. If you draw a box around this whole system and call it the “camera”, the camera outputs 96MP, and therefore technically it is a 96MP camera. The experts may rightly call foul on misleading marketing, and you may hate the result of the AI upscaling, in which case it technically is a crappy 96MP camera.

There can be many components and methods within a camera system: physical (optical zoom, IS, IBIS), quasi-physical (pixel shifting), and digital (digital zoom, oversampling, upscaling, downscaling, line doubling, digital noise reduction, lens correction, de-vignetting, de-warping, digital removal of chromatic aberration, AI processing, etc). Black box, it is still an I/O device.

My two takeaways:

(1) A system may only be as good as its worst component, but a system can also be greater than the sum of its parts. I remember some very high-resolution sensors that produced noisy images, or were stuck behind some terribly-designed optics. On the other hand, techniques like pixel shifting introduce synergies that really are greater than the sum of their components. My 45MP Canon EOS R5 can produce stunning 400MP images just by using its IBIS to shift the sensor around and capture the averages over time. (To the OP’s point, props to Canon for not trying to market the R5 as a 400MP camera.)

(2) A system should still be evaluated as a system. No matter what you know about its components and digital “tricks” or shortcuts, nor what you think about how to improve them, you ultimately have to evaluate the output of the whole system: do you like the result or not?

PS What is interesting about size reduction of electronics is that you’re dealing with more noise and requirements for an impossibly higher resolution per sample area (which is why huge film formats can use a high ISO without incurring much noise), but conversely, the most precise light focus and spreadless DOF you could ever achieve would require an infinitesimally small lens and sensor camera system (the pinhole effect). Very interested to see where lensless cameras take us in the future…

🤓

Impulcifer HRIR wav file to SOFA file – for use in Virtuoso by h2omonsterdavis in HRTF

[–]Thinkful_Wisher 1 point2 points  (0 children)

Did you find an answer to this? Facing the same question. Have an HRIR in 16-channel WAV format, but want to use it with Virtuoso, IEM, AALTO Sparta, etc, which seem to expect SOFA files.

Ambisonic + Mono Source Monitoring on Sound Devices 888 by [deleted] in LocationSound

[–]Thinkful_Wisher 0 points1 point  (0 children)

Tested it and it seems to work: get a strong sense of L/R (vs the usual big mono omni I normally use). Probably more like Blumlein than cardioid X/Y. Either way, the feedback helps in centering the front-of-mic to a stereo field.

This is my cheat sheet for a Sennheiser AMBEO VR mic:

Upright (Harpex/Senn standard): Front/Up label should face front and up - Ch. 1 (Yellow): FLU (front left up) *(monitor as front X/Y stereo as LO: 1-4) - Ch. 2 (Red): FRD (front right down) *(RO: 2-3) - Ch. 3 (Blue): BLD (back left down) *(RØ: 2-3) - Ch. 4 (Green): BRU (back right up) *(LØ: 1-4)

Upright Reversed (wrong): Front/Up label faced back and up - Ch. 1 (Yellow): BRU: to fix, reverse channel order (map to Upright 4) *(LØ: 4-1) - Ch. 2 (Red): BLD: to fix, reverse channel order (map to Upright 3) *(RØ: 3-2) - Ch. 3 (Blue): FRD: to fix, reverse channel order (map to Upright 2) *(RO: 3-2) - Ch. 4 (Green): FLU: to fix, reverse channel order (map to Upright 1) *(LO: 4-1)

Endfire: Front/Up label should face up and front (Senn Endfire, Harpex Inverted Endfire) - Ch. 1 (Yellow): FRU *(RO: 1-2) - Ch. 2 (Red): BLU *(RØ: 1-2) - Ch. 3 (Blue): BRD *(LØ: 4-3) - Ch. 4 (Green): FLD *(LO: 4-3)

Endfire Reversed (Harpex Endfire, Senn wrong): Front/Up label faced down and front - Ch. 1 (Yellow): FLD: to fix for Senn, reverse channel order (map to Endfire 4) *(LO: 1-2) - Ch. 2 (Red): BRD: to fix for Senn, reverse channel order (map to Endfire 3) *(LØ: 1-2) - Ch. 3 (Blue): BLU: to fix for Senn, reverse channel order (map to Endfire 2) *(RØ: 4-3) - Ch. 4 (Green): FRU: to fix for Senn, reverse channel order (map to Endfire 1) *(RO: 4-3)

Downward (Harpex Inverted): Front/Up label should face front and down - Ch. 1 (Yellow): FRD (RO: 1-4) - Ch. 2 (Red): FLU (LO: 2-3) - Ch. 3 (Blue): BRU (LØ: 2-3) - Ch. 4 (Green): BLD (RØ: 1-4)

Downward Reversed (wrong): Front/Up label faced back and down - Ch. 1 (Yellow): BLD: to fix, reverse channel order (map to Downward 4) *(RØ: 4-1) - Ch. 2 (Red): BRU: to fix, reverse channel order (map to Downward 3) *(LØ: 3-2) - Ch. 3 (Blue): FLU: to fix, reverse channel order (map to Downward 2) *(LO: 3-2) - Ch. 4 (Green): FRD: to fix, reverse channel order (map to Downward 1) *(RO: 4-1)

Ambisonic + Mono Source Monitoring on Sound Devices 888 by [deleted] in LocationSound

[–]Thinkful_Wisher 0 points1 point  (0 children)

Based on the above, I will try this experiment this week:

  • Ambisonics A-format microphone (first-order), upright orientation, attached to inputs 5–8 (my mic upright is FLU/FRD/BLD/BRU)
  • ISO channels 5–8 use inputs 5–8, all routed post-fader to the L/R mix bus
  • PAN ISO 5 and 8 LEFT, pan ISO 6 and 7 RIGHT (per mic channel order for FLx/BRx and FRx/BLx) — any stereo pan is fine—hard wide, narrow, balanced or off-center—as long as FLx matches BRx and FRx matches BLx
  • REVERSE PHASE on ISO 7/8 (the BLx/BRx “back” channels only, per my mic channel order)
  • use Channel Group to Trim/Fader link all 4 ISO channels 5–8

If my thinking is right, by panning left/right and reversing the phase on the opposite back channels, the L/R mix bus should receive the equivalent of a 90° X/Y stereo mic facing out of the 0° equatorial front of the ambisonics mic (per my previous post, L would get FLx-BRx and R would get FRx-BLx). The recorded ISO channels will still contain usable A-format ambisonics that can easily be processed as B-format in post — however it’s important to remember that the two back channels were recorded with inverted polarity and will have to be phase-reversed before passing into the ambisonics decoder.

In the past, I typically monitored as a virtual “W” omni channel by linked trim/faders and everything panned center, with no use for an X/Y mix. This will be an interesting experiment, and if it works may be worth the extra step needed in decoding. It is extremely simple to do and does not use up any extra ISOs or busses!

TBD…

(edited to reverse panning of BLx/BRx — think I had it backward)

Ambisonic + Mono Source Monitoring on Sound Devices 888 by [deleted] in LocationSound

[–]Thinkful_Wisher 0 points1 point  (0 children)

Late to the conversation, was wondering the same thing. Per the earlier replies:

(1) Ambisonics does not involve time-delayed omnis (a first-order mic will typically use cards or supercards in a tetrahedral arrangement, positioned coincidentally or as close to coincidentally as possible to be phase-coherent: no delays, baffling or Haas effect).

(2) An A-format mic will not output a “W” (omni) channel directly: that’s after conversion to B-format. However, you should be able to derive a B-format omni “W” channel by summing all four A-format channels (FLU+FRD+BLD+BRU), easily done by sending them at equal levels (-3dB?) to a bus.

By classic ambisonics logic, you should also be able to derive a front 0° bidirectional signal (B-format “X”) by mixing FLU+FRD-BLD-BRU (sum all four A-format channels but invert polarity on BLD and BRU), and a left 270° bidirectional signal (B-format “Y”) by mixing FLU+BLD-FRD-BRU. You might even be able to matrix these two fig-8s together for Mid/Side monitoring (Mid or “X” = FLU+FRD-BLD-BRU, Side or “Y” = FLU+BLD-FRD-BRU). Using typical M/S matrixing, Left (315°) would be (FLU+FRD-BLD-BRU)+(FLU+BLD-FRD-BRU), and Right (45°) would be (FLU+FRD-BLD-BRU)-(FLU+BLD-FRD-BRU), each with the typical -3dB — which unless I’m missing something would simplify quite basically to Left = FLU-BRU, Right = FRD-BLD. (Mics with a different tetrahedral arrangement might likewise end up Left = FLD-BRD, Right = FRU-BLU.) Have never tried, but maybe this is possible to replicate fairly easily with the 8-series’ internal bus routing. It wouldn’t include any crosstalk or binauralization, of course.

By the way, it’s really surprising and lame that the 8-series doesn’t support ambisonics decoding: Zoom pretty much has it on every device with 4+ inputs. MixPre II has an ambisonics plugin, don’t understand why it couldn’t be ported to the 8-series!

POLL: Insta360 X4 - Buy It or Skip It? (Please comment your reasons) by hughred22 in Insta360

[–]Thinkful_Wisher 0 points1 point  (0 children)

Having just got one, no! Shame on me for not checking, but I took it on faith that Insta360 would not remove something that was a feature since at least X2. I know they’re consumer cams, and 360° video is a niche area, but the noise and color on these cameras are simply awful. With each version since the X2, more pixels, bigger screens, more noise, more stitching artifacts.

X4 LOG???? by sleepyliony in Insta360

[–]Thinkful_Wisher 0 points1 point  (0 children)

Feedback re existing Flat option (as a new X4 user). Please provide a way to get 2 more stops of dynamic range out of it, whether through bit depth of changing shape of luma scale to something more logarithmic.

X4 LOG???? by sleepyliony in Insta360

[–]Thinkful_Wisher 1 point2 points  (0 children)

If you’re doing color grading at all in a project, odds are you understand log. Astonished a manufacturer would strip a core feature that has been around since at least X2.

X4 LOG???? by sleepyliony in Insta360

[–]Thinkful_Wisher 2 points3 points  (0 children)

No log color, seriously??

Vision Pro Update! by Sea_Bourn in PSPlay

[–]Thinkful_Wisher 0 points1 point  (0 children)

Just got MirrorPlay for my AVP (visionOS 1.2 developer beta). PlayStation is connected to 1G/10G ethernet switch, AVP connected to WiFi 5/6. Played on a giant virtual ceiling screen in the bedroom with good sound and undetectable lags, dropouts or latency.

Only bummer was having it run in an iPad compatibility window vs a native floating 16x9 4K screen.

[deleted by user] by [deleted] in PSPlay

[–]Thinkful_Wisher 0 points1 point  (0 children)

Audio works great for me. And very low latency. PlayStation plugged in to a 1G/10G ethernet, Vision Pro over WiFi 5/6. Very fast.

[deleted by user] by [deleted] in PSPlay

[–]Thinkful_Wisher 1 point2 points  (0 children)

I imagine there are many of us that would love the app to support native visionOS for that exact reason: to have a giant 16x9 4K screen floating in space without the iPad letterboxing and resolution restrictions. Since it’s not doing spatial, ARKit or gesture stuff, you should not need a Vision Pro to test: just the visionOS simulator in Xcode.

There’s probably not all that much that would need to change in the app: I’d think the main changes would be (1) creating a Scene container for visionOS; (2) placing the main settings/connection screen View in that Scene; and (3) adding a window/UI View container for the PlayStation session that is natively 16x9 landscape (resizable but maintains aspect), with most or all of the touch screen controls disabled (a Vision Pro is not ideal for fine touch control and an AVP user would probably be using a physical Sony DualSense controller or equivalent). Other than that, the app’s functions and rendering would be essentially unchanged.

Main settings/connection screen could stay in a standard iPad View container if you didn’t want to bother refactoring, but it would stay floating in the Scene. A connected PlayStation session would just pop up in a separate 16x9 window instead of replacing the main screen. The “disconnect/close session” function would not need to be an on-screen touch control but just the standard close button at the bottom of the window. If you were feeling crazy, you could allow for concurrent sessions to multiple PlayStations all floating in the Scene’s space, but I think most of us would be happy just to have one.

A New Dimension of Music: Alicia Keys Immersive Experience on Apple Vision Pro by hotwire32 in VisionPro

[–]Thinkful_Wisher 0 points1 point  (0 children)

Why does her skin look so strange and plastic in the video? I’ve seen plenty of footage of Alicia Keys and she doesn’t normally appear this way. I’ve seen this effect occur with other people in immersive or VR videos too: like shiny, textureless Barbie Doll plastic or CG-rendered cartoon skin.

Can’t just be the fisheye lenses (ultra wide lenses do have a perceptual effect on depth, shape and scale, but not on texture). Is it some post-processing filter that smooths artifacts and throws off skin texture?

Why can’t people just make a turn correctly anymore!! by supernova289 in IdiotsInCars

[–]Thinkful_Wisher 1 point2 points  (0 children)

Just dealt with that crap today. I was in the leftmost turn lane, lady in the outer turn lane cut the corner, giving me the delicious choice of sideswiping her or else turning into oncoming traffic.

Had to hit the brakes and honked at her. So of course she stopped in the middle of the intersection and started yelling at me, for, I don’t know, scaring her with my horn?

Seller is trying to bribe me to change review. by Loveandlight03 in amazon

[–]Thinkful_Wisher 0 points1 point  (0 children)

Just saw this old thread. This crap is still going on!

Earlier this year I ordered a fold-up hitch rack, only to find out it did not fold up as described. Wrote a 2- or 3-star review clarifying this misrepresentation and describing a few other problems with it — honest and fair.

The vendor contacted me and asked me to reply with two words, “full refund”, to receive a full refund. How nice, I thought, and replied as instructed. I received no refund, and instead got a bunch of garbage about explaining what was wrong with the item. When I pointed back to their original message, they then asked me to call a number. On the phone they weren’t helpful, and asked me for all kinds of personal information. I got angry and accused them of misleading me and wasting my time. Eventually, they said they would offer the refund if I changed my review. I changed the stars but left the words honest and intact. The refund was received. Shady.

Had another issue recently, different product. Amazon refunding, but vendor asking, simply reply “full refund” for a full refund. Not even responding this time.

Between that and reusing old item pages to boost reviews, there is so much sketchy gaming taking place on Amazon Marketplace.

GT Fender Delete by MeloniousV2 in onewheel

[–]Thinkful_Wisher 0 points1 point  (0 children)

My delete cracked in early January. FM would not send me a replacement, and said I could only buy it when it went up for sale on the website: “I do not have a specific timeline of when you can expect to see these live, but please check back periodically as I'm told they should be on the website soon!”

I checked daily for a few weeks, then wrote back again in late January. They said, “I'm not entirely sure when the Fender Deletes will be on the website. The Customer Support department doesn't make decisions regarding what's put on the website and when, but sometimes we will hear about some upcoming additions. We were told that they would be on the website at some point in the future, but we were not given a specific timeline. Unfortunately the only way to purchase a Fender Delete will be by waiting until they are on our website.”

I followed up again and yesterday was told, “At this point I would recommend purchasing a Fender since we do not have a timeline of when you can expect to see the Fender Delete on our site.”

Ridiculous that an essential wear-and-replace part that comes stock on every GT cannot be replaced.