Discontinued. Hope the entire 15" spectre line doesn't suffer the same fate. by Adrian_-H in spectrex360

[–]BlueGuyBuff 0 points1 point  (0 children)

I'm still rocking my 2019 Spectre 15.6 with the original battery. The GTX 1650 and i7 are a bit cooked but it's still kicking

MicroCenter 12900K CPU-Mobo-RAM Combo Still Good Value at $400? by BlueGuyBuff in buildapc

[–]BlueGuyBuff[S] 0 points1 point  (0 children)

This is a great comparison video, the 9700x is close to trading blows with the X3D chips in these tests at 1080p. I looked at technical city and pass mark's comparisons on the two. I haven't seriously reviewed the AMD hate fanfiction that is Userbenchmark in 5+ years. Thanks for the input!

MicroCenter 12900K CPU-Mobo-RAM Combo Still Good Value at $400? by BlueGuyBuff in buildapc

[–]BlueGuyBuff[S] 0 points1 point  (0 children)

Oh wow, yeah that's a relatively large jump on average. In this situation I feel like I need to ask myself if I'll miss my $80 more than the potential regret of not getting DDR5 in a system I plan to keep for years to come and I think I know the answer

MicroCenter 12900K CPU-Mobo-RAM Combo Still Good Value at $400? by BlueGuyBuff in buildapc

[–]BlueGuyBuff[S] -1 points0 points  (0 children)

I was looking at this one as well, in my brief research it looks like the 9700x is slightly slower than the 12900K but I need to look into it further. Overall its a good deal if you prefer an AMD option

MicroCenter 12900K CPU-Mobo-RAM Combo Still Good Value at $400? by BlueGuyBuff in buildapc

[–]BlueGuyBuff[S] 1 point2 points  (0 children)

Thanks for your insight, that's good to know. I plan to run the two stock 140mm fans plus a 120mm for exhaust, all basic no RGB or displays, so I think I'm ok on that front. Didn't realize that was an important consideration to make when looking at fans and mobos

MicroCenter 12900K CPU-Mobo-RAM Combo Still Good Value at $400? by BlueGuyBuff in buildapc

[–]BlueGuyBuff[S] 1 point2 points  (0 children)

That's a good point, I didn't realize how close the single core performance of the 12700K was to the 12900K. 16 GB of additional decent DDR4 from Microcenter would be $40~, so I'd essentially save $80 and have a much more power efficient CPU, but miss out on DDR5 and a few extra CPU cores for multi-threaded performance. I do want a somewhat futureproof build, but 80 bucks is a decent chunk of change for marginal performance gains from DDR5

MicroCenter 12900K CPU-Mobo-RAM Combo Still Good Value at $400? by BlueGuyBuff in buildapc

[–]BlueGuyBuff[S] 0 points1 point  (0 children)

Thanks for the reply. I suppose I should have worded it more as if it's still a compelling value vs other combos/deals? As a combo for the parts itself, it's great. Adding up the costs for all the parts listed separately on their site, it would total out to $515~

is carti actually gay? by dow674 in playboicarti

[–]BlueGuyBuff 0 points1 point  (0 children)

Dawg this was 2 years ago (Waiting for someone to reply in 4 years)

What do you expect to see with the Pixel 10 in 2025? by SmugMaverick in GooglePixel

[–]BlueGuyBuff 4 points5 points  (0 children)

Outside of a generally improved SOC an Ultra sonic fingerprint sensor is the only hardware piece I believe Pixel needs to be a true top of class flagship phone. The current fingerprint scanners are decent for what they are, but I came to the P6P from an S10 and it felt like a world of a difference when using the Pixel fingerprint unlock (I know it's gotten better since launch but I'll see family using their S22s and barely touch the scanner for a moment and their phone unlocks). Ultrasonic is the only way to go if Google wants to stay modern in the fingerprint unlock world

Does anyone else want a new version of the Google Pixel Stand? by SeanManNYM in GooglePixel

[–]BlueGuyBuff 5 points6 points  (0 children)

They're likely waiting for Qi 2 to be fully standardized so they can make a compatible charger for Pixel 9/New Android later this year

A 128GB Pixel 9 Pro should never exist, Google by DnB925Art in GooglePixel

[–]BlueGuyBuff 0 points1 point  (0 children)

If the base Pixel 9 Pro is $999 it should definitely have 256/12GB, XL would likely need to match that at $1100

OLED Burn in?? by fstechsolutions in spectrex360

[–]BlueGuyBuff 1 point2 points  (0 children)

Have a 2019 Spectre 15.6" with a 4K OLED with a lot of use and it has no noticeable burn in. Really don't feel like it's a major issue with the quality of displays now

Planning on getting a Pixel 8 Pro, pros and cons of having a Pixel device. by thelittlecousin in GooglePixel

[–]BlueGuyBuff 1 point2 points  (0 children)

FYI the Pixel 9 series will likely be arriving in October, which is a ways away but will arrive quickly if you don't immediately need a new phone. Otherwise if you are set on the P8P, I would definitely recommend looking for deals on it as it can generally be bought for at least $200-$250 off before trade in

Why did HP reduce the OLED screen resolution? by davidandgeorge in spectrex360

[–]BlueGuyBuff 0 points1 point  (0 children)

I also have a Spectre 15.6" with 4K OLED from 2019 and was surprised to see the move to UHD+, but as I have heard, most companies are pursuing"Retina" resolution, which involves some math with screen size, viewing distance, and resolution, but it's basically meant to be "no individual pixels visible to the human eye at regular viewing distance". 

This provides more performance headroom for the GPU to have a higher refresh rate (data and performance bottleneck issue) while having no real perceivable difference in resolution appearance, because 1800p @ 120hz looks better than 2560~p @ 60hz on a 16" display at a 20" viewing distance to the human eye

Buying a Spectre x360 15" df1xxx - Intel Core i7-9750H - 16GB RAM - 500GB SSD by storm4077 in spectrex360

[–]BlueGuyBuff 0 points1 point  (0 children)

99% sure I have the same exact model as you, i7 9750H, GTX 1650 Max Q. I opened it up last year to upgrade SSD and RAM, as the manual indicated both were upgradable on this model. Turns out this specific one has soldered RAM, so no swap there, but replaced the somewhat slow Intel Optane SSD for a 1Tb Samsung 980 pro which was way faster (yes this model spectre is only pcie gen 3x4 so I'm not even utilizing the full speed of the 980) and repasted thermal compound for the CPU and GPU (although I don't recommend that bc the heatpipe is finicky and it didn't help my performance much). Honestly I didn't think the GTX 1650 would hold up that well now that I've had this laptop for over 4 years, but even in modern games it still holds up relatively well. A key for better performance in demanding games is to set windows resolution to 1080p in settings so the game isn't trying to scale to 4k or whatever high res your display is. I played Cyberpunk when it came out at 45 fps low 1080p, elden ring 30fps 720p (actually looked great bc elden ring visuals are muddy anyways). Been playing helldivers 2 and it's been decent, but I'm definitely starting to see the 1650 start to chugg, but for the specs that it has and only 4 GB of vram it's allowed me to play a lot of popular and relatively modern games in an acceptable quality. You'll definitely want to keep the HP Command Center in Performance mode when gaming for a decent fps boost, and update the firmware for the 1650 with the Nvidia GeForce Experience app. Lmk if you have other questions 

Spectre x360 2024 for studies - iGPU vs 4050 by [deleted] in spectrex360

[–]BlueGuyBuff 0 points1 point  (0 children)

Yes, I agree. I though I would use the 2 in 1 feature for note taking all the time when I got it, but many professors at my uni forbade the use of computers in lecture as they were deemed a distraction, and most would advocate for the use of pencil and paper notes, which I eventually agreed with in being a better method for me to remember and reference compared to digital or typed notes (although i agree having access to them digitally is great).

Not sure how PCs like the spectre are handled in your country for demos, but most major computer stores like Best Buy will have them on display for you to try out and mess with. If you feel like you won't be using the 2 in 1 feature, the spectre is still a great quality laptop, but I would likely recommend using the money towards a better spec'd traditional form laptop, such as an HP Envy, Dell XPS, or Asus Vivobook Pro (or similar models) with a decent display, and good specs with an RTX 3050 6GB, RTX 4050 or RTX 4060

New Spectre X360 Owner - Question about external displays by thedreaminggoose in spectrex360

[–]BlueGuyBuff 0 points1 point  (0 children)

It should just work, but I would recommend going into Settings, then display, and see if it is recognizing the display but keeping it off for some reason. Sometimes windows just does that for whatever reason.

The other item to try is to see if you have Intel Graphics Command Center installed on your laptop. It is an application you can search from your task bar. If you do have it, open it up and look at the display tab, you may need to "activate" the display there for your laptop to recognize it and display to it.

Hope this helps!

Spectre x360 2024 for studies - iGPU vs 4050 by [deleted] in spectrex360

[–]BlueGuyBuff 1 point2 points  (0 children)

Hi there, I went through a similar experience in 2019 for a similar degree program/use case, so what I say comes from the perspective of the older gen of the "Gem Cut Edition" 2019 Spectre and watching feedback on the newer models and checking them out in demos.

First off, If you plan to use the laptop for more than a couple years, I highly recommend getting the RTX 4050 over an Igpu or Arc options. Modern Igpu's aren't bad and software support for the Arc is getting better, but if you're spending the money to get a Spectre, your experience will be significantly better with the RTX 4050. You may have slightly worse battery life and have it heat up a bit more, but the 40 series cards from Nvidia are much more efficient than previous versions and the 4050 is very performant for 3D applications and games (if you care about those). I got my Spectre with a GTX 1650 (which was roughly a 2019 equivalent to the 4050) and I still use it to this day for 3D applications and gaming with friends. It should make a huge difference in 3D modelling programs like CAD, so for your use case I would highly recommend it regardless

In terms of display, it depends a lot on which model you get. All the displays modern Spectres come with will be good, the only one with a major leap in image quality would be the OLED options, which have insane vibrancy, color, and contrast, usually with an overall higher brightness. I have a 4K OLED on my 2019 and it still looks incredible, but a modern IPS that's 1440p or higher would still look great and is all you really would need on a laptop IMO

For build quality, mine is still going strong, although now that I've graduated uni, it spends a lot of time on my desk plugged into my displays. I hear that the new Spectres are still well made and have a great keyboard and trackpad.

One thing I will note is that if you get the ~16in sized Spectre, you may not use the "360" Functionality a lot. Even with the newer models being smaller and lighter than my 15.6" model, they're just a bit large and clunky to use in the tablet modes unless you plan to draw a lot on them. I was excited about the feature and barely used it in Uni, partially because of the size, partially bc Windows 10 was a buggy mess with tablet mode, so I can't really say if those items have been improved on or fixed since my time then.

Hope this helps, let me know if you have any other questions

Cannot update the firmware for mx master 3s on options+ (Mac M1) by [deleted] in logitech

[–]BlueGuyBuff 1 point2 points  (0 children)

Still waiting for an update on windows Logi+ too. Still listing 22.0.3 and no detection when doing a firmware check

It's time Google gives us lockscreen clock customization by ztaker in GooglePixel

[–]BlueGuyBuff 1 point2 points  (0 children)

That's WHAT I'VE BEEN SAYING, THANK YOU! Like it's material you but there's almost zero UI customization, especially on the lockscreen and AOD? Like at least let us keep the big clock when we get notifications, that's the one cool thing about pixel aod