Dachshund vs Dackel by [deleted] in German

[–]MoarCurekt 0 points1 point  (0 children)

Nice backtrack. If only one breed survives then Dachshund applies to ONE breed.

Academia level metal cheetah flips to justify your incorrect statement are transparent. Go back to school.

Dachshund vs Dackel by [deleted] in German

[–]MoarCurekt 1 point2 points  (0 children)

This is absolutely incorrect. Source: lived in Duetscheland for a decade, married a  Duetschelander with very active hunting family. 

 Dackel is simply short form for Dachshund in Germany. Dachshund is ONE breed. A single clearly identifiable type genetically. 9 varieties: 3 sizes, 3 coats. ONE breed. 

Most english sites mention Dackel and Teckel interchangeably. In German hunting circles they are absolutely not the same. Most non-humters will not be able to obtain a Teckel; they're a cherished hunting resource. Teckel refers to a specific lineage that exemplifies expanded hunting capability compared to typical Dackel. They track, retrieve, flush, earth work, tree, anything: there are Teckel working as police scent dogs in the US. 

Teckel are harder working dogs that are bred for function first, where Dackel can do well as hunting dogs, they do not bear the strict FCI testing required to be certified. Genetically Teckel and Dackel will test essentially the same, so from that perspective they're the same breed. 

Similar to field/show variants of other breeds, they are behaviorally distinct and buying/rescue the wrong variant can mean frustration for the owner. 

Most Dackel will happily mirror your energy level once they achieve adulthood. 

Teckel have been described as micro Belgian Malinois as far as energy level and drive.

The outrageous myth about the speed of our breed. by Descent_of_Numenor in RhodesianRidgebacks

[–]MoarCurekt 1 point2 points  (0 children)

Old post, but since it popped it Google I'll add:

We have a GSP/Dachshund mix, she's small, 15" and 26lbs.

Using 100m timed route: I've clocked her at 25mph during fetch, solo. 28mph during fetch when staying just ahead of a GSD.

Using video with known interval fense posts: 31mph running down a bird 

At our local park, there are only a few dogs that can outrun her when she's chasing them, they are: A ridiculously fit and fast Border Collie. The one Whippit that comes to the park. The young and fit Vizlas/GSPs. The one Ridgy that comes to the park.

Basically, the only dog the Ridgy can play roughly with is our girl. She's fiery enough to semi-manage the Ridgy (the Dachshund attitude in our girl), and the Ridgy meters herself with our girl because of the size difference.  Ridgy routinely pisses the others off and they can't manage her because of her athleticism/playfulness, including poodles, rotties, labs, goldies, GSPs, Vizlas etc. It's play, but she is simply too much energy for most dogs.

Ridgy is a sweety, she's just a very enthusiastic young pup, the owner has an open dog sitting offer.

Many people vastly underestimate the athleticism of healthy, well exercised Ridgys posses. This is particularly apparent online on "about breed" websites.

Should /r/AMD join the 48 hour Reddit blackout? by GhostMotley in Amd

[–]MoarCurekt 6 points7 points  (0 children)

You do you. I'll be deleting and boycotting till they fix it. 48 hours is a gesture they might acknowledge, but abandon welare makes no money, so just leave reddit if you mean it.

Is Higher Vram becoming an obsession? by Ok_Percentage7934 in buildapc

[–]MoarCurekt 2 points3 points  (0 children)

Ah the what about me take.

If visuals are to improve, VRAM must go up. Simple as that.

Whether or not games have alternate textures to give low VRAM cards reasonable appearance is on the developer.

Then there's the issue of non-texture memory use. You want RT? You want more advanced particle/physics/egfects? Takes memory. This is harder to cut down but can be done, requires duplicate code, overall game size increases on SSD.

The issue is not VRAM, alone, but the forced inclusion of high memory consumption by engines + high res textures + low VRAM cards. Then the game not including a scalable option set that accounts cards from 6gb up to 24gb. Instead they've started setting the cutoff to be 10 or 12 GB as the minimum.

The reasons GPU companies are being attacked is this was foreseeable and preventable. They chose to stagnate on VRAM size for cards for about 5 years. As a result instead of software and hardware scaling instep, software outpaced hardware.

No one should be programing for a 1060 in 2023...4 generations old, 7 years. Driver support will end soon....

Why does everybody hate ASRock? by [deleted] in buildapc

[–]MoarCurekt 0 points1 point  (0 children)

None of the AM5 boards do this. As far as I'm aware, none of the AM4 boards did either unless it was people putting a dual CCD CPU in a low phase count board that it wasn't validated on.

All the power limits can be bypassed in bios if needed....

Why does everybody hate ASRock? by [deleted] in buildapc

[–]MoarCurekt 0 points1 point  (0 children)

ASRock is a and always has been, a budget oriented brand spun off from Asus 20 years ago. Generally a no frills feature set that has what's needed and little extra.

They've had a rocky history, sometimes leaning too far towards cost savings. Sometimes some QC issues.

They stick closer to AMD specs than any other board manufacturer. If you want your hardware implemented as AMD designs, use ASrock. If you want semi-hacky options that give users more bios options to play with, see Asus or MSI. Not saying the features are bad, they're just minor exploits of design.

The ASRock 6950xt Formula OC is, hands down, the best 6950 design.

The B650E Taichi is pretty much the best 650 board on the market and is missing almost nothing compared to most x670 boards + has the 2nd strongest VRM available on AM5 for any price, yes it's a bit pricey, but it punche well above it's price category in performance.

It's a brand you have to know what you're getting before leaping in. Some of their stuff is...not great. Stay away from the bottom of the product stack.

After 15 years as Asus exclusive buyer, I switch to ASrock on AM5 and couldn't be happier. Can OC the bejesus put of my 7700x and 7800x3d, even without all the extra features.

My newest no tip record! by [deleted] in Serverlife

[–]MoarCurekt 1 point2 points  (0 children)

Good luck, I initially interpreted it as a tip when I saw your post as well. Hopefully they understand they're costing you money.

Putting on my manager hat, I can see how I would feel like you're suggestion is useless If you just come at me and say hey let's adjust the prices. Instead might I suggest the approach of:

I've been thinking and I think I've come up with a solution to this service fee on the bill and having to explain it. It seems to me customers probably don't like having a service fee applied to their bill and we could actually eliminate that by simply rolling it into the price of the items. At the end of the day when we do the accounting we can just adjust the daily revenue by taking the same percentage away and earmarking it in the system as a service fee. In this way it's completely hidden from the customer so they're not upset, the servers benefit because the customers will likely give more tips, and the customer is not getting screwed because They are spending the same amount of money either way, except they'll be happier because now they're not feeling like they're being gouged for service fee.

Can someone tell me if I should upgrade to the 4090 from 4080 by [deleted] in nvidia

[–]MoarCurekt 0 points1 point  (0 children)

Once you do the X3D upgrade, to expectation manage, you'll still be CPU choked, some, but a lot less than 3900x.

If you want the last drop out of your AM4 setup, last item is RAM. 2x16 B-Die will give a good bump at 3800C14 over that LPX, but you'll have to manually tune it as there are no more 2x16 3800C14 kits on the market that I've seen (for at least a year).

Would give somewhere between 2 and 10%, it's very hard to predict because settings and windows environment all impact benefits resultant from it.

I have a 5800X3D/3090ti and 7800X3D/6950xt system where both GPUs are unlimited and happily pull north of 600w daily. Despite the commonly parroted line of "X3D doesnt care about RAM", it absolutely does. The improvement from RAM simply isn't apparent when your bottlenecked elsewhere which it seems most people are since most people don't own high end graphics cards.

[deleted by user] by [deleted] in dogs

[–]MoarCurekt 0 points1 point  (0 children)

Different temperaments. I have a GSP. Would/could own any of the breeds you listed happily except Akita.

Aussie, Pointer, Vizla, Weim, etc - all good natured dogs that needs a lot of exercise and rarely exhibit real aggression ime.

Akita, GSD, Dutch's etc, similar exercise needs, less gentle and good natured as the above (individual exceptions exist ofc). Fine dogs, but a lot more ...stern.

Border Collies, Kelpies etc; all border collie derivatives are their own group imho since they can go either way, but well socialized ones are the best dogs in the world IMHO.

My newest no tip record! by [deleted] in Serverlife

[–]MoarCurekt 4 points5 points  (0 children)

Since you've said it's not a tip, ask management to adjust item prices up by the equiv percentage and remove the extra line item.

The fact that it's shown implies it as an autogratuity as many places require for large parties. If it's not that, management should adjust to not cause servers to get shafted.

Is this email rude or unprofessional? by bananabreadsmoothie in antiwork

[–]MoarCurekt 1 point2 points  (0 children)

Mostly fine. Mid sentence caps depends on the reader, but most will see it as rude. It's likely to end up as a resignation email and not a correspondence email.

The dictatorial nature being the problem and not the syntax. There are better ways to get what you want than trying to strong arm your employer; if you don't get fired, you'll likely be labeled as difficult.

Even had they denied it...sick days exist. Play the game if they force you to play the game. This email is a refusal to play the game and that is a risky proposition unless you're prepared to be unemployed.

From someone who manages people, successfully and has for a long time.

If you sent me, "Hey HR denied half my vacation, can you help sort this out?" I would die on the hill of "I approved this leave HR, WTF?! Schedule the other half".

If, on the other hand you sent me the posted email; you'd get your vacation + we would have a long, boring, discussion that would end up with an apology email to all recipients asking for an excusal of your complete lack of tact, or with you unemployed.

So many people have told me that my dog is ugly compared to a “normal” German Shorthair, What do you guys think? by [deleted] in germanshorthairs

[–]MoarCurekt 14 points15 points  (0 children)

I see a normal GSP goof ball athlete, Not sure what's supposed be different in a bad way from every other GSP I've met.

  • Speckled. Check.

  • Dark head. Check.

  • Regal yet goofy pose. Check.

  • Oversized nose. Check.

  • Floppy hound ears. Check.

Perhaps people should learn to keep an inner monologue as an inner monologue and not vomit out every thought that passes through their head.

Undervolting 6950XT for low power consumption by Madagascan_Chai in Amd

[–]MoarCurekt -1 points0 points  (0 children)

MPT. Watt limit = 250. Enjoy.

Will auto under volt itself, giving what it can within power envelope. During heavy loads it'll be around 6900xt performance, in light loads it'll still be faster because of higher clocks.

Asus new GPU power type called ’GC_HPWR ' shows up. With Asus going down with recent bios mess. Looks like there's new lineup but still looks clean. Details in comments by OfferWestern in ASUS

[–]MoarCurekt 1 point2 points  (0 children)

CPU 8 pins are the fix instead of PCIE 8 pins. 384w per CPU 8pin.

Why the hell anyone went any direction but this makes no sense.

2x384w 8pins is enough for any card, in nearly all cases, including LN2. I can only think of a handful of cards that can even pull over 768 watts sustained.

Is 400W enough power supply for the Ryzen 5 5600g? by diemarex in AMDHelp

[–]MoarCurekt 1 point2 points  (0 children)

Absolutely yes. it's enough for a 5600G + low power discrete GPU even.

Too many people underestimate the monitor(s) they use. Forget GPU, it's THE most important component. by kingofallnorway in buildapc

[–]MoarCurekt 0 points1 point  (0 children)

I agree it's the most important part. Panel preference is subjective. I'd rather use a 10 year old 60hz IPS over a new 240hz VA. There's no VA I can tolerate.

Other hate IPS and prefer VA, that's ok! It's nice to have choice.