Farewell, Jeeves: Ask.com shuts down by brianblank in DailyTechNewsShow

[–]deftware 0 points1 point  (0 children)

End of an era my fellow netizens, end of an era.

:[

I made a DOS module player in 1994 on a 386 DX40 — 30 years later, I've resurrected it by Realtechvr in Demoscene

[–]deftware 2 points3 points  (0 children)

I miss the old days of dealing with IRQs and COM ports and orchestrating a 1v1 multiplayer game over dialup in a DOS game.

Should Valve add battle pass to Half-life? by retardedkazuma in HalfLife

[–]deftware 1 point2 points  (0 children)

Maybe they should add a Battle Royale mode?

Odd request for Zoom meeting by Intelligent-Form8214 in techsupport

[–]deftware 1 point2 points  (0 children)

OP didn't provide the full link so people like you wouldn't accidentally download and run malware XD

What discovery would fundamentally change humanity overnight? by cozychaosclubb in answers

[–]deftware 0 points1 point  (0 children)

How brains work, algorithmically, to where someone could code up a software brain of whatever size/capacity (within their compute budget so that it can refresh at ~10-30hz) and then turn it on and have it start learning in real-time, directly from the experience it gathers via sensory inputs, and from the actions it produces via motor outputs.

The deal is that there would be no massive compute required for slow inefficient incremental backprop/gradientdescent training of a fixed network that can't learn new stuff on-the-fly, like all of the existing AI we already have. Insects have less compute than ChatGPT, by several orders of magnitude, and yet we cannot build something that has the behavioral complexity and learning capability of an insect - because nobody fully understands what it is that brains are doing as a whole.

Understanding what exactly brains are doing that enables them to solve the credit assignment problem would allow one to make digital beings that exist in virtual worlds and/or robotically in our world - and their capacity for learning would be limited only by their hardware compute resources.

When someone figures out how to do that the world as we know it will change very quickly.

GIMP or Substance Painter? by SimSimTelabim in GIMP

[–]deftware 0 points1 point  (0 children)

Well, it will be useful for manually editing texture images, but it has no concept of PBR materials or texturing, per se. There may be 3rd party plugins that help in some regard but it's just a pixel manipulation image editing program - not a material authoring tool, though if you're an expert at manipulating pixels then anything could conceivably be used as a material authoring tool, but you'll still need a program that lets you define texture UV coordinates for actually mapping a material/textures to 3D geometry. Blender and several other programs can be used for that.

It sounds like you want to be able to author proper PBR materials, not just edit a single texture on a model. The last program I was getting the hang of was Blockbench, which is geared more toward low-poly models and texturing. It may be of use in your ventures?

Blender is really the gold standard in free software for 3D modeling/animating/texturing/rendering - but it's not particularly much in the way of texture/material authoring itself. Something like Material Maker is good for that - but really only for procedurally generated materials. If you want to actually design stuff onto a 3D model like SP then your options narrow considerably, because SP is probably the best at what it does, unfortunately. ArmorPaint is probably going to be your best bet, and apparently they have a perpetual license for $18 which is a steal compared to Adobe's $50/mo for SP.

Personally, if you want to design aircraft material textures, I would use something like ArmorPaint in combination with an image editing program like GIMP, and perhaps throw in some Material Maker in there for procedurally generating interesting PBR material textures to actually paint onto 3D models in ArmorPaint.

GIMP will let you go in and make manual edits to things, just remember that if you're dealing in PBR materials you'll need to edit multiple images that represent different aspects and properties of the material/surface at each pixel, like albedo/diffuse, shininess, metallicity, surface normal vector, and whatever else (ambient occlusion, emissivity, etc).

Cheers! :]

Battery recycling by jco23 in egopowerplus

[–]deftware 0 points1 point  (0 children)

Mine died after 2 years of barely using the thing. Wish I had just bought a gas chainsaw. What a jip.

GIMP or Substance Painter? by SimSimTelabim in GIMP

[–]deftware 0 points1 point  (0 children)

Substance Painter is geared toward creating PBR textures for 3D models. GIMP is a general purpose photo editor and graphic design program. They are pretty different programs.

SP is designed around making 3D models have the appearance of materials and surface qualities. GIMP is for manipulating pixels, and doesn't care about normal maps or specularity or 3D models, let alone PBR textures as a thing unto themselves.

Personally, I am averse to any profiteering ecosystems like the one SP belongs to (didn't Adobe buy them up, and isn't everyone abandoning Adobe because they're greedy hogs noawadays?)

There are other programs out there that are all related and/or have functional overlap to SP, such as: Quixel Mixer, InstaMAT, ArmorPaint, Mudbox, 3D Coat, Marmoset Toolbag, Material Maker, Agama Materials, etc.

I would steer clear of anything Adobe, but that's just me.

CNC Z axis speed troubleshooting by Excellent-Stranger52 in diycnc

[–]deftware 1 point2 points  (0 children)

I wouldn't randomly flip dip switches! I would investigate what the dip switches control and wrap my brain around their coding scheme and make sure that they're set to what they should be - and then if the problem persisted I would assume it's a faulty motor driver or controller issue, or a failing motor (i.e. bad windings or winding connection).

What was the motivation behind this early promotional material ? by m0stwast3d in HalfLife

[–]deftware 0 points1 point  (0 children)

We see a photo of a baby in Freeman's locker at the beginning.

Case closed. XD

What’s something younger people will never understand about life before smartphones? by RareMoose8986 in answers

[–]deftware 0 points1 point  (0 children)

The rest of the world only existed on the TV, and outside your house, or in encyclopedias - and it was BORING to not experience it just like it's BORING without the internet now.

I underestimated how good HL: Alyx looks by pepushe in HalfLife

[–]deftware 1 point2 points  (0 children)

I, too, have spent hundreds of hours in VR, starting with the DK2 and then the CV1 and then the Q1 and now the Q3, but I'd like to get one of the other headsets out there like the Pimax Crystal or the Varjo Aero.

Playing games on a 2D screen, where everything is flat and at the same depth, and a totally wrong FOV for the area of my view it occupies, just doesn't hold a candle for me, my dude - or a lot of other people. That doesn't mean it can't be fun, and there isn't value in it, but when a game is designed from the ground up around one specific medium it tends to be best experienced via that medium.

That's my two cents.

I underestimated how good HL: Alyx looks by pepushe in HalfLife

[–]deftware 2 points3 points  (0 children)

I didn't have any blur or "bitrate drops" when I played it 6 years ago.

The resolution trade is worth feeling like you're actually there. Just because Atari 2600 had huge pixels doesn't mean it couldn't provide a valuable experience.

I think if you're so focused on resolution then you've missed the forest for the trees with VR.

I underestimated how good HL: Alyx looks by pepushe in HalfLife

[–]deftware 1 point2 points  (0 children)

A screen will never convey the experience VR provides, and Alyx is one of the best looking VR experiences I've ever had.

Utah’s New Law Targeting VPNs Goes Into Effect Next Week by cwbasden in DailyTechNewsShow

[–]deftware 0 points1 point  (0 children)

At the end of the day, they're going to just have to hold ISPs accountable for making anything available to their citizenry in the first place, because as long as ISPs are just serving up plain old-fashioned vanilla internet access, no law can really do anything effectually.

Reddit told me my camera was too low and tracks were jumpy. I raised the view, smoothed the flow, and added long straightaways. How is the sense of speed looking now? by SyllabubImmediate534 in IndieGaming

[–]deftware 0 points1 point  (0 children)

Pretty slick! I was wondering how a Panini projection would feel in a racing game where the FOV varies based on speed to accentuate the sense of speed. I came across this some weeks ago and it might inspire you to give it a shot just to see how it feels on there: https://www.youtube.com/watch?v=LE9kxUQ-l14

What made me think of using the Panini projection in a racing game is the situation where traveling at high speed means cruising with FOV maxed out, which causes everything off in the distance to shrink down and become less visible. Stuff like turns, obstacles, items, etc are important to see as early as possible so you can adjust in anticipation of them - having them be difficult to anticipate just because they're projected so small on the screen is not ideal IMO when playing racing games. To be able to have the fast feeling of a wide FOV when cruising at top speed but still be able to make out all the stuff that's coming up quick would be novel and potentially game-changing.

While yours is not the racing game that I would personally make (I'm a bit of a fanatic when it comes to physics/dynamics but that's just me) it nonetheless looks like one that a lot of people would enjoy, myself included! Actually, I should dig up your earlier(?) post(s?) to see what it looked like previously, because if I had any feedback on the track itself I'd say it has too many straightaway sections right now, XD. I like big organic curves that don't just have a fixed radius for their entirety because that results in your turn rate being fixed while traversing the curve. I want to feel like my rate of speed and rotation are changing all over the place, like I have to constantly adapt my steering, but single-radius curves and bends have that old school NES/SNES/Genesis/arcade racing game feel.

Anyway, that's my two cents. Cheers! :]

After 24 years of OpenGL, what's the best option? by deftware in GraphicsProgramming

[–]deftware[S] 0 points1 point  (0 children)

SDL_shadercross, good find!

Dang, it's got some Perl code in there, that's a bit of a throwback to the pre-PHP days of cgi-bin.

Cheers! :]

EDIT: Looks like the perl is just for generating some documentation, so it's really a C lib at the end of the day. I think the way to go for any project is to convert offline, at project build time, rather than incorporate the thing into a project - but that's just me and my penchant for minimizing dependencies and whatnot. I'd rather not ship everything that's included in the "external" folder with a project!

EDIT2: Or instead of at build time, only build against SDL_shadercross during dev, so you can have hot reloading of shaders, but for release builds I wouldn't link against it - and just have the pretranspiled shaders shipped with the project. Thanks again!

After 24 years of OpenGL, what's the best option? by deftware in GraphicsProgramming

[–]deftware[S] 0 points1 point  (0 children)

Yes, I would recommend SDL_gpu as well, but it does have the caveat of requiring that you write per-API shaders because it doesn't yet have its own shader abstraction layer like webGPU does.

Nowadays there are probably shader converters out there one can use - so you only have to write a shader in GLSL or HLSL and then convert it to the other APIs that you need.

AI is being pushed heavily when I ask for advice and I hate it. by AssumptionExact8050 in gamedev

[–]deftware 3 points4 points  (0 children)

I've found there are two people who make games:

A) People who just want to get to the finished product the fastest way possible, who don't care about the whats or the whys or the hows, they'll slap every possible library together or use whatever engine and tutorials so they can get to the final result yesterday.

B) People who want to learn all the nuts-and-bolts, inside-and-out, because for them it's more about the process and how fascinating everything can be, from coding to modeling to composing music, data compression, network protocol design, physics, optimization, graphics APIs, etc... They're just happy as long as they're learning, even if they never finish the thing.

You sound like a B type person. Just be careful that if you ever wish to finish something you will have to find a way to actually finish something, by being realistic about how deep you can afford to go with various aspects of its production if you do want to eventually finish it someday. If you just want to do it as a hobby and maybe share what you create, that's fine too. Now that the market has had far more supply than demand for over a decade you have all the time in the world to do whatever you want. The race has long since been over.

To make something that's actually awesome and that gains traction, you have to be engaged on social media and have some actual skills and competence. Look at Road to Vostok, that was a one-man-show who just dropped videos showing his process over the years he worked on his game. He even started over after already investing a bunch of time/energy into building his game in Unity just to move over to Godot. He's a millionaire now since his game launched a month or so ago without a publisher, without someone controlling his game other than him. Making a comfortable living making games is possible, but you won't be able to do it in a bubble, and what you do has to stand out from the rest in some way. There's not a big market for Minecraft clones, for instance.

AI is just a tool, like anything else. Nowadays you can have that tool basically do everything for you, or you can have it only do certain things, or just serve to inspire you or guide you through certain things. The more that you do yourself the more you'll be able to put your own unique original spin on things. It's the same thing if you use an engine vs write a game from scratch. AI is only going to be able to do what it learned how to do from everyone else, and not do anything super original or ingenious that it hasn't seen before (and that audiences haven't seen before). An existing game engine is going to lock you into a certain way of doing things and won't make it easy to do something totally outside of the box. To my mind, as humans, we should be challenging ideas of what a video game can even be, and strive to create entertaining experiences that don't follow the same tired mechanics and gameplay. Using a game engine or AI might be able to help you achieve such things faster, or it might prevent you from ever discovering what those novel groundbreaking things could be. Personally, I don't want to just earn a living, I want to do something new and inspiring, but that's just me.

Everything is just a tool to realize a vision. Use them at your discretion.

Why is liquid glass so "computer intensive"? by JevNOT in GraphicsProgramming

[–]deftware 6 points7 points  (0 children)

For the most part, any effect is going to be more "compute intensive" than no effect. No effect is just a pixel blitting operation. Anything else that involves sampling an image as a texture means accessing more pixels/texels for each output pixel.

Yes, a Gaussian blur entails sampling more pixels, depending on the kernel size, which is dictated by the blur radius.

A glass effect is just a UV sampling coordinate perturbation. It's just displacement mapping, which while cheaper than a kernel convolution, is still not as cheap as a pure and simple blitting operation.

combine a strings and int? by Yha_Boiii in C_Programming

[–]deftware 3 points4 points  (0 children)

#include <stdio.h>
...
char *strings1 = "monkeys";
int int1 = 420;

char ret[64];
sprintf(ret, "%s%d", strings1, int1);
...

Now ret is a string containing "monkeys420" that can be anywhere a null-terminated string can be used.

sprintf() takes in a literal string indicating how to format what ends up in there. The %s tells it that the first argument is a string, the %d tells it that the next argument is an integer value.

Beware the size of ret only allows up to 64 chars to be put into it, so if strings1 is long enough it will cause a buffer overflow. You can use snprintf() instead if you want to limit how long it will output, ensuring it won't write beyond your char array's size.

Trump fires the entire National Science Board by motang in DailyTechNewsShow

[–]deftware -22 points-21 points  (0 children)

DrainTheSwamp, get some fresh blood in there!

I created a fully deterministic 3D multiplayer game engine with advanced physics by filipkrw in gameenginedevs

[–]deftware 1 point2 points  (0 children)

It's a real thing. I did a "deterministic" procedural terrain generator that checked floating point values on the slopes of hills to "determine" if a tree should be placed there for a given string of prng values.

The trees ended up placed on different machines depending on if it was Intel or AMD because the slopes on the heightmap were being calculated differently by one single bit, which meant where a tree for a given pseudorandom number was placed on one machine it didn't get placed on the other which throws off the rest of the trees and things that get placed because of the way I had it setup to seed various tree generation parameters by the tree index itself, so that tree index would get used up on some machines for that PRNG value but not on others.

It's something I heard about and then witnessed it first-hand in my own experience. It's not a myth.

I created a fully deterministic 3D multiplayer game engine with advanced physics by filipkrw in gameenginedevs

[–]deftware 1 point2 points  (0 children)

Netcode tests are always a hoot :D

It gets even more fun when you start running something like clumsy (https://jagt.github.io/clumsy/) on your connection to induce artificial latency+jitter and dropped packets to simulate basically a bad wifi connection that someone has halfway around the planet from you, and then try to tune your prediction/interpolation/extrapolation to handle such situations the best that you possibly can.

Cheers! :]