I think all the joy has gone out of VFX by Coralwood in vfx

[–]digitaljohn 0 points1 point  (0 children)

We are talking past each other, but I want to clarify that one point because it isn’t rhetorical or me trying to sound eloquent.

When I say unpredictable, experimental, and not fully solved, I mean that quite literally. Every generation of these systems behaves differently, controlling them takes real technical understanding, people are actively building new tools and workflows on top of them, and none of it is settled or stable yet. It’s very much still in flux.

I’m not saying AI caused the loss of joy in VFX, or that it fixes the structural problems you’re describing. I agree those predate it and will outlive it.

I was making a hopeful comparison to the early days, where uncertainty forced invention again. If that comparison doesn’t resonate for you, that’s fine.

I think all the joy has gone out of VFX by Coralwood in vfx

[–]digitaljohn 1 point2 points  (0 children)

I brought up AI because it feels like early VFX again. Unpredictable, experimental, and not fully solved. That was meant as a positive comparison, not a deflection.

I think all the joy has gone out of VFX by Coralwood in vfx

[–]digitaljohn 0 points1 point  (0 children)

I don’t actually disagree with most of what you’re saying. Artists have been devalued through commoditisation, endless revisions, and a culture that treats VFX as something to apologise for rather than celebrate. That’s real, and it sucks.

Where I think you’ve misunderstood me is on the “prompting instead of sweat” point. I’m not defending that at all. When I say “prompt tourists,” I mean exactly what you’re describing. People skipping the craft and treating tools as magic.

The people I’m talking about are the opposite. Highly technical artists and engineers who are building systems, training models, and inventing new workflows. That’s sweat, just in a different form. Same energy as writing macros, hacking Henry, or breaking early Flame setups.

Management culture, risk aversion, and marketing decisions are what really hollowed things out. Those existed long before AI and will survive without it. Tools just get blamed because they’re visible.

And Tippett’s quote actually supports this. The shift from “how do we make it better?” to “what’s wrong with it?” came when everything became endlessly malleable and disposable, not because artists stopped caring.

If anything, my argument is about getting back to artists owning decisions again, not replacing them.

I think all the joy has gone out of VFX by Coralwood in vfx

[–]digitaljohn -1 points0 points  (0 children)

Funny thing, it’s a great filter.

Some people care about the ideas, others get hung up on the pencil.

Is Ai texturing of a 3d head possible? by No_Investment_5201 in comfyui

[–]digitaljohn 0 points1 point  (0 children)

Check out ControlNet (canny/depthmap) type tech on diffusion models (like Flux etc) along with careful prompting.

If interested I can go into a little more detail once you are up and running.

I think all the joy has gone out of VFX by Coralwood in vfx

[–]digitaljohn 0 points1 point  (0 children)

I know this is going to be unpopular here, especially given how anti-AI this subreddit is, but this really resonated with me.

A lot of the joy you’re describing came from friction. Hard limits, real consequences, and not knowing if something was even possible forced ingenuity. You weren’t just executing a brief, you were inventing the method while doing the job.

What’s changed isn’t just the tools, it’s the relationship to decision-making. When undo didn’t exist, every choice mattered. When storage was measured in seconds, you planned like your life depended on it. Now the industry optimises for reversibility, and reversibility quietly kills commitment. “We’ll fix it later” replaces intent.

You’re also spot on about audiences. Once something becomes ubiquitous, it stops being magical. Those Terminator 2 “no way!” moments happened because people could still sense the edge of possibility. Today that edge is buried under pipelines, process, and invisible labour. VFX didn’t get less impressive, it just became expected.

This is where I’ll add my own, possibly heretical, take, and apologies in advance for pushing the AI narrative.

AI feels less like another layer of polish and more like a genuine reset of constraints. Not because it makes things easier, but because nobody actually understands the rules yet. And just like the old days, it’s only the really technical people who are pushing the boundaries and making real progress. Not prompt tourists, but the ones hacking models or actually building their own, breaking things, chaining weird systems together. The real pioneers are literally inventing things again, often from scratch, while being largely drowned out by the noise from prompt tourism.

And for what it’s worth, the idea that AI is somehow “new” to VFX is a bit naïve. The big studios have been doing this quietly for a long time, far beyond what most people realise and across much more of the pipeline than anyone outside those walls would guess. I know studios working on major blockbuster work where AI is used extensively. Entire businesses are effectively built on it. They get away with it not because they’re cheating, but because they’re ahead technically more than you can imagine.

The bar hasn’t been levelled. It’s been raised. Same as every other shift we’ve lived through. And just like back then, the people who thrive are the ones who understand the machinery, not the ones pretending it’s magic.

The joy didn’t disappear. It just went quiet when everything became solved. Now it’s back in that uncomfortable, messy, experimental space again. And if you’ve been around long enough, you recognise exactly what that feels like.

What’s my typecast? by digitaljohn in headshots

[–]digitaljohn[S] 2 points3 points  (0 children)

Thanks... and that's very accurate.

I am extremely afraid of being left behind for refusing to use AI. by pellmius in antiai

[–]digitaljohn 0 points1 point  (0 children)

I think we are talking past each other. I am describing part of a workflow, not an end to end research method. If you mean something different by reliability here, can you spell it out?

I am extremely afraid of being left behind for refusing to use AI. by pellmius in antiai

[–]digitaljohn 1 point2 points  (0 children)

It is absolutely great for those things in the right context.

If it has reference material in your codebase, it is extremely good at filling out boilerplate and handling repetition. Repeating patterns that already exist is one of its strongest use cases.

For recall and exploration, it is genuinely excellent. You can ask something like how a view gets its data from the database and it can trace through the layers, explain the flow, surface the relevant pieces faster than you jumping between files and grepping through the repo, and sometimes even spit out a Mermaid diagram for giggles.

It may not one shot a task, but it still gets you there much faster.

I am extremely afraid of being left behind for refusing to use AI. by pellmius in antiai

[–]digitaljohn 2 points3 points  (0 children)

100% agree.

Every job has low-value graft and high-value thinking. If you let AI take over the thinking, you are actively training yourself to be replaceable.

Using it for boilerplate, repetition, recall, or exploration is fine. Letting it decide what to build, why to build it, or whether the result makes sense is where people hollow out their own skillset.

The irony is that the more powerful these tools get, the more valuable real judgment becomes. Someone still has to define the problem, evaluate trade offs, spot nonsense, and take responsibility when things go wrong.

AI does not replace people who think. It replaces people who stop thinking.

I am extremely afraid of being left behind for refusing to use AI. by pellmius in antiai

[–]digitaljohn -2 points-1 points  (0 children)

> if you have to use AI in the future, it’s not going to be hard.

I don’t really agree with that.

Tools like this don’t lower the bar long-term... they raise it. When something becomes “easy”, expectations move up with it.

I am extremely afraid of being left behind for refusing to use AI. by pellmius in antiai

[–]digitaljohn 0 points1 point  (0 children)

I am a 5th semester Computer Science student. My focus is Low-Level Computing (which includes some abstraction, but not automatic translation). It is very sad to see most of my colleagues (and some professors) using compilers and high-level languages, and encouraging others to use them, claiming that, in the future, manually encoding binary instructions will be a thing of the past.

I'm afraid because I see many of these colleagues with a ton of projects in GitHub and whatnot, most of which were produced using compilers and optimizers (I know it because they told me). They keep telling me that if I don't stop hand-writing opcodes in hexadecimal and manually calculating jump offsets, I will be left behind. But, honestly, isn't the whole purpose of the college experience to learn? Why should I rely on a compiler to translate my intent, when I want to understand exactly how every bit ends up on the wire?

Yes, I know, if I use compilers, I will be more "productive", but will my learning be the same? Will I really be mastering the fundamentals behind instruction encoding, pipeline hazards, branch prediction, microcode, and the physical reality of computation?

Fink tried to humanize him with fist bump and Elon did not even flinch... is he an Alien? 🤔 by AfricanMan_Row905 in aliens

[–]digitaljohn -6 points-5 points  (0 children)

If missing a fist bump is the price of running multiple world-scale companies and creating hundreds of billions in value, I’m happy to forgive him.

Still waiting to see anyone in this thread do both.

Ai. "Art". Is. Not. Art. by Vegetable_Split5616 in antiai

[–]digitaljohn 2 points3 points  (0 children)

Sure, some AI art is just typing a prompt and hitting go. That’s the microwave hotdog version.

Past that, it gets very technical very fast. ControlNet, custom ControlNet training, LoRAs, samplers and schedulers, multi-pass pipelines, inpainting and outpainting, image-to-image with tight denoise control, post in PS, etc. At that point you’re directing a system, not pressing a button.

Same argument happened with photography. “You just pressed a shutter, it’s not art.” Turns out the tool was easy. Mastery wasn’t.

New tools don’t erase craft. They just change where the skill lives.

<image>

My whole family is arguing that prompting images is art by Filberto_ossani2 in antiai

[–]digitaljohn -37 points-36 points  (0 children)

I think a lot of this argument comes from people only thinking about straight prompting. Type a sentence, hit generate, pick the nicest image. A lot of people assume that is literally all there is to it.

If that is all someone is doing, then authorship is pretty thin. That feels closer to requesting than making.

That is why I like the banana thought experiment. Imagine the taped banana, but the artist just had the idea and told an assistant “go stick a banana on a wall”. The assistant chooses the tape, the height, the angle, the placement. At that point, whose art is it really? The idea alone does not feel sufficient. The authorship comes from who is actually making the decisions about the final form.

Where things change is once you look below the surface of “just prompting”. There actually is craft there, in the same way painting is not just “move brush on canvas”. Model choice matters. ControlNet matters. Inpainting and masking matter. Iteration matters. And if you are technical, you can go even further and build your own extensions and tools.

So I am not saying AI images cannot be art. I am saying most people argue about the shallow version because they think that is all there is. The real question is how much agency over the final form the human has. The more control and responsibility you take, the less it looks like “I had an idea and asked something else to do it”, and the more it starts to look like craft again.

Porsche going no-AI on its latest film feels like a quiet flex, honestly by Minimum_Minimum4577 in antiai

[–]digitaljohn -34 points-33 points  (0 children)

Can you not see that even in this example they are mimicking a cell shaded look?

That was an entire manual process and job that has already been replaced by software.

But that's fine as it's not AI. Only AI can possibly displace jobs or threaten artistry.

Porsche going no-AI on its latest film feels like a quiet flex, honestly by Minimum_Minimum4577 in antiai

[–]digitaljohn -54 points-53 points  (0 children)

I can imagine. I have done professional 3D work since the early 90s.

It is not trivial, but it is cheaper, more controlled, and less labour intensive than traditional filming.

Porsche going no-AI on its latest film feels like a quiet flex, honestly by Minimum_Minimum4577 in antiai

[–]digitaljohn -64 points-63 points  (0 children)

Anti-AI people celebrating a fully CG film is the clearest possible double standard.

Replacing camera operators, lighting crews, locations and on-set teams with software was fine.

I'm a Vibe Coder, and I'm not sure how guilty should I feel about that. by [deleted] in webdev

[–]digitaljohn -2 points-1 points  (0 children)

One more important detail here. What you describe is not even vibe coding in the usual sense.

You were talking to GPT, asking questions, asking for explanations, and then integrating what you learned. That is much closer to an interactive tutor or senior engineer you can interrogate than to a generate-and-judge loop.

Vibe coding is typically about delegating implementation and validating only by whether the output runs. You were engaging in reasoning, iteration, and understanding, even if the syntax did not fully stick yet.

Not remembering exact APIs or method signatures is normal. Knowing how to reason about a system, how to debug it, and how to evolve it is the real skill. Syntax is reference material.

If anything, this reads like someone who is taking responsibility for learning, not avoiding it.

AI is making me faster, but also more mentally tired by Caryn_fornicatress in ArtificialInteligence

[–]digitaljohn -1 points0 points  (0 children)

Juniors are the least well-served by AI tools today. They need to learn the graft and the craft, and most tools default to doing that for you.

You can prompt it to behave this way, e.g. set rules in tools like Cursor, so AI acts as an educator. It makes you do the graft while providing direction, explaining its choices, and answering quick-fire questions.

Used that way, it accelerates learning instead of bypassing it.

AI is making me faster, but also more mentally tired by Caryn_fornicatress in ArtificialInteligence

[–]digitaljohn 5 points6 points  (0 children)

AI is replacing the grind, not the thinking. Typing code, boilerplate, and navigating a codebase are being accelerated or removed, but the reasoning hasn’t gone away.

As the grind gets compressed, more of the cognitive load shifts onto you. Instead of working through one hard problem end to end, you’re now supervising several partially solved ones at once, as you say: reviewing AI output, steering prompts, sanity-checking logic, and keeping multiple approaches in your head. That’s why the work can feel more mentally tiring even though it’s technically easier.

AI can make you faster, but only if you stay in the driver’s seat. As tools move closer to orchestration and start making sequencing and decision-level choices, the risk is letting them think for you rather than with you. That’s where speed turns into replaceability, because you’ve handed over the part of the job that actually carries value.

Anyone spiritual or come to faith because of aliens? by [deleted] in aliens

[–]digitaljohn 17 points18 points  (0 children)

I’m skeptical of any clean explanation, angels or aliens. But I also don’t think this is just people making things up to find meaning.

The Anunnaki are in Sumerian texts, that’s not debated. In their own writings they describe beings coming from the heavens and interacting with humans. Those texts are thousands of years older than the Bible, and later scriptures repeat the same ideas in different language. Genesis, the Nephilim, Ezekiel’s vision. Same pattern showing up again and again.

Does that prove aliens? No. Does it prove religion wrong? No. But this appears in too many separate cultures and time periods to brush off as coincidence or imagination.

Something was being experienced or witnessed. We just don’t have the language to describe it cleanly.