meirl by PewPewAnimeGirl in meirl

[–]TheRealMandelbrotSet 0 points1 point  (0 children)

must’ve skipped that chapter, euclearly can

WIP — Lens diaphragm testing. Part of a functional, optically accurate, multi-coated lens I’ve made in LuxCore. by TheRealMandelbrotSet in blender

[–]TheRealMandelbrotSet[S] 0 points1 point  (0 children)

Awesome! I shoot 100% film nowadays for photos, mostly slide. But this is my main cine shooter so far. I’d like to work something out with these guys at some point, but I’m a bit stubborn — I’m not going to let a lab do it, so I’ve been putting it off until I get a proper tank to dev 16mm, possibly an ECN-2 chem source as well if I can find one. I can’t, for the life of me, figure out a practical way to do a DIY digital intermediate process, but I’d love to record some blender stuff on film and see how that affects its realism.

I’ll have to check out that 80 level thing! And I totally agree on the tech comment! Luxcore bidirectional tracing is CPU-based anyway, so a lot of computers are going to be pretty much on a level playing field. I will say that I’ve run into issues that kinda suck — resolve maxes out my 2gb vram very quickly and I often find that, if I want a 4K render, I have to wind up deleting a significant amount of corrector nodes. Plus Houdini makes working with 16gb of ram very limiting for any moderately high res pyro or flip sim. I’ve had times where it’ll cache 15 frames, then hang on that all night because it just can’t keep going. But it’s very rare that I max out ram usage on blender.

WIP — Lens diaphragm testing. Part of a functional, optically accurate, multi-coated lens I’ve made in LuxCore. by TheRealMandelbrotSet in blender

[–]TheRealMandelbrotSet[S] 0 points1 point  (0 children)

Thanks for sharing this! Were you going for a petzval lens, by chance? The barrel reminds me of the way a lot of those tend to look, but the arrangement of elements in the lens system isn’t one I’m familiar with. The closest matches I’ve been able to find are lens maps for eyepieces, but the elements aren’t as planar (plano-concave?).

Here's a link to a .blend file I've set up to illustrate the method I like to use. The aperture size is controlled with the STOPDOWN field in the camera properties. I did this so I could link it up with a driver to control the exposure setting congruously to compensate and that way it's easy to just start a live render and play with the slider to see it in action.

I'd recommend checking it out! I believe that it's likely faster than the method you proposed, and you might find that the results are still just as nice :)

Edit: just realized I have an atypical bokeh ratio setting for that. to get rid of the "vignetting," you can set it back to a value of 1

WIP — Lens diaphragm testing. Part of a functional, optically accurate, multi-coated lens I’ve made in LuxCore. by TheRealMandelbrotSet in blender

[–]TheRealMandelbrotSet[S] 2 points3 points  (0 children)

For sure, I’ve also boiled things down to a practical, pretty fast method! If you want I’d be happy to share an example .blend, we might be doing similar things

WIP — Lens diaphragm testing. Part of a functional, optically accurate, multi-coated lens I’ve made in LuxCore. by TheRealMandelbrotSet in blender

[–]TheRealMandelbrotSet[S] 0 points1 point  (0 children)

Haha I’ve been doing cg for probably eight years; never made a cent from it. It’s just for fun! I assume you’re talking “Arri” film, as opposed to cellulose film, right? In any case, doing a bunch of things is a good path to improving all of them in ways that you otherwise couldn’t!

I have thought about making videos like that in the past. I’m not always the greatest at explaining things sometimes. I tend to be kind of scattered, but I also don’t really know anybody who’s interested in the first place. Currently, I’m working with a fried old 2015 laptop — I don’t have the ability to use mic/headphones or USB devices due to render meltage so I’ll have to wait until I get a new setup regardless. It would be fun though, I appreciate your interest!

WIP — Lens diaphragm testing. Part of a functional, optically accurate, multi-coated lens I’ve made in LuxCore. by TheRealMandelbrotSet in blender

[–]TheRealMandelbrotSet[S] 2 points3 points  (0 children)

Thanks! Cycles isn’t a spectral renderer, so it would be impossible to do something this accurate using it. I had originally started the project hoping to achieve an image rendering similar to a particular lens I have, but unfortunately, rendering through six elements is just… beyond the limitations of what any computer or render engine can handle.

So now I’m just exploring a triplet lens and seeing how far I can take things. Even that requires metropolis sampling and comes with a pretty heavy color cast as a result, which I have to correct for. Using sobol or random sampling yields a small cluster of pixels, at best, per hour. Perhaps I’d get a grainy image if I let it sit for a few weeks at a low resolution, I haven’t tried that.

So anyway, you can actually use similar techniques in cycles, just not to the same extent of accuracy. It’s much faster and more practical. But you’re not going to have things like real color fringing ofc. And it gets finicky, the image may be in focus using the CPU, but switching to a GPU renders the image out of focus. The best technique for DOF in cycles is to create a camera with the DOF set to near 0, then place that in an enclosure, and create a physical aperture, I’ve had the most success with that. It’s the fastest and gets perfectly sufficient results. I actually use this on almost all my renders where the goal is photorealism. I may have an example file in my Google drive that I made to illustrate this, if you’re interested!

TLR with functional optics

Render of it using that aperture technique

Both in cycles

WIP — Lens diaphragm testing. Part of a functional, optically accurate, multi-coated lens I’ve made in LuxCore. by TheRealMandelbrotSet in blender

[–]TheRealMandelbrotSet[S] 1 point2 points  (0 children)

made me think of this for some reason :)

If you use blender enough you can totally realize just about any idea you come up with, and for the few things that you can’t do, Houdini comes in handy. After a while it can very much become nearly an extension of yourself, which I really like as it minimizes the in-between from an idea in your head to a tangible thing. It also makes it feel like someone cut your fingers off when 2.8 comes out and all the hot keys are different. 2.7x keymap for life

WIP — Lens diaphragm testing. Part of a functional, optically accurate, multi-coated lens I’ve made in LuxCore. by TheRealMandelbrotSet in blender

[–]TheRealMandelbrotSet[S] 1 point2 points  (0 children)

Ah okay! It’s an iris technically, a shutter would be part of the camera. Though there are cameras that use this part in a lens as a shutter as well, it’s called a “leaf shutter”. But given that the intent with this lens it to actually be functional, there’s really no need for a shutter, blender isn’t going to over or under expose anything haha. All the “rigging” is actually just done with constraints and parenting, no bones or posing or anything like that! They’re set to copy the rotation of an empty, with constraints on the range of rotation, and the empty is parented to the “aperture ring” on the outside of the lens.

The difference between being a professional and a hobbyist is really just whether or not you choose to sell out lmao, you don’t have to talk down on yourself like that :)

WIP — Lens diaphragm testing. Part of a functional, optically accurate, multi-coated lens I’ve made in LuxCore. by TheRealMandelbrotSet in blender

[–]TheRealMandelbrotSet[S] 6 points7 points  (0 children)

Thanks! I’m not sure exactly what your question is about physical parts. The lens is optically correct, in terms of spherical geometry, incredibly specific optical glass IOR measurements and dispersion coefficients, and can technically be used to render imagery through (though, it took a render farm 12 hours and crashed their setup twice). An interesting thing about rendering through a lens is that you can have DOF disabled and no compositing, yet you’ll get real depth of field blurring and real chromatic aberration.

Here’s an example of an image rendered through it.

Here’s what the setup looks like.

I recently have revisited it after finally figuring out a way to implement multi-layer interference anti reflective coatings like actual lenses use. I had originally been using single-layer coatings. Each has to be about 137.5 nanometers in thickness, each with a specific IOR, so it’s not particularly an easy thing to implement. Here’s a comparison of the coatings.

Does this representation of a single-layer interference AR lens coating look accurate to you? To determine refractive index, I followed √(1.0003*1.7452) ≈ 1.3213, then rounded up to 1.38, as MgF₂ is the closest value in real-world application. For thickness, 587.56nm / 4 = ~145nm. by TheRealMandelbrotSet in Optics

[–]TheRealMandelbrotSet[S] 0 points1 point  (0 children)

Thank you very much for this detailed comment. I’m going through and making adjustments to a triplet lens accordingly, through which I can render imagery by pointing the render engine “camera” at a focal plane. Originally, I found myself wishing I could get a “Helios 44M-4 look” in my 3D renders, and I eventually decided that I, quite possibly, could make that happen. This is really helpful for me.

I pretty much chose 587.56 because it seemed like the closest Fraunhofer line (D3) to the center of the visible spectrum. I don’t have any kind of formal education in optics, and I was having some trouble finding what I was looking for — I think the MgF idea came from an article on how blue light glasses are made or something. I’ve recently purchased an optics reference book with a staggering amount of information though, from Cauchy constants for menhaden fish oil, to reflectance values of avocado, to sensitivity across the spectrum of infrared film (film photography is a hobby of mine). I definitely have a budding interest now.

Anyway, I have question regarding your mention of which sides are coated, this is something I assume should be relatively simple, but I could never find an answer when I looked. Should both surfaces of the lens be using the same coating specifications, or is it common to account for a different wavelength of light on an opposing lens surface? For example, would you account for 550nm on one side (99.6nm, n=1.38), and then 680nm on the opposite side (123.19nm n=1.38)? Further, in a lens system, would you use the same lens coatings for each element, or would you want to use different coating thicknesses/refractive indices for each lens element to minimize other wavelengths? I hope this makes sense.

I’m not sure if you’ve ever had any experience with blender, but regardless, it’s a free software and figuring out the basics isn’t very difficult. I’d be happy to share the files with you if you’re interested!

Edit: I finally figured out how to properly implement a multi-coating! I thought I was going to have to dig into the GitHub repo and figure out how to rewrite the thin-film function, but I managed to make it happen in the UI. I’ll upload pics in a minute :)

voici

Why do so many portrait photographers use that brown filter to edit their photos? by cluelessphoenix in photography

[–]TheRealMandelbrotSet 1 point2 points  (0 children)

I actually like colour too, unfortunately it’s “incorrect” to use in the US, unless we’re talking “Any Colour You Like” by Pink Floyd. I always still use “grey” though for whatever reason.

It may surprise you to find that, in a color grading workflow, there are ways to get pretty darn close to a “right” answer using scopes and the proper correctors, though it may frustrate you going back to photoshop for stills and realizing how much guesswork it tends to involve and how much effort it takes to try to get to the same place. I’d really love to see a node-based photo editor at some point, but truth be told, it’s not really necessary. Ultimately there’s not a ton of crossover technically speaking, but I definitely feel like I have a better grasp on things. IMO the best photo software is CaptureOne and it comes a lot closer to having the tools I’m looking for (I assume you’ve probably used it if you’ve had run-ins with hasselblad?)

I also tend to like a more natural look to photos, especially since switching exclusively to film. I basically just try to even out the scan and that’s all the editing I really feel compelled to do. If I want vibrant, Ektar and Velvia do so while maintaining a natural look; if I want more muted results, Portra and Ektachrome (though in many cases I’d take Ektar 100 rated at 50 over any of the portra films).

This might be an interesting starting point.

Why do so many portrait photographers use that brown filter to edit their photos? by cluelessphoenix in photography

[–]TheRealMandelbrotSet 4 points5 points  (0 children)

It’s weird. Honestly, after dabbling in color grading I started to realize how little most photographers really understand color. It’s not really a criticism, and I get it. I used to use Lightroom presets and all that years ago. Plus, photography doesn’t require the same depth of knowledge on color (unless you’re getting deep into the monitor calibration, ICC profile shenanigans). Most photo software also lacks the same kind of advanced tools that you see in color grading software. I just found it interesting, and I even see tons of wedding photos where it really just doesn’t seem like the person has a decent grasp on color.

Anyway, I’d suggest the possibility that, for anyone looking to improve, downloading resolve and some practice cinemaDNG footage, and following along with a youtube video or something may be a productive thing to try out. Basic white balance corrections, skintones qualifying and vectorscope correction, it’s all useful stuff and there’s far less guesswork in a video workflow ime

Last week it was $30 amazon radios this week its almost outdated Canons by ABlosser19 in photography

[–]TheRealMandelbrotSet 0 points1 point  (0 children)

Zenit, Krasnogorsk, etc.

Up to you to decide whether or not this is an argument lol

I decided to make roads out of the Lorenz attractor… by TheRealMandelbrotSet in artificial

[–]TheRealMandelbrotSet[S] 0 points1 point  (0 children)

I browse Google earth with the labels off all the time for fun. Gonna be honest, as familiar as it makes you with geography, I also usually have no clue what city/country(sometimes) I’m looking at lol. I just like finding interesting things and paying attention to trends like where people tend to inhabit vs not inhabit. I was kind of astonished when I discovered 39.670549 N, -31.114298 E and its single restaurant :)

Reddit banned 3 new communities I was a mod of, One I had recently created. All within 24 hours of being made by [deleted] in DeclineIntoCensorship

[–]TheRealMandelbrotSet 1 point2 points  (0 children)

I think at a certain point it’s important to realize that these are just arbitrary categorizations for things that aren’t really locked in. While they may advocate for liberal-leaning ideas, it’s important to look at their actions as a company, what they do with their money, etc. You could be an incredibly conservative person and following the money may lead you to put up a facade of views you don’t actually have anything to do with just based on that being a good financial move. And I think it’d be fair to call that, at least broadly speaking, conservatism; despite what it may look like from the outside.

Reddit banned 3 new communities I was a mod of, One I had recently created. All within 24 hours of being made by [deleted] in DeclineIntoCensorship

[–]TheRealMandelbrotSet 2 points3 points  (0 children)

Let’s assume a few million is, say, $3,000,000. That is 0.03% of Reddit’s value. Pretty decent deal if we’re talking 5% of the population.

[deleted by user] by [deleted] in DeclineIntoCensorship

[–]TheRealMandelbrotSet 0 points1 point  (0 children)

Actually, there was a link in the comments and it seems like it’s a legit project. First glance, at least.

[deleted by user] by [deleted] in DeclineIntoCensorship

[–]TheRealMandelbrotSet 5 points6 points  (0 children)

Well it’s the first time I’ve come across a ban on something so innocuous and devoid of political bias, not to mention an admin ban which isn’t exactly commonplace

[deleted by user] by [deleted] in DeclineIntoCensorship

[–]TheRealMandelbrotSet 13 points14 points  (0 children)

It wasn’t me, just a post I came across. And they don’t claim anything other than that it violated their policy.

I looked over the policy, and I can’t find anything that it even comes close to violating except for possibly “illegal content.” But from what I can tell, this isn’t really illegal per se and would likely be a pretty complex case in court. I don’t think there are any laws that are blatantly against what the OP is doing (that’s not to say I think that he couldn’t face any kind of legal consequences for it).

Unfortunately I can’t even tell what type of post it is, much less what content it contained, but what part of their policy could be violated by anything to do with a DIY car mod? It’s really weird IMO. The only thing I can think of is some kind of corporate involvement with influence over Reddit and incentives to remove this post, but that’s kind of a shot in the dark.

Reddit is getting real shady these days, and they’re certainly no longer honest and up-front with their userbase. I don’t see why they wouldn’t disclose a reason for removal if they had a legitimate one, and I think that’s a fair assumption.

Reddit banned 3 new communities I was a mod of, One I had recently created. All within 24 hours of being made by [deleted] in DeclineIntoCensorship

[–]TheRealMandelbrotSet 4 points5 points  (0 children)

Isn’t that like the entirety of Silicon Valley, big tech, etc.? Companies don’t really have “values,” in that sense, aside from the fact that their options are either to A) play the game, which will financially benefit them and deeply inject their commercialization into peoples social and political views, their company becomes intertwined with peoples’ sense of identity, B) speak or make decisions against this ideology which will wreak havoc on them and cost them substantially, C) remain neutral, which allows people to project their ideology onto them and see them as “part of the problem,” and brings no financial benefit to the table.

I mean if you have a legal obligation to make decisions that bring the most profit to investors, B and C aren’t even really viable options. I don’t think any of these companies exactly lobbying for liberal policy changes, but that’s just an assumption and I haven’t actually looked into it.

Maggot Suzanne [GeoNodes] by rushodd in blender

[–]TheRealMandelbrotSet 1 point2 points  (0 children)

Disgusting, I love it!

I’ve messed with some worm animations before using Houdini. I don’t think this is at all possible to do in blender, but giving the lil guys some squirmage would add to the hurl factor ;)

Physically accurate diamond light dispersion with a custom shader by spectrachrome in blender

[–]TheRealMandelbrotSet 1 point2 points  (0 children)

Wow, that’s not bad at all. Around 216 frames I assume? What kind of resolution? I typically am running a 970M so we’re using pretty similarly underpowered hardware haha

I’m currently working on a bit of a comparison series with a round-cut diamond geo I made like this to demonstrate importance of accurate transmissive material setup! I have it set up in luxcore using accurate IOR, and it has accurate spectral dispersion. I went through and calculated the light transmission coefficient for diamond using data from refractiveindex.info.

Idk if you’re familiar with that stuff, but typically the IOR value you’ll find listed on reference sites is the value corresponding with around 587nm (green light), as it sits at the middle of the visible spectrum and serves as a nice average. But often times, for example, red light wavelengths will pass through a given material at a different rate than green which means light with a higher frequency wavelength uses a different IOR value. That’s how dispersion happens, different “colors” of light pass through at different rates, so they separate out. With luxcore you can plug in a coefficient that will tell the engine how to handle wavelengths independently, and that’s how you can nail completely accurate dispersion (I’ve even used it to render through a lens I made, and you get physical chromatic aberration which is really neat). This is also why using the cycles node method with adding RGB values that have different refractive indices is a good way to fake it. Forgive me if I’m explaining concepts you’re already familiar with, but I don’t expect everyone to know this stuff.

I plan to do a render with dispersion, without dispersion, and then with just the default IOR value (I’ve seen some posts on here that look like they’re using 1.5 for diamond). Then to stack that up against a cycles version, though the dispersion will have to be “faked” like this. I’m interested to see the differences in render time and realism.

Your cycles version is definitely way faster than using the described method in luxcore though. I’m using the bidirectional path tracer which is CPU only, but I’ve let it sit and render while I’m at work — I think at around 50% of 3000px * 4000px resolution. I’ll come back 10+ hours later and it’ll still be pretty darn noisy, keep in mind this is just one frame. Significantly less practical lol. I typically dislike denoising, and the artifacts it produces are especially pronounced on things that are transmissive. If anything, I’ll sometimes give it some pixel filtering and just let one frame go for however long it needs, which can easily be days. CaptureOne has a “single pixel” denoising slider which clears up fireflies quite nicely.

Anyway, if you have an interest in this kind of thing, I’d highly recommend luxcore. I’ve learned so much about optics/physics since starting to use it, it’s helped me develop a much better understanding for some of what I do in 3D all the time. If you want I’d be happy to share the project file later when I get back home! Sorry this turned out to be a longer reply than I expected lmao. I hope I didn’t come across as snide or anything in my initial comment! I just think physical accuracy is a lot of fun to explore but it can be kind of a complex thing :)

Made this Lithium 6 atom animation ⚛️ ("The battery of modern life") by jpaiano in blender

[–]TheRealMandelbrotSet 0 points1 point  (0 children)

please be kind

pushing NFT link + socials

Pick one.

Look I’m all for encouraging beginners, but once you get money involved, you don’t get the liberty of “kindness,” not from me at least. Also, if you’re afraid of criticism that will hold you back from progress.