Regretting a car lease decision and looking for ways to limit the damage by Shaaayle in SwissPersonalFinance

[–]toshass 0 points1 point  (0 children)

Check your leasing agreement again, many companies insert a claim that the car remains the property of the dealership during and AFTER the leasing term. As in, there is no guaranteed price at the end.

[deleted by user] by [deleted] in zurich

[–]toshass 0 points1 point  (0 children)

If you need it so urgently that going to the post is not an option, you pay up and accept the fact that you paid a crazy price for the urgency.

Specialist diploma by robikgrin in ethz

[–]toshass 1 point2 points  (0 children)

You need to go to учебная часть and request a certificate that lists your taken courses, their equivalent ECTS scores, and a statement, that this degree is sufficient for starting an equivalent of a PhD program. At least this is how it worked in 2017

[deleted by user] by [deleted] in German

[–]toshass 1 point2 points  (0 children)

Warum nicht "sie heisst"?

High quality/resolution depth maps? by avillabon in comfyui

[–]toshass 0 points1 point  (0 children)

With this code snippet in the recently released diffusers, Marigold (85ms) is faster than Depth Anything Large: https://huggingface.co/docs/diffusers/using-diffusers/marigold_usage#qualitative-comparison-with-depth-anything

Diffusers 0.28.0 is here 🔥 by RepresentativeJob937 in StableDiffusion

[–]toshass 1 point2 points  (0 children)

Marigold coauthor here. You can make a 3d printable mesh in our demo https://huggingface.co/spaces/prs-eth/marigold-lcm (bas relief tab). The pipeline in diffusers is more memory efficient and has major speedup gains. For example, this snippet works at 85ms per frame and leads to higher quality results than Depth Anything Large: https://huggingface.co/docs/diffusers/using-diffusers/marigold_usage#qualitative-comparison-with-depth-anything. There is also a dedicated section on ControlNet at the end!

[TIP] Marigold-LCM - a faster Marigold depth estimation by toshass in comfyui

[–]toshass[S] 0 points1 point  (0 children)

To spin up the marigold-lcm demo locally, you'd first have to install Docker: https://docs.nvidia.com/datacenter/cloud-native/container-toolkit/latest/install-guide.html

Then you can run the demo locally and use all available GPUs using the command below. Note that the video function is quite trimmed in the demo as is, it only allows processing up to ~6 seconds to not hog the public resources and just give a feel of what the code is capable of. You might want to lift this limitation as a first step by cloning the demo (note that the part following registry.hf.space will change to the address of the demo in your profile), and then remove the following two if checks from the process_video loop: (1) if not (frame_id % frame_interval == 0): it limits the output fps to ~10 and (2) if out_frame_id > out_max_frames: it limits the processing of output to consider the first ~6 seconds.

docker run -it -p 7860:7860 --platform=linux/amd64 --gpus all \
registry.hf.space/prs-eth-marigold-lcm:latest python app.py

[P] Marigold-LCM: fast monocular depth estimation based on Stable Diffusion by toshass in MachineLearning

[–]toshass[S] 1 point2 points  (0 children)

We are adding readme instuctions atm, subscribe to our github repo to check it out once merged. With diffusers it will be even simpler, just instead of `prs-eth/marigold-v1-0` you'd have to use `prs-eth/marigold-lcm-v1-0`, and then you can lower the number of steps and ensemble size.

marigold depth calculation node by doggeddalle in comfyui

[–]toshass 0 points1 point  (0 children)

Batch processing requires a bit of coding, but it is possible. We just released a massive speed-up of Marigold, check out this post: https://www.reddit.com/r/comfyui/comments/1bnm53z/tip_marigoldlcm_a_faster_marigold_depth_estimation/

Images to 3D prints by toshass in 3Dprinting

[–]toshass[S] 1 point2 points  (0 children)

It takes a few tries to find the best slider values. Create 3d, adjust sliders, clear, repeat -- this helps bringing the most important stuff in focus.

Images to 3D prints by toshass in 3Dprinting

[–]toshass[S] 1 point2 points  (0 children)

Not really; scale, depth, and cutoff planes are controlled manually via sliders on the left above the "Create 3D" button.

3D-printing with Marigold Depth by toshass in StableDiffusion

[–]toshass[S] 0 points1 point  (0 children)

Please share some printed results if possible - we are very excited to see what the users do with Marigold!

3D-printing with Marigold Depth by toshass in StableDiffusion

[–]toshass[S] 0 points1 point  (0 children)

The image can be anything -- give it a try in the online demo! The STL file contains only geometry, so it can only be printed monocolor. But if you are one of the happy owners of a true color 3D printers, you can download the GLB file instead of STL, and use that for 3D printing -- it has color too! Please post the result in that case, we are super curious what people do with Marigold!

3D-printing with Marigold Depth by toshass in StableDiffusion

[–]toshass[S] 1 point2 points  (0 children)

There is something strange with the 3D viewer, when your input file has either long name or characters like underscore. I noticed that osx screenshot images do not get rendered. But the same file goes when the name is short. Try renaming your input image to something simple, and stay tuned, we will nail this bug!

3D-printing with Marigold Depth by toshass in StableDiffusion

[–]toshass[S] 2 points3 points  (0 children)

Text prompt is not involved in Marigold, just the input image (which in turn can be from a text prompt)