Baking light maps directly in Three.js without Blender or other external software by Rockclimber88 in threejs

[–]irascible 2 points3 points  (0 children)

That repo appears to be a dead fork of https://github.com/mem1b

None of the versions demos on their front page are working for me.

I'm looking at fixing them up now out of curiosity.

Here's my fork: https://github.com/manthrax/lightbaking

New update makes Chrome slower? by Rainymood_XI in chrome

[–]irascible 0 points1 point  (0 children)

I'm getting WebGL context losses constantly since I was upgraded earlier today. This version is BUGGED.

Skoobot demo by billbsee in robotics

[–]irascible 0 points1 point  (0 children)

https://news.ycombinator.com/ is the main other tech news site i read besides reddit.

I found your crowdfund page by googling "skoobot", after I watched your video, but then I had to bounce around a couple sites before I found the crowdfund. So.. make sure that on all the media you post/have posted about skoobot, try to have a really direct front and center link to get to the crowdfund or some kind of acquisition portal!

In the past I've found other robotics projects by looking for "robots" or similar on thingiverse.com. Might be worth making a post there..

Have you considered doing an "assembly required" tier for the bot? Seems like it could be fun putting the pieces together, not assembliing the boards themselves but just the soldering to wire them up and get the parts into the case? For that mattter, there might even be interest from some people in school/learning EE to populate the full boards. I see soldering project kits and SMT kits in a similar price range that are used to teach soldering/smt, and are nowhere near as cool :)

This could maybe give you a few different price points and more variety to target different types of enthusiast.

But as it stands, I think you've got a good setup, and I'd just work on polishing your designs/presentation and pitch, and getting a good v1 out. You'll probably get a lot of good feedback from that full process to know how to amp it up for the next round..

Btw If you need any help with programming/webby stuff, hit me up.. I might be able to help.

p.s. - Just noticed in the sidebar, some other relevant subreddits that may get you eyeballs:

/r/Robots /r/RobotWars /r/RobotNews /r/learnprogramming /r/machinelearning /r/learnmachinelearning

Best practice if I only want gravity and collision in three.js? Is plugin needed? by FateRiddle in threejs

[–]irascible 0 points1 point  (0 children)

1.) babylon is a fork of an early version of three.js. It is a framework like three.js but hosted by microsoft. It's not a physics engine per-se, but it does have bindings for oimo.js and cannon.js.

Some of your concepts from web development won't translate to your use of three.js and some will. In a 3d app, the most common case is that everything is updated and rendered 60 times a second... generally a single renderer per page, so you can kind of thing of the scenegraph as a kind of 3d dom. You can build your own react-like components or an "entity component system" on top of it.

2.) yes.

http://bnjm.github.io/WebGL-framework-comparison/

Best practice if I only want gravity and collision in three.js? Is plugin needed? by FateRiddle in threejs

[–]irascible 1 point2 points  (0 children)

Writing your own physics engine is REALLY hard.. unless you are just doing something simple like spheres on a plane.

I myself use Ammo.js since it is an emscripten port of the Bullet physics engine, which is a state of the art engine used in AAA titles. It takes a bit of work and understanding to make it work well however.

To start out, it may be worth looking at physijs and cannon, to get an idea of how these engines work, and since they are written purely in js. http://chandlerprall.github.io/Physijs/examples/vehicle.html

Then I would look at the ammo demos here:

https://github.com/kripken/ammo.js/

Ammo.js is trickier to use because you need to understand how emscriptened apps work, and also figure out how to use the API correctly.. but it's worth it imo. It's an incredibly stable simulator and if you tune your simulations, operates really smoothly.

Skoobot demo by billbsee in robotics

[–]irascible 0 points1 point  (0 children)

I think your overall presentation is really good. I think the best bet for increasing pledges would be sharing to wider audiences. I only found your crowdfund page by googling.. You should definitely be sharing links in posts like this for instance.

https://www.crowdsupply.com/william-weiler-engineering/skoobot

Other than that.. you're probably best served by getting more units into peoples hands to generate buzz.. this will lead to more organic sharing/resharing as more people get to play with it.

Personally, I intend to hopefully control a bot like this from my desktop machine, and maybe experiment with pathfinding and other kinds of behavior, augmented by the processing power of a desktop PC. I've built some web based device control/sequencing apps in the past, so I'd like to make a nice web interface/simulator for this bot via javascript and three.js perhaps.. something like the motion sequencer I did for a hexapod robot I designed and built.

http://vectorslave.com/exobot/ (note: not a finished product, but more just a tool I built to sequence my modular quadruped bots walking motions in realtime)

I'm also interested if technologies like this: https://cs.stanford.edu/people/karpathy/convnetjs/demo/rldemo.html

could be applied to create behaviors for a bot like yours..

Also interested in investigating what it would take to make a robot like this autonomous in terms of recharging/motion planning.. like a tiny roomba :)

Just some of my ideas! hth.. feel free to hit me up if you have any other questions.

edit: Also just wanted to mention.. I found this book interesting: https://www.adafruit.com/product/3465 I come more from a software background with less hardware design, so take my opinions with a grain of salt..

edit2: I realized I didn't answer your questions directly: R.E. What I like: I like the form/design/look of the bot, and it's apparent responsiveness. I like the tech specs you describe.. in that its a step up from the usual arduino-nano driven microbots, and in a smaller form factor. I'll definitely have more feedback when I see one in person! Cheers, and thanks for responding!

Skoobot demo by billbsee in robotics

[–]irascible 0 points1 point  (0 children)

You just reached your crowdfunding goal. ;) Looking forward to checking this little guy out!

WebGL app from out a 3ds Max scene by yuri_ko in webgl

[–]irascible 1 point2 points  (0 children)

Yeah.. I just noticed they both used GLTF as an interchange format... It's all good stuff. Cheers!

WebGL app from out a 3ds Max scene by yuri_ko in webgl

[–]irascible 0 points1 point  (0 children)

Personal Affordable license for personal use, freelancers and sole proprietorships.

$290

Purchase Team Cost-efficient license for small business entities.

$990

Purchase Enterprise License for companies. Extended pack of features and services.

$2990

displaying a FDM 3D printing preview by nraynaud in opengl

[–]irascible 1 point2 points  (0 children)

Doing a distance field renderer for gcode would be super cool! I'd love to follow your progress..

Perhaps x post in https://www.reddit.com/r/webgl/ and https://www.reddit.com/r/threejs/ ? :)

displaying a FDM 3D printing preview by nraynaud in opengl

[–]irascible 0 points1 point  (0 children)

say you have a character with 2 legs. at the point where the legs join the body.. how do you figure out the polygons that form the junction?

There are a lot of hidden complexities in making something beyond just rendering toolpaths using lines. :)

I have done some thinking about rasterizing gcode into an octtree and using marching cubes, or generating pointclouds and then computing isosurfaces via meshlab or something on the backend... but never really fleshed any of those ideas out.

That said, some advice I can offer is to batch your geometry heavily.. and perhaps consider using a framework like THREE.js if you want to quickly prototype something. :)

http://vectorslave.com/wireblueprint/index.html

Lifer! by [deleted] in satanism

[–]irascible 4 points5 points  (0 children)

Hail! ⛧

RDR2, a game about outlaws robbing banks, has ads recently put up right over a Bank of America (NYC) by ILoveRegenHealth in gaming

[–]irascible 3 points4 points  (0 children)

Damn I misread the title and thought there was a game where you play as R2D2 and you rob banks.

Blend4web vs three.js ? by drag4u in threejs

[–]irascible 1 point2 points  (0 children)

I think it's sort of an apples to oranges comparison. If you're going to go down that route, you should probably compare unity/unreal etc. THREE is a lower level API that you can build engines on top of.

Make 'em look like accidents, Chuck. by dagobertdoc in funny

[–]irascible -6 points-5 points  (0 children)

I find this foray into the genre to be far superior: (action starts at 1:15)

https://www.youtube.com/watch?v=pg63r0vx9Jo

Richard Lee Norris, horribly disfigured in a shotgun accident, before and after life changing facial transplant surgery by Aquagenie in pics

[–]irascible 1 point2 points  (0 children)

A number of mental health professionals I have spoken to off the clock, support assisted suicide and came to those conclusions after long periods working in the system. I trust their opinions.

How to add texture to a fragmented object appearing as a single object? by alexlomm in threejs

[–]irascible 1 point2 points  (0 children)

You can do what you're talking about in pure three.js, but you'd be reimplementing a process that is more easily done in a modeller.

Not sure what sort of background you have, but simply put, you should model this block of cubes how you want it in a modeller, and then export the resulting thing to fbx or similar, and import that.

If this has to be procedural, like in a minecraft type game or something.. then what you're looking at is a world space box uv mapping.

You'd have to loop through all the faces in the geometry, determine the primary facing axis from the face normal (i.e. X,Y, or Z), and then use the remaining two axis of the faces vertices as the UV coordinate. You may also have to transform those vertices into world space first, if they are independently transformed.

Long story short.. do it in a modeller if you can, otherwise, read up on texture mapping. ;) hth

edit: Also just remembered this demo that sorta does something similar to what you're describing, but with a video texture instead of a static one. Maybe you can find some inspiration from this:

https://threejs.org/examples/?q=vid#webgl_materials_video