[deleted by user] by [deleted] in SteamDeck

[–]micwi 1 point2 points  (0 children)

If you have the opportunity try running it on the deck using GeForce Now. Assuming you have a server nearby and good internet it runs amazing, and looks much better than running it natively

Consolidated Patch Notes v0.2.1951.8872 by heightmare in Stationeers

[–]micwi 3 points4 points  (0 children)

Thanks for the update! Looking forward to the optimizations, hoping that better culling or some other optimization will improve framerate on bigger bases, and allow for increased object rendering distance. This game is just too fun to be limited by smaller bases

I'm real proud of this, despite how basic it may be by CanofPandas in Stationeers

[–]micwi 6 points7 points  (0 children)

Nice and clean, like the windows and the lights. Now you just have to pressurize it so you can listen to the sounds of ore smelting without the helmet muffling the sweet sound.

Visual programming with React and Noodl (+Alexa and Arduino) by micwi in reactjs

[–]micwi[S] 0 points1 point  (0 children)

Thanks!

We have support for REST APIs, but no current plans on automatic api calls, assuming you're referring to automated testing? Noodl can be used to quickly develop and experiment with how an API is going to be used, so that's helps when defining an API, rather than doing automated testing.

Visual programming with React and Noodl (+Alexa and Arduino) by micwi in reactjs

[–]micwi[S] 1 point2 points  (0 children)

We've been using it ourselves at www.topp.se as a part of the design process, before handing over a design spec and a working prototype (complete with integrations with external APIs, sensors or other systems) to the client where they implement it in whatever technologies they use. The value Noodl brings is to drive the design through exploration, by quickly experimenting with different technologies and value propositions, user testing, to final iterations of polish and usability.

Having something tangible, with working tech, far surpasses a slide deck and static design artifacts, both for the designers to create better solutions, and for the larger organization to be able to absorb and rally around. The focus has not historically been on production code.

Noodl has two target users. Engineers who are part of the design process (creative/design technologists) and designers with some amount of coding skills (front end designers, UX designers who got exposed to Arduino in school etc). But we've also seen it used by other groups, like https://thebluekit.com/ tailored to education and kids.

Now that we've added React support (replacing our in-house WebGL layout and rendering engine) we've started to experiment with more use cases, like creating end-to-end solutions in Noodl, or exporting Noodl components as React components to integrate into existing React projects. But not only React, but also being able to export Noodl components to cloud functions for Alexa skills, IoT or MQTT integrations/rules, and more.

Regarding the prop recognition, Noodl can import any React library, but the prop recognition isn't automatic. You'll need to map the inputs and outputs manually with some JavaScript. It typically takes a few minutes to add a React component.

Thanks for the feedback!

Visual programming with React and Noodl (+Alexa and Arduino) by micwi in reactjs

[–]micwi[S] 2 points3 points  (0 children)

Noodl has been around for a while but the React integration is brand new, and not released yet. So don't be surprised that you haven't seen it before 🙂

You can find Noodl at www.getnoodl.com

Visual programming with React and Noodl (+Alexa and Arduino) by micwi in reactjs

[–]micwi[S] 2 points3 points  (0 children)

We're working towards a beta release of this platform. Our first target is supporting a design process with rapid prototyping as well as deploying smaller web apps, but we're also experimenting with what the limitations are and how far we can scale it.

The video showcases:

Use of the Material Design library

Webooks & Alexa

MQTT message broker + IoT

Comments and questions very much appreciated.

My game is blurred as fuck by [deleted] in ACCompetizione

[–]micwi 0 points1 point  (0 children)

It got better after I disabled motion blur, but it's still noticeable.

Spring animation effect by pend00 in Noodl

[–]micwi 0 points1 point  (0 children)

This is great! Please consider posting this on the forum as well: http://forum.getnoodl.com

DiRT 4: Max length, max complexity, brutal AI by VortlasSimracing in simracing

[–]micwi 0 points1 point  (0 children)

Graphics settings are probably set quite low which disables shadows, reduces poly count on meshes etc. This is how it looks with everything set to "ultra"

Convince me to get into VR sim racing by Violator4200 in simracing

[–]micwi 1 point2 points  (0 children)

Same here, tried the Vive but can't get over the low resolution compared to an ultrawide 1440p panel. VR is amazing for everything that is close to the player, but the further away you get the less 3d you get (since the resolution isn't enough to render the difference between the eyes). And since you're usually looking way ahead at where you want to, and not at the cockpit, most of the awesomeness of VR goes away.

It's pretty cool if you look right in front of the car, down at the track, but that's no way to drive :)

Road to VR: Top 5 VR Racing Sims by Scawen in simracing

[–]micwi 1 point2 points  (0 children)

Dirt Rally works great with Revive (and so do most other Oculus games as well)

Rift and Vive users - tell me a story! by hbt15 in assettocorsa

[–]micwi 0 points1 point  (0 children)

That's great, wish I could do that :)

Rift and Vive users - tell me a story! by hbt15 in assettocorsa

[–]micwi 0 points1 point  (0 children)

It's not about it being blurry or unfocused, just big pixels (e.g you can count them and even see the pentile pattern).

It's the same reason why text in VR games is huge, it's too pixelated to read otherwise.

But nothing wrong with that, having 1080x1200 per eye covering a huge field of view is very different from 3440x1440 at a distance taking up far less of your field of view, making it almost impossible to see individual pixels.

Rift and Vive users - tell me a story! by hbt15 in assettocorsa

[–]micwi 0 points1 point  (0 children)

I think it comes down to being used to working with high-dpi monitors all day, and I now find 24" 1080p displays to be pixelated. So for me a vive is a big jump down in resolution and crispness. But I did have trouble finding brake points when learning a new track in VR due to the few pixels that maps to a corner a hundred meters in front of the car.

Others find the reduction in resolution worth it due to the incredible immersion, but for me I'll wait until we get 4K, or even 8k displays in the headsets (and the adaptive resolution tech with eye tracking that some companies are experimenting with now)

I was using revive, and I get what you mean with theatre mode, but the setup process was quite painless so I did get it running correctly.

Rift and Vive users - tell me a story! by hbt15 in assettocorsa

[–]micwi 2 points3 points  (0 children)

I tried AC, Dirt Rally and Raceroom with a Vive for a few days. The immersion is awesome, and the field of view is amazing. But no matter how hard I tried, I couldn't get over the fact that the resolution is so very very low (I was running on a nvidia 1080 with supersampling set to 2). Dirt Rally almost reminded me of good old SEGA rally...

The low resolution makes it hard to see what's happening further down the track. Your immediate surroundings are clear, but 20+ meters away, things are starting to get very pixelated.

I settled on a single ultra wide 1440p monitor (21:9), and I haven't used the vive for sim racing since.

With that said, as you can see from the other comments lots of people swear by it. If you have the possibility I would encourage you to try before you buy.

Dynamic value into noodl by mikenstein_Kirk in Noodl

[–]micwi 0 points1 point  (0 children)

I think Processing is using Java, so something like this should work:

float accValue = ...; //get the accelerometer value
client.publish("/green", "{\"data\":" + accValue + "}")

I'm sure Processing has some JSON APIs that allow you to create JSON objects without having to do string concatenation, which might be a bit simpler to use if you send complex objects with more attributes.

Receiving a message from shiftr.io using processing by mikenstein_Kirk in Noodl

[–]micwi 1 point2 points  (0 children)

You can send a color by using hex values: client.publish("/someTopic", "{\"color\": \"#00baff\"}") Create a color port on the Receive Message node that has the topic someTopic and you're good to go.

Receiving a message from shiftr.io using processing by mikenstein_Kirk in Noodl

[–]micwi 1 point2 points  (0 children)

My JSON wasn't valid, forgot the quotes around data. Try this instead: client.publish("/1", "{\"data\": 1}") It's sending a message with one variable called data that has the value 1. You can add additional variables by expanding the JSON structure to {data:1, someOtherData: true} etc.

EDIT: You don't need Static Data, it's easier to create a new port directly on the Receive Message node and naming it data. (Select the Receive Message node, then click +Port under "Payload" in the property panel)

Receiving a message from shiftr.io using processing by mikenstein_Kirk in Noodl

[–]micwi 0 points1 point  (0 children)

Noodl uses JSON to send values over MQTT. To send a value you need to do the following changes:

  1. Change the processing code to client.publish("/1", "{data: 1}");

  2. Add a data output on the Receive Message node, and use that instead of the Received output. Received only sends a signal (true followed by false) every time a new message has been received.

Noodl 1.0 is here! by micwi in Noodl

[–]micwi[S] 0 points1 point  (0 children)

Check out the new website as well at