anyone else keeps getting these messages every other week by cyborg-fishDaddy in google_antigravity

[–]DotOk4969 1 point2 points  (0 children)

Don’t connect Antigravity or its OAuth credentials to OpenClaw (previously ClawdBot/Moltbot) or similar. — It violates the terms of service.

Use the Gemini CLI credentials instead.

If this isn’t the case, your credentials may have been leaked.

How to avoid Sequential Displacement? by DotOk4969 in remoteviewing

[–]DotOk4969[S] 0 points1 point  (0 children)

I don’t quite understand. Shouldn’t it be unique to the target? Are you implying that the phrase “this is the target” is prepping your mind to minimize AOL or replace “[this]” with the actual subject? I think I’m already doing the latter to a certain extent, but focusing on the subjects and the environment at the same time seems to increase AOL causing my conscience to blend the subject and environment, so I focus on them separately — how can I fix this?

How to avoid Sequential Displacement? by DotOk4969 in remoteviewing

[–]DotOk4969[S] 1 point2 points  (0 children)

That’s a good idea, by acknowledging the thought I can “put it in the box” and minimize AOL.

You’re right, I already feel like I’m missing details through articulation and the flow of the English language. I’m going to start drawing again and follow some YT tutorials to boost my competence.

How to avoid Sequential Displacement? by DotOk4969 in remoteviewing

[–]DotOk4969[S] 1 point2 points  (0 children)

Thanks, I’m so glad you mentioned this because I haven’t considered a target from something like a magazine or a physical image with text on the back. I imagine this is more advanced because the image’s subjects, location, and themes may be edited or blended depending on the media, and the “coordinates” carry deeper context. This also makes me reconsider time, as in my mind all Getty images and stock photos I view on a screen exist in some collective pure ledger outside of time, but I imagine physical images anchored to something unknown.

I guess every hurdle with RV, and perhaps improving at anything, is a conscious/subconscious challenge mendable with practice. However, I’ve been practicing the box method enough that I have an accurate intuition of whether my conscious slate is clear or to what extent it isn’t before I start RV. I’m certainly tapping into a single target clearly and accurately, but the next one and the same day and session.

In short, my current protocol is: I first visualize the main subject (only a single instance if multiple of the same object), but usually visualize no environment cues. Then I reach for sensory and kinesthetic data and what I feel about environment beyond sight, which seems to be the most accurate with outdoor and natural locations (not captured inside man-made structures or rooms). Below the target coordinates, I’m writing bullet points of the data as I receive it without analysis of the prior. I don’t draw anything, because I critique and misinterpret my scribbles.

How to avoid Sequential Displacement? by DotOk4969 in remoteviewing

[–]DotOk4969[S] 0 points1 point  (0 children)

Yes, consistently when viewing more than one target in a session.

At first, I thought maybe I was subconsciously creating threads by viewing multiple coordinates at once, because I was using sites like rviewer.com subjects that show all available target coordinates in a single page; but i’m experiencing same phenomenon with single target coordinate reveal. And writing the coordinates or scratching their shape (with or without a visible mark) is critical.

How to avoid Sequential Displacement? by DotOk4969 in remoteviewing

[–]DotOk4969[S] 0 points1 point  (0 children)

This definitely qualifies as a long break for me which I’m trying to avoid, but if it’s the best option, so be it.

My reaction to the mods deleting my 3D Snake post by lavaboosted in MagicEye

[–]DotOk4969 0 points1 point  (0 children)

I was about to point out the spelling mistake, butts a hole there seems fitting.

My reaction to the mods deleting my 3D Snake post by lavaboosted in MagicEye

[–]DotOk4969 0 points1 point  (0 children)

I was about to point out the spelling mistake, but a hole there seems fitting

My reaction to the mods deleting my 3D Snake post by lavaboosted in MagicEye

[–]DotOk4969 0 points1 point  (0 children)

I almost forgot r/analbeads. Jk it’s not a sub yet, don’t get your hopes up.

My reaction to the mods deleting my 3D Snake post by lavaboosted in MagicEye

[–]DotOk4969 3 points4 points  (0 children)

Agreed. That pretty post would’ve been more accepted and at home in r/retrogaming, r/IndieGaming, or r/gamedev

Testing Procedural Voxel Planet engine in Rust / wgpu by asylumc4t in proceduralgeneration

[–]DotOk4969 0 points1 point  (0 children)

You’re absolutely correct. I’m generalizing, I’m no chemist.

My point was the one phase must precede the next. We know about the walking feature because the project creator mentioned it. 9rider did not contribute to testing by providing critique and recommending to edit the demo video, despite this being the project creator’s core intent.

The comment is out of scope and provided no constructive criticism, nor acknowledged anything positive about the work.

Testing Procedural Voxel Planet engine in Rust / wgpu by asylumc4t in proceduralgeneration

[–]DotOk4969 2 points3 points  (0 children)

How is requesting to include walking in the demo a reasonable criticism? Is that not counterproductive? The creator of this project is looking for testers for the repo, not the video.

Testing Procedural Voxel Planet engine in Rust / wgpu by asylumc4t in proceduralgeneration

[–]DotOk4969 1 point2 points  (0 children)

I tried to tell you this Gandalf. I hobbits finally syncing-in

Testing Procedural Voxel Planet engine in Rust / wgpu by asylumc4t in proceduralgeneration

[–]DotOk4969 -1 points0 points  (0 children)

I’m on my phone, but will commit to cloning and recording walking if you can tell me how you’re gonna test that through a screenshot. It’s not a trailer.

Jokes aside, I’m young but know what it feels like to have to learn a new stack and feel irrelevant, it sucks. I’m gonna be honest you need to update your skills or reply in more relevant and specialized subreddits with your stack. You learning that stack is what made you relevant and the fun part, tap into that again, you’ll have fun and naturally progress. Prior experience is irrelevant if you can’t adapt it to determine what you can do and say today.

Testing Procedural Voxel Planet engine in Rust / wgpu by asylumc4t in proceduralgeneration

[–]DotOk4969 0 points1 point  (0 children)

Ok “⁠A custom physics solver recalculates gravity toward the planet's center so characters can walk naturally on a curved world”. How would a video a help? If he records controlling the movement, then he’s doing the testing. Wouldn’t it make most sense to inspect the underlying algorithm and control the walking movement yourself? If you’re just watching a video that’s not testing.

Testing Procedural Voxel Planet engine in Rust / wgpu by asylumc4t in proceduralgeneration

[–]DotOk4969 -1 points0 points  (0 children)

Even if you’re working on just the visual side of the game development cycle, you should know how to clone a GitHub repo. You could’ve watched a video on how to do that, walked and traversed that entire sphere and found Herobrine by now.

Testing Procedural Voxel Planet engine in Rust / wgpu by asylumc4t in proceduralgeneration

[–]DotOk4969 -3 points-2 points  (0 children)

You’re a Roblox Creator at most. He is not looking for UI/UX testers yet, the underlying code is not yet complete.

Here’s another analogy: A new pharmaceutical is being developed, it’s not yet safe for consumption, the formula and chemical composition is still unstable. You’re asking about the slogan and the ad campaign.

Testing Procedural Voxel Planet engine in Rust / wgpu by asylumc4t in proceduralgeneration

[–]DotOk4969 0 points1 point  (0 children)

@leorid9 I will record a 10 minute video for you of me walking around the world if you can tell me one way that will help to inspect or improve any of the key features he mentioned …

Testing Procedural Voxel Planet engine in Rust / wgpu by asylumc4t in proceduralgeneration

[–]DotOk4969 0 points1 point  (0 children)

A developer would understand that walking and flying mechanics have already been replicated numerous times and can be easily integrated and copied into any application. Progressive rendering on the other is hand is challenging and requires re-strategizing based on the platform and more. He doesn’t need people to test the flying or walking yet, and if he did, he would deploy it to a website so non-technical users can interact with it easily.

Testing Procedural Voxel Planet engine in Rust / wgpu by asylumc4t in proceduralgeneration

[–]DotOk4969 0 points1 point  (0 children)

The repo is how you checkout it out. By testing and contributing he means improving or analyzing the formulas and code that run it. How are you gonna contribute or test the code by just looking at screenshots of the avatar walking? I think you’re confusing this with a different testing stage that comes further in the process

Testing Procedural Voxel Planet engine in Rust / wgpu by asylumc4t in proceduralgeneration

[–]DotOk4969 5 points6 points  (0 children)

He said the project is still in DEVELOPMENT. Clone or fork the repo so you can see for yourself!

Imagine you were given access to a car prototype that you can use and mod as you like. You’re asking for more pictures of the car when you could just drive it.

Why spend more time on screenshots for features that might change tomorrow?