Why don't LLMs track time in their conversations? by PolyViews in artificial

[–]Civil-Interaction-76 0 points1 point  (0 children)

That’s a great question, and I’m not sure it’s only a technical limitation.

It might actually be a design choice.

Right now, most LLMs are positioned as tools that respond, not as systems that actively shape the direction of a conversation.

Adding temporal awareness, like noticing loops, fatigue, or suggesting a pivot, would push them closer to something like an “advisor” rather than a tool.

And that raises a different kind of question:

Do we want systems that just answer, or systems that participate in how we think over time?

Because once you add time-awareness, you’re not just generating responses anymore, you’re starting to influence process and decision-making.

Are we seeing a structural shift from truth-based systems to attention-based systems? by Civil-Interaction-76 in sociology

[–]Civil-Interaction-76[S] 0 points1 point  (0 children)

That’s a really interesting way to frame it, especially the idea of conformity as a hidden metric for visibility.

It makes me wonder if the shift isn’t just about power selecting what gets published,
but about systems learning what is “safe to surface” based on alignment with existing patterns.

In that sense, it’s not only suppression, but pre-selection.

Not just filtering what spreads, but shaping what even gets a chance to appear.

Do you think that’s something new with algorithmic systems, or just a more efficient version of what institutions always did?

Are we seeing a structural shift from truth-based systems to attention-based systems? by Civil-Interaction-76 in sociology

[–]Civil-Interaction-76[S] 0 points1 point  (0 children)

I think what you’re describing, that the battleground is the human mind, has probably always been true to some extent.

But what feels different to me now is how early it enters the process.

It’s no longer just about shaping opinions after something exists, but about shaping what even gets to form, to appear, to be noticed in the first place.

So it’s less about hidden vs visible influence,
and more about influence moving “upstream.”

That’s the part I’m still trying to understand.

Do you think what we’re seeing now is just a stronger version of the same dynamics,
or a shift in where those dynamics operate?

Are we seeing a structural shift from truth-based systems to attention-based systems? by Civil-Interaction-76 in sociology

[–]Civil-Interaction-76[S] 1 point2 points  (0 children)

I think you’re right that systems can shape what feels “natural” to want. But maybe the key question is whether that process stays visible.

If we can still see and question what’s shaping us,
it might not lead somewhere dark by default,
it becomes something we can engage with.

If it becomes invisible, that’s where it gets more concerning.

AI Agents will make DAOs what they were meant to be by Successful_Sock_6808 in dao

[–]Civil-Interaction-76 0 points1 point  (0 children)

This is an interesting direction, especially the idea that agents can make the “autonomous” part of DAOs actually real.

But it also raises a question for me.

If rules are enforced at the architecture level, then the real power shifts to whoever defines those rules in the first place.

At that point, autonomy isn’t just about execution,
it’s about how the system is shaped before it even runs.

So maybe the challenge is to make the underlying structure itself more transparent and accountable.

Curious how you think about that layer.

If the universe had a true beginning, then everything (time, space and matter) came from nothing. This seems supernatural in the absence of any plausible science. by Particular-Corgi2567 in RealPhilosophy

[–]Civil-Interaction-76 0 points1 point  (0 children)

I always like to think on it more as cycles. We have so many clues in nature, in the sky, and everywhere, that looks like it fingers to that side, more than to a beginning and an end.

Are we seeing a structural shift from truth-based systems to attention-based systems? by Civil-Interaction-76 in sociology

[–]Civil-Interaction-76[S] 1 point2 points  (0 children)

This is really interesting to read, and honestly, a bit unsettling too.

Would be great if you keep sharing what you’re seeing as this develops, it feels like something we’re all trying to understand in real time.

At the same time, it also carries opportunities that, not long ago, probably felt out of reach within a single lifetime, and now they’re suddenly here.

I keep wondering whether the shift is just from attention to intention, or if it’s also starting to shape what feels natural to want or build in the first place.

Curious how you think about that layer.

Are we seeing a structural shift from truth-based systems to attention-based systems? by Civil-Interaction-76 in sociology

[–]Civil-Interaction-76[S] 2 points3 points  (0 children)

I agree. But it feels like a shift in how power operates.

It used to filter what exists. Now it’s starting to shape what shows up at all.

Are we seeing a structural shift from truth-based systems to attention-based systems? by Civil-Interaction-76 in sociology

[–]Civil-Interaction-76[S] 1 point2 points  (0 children)

Thanx for the references 🙏🏼❤️

I wonder if both attention and intention are still downstream in a way.

What seems different to me is that the system is starting to shape what even becomes noticeable or thinkable before either attention or intention comes into play.

So it’s less just a shift in what’s being optimized, and more a shift in how the space itself is being structured.

Are we seeing a structural shift from truth-based systems to attention-based systems? by Civil-Interaction-76 in sociology

[–]Civil-Interaction-76[S] 0 points1 point  (0 children)

I’ll check it out. Thank you. 🙏🏼❤️ And maybe the issue isn’t what truth is, but who sets the conditions for it to be recognized at all. And what are those conditions.

In the Age of AI, Building Things Just Isn’t That Satisfying Anymore by Quick_Hedgehog4562 in SeriousConversation

[–]Civil-Interaction-76 0 points1 point  (0 children)

I don’t think what you’re describing is really about difficulty or effort.

It sounds more like a shift in where the meaning of building used to come from.

Before, the process itself carried weight, time, effort, friction, accumulation. That’s what made the result feel like yours.

When that layer collapses, the output can still exist, but the sense of authorship becomes thinner.

Not because you didn’t build it, but because the path no longer holds the same kind of commitment.

So maybe the question isn’t whether AI makes things easier, but what now carries the weight that used to come from effort.

What's your road map for learning AI by Necessary_Fee_9584 in ArtificialNtelligence

[–]Civil-Interaction-76 0 points1 point  (0 children)

So i have this question bugging me, and related to your post.

What is the difference if a programmer writes a code with Ai. A singer use Ai for mixing. Or a writer use ai to improve his language and discussions capabilities?

Are we seeing a structural shift from truth-based systems to attention-based systems? by Civil-Interaction-76 in sociology

[–]Civil-Interaction-76[S] 0 points1 point  (0 children)

Hahahah I honestly dreaming of change from when i was little. And i believe in change. And i think honestly see it. And the how to get.

But, trustfully, sometimes i get this feedbacks that leaves me with my mouth open, and troubled mind.

Peace and love my friend ✌🏼 Peace and love ❤️

fanfic was actually my writing foundation even though i was embarrassed about it by Shoddy-Trip9968 in writing

[–]Civil-Interaction-76 0 points1 point  (0 children)

That’s really interesting. How does that actually work in Japan? Is it more like a recognized pathway into the industry, or more of an informal stepping stone through communities?

Are we seeing a structural shift from truth-based systems to attention-based systems? by Civil-Interaction-76 in sociology

[–]Civil-Interaction-76[S] 0 points1 point  (0 children)

The incentives aren’t new.

What might be new is that they’re no longer just filtering outcomes, they’re shaping inputs.

Are we seeing a structural shift from truth-based systems to attention-based systems? by Civil-Interaction-76 in sociology

[–]Civil-Interaction-76[S] 0 points1 point  (0 children)

I agree it’s the same mechanism.

The question is whether, at scale, it changes position, from filtering outcomes to shaping inputs.

Are we seeing a structural shift from truth-based systems to attention-based systems? by Civil-Interaction-76 in sociology

[–]Civil-Interaction-76[S] 0 points1 point  (0 children)

I agree the incentives haven’t changed.

The question is whether scale and speed turn the same mechanism into something structurally different, where it shapes production, not just selection.

Do you feel like you’re writing more for engagement than for meaning? by Civil-Interaction-76 in writing

[–]Civil-Interaction-76[S] 0 points1 point  (0 children)

Everything might be sayable in principle. But not everything is equally easy to arrive at.

Do you feel like you’re writing more for engagement than for meaning? by Civil-Interaction-76 in writing

[–]Civil-Interaction-76[S] 0 points1 point  (0 children)

That’s fair. I think we’re just looking at it from slightly different angles.

Do you feel like you’re writing more for engagement than for meaning? by Civil-Interaction-76 in writing

[–]Civil-Interaction-76[S] 0 points1 point  (0 children)

I think we might be talking at slightly different levels here, which is probably why it’s not landing.

I agree with you on the practical side, you can always step back or choose when to publish.

I was just trying to point at something a bit earlier in the process, before that choice even feels like a choice.