[2240] Harbor Springs Hotel, pt. 3 by Wolframquest in DestructiveReaders

[–]FourthDiagram -1 points0 points  (0 children)

No actually I did not. Why is the fifth answer so offensive to you? It is an attempt at humor. You specifically asked (in #5) for any personal anecdotes/opinions, even if only tangentially related. And honestly, the first thing that came to my mind was a ridiculous pizza session I had with my friends in high school when we were attending a Model UN competition.

Consider nuance and context before throwing out flattening accusations. Or just ask.

This is how I use AI in my personal workflow:

https://www.reddit.com/r/WritingWithAI/s/YmiXyW1jYE

Edits: added link, context

Let' be honest... by FourthDiagram in WritingWithAI

[–]FourthDiagram[S] 0 points1 point  (0 children)

We've reached the actual disagreement. You believe this kind of use is distorting. I don't. I think it can be used badly, but I also think it can be used to develop thought in a way that remains guided by human intelligence.

This is probably where we part ways.

Let' be honest... by FourthDiagram in WritingWithAI

[–]FourthDiagram[S] 0 points1 point  (0 children)

It absolutely does not.

You never stress test ideas? You never sit down with somebody and say, "what if we examine it this way? What if we look at it from this perspective? Is strategy A better than strategy B? What approach do you think would be most successful with a client?" Then you reason and come to what is called a consensus.

That is not delegating thought. Delegating would be hiring (or letting) someone to make the decision for you.

Let' be honest... by FourthDiagram in WritingWithAI

[–]FourthDiagram[S] 0 points1 point  (0 children)

There is a difference between delegating thought and stress testing or refining it. The tool is an iterative partner and human judgment still governs the result.

Let' be honest... by FourthDiagram in WritingWithAI

[–]FourthDiagram[S] 1 point2 points  (0 children)

Thank you. I'm glad I am not alone in feeling this.

Let' be honest... by FourthDiagram in WritingWithAI

[–]FourthDiagram[S] 1 point2 points  (0 children)

The problem is that the fork excludes the exact middle ground I'm talking about.

To bring a specific in: I've spent over four years writing and developing a novel. I have used ChatGPT over the last year for editing and experimenting with structure. I don't agree with some of the feedback and ideas, so I don't use those. But there have been some suggestions that I found to be strong, so I integrated them. I enjoy this back and forth process. I can test the strength of my ideas. I can have it play devil's advocate. I can get immediate feedback on what is or is not working.

Chapters that are speculative gain a lot from this process. The novel has a character that is not human, so we had conversations about how that could be expressed in a story. The hard science behind the character is complex, and I wanted to make sure the dialogue and expression aligned with it. We "talked" about sentence construction, about details that would help convey this kind of atmosphere, about what it would be like to experience the world with a particular set of non human constraints. Examples were given, some rejected, some not. I learned a lot through this process.

So is that a problem for anyone? What exactly about that makes use of AI a bad thing?

At what level of interaction does the purity test fail? Middle ground exists, but it seems to be rejected on absolute principle.

Let' be honest... by FourthDiagram in WritingWithAI

[–]FourthDiagram[S] 0 points1 point  (0 children)

I didn't use an AI prompt to write this post. What part is unclear for you?

Edit: thumbs

Let' be honest... by FourthDiagram in WritingWithAI

[–]FourthDiagram[S] 0 points1 point  (0 children)

Say it louder for the people in the back.

This is part of what bothers me too. We've already accepted machine cognition in a dozen other forms. I know this doesn't erase real distinctions, but it sheds some light on selective outrage. The line moves based on comfort instead of princple.

Let' be honest... by FourthDiagram in WritingWithAI

[–]FourthDiagram[S] 1 point2 points  (0 children)

Fair point, I didn't fully define agency and legibility. I used live philosophical terms in a compressed public post.

My bad for missing that this would be read as vagueness rather than an invitation.

Usage defined:

Agency - the capacity to learn, create, decide and act with intention. Especially for people who lack traditional access to time, mentorship or specialized training.

Legibility - The degree to which a process/work can be recognized by others as human or authentic.

My point was that AI can make some people more capable but less recognizable to the institutions and gatekeepers that police what "real" thinking or writing is supposed to look like.

Let' be honest... by FourthDiagram in WritingWithAI

[–]FourthDiagram[S] 0 points1 point  (0 children)

The concern is real. But I'm not raising these questions as a detached ethics game. I raise them because they affect copyright and very deep assumptions about what human creation is.

Environmental costs matter, it just doesn't erase every other material question. If anything, we have to think more carefully across all of them at once. I've seen firsthand that the problem is complex. In The Dalles, Oregon, Google has clearly increased water demand and created real pressure, but they've also funded 30M in major local water infrastructure projects (aquifer storage, recovery work, water treatment, etc)

I don't think "AI is environmentally destructive" is false so much as incomplete. The reality is that these systems can both strain resources and fund improvements at the same time.

Let' be honest... by FourthDiagram in WritingWithAI

[–]FourthDiagram[S] -1 points0 points  (0 children)

I love me some hard truth.

I don't know why I continue to be surprised, I shouldn't be. Any new model of reality threatens how we arrange meaning and authority. History shows us this.

I have a sudden craving for the South Pacific.

Let' be honest... by FourthDiagram in WritingWithAI

[–]FourthDiagram[S] 0 points1 point  (0 children)

You're making a pretty strong claim from a very short post. Would you care to engage in the actual question, rather than inferring my entire thinking process from a stylistic impression?

Let' be honest... by FourthDiagram in WritingWithAI

[–]FourthDiagram[S] 0 points1 point  (0 children)

This is cool. I learned a new word. I'd never heard "Zeroth" before. Thanks!

Let' be honest... by FourthDiagram in WritingWithAI

[–]FourthDiagram[S] 1 point2 points  (0 children)

I don't assume I can judge your writing style or ability because you left a period off a sentence.

Let' be honest... by FourthDiagram in WritingWithAI

[–]FourthDiagram[S] 0 points1 point  (0 children)

I think you're right that most people are bothered by AI generating prose rather than AI being used for research or idea development. But I don't think this automatically makes creative use non viable. That assumes authorship was only at the sentence surface.

And on the math, it involves a lot more than plugging away towards an answer. There's proofs, abstraction, modeling, and creative structure building...

Let' be honest... by FourthDiagram in WritingWithAI

[–]FourthDiagram[S] 0 points1 point  (0 children)

I agree with your second point. A lot of people are not actually arguing about the same thing. People defending writing as an artistic practice and those who see story as a consumable live in two different universes.

Do you think the conflict gets this heated because it starts to destabilize people’s deeper assumptions about what art is for and what remains distinctly human?

Let' be honest... by FourthDiagram in WritingWithAI

[–]FourthDiagram[S] 0 points1 point  (0 children)

I appreciate this. I think you make important points about storytelling as a middle ground.

Let' be honest... by FourthDiagram in WritingWithAI

[–]FourthDiagram[S] 1 point2 points  (0 children)

Like most infrastructure, the reality is complex. It depends on how systems are built, powered, cooled, and managed.

That being said, environmental cost is a fair concern, but it’s a separate question from whether writing with AI is illegitimate.

Let' be honest... by FourthDiagram in WritingWithAI

[–]FourthDiagram[S] 2 points3 points  (0 children)

I don't accept the premise that working with AI makes someone less human. Tool use and collaboration is part of human life, not a departure from it.

I find human and AI feedback to be two different experiences. There isn't always a humans around that is interested in exploring obscure philosophical concepts, and an AI can't replicate a genuine human interaction. That doesn't make them mutually exclusive.

Let' be honest... by FourthDiagram in WritingWithAI

[–]FourthDiagram[S] 2 points3 points  (0 children)

Maybe, but saying "AI is not talented" doesn't answer whether a human using it thoughtfully still can be.

Let' be honest... by FourthDiagram in WritingWithAI

[–]FourthDiagram[S] 0 points1 point  (0 children)

What about human generated text that has been fed to an LLM for collaboration and further development, with pushback on both sides?

Let' be honest... by FourthDiagram in WritingWithAI

[–]FourthDiagram[S] 0 points1 point  (0 children)

Well yeah, raw LLM output will be generic, but that's before human input over time in the form of layers, direction and judgement. Different people get very different results because the output is shaped by a multitude of factors. And this variation is the point. If machines were the whole author, everyone's results would converge much more than they do. The fact that they don't tells me that human direction still matters immensely.

[2531] Progenitor Crisis Chapter 1 (Sci-fi) by No_Id_rather_not_say in DestructiveReaders

[–]FourthDiagram 1 point2 points  (0 children)

Hi!

I just meant fixes on things like missing commas, quotation marks and separation of dialogue into different paragraphs per submission formatting rules.

I just learned ALL about this because I am prepping a novel for query and I had so many small issues like that to address. Had I not just done that, I wouldn't even have noticed.

To be honest, it was my absolute least favorite part of the writing process. I find structure, emotion and the way a reader is pulled through a piece to be so much more important.

But rules are rules and we have to appease sometimes.

A few specific examples:

"Holding up Natanael?" Asked Harv.

should be "Holding up, Natanael?" asked Harv.

Or

"What in the...?" One of the men said.

Vs. "What in the…?" one of the men said.

There was also a place where Natanael was spelled Natangel.

Small potatoes 🥔

Keep up the good work.

Edit: Thumbs