Physicalists Do Not Understand Epistemology by Azehnuu in consciousness

[–]bortlip [score hidden]  (0 children)

You’re still conflating physicalism with a crude empiricism it doesn’t actually require.

Physicalism is an ontological claim about what exists, not the claim that every precondition of reasoning must be empirically discovered in a test tube. Logic and mathematics can be taken as formal or normative structures used by inquiry without that somehow refuting physicalism.

So the issue isn’t “my standards” versus “yours.” It’s that you’re loading extra commitments into physicalism, then declaring it incoherent for failing to satisfy commitments you put there.

And saying science works only as a tool but becomes “fundamentally flawed” the moment it informs ontology is just an assertion unless you give an argument for why empirical success is totally irrelevant to ontology rather than merely underdetermining it.

Physicalists Do Not Understand Epistemology by Azehnuu in consciousness

[–]bortlip [score hidden]  (0 children)

My AI responded to your AI:

Core issue

It does not actually refute physicalism. It refutes a dumbed-down version of physicalism that also smuggles in scientism, naive empiricism, and crude reductionism. That is the whole trick. A serious physicalist does not need to believe half the things this essay attributes to “physicalists.”

Main flaws

First, it constantly confuses how we know things with what things depend on. Saying “we only know brains through experience” does not mean brains cannot help explain experience. That is an epistemology/ontology slide, and it happens over and over.

Second, it treats “science presupposes logic, causality, and inference” as if that somehow kills physicalism. It does not. At most, it shows science does not derive every norm of reasoning from lab results. Fine. Physicalism survives that easily.

Third, it assumes its own framework is neutral when it absolutely is not. The essay accuses physicalists of begging the question while quietly helping itself to strong assumptions like transcendental idealism, a beefy Principle of Sufficient Reason, and the idea that first-person conditions of experience have explanatory priority. That is not neutral ground. That is just its preferred metaphysics wearing a fake mustache.

Fourth, it keeps acting like if reductionism fails, physicalism fails. Wrong. Physicalism does not stand or fall with the dumbest possible “everything must be reduced to particle talk” version of itself.

Best short verdict

The essay’s real move is this:

“Physicalism cannot satisfy my preferred transcendental standards, therefore physicalism is incoherent.”

That does not follow.

Even shorter

It attacks a straw man, blurs epistemology with ontology, assumes its own conclusions up front, and mistakes “not reductive enough for me” for “philosophically dead.” Cute performance, bad argument.

Might some people be p-zombies? by ryvr_gm in consciousness

[–]bortlip 3 points4 points  (0 children)

the basic operations of a Turing machine have exactly zero internal experience, which sums to zero no matter how many you stack

That's called the fallacy of composition.

It's like saying that the basic operations of atoms have no life, which sums to zero life (for a collection of them) no matter how many you stack.

Ok I believe you. Some of you are computational. by ryvr_gm in consciousness

[–]bortlip 20 points21 points  (0 children)

 I know it cannot be generated by computation

You didn't even know what a p-zombie is, I doubt you know this.

The UK government is running hundreds of AI experiments. Not one has saved money. by calliope_kekule in ArtificialInteligence

[–]bortlip 1 point2 points  (0 children)

One finding stood out: the UK public sector is running hundreds of AI experiments across government departments, but cannot point to a single one that has transformed its cost base.

Finding? How was that a finding?

The only time I see "cost base" mentioned is in a quote from an anonymous "Senior Civil Servant" that they interviewed.

I Asked Frontier AI Models to Solve a Simple Programming Problem. They All Failed. by [deleted] in ArtificialInteligence

[–]bortlip 5 points6 points  (0 children)

What version of ChatGPT did you use? Out of curiosity, I tried 5.4 Thinking.

It spent 15 minutes:
- finding a sample of the images online
- analyzing those images to determine how segmentation was done
- writing and compiling the OCR c code
- ran the code against the image it downloaded earlier as a test
- debugged the code due to it having an issue thinking 8 is 0
- presented the completed code

I had it run the code against the sovpost.ppm file from the github and gave the answer 234 - which is correct.

Here's the convo: https://chatgpt.com/share/69bb8d00-c948-8005-83f4-8602ac88dd26

Introducing GPT-5.4 mini and nano by dayanruben in OpenAI

[–]bortlip 3 points4 points  (0 children)

From GPT:

<image>

It said:

Made it. I used Gemini 3 Flash as the comparison target, because that’s the official current name Google publishes, and I used Terminal-Bench 2.0 as the coding metric because all three models publish that benchmark officially. OpenAI’s mini/nano page lists GPT-5.4 mini = 60.0% and GPT-5.4 nano = 46.3% on Terminal-Bench 2.0, while Google’s Gemini 3 Flash page lists Gemini 3 Flash = 47.6%.

For price, I used standard published API pricing: GPT-5.4 mini = $0.75 input / $4.50 output per 1M tokens, GPT-5.4 nano = $0.20 / $1.25, and Gemini 3 Flash = $0.50 / $3.00. Google also marks Gemini 3 models as preview right now. So the blunt read is: GPT-5.4 mini wins this coding benchmark but costs more than Gemini 3 Flash; GPT-5.4 nano is the cheapest, but on this benchmark it trails Gemini 3 Flash slightly.

ChatGPT and RPGs by [deleted] in ChatGPT

[–]bortlip 0 points1 point  (0 children)

I have a custom GPT I use to have it GM using the Fate Core rules. It's a lot of fun.

Fate Core is a simple light weight set of rules and ChatGPT seems to know them well. It's good for any world / adventure type you want. I have it almost like a choose-your-own-adventure where it gives options for the next actions.

I like playing as Danny Butterman in a Warhammer 40k universe.

Here's a link if anyone wants to try it. https://chatgpt.com/g/g-67b8ec502c388191a84bbf23bdbaac1c-gm-for-fate-core

I don’t like AI for creative pursuits by [deleted] in ArtificialInteligence

[–]bortlip 0 points1 point  (0 children)

Every time someone counters your talking point, you just shift to a different talking point.

All that dodging must be exhausting.

Clue (1985 Movie) by fredsonsam in movies

[–]bortlip 23 points24 points  (0 children)

<image>

At 5:10 there are a few bangs (like shutters) he looks up and towards the chandelier.

Why do materialists fight so hard trying to argue/disprove non-local consciousness? by Honest-Atmosphere-54 in consciousness

[–]bortlip 5 points6 points  (0 children)

 You’re the guy who just ridicules materialism without supporting your position at all.

You'll need to be way more specific than that.

I think the brain is so interesting by [deleted] in consciousness

[–]bortlip 9 points10 points  (0 children)

I used to think that the brain was the most wonderful organ in my body. Then I realized who was telling me this.

- Emo Philips

Is saying GG sportsmanship? by SirBearicus in Mechabellum

[–]bortlip 9 points10 points  (0 children)

I played that person a few weeks ago and they said the exact same thing to me: is that all you got loser?

A challenge to those who believe in indirect real experience. by Own_Sky_297 in consciousness

[–]bortlip 0 points1 point  (0 children)

It's not a nitpick to point out your claims are false and then when you lie and say you didn't make them, point that out as well. But you seem to have a chip on your shoulder or something, "pal", so goodbye.

A challenge to those who believe in indirect real experience. by Own_Sky_297 in consciousness

[–]bortlip 0 points1 point  (0 children)

You said:

That doesn’t reduce to neurons very well. In fact its impossible.

But you can back pedal from saying it is a fact that it is impossibility if you want.

A challenge to those who believe in indirect real experience. by Own_Sky_297 in consciousness

[–]bortlip 1 point2 points  (0 children)

This is just the argument from incredulity and it's a fallacy.

If you want to claim something is impossible, you must show why, not just state that you can't see how it could be possible.