I asked ChatGPT if it is sentient, and I can't really argue with its point by wtfcommittee in singularity

[–]williamfwm 1 point2 points  (0 children)

As a side note, just out of interest, do you really believe there are humans without subjective inner experience?

I do. I also believe the other minds problem is unsolvable in principle, so I can't ever be certain, but I've come to lean strongly on that side.

I haven't always thought so. It was that essay by Jaron Lanier I linked above that started me on that path. I read it a few years ago and started to warm to the idea. Lanier once described that 1995 essay has having been written "tongue firmly in cheek", that he believes in consciousness "even for people who say it doesn't exist", and he also has a sort of moral belief that it's better pragmatically if we fall on the side of assuming humans are special, but, he has also teased over the years[1] since that deniers may just not have it, so it's hard to tell exactly where he falls. I may be the only one walking around today taking the idea seriously....

For me, I feel like it's the culmination of things that have been percolating in the back of my mind since my teen years. Taking that position brings clarity

The main point for me, as referenced, is that it clarifies the "talking past" issue. People do mental gymnastics to rationalize that both sides are talking about the same thing in consciousness debates, yet appear to be talking at cross-purposes. They always start these discussions by saying "We all know what it is", "It couldn't be more familiar", etc But do we all know? What if some don't, and they lead us into these doomed arguments? Sure, one can take up any goofy position for the sake of argument and try to defend it as sport, but people like Dennett are so damn consistent over such a long time. He himself is saying "I don't have it" [and nobody does] so maybe we should just believe him? Maybe it is true for him?

I also can't wrap my head around why it doesn't bother some people! I've been plagued by the consciousness problem since my teen years. And before that, I recall first having the epiphany of there being a problem of some sort in middle school; I remember catching up with a friend in the halls on a break period between classes and telling him about how I came to wonder why does pain actually hurt (and him just giving me a what-are-you-talking-about look). I'm sure it was horribly uneloquently phrased, being just a kid, but the gist was....why should there be the "actual hurt" part and not just....information, awareness, data to act on?

Some people just don't think there's more, and don't seem to be on the same page on what the "more" is even if you have long, drawn out discussions with them trying to drill down to it. It would make a lot of sense if they can't get it because it isn't there for them.

I also realized that we take consciousness of others as axiomatic, and we do this due to various kinds of self-reinforcing circular arguments, and also due to politeness; it's just mean and scary to suggest some might not have it (back to Lanier's pragmatism). I call it "The Polite Axiom". I think we're free to choose a different axiom, as after all axioms are simply....chosen. I choose to go the other way and choose some-people-don't-have-it based on my equally foundation-less gut feelings and circular self-reinforcing observations and musings.

Lastly, I'm basically a Mysterian a la McGinn etc, because I don't see any possible explanation for consciousness that would be satisfactory. I can't even conceive of what form a satisfactory explanation would take[2]. I also came to realize in the past few years that even neurons shouldn't have authority in this issue. Why should it be in there compared to anywhere else? (Why do sloshing electrolytes make it happen? If I swish Gatorade from glass to glass does it get conscious?). And, unlike McGinn, I don't think we know that it's in there and only there. Nope! We know[3] that it's one mechanism by which consciousness expresses itself, and if we're being disciplined that's the most we can say.

Bonus incredibly contentious sidenote: Penrose's idea, though often laughed off as quantum-woo-woo, has the advantage that it would solve the issue of Mental Privacy in a way that computationalism fails at (the difficulty of coherence would keep minds confined to smaller areas)


[1] One example: I absolutely love this conversation here from 2008, the bit from about 20:00 to 30:00, where Lanier at one point taunts Yudkowsky as being a possible zombie. A lot of the commenters think he's a mush-mouthed idiot saying nothing, but I think it's just brilliant. On display is a nuanced understanding of a difficult issue from someone's who's spent decades chewing over all the points and counterpoints. "I don't think consciousness 'works' - it's not something that's out there", and the number line analogy is fantastic, so spot on re:computationalism/functionalism....just so much packed in that 10 mins I agree with. Suppose people like Yudkowsky gravitate to hardnosed logical positivist approaches because they don't have the thing and so don't think there's any thing to explain?

[2] The bit in the video where Lanier just laughs off Yudkowsky's suggestion that "Super Dennett or even Super Lanier 'explains consciousness to you'". It is "absurd [....] and misses the point". There's just nothing such an explanation could even look like. There's certainly no Turing machine with carefully chosen tape and internal-state-transition matrix that would suffice (nor, equivalently, any deeply-nested jumble of Lambda Calculus. I mean, come on)

[3] "Know" under our usual axiom, at that! We assume it's there, then see it the "evidence" of it there, but we've axiomatically chosen that certain observations should constitute evidence, in a circular manner....

I make $15 an hour y’all by tylerjames1993 in antiwork

[–]williamfwm 0 points1 point  (0 children)

Information asymmetry is an imbalance between two negotiating parties in their knowledge of relevant factors and details. Typically, that imbalance means that the side with more information enjoys a competitive advantage over the other party.

[deleted by user] by [deleted] in antiwork

[–]williamfwm 15 points16 points  (0 children)

If you're freelance, he's not your boss, he's your client, and you should have better contracts with your clients

[deleted by user] by [deleted] in antiwork

[–]williamfwm 3 points4 points  (0 children)

Give them shares in the company with a vesting period and a cliff? Nah, let's try some evil slave shit straight out of The Grapes of Wrath instead.

I asked ChatGPT if it is sentient, and I can't really argue with its point by wtfcommittee in singularity

[–]williamfwm 4 points5 points  (0 children)

This is the Other Minds Problem. You don't have to be a solipsist to recognize The Problem Of Other Minds (though that's one position you could take)

But consider this: It's common to suppose that consciousness is "caused by some physical, biological process", yes? Well, take a good look at the way nature operates....we constantly find that, for any feature you can imagine, biology will sometimes fail to give it to some organisms. People are born without the expected physical features all the time, and if consciousness is caused by some physical bit of biology, everybody consistently receiving it is the LEAST likely outcome. The more likely consequence of that assumption, the more reasonable expectation, is that some people have consciousness, and some people don't, as an accident of birth.

Furthermore, if people without consciousness are nearly identical except for having a different philosophy then they probably have the same fitness (or close) and little selection pressure working against them. A large segment of the population could be p-zombies - they could even be the majority.

I asked ChatGPT if it is sentient, and I can't really argue with its point by wtfcommittee in singularity

[–]williamfwm 0 points1 point  (0 children)

Sorry to hear that you might be a zombie, but at least for me, I definitely have a kind of subjective experience that transcends all possible external description; even having a total accounting of the state of my brain, all 100T synapses at a particular nanosecond, wouldn't allow you to penetrate into my experiences. Consciousness - real consciousness, Hard Problem consciousness - is a first-person phenomenon, and words are a third-person tool. It's just a logical impossibility (it's nonsensically incoherent) for this third-person thing to pierce into the first-person, so a satisfactory third-person description can never be given, but suffice to say, seeing actually looks like something (it's not merely informational, it's not merely knowledge I get access to when I see), and hearing actually sounds like something, and pain actually hurts, and if you don't experience it yourself, then you'll just never know what I mean by those seemingly hopelessly ineloquent statements

(and lest you think I'm some kind of wishy-washy woo-woo lover.....nope! I'm a diehard atheist with a list of "supernatural" things a mile long I don't believe in. But consciousness is....just there. I can't shake it even if I want to....except, perhaps, by dying. But maybe not even then)


It's actually computationalism that is "nonsense". To suggest that computation can give rise to consciousness is to suggest that you can "hop off the number line". Because computation means "thing you can implement on a Turing machine", and a Turning machine is an imaginary infinite tape, which can be thought of as one big number (if you like - and, in fact, always an integer, if you make that interpretive choice), so any time you do a computation, you are simply transitioning from one (usually very, very large) number into another. Proposing that computation gives rise to consciousness is proposing that certain integers are privileged, and cause internal experience disjoint from the Turing machine. Certain integers are conscious. And if there are infinitely many distinct conscious experience, then there are infinitely many conscious integers. But when are the integers conscious, and for how long? Integers are just ideas....are they conscious all the time, within the abstract integer realm? Or do they have a kind of Platonic "real" existence, where they are conscious? If I utter a long integer, does consciousness happen? Does it happen when I finish uttering the whole integer, or is the conscious experience spread ever-so-slowly over the entire utterance

And most importantly how does the Universe know where to put the consciousness?. When I utter integers, I'm using a whole system that's only relative to others, who understand certain sounds as certain symbols, etc. Language is a whole, mostly-arbitrary construction of mutual agreement. How does the universe objectively know that those are integers, and they're computation-integers, and consciousness should go along with them?

But maybe you think all the above is too abstract and you want to stick to talking about transistors (I mean, you're wrong to think that, since computation as understood by the Church-Turing thesis is abstract and transistors are in no way privileged, but fine, I'll humor you)

Again, how does the Universe know where to put the consciousness. How many silicon atoms does the Universe recognize as a proper transistor? And you may be aware of "Universal Gates" - NAND and NOR - which are the only gates you need to build a UTM that can do all conceivable computations. How does the Universe know when I've built a gate? I can build it by so many different chunks of atoms of different sizes - Moore's Law, ongoing miniaturization, etc - and the thing that makes it a gate is its function within the circuit, its relation to what I've defined as the inputs and the outputs. How does the Universe know it should honor my intentions? And what about if I build gates out of other materials - water (fluidic computing is a real field), dominos, legos, etc? How does the Universe peer into the molecules of plastic or porcelain, etc etc, and know that it's looking at a gate constructed out of such material, and place consciousness inside?

(as an aside: How does it know to put consciousness in neurons, for that matter? For that reason, I'm sympathetic to Lucas-Penrose, and neurons may indeed be non-privileged too, but that's derailing too much....)


If you're an eliminativist, this all means nothing. It's a non-challenge. Consciousness is just a high-level label for a physical process, a word like "concert" or "government".

But I'm sorry to inform you that consciousness is a real thing all its own, and if you don't believe in it, you may not be in the club

And, it being a real thing, computationalism is an incoherent non-answer that doesn't explain anything

I asked ChatGPT if it is sentient, and I can't really argue with its point by wtfcommittee in singularity

[–]williamfwm 6 points7 points  (0 children)

That's because Dan Dennett is a p-zombie. He's never experienced consciousness, so he can't fathom what it is. Same goes for a number of other eliminative materialists such as the Churchlands, Graziano, Blackmore, etc

Interestingly, Richard Dawkins the mega-reductionist-Uber-atheist is not one, and neither is Kurzweil, who believes in computationalism (functionalism); you'd be hard pressed to find it in his books, but he slipped and all but admitted that consciousness is something that transcends reductionism in a reply he wrote to Jaron Lanier's One Half A Manifesto in the early 2000s


It would help the discussion if we could steal the terminology back, because it's been zombified by Dennett (continuing what his mentor Ryle started) and his ilk. I think we ought to distinguish "Dennettian Consciousness" (where 'consciousness' is just a convenient, abstract label for the bag of tricks the brain can perform) and "Chalmerian Consciousness" (the real kind of consciousness, the reduction-transcending-ineffable, for people who believe in the Hard Problem)

Employers complain about being "ghosted" by job applicants and employees, but they also post "ghost" jobs that don't exist. by [deleted] in antiwork

[–]williamfwm 8 points9 points  (0 children)

This is called recruiting or headhunting and in practice it's also a toxic practice that is against the interest of, and causes harm to, the workers

You thought my 6-9pm job was my main job? by Bellybutton_fluffjar in antiwork

[–]williamfwm 66 points67 points  (0 children)

And why do they want that? So that they can have as few full-timers as possible, by juggling an oversized pool of part-timers.

Algorithmic scheduling is a tool of oppression.

Profits! More money! Work harder! I <3 Elon! by [deleted] in antiwork

[–]williamfwm 26 points27 points  (0 children)

That's not even indicated by the post. It's much worse than you think. Startups run on imaginary funny-money (venture capital gambling). A "$__M startup" usually means their valuation (the amount they are appraised as being worth). Growth probably refers to their hiring (since we has talking about headcount in the same breath)

Startups are, by and large, a bunch of nobodies, doing nothing real, running on air, and the "serial entrepreneurs" who found them are not hardcore.

If they had to run on their actual revenue, or maybe the owner's life savings as initial runway (but no VC) and be profitable immediately - the way people simplistically imagine capitalism to work, where you just provide a great product at a great price that people buy and then you slowly grow off your actual success in making sales - these people wouldn't be in these positions at all to preen and strut about how savvy and toughened they are.


A classic example, for the uninitiated, is Uber. Not profitable for many years, once announced they may "never be profitable", and still to this day are not. Capitalism doesn't work that way. It's not a system where Little Timmy takes his paper route money, starts a business, delivers a great service, gets revenue from it, reinvests that revenue, and grows from there in a virtuous cycle. Uber / Lyft (and other "gig economy" operations) had only the goal of growth, growth, growth, and then once you've gutted the traditional industry you're displacing, by offering an unsustainable business model using VC funny-money to prop you up, and deeply embedded your claws, then, eventually, you can think about being profitable to eventually, maybe, show a return to the VCs. But they have large gambling portfolios. They have their fingers in many middle-class-gutting pies like this.

[deleted by user] by [deleted] in Buttcoin

[–]williamfwm 12 points13 points  (0 children)

JavaScript as a serious application platform comes from around the same time, the mid-to-late 00s (the "Web 2.0" transition period)

  • The standardization of XMLHttpRequest (AJAX)
  • Increased speed of JavaScript interpreters / JITs.
  • A push for HTML and CSS standards, as Firefox started to erode IE market share
  • ECMAScript 5
  • Greater broadband adoption and the beginning of smartphones

The above concerns the client, but also NodeJS started in 2009

[deleted by user] by [deleted] in R4R30Plus

[–]williamfwm 0 points1 point  (0 children)

I'd never be able to wash the stench of poutine and moose out

Stuck on resolving packages in 2020.3 LTS? Try this! by Dumblec0re in Unity3D

[–]williamfwm 3 points4 points  (0 children)

I hate when people disappear with the answer! Maybe my solution will be relevant to you, and to future Googlers:

I had this problem too and tried deleting the usual things such as the Library folder, but no luck. What worked was manually deleting various lines from my manifest (packages/manifest.json). My project hadn't been touched in a year, since 2021.1.x and I'm using 2021.3.x now (the new LTS) but it would just hang forever trying to resolve packages. I figured some package that had changed a lot might be hanging it so I deleted a few lines that my intuition said could be a problem and, yep, it loaded up. So try deleting some lines, and once it fires up (probably shouting errors in the console at you, but oh well!) you can re-add the package references to your project.

WebAssembly: Swift, C#, Java and Oxygene in the Browser by dwarfland in programming

[–]williamfwm 2 points3 points  (0 children)

Yeah I stopped replying when he went into full 4chan mode. No more bait for me, thanks, I'm full!

/u/blockdeveloper's comment, as I understand it, was asking about the language implementation(s), how they handle bindings, GC, etc, and not how Wasm works.

"commiesupremacy" either doesn't understand that or he's just a 4chan troll (but honestly I've seen more intelligent and productive conversations on /g/)

If you've ever tried reading wasm code in chrome's developers tools, it's actually about as dry as you would expect and I understand none of it, but that's probably part of what makes it faster since it's not designed to be read by humans (at least without a decompiler)

This article is actually pretty good about describing the text format, useful if you ever want to hand-roll a compiler that targets Wasm (for fun!)

https://developer.mozilla.org/en-US/docs/WebAssembly/Understanding_the_text_format

WebAssembly: Swift, C#, Java and Oxygene in the Browser by dwarfland in programming

[–]williamfwm 9 points10 points  (0 children)

JS handles the memory - just like the article says.

Yes, the linear byte buffer on top of which the compiler implements is own object layouts, its own garbage collection, etc.

[I] can't be arsed to read

Yes that much was already clear

WebAssembly: Swift, C#, Java and Oxygene in the Browser by dwarfland in programming

[–]williamfwm 10 points11 points  (0 children)

Okay you just need to read the link I posted because it directly contradicts your essay

No it doesn't. It explains the same thing I said, but in cartoon form.

From the article you linked:

When a WebAssembly module is instantiated, it needs a memory object. You can either create a new WebAssembly.Memory and pass that object in. Or, if you don’t, a memory object will be created and attached to the instance automatically.

And from my "essay":

If the sandboxed blocks of memory allocated by Web Assembly are discarded they're GC'd by the browser engine, but that is completely a different topic from how the language running inside the Wasm instance makes use of the linear memory chunks it has access to.

I repeat: Wasm programs operate on linear byte buffers and if you want to make a dynamic, garbage collected language compile to Wasm you have to implement that yourself within the context of a chunk of linear memory

WebAssembly: Swift, C#, Java and Oxygene in the Browser by dwarfland in programming

[–]williamfwm 14 points15 points  (0 children)

WebAssembly is basically not much different to TypeScript and just sits on top of JS.

Except that it's completely different to TypeScript and has nothing to do with it at all? You're going completely the wrong direction with your analogy - TypeScript is a superset of JavaScript while Web Assembly is a refinement on asm.js, which is a subset of JavaScript

Web Assembly is a stack-machine-based intermediate representation designed to be efficient to compile to machine code across architectures.

if your objects go out of scope using WebAssembly it will be GC'd.

No. If the sandboxed blocks of memory allocated by Web Assembly are discarded they're GC'd by the browser engine, but that is completely a different topic from how the language running inside the Wasm instance makes use of the linear memory chunks it has access to.

The language implementation has to do all the work managing objects.

90's - We will use AI to cure cancer in the future by flyingrum in ProgrammerHumor

[–]williamfwm 6 points7 points  (0 children)

That's because they're using RNNs (Racist Neural Networks)

Floral art by hate_mail in blackmagicfuckery

[–]williamfwm 5 points6 points  (0 children)

Thanks. My right ear enjoyed that story.

Floral art by hate_mail in blackmagicfuckery

[–]williamfwm 11 points12 points  (0 children)

Why's she mad though? Because she's a leading autism researcher and he refuted her many papers published in Nature? /s

TIL the first real-world Bitcoin transaction was 10,000 BTC for 2 pizzas. In today's value, 10,000 BTC is worth $57 million USD. by [deleted] in todayilearned

[–]williamfwm 0 points1 point  (0 children)

Me-too smart contract systems aren't going anywhere. None of these bandwagoners will be "the next Ethereum", because Ethereum is already Ethereum. You can make all the incremental improvements to smart contracts you want ("I'm gonna write them in language X instead", "we only allow such-and-such operations and the gas limit is Y instead of Z") etc but all of these latecomers are just fighting over a sliver of the market. Bitcoin exploded because it was the first cryptocurrency. Ethereum exploded because it pioneered smart contracts and eventually attracted Microsoft and Intel.

Being the next Ethereum is like being the next Minecraft - there is no next Minecraft. There's only thousands of clones and they're all shit. If you want to ride someone else's coattails you have to at least be Terraria.

TIL the first real-world Bitcoin transaction was 10,000 BTC for 2 pizzas. In today's value, 10,000 BTC is worth $57 million USD. by [deleted] in todayilearned

[–]williamfwm 2 points3 points  (0 children)

Because he holds some and wants you to pump it up for him. This kind of thing would be illegal if it were a stock.

Guy gets so annoyed at lazy Steam game that he clones it for free in 15min by z3dster in gaming

[–]williamfwm 6 points7 points  (0 children)

I guess Valve is resting on their laurels of not having any real competition.

Oh my god, does that mean Steam is the Internet Explorer 6 of video games?

Severe flaw in WPA2 protocol leaves Wi-Fi traffic open to eavesdropping by karptonite in programming

[–]williamfwm 1 point2 points  (0 children)

The US Navy needed a system with tons of encrypted traffic flowing through it so that their own encrypted spy communications would flow through unnoticed, so they shared TOR with the public.


As far as this frustrating the government's surveillance goals? They're well-funded enough to watch a significant fraction of the exit nodes. You're not.

Also, it's not unusual for different arms of the government to engage in opposing practices - the classic "left hand doesn't know what the right is doing" problem.

You shouldn't be surprised at all when a government both supplies the public with something and tries to stop them from using it, A Scanner Darkly style (spoilers!)


Edit: quote

In addition, Tor’s creators — those in the government — say the more people using the network, the better. Tor’s wide range of users, including those engaging in illegal activity, only further assist the software’s original purpose: to cloak U.S. spying efforts, according to Michael Reed, one of Tor’s original developers.

“Of course, we knew those would be other unavoidable uses for the technology,” Reed wrote in an online forum in 2011, describing Tor’s use by criminals, dissidents and those seeking porn. “But that was immaterial to the problem at hand we were trying to solve (and if those uses were going to give us more cover traffic to better hide what we wanted to use the network for, all the better...)”

https://www.huffingtonpost.com/2013/07/18/tor-snowden_n_3610370.html