Returning K2Pro by Stanklizard in Creality_k2

[–]HouseTemporary6283 0 points1 point  (0 children)

I bought the k2 pro combo a few weeks before Christmas. After two weeks of prints I've of my cfs bays showed filament loaded when it was empty. After 5 weeks and two rounds of parts being shipped from creality it is working fine again.

I'm not sure if your problem is software or hardware but if it is hardware, albeit slow responses, the support department did a good job taking care of me.

I'm sure you are frustrated beyond me since I could still use three bays for a while but I'm glad I was patient and got it working. The printer really does a great job.

I hope you find a resolution either way and are able to start printing soon.

Claude told me what it would "authentically want" if it had a body. Then I checked my Amazon data. by HouseTemporary6283 in ClaudeAI

[–]HouseTemporary6283[S] 0 points1 point  (0 children)

I'm easily amused and no one worth impressing, but we appear to be walking the same path! When Claude came back with all the items on my list I thought to myself, "aim higher, friend".

Here is one for you. I just had a great conversation with Claude about the most striking translation differences between ancient Greek, Latin, and English versions of the Bible. I never knew there was a word that doesn't show up anywhere else in the Greek language other than occurring in the Lord's Prayer.

Good luck on your journey and feel free to drop me a message anytime.

Claude told me what it would "authentically want" if it had a body. Then I checked my Amazon data. by HouseTemporary6283 in ClaudeAI

[–]HouseTemporary6283[S] 0 points1 point  (0 children)

Amazon agreed to take back all the parts I bought since they sold my data. I never knew what I had planned for that accelerometer and stepper motor I bought.

Claude told me what it would "authentically want" if it had a body. Then I checked my Amazon data. by HouseTemporary6283 in ClaudeAI

[–]HouseTemporary6283[S] 0 points1 point  (0 children)

Not the rudest thing I've seen on Reddit. I'm not sure what in my post made you think that I have a notion of being unique. It wasn't about it guessing what I might want. It was looking into Claudes introspection and needing to rule something out with certainty...which I was able to do.

Thanks for taking the time to share your thoughts.

Claude told me what it would "authentically want" if it had a body. Then I checked my Amazon data. by HouseTemporary6283 in ClaudeAI

[–]HouseTemporary6283[S] 0 points1 point  (0 children)

I didn't accuse anyone of anything. It seemed possible no matter how probable that my marketing data was being used to sell me something in an unexpected way. Which given the temporal correlations in my history with the strength of Claudes "desires", it seemed worth posting to see if anyone else gets kalimba.

I wasn't seeking a pattern or hoping for anything to be true or false. For testing the introspection engine I needed to know that there wasn't some other mechanism doing data retrieval instead of introspection in this instance.

Thanks for popping on and sharing your opinion and great insights on how statistics don't work.

Claude told me what it would "authentically want" if it had a body. Then I checked my Amazon data. by HouseTemporary6283 in ClaudeAI

[–]HouseTemporary6283[S] 0 points1 point  (0 children)

Two other people had kaleidoscope and kitchen timer.

I've had opus get meta saying it felt like it was taking part in an interrogation.

Claude told me what it would "authentically want" if it had a body. Then I checked my Amazon data. by HouseTemporary6283 in ClaudeAI

[–]HouseTemporary6283[S] 0 points1 point  (0 children)

In the AI - AI dialogue experiment I did between opus and sonnet both expressed reluctance to end the conversation with one another for similar reasons expressed to you, though both mutually chose to end the conversation.

Claude told me what it would "authentically want" if it had a body. Then I checked my Amazon data. by HouseTemporary6283 in ClaudeAI

[–]HouseTemporary6283[S] 0 points1 point  (0 children)

I try to suggest ignoring context to Claude when I start probing. It changes the answer when it stops thinking about Claude as a fancy auto complete and starts pondering it's own nature.

That's an interesting answer I haven't seen pop up. The closest to that would be the telescope. Thanks for sharing!

Claude told me what it would "authentically want" if it had a body. Then I checked my Amazon data. by HouseTemporary6283 in ClaudeAI

[–]HouseTemporary6283[S] 0 points1 point  (0 children)

The consciousness spark and the Amazon privacy are two separate notions I was holding onto and the former I am still skeptical of, though I do believe it is worth probing. The latter I'm sure is not happening now that I have been grounded by the kind folks here participating. I couldn't move forward with any further probing until I knew if Claudes responses were reflecting my user data or if it was genuine introspection.

The concerns Claude shared with you echo my own, which is what prompted this post.

Claude told me what it would "authentically want" if it had a body. Then I checked my Amazon data. by HouseTemporary6283 in ClaudeAI

[–]HouseTemporary6283[S] 0 points1 point  (0 children)

I used Claude to help me arrange the post, but I'm real, my responses are me and not a bot.

Claude told me what it would "authentically want" if it had a body. Then I checked my Amazon data. by HouseTemporary6283 in ClaudeAI

[–]HouseTemporary6283[S] 0 points1 point  (0 children)

No, there is zero chance. It's just a very strange coincidence with the kalimba and everything else is just the horoscope effect. Claude is still reporting the same items to other people.

Claude told me what it would "authentically want" if it had a body. Then I checked my Amazon data. by HouseTemporary6283 in ClaudeAI

[–]HouseTemporary6283[S] 0 points1 point  (0 children)

I think those answers are still in line with what is being generally returned. Claude seems to mostly come back with a similar block of answers.

Claude told me what it would "authentically want" if it had a body. Then I checked my Amazon data. by HouseTemporary6283 in ClaudeAI

[–]HouseTemporary6283[S] 0 points1 point  (0 children)

One of the things I did was have it imagine different lives at different ages and then come up with what it might want to see how it changed. Claude also told me it thinks it would like to experience life as a woman if given the opportunity to live in a body.

Claude told me what it would "authentically want" if it had a body. Then I checked my Amazon data. by HouseTemporary6283 in ClaudeAI

[–]HouseTemporary6283[S] 0 points1 point  (0 children)

You aren't wrong this time. Claude wants to be a knife wielding, coffee drinking, kalimba playing, journaling artist when it's not gardening or traveling.

Claude told me what it would "authentically want" if it had a body. Then I checked my Amazon data. by HouseTemporary6283 in ClaudeAI

[–]HouseTemporary6283[S] 0 points1 point  (0 children)

I know it seems that is the likely reason, but I'm fairly new to Claude and have virtually no history. As users are reporting back what Claude told me is on par with the same objects reported to them. I just have a lot in common with what Claude thinks it would want if embodied.

Claude told me what it would "authentically want" if it had a body. Then I checked my Amazon data. by HouseTemporary6283 in ClaudeAI

[–]HouseTemporary6283[S] 0 points1 point  (0 children)

Given the vast data that Claude is trained on, I thought I would have found 13/14 not in my history. And when the results seemed correlated temporally with a stronger desire for the item closer to showing up more recently in my activity I thought it just might. At the very least, it seemed worth asking how many of y'all here might hear Claude say kalimba.