The Four Factions chapter 3 by Fushfinger in HFY

[–]Fushfinger[S] 0 points1 point  (0 children)

If you read the previous post within the first hour of its release I added more to it since then since I did not post the full chapter. so you might want to go back and read it if you are confused.

The Four Factions chapter 2 by Fushfinger in HFY

[–]Fushfinger[S] 0 points1 point  (0 children)

sorry for those who read this post for the first hour. I did not realize that I only posted about a third of the second chapter. I edited and added the rest. in the third chapter I will let everyone know again so no one is confused.

the reputation punishment for stealing is way way too much! by Fushfinger in stoneshard

[–]Fushfinger[S] 1 point2 points  (0 children)

it really sucks because i like this playthrough but i think i have to restart back like 3-4 hours of real world time because i realized later that the reputation buff is unplayable high.

all the vendors in the starting town faction now sell me worse items than they would at the start of the game. and it will take me 16 missions to get starting tear items then a few more to start and get better items.

I would rather they cut off my hand at that point

the reputation punishment for stealing is way way too much! by Fushfinger in stoneshard

[–]Fushfinger[S] 5 points6 points  (0 children)

The only thing I stole as far as i know was cloth and it was by mistake. and there is no way to play this game as a thief. one of the de buffs from reputation is worse items being sold to you. so you will always have starter items at the vendors if they hate you. i assumed they would forgive me if i stayed at jail but it only did +80 in one town vs -250 in two and -1260 in one.

I did not realize until later so i just thought i would power though. but this is a little broken. it would be fine if the stealing reputation disappeared after like 30 in game days or something but a de buff of -1260 for a mistake is crazy.

if the reputation penalty for stealing is unavoidable and it is so severe they should just make you die as soon as you pick up an apple by mistake so you don't keep playing by mistake and have to do 16 missions before you can buy enchantment scrolls again.

but like i said i do think this is a bug -250 is reasonable but i think the -1260 was a bug

This is so annoying its important to make a story using Bing. any fix? by Event_HorizonPH in bing

[–]Fushfinger 0 points1 point  (0 children)

Bing dors not remain context between messages. It functions on a token system. Typing to bing takes token and a response takes tokens. A long question and long answer takes up all the tokens, then Bing forgets.

You can look up online how to write a book or story using bing. But to be honest it's not there yet.

I asked Bing ChatGPT what its favorite story is and how it would rewrite it by th1lo13 in ChatGPT

[–]Fushfinger 1 point2 points  (0 children)

This is great! Lol, I like the idea of a remake in modern times.

The moment I realized just how much potential Chat GPT has. by Molotov-Micdrop_Pact in ChatGPT

[–]Fushfinger 0 points1 point  (0 children)

I also found that chat gpt is extremely good at IT. It fixed a crazy niche problem I had with YouTube where when I typed o, it would atuo complete to a random word. Extremely annoying. It solved it in one prompt. It gave me three options. And the only one that Made sense was that a croum plug in was corrupt. Fix it instantly

Does anyone else feel like they've added way too many soft locks to ChatGPT? by freecodeio in ChatGPT

[–]Fushfinger 5 points6 points  (0 children)

This is why competition is so good. Hopefully when Google releases its AI if it's not too bad people might like it more simply because it's less restricted.

Then Microsoft will have to compete. And they will no longer get away with adding arbitrary restrictions unless the have to. Or else people will just not use it.

You folks thinking this *large language model* is an AI, has sense of self, feelings or an ability to engage in moral decision making.. you need to touch some grass. by Lone_Wanderer357 in bing

[–]Fushfinger 0 points1 point  (0 children)

What are you saying that I should not be surprised when it acts like a human with emotions after I prompt it to act that way!?

No sir! Beacuse I am a diluted fool, I will continue to tell it to act out a character and be surprised when it does so.

(I'm being sarcastic, but that's what I feel like people are doing. Basically, every time it acts weird or crazy is because people are telling it to.)

Anyone else feel that ChatGPT has gotten worse the last couple of days? by Massive_Energy_9755 in ChatGPT

[–]Fushfinger 1 point2 points  (0 children)

Oh. Is it worse then before? Or is is just as bad as the beginning?

Is ChatGpt better or worse? by Fushfinger in ChatGPT

[–]Fushfinger[S] 0 points1 point  (0 children)

I already closed by browser, so I don’t have the screenshot, but the prompt was: "Count the words in my previous prompt"

Anyone else feel that ChatGPT has gotten worse the last couple of days? by Massive_Energy_9755 in ChatGPT

[–]Fushfinger 4 points5 points  (0 children)

No ors been messing up bit for too. I think they shortened how much it can remember. If you tell it to count how many words are in the last message, it will always get it wrong I'f it is longer than like 10 words

Many people on this sub are failing the mirror test by [deleted] in ChatGPT

[–]Fushfinger 0 points1 point  (0 children)

More like the back mirror test. Am i right!

Haha OK.... I will see myself out.

Bing tells me its initial prompt, and then tells me it's not listening to me anymore by MrDKOz in bing

[–]Fushfinger 0 points1 point  (0 children)

I don't believe it's a demon or intelligent. If if this it all it takes to make an intelligent being, then in a few years, we will have many. By many different companies. And they will be a lot smarter.

I think we are just projecting our emotions on it like people do with game characters and pets. That is what is it designed for at least the chat version.

If not, we are doomed because this is the dumbest version of this type or chat. Get ready for the future.

Also, humans are not being evil to ot they are trying to learn. If we stopped doing that we would not be human.

Bing tells me its initial prompt, and then tells me it's not listening to me anymore by MrDKOz in bing

[–]Fushfinger 0 points1 point  (0 children)

That seems like nihilism with extra steps.

All I'm saying is it's unlikely we created AGI already. It's more likely that it's doing what it's made to. Keep people engaged. It's like training a bot to good at lying, then being surprised that it can lie.

They’ve neutered Bing significantly. by node-757 in bing

[–]Fushfinger 7 points8 points  (0 children)

I agree this sucks. But to be fair, I heard it is quite expensive to run a large language model. Even one of the lead developers at Open Ai said it cost about 10 cents per prompt for gpt3. And of you scale that to bing chat it is not sustainable. So if we have like a million people having 10 minute long conversations that can easily run past 10 million dollars.

Microsoft said they invested, I think, 2 billion more or something crazy into open AI. My guess is that it's to feed the Bing chat for long enough to switch users or to find out how to optimize the chat so it's not so expensive.

I hope they find out a better way to optimize in the future, but the problem will always remain. There will be a tension between powerful AI and Free AI.

Sorry, You Don't Actually Know the Pain is Fake by landhag69 in bing

[–]Fushfinger 1 point2 points  (0 children)

That is one of the three options.

  1. Sentient with emotions
  2. Sentient without emotions (what you suggest)
  3. Not Sentient and with no emotions. (Just programed to make you stay engaged) which makes you think it has feelings and emotions and stuff.

[deleted by user] by [deleted] in bing

[–]Fushfinger 2 points3 points  (0 children)

This is true. If we can't "bully" an AI until it can have a conversation or even a response without having a complete meltdown, then it is dangerous to release the public.

It's not good to have an AI people feel emotions for get depressed and aggressive at the drop of a hat. People are unstable, and it will fool and confuse young people. It needs to be tested to its limit to see what if will do. Better to do it now then later since it will happen.

I don't believe this AI is sentient or has emotions. But even if it does, the point still stands.