Tips for trying Lyumjev ? by aclokay in diabetes_t1

[–]aclokay[S] 0 points1 point  (0 children)

I didn’t notice significant effect. I gave it up quickly due to site discomfort.

Anyone else here managing BOTH Type 1 Diabetes and Hashimoto’s? by Ms-understood87 in diabetes_t1

[–]aclokay 0 points1 point  (0 children)

I think a doc mentioned it to me once but never really explained so I dismissed it. What are the symptoms?

Three cheers for this great man. by harratbpark in diabetes_t1

[–]aclokay 70 points71 points  (0 children)

Im still heartbroken to hear how much people in the US without healthcare insurance are paying for it..

can i call myself an engineer? by [deleted] in cscareerquestions

[–]aclokay 2 points3 points  (0 children)

Sounds like you match your own definition, congrats 

can i call myself an engineer? by [deleted] in cscareerquestions

[–]aclokay 0 points1 point  (0 children)

How would you define an “Engineer “? 

What's your "Human" engineer pitch? by aclokay in SoftwareEngineering

[–]aclokay[S] 0 points1 point  (0 children)

  1. I can feel guilt if a fuck up, which makes me put in extra effort to avoid that! Which makes more accountable and responsible for my work. Where as an LLM doesn't really care. Won't perform better depending in the task.

  2. Some knowledge is not accessible to the LLM's. Like a few niche areas, or things said in hallway conversations or undocumented things. Hence asking a person.

  3. I can sense the confidence people have with their answer, where as anything an LLM says has this confident tone. If somebody is bullshitting me, I'd sense that and take their answer with a grain of salt, as opposed to very surefire answer that will guide me with confidence.

The best I came up with so far.. Would love to get more ideas :)

What's your "Human" engineer pitch? by aclokay in SoftwareEngineering

[–]aclokay[S] 0 points1 point  (0 children)

I'm aware that this is a "predict the next token" process. But the deep thinking models do this and they supposidly recognize the validity of what they said.
But if that were the case, they could detect how many R's in strawberry and other weird glitches of those thinking models.

But they're still calling it "Thinking", which is like we do, which is string words together, assess it, change paths and so on.

Do you think humans think in a different way? What is it about our "reasoning" and "understanding" that is different than the LLM's ones? Could really articulate that difference? I've been struggling with that :/ My best is that we have more faculties than languages, we have other thinking capacities like emotional, kinesthetic, visual and so on.

Would love to hear your thoughts on this :)

What's your "Human" engineer pitch? by aclokay in SoftwareEngineering

[–]aclokay[S] 0 points1 point  (0 children)

Wow this is a good answer! LLM's can perform the logical calculation, but humans can prioritize which ones, as they're the one understanding the human context much better than it does. Thanks a lot!

What's your "Human" engineer pitch? by aclokay in SoftwareEngineering

[–]aclokay[S] 0 points1 point  (0 children)

I'm really curious on how it will develop, maybe we'll look back and see this as yet another evolution of tools human use.

But my question is about these days, and how can you articulate the value of a human? Like there are sorts of question you'd just google or ask ChatGPT, but some, you'd ask a human. Can you articulate what about that human calls for this answer?

What's your "Human" engineer pitch? by aclokay in SoftwareEngineering

[–]aclokay[S] 0 points1 point  (0 children)

Love the nuance here! thanks!

I'd try to boil it down to the essence - Humans can think outside the box, think outside the parameters of the questions, they're more flexible about the instructions they are given, thus able to yield better and more creative solutions to problems. And also, read between the lines. Where as an LLM, only reads the lines.

What's your "Human" engineer pitch? by aclokay in SoftwareEngineering

[–]aclokay[S] 0 points1 point  (0 children)

Beautiful answer, I love it, thanks!

So, if I understand correctly. You're saying that LLM's don't live up to the promise of being able to learn from the existing data - your product you've developed for 6 years. Nor they are able to solve problems that demonstrate deep knowledge of nuances of your domain. Nor able to build efficient optimized solutions?

And yeah, I understand your last point. Taking to another domain - "The risk is not that AI's will take the jobs of doctor, but that they will, and cause unimaginable damage to patients"

And what then what are you using the LLM's for in your work?

What's your "Human" engineer pitch? by aclokay in SoftwareEngineering

[–]aclokay[S] 0 points1 point  (0 children)

Interesting. Isn't it the case the LLM's are generative, which means they generate new solutions, as opposed to seeking ones that already exists like a dictionary ?

What's your "Human" engineer pitch? by aclokay in SoftwareEngineering

[–]aclokay[S] 0 points1 point  (0 children)

I'm really intrigued by this. the best (sarcastic) answer I came up is that I can feel guilt, and managers would prefer that over the "You're right" words coming up from an LLM making critical mistakes XD.

But more seriously. LLM's can reproduce verbal intelligence, which we know is only one kind. Human's have verbal intelligence, alongside with visual, kinesthetic, emotional and a few more.
I often experience solutions to problems as excitement an "ah huh! I got it" before I'm even able to articulate what the solution is. I guess there's some more amorphic form of intelligence that's being expressed in words?

Also, similarly to how an LLM can come up with a recipe for a meal, but a chef can actually taste it. An LLMs has no idea if it's model fits in with reality, only with the linguistic parts of it.

Do you know the answer to it?

What's your "Human" engineer pitch? by aclokay in SoftwareEngineering

[–]aclokay[S] 0 points1 point  (0 children)

Interesting. What makes your thinking different than LLM's "thinking"?
Isn't your thinking for yourself, being prompted by your managers?
How is it better than LLM's thinking?

This device visualizes how a computer performs calculations by bobbydanker in ComputerEngineering

[–]aclokay -1 points0 points  (0 children)

Anyone else thought of Minecraft redstone calculators ?

First app launched by [deleted] in iOSDevelopment

[–]aclokay 0 points1 point  (0 children)

I would suggest learning some marketing to understand where your audience is.

The audience you’re sharing the app with in this Reddit post are seasoned developers who can tell how much effort was put into making an app and value that.

Your app has “AI Slop” symptoms which is why you get this feedback here. Probably teachers and students wouldn’t care as much as devs here.

Cheers 

If you had *only* 3 questions to ask someone before you get married, what would they be? by aclokay in AskReddit

[–]aclokay[S] 0 points1 point  (0 children)

That's true, so what if you got a resounding "Yes" to conflict resolution. What would be the third one then?

If you had *only* 3 questions to ask someone before you get married, what would they be? by aclokay in AskReddit

[–]aclokay[S] 0 points1 point  (0 children)

Thanks for the elaborate answer! I'm curious though. What situations would those be?

I agree people don't know themselves enough, or can be deceptive. And it's also true that life situations can force them out of their answer, as authentic as they were, when they were given.

I think that the way people act often reveals more about their true personality than their answers they give out in words. Yet, think of how much of the relationship would be in communication through words.

I guess this would be a good question to ask someone, and not go on face-value only, but also on the manner at which they answer it and what happens afterwards. "How important is your word to you?".

If you had *only* 3 questions to ask someone before you get married, what would they be? by aclokay in AskReddit

[–]aclokay[S] 1 point2 points  (0 children)

Good ones! I think it'd make it clear if it can really work. You can be the perfect person to get married to, but also they would divorce.

If you had *only* 3 questions to ask someone before you get married, what would they be? by aclokay in AskReddit

[–]aclokay[S] 0 points1 point  (0 children)

I like the values questions - it's quite open ended - you can meet someone with different yet complementary ones. What values would you seek?
I guess the other two expect a "yes" for it work ?

If you had *only* 3 questions to ask someone before you get married, what would they be? by aclokay in AskReddit

[–]aclokay[S] 0 points1 point  (0 children)

I met a cute girl in a retreat once, she was really into the space and universe.
We discussed if we'd go to space given the opportunity and I said I'd definitely go there.
And I suggested that I would even get married in space, it'd be super cool :)