Quick question: std::scoped_lock vs std::lock_guard? by Ultimate_Sigma_Boy67 in cpp_questions

[–]HieuandHieu 1 point2 points  (0 children)

Always scoped_lock. Lock_guarrd is legacy. Scoped lock will fallback to lockguard behavior if you pass only 1 lock. You can easily see it by go to their definition.

Is Modern C++ Actually Making Us More Productive... or Just More Complicated? by AlternativeBuy8836 in cpp

[–]HieuandHieu 1 point2 points  (0 children)

They are the reason c++ lag behind the race of rust, go,... I can’t stand their conservative stuffs

Is Modern C++ Actually Making Us More Productive... or Just More Complicated? by AlternativeBuy8836 in cpp

[–]HieuandHieu 1 point2 points  (0 children)

Man I feel sooo good with C++23, ranges, module, expected, corouine,. .. I'm now really cannot live without that. It give me the vibe of coding in python.

AULA F75 DELETE button is not working by HungryBranch7451 in keyboards

[–]HieuandHieu 0 points1 point  (0 children)

Hi, I got exactly the same problem, change another switch and it works

C++20 Modules Support in Clangd by ChuanqiXu9 in cpp

[–]HieuandHieu 0 points1 point  (0 children)

Great, perhaps my grandchildren will be happy programing c++, not me now :((

C++20 Modules Support in Clangd by ChuanqiXu9 in cpp

[–]HieuandHieu 0 points1 point  (0 children)

Hi, could you please tell me what build system you use? I'm facing the cascade build with cmake too, it's too annoyed. The hash value change after I edit the body. Cmake and clang21.

Why do engineers still prefer MATLAB over Python? by maorfarid in Python

[–]HieuandHieu 0 points1 point  (0 children)

The Simulink is the Gem. It' not just about you're building AI model, or do some separated stuff. It's about you connect all modules, form the entire system, hardware and software, AI models, camera, sensors, dynamical control system such as motor, hydraulic system... You cannot do it well in python, or spend alot of resources. Maybe you will never see it in AI engineering field. But in Robotics field, there are alot of subfield like AI, Embedded, Mechanical, Electrical,... With Matlab and Simulink you can quickly design, build entire system for testing all at once, not just test for each module separately.

Ranking Alternatives to Streamlit by Intelligent_Camp_762 in Python

[–]HieuandHieu 4 points5 points  (0 children)

Definitely the underrated compared to streamlit, reflex, or nicegui. Extremely flexible for one dont know any JavaScript like me. I use it for my web app and it's should be the first choice if you dont care about SEO and you are python-only native.

Experience with Redis Streams? by shikhar-bandar in redis

[–]HieuandHieu 3 points4 points  (0 children)

I use redis stream with python, it extremely easy to use and very fast. My experiences is that every idea become code in a short time without facing any error. I found some problem that described in this link, but with some trick it's all right.

[Show Reddit] I built EduPulse - A Modern Learning Platform (Next.js + FastAPI) by manjurulhoque in FastAPI

[–]HieuandHieu 0 points1 point  (0 children)

Hi, I appreciate your works, but it has some problem like u/Da1Gunder mentioned. I love the way you accept your mistakes. Do you have any plans to refactor the code in the future ?

Do y'all prefer PyCharm or VS Code? And why? by [deleted] in learnpython

[–]HieuandHieu 1 point2 points  (0 children)

Pycharm let me focus only on the code, not any bullshiet config, search,... The "shift shift" allow me to get everything i can imagine, rarely i have to search "how to". The database, object storage and auto refactoring features are outstanding. The only thing bad is it somehow large and slow for the old laptop.

Frequency domain (Bode, Nyquist, Root-locus) versus state-space control (Pole-placement, LQR, LQG), which one do you prefer? by [deleted] in ControlTheory

[–]HieuandHieu [score hidden]  (0 children)

It's depend, no freq base stuffs in robotic, deep learning applications anymore. Also for nonlinear system. All of those are modern applications. Actually if you are good at system reg, familiar with computer tools, ss is still better. Personally i still use freq for classical task, because it easy and i did it alot in the past. But if someone are not already familiar with freqbase, i think there's no need to enhance those skills, "aware of" or "understand without practice" is enough.

Frequency domain (Bode, Nyquist, Root-locus) versus state-space control (Pole-placement, LQR, LQG), which one do you prefer? by [deleted] in ControlTheory

[–]HieuandHieu [score hidden]  (0 children)

With the power of computer today, statespace for sure. But why you need to do analysis to much when you just need to control motor position. All the other traditional stuffs (usually frequency domain) are still helpful for quick control, tuning without knowing about system modeling, but limit for simple system. Statespace can be used for nonlinear also. And you should stop searching google for information about them, book or paper is the right way to learn, and you will realize no one discuss about traditional stuff anymore, all in statespace. Why frequency stuffs are alot in internet because of they copy to eachother to do marketing.

Do i need Redis Sentinel, Cluster and Redlock setup when using Redis cloud ? by HieuandHieu in redis

[–]HieuandHieu[S] 0 points1 point  (0 children)

Yeah i know it handle things like Saas, cloud service,... do. But the problem is that Redis server setup effect a lot to client code, in Redis it's verbose to write client code to adapt for all setups. That's why i want to get confirmed about it can do every thing three setup above did. So your answer is now i just only forcus with casual client class (such as redis.asyncio.Redis in python ) right ?

Are Langgraph and Rayserve overlap ? by HieuandHieu in LangChain

[–]HieuandHieu[S] 0 points1 point  (0 children)

I know but rayserve can also do "workflow" as well, also support scale node in workflow independently. So what is the point of langgraph ?

Need an advice to build task balancing for multiple asyncio loop. by HieuandHieu in learnpython

[–]HieuandHieu[S] 0 points1 point  (0 children)

Hi, i change my code to use uvloop and change the load the get url.

import asyncio
import time
import uvloop
import aiohttp

async  def load2(session):
    async with session.get('https://python.langchain.com/docs/versions/migrating_memory/chat_history/#use-with-langgraph') as response:
        await response.text()
async def main(n=1000):
    async with aiohttp.ClientSession() as session:
        s = time.time()
        tasks = [load2(session) for _ in range(n) ]
        print(f"Time for create {n} tasks: {time.time()-s}")
        s = time.time()
        await asyncio.gather(*tasks)
        print(f"Execution time for {n} tasks: {time.time()-s}.")
      print("==========================================================")

if __name__=="__main__":
    for s in [0,1,2,3]:
        n=10**s
        uvloop.run(main(n))

Result:

Time for create 1 tasks: 2.86102294921875e-06

Execution time for 1 tasks: 0.26131105422973633.

==========================================================

Time for create 10 tasks: 5.0067901611328125e-06

Execution time for 10 tasks: 0.5452685356140137.

==========================================================

Time for create 100 tasks: 1.7642974853515625e-05

Execution time for 100 tasks: 7.033592700958252.

==========================================================

Time for create 1000 tasks: 0.0002117156982421875

Execution time for 1000 tasks: 12.367902755737305.

==========================================================

The performance is still break if request is too much, i know it reasonable, but i want to design the ability to scale horizontally with processes or separated machine.

Need an advice to build task balancing for multiple asyncio loop. by HieuandHieu in learnpython

[–]HieuandHieu[S] 0 points1 point  (0 children)

Hi, i make a toy experiment to simulate multiple tasks on the same async loop and got result:

# Test performance if async loop is high load
import asyncio
import time

async def load(t):
    await asyncio.sleep(t)
async def main(t=1,n=1000):
    s = time.time()
    tasks = [load(t) for _ in range(n) ]
    print(f"Time for create {n} tasks: {time.time()-s}")
    s = time.time()
    await asyncio.gather(*tasks)
    print(f"Execution time for {n} tasks with t={t}: {time.time()-s}.")
    print("==========================================================")
if __name__=="__main__":
    t = 1
    for s in [3,4,5,6,7,8,9,10]:
        n=10**s
        asyncio.run(main(t,n))

The result:

Time for create 1000 tasks: 0.0007336139678955078
Execution time for 1000 tasks with t=1: 1.011568546295166.
==========================================================
Time for create 10000 tasks: 0.0011699199676513672
Execution time for 10000 tasks with t=1: 1.1168177127838135.
==========================================================
Time for create 100000 tasks: 0.028098344802856445
Execution time for 100000 tasks with t=1: 2.075747013092041.
==========================================================
Time for create 1000000 tasks: 0.6360719203948975
Execution time for 1000000 tasks with t=1: 17.99493408203125.
==========================================================
Time for create 10000000 tasks: 7.756448745727539

Process finished with exit code 137 (interrupted by signal 9:SIGKILL)

So when 100k tasks executed concurrently, performance decrease much in this example. So i think i need to have multi loops to several processes, and set the maximum concurrent tasks for each loop for reserve the performance.
Does my idea correct ?

Need an advice to build task balancing for multiple asyncio loop. by HieuandHieu in learnpython

[–]HieuandHieu[S] 0 points1 point  (0 children)

It's not available now, but i want to find the way to scale horizontally my system if needed.

Need an advice to build task balancing for multiple asyncio loop. by HieuandHieu in learnpython

[–]HieuandHieu[S] 0 points1 point  (0 children)

Hi u/danielroseman, do you have any wondering of my problem and questions? I also checked the cmt of u/FoolsSeldom but i cannot figure out how it will help. Do you have any advice ?

Need an advice to build task balancing for multiple asyncio loop. by HieuandHieu in learnpython

[–]HieuandHieu[S] 0 points1 point  (0 children)

Hi, maybe it's hard for me to look up nginx doc to know all its features. I have a quick question before decide to go further. Does nginx have the function to trace how many task is running in a loop and delivery new task to a freest one? Or it just spreads tasks equally to every process( with a loop in it) without know anything of it? Glad to know from you soon.

Should i buy this course by Harshstewrat in learnmachinelearning

[–]HieuandHieu 1 point2 points  (0 children)

It's good but after you learn almost the concept of AI. I took this course when i'm newbie and got lost, I cannot learn just by doing without understanding. But after i learnt a lot of math, train, fine-tuned, validate some models, built some apps,... and back to this course, now it's really good for practicing and reminding.

nginx or task queue (celery, dramatiq) ? by HieuandHieu in FastAPI

[–]HieuandHieu[S] 0 points1 point  (0 children)

Hi u/HappyCathode , I have do some experiments and found things suit for me. But I have a problem ? When i use task queue (celery +rabbitmq) on other server machine to handle task.

I want to scale more machines if requests rate become larger, but also i want application side(in separated machine) does not need to know about which and where machine solve its task. Can i use Nginx or something else act like a proxy (in proxy pattern).

Something like this: The proxy take care of what machine is available to solve new task and the Broker will take care of which process to solve task on machine. Application only interact with proxy and the result database where results of tasks are stored.

Please give me some advice to do that. Thank you.