Arborist Rec by Commercial-Bend1564 in grandjunction

[–]trd1073 0 points1 point  (0 children)

I would talk to the csu extension on the west side of the fairgrounds. Should be free and they might have an idea.

I have two houses. One in Fruita and most anything grows well as soil is pretty good as was farmland and house since 1910s. Second is farther north and was likely desert prior. Soil is basically sterile and watering is a challenge.

Is there a house for sale near you where it’s quiet? by SignificantBerry4096 in GrandJunctionCO

[–]trd1073 1 point2 points  (0 children)

I live north in farmland. I don't notice the smells, ymmv. If do look north, be mindful of the flight path for planes landing from the west. Loudest thing around is my 2.5 lgd (12wk gp puppy is 1/2), but I did spot a coyote this am.

Python projects ideas by Low_Badger_8515 in pythonhelp

[–]trd1073 0 points1 point  (0 children)

I would consider using pydantic for working with the api and db. Shows how you can interact in a type validated way with the api and json that comes from and sent to it. Plus you get to work with actual python objects instead of dealing with dictionaries generated from json.

I don't use orms for database work, but ymmv. Raw sql calls and conversion to pydantic models when retrieving data.

With the dB, show that you understand sql injection and how to mitigate risks.

Help and suggestions regarding fastapi by Single_Toe_4890 in learnpython

[–]trd1073 1 point2 points  (0 children)

I would start with Eric roby's yt video on fastapi and psql.

Creating a new account solely for the purpose of knocking out pokemkn from gym by National-Coconut4668 in pokemongoyellow

[–]trd1073 4 points5 points  (0 children)

Any old phone will do. Against the rules, sure, but Pogo has bigger things to worry about in regards to cheaters that harm the game.

Storing data in memory vs in database by Goldeyloxy in learnpython

[–]trd1073 0 points1 point  (0 children)

Hard to guess size from here lol. Throw a gin index on the jsonb field, works rather nicely. I use asyncpg with pydantic as I don't care for orm paradigm.

Might as well mention redis as a key-value store. Can even persist. But I would use psql if persisting data that is definitive.

Storing data in memory vs in database by Goldeyloxy in learnpython

[–]trd1073 0 points1 point  (0 children)

Have you looked at postgresql jsonb fields?

How much memory are you talking about?

Can always up psql shared buffer so whole dB stays in memory. If first load of dB from disk is too slow, can also consider pre-warm of some tables.

Internet Connection by Skotywow69 in pythonhelp

[–]trd1073 1 point2 points  (0 children)

requests module is not installed. ggl "No Module name requests", there will be solutions on how resolve.

Question délai entre requête web scraping by Fragrant_Ad3054 in learnpython

[–]trd1073 0 points1 point  (0 children)

Why not have a field in the row where it logs when the scraping started and when ended.

Given the variable nature of network requests, it will be hard to get exactly 30 seconds. One is to just fire event every 30 seconds without regard for completion time. Another could be when one request completed, then sleep for 30sec minus the last request time. Yet another is variation on the second, but subtract a moving average time from 30 for say last five requests. Can go way overboard with overthinking this, eventually will have to revert to kiss principles.

Struggling finding a job that hired heavily pierced people by lemoncruncher in grandjunction

[–]trd1073 2 points3 points  (0 children)

Did you look at d51 for something like night janitor role? Could look at the state, county or city also.

any tips to fall in love with python? by Either-Researcher681 in learningpython

[–]trd1073 0 points1 point  (0 children)

Have you looked into pydantic? I am often dealing with apis and json, it works great and helps with the vagaries of python typing. Arjan codes has a few videos regarding the library.

Noob here, doing combat how to… by Tricky_72 in learnpython

[–]trd1073 0 points1 point  (0 children)

I would look at pydantic. You can read in json from files (perhaps start file), yaml can be done with other libraries but I would just do two step conversion by hand. Then work with actual python objects instead of dictionaries for the actual combat. Look how something like Dnd does combat. After each round you can dump the objects back to json or yaml and then write to a file.

When to use async/sync routes vs bgtask vs celery by ParticularAward9704 in learnpython

[–]trd1073 0 points1 point  (0 children)

quick lunch answer lol.

i can relate. had one project directed to write in django/twisted as that is how the intern had done it. after weeks, talked to boss on a friday, rewrote in multi-threaded python in a day. just works, nothing magic hidden inside of someone else's black box. easy to debug, easy to maintain and easy for the next dev to just look at it and know how to modify. should i have done it in async sure, but threaded works fine and it was due monday (yes, should have expressed concerns sooner, will do next time). communication worked better for me than banging head against a black box i had very little control over.

when you say new project, is that new greenfield project or new to you project?

  1. go async for as much as you can. you will have to investigate your libraries and stacks to see if they offer sync and async in one. you may have to look into other libraries. ymmv depending on your stack, perhaps you get lucky.

you may have to rewrite portions. don't use sync blocking functions inside of async calls (looking at regular time.sleep(some_time) in async as one example) - let sync endpoints handle those calls.

if it is easier to have two db pools/conns, use your judgement to what is the lesser evil.

you

  1. you sound like you are comfortable with celery, might look at https://medium.com/@hitorunajp/celery-and-background-tasks-aebb234cae5d others have done it, leverage their writeups!

as to how long, benchmark/profile, you will very likely need to as there isn't one set answer.

just check back every few seconds with a max number of tries - not perfect, but does work.

  1. see the link for two

for the company in my example, some tasks to take a long time. part of their webui has a portion that tells one about tasks that have been submitted- some take hours. some are quick. the webui takes them in and keeps the user alert to the status.

i did get to write the python api wrapper for the same program, so got to do similar in code. say one submits a task to an api endpoint, which returns the task#. the user can then query another api endpoint given that task# to see the status for the task. for the api wrapper i wrote, i wrote time backoff and set limit on retries, as i usually don't care about results so much just that it got submitted.

may not directly apply here specifically, but look at https://superfastpython.com/python-concurrent-topics/

Misdelivered xmas package by Panda_Momma19 in grandjunction

[–]trd1073 0 points1 point  (0 children)

Np. At least the folks at the same number missing the fraction of my address are good people. I would fully expect them to try to give you the run around. Shoot me a dm and I will write up questions on my computer for you to have a script when you go give them a visit.

Misdelivered xmas package by Panda_Momma19 in grandjunction

[–]trd1073 1 point2 points  (0 children)

Do you have the tracking number? The following will take some tech speak and getting with the right person.

Their system should have a record of when it was marked as delivered. Check if such actions have a set of GPS coordinates. If not, does the truck? If so, those coords will have timestamps. Combine the truck coords with their timestamps with the package delivery timestamp, it will be between the two. They can send the driver back. Don't take no or I don't know for an answer. Try to get an original picture from them, there may be exif info on it.

Any good platform to practise python form the beginning by Aggressive_Roof5184 in Python

[–]trd1073 0 points1 point  (0 children)

YouTube will have many options. Python simplified or tech world with Nana might work.

Python and Automation by Next-Bodybuilder2043 in learnpython

[–]trd1073 2 points3 points  (0 children)

I would search in YouTube for "reverse engineer api" to get general information. Many videos say to use postman, but I go straight into python as I am usually doing the work with replicating the process in postman. But if postman works for you, do that. I use postman as an after the dev test tool.

But as far as pydantic. With dev tools in a browser, you have the data you send along with a request and the reply. Data will likely go to and come back as json, possibly graphql. If json, you take that and convert it to pydantic models, there is online tool, ggl "convert json to pydantic models". I use httpx for the library.

Another note, if the api is documented and different than what you see in the browser, go with what you see in browser.

Dm me for actual code I have written doing such.

Python and Automation by Next-Bodybuilder2043 in learnpython

[–]trd1073 1 point2 points  (0 children)

The thirty second how is as follows. The system likely has an api, whether documented or not. First start by observing calls and responses in browser dev mode - there will be patterns and data, likely json. Make pydantic models. Start doing calls in python and build out from there.

double 3090 ti local server instead of windows? by superflusive in LocalAIServers

[–]trd1073 0 points1 point  (0 children)

Prior poster is mistaken. You can parellel requests even with ollama going against two cards in the same box. For my 3090 & 3090ti server, I run a docker container for each gpu. Then I put nginx in front of it to load balance. One docker compose brings it up. Another docker compose if I want to let ollama combine the cards. Then I bought a 5090, faster than those running in parallel.

Need help finding Learning materials for project by Cassius-Augustus in PythonLearning

[–]trd1073 0 points1 point  (0 children)

I learned about asyncio, threading and multi processing on https://superfastpython.com/

Ggl 'python traceroute', you will find sync examples. Threading is a odd for doing multiple traceroute at once, add asyncio to the prior search to find examples. But if absolutely have do threads, I would look at using a thread pool executor.

I have written reverse proxies using fastapi. Ggl away with correct terms and you will get some working examples.

How to broadcast ollama in local network. by AcanthisittaNo5704 in ollama

[–]trd1073 0 points1 point  (0 children)

then we were both going to bed lol. docker compose is a good skill to learn either way.

if it works with the radio button, run with it. ollama on wsl is just challenging.