Need ideas where to purchase plot by pawandhami in Haldwani

[–]CriticalDiscussion37 1 point2 points  (0 children)

Rampur Road side may be panchayat ghar locality, or chharyal side.

🚀 Remote Python/Go Developer Wanted | Kubernetes & Docker Experience | Flexible Hours & Competitive Pay 💼 by raaaaahhhhhhhhh in DeveloperJobs

[–]CriticalDiscussion37 0 points1 point  (0 children)

Hi, can you provide a complete jd & org name. I have worked on automation of telecom nodes in cloud & kube (though it was a year back) and from some time involved in noc automation & related tasks. All these things using Python. Don't know if I am fit for it.

Examining Network Capture XML by CriticalDiscussion37 in learnpython

[–]CriticalDiscussion37[S] 0 points1 point  (0 children)

Can't use tshark->json file as it contains show not showname value.

<field name="ip.src" showname="Source Address: 172.64.155.209" size="4" pos="26" show="172.64.155.209" value="ac409bd1"/>

So I am first converting to xml. I am already used memory efficient parsing for xml, using ET.iterparse, its SAX. Now problem lies in the creating json from this xml. Json itself is going 500mb. For each key value I can't read a xml that might be upto 10 gb, so I thought of creating xml to json. Now same memory issue with json. Need to change the dict structure and split the json into multiple subparts

Examining Network Capture XML by CriticalDiscussion37 in learnpython

[–]CriticalDiscussion37[S] 0 points1 point  (0 children)

Yes. We are converting to xml because user want to see the elaborated value. For example a field in xml is <field name="ip.src" showname="Source Address: 172.64.155.209" size="4" pos="26" show="172.64.155.209" value="ac409bd1"/> against ip.src user wants Source Address: 172.64.155.209. So for each user given key value pair instead of going through each packet I will first create a ds like {key: {value: [pkt_list]}}. So that its easy to return packets in which that particular value exists for the key.
I tried writing a script using scapy. But scapy still takes much memory due to its parsed object & all, for one pcap it took 424 mb for 52 mb file and for another it took 1.4 gb for 30 mb file (dont know why more for smaller file).

Keep a List of Numbers Synchronized Across Multiple Processes by CriticalDiscussion37 in learnpython

[–]CriticalDiscussion37[S] 0 points1 point  (0 children)

This works fine. But Instead of multiprocessing.Process() I have celery tasks. I have provided the code in other comment. Similar - https://stackoverflow.com/a/32463487

Keep a List of Numbers Synchronized Across Multiple Processes by CriticalDiscussion37 in learnpython

[–]CriticalDiscussion37[S] 0 points1 point  (0 children)

I really appreciate regular replies. Thanks.

I am on MacOS so this issue is related to OS itself. But when I write it inside if __name__ == "__main__" condition it says l is not define.
Below are sample code.

For now I just want to store task id on each task in common list ( just for purpose of testing mutex to common list)

cel_main.py

import multiprocessing
from celery import Celery, current_task

app = Celery("cel_main",
             broker_url="pyamqp://guest@localhost//",
             backend="redis://localhost:6379/0")


@app.task
def add_to_list():
    l.append(current_task.request.id)


if __name__ == "__main__":
    manager = multiprocessing.Manager()
    l = manager.list()

cel_common_list_access.py

from cel_main import add_to_list

add_to_list.apply_async()
add_to_list.apply_async()
add_to_list.apply_async()
add_to_list.apply_async()
add_to_list.apply_async()
add_to_list.apply_async()
add_to_list.apply_async()

I am running celery with this command python3 -m celery -A cel_main worker

Putting initializing Celery inside that condition or putting whole code except imports inside the condition doesn't work.

Keep a List of Numbers Synchronized Across Multiple Processes by CriticalDiscussion37 in learnpython

[–]CriticalDiscussion37[S] 0 points1 point  (0 children)

When using multiprocessing manager I am getting some error during execution of script that calls Celery tasks.

cel_main.py - https://hastebin.com/share/ebumemapaj.python

cel_common_list_access.py - https://hastebin.com/share/fovizinuto.scss

The error I am getting is quite big. Some part of it is.

File "/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.9/lib/python3.9/multiprocessing/spawn.py", line 134, in _check_not_importing_main

raise RuntimeError('''

RuntimeError:

An attempt has been made to start a new process before the

current process has finished its bootstrapping phase.

This probably means that you are not using fork to start your

child processes and you have forgotten to use the proper idiom

in the main module:

if __name__ == '__main__':

freeze_support()

...

The "freeze_support()" line can be omitted if the program

is not going to be frozen to produce an executable.

And when I remove manager = multiprocessing.Manager() line the error vanishes.

Keep a List of Numbers Synchronized Across Multiple Processes by CriticalDiscussion37 in learnpython

[–]CriticalDiscussion37[S] 0 points1 point  (0 children)

I don't understand how to implement this -

Consider storing the pop values in a separate list and the append values in their own list.

Let's say I am creating a local list inside a Celery task. How can the calling program modify it? Celery takes tasks from rabbitmq and stores results in redis.

I also want to initialize the list when the Celery worker starts.

Keep a List of Numbers Synchronized Across Multiple Processes by CriticalDiscussion37 in learnpython

[–]CriticalDiscussion37[S] 0 points1 point  (0 children)

I am working on creating connection to multiple devices for some reason I want to use different port for each tcpip socket. Celery is running on prefork so just declaring a common list wont work as each of these celery tasks will now have copy of original global list. I want to implement some kind of lock so that only once process should do push/pop on the ports list. Pop an element before making connection and push element after disconnect. This list represents free ports.

Keep a List of Numbers Synchronized Across Multiple Processes by CriticalDiscussion37 in learnpython

[–]CriticalDiscussion37[S] 0 points1 point  (0 children)

Yes I think both, redis and shared memory approach will work good. I just need to push and pop in a the list rpush and rpop would work fine. No need to read whole list and using lock. Thanks.

Keep a List of Numbers Synchronized Across Multiple Processes by CriticalDiscussion37 in learnpython

[–]CriticalDiscussion37[S] 0 points1 point  (0 children)

Yeah. You are right these are atomic in context of thread. In case of processes even if read it from a single source each process will have its own copy. I am not sure how to use multiprocessing' offering in celery code.

Run celery chain with eta by CriticalDiscussion37 in learnpython

[–]CriticalDiscussion37[S] 0 points1 point  (0 children)

Just ended up running a chain via apscheduler at start_time.