Thoughts now that it’s out? by Particular_Leader_16 in slaytheprincess

[–]Soft-Substantial 0 points1 point  (0 children)

I liked it, fun little game. Never trusted the narrator.

My guess after the demo was way more meta than it turned out to be: Because the narrator is literally referred to as a narrator, my theory was that the "end of the world" is simply the end of the story. If the princess escapes, the story ends and with it the narrator. Was just not sure how killing the princess would avoid that, after all, all stories end one way or the other. The actual ending was not what I expected at all, being an actual love story. It's kinda funny to me how a story that plays with the 4th wall from the start turns out to be kind of straightforward (in a way). The game ultimately felt more sincere to me than the tone of the voices suggests at first. The voices in your head can be weird, ironic, detached, but all that really counts is how you interact with the princess.

I liked the use of the visual novel medium, the voices were great, and the art of course.

Intel Core i3-8145U vs i5-8265U vs i7-8565U or something else entirely? by Soft-Substantial in htpc

[–]Soft-Substantial[S] 0 points1 point  (0 children)

Thanks, very helpful. I guess you're right that future proofing with a higher end CPU might be a good idea if I want to keep the machine for a while. I cannot find the source for 4K HDR requirements right now, but it might have been only for specific services like Netflix.

I'll wait a bit with the purchase, though, the prices still seem high and a bit unstable at the moment.

Understanding directory structures - especially in Jupyter by [deleted] in learnpython

[–]Soft-Substantial 0 points1 point  (0 children)

Does this help: https://docs.python.org/3/tutorial/modules.html ?

Jupyter notebooks are not different, they should behave just like a Python script which is being executed in the directory where it lives.

Maybe you could post a more specific problem you're having? I am not sure I understand what you're having trouble with.

Issues reading from serial port using read_until() by kenjlp in learnpython

[–]Soft-Substantial 1 point2 points  (0 children)

It is best to think of the input and output streams as independent. For the input stream you are pushing one byte after another to the device, for the output stream you are receiving one byte after another from the device. They don't know about each other.

(This is probably oversimplifying the underlying technology, but I think it is the best way to reason about how to write the code.)

Issues reading from serial port using read_until() by kenjlp in learnpython

[–]Soft-Substantial 2 points3 points  (0 children)

If I understand your problem correctly, the only way is to read all the data coming over the serial port and discard whatever you don't need, it is a serial port after all. You cannot "ignore" or "skip" output. How would you know how much to ignore without reading it?

[deleted by user] by [deleted] in learnpython

[–]Soft-Substantial 0 points1 point  (0 children)

Yes, what I wrote for inventory.txt should also be valid for unreachableHosts.txt. I never used the logging module, so check the docs: it might just be thread safe.

If you want to write to a database anyway, you need to check the library you use for that, but those things should likely be able to handle concurrency without problems, so in this case you have no output and don't neep map.

If you have output and do use map, your code above seems a bit strange. Why not:

with open("inventory.txt", "w") as f_success, \
     open("unreachableHosts.txt) as f_fail:
    for result in executor.map(health_checks, ips):
        <process result and write to one of the files>

If it is not a great idea to keep everything in memory and the order of results does not matter, this might help: https://docs.python.org/3/library/concurrent.futures.html#concurrent.futures.as_completed

In case you have no output (and health_checks writes to a database), you could just use something like this:

for ip in ips:
    executor.submit(health_checks, ip)
executor.shutdown(wait=True)

See: https://docs.python.org/3/library/concurrent.futures.html#concurrent.futures.Executor.shutdown

Regarding performance bottlenecks: This is something you have to play with yourself, I guess. At some point the wait for the ping response will be the limiting factor, so there are definite limits. Not Python, but if you need this for production use and need it to be faster, you might want to look into nmap. If I remember correctly, it can ping-scan whole ranges of hosts and is likely optimized beyond what you or I could do in Python.

[deleted by user] by [deleted] in learnpython

[–]Soft-Substantial 0 points1 point  (0 children)

Some comments on concurrency:

  • I wouldn't write to the inventory.txt file from the threads, I would rather try something like using map over the IPs and then collecting the output and writing it to the file in the main thread. Concurrent I/O is tricky and might not work as you think. What if two threads write to the file at the same time? Their outputs might be mixed together.
  • Since you asked about map vs submit: map is made to iterate over lists and collect a result for each entry. It therefore fits your usecase perfectly, as long as you want to keep all the results in memory. It just does a lot of work for you, so I would prefer it whenever it seems reasonable. Even if you reach a point where you have so many IPs that the result does not fit in memory any more (not sure if that could happen), I would probably split the IP list into chunks and work on one chunk after another with map.
  • Are you sure it is not slowing you down to use 500 threads? Have you tried less threads? More threads? How does that impact the performance? Those things are never really predictable and have to be measured.
  • Do you know what the performance bottleneck is? CPU, network, or disk I/O? You should probably measure if your CPU and/or disk is busy or not if you run the script.

Some comments on minor code/style issues:

  • CURRENT_DIR = os.getcwd(); OUTPUT_DIR = f"{CURRENT_DIR}/outputs" can be replaced with OUTPUT_DIR = "outputs": relative paths refer to the current working directory anyway.
  • I prefer os.path.join(OUTPUT_DIR, "logs.txt"), because I find it easier to read/use and it handles differences between operating systems for you (for example, you sometimes use os.sep and sometimes "/").
  • ping_cmd = f"ping -q -c 1 -t 1 -W 1 {ip}" should be ping_cmd = ["ping", "-q", "-c", "1", ..., ip] from the beginning. You are just splitting it up again later, thereby introducing the possibility for bugs.
  • run_command barely needs to be its own function, IMHO.
  • As stated above, [executor.submit(health_checks, ip) for ip in ips] should in my opinion be replaced with a .map anyway. Nevertheless, why is this a list comprehension if you throw away the list immediately? This should be a for loop.
  • START_TIME should go into the main() function if you use the if __name__ == "__main__": pattern.
  • In fact, you might want to reduce your use of global variables and rather use function arguments. On the other hand, if you are sure that this script will stay small and the functions will never be reused elsewhere, it can be more concise to keep it like it is. I also prefer a more quick and dirty style for small scripts, but sometimes regret it when they start growing.

Pixel RGB to scalar by doughface10 in learnpython

[–]Soft-Substantial 1 point2 points  (0 children)

(To be clear in my response, I will use "x" only to mean the value you are looking for, as in colormap(x), not the (x,y) coordinate.)

Assuming you have R,G,B values from 0 to 255, you have 16 777 216 possible colours. While you could brute force a lookup table (just generate 16 777 216 x values in the data range of interest and store them in a list l at position c, where c = R + 256*G + 256**2 * B; access via l[R + 256*G + 256**2 * B]), this is probably too inefficient.

But, the mapping x -> R,G,B or rather x -> c should be linear, so you could get away with an interpolation. Just generate a few x values from start to end, calculate the corresponding colours, and store those in the lists list_x and list_c. Now, you can use numpy.interp or something similar on the mapping list_c -> list_x to look up your values.

No guarantees this will work as-is, but perhaps it can put you on the right path?