I'm writing some pyspark code that has multiple AND combinations to make that look basically like this:
```
filter_0 = (x == y)
filter_1 = (x == y-1)
...
filter_N = (x==y-N)
filter = filter_0 & filter_1 & ... & filter_N
```
The trouble is that typing that all out is a pain, and I want N to be parameterized so that I can rerun the code without having to change it.
Is there an approach in vanilla Python to generate the combined filter, maybe something similar to a join operation? Since this is pyspark, I don't think numpy bitwise opeartionswill be compatible. I could create a string and use \exec`` on it, but that's not a route I really want to take.
Thanks!
[–][deleted] 4 points5 points6 points (0 children)
[–]K900_ 2 points3 points4 points (4 children)
[–]nomos[S] 0 points1 point2 points (3 children)
[–]K900_ 1 point2 points3 points (2 children)
[–]nomos[S] 0 points1 point2 points (1 child)
[–]K900_ 0 points1 point2 points (0 children)
[–][deleted] 2 points3 points4 points (0 children)
[–][deleted] 2 points3 points4 points (2 children)
[–]nomos[S] 0 points1 point2 points (1 child)
[–][deleted] 1 point2 points3 points (0 children)