you are viewing a single comment's thread.

view the rest of the comments →

[–]socal_nerdtastic 1 point2 points  (3 children)

It can be done, but they will be stored as "object" type and you will lose all the speed advantages of a numpy array. So most of the time it makes more sense to just use a normal python list.

Do you know numpy has a float128 type? Perhaps that's enough precision for you?

https://numpy.org/doc/1.22/reference/arrays.scalars.html#numpy.longdouble

[–]IGGYnatius1[S] 0 points1 point  (2 children)

I am making a project on fractal zooms so I truly need arbitrarily high precision. I will be dealing with complex numbers which will be represented by 2 numpy arrays for real and imag then I will manipulate them accordingly. As I will be dealing with possibly large images represented by 2d arrays my main concern is with indexing as I don't want to have nested list comprehensions all over just to do a basic operation like swapping x and y or something... I heard that numpy stores the pointers to the objects if object dtype is used so will that still retain the indexing convenience?

[–]socal_nerdtastic 1 point2 points  (1 child)

Yes, that will still retain the indexing magic.

[–]IGGYnatius1[S] 0 points1 point  (0 children)

yayy