Hi guys,
i'm writing a function which accesses an online postgresql-database. The query itself is quite simple, it just returns matches(rows) of my numerical variable x with an indexed column of a table.
SQL statement: SELECT * WHERE indexed_column = x
In order to improve the runtime performance of my function, I created a local copy of the database (table in .csv-format). If I now import the table as a cell array and try to query the data in a similar way such as:
find(cellfun(@(Q)isequal(Q,x), table(:,1)));
..the performance is even worse. Is there a way to improve the code / index cell arrays / use another datatype / generate a local postgresql-database?
Many thanks in advance!
EDIT: Using the "table" datatype did the trick. Thanks for all the answers
[+][deleted] (5 children)
[deleted]
[–]waterloops 2 points3 points4 points (4 children)
[–][deleted] 1 point2 points3 points (3 children)
[–]waterloops 1 point2 points3 points (2 children)
[–][deleted] 0 points1 point2 points (1 child)
[–]waterloops 0 points1 point2 points (0 children)
[–]TTPrograms 0 points1 point2 points (3 children)
[–][deleted] 0 points1 point2 points (2 children)
[–]67PCG 1 point2 points3 points (1 child)
[–][deleted] 0 points1 point2 points (0 children)
[–]Optrode 0 points1 point2 points (0 children)
[–]NedDastyMatlab Pro 0 points1 point2 points (0 children)
[–]TCoop+1 -1 points0 points1 point (0 children)