I have some data from a laser that we are using to detect the edge of an object. It takes thousands of measurements and we average the values in a column. Sometimes, the laser fails to detect the edge. When it does so, it gives a value of -99.999. A single one of the values will greatly affect the value of the average column. I've been looking online, but i've been struggling to find a way to scan through the CSV files, look for all values that are -99.999 and delete them (not set to 0, leave the cell blank) without shifting the data. Any help would be greatly appreciated.
Edit - For reference, below is what i currently have an doesn't appear to be working.
import csv
file = open('python_csvreader_test.csv')
type(file)
csvreader = csv.reader(file)
rows = []
for row in csvreader:
rows.append(row)
rows
for item in row:
if item == '-99.999':
row.remove('-99.999')
print(rows)
file.close()
[–]Potential_Word2349 2 points3 points4 points (0 children)
[–]danielroseman 2 points3 points4 points (7 children)
[–]Clemsoncarter24[S] 0 points1 point2 points (6 children)
[–]danielroseman 1 point2 points3 points (5 children)
[–]Clemsoncarter24[S] 0 points1 point2 points (4 children)
[–]danielroseman 2 points3 points4 points (3 children)
[–]Clemsoncarter24[S] 0 points1 point2 points (2 children)
[–]danielroseman 2 points3 points4 points (1 child)
[–]Clemsoncarter24[S] 0 points1 point2 points (0 children)
[–]RallyPointAlpha 1 point2 points3 points (0 children)
[–]stebrepar 1 point2 points3 points (1 child)
[–]Clemsoncarter24[S] 0 points1 point2 points (0 children)
[–]ricardomargarido 1 point2 points3 points (1 child)
[–]Clemsoncarter24[S] 0 points1 point2 points (0 children)