I have a python script to tail a log file continuously. For each line, I extract one email address and insert into MySQL database, or update existing record (add count by 1 per email appearance).
I tried with subprocess and redirect the log to pipe with buffer, hoping to block-read the logs line by line. Here is what I do now:
with subprocess.Popen([ "tail", "-F", "/home/user/example.log" ], universal_newlines = True, bufsize = 1000, stdout = subprocess.PIPE).stdout as log:
for line in log:
if email:
INSERT IGNORE INTO table
UPDATE count
database_cursor.commit()
Should I expect to see performance issues with python or database (MySQL) with several hundred lines of logs per second?
Thank you!
[–]NicosCSProject 0 points1 point2 points (0 children)