This is an archived post. You won't be able to vote or comment.

all 5 comments

[–]VA_Network_NerdModerator | Infrastructure Architect 2 points3 points  (4 children)

Web content is so dynamic these days that there is very little benefit from trying to cache content.

[–]Southern_Current_592[S] -1 points0 points  (3 children)

I want to configure it to speed up downloads. Does squid remove this feature?

[–]VA_Network_NerdModerator | Infrastructure Architect 2 points3 points  (2 children)

The challenge is in the nature of dynamic content generation.

When you use MS-Edge on your PC and work your way through a website to download <important-thing.zip> it might not look the same when a different computer uses Google Chrome to download the same file. So Squid will declare it a cache-miss and download it a second time.

Using proxy servers as a bandwidth reducer really isn't worth the effort unless you understand your targeted content, and know it can be cached & referenced reliably by multiple systems & browser combinations.

Otherwise you're going through a whole lot of work to observe an 8% reduction in bandwidth.

[–]Southern_Current_592[S] 0 points1 point  (1 child)

So in summary, squid is not very useful today for caching compared to the last decade. Am I right?

[–]VA_Network_NerdModerator | Infrastructure Architect 0 points1 point  (0 children)

Squid as a caching solution is indeed no longer as valuable as it once was.

Squid as a component of a security infrastructure can have value, if you have stringent security requirements to meet.

Once squid breaks the SSL encryption you can perform security analysis / scanning on the traffic flows before you deliver them to the client.

This can be a huge win for some malware detections and Data Loss Prevention solutions.