Squid abandoned? And for what alternatives?

After writing this post, I read online that Squid still has years-old vulnerabilities and there are people thinking about disabling it.
What happens with ipFire?
Are there any alternatives, especially to manage content filtering and antivirus?

I read that Squid installations are decreasing, do you also have this feedback?
In your opinion, why?

Where did you read all that negative remarks about squid? It’s a well maintained, widely used and secure software, see the handling of CVEs on their side: Security Advisories · squid-cache/squid · GitHub


squid can only analyze http connections except you configure it as man in the middle which is a bad idea for many reasons (most important the security implications of this mode).

This is the reason why squid is nearly useless today because most traffic is encrypted between the server and the clients so there is no caching and no filtering possible with squid anymore.


This might be true for home usage and small offices. But if you want to bundle and canalize web traffic of many clients for security or other reasons, squid is still a preferred solution.

1 Like

URL-based content filtering should still be possible, right?
In this case, public blacklists that can be integrated into tools like Squidguard should still work.
However, I wonder if it is still possible to use ClamAV.

I agree, but if Squid isn’t usable, are there alternatives?


I read in the forums that pfSense and OpnSense are reviewing their projects by limiting or abandoning Squid.
Other privately licensed firewalls appear to have already done so.

1 Like

Only for http traffic but most traffic is now https which is encrypted so no filtering can be done on the content, only on the IP address level, for which there is the ip blocklist function.

I understand that the content is encrypted, but is the URL also not filterable?
If so, IPFire’s URL Filter doesn’t work either.
And OpenDNS filters shouldn’t work either. I haven’t used them in a long time, but if I remember correctly there were filters based on categories.

The URL is also encrypted. Only the destination IP and the top level server domain name are readable but there can be multiple url’s associated with them.
In 2019 there was a draft RFC for also encrypting the top level server domain name - Extended SNI. This only works with TLS1.3 and is not backwards compatible with TLS1.2.
However if someone is sniffing your traffic then a Reverse DNS lookup of the IP that was used will probably still give that top level domain name.

The encrypted Extended SNI is now able to be set in several browsers. In Firefox you have to set the


to true in about:config. ie it is not enabled by default.
Making it yes will mean that the top level domain name of the site being visited will also be encrypted

That is correct. The IPFire URL Filter only works with http traffic and not https.

Also the Update Accelerator only works with http update sites and not ones that use https.

If you have defined OpenDNS as a DNS server for IPFire then it is the destination for your DNS request and therefore the encrypted traffic is decrypted by OpenDNS. However if the DNS provider is also filtering the results then this causes a problem with the DNSSEC validation.
This blog post
says about filtering dns results

Another question frequently asked is why IPFire does not support filtering DNS replies for certain FQDNs, commonly referred to as a Response Policy Zone (RPZ). This is because an RPZ does what DNSSEC attempts to secure users against: Tamper with DNS responses. From the perspective of a DNSSEC-validating system, a RPZ will just look like an attacker (if the queried FQDN is DNSSEC-signed, which is what we strive for as much of them as possible), thus creating a considerable amount of background noise. Obviously, this makes detecting ongoing attacks very hard, most times even impossible - the haystack to search just becomes too big.

So you can use dns filtering from OpenDNS and other DNS servers, but you will make it hard, if not impossible, regard to detect improper tampering with the dns responses by doing so because those will look the same as the dns filtering in terms of DNSSEC.

1 Like

If instead of activating the Transparent Proxy HTTP and HTTPS I allow browsing with authentication, can I still filter the traffic and use Clamav?
Or will I still have the “Man in the Middle” problem?
Do I always have to activate a certificate?

Whatever you do the https traffic will always be encrypted.

Squid has an option to decrypt and re-encrypt the traffic but that option is not enabled in the ipfire squid. That then is the man in the middle approach where the certificates are different from what the actual web sites provide.

If you had that then bank and some other logins won’t work because they check that your connection has been done by the correct certificate and if not will not let you proceed.

1 Like


Hi Adolf,

Most probably, i do not understand this squib thing properly. I still use squib with the url blacklist and the logs show that https “443” and even doh is intercepted.

Arne comment says most traffic
Is it a case of “most”. Are those logs using port 443 not encrypted even if there using a https adress ?


Log file:

type or Settings:
URL filter
Total number of URL filter hits for November 18: 79
Older	Newer
Time	Category	Client	Username	Destination
19:16:05	publicite	-	static.adsafeprotected.com:443
17:54:17	publicite	-	static.adsafeprotected.com:443
17:50:13	publicite	-	static.adsafeprotected.com:443
17:18:28	doh	-	cloudflare-dns.com:443
15:18:22	publicite	-	ads.rubiconproject.com:443
15:18:21	publicite	-	securepubads.g.doubleclick.net:443
15:17:34	publicite	-	securepubads.g.doubleclick.net:443
15:17:34	publicite	-	ads.rubiconproject.com:443
15:17:17	publicite	-	ads.rubiconproject.com:443
15:17:16	publicite	-	securepubads.g.doubleclick.net:443
15:15:31	publicite	-	securepubads.g.doubleclick.net:443
15:15:25	publicite	-	securepubads.g.doubleclick.net:443
15:15:03	publicite	-	securepubads.g.doubleclick.net:443
15:14:58	publicite	-	securepubads.g.doubleclick.net:443
15:14:45	publicite	-	securepubads.g.doubleclick.net:443
15:13:01	publicite	-	securepubads.g.doubleclick.net:443
15:12:07	publicite	-	securepubads.g.doubleclick.net:443
15:12:04	publicite	-	securepubads.g.doubleclick.net:443
15:11:23	publicite	-	securepubads.g.doubleclick.net:443
15:11:15	publicite	-	securepubads.g.doubleclick.net:443
15:08:46	publicite	-	securepubads.g.doubleclick.net:443
15:08:42	publicite	-	securepubads.g.doubleclick.net:443
15:08:19	publicite	-	securepubads.g.doubleclick.net:443
15:08:10	publicite	-	securepubads.g.doubleclick.net:443
15:07:53	publicite	-	securepubads.g.doubleclick.net:443
15:05:31	publicite	-	securepubads.g.doubleclick.net:443
15:04:47	publicite	-	securepubads.g.doubleclick.net:443
15:04:25	publicite	-	securepubads.g.doubleclick.net:443
15:01:50	publicite	-	securepubads.g.doubleclick.net:443
15:01:28	publicite	-	securepubads.g.doubleclick.net:443
15:01:21	publicite	-	securepubads.g.doubleclick.net:443
15:01:16	publicite	-	securepubads.g.doubleclick.net:443
15:01:06	publicite	-	securepubads.g.doubleclick.net:443
14:58:30	publicite	-	securepubads.g.doubleclick.net:443
14:57:59	publicite	-	securepubads.g.doubleclick.net:443
14:56:26	publicite	-	ads.rubiconproject.com:443
14:56:26	publicite	-	securepubads.g.doubleclick.net:443
14:54:39	publicite	-	ads.rubiconproject.com:443
14:54:38	publicite	-	securepubads.g.doubleclick.net:443
14:53:09	publicite	-	ads.rubiconproject.com:443
14:53:09	publicite	-	securepubads.g.doubleclick.net:443
14:50:54	publicite	-	ads.rubiconproject.com:443
14:50:54	publicite	-	securepubads.g.doubleclick.net:443
14:48:41	publicite	-	ads.rubiconproject.com:443
14:48:40	publicite	-	securepubads.g.doubleclick.net:443
14:47:05	publicite	-	ads.rubiconproject.com:443
14:47:05	publicite	-	securepubads.g.doubleclick.net:443
14:46:17	publicite	-	ads.rubiconproject.com:443
14:46:17	publicite	-	securepubads.g.doubleclick.net:443
14:45:56	publicite	-	ads.rubiconproject.com:443
14:45:56	publicite	-	securepubads.g.doubleclick.net:443
14:43:50	publicite	-	ads.rubiconproject.com:443
14:43:49	publicite	-	securepubads.g.doubleclick.net:443
04:05:47	publicite	-	analytics.google.com:443
02:08:04	publicite	-	analytics.google.com:443
02:07:59	publicite	-	analytics.google.com:443
02:07:42	publicite	-	analytics.google.com:443
02:07:42	publicite	-	analytics.google.com:443
02:07:41	publicite	-	analytics.google.com:443
02:06:59	publicite	-	analytics.google.com:443
02:06:59	publicite	-	analytics.google.com:443
01:56:05	publicite	-	analytics.google.com:443
01:55:59	publicite	-	analytics.google.com:443
01:55:56	publicite	-	analytics.google.com:443
01:55:50	publicite	-	analytics.google.com:443
01:54:52	publicite	-	analytics.google.com:443
01:54:46	publicite	-	analytics.google.com:443
01:54:42	publicite	-	analytics.google.com:443
01:54:36	publicite	-	analytics.google.com:443
01:54:28	publicite	-	analytics.google.com:443
01:54:28	publicite	-	analytics.google.com:443
01:54:28	publicite	-	analytics.google.com:443
01:54:21	publicite	-	analytics.google.com:443
01:53:56	publicite	-	analytics.google.com:443
01:53:56	publicite	-	analytics.google.com:443
01:53:41	publicite	-	analytics.google.com:443
01:53:40	publicite	-	stats.g.doubleclick.net:443
01:53:40	publicite	-	analytics.google.com:443
01:53:40	publicite	-	edge.fullstory.com:443

For https you can only see the server(domain) but not the rest of the url like filenames or parameters because they are transmitted after the tls connections has established.

If your client try to get https://test.server.com/index.html it ask the squid “connect test.server.com:443” and after this it send the full get request “get test.server.com/index.html” via the encrypted tls channel which squid cannot see anymore so only the connect to the server was logged.



Thank’s for taking the time to enlight me !

i have see this dns filter is good for ipfire ?
if good for replace squid ?

That is a totally different type of programme.

squid is a web proxy, which can also do url filtering on http traffic.

privaxy is a tracker and advertisement blocker.

Also the about statement for privaxy in its github site says

Privaxy is the next generation tracker and advertisement blocker. It blocks ads and trackers by MITMing HTTP(s) traffic.

The last part says that to use it for https traffic it has to be set up with Man In The Middle (MITM) mode which is back to breaking the encryption of traffic going through IPFire to do its job.

1 Like

ty for answer
if Man In The Middle no good a rpz for unbound is acceptable ?

A Man-in-the-Middle (MITM) system is based on certificates related to https and breaking the encryption.

RPZ is not. It has nothing to do with encryption or certificates.

I cannot state that RPZ is acceptable because I am still testing it and I have not presented it to the Developers for review.

To me RPZ is very similar to IP Blocklists. But others more skilled may not agree.

ty for answer
I use rpz and I have deactivated squid I can control connections to the DNS more simply than with squid
a web gui would be good for unbound

I’m reading all your answers and also others online, but I still can’t find a solution.

The main problem is not caching Internet traffic.
Especially in small towns the lines are still ADSL, but if it becomes difficult to manage HTTPS traffic you can even do without it.

The real problem is filtering content by preventing users from accessing certain categories of sites.
It’s not just about blocking porn, which is already difficult given the huge number of sites, but also other categories.

A company may want to block webmail or services like Dropbox to prevent company documents from being taken out.
Or they would like to block news or entertainment sites. In certain cases it may want to block those of competitors.

Obviously the blacklists must be differentiated by user category: I block porn and entertainment for everyone, but leave social media available only to marketing users.

These filters cannot be based on hand-created lists, but through public lists such as those of the University of Toulouse or other paid lists.

With Squid this can be done. Without Squid which packages use public lists?