URL Filter whitelist ignored - Core Update 153

I’ve recently updated an IPFire box from Core Update 146 to 153 and suddenly all whitelisted domains in URL Filter are ignored.
I’ve blocked “searchengines” and “socialnet” categories and then added into the “Custom whitelist - allowed domains”:


Then browsing to google_com or linkedin_com or accounts_google_com is blocked with a TUNNELL ERROR.
Removing the categories “searchengines” and “socialnet” will let clients to browse correctly.
I tried disabling and re-enabling the URL Filter addon, disabling and re-enabling custom whitelists, rebooted several times but nothing worked.
Web proxy is not transparent.

Here some screens about the entries found into “URL Filter Logs” and of “Custom whitelist”


PS: I had to write domains with an underscore otherwise, as a new user, this new forum wouldn’t let me post since there’s a limit to post only two “URLs”…

Have you tried to allow accounts.google.com and it.linkedin.com. These are blocked domains.
Just if URLFilter matches the whitelist more exactly.

Yes, adding the exact domain will match the whitelist and let browse the website.
But whitelist is no more working as intended, it should work as until Core 146.
If I put google_com into whitelist it has to whitelist google_com and all of its bazillion subdomains, I tried adding wildcards before google_com whitout success.

Hi @logharn,

The urlfilter.cgi page was last changed in Core Update 132.

The proxy.cgi page which uses the urlfilter.cgi page has had the following changes

Core Update 147   proxy.cgi: remove old CVS licence clutter
Core Update 152   modified proxy.cgi to make it possible that all subnets declared in "network access control" will be translated from cidr to subnet notation in proxy.pac

Neither of these changes look like they could have impacted the whitelist operation after Core Update 146.

Is there any more information about the blocking in the main messages logs in /var/log/

I just tried with Shalla list.
Enabling ‘searchengines’ blocks login.web.de, for example. Including ‘web.de’ in the whitelist allows the access again.
There must be some other problem.

Sorry for the late reply…
Anyway, I’m also using Shalla Secure Services list.
I tried as your exemple and login.web.de was blocked as ISP category, so I included web.de into the whitelist and it allowed the site.
login.web.de redirected to hilfe.web.de anyway…
Did you tried with google.com ??
I tried also on another IPFire installation and www.google.com or accounts.google.com are unreachable until the “searchengines” category is unflagged or is added the specific subdomain…
Linkedin, strangely, today works…
Into the /var/log there’s nothing more than what WUI shows :frowning:

Where could be the issue??

URLFilter blocking is logged in /var/log/SquidGuard/*.log

1 Like

Ok, that’s the full path…
But as I told before, they show the same output as the WUI, but in a textual way…

Logs.zip (26.2 KB)

I see curious “private” IPs requesting web access.
Do surely use for your green interface?
The network is allocated to CGNAT.
See wikipedia:

Dedicated space for carrier-grade NAT deployment

Main article: IPv4 shared address space

In April 2012, IANA allocated the block ( to, netmask for use in carrier-grade NAT scenarios.[4]

This address block should not be used on private networks or on the public Internet. The size of the address block (222, approximately 4 million addresses) was selected to be large enough to uniquely number all customer access devices for all of a single operator’s points of presence in a large metropolitan area such as Tokyo.[4]

Yes that’s the range we are using…
And, as you pointed, those addresses are embedded into specific operator NAT space which our carrier isn’t using, so it seems to me you are going a little off topic here unless you think that the private address is the problem (and it is not).
In any case I have solved the issue myself starting from a blank ipfire installation and then finding what was the real catch; a really WTF moment…
When into the whitelist there are a domain and a subdomain, in my specific case there were google.com and play.google.com, URL Filter goes crazy.
I was able to reproduce multiple times this behavior with every domain also with web.de that you used as example…
I don’t know if this is specific to Core 153 but for sure until Core 146 URL Filter worked without issues with that whitelist.


Congrats to the solution!

I do not know, whether the selection of LAN IPs influences the behaviour of SquidGuard. I only supposed there may be a problem, if networks are somewhere classified. Not more and not less.

I can imagine, that specifying different levels of the same domain ( as your example with google.com ) can produce errors in the matching engine of SquidGuard. One more proof for the principle “Keep it simple as possible!”.

Hi Luca - Welcome to the IPFire Community!

Would you be so kind to enter a bug report in IPFire Bugzilla?

1 Like

As suggested I opened a bug report that can be found here:


1 Like


With Core 147, ‘squidguard’ was upgraded from 1.4.1 to 1.6.0. Sounds if this could be the reason.

But I don’t know if ‘squidguard’ is still maintained by the “Shalla squidGuard team”, the last message on the mailing list seems to be from Thu Jan 21 19:00:52 CET 2016