Squid setup

| Billing< | Statistics< | WebServer< |

1. Squid

Squid is a caching proxy for the Web supporting HTTP, HTTPS, FTP, and more. It reduces bandwidth and improves response times by caching and reusing frequently-requested web pages. Squid has extensive access controls and makes a great server accelerator.

Being a CentOS geek, I am starting from the Squid< RPM installation.

Edit squid.conf< (FC8: conf<, diff<; RHEL4: conf<, diff<):

Create caches:

service squid stop
squid -z<

Start Squid:

chkconfig --add squid
chkconfig squid on
service squid restart<

2. Transparent proxying

Command squid to emit packets only from the internet-wide address:


Configure /etc/sysconfig/iptables (for every internal network but internet-wide address):

-A PREROUTING -s -p tcp --dport 80 -j ACCEPT
-A PREROUTING -s -p tcp --dport 80 -j REDIRECT --to-port 3128
-A PREROUTING -s -p tcp --dport 80 -j REDIRECT --to-port 3128
-A PREROUTING -s -p tcp --dport 80 -j REDIRECT --to-port 3128
-A PREROUTING -s -p tcp --dport 80 -j REDIRECT --to-port 3128<

Restart iptables

/etc/init.d/iptables restart<

3. Clients

Create proxy autodiscrovery script as explained here< and configure browsers to use script at


As a fallback, you can manually instruct browsers to useunified proxy on host proxy.ourdom.com and port 3128 for all traffic but excluding:

localhost, localhost.localdomain,,, .ourdom.com

Edit /etc/httpd/conf.d/vhosts/admin.conf:

<VirtualHost *:443>
ScriptAlias /Squid/cgi-bin/cachemgr.cgi /usr/lib/squid/cachemgr.cgi <Location /Squid>
  Order Allow,Deny
  Allow from 10.20 192.168 172.16 .ourdom.com

Create wiki links<

5. Statistics

Google search returns quite a few tools which claim to parse and analyze Squid logs: sarg<, squidalyser<, mysar< and many commercial tools. However, their quality varies. In the past I used SARG< (see installation tips<), but it provided to be buggy and inflexible. Currently I maintain SAWstats<, an improved version of AWstats<. AWstats is a great log parser and anlyzer with nice web interface, but after years it still lacks Squid support, so I decided to fork.

6. Squidguard

Dag's squidguard for EL 4 is 1.2, too old.

I created the squidguard-1.3-1.vit.el4.src.rpm< package for 1.3 with patches from squidguard.org<.

Install the package

rpm -ivh squidguard-1.3-1.vit.el4.x86_64.rpm

Download blacklists from: shallalist.de<, rejik.ru<, universite toulouse<

I created complete configuration file squidguard-shalla.conf< as a reference for shalla list.

Install new blacklists:

mkdir /var/lib/squidguard
cd /var/lib/squidguard
tar xzf shallalist.tar.gz
tar xzf banlists-2.x.x.tar.gz
tar xzf squidguard-localsite.tar.gz<

Create squidguard config file /etc/squid/squidguard.conf<

Adjust config file...

Verify config file paths:

squidGuard -c /etc/squid/squidguard.conf -d -C xxx < /dev/null<

Note1: without /dev/null redirection squidguard will stop on errors

Note2: -C defines which databases to rebuild. when inexistant database xxx is provided, only checks will be performed. this way we avoid reindexing of huge pron lists

Index the blacklists:

find /var/lib/squidguard -name '*.db' -exec rm '{}' ';'
squidGuard -c /etc/squid/squidguard.conf -d -C all < /dev/null<

Verify that squidguard really protects

echo "http://rose.ixbt.com/ root GET" | squidGuard -d
echo "http://www.sex.com/ root GET" | squidGuard -d<

Restore directory rights

chown -R squid:squid /var/lib/squidguard /var/log/squidguard /etc/squid/squidguard<

Create redirection stubs in /var/www/squidguard

cd /
tar xpzf squidguard-www.tar.gz<

Edit /etc/httpd/conf.d/vhosts/www.inc and add the /guarded location to www.ourdom.com:

Alias /guarded /var/www/squidguard<

Tell squid about squidguard via a line in /etc/squid/squid.conf:

redirect_program /usr/bin/squidGuard -c /etc/squid/squidguard.conf<

Restart squid

service squid restart<

Create a script /etc/localsite/squid/update-guard-rules which should be invoked after each modification of squid guard rules:

echo "Reindexing rules - Please be patient, it may take a while ..."
squidGuard -c /etc/squid/squidguard.conf -d -C all < /dev/null 2>&1
     | grep -vF '.....' | grep -vF '100 % done'
echo "Fixing permissions ..."
chown -R squid.squid /var/lib/squidguard/
echo "Signaling squid ..."
squid -k reconfig<

7. Hints

7.1. Transparency

Since http headers do not carry destination port information, redirecting several ports to a proxy port will loose destination port information. Therefore, only port 80 can be redirected transparently.

-A PREROUTING -s -p tcp --dport 80 -j ACCEPT
-A PREROUTING -s ! -p tcp --dport 80 -j REDIRECT --to-port 3128

7.2. SSL traffic goes direct

There can be situations where squid works behind an upstream proxy, and all targets but local servers (listed in always_direct allow) should be sent through. However, for unknown reasons Squid decides that SSL traffic should be sent directly to incoming hosts. This can be helped by:

prefer_direct off<

and (specially!)

nonhierarchical_direct off

7.3. Transparent SSL

Is not possible with current Squid. Port 443 should be routed around Squid.

7.4. Masquerading

Simplistically can be organized via this iptables line in the *nat section:


For better security one can limit port use for masquerading:

-A POSTROUTING -s 192.168.117/24 -m multiport --dports 80,443 -o eth0 -j MASQUERADE
-A POSTROUTING -s 10.20/16 -m multiport --dports 80,443 -o eth0 -j MASQUERADE<

Other ports e.g. 8000, 8008, 8080, 8100, 8888 et cetera can be considered too.

7.5. Squid 2.6 syntax change


http_port 79
httpd_accel_port 81
httpd_accel_with_proxy off
httpd_accel_uses_host_header off<

changed to

http_port 79 defaultsite= vhost vport=81 <