Improving DNS Queries
During my transitional period between my office desktop motherboard dying and installing a dedicated file and media server, I started improving my local admin scripts to accommodate the temporary changes but with the long-term vision of supporting the new server. That is, trying to anticipate as much as reasonably possible.
One of those adjustments was shifting DNS queries to a single system. For years I have used dnsmasq on every computer I use. This has worked well but I always was aware of the inefficiency of several computers individually querying DNS URLs. With a dedicated LAN server on the horizon, I saw a way to finally improve the situation. I could configure the server as my sole DNS name caching server through dnsmasq.
I wanted to use the same generic resolv.conf on all systems. Less maintenance. Each machine is restricted to using the list of name servers in resolv.conf. There are only two IP addresses: 127.0.0.1 and the IP address of the server.
There was one caveat: a Thinkpad T400 laptop. Occasionally I use the laptop outside my home. I wanted DNS queries to work when the LAN server was unavailable. Without modifying resolv.conf.
The solution was to hard-code my preferred DNS servers in dnsmasq.conf rather than resolv.conf. By excluding that list from resolv.conf I ensured each system in my network used the server dnsmasq cache.
When I take the T400 outside the home I run a test in rc.local looking for the LAN server and start dnsmasq locally when the server is unavailable. Also in rc.local I run a script that continually looks for the LAN server. The script uses the at scheduling daemon to recursively reschedule itself. I then can close the lid on the T400, return home, open the lid and within one minute the script finds the local LAN server. The script then stops the local instance of dnsmasq and starts using the dnsmasq at the LAN server.
A big reason I always used dnsmasq on each machine is the simplicity of adding a secondary hosts file, which I use to block undesirable web site URLs. A weekly sync script keeps these secondary files updated with online sources.
One of these secondary files is a huge generic list. Another file contains known Facebook URLs. A third contains known Microsoft URLs, mostly to block Windows 10 telemetry nonsense that has been backported into Windows 7. The respective section of my dnsmasq.conf looks like this:
# A huge generic block file. addn-hosts=/etc/hosts-blocked # Block known Facebook URLs. addn-hosts=/etc/hosts-fb # Block known Microsoft URLs associated with telemetry and phoning home. addn-hosts=/etc/hosts-ms
With a sole instance of dnsmasq running and all network machines using that sole instance, my DNS queries now are more efficient. Blocking undesirable web site URLs also is more efficient.
I use an 8-port 1 Gbps switch for my local network systems. That frees ports on my router’s built-in 10/100 Mbps switch. I configured two ports on the router as VLANs. I connect my refurbished Windows 7 system to one of the VLANs, thereby keeping the system isolated from my LAN. I configured the VLANs to use my server for DNS queries. That means the Windows 7 system DNS quueries are limited by how I configure dnsmasq in the server. While I cannot block all Microsoft IP addresses without iptables magic, I can stop most unwanted telemetry URLs that use normal DNS lookups.
I wish dnsmasq had the ability to store its cache as a file. Further, without recompiling dnsmasq, the cache size is hard-coded to a maximum of 10000 entries, but that has not yet seemed to be a limitation.
When I get the final server configured I want to investigate the squid caching proxy.
Posted: Tutorial Tagged: General
Category:Next: Laptop Frustration
Previous: Protecting Against Ransomware