You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Note that these do include retries on timeouts, which explains why the total responses might exceed the 5k number.
I've done some investigating and it seems like some DNS servers will refuse to serve records for certain domains. For example:
162.159.11.139 will refuse example.com, youtube.com, facebook.com, but not slack.com or lesser-known domains.
162.159.34.74 will refuse example.com, youtube.com, facebook.com, but not slack.com or lesser-known domains.
dnsvalidator relies on bet365.com, which works, but provides little guarantee that it will work on random domains as well. I think it's best to not make use of the default configuration to improve the quality of the resulting resolver list. Additionally, dnsvalidator has some minor issues that may affect results: vortexau/dnsvalidator#40
The text was updated successfully, but these errors were encountered:
Thanks for bringing this up! I'm trying to put together a plan of what we can do to improve the results. So apparently the root cause of the problem is dnsvalidator's baseline checks. I got the same results as you and did a couple more tests. Some resolvers actually resolve root domains like slack.com but not any of their subdomains like api.slack.com.
I've been thinking that a good baseline domain could be something like nip.io (which serves dynamic DNS records depending on the subdomain/query). dnsvalidator can test with a few variations and predetermine the results without needing to check for a "good result" using a trusted resolver. This should solve the geolocated IP addresses issue as well.
Yes, using a different root domain as a baseline can lead to better results, which is what I did in my second run :)
Interesting tactic! I've been thinking it through for a bit and it could certainly help as an initial test. Though it may not be enough to filter out malicious servers attempting to poison DNS since nip.io is not a high profile domain and less likely to be a target for attacks. Perhaps a multi-step workflow where, at first, nip.io is used and then a second run on the resulting list for DNS poisoning checks could work well in terms of speed and reliability?
Additionally, I think bringing down the default dnsvalidator timeout of 600 seconds back to 5 could give better output. Some resolvers just don't respond to queries within a realistic timeframe, and most round-trips don't take longer than a few hundred milliseconds, so 5 seconds should be plenty.
These are the results I'm getting when resolving the A records of the Alexa top 5k domains using the resolvers from this repository:
Further filtering on resolvers.txt using custom dnsvalidator options results in more stable output:
Note that these do include retries on timeouts, which explains why the total responses might exceed the 5k number.
I've done some investigating and it seems like some DNS servers will refuse to serve records for certain domains. For example:
162.159.11.139
will refuseexample.com
,youtube.com
,facebook.com
, but notslack.com
or lesser-known domains.162.159.34.74
will refuseexample.com
,youtube.com
,facebook.com
, but notslack.com
or lesser-known domains.dnsvalidator relies on bet365.com, which works, but provides little guarantee that it will work on random domains as well. I think it's best to not make use of the default configuration to improve the quality of the resulting resolver list. Additionally, dnsvalidator has some minor issues that may affect results: vortexau/dnsvalidator#40
The text was updated successfully, but these errors were encountered: