Você está na página 1de 4

GRC's | DNS Benchmark - Frequently Asked Questions & Answers

Page 1 of 4

DNS Benchmark

DNS Benchmark

Frequently Asked Questions & Answers


A page that hopes to eliminate any confusion that's been created.

You can't optimize it until you can measure it


Frequently Asked Questions

Q:

Why does the DNS Benchmark's built-in default list contain many red (offline or actively rejecting) DNS resolvers? Some well known fast (and useful) DNS resolvers are only accessible to a subset of users on the Internet. An example might be those belonging to specific ISPs who have configured their resolvers for access only by their own customers. Rather than exclude some potentially fast resolvers from the list, we decided to provide any that would probably be worthwhile, and allow the DNS Benchmark to determine whether they were accessible. Leaving the red marked resolvers in the Benchmark's list does not slow down the benchmarking, and they can be easily removed by right-clicking the mouse on the list and choosing "Remove dead nameservers." Also note that using the Benchmark's Custom List creator will select a list of the best resolvers that all work from your location.

A:

Q:

I'd like to use some fast DNS resolvers, but the Benchmark has colored them orange and their status is: Bad domain names are intercepted by provider. Does this mean I should not use them? The DNS Benchmark colors working DNS resolvers orange when they do not return DNS errors in response to queries for erroneous and non-existent domain names. Instead, such resolvers are configured to return the IP address of a web intercept page that some ISPs and third-party DNS providers use to generate advertising and marketing revenue from their users' domain name typos. If you're curious, you can give such resolvers a try by configuring your system to only use orange DNS resolvers, then deliberately mistype a domain name in your web browser and see how you feel about whatever it is that happens. Note also that this behavior has raised enough stir among annoyed users that ISPs and third-party providers who are doing this often provide some means for turning off this behavior for individual customers. So if you would like to use orange resolvers, you might see whether there's a means for deactivating this behavior at the provider's end. For example, OpenDNS is notorious for doing this, but it can be turned off even with their completely free accounts.

A:

Q:

After running a benchmark, the Conclusions states that four public resolvers are faster than the slowest of my system's currently configured resolvers. But the bar chart clearly shows that seven public resolvers are faster. So am I confused or is the Benchmark broken?

http://www.grc.com/dns/benchmark-faq.htm

8/11/2012

GRC's | DNS Benchmark - Frequently Asked Questions & Answers

Page 2 of 4

A:

The confusion arises because the Benchmark tries very hard not to be alarmist and not to make dubious recommendations for changes that might not really be useful. The benchmark takes 50 individual readings (samples) of each resolver's performance and displays the average of those 50 readings. But the question is: how certain are we that the average of those 50 samples is meaningful enough to draw a conclusion? In other words, if we were to take another 50 samples, how sure can we be that we'd get the same average? If we took three samples, and each one was 40, from everything we could tell we could be very confident that the average (which would be 40) would probably reflect the truth. In other words, if we were to take another three samples, from everything we've seen, we'd probably get 40 each time again, for another average of 40. But what if we took three samples and received 20, 40 and 60? The average of these three is still 40, but we would be far less sure that if we took another three samples that we'd get an average of 40 again. We might get 20 or 60 if we happened to get 20 three times, or 60 three times. The good news is, sampling theory understands these things . . . and so does the DNS Benchmark. The Benchmark carefully analyzes and develops a statistical model of each resolver. It computes the standard deviation of the sample set to characterize the statistical spread seen within the set of samples obtained. The larger the spread seen, the less certain we're able to be that the average of those samples will be highly representative of the resolver's repeatable behavior. The benchmark takes all this into account and uses this data to determine how sure it is able to be about any conclusions it draws. It sets a confidence threshold of 95% for everything it concludes. In other words it says given the data we have seen, we can be 95% confident that the following is true... Therefore, to answer the question posed above as an example: When the benchmark says that four resolvers are faster, even though the average performance of seven were faster, given the spread seen among the samples, we can only be at least 95% confident that four of those seven were repeatedly and reliably faster. (And also note that those four would be the top four of the apparently faster seven when the list is sorted by fastest-first.)

Q:

I had hoped to use the DNS Benchmark to compare the performance of different DNS servers on our internal network. From what I can tell this is not really possible. Is that actually the case? The DNS Benchmark issues recursive DNS queries (standard DNS lookup requests) for publicly accessible domains (like amazon.com, cnn.com, etc.) to any resolvers it is benchmarking. The resolvers to be benchmarked are identified by their public or private IP addresses. Therefore, if the resolvers whose IP addresses have been provided to the Benchmark provide DNS resolution services for public domains, the Benchmark should provide comparative performance results without any trouble.

A:

http://www.grc.com/dns/benchmark-faq.htm

8/11/2012

GRC's | DNS Benchmark - Frequently Asked Questions & Answers

Page 3 of 4

Q:

It appears that my computer's third-party firewall is blocking traffic and causing trouble for the Benchmark. The Benchmark works if I completely disable the firewall. What firewall rules do I need to add so that the Benchmark can work? All of the Benchmark's DNS performance benchmarking is performed over UDP protocol to remote port 53 (DNS). This includes the Benchmark's optional automatic or manual check for any newer version of itself, which is also performed through a DNS query. Additionally, the benchmark may attempt to acquire GRC's current master list of publicly available resolvers in case the user hasn't yet built their own custom list. This requires a connection using TCP protocol to remote port 80 (HTTP). And if the user does build a custom list, the Benchmark will return statistics on the top 200 resolvers found by the user, sending them back to GRC using an encrypted HTTPS SLL connection. So the Benchmark might also need to connect over TCP to remote port 443 (SSL). Therefore, in summary, for following local firewall permissions should be provided: Protocol UDP TCP TCP Remote Port 53 80 443 Name DNS HTTP HTTPS Required Yes No No Purpose Benchmarking and Version Self-check Obtaining GRC's Public Master List Returning Master List Statistics to GRC

A:

Q:

What information does the GRC Benchmark send back to GRC, why, and can I block it? We would like the Benchmark's most time consuming operation, the scanning of thousands of possible resolvers to create a custom list, to take less time. Since we're already scanning thousands of remote DNS resolvers as quickly as possible, the only way to speed it is to reduce the total number of resolvers scanned. And the only way to reduce the total number of resolvers scanned, without eliminating any that might be useful to someone, is to acquire intelligence on the past usefulness of the resolvers in our master list. So, for this purpose, and only for this purpose, upon completing the scanning of the master list, the DNS Benchmark returns to GRC a sorted list of the top 200 resolvers found by the Benchmark during that scan. When GRC receives that list the usefulness counts the master database are incremented. No other record or log of any sort is retained of anyone's specific top 200 submission. Therefore, allowing the Benchmark to update GRC's master list is helpful for the future of the Benchmark and is appreciated. However, if for some reason you object to anything, of any sort, being returned to GRC, we respect that and in the interest of total privacy we have provided a means for suppressing the return of the top 200 list: the /nosend command-line option, when present, instructs the Benchmark not to return the final top 200 list to GRC. Simply start the Benchmark with this option when creating a new custom master list and the results will not be returned to GRC.

A:

http://www.grc.com/dns/benchmark-faq.htm

8/11/2012

GRC's | DNS Benchmark - Frequently Asked Questions & Answers

Page 4 of 4

GRC's DNS Benchmark Pages:

1 DNS Benchmark Introduction 2 Features & Operation Walkthrough 3 System Menu Options & Commands 4 Command-Line Operation Reference 5 Building a Custom Nameserver List 6 DNS Benchmark Resource Files

7 Configuring your DNS Nameservers 8 Benchmark Questions & Answers 9 DNS Benchmark Version History 10 Running GRC Apps under WINE 11 DNS Spoofability Test Introduction 12 Please Send Us Your Feedback

Gibson Research Corporation is owned and operated by Steve Gibson. The contents of this page are Copyright (c) 2012 Gibson Research Corporation. SpinRite, ShieldsUP, NanoProbe, and any other indicated trademarks are registered trademarks of Gibson Research Corporation, Laguna Hills, CA, USA. GRC's web and customer privacy policy. Last Edit: Nov 12, 2010 at 13:01 (637.87 days ago) Viewed 15 times per day

http://www.grc.com/dns/benchmark-faq.htm

8/11/2012

Você também pode gostar