Você está na página 1de 20

y

#1 Which of the following is the least important area in which to include your keyword(s)?
Your Answer: Meta Keywords Correct Answer: Meta Keywords The meta keywords tag is least important among these because search engines do not consider it in ranking calculations and it's never seen by visitors or searchers (unlike the meta description tag, which displays beneath listings in the SERPs).

#2 Which of the following would be the best choice of URL structure (for both search engines and humans)?
Your Answer: www.wildlifeonline.com/563 Correct Answer: www.wildlifeonline.com/animals/crocodile The best choice would be www.wildlifeonline.com/animals/crocodile - it provides the most semantic information, the best description of the content on the page and contains no parameters or subdomains that could cause issues at the engines. For more on URL structuring, see this post on SEOmoz.

#3 When linking to external websites, a good strategy to move up in the rankings is to use the keywords you're attempting to rank for on that page as the anchor text of the external-pointing links. For example, if you were attempting to rank a page for the phrase "hulk smash" you would want to use that phrase, "hulk smash" as the anchor text of a link pointing to a web page on another domain.
Your Answer: False Correct Answer: False The biggest problem with linking out to other websites with your targeted keyword phrases in the anchor text is that it creates additional competition for your page in the search results, as you give relevance through anchor text and link juice to a competing page on a competing site. Thus, FALSE is the correct answer.

#4 Which of the following is the best way to maximize the frequency with which your site/page is crawled by the search engines?
Your Answer: Search for your website more frequently in the major engines Correct Answer: Frequently add new content Adding new content on a regular basis is the only one of the methods listed that will promote more frequent spidering and indexing. Tags like crawl delay have never been

shown to be effective (and aren't even supported by many of the major engines). The other "partially" correct answer would be to turn up crawl frequency inside Webmaster Central at Google, but this only works if Google wants to crawl your site more actively and is restricted from doing so.
y

#5 Which of the following is a legitimate technique to improve rankings & traffic from search engines?
Your Answer: Hosting your site on a "search engine optimized" web hosting platform Correct Answer: Re-writing title tags on your pages to reflect high search volume, relevant keywords Of the choices, only the option to change title tags to reflect better keywords is a legitimate and effective SEO technique.

#6 Danny Sullivan is best known (in the field of web search) as:
Your Answer: The recently-promoted CEO of Ask.com Correct Answer: A journalist and pundit who covers the field of web search Although there's an answer we'd love to choose :), the correct answer is that Danny's a journalist and pundit on web search who currently operates the SearchEngineLand blog and runs the SearchMarketingExpo event series.

#7 Which of the following is the WORST criterion for estimating the value of a link to your page/site?
Your Answer: The number and quality of other external links on the page Correct Answer: The popularity of the domain on which the page is hosted according to Alexa Since Alexa data is typically less useful than monkey's throwing darts at a laptop, that's the obvious choice for worst metric. The others can all contribute at least some valuable insight into the value a link might pass.

#8 How can Meta Description tags help with the practice of search engine optimization?
Your Answer: They serve as the copy that will entice searchers to click on your listing Correct Answer: They serve as the copy that will entice searchers to click on your listing

The correct answer is that they serve as the copy in the SERPs and are thus valuable for influencing click-through rates.
y

#9 Which of the following content types is most easily crawled by the major web search engines (Google, Yahoo!, MSN/Live & Ask.com)?
Your Answer: Executable Files (EXE) Correct Answer: XHTML XHTML is the obvious choice as the other file types all create problems for search engine spiders.

#10 Which of the following sources is considered to be the best for acquiring competitive link data?
Your Answer: MSN/Live Correct Answer: Yahoo! Since Yahoo! is the only engine still providing in-depth, comprehensive link data for both sites and pages, it's the obvious choice. Link commands have been disabled at MSN, throttled at Google, never existed at Ask.com and provide only a tiny subset of data at Alexa.

#11 Which of the following site architecture issues MOST impedes the ability of search engine spiders to crawl a site?
Your Answer: Dynamic pages with 2 or more variables in the URL string Correct Answer: Pages that require form submission to reach database content Since search engines will assume a site is crawlable if it has no robots.txt file, doesn't have any crawl-specific issues with paid links, can read iFrames perfectly well and is able to spider and index plenty of pages with multiple URL parameters, the correct answer is clear. Pages that require form submission effectively block spiders, as automated bots will not complete form submissions to attempt to discover web content.

#12 What is the generally accepted difference between SEO and SEM?
Your Answer: SEM implies association with a traditional marketing company, while SEO is usually independent or unaffiliated with traditional marketing. Correct Answer: SEO focuses on organic/natural search rankings, SEM encompasses all aspects of search marketing

SEO - Search Engine Optimization - refers to the practice of ranking pages in the organic results at the search engines. SEM - Search Engine Marketing - refers to all practices that leverage search engines for traffic, branding, advertising & marketing.
y

#13 Which of these is NOT generally considered to be a highly important factor for ranking for a particular search term?
Your Answer: Keyword usage in the title tag of the page Correct Answer: HTML Validation (according to W3C standards) of a page As this document would indicate, W3C validation is clearly the odd man out in this bunch.

#14 When creating a "flat architecture" for a site, you attempt to minimize what?
Your Answer: The KB size of search-targeted pages Correct Answer: The number of links a search engine must follow to reach content pages Flat site architecture refers to the link structure of the site, and thus, the only answer is "the number of links a search engine must follow to reach content pages."

#15 In the search marketing industry, what is traditionally represented by this graph?

Your Answer: Number of users as ranking in the SERPs goes down Correct Answer: The "long tail" theory of keyword demand The graph shown represents the long tail concept, which is most frequently applied to keyword demand in the search marketing world. The theory is explained in detail here.

#16 Which of the following is NOT a "best practice" for creating high quality title tags?
Your Answer: Write compelling copy that encourages users to "click" your listing Correct Answer: Include an exhaustive list of keywords Since all the rest are very good ideas for title tag optimization (see this post for more), the outlier is to include an exhaustive list of keywords. Title tags are meant to describe the content on the page and to target 1-2 keyword phrases in the search engines, and thus, it would be terribly unwise to stuff many terms/phrases into the tag.

#17 Which of the following character limits is the best choice to use when limiting the length of title tags (assuming you want those tags to fully display in the search results at the major engines)?
Your Answer: 80 Correct Answer: 65 As Google & Yahoo! both display between 62-68 characters (there appears to be some various depending on both the country of origin of the search and the exact query), and MSN/Live hovers between 65-69, the best answer is... 65!

#18 PageRank is so named because it was created by Larry Page, not because it ranks pages.
Your Answer: FALSE Correct Answer: TRUE As you can read on Google's fun facts page, PageRank was named for its co-creator, Larry.

#19 A page on your site that serves as a "sitemap," linking to other pages on your domain in an organized, list format, is important because...
Your Answer: It reduces the crawl rate of spiders to your pages Correct Answer: It may help search engine crawlers to easily access many pages on your site As none of the others are remotely true, the only correct answer is that a sitemap page may help search engine crawlers easily access many pages on your site, particularly if your link structure is otherwise problematic.

#20 Which of the following search engines patented the concept of "TrustRank" as a methodology for ranking web sites & pages?
Your Answer: Google Correct Answer: Yahoo! The correct answer comes via the patent guru himself, Bill Slawski, who notes: The citation that Ive seen most commonly pointed at regarding trustrank is this paper - Combating Web Spam with TrustRank (pdf). The authors listed on that paper are the named inventors on this Yahoo patent application: 1 Link-based spam detection (20060095416) The remaining four describe an expansion of the trustrank process, referred to as dual trustrank, which adds elements of the social graph to the use of trustrank. 2 Using community annotations as anchortext (20060294085) 3 Realtime indexing and search in large, rapidly changing document collections (20060294086) 4 Trust propagation through both explicit and implicit social networks (20060294134) 5 Search engine with augmented relevance ranking by community participation (20070112761)

#21 Why are absolute (http://www.mysite.com/my-category)URLs better than relative ("/my-category") URLs for on-page internal linking?
Your Answer: When scraped and copied on other domains, they provide a link back to the website Correct Answer: When scraped and copied on other domains, they provide a link back to the website None of the answers makes sense, except that which refers to scrapers, who often copy pages without changing links and will thus link back to your site, helping to reduce duplicate content issues, and potentially provide some link value as well.

#22 How can you avoid the duplicate content problems that often accompany temporal pagination issues (where content moves down a

page and from page to page, as is often seen in lists of articles, multi page articles and blogs)?
Your Answer: Link to paginated pages with rel="nofollow" in the link tag Correct Answer: Add a meta robots tag with "noindex, follow" to the paginated pages The only method listed in the answers that's effective is to use "noindex, follow" on the paginated, non-canonical pages.
y

#23 If you update your site's URL structure to create new versions of your pages, what should you do with the old URLs?
Your Answer: 301 redirect them to the new URLs Correct Answer: 301 redirect them to the new URLs The correct move is to 301 the pages so they pass link juice and visitors to the new, proper locations.

#24 When you have multiple-pages targeting the same keywords on a domain, which of the following is the best way to avoid keyword cannibalization?
Your Answer: Restrict search engine from crawling/indexing any of the less important pages Correct Answer: Place links on all the secondary pages back to the page you most want ranking for the term/phrase using the primary keywords as the anchor text of those links As this blog post explains, it's best to "place links on all the secondary pages back to the page you most want ranking for the term/phrase using the primary keywords as the anchor text of those links."

#25 The de-facto version of a page located on the primary URL you want associated with the content is known as:
Your Answer: Empirical Version Correct Answer: Canonical Version The only answer that is generally accepted in the search community is "canonical version."

#26 Which domain extensions are more often associated with greater trust and authority in the search engines?

Your Answer: .org and .net Correct Answer: .edu, .mil and .gov Although the search engines themselves have said there are no specific algorithmic elements that make domains from .gov, .edu and .mil more trustworthy or authoritative, these sites, due to the restriction of the TLD licensing, certainly have an association with more trust in webmaster's eyes (and, very often, the search results).
y

#27 High quality links to a site's homepage will help to increa se the ranking ability of deeper pages on the same domain.
Your Answer: TRUE Correct Answer: TRUE The answer is "TRUE" as the properties of PageRank, domain trust, authority and many other search ranking factors will cause internal pages on a well-linked-to domain to rank more highly.

#28 The practice of showing one version of content on a URL to search engines, and another, different version to human visitors of the same URL is known as?

Your Answer: Cloaking Correct Answer: Cloaking

As WebmasterWorld notes, this practice is called cloaking.


y

#29 Which HTTP server response code indicates a file that no longer exists? (File Not Found)
Your Answer: 404 Correct Answer: 404 The W3C standards for HTTP status codes tells us that 404 is the correct answer.

#30 Spammy sites or blogs begin linking to your site. What effect is this likely to have on your search engine rankings?
Your Answer: It will boost your rankings at Live/MSN, keep them the same at Yahoo! and lower them in Google as the three engines treat spammy links in different ways Correct Answer: A very slight positive effect is most likely, as search engines are not perfectly able to discount the link value of all spammy sites The correct answer is that a very slight positive effect is most likely. This is because search engines do NOT want to penalize for the acquisition of spammy links, as this would simply encourage sites to point low quality links at their competition in order to knock them out of the results. The slight positive effect is typical because not all engines are 100% perfect at removing the link value from spam.

#31 A link from a PageRank "3" page (according to the Google toolbar) hosted on a very strong, trusted domain can be more valuable than a link from a PageRank "4" page hosted on a weaker domain.
Your Answer: TRUE Correct Answer: TRUE Since PageRank is not nearly the overwhelmingly strong factor influencing search rankings at Google these days, the answer is definitely "TRUE."

#32 What's the largest page size that Google's spider will crawl?
Your Answer: 250K Correct Answer: No set limit exists - Google may crawl very large pages if it believes them to be worthwhile As evidenced by many of the 500-100K+ pages in Google's index, there is no set limit, and the search engine may spider unusually large documents if it feels the effort is warranted (particularly if many important links point to a page).

#33 Is it generally considered acceptable to have the same content resolve on both www and non-www URLs of a website?
Your Answer: No, this may cause negative indexing/ranking issues Correct Answer: No, this may cause negative indexing/ranking issues This is generally considered a bad idea, and may have negative effects if the search engines do not properly count links to both versions (the most common issue) or even view the two as duplicate, competing content (unlikely, though possible).

#34 Which HTTP server response code indicates a page that has been temporarily relocated and links to the old location will not pass influence to the new location?
Your Answer: 301 Correct Answer: 302 The W3C standards for HTTP status codes tells us that 302 is the correct answer.

#35 Which of these is least likely to have difficulty ranking for its targeted terms/phrases in Google?
Your Answer: An established domain moving to a newly registered domain Correct Answer: A new domain that has received significant buzz and attention in the online and offline media, along with tens of thousands of natural links This is a tough question, and the answer is even somewhat debatable. However, as phrased, the MOST correct answer is almost certainly - "A new domain that has received significant buzz and attention in the online and offline media, along with tens of thousands of natural links" - as each of the other situations have many examples of having very difficult times ranking well.

#36 What is the advantage of putting all of your important keywords in the Meta Keywords tag?
Your Answer: They will be bolded in searches for that term Correct Answer: There is no specific advantage for search engines The answer is that no advantage is conferred upon sites who include their terms in the meta keywords tag. For more on the subject, read Danny Sullivan's excellent post.

#37 Which of the following link building tactics do search engines tacitly endorse?

Your Answer: Renting pages from trustworthy domains and placing links on them Correct Answer: Viral content creation & promotion As representatives from each of the major engines have acknowledged publicly, viral content creation and promotion is viewed as a legitimate and preferred tactic for link acquisition.
y

#38 Which HTTP server response code indicates a page that has been permanently relocated and all links to the old page will pass their influence to the new page location?
Your Answer: 302 Correct Answer: 301 The W3C standards for HTTP status codes tells us that 301 is the correct answer.

#39 Which of the following factors is considered when search engines assign value to links?
Your Answer: The date/time the link was created and the temporal relationship between that link's appearance and other time-sensitive criteria Correct Answer: The date/time the link was created and the temporal relationship between that link's appearance and other time-sensitive criteria The only one of these that search engines would consider (and have mentioned in patent applications like this one) is the temporal data.

#40 There is no apparent search engine rankings benefit to having a keyword-matched domain name (eg www.example.com for keyword "example").
Your Answer: FALSE Correct Answer: FALSE This is "FALSE," as many examples of keyword-targeted domains have been shown to have a phenomenal amount of ranking success in the engines, despite other factors not being nearly as strong as the competition.

#41 If you want a page to pass value through its links, but stay out of the search engines' indices, which of the following tags should you place in the header?
Your Answer: Use meta robots="index, nofollow"

Correct Answer: Use meta robots="noindex, follow" As Google tells us here, the proper format would be to use meta robots="noindex, follow"
y

#42 Which of these factors is LEAST likely to decrease the value of a link?
Your Answer: The linked-to domain has a link somewhere that points at the linking domain (each domain link to pages on the other's site) Correct Answer: The linked-to domain has a link somewhere that points at the linking domain (each domain link to pages on the other's site) The right answer is "a link from the domain being linked to pointing at the linking site already exists (each domain link to pages on the other's site)." This is because despite the fact that these links are technically "reciprocal," they don't fit any pattern of penalization for such links (such as being listed on link list style pages). The search engines are least likely to devalue these because of all the natural patterns in which such linking occurs (blogrolls, news sites, forums, hobbyists, schools, etc.)

#43 Which of the following is a requirement for getting in the Google Local listings?
Your Answer: A 3-digit numerical string in the URL of content pages Correct Answer: A physical mail address in your claimed location The only one that's a must-have is the physical mailing address.

#44 Which of the following engines offers paid inclusion services for their main web index (not advertising):
Your Answer: Google Correct Answer: Yahoo! Currently, only Yahoo! offers paid inclusion through their search submit program.

#45 When is it advisable to leave the meta description off of a page?


Your Answer: When the page is very focused toward one keyword, and a description wouldn't help. Correct Answer: When a large amount of pages exist and the options are between using a single meta description for all of the pages vs. leaving them with none at all The correct answer is "When a large amount of pages exist and the options are between using a single meta description for all of the pages vs. leaving them with

none at all." Duplicate meta description tags aren't the worst thing in the world, but they're certainly not providing any value and may have downsides from a duplicate content perspective (particularly if page content is very similar). Besides that, the other answers simply don't make sense :)
y

#46 A domain will not be hurt by having a penalized site or page 301'd to it.
Your Answer: TRUE Correct Answer: TRUE This is "TRUE," and has been tested by many a black hat. The danger here is that, once again, crafty spammers could use this technique to hurt their competitors if the search engines did penalize the receiving domain.

#47 Which of the following strategies is the best way to lift a page out of Google's supplemental index?
Your Answer: Link to it internally from strong pages Correct Answer: Link to it internally from strong pages As "supplemental" has been defined by engineers at Google as being a page with very little PageRank, the best way to lift it out, from the options given, is to link to it internally from strong pages.

#48 Which of the following is NOT speculated to be a contributing factor in achieving "breakout" site results in Google?

A sample of "breakout" results for the query, Comedy Central, at Google


Your Answer: Global link popularity Correct Answer: Having an active AdWords campaign The only one that doesn't fit is the use of an AdWords campaign, which Google has said has no impact on organic listings.

#49 Which of the following is the best method to insure that a page does not get crawled or indexed by a search engine?
Your Answer: Restrict the page using robots.txt Correct Answer: Restrict the page using robots.txt The clear best method above, and the one prescribed by the engines, is to use the robots.txt file to restrict access.

#50 If you want to rank for a country specific TLD/Top-Level-Domain extension (such as Yahoo.jp or Google.ca) which of the following is NOT important?
Your Answer: Registering with local search extensions of the major engines to confirm your geographic location Correct Answer: Linking out only to other sites with the targeted TLD extension Linking out only to other sites with the targeted TLD extension is certainly not a requirement nor a suggested method for inclusion into a country-specific search engine's results. See this recent video for more.

#51 Which of the following CANNOT get you penalized at the major search engines?
Your Answer: Linking to spammy sites Correct Answer: Using "nofollow" internally on your site to control the flow of link juice As Matt Cutts has noted recently, using "nofollow" to sculpt the flow of link juice is perfectly acceptable.

#52 Which of the following directories had its ability to pass link value removed?
Your Answer: www.bluefind.org - The Bluefind Web Directory Correct Answer: www.bluefind.org - The Bluefind Web Directory Only BlueFind suffered this penalty - having had its ability to pass link value removed by Google, ostensibly for "selling PageRank."

#53 Which of the following is an acceptable way to show HTML text to search engines while creating a graphical image to display to users?

Your Answer: IP-based Cloaking - show search engine IPs the text and other IPs the image Correct Answer: CSS image replacement - create a rule in the CSS file that replaces the text with an image based on a given class The only method that's approved by search engines is to use CSS image replacement with the exact copy in both the image and the HTML text.
y

#54 For high-volume search phrases, the Search Engines usually will not differentiate between singular and plural versions of a term (eg "cell phone" vs. "cell phones" or "bird feeder" vs. "bird feeders").
Your Answer: FALSE Correct Answer: FALSE As we can see from searches on the various phrases - cell phone vs. cell phones and bird feeder vs. bird feeders - this is FALSE. There are clear differentiations.

#55 If your site is ranked in the #1 organic position for a given query, advertising in the top paid position for that search result will generally not produce an additional volume of search traffic.
Your Answer: FALSE Correct Answer: FALSE Research from several sources, including this eye-tracking research report from MarketingSherpa, indicates that the correct answer is FALSE. You get more traffic and click-throughs with both the top paid and organic results than either individually.

#56 What's likely to happen if multiple accounts on a single IP address vote up a story at Digg in a short time period?
Your Answer: Your accounts will be suspended Correct Answer: Your accounts will be suspended The most likely result, particularly if this is done multiple times, is to have the accounts suspended.

#57 Let's assume that you're running SEO for an auction website with many listings, sorted by categories and subcategories. To achieve the maximum search engine traffic benefit, what should you do with individual product/auction pages after the auction has expired and the product is no longer available?

Your Answer: 301 redirect the pages to the home page to keep any external link juice flowing through the site Correct Answer: 301 redirect them to the most appropriate category page associated with the product The "best" answer of the choices given is to 301 redirect the pages to the most appropriate category page associated with the product - this ensures that link value won't be lost, and visitors who come to the old page will get the best user experience as well.
y

#58 Which factor is most likely to decrease the ranking value of a link?
Your Answer: Comes from a page with many reciprocal and paid links Correct Answer: Comes from a page with many reciprocal and paid links All of the answers can provide significant link value except "comes from a page with many reciprocal and paid links," which is very likely to have a strong negative affect on the value of the link.

#59 Which of the following search engine and country combination does not represent the most popular search engine in that country?
Your Answer: Japan / Yahoo Correct Answer: Japan / Yahoo All of the above are correct, except Japan, where Google appears to now have a dominant search market share, despite Yahoo! getting more web traffic and visits. See also this piece from Multilingual-Search.com.

#60 Where do search engines consider content inside an iFrame to be located?


Your Answer: Search engines cannot spider content in iFrames Correct Answer: On the source page the iFrame pulls from Engines judge iframe content the same way browsers do, and consider them to be part of the source page the iFrame pulls from (not the URL displaying the iFrame content).

#61 If the company you buy links from gets "busted" (discovered and penalized) by a search engine, the links you have from them will:
Your Answer: Cause your site to be banned from the engine Correct Answer: Stop passing link value

Since search engines don't want to give webmasters the ability to knock their competitors out with paid links, they will simply devalue the links they discover to be part of paid networks, such that they no longer pass value.
y

#62 Which of these queries would not have an "Instant Answer" or "Onebox Result" on Google?
Your Answer: Number of Horns on a Unicorn Correct Answer: Best Chinese Restaurant in San Francisco No surprisingly, the only correct answer is "Best Chinese Restaurant in San Francisco."

#63 Which major search engine serves advertising listings (paid search results) from the PPC program of one of the other major engines?
Your Answer: Ask.com Correct Answer: Ask.com Ask.com is the only major engine that shows ad results from another engine specifically, Google.

#64 Duplicate content is primarily an off-site issue, created through content licensing deals and copyright violations of scraped and republished content, rather than a site-internal problem.
Your Answer: TRUE Correct Answer: FALSE The answer is FALSE, as on-site duplicate content issues can be serious and cause plenty of problems in the search engines.

#65 Links from 'noindex, follow' pages are treated exactly the same as links from default ('index, follow') pages.
Your Answer: TRUE Correct Answer: TRUE This is TRUE - according to Matt Cutts in a comment here, links on pages with "noindex, follow" are treated exactly the same as links from default ("index, follow") pages.

#66 Which metric is NOT used by the major search engines to measure relevance or popularity in their ranking algorithms?

Your Answer: Anchor text of links pointing to a page Correct Answer: Keyword density in text on the page Keyword density is the outlier here. Dr. Garcia explains why search engines don't use the metric here.
y

#67 If they have the same content, the Search Engines will consider example.com/avocado and example.com/avocado/ to be the same page.
Your Answer: TRUE Correct Answer: TRUE The answer is TRUE, as engines don't consider the trailing slash to create a different page (examples here and here).

#68 Which Search Engines currently allow the 'nocontent' attribute?


Your Answer: Google Correct Answer: Yahoo! To date, only Yahoo! has implemented the nocontent parameter.

#69 In which of the following countries does Ask.com have the most significant percentage of search engine market share?
Your Answer: Australia Correct Answer: United States Surprisingly, the answer is the US, where Ask.com has an estimated 5% market share.

#70 For search engine rankings & traffic in Google & Yahoo!, it is generally better to have many, small, single topic focused sites with links spread out between them than one, large, inclusive site with all the links pointing to that single domain.
Your Answer: TRUE Correct Answer: FALSE This is FALSE, primarily because the search engines' current algorithms places a great deal of weight on large, trusted domains, rather than small, niche sites.

#71 The 4 major search engines - Google, Yahoo!, MSN/Live and Ask serve what approximate percentage of all searches performed in the US?
Your Answer: ~80% Correct Answer: ~95% According to nearly every study reported (including ComScore's), the four major networks, when AOL is included (serving Google results), provide ~95% of all searches in the US.

#72 The linkfromdomain operator displays what information and is available at which search engine(s)?
Your Answer: Data on who is linking to a given website - available at Google & Yahoo! Correct Answer: Data on what websites are linked-to from a given domain available at MSN/Live only As can be seen here, Microsoft/Live is the only engine to provide the command and it shows what pages are linked-to by a given domain.

#73 Which of the following social media websites is the least popular (as measured by active users & visitors)?
Your Answer: Newsvine Correct Answer: Newsvine Newsvine is the smallest of the above, both in terms of traffic and users.

#74 Which of the following pieces of information is NOT available from current keyword research sources?
Your Answer: Estimated volume of searches per month Correct Answer: Cost per click paid by PPC advertisers Since all of the current search engines have blind bid systems, the cost-per-click paid by advertisers is currently unavailable anywhere.

#75 The use of AJAX presents what common problem for search engines and websites?
Your Answer: It creates multiple pages with unique content without enabling new, spiderable, linkable URLs

Correct Answer: It creates multiple pages with unique content without enabling new, spiderable, linkable URLs The largest problem for search engines is that AJAX frequently "creates multiple pages with unique content without enabling new, spiderable, linkable URLs."

Você também pode gostar