A solution to the drop in web traffic

how to solve the drop of website traffic

1. The total number of indexes

every day staring at the search engine to see how many pages it contains in the end is a matter of effort. In fact, even if your website is doing well, the number of pages you search for will often fluctuate considerably. In the case of other conditions unchanged, the only reason the decline in the number of pages included is the decline in traffic to the site, unless you can find a better reason to that page is included to reduce traffic or no relationship between the two. In other words, if you have a 30% reduction in the Google index page, which reduced the page does not give your site more flow, it can be shown that not because the site flow drops down and the collection of pages.

if your web site is divided into two categories, WWW and no WWW (such as: www.mntp123.cn and mntp123.cn), you need to redirect one of them to another URL (using 301 errors to redirect). Doing so will delay the search traffic in the short run and reduce the number of pages included (especially when the two sites are highly ranked). But in the long run, your website will benefit from it.

if you can know what page is no longer within the scope of the index, you can try to change the navigation structure of the web page to ensure that the search engine crawlers have more chances to find these hidden pages. In addition, check out the pages that are not included, whether they are repeated on other content on the site and similar to those on other sites. This may be difficult for some classified websites, because their content is mainly the sales information of the merchants. In general, unique user generated content (preferably with multiple internal and external links) should not be ignored by search engines.

last week, the search engine support Sitemaps.org gives users a way of crawl content: the robots.txt to insert your XML sitemap feed web site, the search engine will automatically search the corresponding address. With Ask.com also joining the search engine support list, XML sitemap will become crucial for pages that aren’t always caught by crawlers.

two, the number of internal links

each engine determines the method of linking is different, I hope every search engine on the number of links fluctuations can be stable and small, is unrealistic.

if you are sure that the drop in traffic is a link within the site, it means that the search engine does not recognize certain pages. Some of the pages mentioned here might have been pointing to a web site or being directed to a page. So check that the URL of some pages has been included in the search engine.

Leave a Reply

Your email address will not be published. Required fields are marked *