First, keyword analysis.
Keyword analysis is a course that all SEO must master. Although large websites have massive data, every page needs keyword analysis. In addition to SEO, planning editors also need to have certain keyword analysis ability.
Let's look at the basic principle of keyword analysis:
1. Investigate users' search habits: this is an important aspect. Only by understanding users' search habits can we know our users' search needs and what users like to search for. What search engine is used? etc
2. Don't be too broad: too broad keywords will lead to fierce competition, spend a lot of time but may not get the desired effect, and may also reduce the relevance of keywords.
3, keywords can not be too cold: think about it, there are no keywords searched by users, is it worth optimizing?
4. keywords should be highly related to the content of the page: this is good for optimization and users.
Let's take a look at the steps of keyword selection:
1, determine the core keywords: Which word or two words should we consider to describe the content of the web page most accurately? What is the word most searched by users?
2. Expansion of the definition of core keywords: for example, the alias of core keywords, the combination next to core keywords, and the assistance of core keywords.
3. Simulate user thinking design keywords: Imagine yourself as a user, so what keywords would I search for?
4. Study the keywords of competitors: analyze the websites of the dominant competitors. What keywords do they use?
Second, the friendly design of the page search engine
1. First of all, let's look at the search engine friendliness of UI design: the main purpose is to navigate clearly and use flash and pictures. Generally speaking, navigation and parts with keywords are not suitable for using flash and pictures, because most search engines can't capture the characters in flash and pictures.
2. Then the front-end code is search engine friendly, including the following points.
A. concise code: search engines like concise html code, which is more conducive to analysis.
B. Priority of important information: refers to information with keywords and frequent updates, and try to appear in the front position of html.
C. Filtering interference information: The pages of large websites are generally complicated, and there are many irrelevant information such as advertisements, cooperation and communication contents. We should choose to use codes that search engines such as js and I can't recognize to filter out this information.
D, the basic SEO of the code: This is the basic SEO work to avoid html errors and semantic tags.
Third, the link strategy:
It is divided into two parts: internal chain strategy and external chain strategy.
Internal chain strategy:
1, the massive data of large websites make the advantages of internal links far greater than external links. The number of external links may be tens of thousands or hundreds of thousands, but large websites have millions or even hundreds of millions of pages. If these massive pages are used to establish internal links, the advantages are obvious.
2. It's easy to export links between web pages.
3. Improve the efficiency of crawling and indexing websites by search engines, and enhance the collection, which is also conducive to the delivery of PR.
4. Focus on the topic, so that the keywords of this topic have a ranking advantage in search engines.
In the internal chain construction, we should follow the following principles:
1, control the number of links in the article: the number of links inserted in the article can be controlled at about 3-8 according to the content.
2. The relevance of linked objects should be high.
3. Pay more attention to important web pages: let important web pages with more keyword values get better rankings.
4. Use an absolute path.
External link strategy:
We emphasize the internal chain construction of large-scale websites, but at the same time we can't ignore the external chain construction. Although the construction of external links is not as important as small and medium-sized websites, it also has high value. We can usually establish external links by link exchange, making link bait and putting in soft text with links.
1, let's see what principles should be followed in link exchange:
First, the link text contains keywords
B, as far as possible with high correlation sites, channels for link exchange.
C, the number of links exported from the other website should not be too much, which is of little value.
D avoid exchanging links with websites that are not included and punished by search engines.
2. Making link bait: Making link bait is a labor-saving job, and let the other website actively add links for us. There are many techniques for making link bait, but they can be summarized in two words: creativity.
3, soft text with links. Refers to sending soft articles with links in business promotion or for the purpose of obtaining external links.
Fourth, the website map strategy:
There are many large websites that don't pay attention to the construction of site maps, and the site maps of many large websites are just perfunctory and a decoration. In fact, websites are very important for large websites. The massive data of large websites, complex URL navigation structure and extremely fast update frequency make it impossible for search engines to completely capture all the pages. This is also an important reason why some large websites have millions or even hundreds of millions of data, but only half, one third or even less of the website data are included by search engines. You can't even guarantee the inclusion. How can you rank?
Html map:
1. Establish a good navigation structure for search engines.
The map of 2.Html can be divided into horizontal navigation and vertical navigation. Horizontal navigation mainly refers to links such as channels, columns and special topics, while vertical navigation mainly focuses on keywords.
3. Each page has a link to the site map.
Xml site map:
Mainly for Google, yahoo, live and other search engines. Because the data of large websites is too large, a single sitemap will lead to a large sitemap.xml file, which exceeds the capacity of search engines. Therefore, we need to split the sitemap.xml into several fragments, and each split sitemap.xml remains within the range suggested by the search engine.
Verb (abbreviation of verb) content strategy:
Search engine friendly writing is a key part of creating massive data to achieve good search engine ranking. It is impossible for SEO personnel to put forward SEO suggestions or schemes for every web page, so the training of writers is particularly important. If all writers write according to the principle of search engine friendliness, the effect will be terrible.
1. Repeated SEO training should be given to writers: writers are not SEO and have no experience, so it is impossible to understand SEO writing skills at once. Therefore, it is necessary to train writers repeatedly to achieve results.
2. To create content, we must first think about what the user will search for and write according to the user's search needs.
3. Pay attention to the title and writing style: For example, although the weight in search engines is already very low, poor writing style, such as piling up keywords and keywords irrelevant to the content, will have a negative impact. The title has a high weight, so try to incorporate keywords into the title.
4. Integration of content and keywords: keywords should be properly integrated into the content, so that keywords can appear in appropriate positions and maintain appropriate keyword density.
5. It is very important to add links to keywords: adding links to related keywords, or adding links to keywords on other pages that appear on this page, can make good use of the advantages of internal links.
6. Use semantic tags for keywords:
Create a special topic for keywords:
Besides the final page, all kinds of topics for popular keywords should be the second largest source of search engine traffic on the website. After mining the hot data, we can do special topics for these hot keywords. Where does the content of the topic page come from? We generally filter and aggregate the information related to the corresponding keywords through programs, so that the content is highly matched with the keywords, and the required content is provided for users and search engines.
Of course, if there is no auxiliary means, it is difficult to ensure the ranking of a topic in search engines just by establishing a topic. We can get the link through the internal link of the article, the recommendation of the channel page, or the special recommendation of the final page to achieve the effect.
1, doing special topics for popular keywords
2. Aggregation of keyword-related information
3, supplemented by the introduction of the inner chain of the article.
Six, log analysis and data mining:
Log analysis and data mining are often ignored by us. In fact, whether it is a large website or a small and medium-sized website, it is very meaningful work. It's just that log analysis and data mining of large websites are more difficult, because the amount of data is too large, so we should have enough patience to do this work well and be targeted.
1. Website log analysis: There are many types of website log analysis, such as access source, browser, client screen size, portal, bounce rate, PV, etc. There are three main types that are most related to SEO work:
First, the search engine traffic import
B, search engine keyword analysis
C, statistical analysis of user search behavior
2. Hot data mining: We can mine hot data through our own website log analysis, some external tools and SEO's own ability to grasp hot spots. Hot data mining mainly includes the following means:
First, grasp the hot spots in the industry, which can be completed by editors and SEO.
B. Predicting potential hot spots requires high sensitivity to information and can predict potential hot spot information.
C. create your own hot spots, such as hype.