LEARN SEO PDF

adminComment(0)

Read on if you would like to learn how to SEO Google has updated their SEO starter guide for , although this version is not in PDF. Learning Your Subject . a free PDF checklist of all the SEO tools I use. . off the start to learn what keywords convert well for you and which. You'll get the most out of this guide if your desire to learn search engine optimization (SEO) is exceeded only by your willingness to execute and test concepts.


Learn Seo Pdf

Author:ARIE MASSAGLIA
Language:English, Portuguese, Hindi
Country:Ethiopia
Genre:Children & Youth
Pages:242
Published (Last):05.04.2015
ISBN:642-7-78709-295-1
ePub File Size:28.54 MB
PDF File Size:9.17 MB
Distribution:Free* [*Sign up for free]
Downloads:25728
Uploaded by: ANGELIA

SEO Learn Basics of Search Engine Optimization. by Search Engine Journal. Number of pages: With over pages, SEO learn the basics of SEO including how to find the right keywords using the Wordtracker Keywords tool and how to test their value using PPC. (pay per click) . Each stop is a unique document (usually a web page, but sometimes a PDF, JPG , or other .. Learning the foundations of SEO is a vital step in achieving these.

Using appropriate technologies is also important. Make expertise and authoritativeness clear Expertise and authoritativeness of a site increases its quality.

Be sure that content on your site is created or edited by people with expertise in the topic. Representing well-established consensus in pages on scientific topics is a good practice if such consensus exists. Content should be factually accurate, clearly written, and comprehensive. So, for example, if you describe your page as a recipe, provide a complete recipe that is easy to follow, rather than just a set of ingredients or a basic description of the dish.

Avoid: Providing insufficient content for the purpose of the page.

Avoid distracting advertisements We expect advertisements to be visible. However, you should not let the advertisements distract users or prevent them from consuming the site content. For example, advertisements, supplement contents, or interstitial pages pages displayed before or after the content you are expecting that make it difficult to use the website.

Learn more about this topic. Use links wisely Write good link text Link text is the visible text inside a link. This text tells users and Google something about the page you're linking to. Links on your page may be internal—pointing to other pages on your site—or external—leading to content on other sites.

In either of these cases, the better your anchor text is, the easier it is for users to navigate and for Google to understand what the page you're linking to is about. With appropriate anchor text, users and search engines can easily understand what the linked pages contain. Best Practices Choose descriptive text The anchor text you use for a link should provide at least a basic idea of what the page linked to is about.

Avoid: Writing generic anchor text like "page", "article", or "click here". Using text that is off-topic or has no relation to the content of the page linked to.

Using the page's URL as the anchor text in most cases, although there are certainly legitimate uses of this, such as promoting or referencing a new website's address.

Write concise text Aim for short but descriptive text-usually a few words or a short phrase. Avoid: Writing long anchor text, such as a lengthy sentence or short paragraph of text. Format links so they're easy to spot Make it easy for users to distinguish between regular text and the anchor text of your links.

Your content becomes less useful if users miss the links or accidentally click them. Avoid: Using CSS or text styling that make links look just like regular text. Think about anchor text for internal links too You may usually think about linking in terms of pointing to outside websites, but paying more attention to the anchor text used for internal links can help users and Google navigate your site better.

Avoid: Using excessively keyword-filled or lengthy anchor text just for search engines. Creating unnecessary links that don't help with the user's navigation of the site. Be careful who you link to You can confer some of your site's reputation to another site when your site links to it.

Sometimes users can take advantage of this by adding links to their own site in your comment sections or message boards. Or sometimes you might mention a site in a negative way and don't want to confer any of your reputation upon it. For example, imagine that you're writing a blog post on the topic of comment spamming and you want to call out a site that recently comment spammed your blog.

You want to warn others of the site, so you include the link to it in your content; however, you certainly don't want to give the site some of your reputation from your link. This would be a good time to use nofollow. If you are using a third party's widget to enrich the experience of your site and engage users, check if it contains any links that you did not intend to place on your site along with the widget.

Some widgets may add links to your site which are not your editorial choice and contain anchor text that you as a webmaster may not control. If you create a widget for functionality or content that you provide, make sure to include the nofollow on links in the default code snippet. You can find more details about robots meta tag on the Webmaster Blog Combat comment spam with "nofollow" Setting the value of the "rel" attribute of a link to "nofollow" will tell Google that certain links on your site shouldn't be followed or pass your page's reputation to the pages linked to.

If your site has a blog with public commenting turned on, links within those comments could pass your reputation to pages that you may not be comfortable vouching for. Blog comment areas on pages are highly susceptible to comment spam. Nofollowing these user-added links ensures that you're not giving your page's hard-earned reputation to a spammy site. Automatically add "nofollow" to comment columns and message boards Many blogging software packages automatically nofollow user comments, but those that don't can most likely be manually edited to do this.

This advice also goes for other areas of your site that may involve user-generated content, such as guest books, forums, shout-boards, referrer listings, etc. If you're willing to vouch for links added by third parties for example, if a commenter is trusted on your site , then there's no need to use nofollow on links; however, linking to sites that Google considers spammy can affect the reputation of your own site.

The "alt" attribute allows you to specify alternative text for the image if it cannot be displayed for some reason.

Also read: SEO URDU BOOK

Why use this attribute? If a user is viewing your site using assistive technologies, such as a screen reader, the contents of the alt attribute provide information about the picture. Another reason is that if you're using an image as a link, the alt text for that image will be treated similarly to the anchor text of a text link. However, we don't recommend using too many images for links in your site's navigation when text links could serve the same purpose. Lastly, optimizing your image filenames and alt text makes it easier for image search projects like Google Image Search to better understand your images.

Avoid: Using generic filenames like "image1. Writing extremely lengthy filenames. Stuffing keywords into alt text or copying and pasting entire sentences. What about the other way around? Scenario 2: You have neatly trimmed your lawn, but the inside of your house is a mess.

When a visitor leaves your site after viewing only one page, Google considers that a bounce. You can do several things on your page to get the former right and then even more things outside of that off the page if you will to ace the latter. The first and most important is content.

Because a Google search engine customer is happy when he finds the result that serves his needs in the best way. It tries to give you exactly what you asked for. Google always tries to give you the best experience possible by directing you to the greatest content it can find.

This means that your number one job to do well with SEO is to produce great content. You still have to put in a ton of work. SEO is no different than any other skill: great results will always come from big effort.

But coming up with great content is not easy. After all, it means that you have to become a teacher — and a good one at that. Out of all on-page SEO factors, this is the one you should spend the most time learning. While you should, of course, use your keyword throughout your content, jamming your keyword into your text as much as possible will hurt your rankings rather than improve them.

Post navigation

Today, the use of keywords is much more about semantics. However, posting new content is only one way to signal Google freshness. Brian Dean from Backlinko, for example, has only published around 30 posts in two years.

Yet, he keeps all of his posts up to date by rewriting them and adding new information as he finds it. While it is important to publish regularly, you can still get great results by posting once a month as long as your content is thorough and in-depth.

If you write your content clearly enough for Google to recognize it as an answer to a particular question, it will show up directly beneath the search bar.

So make sure you clear up your writing. Fancy buzzwords and complex sentence constructions will neither make you sound smart nor help your SEO game.

I just made that stat up, but you get the point. Keywords dictate what each piece of content is about. It dictates what you call your site or how you describe your brand online.

Keywords even dictate how you build links, including everything from the tactics you choose to how you plan on implementing them. Another common mistake people make is that they stop. They do it for a week or two, update their pages, and then stop. They think keyword research is a one-and-done thing.

The best SEOs are constantly doing keyword research. That one keyword could send your site thousands of people each month. In fact, this next one is even more common. So what is the very next thing you do? By default, Googlebot will index a page and follow links to it. At a page level — it is a powerful way to control if your pages are returned in search results pages.

I have never experienced any problems using CSS to control the appearance of the heading tags making them larger or smaller. How many words in the H1 Tag? As many as I think is sensible — as short and snappy as possible usually.

As always be sure to make your heading tags highly relevant to the content on that page and not too spammy, either. Use ALT tags or rather, ALT Attributes for descriptive text that helps visitors — and keep them unique where possible, like you do with your titles and meta descriptions.

The title attribute should contain information about what will happen when you click on the image. From my tests, no.

From observing how my test page ranks — Google is ignoring keywords in the acronym tag. You do not need clean URLs in site architecture for Google to spider a site successfully confirmed by Google in , although I do use clean URLs as a default these days, and have done so for years. However — there it is demonstrable benefit to having keywords in URLs. The thinking is that you might get a boost in Google SERPs if your URLs are clean — because you are using keywords in the actual page name instead of a parameter or session ID number which Google often struggles with.

I optimise as if they do, and when asked about keywords in urls Google did reply:. I believe that is a very small ranking factor. Then it is fair to say you do get a boost because keywords are in the actual anchor text link to your site, and I believe this is the case, but again, that depends on the quality of the page linking to your site.

That is, if Google trusts it and it passes Pagerank! Sometimes I will remove the stop-words from a URL and leave the important keywords as the page title because a lot of forums garble a URL to shorten it. Most forums will be nofollowed in , to be fair, but some old habits die-hard. It should be remembered it is thought although Googlebot can crawl sites with dynamic URLs; it is assumed by many webmasters there is a greater risk that it will give up if the URLs are deemed not important and contain multiple variables and session IDs theory.

As standard , I use clean URLs where possible on new sites these days, and try to keep the URLs as simple as possible and do not obsess about it.

Having a keyword in your URL might be the difference between your site ranking and not — potentially useful to take advantage of long tail search queries. I prefer absolute URLs. Google will crawl either if the local setup is correctly developed.

This is entirely going to a choice for your developers. Some developers on very large sites will always prefer relative URLS. I have not been able to decide if there is any real benefit in terms of ranking boost to using either. I used to prefer files like. Google treats some subfolders….. Personally, as an SEO, I prefer subdirectories rather than subdomains if given the choice, unless it really makes sense to house particular content on a subdomain, rather than the main site as in the examples John mentions.

I thought that was a temporary solution. If you have the choice, I would choose to house content on a subfolder on the main domain. Recent research would still indicate this is the best way to go:. I prefer PHP these days even with flat documents as it is easier to add server side code to that document if I want to add some sort of function to the site. It is important that what Google Googlebot sees is exactly what a visitor would see if they visit your site.

Blocking Google can sometimes result in a real ranking problem for websites. If Google has problems accessing particular parts of your website, it will tell you in Search Console. If you are a website designer, you might want to test your web design and see how it looks in different versions of Microsoft Windows Internet Explorer.

Does Google rank a page higher because of valid code? I love creating accessible websites but they are a bit of a pain to manage when you have multiple authors or developers on a site.

If your site is so badly designed with a lot of invalid code even Google and browsers cannot read it, then you have a problem.

What’s Not Covered in the Above Search Engine Optimization PDF 2019?

Where possible, if commissioning a new website, demand, at least, minimum web accessibility compliance on a site there are three levels of priority to meet , and aim for valid HTML and CSS.

It is one form of optimisation Google will not penalise you for. I link to relevant internal pages in my site when necessary.

I silo any relevance or trust mainly via links in text content and secondary menu systems and between pages that are relevant in context to one another. I do not obsess about site architecture as much as I used to…. This is normally triggered when Google is confident this is the site you are looking for, based on the search terms you used. Sitelinks are usually reserved for navigational queries with a heavy brand bias, a brand name or a company name, for instance, or the website address.

Google likes to seem to mix this up a lot, perhaps to offer some variety, and probably to obfuscate results to minimise or discourage manipulation. Sometimes it returns pages that leave me scratching my head as to why Google selected a particular page appears.

Sitelinks are not something can be switched on or off, although you can control to some degree the pages are selected as site links. This works for me, it allows me to share the link equity I have with other sites while ensuring it is not at the expense of pages on my domain. Try it. Check your pages for broken links. Seriously, broken links are a waste of link power and could hurt your site, drastically in some cases. Google is a link-based search engine — if your links are broken and your site is chock full of s you might not be at the races.

For example and I am talking internally here — if you took a page and I placed two links on it, both going to the same page? OK — hardly scientific, but you should get the idea. Or will it read the anchor text of both links, and give my page the benefit of the text in both links especially if the anchor text is different in both links? Will Google ignore the second link? What is interesting to me is that knowing this leaves you with a question. If your navigation array has your main pages linked to in it, perhaps your links in content are being ignored, or at least, not valued.

I think links in body text are invaluable. Does that mean placing the navigation below the copy to get a wide and varied internal anchor text to a page? Also, as John Mueller points out, Google picks the best option to show users depending on who they are and where they are.

So sometimes, your duplicate content will appear to users where relevant. This type of copying makes it difficult to find the exact matching original source.

10 free SEO ebooks worth downloading

These types of changes are deliberately done to make it difficult to find the original source of the content. How do you get two listings from the same website in the top ten results in Google instead of one in normal view with 10 results.

Generally speaking, this means you have at least two pages with enough link equity to reach the top ten results — two pages very relevant to the search term. You can achieve this with relevant pages, good internal structure and of course links from other websites. Some SERPs feature sites with more than two results from the same site. It is incredibly important in to create useful and proper pages. This will help prevent Google recording lots of autogenerated thin pages on your site both a security risk and a rankings risk.

I will highlight a poor page in my audits and actually programmatically look for signs of this issue when I scan a site. Use language that is friendly and inviting. Make sure your page uses the same look and feel including navigation as the rest of your site. Think about providing a way for users to report a broken link. In order to prevent pages from being indexed by Google and other search engines, make sure that your webserver returns an actual HTTP status code when a missing page is requested.

Related Post: LEARN RUSSIAN BOOK

A good page and proper setup prevents a lot of this from happening in the first place. Pages may lack MC for various reasons. Sometimes, the content is no longer available and the page displays an error message with this information. This is normal, and those individual non-functioning or broken pages on an otherwise maintained site should be rated Low quality. This is true even if other pages on the website are overall High or Highest quality.

The issue here is that Google introduces a lot of noise into that Crawl Errors report to make it unwieldy and not very user-friendly. A lot of broken links Google tells you about can often be totally irrelevant and legacy issues. Google could make it instantly more valuable by telling us which s are linked to from only external websites. I also prefer to use Analytics to look for broken backlinks on a site with some history of migrations, for instance.

John has clarified some of this before, although he is talking specifically I think about errors found by Google in Search Console formerly Google Webmaster Tools:. If you are making websites and want them to rank, the and Quality Raters Guidelines document is a great guide for Webmasters to avoid low-quality ratings and potentially avoid punishment algorithms. You can use redirects to redirect pages, sub-folders or even entire websites and preserve Google rankings that the old page, sub-folder or websites enjoyed.

This is the best way to ensure that users and search engines are directed to the correct page. Redirecting multiple old pages to one new page works too if the information is there on the new page that ranked the old page.

Pages should be thematically connected if you want the redirects to have a SEO benefit. My general rule of thumb is to make sure the information and keywords on the old page are featured prominently in the text of the new page — stay on the safe side. You need to keep these redirects in place for instance on a linux apache server, in your htaccess file forever. John Mueller, Google. If you need a page to redirect old URLs to, consider your sitemap or contact page.

As long as the intention is to serve users and create content that is satisfying and more up-to-date — Google is OK with this. As a result, that URL may be crawled and its content indexed. However, Google will also treat certain mismatched or incorrect redirects as soft type pages, too.

And this is a REAL problem in , and a marked change from the way Google worked say ten years ago. It essentially means that Google is not going to honour your redirect instruction and that means you are at risk of knobbling any positive signals you are attempting to transfer through a redirect.

Sometimes it is useful to direct visitors from a usability point of view, but sometimes that usability issue will impact SEO benefits from old assets.

If I want to boost that pages relevance for that KEYWORD at the center of any redirects, I will ensure the new page content is updated and expanded upon if it is of genuine interest to a user. Links may point to your site using both the www and non-www versions of the URL for instance, http: The preferred domain is the version that you want used for your site in the search results. Simply put, https: It keeps it simple when optimising for Google.

If you want to know more, see how to use canonical tags properly. Other pages, every couple of months. Again — best practice. Google has said very recently XML and RSS are still a very useful discovery method for them to pick out recently updated content on your site. Remember — Google needs links to find all the pages on your site, and links spread Pagerank, that help pages rank — so an XML sitemap is never a substitute for a great website architecture.

Google wants evaluators to find out who owns the website and who is responsible for the content on it. The above information does not need to feature on every page, more on a clearly accessible page.

10 free SEO ebooks worth downloading

If the business is a member of a trade or professional association, membership details, including any registration number, should be provided.

Consider also the Distance Selling Regulations which contain other information requirements for online businesses that sell to consumers B2C, as opposed to B2B, sales. While you are editing your footer — ensure your copyright notice is dynamic and will change year to year — automatically. This little bit of code will display the current year. You can take your information you have from above and transform it with Schema. I got yellow stars in Google within a few days of adding the code to my website template — directly linking my site to information Google already has about my business.

Flash is a propriety plug-in created by Macromedia to infuse albeit fantastically rich media for your websites. The W3C advises you avoid the use of such proprietary technology to construct an entire site. We will remove Flash completely from Chrome toward the end of Flash, in the hands of an inexperienced designer, can cause all types of problems at the moment, especially with:.

Note that Google sometimes highlights if your site is not mobile friendly on some devices. And on the subject of mobile-friendly websites — note that Google has alerted the webmaster community that mobile friendliness will be a search engine ranking factor in This change will affect mobile searches in all languages worldwide and will have a significant impact in our search results.

Html5 is the preferred option over Flash these days, for most designers. If you are new to web design, avoid things like Flash and JavaScript, especially for elements like scrolling news tickers, etc.

These elements work fine for TV — but only cause problems for website visitors. Keep layouts and navigation arrays consistent and simple too. First — for I have witnessed VERY slow websites of 10 seconds and more negatively impacted in Google, and second, from statements made by Googlers:.

Google might crawl your site slower if you have a slow site. My latest research would indicate as fast as possible. Easier said than done. In the video above you hear from at least one spam fighter that would confirm that at least some people are employed at Google to demote sites that fail to meet policy:. So, for the long term, on primary sites, once you have cleaned all infractions up, the aim is to satisfy users by:.

It still leaves out some of the more complicated technical recommendations for larger sites. I usually find it useful to keep an eye on what Google tells you to avoid in such documents, which are:.

Users dislike clicking a search engine result only to land on another search result page on your site. Of course, you combine the above together with the technical recommendations in Google guidelines for webmasters. On Page SEO is not as simple as a checklist any more of keyword here, keyword there.

Now, consultants need to be page-centric abstract, I know , instead of just keyword centric when optimising a web page for Google. One filter may be kicking in keeping a page down in the SERPs while another filter is pushing another page up. You might have poor content but excellent incoming links, or vice versa. You might have very good content, but a very poor technical organisation of it. The key to a successful campaign, I think, is persuading Google that your page is most relevant to any given search query.

Next time you are developing a page, consider what looks spammy to you is probably spammy to Google. Ask yourself which pages on your site are really necessary. Which links are necessary? Which pages on the site are emphasised in the site architecture?

Which pages would you ignore? You can help a site along in any number of ways including making sure your page titles and meta tags are unique but be careful. There are no hard and fast rules to long-term ranking success, other than developing quality websites with quality content and quality links pointing to it.

The aim is to build a satisfying website and build real authority! Make mistakes and learn from them by observation. If they were, I would be a black hat full time. So would everybody else trying to rank in Google. The majority of small to medium businesses do not need advanced strategies because their direct competition has not employed these tactics either.

This site was a couple of years old, a clean record in Google, and a couple of organic links already from trusted sites.So, if you are helping visitors the come from Google — and not by just directing them to another website — you are probably doing one thing right at least. How do you get two listings from the same website in the top ten results in Google instead of one in normal view with 10 results. For more on this, I recommend this article on the time to long click.

But he points out that: He has more links to the page than the competition. He tallied up the answers for a leaderboard, but you can see each response too.

BRUCE from Lakewood
I do enjoy reading comics longingly . Look through my other articles. One of my extra-curricular activities is canyoning.
>