Perhaps I should have started with this topic instead of jumping right in. Hindsight, right? This is by no means meant to be comprehensive, just a starting point for any blogger looking to learn more about SEO and how it affects their site.
How Search Engines Work
Search engines have two jobs: crawling and indexing data, and then serving that data to users.
Crawling and Indexing: Think of the internet like a subway system. Each web page is a stop, and the subway is a crawler. (Or spider, or bot, depending on how I’m feeling, which word feels right in my mouth, how the stars are aligned, et cetera.) The subway moves from stop to stop, page to page, connected by links. At each website stop, the crawler takes in as much information as it can to understand the page, and then moves on to the next stop.
Serving Data: Once the information has been indexed, the engine has to process all that information and serve the most relevant information to a user based on their search. They do this by holding a giant popularity contest for every search. Information that is served for any given query that seems to satisfy a user by having a lower bounce rate, a higher click through rate, or less instances of pogo-sticking, is information that is served more often. But if a page is served for a query and users regularly don’t click on it, or click on it and immediately bounce, the search engine assumes that wasn’t what the user was looking for, and won’t serve it as often (if at all) for the next search.
What Do Search Engines Look For?
Google has said that their algorithm evaluates more than 200 different things in order to serve you the best result possible. Of course, they’ll never tell us what all 200 are, but we have some ideas.
Search engines have gotten a LOT smarter, but they’re still dumb robots, so we have to make our websites as comprehensible as we can. This includes being clear in how we write our content as well as how it’s organized on the site, and structuring our data in such a way that makes it as easy as possible for dumb bots.
Page-level checklist:
- Use subheadings to separate different topics in one post.
- Make sure your images have descriptive alt text or captions.
- Ensure your URL is descriptive if you can. Avoid example.com/post01234.
- Link to relevant pages on your website, and make sure other pages link back.
- Use a rel=”nofollow” or update your robots.txt to prevent advertising links from being followed by a crawler.
- Do a bit of keyword research to craft your title and URL.
I’ll just put a ton of keywords in my post. That’ll be fine.
No. Did you get that? No. You cannot do that. There used to be an industry standard that your primary keyword for any given page couldn’t exceed 3-5% of your total word count. That’s dropped now to closer to 0.5-1.5%. Stuffing keywords is black-hat SEO, and will get you penalized. Believe me, you don’t want to be penalized by the Overlord Google. Besides, it just doesn’t read well! I don’t want to read a post with the same wording over and over and over again. It doesn’t provide a good user experience. I’ll write about penalties at some point, but don’t risk it.
Ultimately, your site should be built for users, not for rankings. If it’s useful information presented in a clear way and users like it, chances are a search engine will like it too.
On Wednesdays we wear pink. On Saturdays, we talk about SEO for Sex Bloggers! Follow me on Twitter for the latest updates.
Pingback: SEO for Sex Bloggers: URL Structure Best Practices | Ace in the Hole