Join date: Mar 24, 2022


SEO Through Site Organization

SEO LogoMany bloggers seem to cram their posts into as many categories and tags as possible with the idea that it will generate more traffic due to being “search-able” under each term. I’m here to tell you that what you’ve been doing has been hurting you more than helping you.


To understand this, you have to pretend that Google bot is your retarded 3 year old cousin. Yep, I’m labeling the almighty Google that I regard so high as a retarded child. When Googlebot scans your pages for indexing he sees your site as a giant tree. Whenever he finds content in multiple categories he’s confused beyond belief and doesn’t know what to index it as. So what does someone do when they have too many options? They choose nothing over something.

To fix this I’ve compiled a few quick and easy steps to backtrack out of the rut you’ve landed yourself into.

First, according to organize your posts under only a Single Category that you think best describes it This way no matter how stupid Googlebot is, he can only find each piece of content (post) in one category and leaves no room for error for Googlebot trying to guess a category. I know this might be hard for a lot of you that have multiple categories that conflict or posts that don’t fit under just one, but that has more to do with how you organized your categories rather than the strategy behind this. More on organizing your categories in a later post. Putting your post in more than one category is a bad practice anyway because it drains “link-juice” which is pretty much how high the priority of the link is.

Second, summarize your posts on your main page and have the only place with the full article be on the dedicated page for that article. This goes back to the same reasoning of Googlebot being retarded. If he sees your post under a category as well as on the front page he’s gonna say “Uh-Oh” in his Teletubby Voice, and screw up the search-index format for your article indefinitely. By summarizing your post you can make sure that your full article is only listed on it’s dedicated page and that a link from Google is going to redirect to that article when clicked on, instead of going to the main page where your post has probably drowned to the bottom by now. If you need help with summarizing your posts check out the tutorial I wrote on How To Summarize your Posts.

Use your robots.txt file to block Google from indexing duplicate content. Now for example, what if you use your categories for other clever schemes on your blog such as having a category labeled Popular and having a Widget display the “Most Popular” posts by directing it to the “Popular” category. You can still do this, but you have to take an extra step in order to stop Googlebot from indexing those articles in both the Popular category as well as their original categories. The robots.txt file is located in the root of your directory and is basically a barb-wired fence for our retarded 3 year old cousin Mr. Googlebot. Any directory we plug into robots.txt will block Googlebot from entering it and as such will be completely ignored in the search indexing.


Here’s a quick tutorial on how to block Google from accessing a certain directory. I’m going to try to explain what each part means in as simple a way as possible for the average user to immediately be able to integrate this into their site.

If your robots.txt file is empty copy and paste the code below in it. If something is already written there then skip a few lines before pasting it in.

More resources:

Summarising Posts Does Age Reflect Intelligence? Adding Pictures to a Blog Post 10+ Resources For Increase Blog Backlinks How to Increase Traffic To Website


More actions