As the 21st century marches onward and things like “smart devices” and “drones” and “3D printers” become the latest buzz, one truth remains in the digital age: Google has retained its steadfast grip (read: complete monopoly) on all things search.
Despite initial scares from Facebook’s widely announced Open Graph technology, and Microsoft’s desperate attempts at challenging the search king with their re-branding of Bing.com (and now Windows 10), the dust keeps settling and owners of Google stock are still laughing their way to the bank. Why? Because despite new AI assistants such as IBM’s Watson, Apple’s Siri, or Microsoft’s Cortana, Google’s dominance is not letting up, even as tech bloggers continue to theorize about how Windows 10 and/or Cortana’s AI search “should” be eating into their market share right about now…
“Never assume the obvious is true.” ― William Safire
In short, if you are a small business owner that relies (at least in part) on Google for sales or new leads, it is simply imperative that the key pages on your website are being quickly and properly indexed into Google results.
Now, this sounds so basic that many readers may not even know what I’m talking about. So, let’s back up and imagine for a second that you just launched a new website. How are you going to get it listed in Google? Most webmasters just sit and wait, and hope that the internet gods show them favor and somehow Google will discover their new site. In most cases, Google does indeed eventually find your website (somehow or another). But let’s take it a step farther: what if you just launched a WooCommerce store during the month of September that sells Halloween costumes (for next month’s holiday), or a line of niche health products that were mentioned on CNN last week and are starting to go viral online? Oh no, what to do? How are you going to get those new products indexed into Google search results ASAP?
You have a few different options to get indexed on Google, but read until the end:
1. High PR Backlinks. In the earlier days of SEO, say about 5-10 years ago, a powerful trick that many gurus were using was to place a link to that “new” website or page on a very reputable or high PageRank website, which was usually one of their other established web properties. Not only would that “new” page get indexed very quickly, but it would also immediately rank highly for whatever keyword was used in the hyperlink anchor text. Now, this still works extremely well, however, after Google’s infamous Panda and Penguin updates which aimed to cut down on unatural linking methods, its important not to interlink unrelated websites for more than a brief period of time, or use overly spammy keywords as the anchor. (And in any regard, there is strong consensus in the SEO community that PageRank is largely being replaced by “authority” metrics.)
2. Social Bookmarking. As the Web 2.0 era began to calm down, and wannabe social media networks began dying out like the plague, Google’s algorithm took notice. Specifically, around 5 ago, Google bots began selectively crawling certain social communities and “bookmarking” websites that it had manually deemed to be of good quality with top notch HUMAN moderation systems. This is still true today (especially because Google is rumored to lack any insight into Facebook’s sharing data, which is the largest social network in the world), so getting a link submitted to reputable social bookmark sites such as Hacker News, Reddit, Digg, or a few other publicly-accessible and high reputation aggregators will result in being immediately indexed into Google, and usually has the side benefit of giving that page a boost in reputation as well. (Strangely, despite being completely ineffective and in fact hurtful, thousands of webmasters continue to spam low quality “bookmarking” sites.)
3. Indexing Services. Okay then, but what happens if you don’t have access to a high authority/PR website, or aren’t able to submit a link to a popular bookmarking site, for whatever reason? Or, maybe you have 100+ links that you need to get indexed, what then? This is where “indexing” services began taking over a few years back as Google’s algorithm got smarter about ignoring most bookmarking and low quality “social” websites. In other words, jumping on Fiverr.com and ordering 5,000 “bookmarks” no longer helped you get indexed per se, and if anything, this sort of approach began penalizing your web page and overall domain reputation. In its place (or sometimes alongside) popped up services like Lindexed, Linkalicious, etc. that offer to get your links (even thousands of them) properly indexed into Google within a relatively short period of time. These nearly pointless services take on the attitude of, “Google doesn’t trust our spammy bookmarks, so we will turn them into a spammy RSS feed instead, and/or combine with High PR Backlinks, and see if that works instead.” This isn’t necessarily a bad approach as it does work a lot of the time (for now), but thankfully, there is still a better option…
4. Submit Directly To Google. Holy… what in the name is this? Yes, you read correctly! Turns out, while millions of people around the world have been trying to figure out how to trick Google over the past decade, they actually decided to create a free tool by which anyone, anywhere, can submit a link to the Google index. “Surely, this must be some sort of black magic or entrapment by Google,” you are probably thinking. Wrong! You see, put yourself in Google’s shoes: this internet thing that you are trying hard to crawl and organize is being overwhelmed by spam and/or SEO scams being sold to small business owners. What do you do? You make an easily accessible tool whereby anyone can suggest a page that your robots should index, and if its a quality enough page, you add it to your index immediately. Don’t believe me? Try it now:
Keep in mind that this is different from submitting a “sitemap” to Google Webmasters Tools; it is also (appears) different from the older tool that Google has where you can “fetch as Google” and then “submit” a page to Google’s index. Those two approaches, from everything I’ve witnessed, are largely the same, as they merely “ping” Google to request that robots re-visit your domain again, but don’t necessarily add the specific page to Google’s index (and even Google’s own Help pages seem to vaguely suggest that the submit tool is different from the re-crawl tool). In contrast, the rather secretive tool linked above almost immediately adds new pages to Google, literally within seconds in many cases where the domain is already trusted (such as your company’s new Yelp profile). It also works amazingly for instances where you have updated the title, description, or content of a web page and want Google’s search results to reflect those changes — again, such updates are often visible in search results within SECONDS of submission. For one recent SEO client of mine, I was literally able to publish a new page on his website (of a service offered), submit it to Google, and rank #1 in search results for that topic all within 30 minutes.
Now stop reading, go write a killer new blog post, and try out this tool ;)