Monthly Archives: May 2017

How Faceted Navigation can Impact SEO

By | SEO, SEO Strategies | No Comments
If you want to give your prospects the best possible experience, then you likely use a faceted navigation system to allow your visitors to navigate your website with ease. The problem, however, is that using this layout can harm your search engine optimization rank more than you would think. Unless you take steps to address the issue, you will get less organic traffic than before, impacting your bottom line in a negative way. The good news is that you can make your website easy to navigate without harming your position in the search engines.

Faceted Navigation Overview

Before you do anything else, let’s review what a faceted navigation system looks like so that we can get an even better understanding of the issue at hand. The layout allows your visitors to find the exact product they want by filtering their search results. It’s important to remember that each combination has a link pointing to it, and this can trigger Google to flag your website for having duplicate content. Also, a faceted navigation system will spread your domain authority to pages that you don’t need to index, wasting your SEO potential.

Noindex Tags

Those who want to reduce the harm of a faceted navigation system often turn to noindex tags to get the job done. At the surface, this solution seems to make a lot of sense, but you will need to take a close look if you want to get a clear picture. Although using noindex tags will prevent Google from flagging you for using duplicate content, it won’t stop your extra links from consuming your crawl budget. If the search engines spend too much time crawling links that don’t matter, your SEO efforts might not draw any attention.

Nofollow Tags

Nofollow tags are another option that you can try when your goal is to maintain your SEO rank. These tags will tell the search engines not to crawl your faceted pages and prevent you from wasting your crawl budget. But this solution is not without its setbacks. If you opt to take this path, it won’t stop Google from picking up duplicate content on your site, which will put you at risk for a penalty.

JavaScript

If you are building a new website, you can implement your faceted system with JavaScript to avoid common pitfalls. Doing so will allow you to get the same benefits without putting your SEO rank in harm’s way. With JavaScript, you can use the same link for different filters, and you won’t need to worry about wasting your crawl budget or domain authority on pages that don’t matter. When you use this method, you will need to manually ensure that Google and the other search engines can index key pages if you don’t want to be overlooked.

Final Thoughts

Using a faceted system without running into problems with SEO is not impossible, but you will need to weigh your options and consider what is important to you. Taking one path will force you to give up benefits from other areas, and making the wrong choice can hurt your progress. If you are not sure what solution makes the most sense for you, review your website and ask yourself if you prefer to save your crawl budget, domain authority or time, and the correct answer will become apparent. When you don’t get the outcome for which you had hoped, you can always try new approaches until you find one that meets your needs.

Jump to top

Why Do People Misunderstand The XML Sitemaps?

By | Search Engine Watch | No Comments

XML Sitemaps is one of the many search engine optimization tools. However, many people fail to understand what the tool can achieve, for reasons unknown to them. There is also another category of people who hardly have any idea of what XML Sitemaps entail. Just like any other tool, individuals need some training so that they can come to terms with what these tools are capable of.

A lot of people out there commonly think that it is the XML Sitemaps that help in the indexing of your web pages. On the contrary, this is never the case. Google only indexes web pages in which its tools have had the chance and opportunity to crawl. Also, indexing only occurs after the pages previously crawled upon are found to contain meaningful content. With this misconception, many people have found themselves submitting samples of their pages to Google for review. However, they are met with shock since Google only approves pages that portray quality rather than quantity. Indexation, however, helps other individuals find your content quickly.

Sometimes, when people decide to send Google some of their pages, they happen to produce quality content but forget to update their meta robots. In such a case, they create confusion and end up not getting indexed due to such inconsistencies. Setting your meta robots to no-index and no-follow causes Google not to index your page. Thus, you should avoid the above at all costs.

Google usually indexes pages based on the type of content found in them. If your content seems to be of high quality and directly speaks to users, then Google feels compelled to index your pages. If you happen to produce ten pages of great content and eleven pages of poor quality text, Google feels discouraged to index you since it sees no worth in sending users to your portal.

XML Sitemap is, therefore, a tool that helps you discover which of your pages have been indexed by Google and which have not. In this way, you can upgrade the content on the pages that have not yet received indexation so as to improve their quality. Low ranking pages appear at the bottom page. For the best possible output, it is best if you block meta robots and robots.txt so as to eliminate anomalies that come with the use of the same. It is only after cleaning up your pages that you will start experiencing significant changes on your part.

When you submit particular content to Google, you hint to it that such information is vital to your site. Thus, Google can now index your content. For the best results, it is important that you sub-group your pages so that Google can also find it easier to index them. In so doing, you also get to know which pages have quality work and which ones need an immediate review. Something important to note is that you do not need to have an XML extension in order to submit your files to Google.

Thus, the best outcomes call for consistency in content, the ability to use XML Sitemaps with excellent efficiency, and the capacity to sub-group content for easy indexing. Hence, I believe this article will help you not to mistake XML Sitemaps for something it is not.

 

 

Link Building And Analysis

By | SEO, SEO Strategies | No Comments

Every business that is operating online must be sure that they have taken steps to ensure they are using the proper links or link-building program. The program itself is quite easy too use, and it may be deployed to ensure that anyone who is attempting to widen their Internet presence may do so. This article explains how someone can take the necessary steps to change their overall brand image, and takes a look at how links will help.

#1: Links Are Built Slowly

Links must be built slowly with a number of different partners, and each of them must be given the opportunity to carry links for the company for some time. There are many people who think they may start a link program today that will show instant results, but that is not the case. Links take time, and link research shows that everyone who has a longstanding link program has better results.

#2: Links Must Have Variety

Any link will be flagged as spam if it does not have any variety. Posting the same thing time and again will alert the search engines that there may be something amiss. There is nothing wrong with the links, but they appear to be untoward, and that is a problem that is difficult to solve. Everyone who is using a link program must insert as much variety as possible into the program, and they will see better results with fewer red flags.

#3: The Overall Partnership

There are many partnerships that may be started with a link program, and they all look back to simple linking techniques that may share information. A link partner may have their links with another site, and the two are sharing link space. They are sending customers back and forth between one another as a way of offering good graces to everyone around them, and it may build a partnership wherein the two companies are thought of as one.

#4: SEO Content

Links must be written into SEO articles that have been produced in a proper manner. Someone who does not know the writing style of SEO must learn how to write with keywords that will be helpful, and they must track the success of all the work they have done. The writer may change their articles often to reflect which keywords should be listed, and they may do so many times during the year as trends change.

#5: Links Must Be Subtle

Links must be as subtle as possible when they are placed online, and they will be much easier to manage when they are dropped into articles without much fanfare. The links should catch the eye of the reader, but not be so heavy handed that the customer does not believe they should click on it. Every new link that has been placed subtly in a proper location will be more than effective.

Link building programs are helpful to a number of people who are operating online. They may start a new link program today that will be quite helpful, and it will show businesses how best to use their resources. They may partner with other companies that host their links, or they may choose to place their links all over the Internet where a reader may find them interesting.

From Shifting Rules To Managing Expectations: Common SEO Challenges

By | SEO, SEO Strategies | No Comments

Change is the norm in the SEO business. The digital landscape is constantly evolving, and any marketing strategy needs flexibility ingrained in its very DNA to remain viable, let alone be successful. From the old days of the early internet when content didn’t matter, and spam was king, SEO has come a long way in the past 10-15 years. But is still faces considerable challenges at all levels, from managing daily operations to managing budgets and the expectations of clients.

Amidst constant change, one thing has remained a near constant: the challenges faced by SEO professionals. It doesn’t matter if we are talking about in-house SEO activities at an organization or SEO as a digital marketing agency service for hire. They all face the same kind of hurdles.

The central issue pertains to search engines, especially Google, and their near constant updates on search engine algorithms. And it is not just the major ones like Panda or Penguin which they roll out every year or so, but those innumerable unnamed updates, major and minor, implemented almost on a daily basis. SEO is one tough game where rules can change overnight, without warning too. Something that was previously considered a legitimate white-hat tactic can turn into spam, literally overnight. Rare are the days when SEO professionals don’t have to spend more than 15 minutes in the morning just to stay abreast of all the changes that happened in that past 24 hours!

And all this constant activity and changes bring out the next big challenge, of resource scarcity. Be it the financial backing or total staff hours required; almost every SEO campaign struggles to cover all the bases. The online marketing process alone can involve multiple platforms, from email to social media and Websites. Combine that with the fact that SEO has to sync with the organization’s other ad and PR initiatives, and you reach a stage where it seems like you are playing whack-a-mole with ever-shifting objectives on multiple fronts!

Talking about being in sync with the organizational activities, budgets, and goals opens up the next big can of worms: managing expectations of your client/boss, and showing tangible ROI for the money and labor expended in a campaign. SEO is different from traditional marketing efforts. It’s the new kid on the block, it is online, and everything online is supposed to be fast. So this creates a lot of misconceptions about what the process involves and how fast results can be achieved. Impatience from senior management and or clients is a real issue for SEO professionals. It can take months of constant effort before any tangible benefits start manifesting in SERPs.

Things have come to a point where even Google has decided take the initiative to caution SEO clients against unrealistic expectations. Calculating ROI in the meantime to keep those holding the purse-strings happy is another major challenge. Benefits, if any, often show up in revenue streams and transactional data. But it is uncommon for SEO teams to have direct access this data. The only viable recourse is to try and gain access to said information from the relevant CRM platform/accounting system of the organization.

In SEO, the challenges are many and constant. An effective SEO strategy should involve regular updates, as well as audits, analytics, and tracking of multiple datasets across the organization. Keeping on top of all these can take a lot of effort and resources, but that is the only way to make life easy in this business. All else are but shortcuts to even more pain and failure!

 

 

 

 

What Is So Complicated About SEO Redirect

By | Search Engine Watch | No Comments

SEO redirect is an aspect that eludes most experts. The reason behind this constant fear is because there are more than a thousand ways to carry out SEO redirects, and thus it would rather be impossible for an ordinary expert to understand all these processes. Besides, these methodologies often change depending on the procedures in effect. In principle, SEO redirect refers to the course of transferring digital signals from one URL to another. Today, most of us have had an experience with how Google takes you to other sites that offer you better information on topics that you have an interest in. What we fail to understand is that the above occurrences is one of the ways of handling redirects.

Redirects include the grouping of signals such that they are consolidated as links and passed from one web page to the other. Since redirects improve computer usability, it is vital that they are preserved even in future. All SEO redirects virtually center on the type of technology you are using at the time. Domain redirects happen to be quite different from web page redirects. Redirects are flexible for placement in various positions, and this is what brings these variations.

As much as redirects are useful, it is vital for you to know where to place them and at what time. It is bad practice to put many redirects on one page since this usually leads to overcrowding. You can only redirect your files at the DNS, CDN, and server levels. The process is also ideal during HTTP header response and language-based redirects.

The procedure of a redirect is more complicated for businesses as opposed to other ventures. Since a business might have many pages, it is important that you develop several methodologies that will allow you to set up a vast majority of redirects without overcrowding them. Before creating your redirects, it is critical that you understand where you intend to use them.

As you set up these tools, you have to make sure that you do not block or no-index anything. Blocking of any kind prevents Google from crawling through your content, and thus it becomes impossible for your files to be indexed. It is always good to remember that redirects vary from one aspect to the other and that each scenario requires its unique set of rules. If you happen to change your site from HTTP to HTTPS, you ought to also update your redirect settings. Keeping your data updated helps protect your systems from crashing.

Apart from the above, you have to possess some basic knowledge on web design, a knowledge that will help you integrate redirects into your sites. In the absence of such knowledge, you are prone to experience problems with your domains in future, especially 404 errors. Using the right types of codes is also imperative for redirects. It is these regulations that dictate the kind of signals your redirects will use.

Finally, you have to recheck your redirects for any errors. It is an opportunity that allows you to correct the broken links in your creation way before you start using your redirects. Since there are many ways of doing redirects, it is essential that you equip yourself with thorough knowledge of at least one method since it will make the understanding of other methods easier.

Common Benefits of Listening to Social Media

By | Social Media Marketing | No Comments

Monitoring social media allows a business to be aware of everything being said about it online and try to make use of all of the information it can find in those comments. Through this, a business can optimize its reputation and the frequency that new leads for its content can be found. There are several other, more specific reasons that listening to social media can benefit a business, however.

It is commonly understood that your business needs to listen to what is being said about you on social media in order to upgrade your service quality. Your clientele can provide you a picture of how your products are being used and how people regard your products. Through this picture, you can work toward improving what people are saying needs to be improved about your products, and once you do, you can anticipate better sales from your efforts to market higher-quality products.

 

Another benefit of monitoring social media, one that is regarded to be highly significant to any business’ short-term and long-term success, is that you can gain more customers based on what they say about you. Generally speaking, potential customers of any given company do not blindly buy on impulse. Instead, they look into whether that business is reputed for serving its customers well and reacting promptly to any concerns that it directly receives from its customers on its social media profile. Therefore, you can listen to social media to improve how your current stable of customers view you, which will lead to more customers for you in the future.

The last of what can be described as the three primary benefits of social media monitoring is to learn how to keep your customer service prompt and relevant to addressing legitimate problems and concerns your customers will likely have. You need to make yourself aware of issues with your products so that you can provide solutions to your customers’ problems quickly when they use social media to communicate with you directly regarding these problems. If you are always ready to serve your customer base at a moment’s notice, its continued enthusiasm about your service will reflect well on your company.

There are many other benefits to listening to what social media says about a company.

One that we’ve noted is looking at how well each of your different types of products and services perform in comparison to each other. This will help you improve your overall content with that information in mind. Therefore, people can be compelled to directly browse your company online in addition to sharing information about your products with each other.

Another is the fact that you can get an idea on how to become better than your competition – or at least differentiate yourself from competitors – by listening to people on social media as they comment on the service quality of other companies. You can essentially monitor your competition by reading how people on social media vocally regard the businesses selling similar products to yours.

Disavowing and Penguin 4.0

By | SEO | No Comments

Google has released a new version of Penguin known as Penguin 4.0 and, according to some SEO practitioners, it is fairer than the previous versions. Being part of Google’s core algorithm, Penguin is now able to penalize websites that use several black-hat link schemes to manipulate search rankings. Unlike the other versions that would affect the whole site when the algorithm suspected any engagement in manipulative link building, Penguin 4.0 devalues spam by making adjustments to the ranking of the page in the wrong. Previously, even cleaning up would save the site until Google re-ran the Penguin algorithm and recognized the clean-up efforts. Suppression of sites is no longer a common thing, unlike before when many businesses were put on suppression. It is impressive how Google was able to come up with this method to find and devalue spam.

Many have and still wonder if there is any reason to disavow links anymore if Penguin can devalue spam. The disavow tool can still be used if you realize that your site’s ranking is being affected by low-quality links that you cannot control. You can ask Google not to take them into account when assessing your site by uploading a disavow file to them containing a list of the domains. The disavowing is still highly recommended by the Google employees despite the fact that they can just devalue spam links and save people the struggle to deny them.

As much as disavowing is recommended, with the launch of the new Penguin, it is not used as much. However, the following are some reasons why disavowing is still recommended:

The existence of manual actions– Leaving all the task to Penguin to devalue your links instead of disavowing may pose risks of getting a manual action. This may attract a penalty that could have been avoided if you had taken action by removing the bad links.

Other algorithms use links– Penguin is just one way to fight against link spam. When making calculations, Google looks into all links apart from those that have been exempted. Another algorithm that is said to be using links is the Payday Loans algorithm that makes an impact on sites in several high-competition verticals.

It is not guaranteed that Penguin will devalue all the spam– Although it is clear that Penguin 4.0 devalues spam and doesn’t penalize, many don’t understand how algorithm ignores the unnatural links pointing the site. According to the statements given by Google employees, there are no clear indications whether people should stop worrying about any links or just need to worry less than before.

The disavow tool can be used when an SEO company makes low-quality links like articles, directories or bookmark sites and they need to be cleaned up. If you are not sure whether the link is spam or not, they can be passed through the disavow blacklist. Another situation is when a site has a past of manual action for unnatural links or is under apparent negative SEO attacks. However, disavowing should not be conducted for any link just because they are strange because Google takes action against a site that has been continually manipulating Google rankings. Links that are there for mostly SEO reasons are supposed to be disavowed so as to avoid getting a manual action.