You are here: Home / Blog / Interview with the SEO Scientist Branko Rihtman

Interview with the SEO Scientist Branko Rihtman

For this Blogvember feature, I had the pleasure of interviewing Branko Rihtman.

The answers are best read in the correct order but we’ve also added a list so you can jump straight to each one:

  1. Are rankings just as important as ever? What do you typically report on for a client?
  2. How much rigour do you apply to proving changes were a result of your efforts
  3. What approach or steps you normally take during a site audit?
  4. What kind of relevancy signals you are keen to attach to your links?
  5. Have you got any experiments planned for the near future you can share with us?
  6. The online marketing community could benefit from taking a more scientific approach to what they do?
  7. Are you likely to move away from SEO once you’ve completed your Marine Biology PHD?



Q1: I’ve seen people this week stating they’ll drop rankings as a KPI altogether in the near future whilst others proclaim it’s more important than ever due to the [not provided] situation. I know you don’t consider them the ‘only’ KPI but what’s your take on the above?

Some of our team also had an interesting round robin style email with a bunch of industry associates recently on what they like to include most on client’s reports (assuming they have budget and no help inhouse). Are you able to share your typical checklist or do you look to tailor it every time?


A1: I don’t think rankings are more important than ever. They were always important and has nothing to do with it. At the moment, a lot of SEOs out there are claiming that rankings are not and never were important, since anyway, they are not what the SEOs should provide their clients with. SEOs should provide their clients with increased organic traffic which in turns increases the conversions. However, that statement is a red herring. It is arguing against a false premise where SEOs, as an industry, are reporting rankings as a standalone KPI. In all my years as an SEO, I have not encountered many serious SEOs who are using the “we’ve improved your rankings for XXXX keywords last month” spiel to impress their clients. I was going through the old reports I was sending to clients back in 2003-4 and not a single ranking change was reported without connecting it to change in traffic or conversions. We even had a special section of SEO reports in which we outlined the keywords for which top rankings were achieved, but they provided no significant converting traffic, just with the purpose to show the client what the true KPI is and that some keywords are not worth actively pursuing.`

If you ran a content outreach campaign, while at the same time searches for your brand spiked due to some offline campaign, without rankings, you wouldn’t know whether your work contributed to that increased visibility or not.

Without rankings, we are not able to attribute the KPI improvement to our work. If you ran a content outreach campaign, while at the same time searches for your brand spiked due to some offline campaign, without rankings, you wouldn’t know whether your work contributed to that increased visibility or not. Over the time, Google has introduced a number of keyword-specific penalties; without reporting ranking drops, we would have no idea whether we were outranked by our competitor, pushing us to the second page or whether we were thrown down to page #5 due to some penalty. Furthermore, scraping the SERPs and following the progress of your competitors was always a reliable way to assess your niche and the efforts invested by the other side. If you are not checking rankings, you do not know that your competitor has conducted a content outreach effort which has propelled him from the bottom of 4th page to the top of the 2nd page. Wouldn’t you want to know what made that campaign successful? Without rankings, you wouldn’t even know to ask that question.

Regarding the client reports, I pretty much tailor it every time. The majority of reports I have written lately were site audits, rather than SEO work reports, so I am a bit rusty on that front, but generally speaking, I like to include the following things:

  1. Changes in monthly organic traffic
  2. Changes in traffic from other sources
  3. Conversions from organic traffic, per landing page and per keyword
  4. List of all the SEO activities performed (on-page, backlinks, content promotion)
  5. Changes in backlink numbers/numbers of linking domains
  6. Reports on ranking movements of main competitors

Other than that, it is specific to each client, depending on their needs and their niche.




Q2: It makes sense that rankings changes have a causative effect on traffic and conversions, so how much scientific rigour do you apply to proving that these changes were as a result of your efforts?


A2: I don’t always prove that the ranking changes are solely responsible for the increase in traffic and conversions. At some point you have to assume that the chances of increased rankings being solely responsible for organic traffic increase are high enough to make that conclusion. When we had the keyword data, we could increase that probability by looking at differential increase of organic traffic coming from keywords for which the rankings have indeed increased vs. those which remained relatively static. Now what we have left is looking at the landing page level and try to discount potential increases from other sources of traffic.

Now what we have left is looking at the landing page level and try to discount potential increases from other sources of traffic.

I save the proving for the experiments I do, in which it is important to make sure that there is a causative relation between the change I do on the page/links and ranking changes. I have presented at SMX Advanced about introducing scientific method into SEO experiments and there I talked about multidirectional experiments (slide 7) in which we try to reverse the effect of the change by reverting to the previous state. As I try to mention at every opportunity, just like in science, we can never prove things definitively, only increase the probability that the explanation we have for a certain observation is indeed the correct one.




Q3: Ok so going back to the audits you’re doing, we tend to find that whilst the on-site technical audits are pretty logical and consistent we almost have to take a slightly different approach each time for the off-site analysis in order to determine what their exact problems are (assuming it’s not spelled out) or quickest wins will be.

Do you find this too and can you outline the approach or steps you normally take?


A3: Yes, definitely. There is a limited variety of recommendations in on-site audits and are usually limited to technical issues (crawlability, page loading speed, canonicalization of URLs etc.) and keyword targeting issues (grouping of keywords, topical hierarchy, page interlinking, etc.). There are some cases in which more fundamental changes need to be implemented, but those cases are slowly disappearing as search engines are getting better at indexing a lot of different kinds of content.

As for off-site analysis, they usually have a constant and a variable component. The constant component of the analysis has to do with the way I look at backlink profiles: I usually look at the spreadout of sitewide links, how many domains provide 90% of links, how many keyword anchors are represented in those 90% of links, how many branded anchors does the linking profile have, what is the spread of anchors per linking domain, what on-page elements contain the links, etc.

…you still want to provide relevancy signals with links which are acquired in less organic ways…

The variable component of the analysis comes into play when looking at the ecosystem in which the site operates. There are some niches in which implementation of Google Webmaster Guidelines word for word would result in SERPs with 0 results. In such niches, it is important to balance the need to have a natural link profile with the volumes and quality of links that the competitors are achieving in order to rank. Therefore, having 80% sitewide links (for example) in a niche in which all the competitors have 99% of links coming from 2-3 domains is a less alarming situation compared to 80% sitewide link profile in which majority of links are coming from a large percentage of total linking domains. The same holds true for anchor text profile – hugely important to balance between keyword targeting and natural profile. I also like to look at traffic and social media parameters that linking URLs have and add that layer of quality analysis – links that are providing traffic and are being shared in social media worry me less than links that do not show any signs of organic behaviour.

The similar process is applied when implementing link building campaigns – risk mitigation is balanced out by potential value and social media data is invaluable when looking for organic, future-proof links. Identifying the interested users who have the best share-to-link ratio and placing your content in front of their eyes is a great way to build links for which, at the end of the day, you may be less interested in Google’s opinion of. On the other hand, you still want to provide relevancy signals with links which are acquired in less organic ways, so you operate in a space which is defined by the perceived “naturality” of your link profile as compared to the competing sites in your niche.




Q4: I agree with those areas and it’s that ‘variable component’ whereby a knowledgable SEO is really valuable to a business, given that tools like Screaming Frog and WooRank can automate much of the on-site checks for you and checking the off-site contstants is relatively easy using Majestic or Ahrefs with a little time but you really can’t shortcut that variable area and have to have historic knowledge and a deeper understanding of SEO to assess it correctly.

Can you give us an idea of the kind of relevancy signals you are keen to attach to your links just now and whether you think Hummingbird will change them?


A4: There are two main types of signals I look at when estimating links:

1. Trust – in addition to the obvious estimation of the value of the link, this signal also ensures the susceptibility to future penalties. Low trust links, even if they are contributing at the moment, will be the target of future algorithm updates so it is desirable to move the dial of your backlink profile towards higher trust sources. Obviously, we can only guess how trust is assigned to links, but I try to look at the following parameters:

  1. Traffic data – how much traffic a link is sending. Obviously this can only be estimated after the link is placed, however it can be an important parameter when thinking of disawoving links after a penalty.
  2. Social data – link from page that has social media activity attached to it will probably be more trusted than a link from a page which is not shared
  3. On-page activity – whether there are any crawlable comments on the page, signal which can mean that the page is live and trustworthy, due to interest it raises with the visiting crowd
  4. Backlink acquiring activity – whether linking page acquires links over time
  5. Ranking data – whether linking page is ranking for some terms with traffic potential. The fact that Google ranks a page in its SERPs could be a good proxy to the level of trust it has

2. Relevancy – this is important for deciding what keyword rankings will links influence. There is the immediate signal of whether a desired keyword (or a derivative of it – cars vs. automobils for example) appears in the page copy or meta tags, but we can look past that. For example, we can look at the backlinks of the linking page, both by analyzing the anchor profile and estimating the keyword relevancy of linking domains. This last bit can be done by getting a list of linking domains, creating a custom Google Search Engine based on those domains and searching for desired keywords to find the percentage of linking domains that are deemed relevant to each keywords.

Look at the backlinks of the linking page, both by analyzing the anchor profile and estimating the keyword relevancy of linking domains.

You can also look at social relevancy of people who have shared the potential linking page. There are tools out there like Tweetcloud, that will allow you to create word clouds based on Twitter feeds of users that are sharing a certain URL (data which can be mined from Topsy).

Regarding the response to Hummingbird, I am not sure how it is supposed to influence link building. It is an algorithm which is supposed to help Google interpret a larger number of queries in a way that will help the return meaningful results. So it is about understanding intent. I can see how it could possibly widen the definition of what is considered relevant to a search keyphrase and thus potentially increase the available pool of sites relevant for link outreach campaigns, but we don’t know much about it – not how it works nor at what level – does it take a previously incomprehensible query and match it to a whole list of popular and relevant queries returning a larger variety of sites? Does it only impact the variety and richness of SERPs or does that relevancy also influence the link graph? We just don’t know so there needs to be a lot of testing in order to better understand how it impacts link building.




Q5: Yes it’s going to be a while until we’ve anything concrete to work from. I know you’re a fan of running experiments, have you got anything planned for the near future you can share with us?


A5: Yes definitely, I am planning some new things as well as revisit some old ones. I have really started to dig into the different sources of data alternatives and primarily to check exactly how reliable GWT information is. There has been an excellent post by Yehoshua Coren, a guy with amazing Google Analytics skills, on how different exactly is the data on GWT compared to what is available to Adwords advertisers, however there was little analysis of the quality of the data provided in GWT and I will try to see if I can put some numbers on that. Additionally I am quite interested in analysing how the canonical tag behaves, whether there is a complete link equity and relevance transfer or whether there is dampening, how similar is its behaviour to 301 redirect in terms of link equity transfer, what the minimal content duplication requirements between the source and target of canonical tag are, etc.

I have really started to dig into the different sources of data alternatives and primarily to check exactly how reliable GWT information is

From my old research, I am planning on revisiting the backlink data provider analysis. There were some major advances in the capabilities of each of the tools analysed and I would be interested in comparing the outcome of similar analysis today to the results.

Another work that I would like to revisit is the research I have done on using social media analysis for link building purposes. I have presented that strategy on Linklove London 2012. Firstly, I am sure a lot of the APIs have changed and that needs updating. Secondly, there are some new tools out there that could add new data layers to the analysis. And finally, while I have used bits and pieces of the strategy to complement the more traditional link building campaigns, I have never used it as an only strategy on a single website and I would like to try it.




Q6: Do you feel like the online marketing community as a whole should or could benefit from taking a more scientific approach to what they do or do you see your approach as being the Yang to the creative Yin?


A6: Hm. That is a loaded question. As a general rule, I think the world should take a more scientific approach to a lot more things. Way too many times we are swayed by manipulation and interests rather than pure data and truthful analysis.

That said, the last thing I wish to see is the creativity taken out of the marketing process. As a matter of fact, a marketing process without creativity will just fail, regardless of how great the data analysis is being fed into it. Neither approaches can work well on its own merit. Creativity, supported by data and kept in check by scientific, unbiased approach is, in my opinion, a very solid way of going about search marketing.




Q7: Are you likely to abandon or move away from SEO once you’ve completed your Marine Biology PHD or will you keep doing SEO or even if you pursue a career in Biology do you think somehow your deep understanding of search will play a role in your future?


A7: I have stopped “deciding” what I am going to do in the future, at least when it comes to my career. Biology was always my passion, since I was about 5 years old. I fell into SEO by chance and loved it. Actually, just after I finished my MSc in Biology, I was very close to switching to full time SEO career. It sure as hell is more lucrative and I got to meet some amazing, gifted and caring people in the industry. Then came the PhD offer and I was back in the lab, where I realised how much I missed it.

My ideal career would have both full time SEO and microbiology in it, however, unfortunately there are only 24 hours in the day and some smartass thought it would be a good idea to waste some of those on being unconscious every night, so I am aware that at different points in life one of those calling will be put on the back burner for a period of time. I will, however, do my best to keep my feet in both those pools, as I find both extremely intellectually challenging and consider myself incredibly lucky to have not one, but two amazing profession which fulfil me.




Thank you very much for your time, I hope our readers enjoy the interview as much as I did.

You can catch up with Branko on Twitter or Google+.


About Branko Rihtman
Branko has been an active SEO consultant since 2001. In the course of his SEO career, he has consulted for companies in niches ranging from online poker and hotels to large eCommerce sites and avionics, as well as numerous small and mid-sized businesses. Over the time, he has developed an analytical approach to SEO, providing him with novel insights in otherwise purely marketing niche. He has conducted a number of highly praised and quoted SEO researches, which have helped provide competitive advantages to both clients and SEO strategists.

Over the years he has spoken at, moderated, and helped organize some of the leading online marketing conferences, such as SMX Advanced, Distilled LinkLove, NOAH Conference, MIT Forum, and Affilicon. He currently resides in the UK where he is pursuing his PhD in Environmental Microbiology at the University of Warwick. He sometimes blogs at http://www.seo-scientist.com.

Add Comment Register



Leave a comment

*


× 6 = fifty four

What Our Clients Say

James AgateJames Agate
Managing Director
Skyrocket SEO

Chris and his team are patient, responsive and have a sense of humour even when I asked them to change something that I'd insisted upon initially. Would definitely recommend to anyone looking for an awesome WordPress development and design team.

Richard SedleyRichard Sedley
Director
Seren

Hit Reach were really easy to deal with and provide exemplary service. They implemented exactly what we wanted and guided us when we were uncertain about the solution. A highly professional and cost effective solution – what's not to love.

Roger GreenRoger Green
Director
Best4tyres.com

Hit Reach got it absolutely right. They embraced the challenge and complexity of the site, and exceeded expectation with quick response times and great service.

Our Key Services
Menu