*Search Engine Marketing Consulting Blog

Exclusive Interview with Googler: Kaspar Szymanski

January 20th, 2009 | Autor: Cezary Lech

Who is Googler ?

Kaspar Szymanski is the author of several entries in the official Google blogs and work within Google’s Search Quality Team. This corresponds, among others, by the end of directory pages, also presented us with linking guidance or how to report SPAM.

In SEOs minds Kaspar is responsible for the quality of the Polish SERP’s. Here the Googler responses to series of difficult questions.

This is a translation of interview originally published on SprawnyMarketing.pl (Polish SEM blog).  There is also a German translation of the interview on SEOptiker.de.

- You go by the name Guglarz [googler in Polish] in the forum, can you introduce yourself and give your full name and surname?

I thought that question might come up :-) My name is Kaspar Szymanski. I monitor the Polish Webmaster Help Group as Guglarz. On Google blogs, however, I support our users on a regular basis using my full name.

- What are your responsibilities at Google?

I work at Google within the Search Quality Team. I focus on preventing spam from coming up in the SERPs, so our users will not be affected by it. This is my main objective. However, communication with webmasters in particular in Europe is also an important part of my scope of work. You can read Udi Mamber’s recent post if you want to learn more about our work. In terms of webmaster communication I would like to share a few more details about mine and my colleagues efforts.

For about two years now, the Search Quality Team in Dublin reinforces Google’s efforts to engage in a dialogue with webmasters in their native languages. Currently, me and my team support a number of communication channels in sixteen languages. I mentioned the Google Help Groups, which me and my colleagues monitor. We also run a Google Webmaster Blog in German and contribute to country blogs in Polish, English, French, Spanish, Italian, Russian and Turkish.

Recently, we have also started pushing educational videos for webmasters. My friends Rebecca and Alvar came up with the initial one in Spanish. So far, the feedback from the webmaster community was very positive :-)

The part of my communication work I’m really excited about, is meeting webmasters personally at events like conferences or Google Days. On occasions like the SMX or the SES conferences we learn a lot about what is currently on people’s minds and where we need to improve upon the information we provide webmasters. I like this channel best because I enjoy the experience of talking to webmasters and having an instant interaction. So far, unfortunately we did not get a chance to participate in the event in Poland. However, I am optimistic on catching up on this before the end of the year.

- How did you get into Google and which Google office do you work in?

About two and a half years ago, I was looking for a change both job and country-wise, after working outside Europe for a communication company for some time. I came across a job description that was both, sort of cryptic and exciting. Once I applied things went pretty smoothly, I quickly completed the recruitment process and about a month after the initial contact Google gave me one of the coolest Christmas gifts ever: an offer to work at this unique organization’s European Headquarters in Dublin :-)

- How do you feel in the role of the conqueror of the Polish directories and at the same time as the author of the first entry in Polish on the official Google Webmaster Central blog?

To answer your first question, I did not know I had that sort of fierce reputation! :-) I guess you are referring to one of my blog posts from last year. At that point of time we felt like we had to communicate to the Polish webmaster community that site enhancement as for instance scraped, off-topic directories added randomly to content-light sites is not considered a white hat SEO method by Google. Looking back, I have to say it was the right thing to do, since we were getting more and more users’ complaints about this particular method. Please, don’t get me wrong, I think thematic, moderated directories can be a great way to generate added value for users. Unfortunately, some people abused good, existing ones like the DMOZ project to artificially increase the number of pages on their own or their clients´sites. This resulted in small but original sites having abruptly hundreds or even thousands of off-topic scraped content pages. The users were not delighted to see these in the SERPs. And so were site owners seeing their sites being temporarily removed for the use of this spammy technique. Eventually, Matt Cutts and the Search Quality Team decided that we had to make sure people know about the impact site enhancement can have on a site. It was rather a team effort involving several people than  a man action. So, in the end, I don’t see myself much as a directory conqueror.

Regarding your second question, I did not know I was the first one to write in Polish on the Google Webmaster Blog either :-) Thank you for letting me know that! And yes, in that case, I am delighted to hear this. I think reaching out to webmasters in their native language and talking to each other is very important for both parties, Google as much as the people building web sites, since we can only benefit from open communication.

- Are the actions lately heightened by Google to remove the web sites of doubtful quality from the index also a form of fighting automatic link exchange systems?

We have always been fighting black hat SEO techniques, no matter which ones you have in mind. As you might remember, on the group as much as on the Google Webmaster Blog, we have been communicating how Google views link schemes. Since Google links represent a sort of vote for a site’s content, link exchange schemes as much as buying and selling links which pass PageRank are not within our guidelines. Of course, any webmaster is free to design his or her site the way he or she desires and link in any way he or she feels good about. But please keep in mind that it is also fair for Google to reserve the right to protect its users and to maintain the high quality of the index. Those webmasters who want their sites to perform well in the Google index might want to check their linking strategies against our guidelines. And in the case you are not 100% sure, please join our Webmaster Help Group. There are plenty of savvy, smart people, willing to help each other. And there is a Googler, monitoring the group and throwing his 2 cents once in a while ;-)

- Is Google going to hide the real number of indexed sub-pages as it does with the “link:” operator? If yes, are you going to add the special tool in Google Webmaster Tools?

My friend Juliane and I have been writing about both search operators “site:” and “link:” on Google’s Polish Country Blog in May this year. Basically, doing a search on Google using the operator “link:” before the site name shows a sample of the incoming links for that site. And a much larger sample, although not all of them, if you check the incoming links to your verified site in the Google Webmaster Tools.
The reason why the displayed results may vary if you use “site:” is due to asynchronous the sequential Google Data Center update. Depending on the traffic load and the location of the Data Center Google returns results generated at different points in time.

- Matt Cutt’s team has changed the Webmaster Guidelines lately. Although lots of issues were explained, some webmasters still argue about the nofollow attribute. Can you explain if the nofollow attribute on internal links favours internal linking and PageRank passing?

I have come across some webmasters almost obsessed by the idea of actively directing link juice on their sites. So let me explain this: there is more than one way for a webmaster to modify PageRank flow on his site. You can use at a link level, or or robots.txt on a page level. If you want to promote certain parts of your site, feel free to do so. Go ahead and use any of these options described above. Just keep in mind that nofollow’ed links are dropped out of Google’s link graph, so the choice to direct link juice should be taken carefully.

On one hand, it is my opinion that, for example, nofollowing a link to your blogs sign-in account makes sense since the Googlebot isn’t going to sign into sites’ back-ends. On the other hand, this step might contribute to your site’s security. However, nofollowing other pages, like “About Us” – because you think users don’t care or you would rather have them go to a product page – is something you may want to think twice about. I would like to point out, though, that while PageRank sculpting with the certainly is permissible, it might be not the best use of an SEO’s’ time.

- Can you confirm that if there are two links on the same page, the first one nofollowed and the second one without the nofollow attribute, pointing to the exact same destination, Google(bot) would ignore both of the links?

As you know Google’s algorithms are constantly being updated and use a large number of factors to determine connections between pages. The only constant is, in fact,  evolution, that is why I would not focus too much on these details as they are likely to change over time, while our desire will crawl and index natural content will not. Of course, any webmaster can run experiments regarding the current situation at any given time.

- Is web positioning SPAM?

No, white hat SEO is not spam. Recently, we have added an article to the Webmaster Help Center, describing Google’s view on Search Engine Optimization and offering some advice to site owners on how to find a white hat SEO.

- What are your definitions of SEO and web positioning?

We had a discussion about this on the Webmaster Help Group the other day. In the end I think everyone has their own perspective regarding this. From what I see across Europe and beyond, SEO and SEM are becoming more and more portfolio components of web consultants. The main difference to point out is the natural search that SEO focuses on vs. paid traffic trough SEM. I do not have my own creative definition for SEO but personally I think if someone optimizes a site to be visible for search engines and more crawlable for their bots, that person is an SEO. Of course there is more that makes a good one, as accessibility and technical fine tuning that can be worked on in order to improve the site’s rankings.

- Do Polish Quality Raters also read the spam reports?

Unfortunately, I never had a chance to talk to a Polish quality rater, so I can neither confirm nor deny this statement. I can imagine the curiosity in Google’s spam fighting team within the SEO industry. Let me take advantage of this opportunity and clarify that Quality Rater team and the Search Quality team are not the same. If you’re asking whether I or other Polish spam fighters read spam reports, the answer is definitely yes, we do.

- If the access to my site is blocked for Googlebot, is the access also blocked for other Google bots? I mean those from AdSense and those counting Quality Score for AdWords.

With the robots.txt file spiders can be blocked on an individual basis. If you wish to participate in AdSense as a publisher without granting access to your site to any other bots, all you need to do is to add the following two lines of text to the top of your robots.txt file:

User-agent: Mediapartners-Google
Disallow:

Please keep in mind that any changes you might make to your robots.txt file will not be reflected in our index until our crawlers visit your site again.

Same applies to the AdWords AdsBot-Google, our automatic landing page review system. However it is important to remember that in order to avoid increasing CPCs for advertisers who do not intend to restrict AdWords visits to their pages, the system will ignore blanket exclusions (User-agent: *) in robots.txt files.

Since I am more into search, then advertising, I guess you will find more comprehensive information on the Google Webmaster Help Center site.

- Does Google still follow the links and passes Page Rank when the tag is used and the sub-page is no longer indexed?

Generally the Googlebot follows sites with the attribute the same way as the sites without it. The difference is that these sites are being displayed in the Google search engine results without the link to the cached version.

- How does the 301 redirection work? Some time ago I noticed an error in Google Webmaster Tools panel showing that I redirected my whole domain, including sub-pages, to the main page of the new website… Does this situation affect the position of the new website in SERP?

As you know 301 redirection says “moved permanently”. Without investigating the particular case you are mentioning, I could not tell what the source of the error message was. However, I would expect no impact on the new domain in the worst case. A working 301 on the other hand should be rather beneficial for your new site, if you have previously built up a reputation and gathered a community of users visiting your site on a regular basis. It’s worth mentioning that a redirect should be done on a URL level.

- If 301 redirection is used, will the position of the new site be the same as the old one?

I often come across this question. The 301 redirection is a great way to tell Google that a project has been moved to a new domain. While it should not be the cause of any disadvantage for the webmaster, no one should assume their position in the SERPs would be exactly the same. Change is the only constant in the web. Competing sites could possibly develop in the same time frame, gaining links through a great content. Or new sites about the same topic could have come up. For this reason I cannot promise exactly the same ranking, though I strongly encourage to use it for the purpose described above. I would not recommend it, if I wasn’t using it myself occasionally.

If a webmaster is very concerned about migrating a project, it’s probably a good idea to run an experiment, where you migrate one subdirectory to the new domain first. If that works fine and you don’t see any dramatic ranking changes for that subdirectory, then I wouldn’t worry about moving the entire site over.

- Does Google index only some parts of the content on pages in the Supplemental Index? There is a rumor that Google indexes only so called “important” words on pages in the Supplemental Index.

In the past the Supplemental index would be used for unusual documents that would come up for rather not mainstream queries. From the webmasters perspective, the lower crawl rate was the main difference between pages indexed in our main index and the ones in Supplemental index. Google’s engineers have been working hard on solving this issue and to level the differences between indexes, eventually succeeding in December 2007. Currently, there is very little difference. Anyone who wants to find out more about how Google indexes and ranks pages, I recommend to visit this link :-)

- Does adding deep urls to a Sitemap to Google Webmaster Tools cause those pages to go to Supplemental Index? Can a Sitemap be the main or even the only element of indexation process?

Adding your url to a Sitemap does not affect whether the url enters our supplemental index either way. Just because you add a url to a Sitemap doesn’t mean that Google will crawl that url, either. One area that submitting your url can help is if Google sees duplicate pages on the same site, it is more likely that the url that was submitted in a Sitemap will be prefered.

- Is it true that links from pages in the Supplemental Index do not pass the value of the anchor text? What do you think about the value of links from such pages?

I would encourage webmasters not to get too hung up on the Supplemental index. Google tries to make sure that links which we find in the course of crawling the web are counted appropriately. Personally, I consider the value of these links equal to other ones.

- Why banned sites are still being crawled by the Googlebot?

This really depends on an individual case. But let’s say a webmaster has been hiding keywords on his site in order to artificially increase keyword density which is a Google Guidelines violation but his site also provides decent content, users might be interested in. If he or she uses our free Webmaster Tools, Google might notify the webmaster about the violation and remove the site form the search results displayed to the users for a certain period of time. Googlebot however can still crawl the site, so the most recent cache version can be indexed once the problem has been fixed and a reconsideration requested.

- Is reciprocal linking e.g. A<->B, B<->A a violation of Google Webmaster Guidelines?

If linking is natural, based on relevancy of content sites I don’t see a violation of any Google Webmaster Guidelines. I remember the case of a small vet site and a pet food wholesaler located close by, linking to each other. That seems pretty reasonable, since visitors from one site could be interested in the content/offer of the other, right? On the other hand, if you think of a games community forum cross linking with a bride shop in Warsaw and an all inclusive hotel chain in South Asia, the chances are low that the links have been set for users. Obviously, the second example would be a violation of our Webmaster Guidelines. Let me add that the algorithms we have been working on are fairly precise in determining the relevancy of links.

- Assuming, two links with different anchor texts, located on different pages of a site point to the same URL, which one is going to pass PageRank?

PageRank is independent of anchor texts. Having multiple links to the same URL may change the PageRank that’s being passed slightly, but I wouldn’t focus on this when designing the website. The site should be made with the user in mind, not the search engines. So add your links to the content where they make sense and don’t worry about the PageRank calculations. For more information about PageRank in general, feel free to read the original document at http://dbpubs.stanford.edu:8090/pub/1999-66. And if you are still concerned, this is another area where webmasters can run an experiment to verify the expected behavior.

- How does Google treat SEO friendly partner programs?

Like I said before, links that pass PageRank should be set based on merit only, in order to be Google Webmaster Guidelines compliant. While it is legitimate for a webmaster to monetize a great content, for performing well at Google it is important to take technical steps in order to prevent random PageRank flow, e.g. by either using the attribute or by creating a robots.txt file.

- What is Google’s policy about Pay Per Posts? Shall for example Polish services like krytycy.pl put into their guidelines the mandatory use of the nofollow attribbute on links in sponsored articles?

To quote Matt: “Paid links that affect search engines, whether paid text links or a paid review, can cause a site to lose trust in Google.” Not referring to any service in particular, if a webmaster wants to build up reputation and trust with Google, he or she should make sure linking on the website would not be perceived as deceptive. Let me add that the Search Quality team has made a substantial effort in paid linking detection over the last years and we keep on working on it.

- What do you think about the idea of tagging with the nofollow attribute all of the outgoing links from wykop.pl (a polish version of digg.com). Theoretically that’s a set of interesting, organic and useful links that are worth indexing… is wykop.pl the starting point of the organic Link Bait which is said to be acceptable by Google? What about the so called SEO 2.0 in the context of the biggest Polish WEB 2.0 service having their links nofollowed?

While Google doesn’t comment on specific companies, I would like to point out that using the attribute on sites based on user generated content seems to be becoming more and more an industry standard, ever since respected sources like Wikipedia incorporated it. In my personal opinion it can be also a good way to discourage spammers from abusing social bookmark services for deceptive linking methods.

But I also think initially nofollowing links and either manually or automatically removing the attribute when certain trust-based criteria is met, let’s say if a user was consistently providing high quality, relevant links, is appropriate and an effort Google would see in a positive way.

- Should different language versions of sites (placed in sub-directories or sub-domains) link to each other in a way that is readable for Google?

This is a question that comes up on a regular basis, frequently in connection to travel sites designed for different target groups. Cross linking different language versions located in sub-directories or sub-domains in a way visible for the Googlebot is not going to cause any disadvantage for your site in terms of indexing or ranking.
Speaking of different language versions: There is a way to show Google more specifically which country your sub domain or directory is targeted to, by taking advantage of the Geographic Target tool in Google Webmaster Tools. My colleague Susan explains precisely the way the tool works in her video released a few months back. Essentially, the Geographic Target tool is your favourite choice if you want to make sure you reach users from a particular area, rather then users speaking a particular language. I don’t want to explain it in detail redundantly but I warmly recommend it to any local businesses or local communities.

- API Google Webmaster Tools, what other elements will be accessible with API?

Google is always trying to provide great solutions for users and improve existing products, so I would not be surprised if the API team had some new features on it’s roadmap, however I am not aware of any upcoming releases.

- What is the future of Google Webmaster Help groups? Do you want to make them more popular than forum.optymalziacja.com (the most popular polish forum about SEO) ? ;) . There is a “search” section in the Google Help service, just launched on Polish market, do you think it might be competition?

An interesting question :-) I have no plans to compete with forum.optymalziacja.com in any way. As you know there are some heavy users contributing to both platforms. But I think the purposes and the target groups of these two communities are a little different. The Google Webmaster Help Group is a place where less savvy users can find initial help and escalate issues they came across while they were using our products. Google Help is a new service I am very excited about. I was lucky to watch it’s development over the last couple of months and I think the outcome is very promising. Not to mention, I am happy about the Beta test being run in Polish as the first language globally! I don’t see the future of Google Webmaster Tools Group and Google Help in terms of competition but rather co-existence and possibly even a fusion at some point. I recommend everyone interested in the future of both platforms to stay tuned for news to come.

- What do you think about the future of SEO, what can happen let’s say in a year?

I gave up on trying to figure out which creative ideas SEO’s will come up with. It’s a high speed industry and there are interesting things happening almost everyday. The only thing I am sure of is that it will be exciting for sure.

- In your opinion… How has SEO in Poland been evolving and what is the awareness of Polish companies in this scope, especially big e-commerce companies?

I am glad you are asking about this. Among its European neighbours like Germany or The Czech Republic, Poland is one of the countries where the awareness of SEO among some large companies is just coming up. Being at the SMX Munich I had a chance to talk to agency representatives, who were confirming my impression. We are not quite there yet but there is definitely a positive trend. In terms of SEO development, Poland is an emerging market, rapidly catching up and closing the gap to his neighbours.

- Do you think it’s possible that all big e-commerce companies will have their websites optimized to the same level one day and more or less spammy techniques or link-baiting will be the only way to compete?

This is a speculative question and I hesitate to make a strong prognosis. Based on what I have seen happening in different markets over the course of the last years I doubt there will ever be a standstill situation in the Search Engine industry. There are some highly intelligent, creative people constantly working on tweaking their sites and I doubt they will ever stop thinking. In my personal opinion, no shady techniques or link-baiting will be the final stage in the evolution.

Last posts:

9 Responses to “Exclusive Interview with Googler: Kaspar Szymanski”

Great Interview with Google Search Quality Team Member: Kaspar Szymanski | Search Marketing Blog from Cincinnati, Ohio

January 23rd, 2009 1:00 am


[...] Exclusive Interview with Googler: Kaspar Szymanski [...]


The Top 100 Internet Marketing Posts of 2009 | Best Hosting Talk | Cheap Web Hosting Deals | Web Hosting Deals and Coupons

December 28th, 2009 11:14 am


[...] Exclusive Interview with Googler: Kaspar Szymanski – MaxROY.com Blog [...]


The Top 100 Internet Marketing Posts of 2009

August 3rd, 2011 10:28 am


[...] Exclusive Interview with Googler: Kaspar Szymanski – MaxROY.com Blog [...]


weight living

September 2nd, 2014 2:14 am


This is my first time visit at here and i am
in fact impressed to read all at one place.


olej opałowy Gliwice

October 17th, 2014 12:52 am


Treść bliska każdemu, polecam publikację olej opałowy Gliwice opałowy – Gliwice


shortcurlyhairsecrets.blogspot.com

October 21st, 2014 7:15 am


28x[0-9]499 02y[0-9]193

Feel free tօ visit my blog post: short haircuts fߋr women & SCS (shortcurlyhairsecrets.blogspot.com)


elektroosmoza

October 28th, 2014 11:19 am


Strona świadczy o nietypowych zagadnieniach, namawiam do rozmowy

My web site elektroosmoza


plac zabaw

November 15th, 2014 1:54 am


Każdy spośród nas zetknął się z problemem, rekomenduję zaznajomienie się
z problemem. Certyfikowane place plac zabaw


Camera deurbel

November 19th, 2014 3:23 am


Thank you a bunch for sharing this with all of us you really recognise what you’re speaking approximately!

Bookmarked. Please additionally seek advice from my site =).

We could have a link exchange arrangement between us


Leave a Reply




 

It is a corporate blog by MaxROY.com
The Blog’s Mission:

  • Let English-speaking community of SEM know a few things about the specificity of Polish, Central and Eastern European market.
  • Reach people from top management CEE corporation and indicate them how crucial search engines marketing is in the strategy of companies.

Read on »


Subscribe to our RSS feed to
get MaxRoy updates
Subscribe Now >>

Enter your email address:

Delivered by FeedBurner


Poznań, 15 June 2011 – Search Marketing Day – International SEM Conference! | March 18th, 2011 • Search Marketing Day – the first international SEM conference in Poland! Only foreign speaker

MaxROY.com published the Google AdWords training on 3 DVDs! | January 14th, 2011 • This message will be appreciated by those who wanted to participate in the training of Google AdWord

Think with Google 2010 Coverage | October 20th, 2010 • Don’t treat us as Internet traffic suppliers. We are your business partner. Mobile and video –

Search Marketing Week 2010 Coverage | June 9th, 2010 • It has been more than a week since the Search Marketing Week finished. Search Marketing Week, a wee

MaxROY.com – the New Member of IAB Poland | February 11th, 2010 • Since February 2010 we have became a member of IAB Poland in the branch of SEM and Search Engines. I

Short report from first MaxROY.com SEO training | July 7th, 2009 • The first SEO training organized on June 25 by MaxROY.com has been finished. Preparing all the detai

Google AdWords Seminars in Poland – AdWords Seminar Leader | February 10th, 2009 • MaxROY.com was the first company in Poland to begin organizing Google Seminars for Success within Ad

MaxROY.com – an Official Day of Birth :) | January 16th, 2009 • In 17th September 2008 MaxROY.com company got a number in Polish National Judical Register. Thereby