Architecting Search-Engine Friendly Websites – Shari Thurow.

Next up, a talk from Shari Thurow on search engines and IA.

Thurow is starting with the basics of SEO. If you don’t do these 4 basic building blocks, you can forget it. Doing them makes your content easy to find — both on Google and your internal site search. For more information on SEO visit https://victoriousseo.com/services/link-building/.

Thurow hates search engine spam. She turns people in regularly for it.

Why should IAs care about SEO? Because people search.

Thurow shows a couple of pages from a website — beautiful images — but no copy. Search engines can’t evaluate them.

Talks about a usability test on a site with a high-tech audience, with a huge Flash presentation at the top. It was the shortest usability test she ever did — the users hated the Flash so much they wouldn’t look at the site.

SEO is optimizing for people who use search engines.

SEO is not magic pixie dust, sometimes you may need help from experts like True North Social.

Two great sites for SEO: Apple and Mayo Clinic.

According to this WordPress specialist, SEO covers architecting, designing, writing, programming.

What search engine optimizers need to do:

  • Need to label content
  • Organize website content so it is easy to find- click here to learn more
  • Ensure search engines can access right content
  • Ensure search engines can’t access undesirable content

4 Building Blocks of SEP

  • Keywords: Text
  • Architecture and design
  • Link development
  • Searcher goals

Most important text: Title tag, Most important [top] content, URL structure

The first two, keywords and architecture, are on-the-page and entirely within the site owner’s control, learn how to choose an SEO company.

The latter two are off-the-page criteria. In off-the-page criteria, quality trumps quantity. Who links to you matters more than how many you have.

Many SEOs don’t pay attention to searcher goals. Navigational [people want to go to a website], informational, transactional

For navigational, people rarely look past slots 1-2. When site links show up, something in the search indicated navigational intent.

Up to 80% of searches are informational. When Wikipedia shows up in results, something in the search demonstrated informational intent.

Least common type is transactional query. Here, people don’t always type in the words they expect to see on the page. [They don’t type, “watch” or “cart” when they want a video or to purchase something.]

Thurow shows this horrific result….when you search for

United.com FAQ

You get in Google what looks like the right page…but you go to the page, and there are no FAQs. Total fail.

Are you communicating to humans and technology the right information scent and aboutness of your content?

We know that content can be organized in multiple ways, but we have to determine how the target audience would organize it. SEOs think everything needs to be organized by topic. Keyword research tools shouldn’t be used to create architecture. But it is helpful to find out how people label things.

SEOs and IAs both work on labeling — are we using the right keywords on our labels? Are we properly indicating aboutness?

Prioritizing: Don’t put too many links/labels on the page, but definitely put them in the right order. Usability testing can help you figure this out.

Don’t put glossary content in a popup window — you’ve orphaned that really informative content to Google.

Information architecture decisions have a direct impact on SEO. Don’t wait til a site is ready for launch to bring in an SEO.

Most pages on a website should be treated as a point of entry. Do you give users enough context to figure out how to get where they need to go?

Some findability solutions can cause search engine problems — faceted classification, tagging and site search pages.

Thurow is an amusing and educational speaker. Great info.

Google Search Updates: Good or bad?

Read a great post this morning from InfoCommerce, a consulting group that focuses on business information content and database publishing. Here’s a snippet, but I think you should head over to their blog for the whole post on Google’s recent spam-prevention search updates.

…let’s think a little more about this new filtering capability offered by Google. What if it were to truly catch on? The basic concept is that you can now easily and permanently take out any domain from your Google search results. Consider what this means: suddenly, nobody is seeing the same search results. What is the implication for search engine optimization programs and providers? What happens to search engine marketing?

more at the InfoCommerce blog

Of course, the clear message from Google, Bing and other major search engines is that already, no one gets the same search results. [Despite that, some SEO firms are still trying to sell you rankings. Snake oil alert!] But the greater ability to customize your own results on the fly means that not only will you not see what I see, but that we may accidentally be giving ourselves poorer results.

Personally, I’m not sorry to see Google act on the content farm spam problem. I’ve noticed it a lot in my personal searching, as I have tried to teach my 5-year-old and my 11-year-old how you judge a quality website to use in research for school projects. To an untrained researcher, lots of content farm posts look really useful — but they usually just give you half an answer or even the wrong information.

But the implications of fiddling with the mechanism, and of allowing us to fiddle with it, also concern me. Will we start to see campaigns to have people “vote down” certain sites out of malice, hoping to get Google to dump them? I suspect that Google will figure out when humans need to intervene in those kind of cases, but Google’s got a track record in other areas of letting the machines act first and then let the humans correct when necessary — with no regard to the downstream consequences. So I hope that Google is treading very, very carefully here.

Recommendation Engines: Going Beyond the Social Graph

Hunter Walk. Leads product team at YouTube.
Tom Conrad. Product engineering at Pandora.
Garrett Camp. Cofounder of Stumble Upon.
Lior Ron from Google’s Hotpot project. Works on local recommendation engine.
Liz Gannes, journalist, moderating.

Conrad: Pandora has 8 billion thumbs up/thumbs down data, completely contextualized.

Walk says that YouTube knows a lot about where their videos are embedded. Talks about could personally review videos, or use algorithm to analyze videos, but they are also look at what the top blogs/sites are pulling from YouTube to understand what videos are popular with whom.

Walk: About 50% of searches on YouTube are “broad,” meaning the person is looking for an experience, not a particular video. Google has to figure out what the best videos are to help someone understand/experience a topic. It’s very different from trying to answer a question, like we think of in traditional search.

Camp: We want to get away from 10 blue links. We want to be surprised, have serendipitous experience.

Conrad: Looking at most common starting points a couple of years ago. One of top ones was called Christmas. The station was seeded with an indie rock band called Christmas. Oops. So then they started playing the station to see what happened, and it was playing all holiday music. The crowd had very quickly weeded out the data error by thumbing down the band on the holiday station.

Walk shares story of a teenager who told them that she wanted to know what her friends hadn’t watched yet on YouTube, so she knew what to share. It’s a hard problem, but they want to figure out what’s not yet spreading, but will.

Camp: StumbleUpon tests new, non-socially-recommended stuff in streams to figure out this kind of question. When you’re just looking at the social graph, you’re in a closed loop.

Ron: Social is really important in recommendations because of the trust factor. Getting a friend recommendation still beats the site telling you, hey “other people like you” like this.

Walk: Early on, they just trusted what the uploader said about what the video is. Now, they use a lot of technology to understand a piece of content. What does perfect metadata look like? What’s everything I COULD know about it? And then the challenge is, you take all that data to try to create an experience, not just spit out data.

Walk: One of biggest changes in perceived search relevance was when they started showing context for recommendations. Immediately, people thought recommendations were more relevant. And two, if the recommendation was wrong, they blamed themselves, not YouTube.

Conrad: Pandora has broken out of the PC into mobile and now car implementations. The difference in environment between listening at work via headphones to listening in the car with the whole family and lots of people making the music decisions is very complex.

Camp: StumbleUpon is often a free-time application, so their new mobile app [6mo old] is doing well.

Ron: Interesting patterns in how people are following people for recommendations. Some people follow only celebrities, for instance.

Camp: For analytics, they look at thumbs up/thumbs down, length of time on resource, comparing time to type of resource. SU has an 80-85% thumbs up rate.

Walk: You have to be careful with analytics. You don’t want to introduce features that push up your positive stats to the detriment of user experience.

Conrad: They religiously test all changes to the algorithms now, after making several changes in early days that “everyone” agreed would be great, that instead tanked numbers.

Ron: Recommendation is very vertical-oriented. The required data is so specific that it’s hard to have a general recommendation engine.

Camp: Also, UI affects what kind of data you get a lot, so that’s part of why people build the engines themselves.

Ron: We’re not living in a world yet where we’re bombarded by awesome recommendations and we have to tune them. Part of the problem right now is getting coverage for everywhere.

Camp: We do a combination of social and similarity in your recommendation list.

The Misplaced SEO-Content Strategy Fight

I’ve been working on content strategy and management for a good long time, so I’ve had my rounds with disreputable SEO practitioners. In the early 2000s, there were a LOT of snake oil salesmen out there. I could tell you some crazy stories, but we all love to sit around and grouse about the other guy. I’ve looked around at lots of different SEO Services, which is really important if you want to develop the SEO of your website. It involves a lot of trial and error, AB testing, and really taking your time to find what works.

To be clear: I think reputable SEO practitioners today contribute significantly to the web. Take a look at king kong sabri suby and his success story. He grew his company rapidly and is trusted by many businesses to take care of their SEO.

For some reason, the past few days have been the content strategy vs. SEO throwdown of the century. There are posts popping up all over. I’ve got my favorites, but there are some for all sides….whether you fall on the SEO or content strategy side of the fence.

But I’d argue we all ought to spend more time worrying about the problems created by spam content by Demand Media and similar companies. That’s a much bigger problem for legitimate SEO and content strategy practitioners. When the internet is overflowing with junk, that makes it far more difficult to share real knowledge, no matter which side of the fence you’re sitting on.

People that are concerned about their website’s success are encouraged to conduct an SEO audit which can help to identify weaknesses and how they can be improved upon to help the website feature higher in search rankings. For information on this service, see here – http://victoriousseo.com/services/seo-audit/