At the Corner of UX, SEO, and Accessibility

Your search engine of choice, whether it is Google, Bing, or my recent favorite, DuckDuckGo, uses pieces of software called “robots” that crawl the around the web and map the Internet.

Your job is to ensure that when a user is looking for X on the map of the Internet, your web page marks the spot.

Ideally, these robots are meant to traverse the web just as humans do: Get your bearings, read the content, and follow relevant hyperlinks. They are always getting better at this, but there are a few things one can do to help the little HAL-to-be out. The side effect will be that the user experience for flesh-and-blood will get better as well.

One more thing. As I mentioned previously, accessibility for the vision-impaired also depends on software being able to make sense of your website. Screen-reading software and search engine crawlers share the basic concepts of reading web sites.

In the end, you’ll end up with a web site that is simply easier to find and use. Onward!

Please, please, please don’t use Flash.

Content inside Adobe Flash is not easily searchable. That alone should disqualify it for everything but for the most dire circumstance. You would also be shunning your iOS-using audience, along with the increasing number of users who decide to forego Flash entirely. (Macs no longer come with Flash pre-installed.) If you need animation or other whiz-bang effects, there are newer, HTML-based solutions available today.

Use HTML5′s new semantic structural tags.

HTML5 introduces new tags that just plain make more sense than <div> ever could.

<header>, <nav>, <article>, <section>, <aside>, <footer>

Hardly any explanation required.

Title everything appropriately.

Your page title is what will stand out the most in a sea of search results or browser tabs. Give each page a distinct title that accurately describes what a person can expect to find there. Sadly, this is going to preclude a lot of the wit that comes from lede-writing in general, but try to work in the key words at least.

Eradicate “click here”.

The clickable text for links should have something to do with where it leads. “Click here” itself tells you nothing about the page behind it. A robot will better understand the relationship between pages if you use real words in link text.

Fill in metadata.

You hear this over and over again about images and their alt and description attributes, but metadata is useful for entire pages as well. Within the <head> of each page, you should have these:

<meta type=description content="This is the text that most search engines use as the excerpt in their search results. Stick to fewer than 155 characters.">

<meta type=keywords content="these,are,not,used,as,much,but,can,still,help">

Tell the search engines where you are.

Let your creation unto the world and tell the search engines about it directly.
* Google Site Submission
* Bing Site submission
* (DuckDuckGo claims it doesn’t need site submissions.)

Both Google and Bing provide webmaster tools that allow you to see what they see. They’ll tell you about the relevant keywords they found in your content, as well as any errors and broken links at your site.

Content that matters

In the end, though, nothing is going to help you more than having content that’s relevant in the first place. Write with clarity of thought and purpose, and you’ll be rewarding yourself and your users for finding your content. That will only increase the likelihood of them sharing it with their friends.

As search robots gets more sophisticated at mimicking human behavior, they too will be get better and better at recognizing quality.

Michał Sterzycki

Michał has spent over 10 years ensuring end-users and administrators alike have the best experience possible, in both consumer- and enterprise-level applications. He anticipates what they want, before they want it.

If you catch him in the hallway, he will likely wax prophetic on the nature of software and hardware design. He watches the software and mobile industries, closely monitoring trends. He also dabbles in film criticism, gaming, history, and religious study.

Michał graduated from Rensselaer Polytechnic Institute with a Bachelor's Degree in both Information Technology and Psychology, with a concentration in Software Usability. Figures.

Twitter - More Posts

Leave a Reply

Your email address will not be published. Required fields are marked *

*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>