Friday, December 29, 2006

Clicking into the Future, Predictions for 2007

Clicking into the Future, Predictions for 2007
By Jim Hedger (c) 2006

While 2006 was a great year for webmasters it was a time of
great change and challenge in the search engine sector. The most
notable trend from 2006 was the continued advancement of Google
against natural rivals Yahoo, MSN and Ask, as well as its
persistent movement towards other, less obvious advertising
venues. Next week we will look back at the ghosts of 2006 but as
the New Year approaches, thoughts turn towards the spirits of
things to come.

Predictions for 2007

1/ Google's early entry to radio advertising will bear fruit
prompting Google to start examining television advertising.

When it comes to the allocation of advertising dollars, Google
fundamentally altered the rules of the game. In the past, mass
market advertisers would gamble on a shotgun approach to
ad-spending. The options consisted of newspaper, magazine,
telephone directory, radio or television. While each medium
could provide statistics showing their print-run and audience,
tracking the success or failure of these types of advertising
has always posed a problem for manufacturers and retailers. When
Google popularized the pay-per-click model in which advertisers
only pay when a person actually views the ad, a new model was
born.

Google's billing model will have to alter slightly to meet the
radio and video formats however, as digital radio and television
becomes the norm, tracking will become far easier. Google will
be able to charge by the listener or viewer though it remains to
be seen if listeners or viewers will actually listen to or view
advertisements on digital radio or television.

2/ A consortium of major newspapers and publishers will form a
consolidated advertising group to combat Google. The initiative
will fail.

In 2006, Google literally sucked much of the money out of the
pool of resources shared by publishers on or offline. With its
venture into newspaper advertising, Google has threatened the
classic golden goose of traditional news publishing, the
classified ad section. News gathering organizations, already
reeling from a loss of advertising income to other facets of the
online marketplace should band together much as several airlines
have to reduce cost and difficulties booking tickets. The
newspapers and magazines could easily promote local, regional,
national and even international advertising opportunities at a
much lower cost than they could individually. If they can band
together to undercut Google's costs is another question however
they will either figure out how to collectively raise
advertising income or they will become Google advertising
affiliates.

3/ Congress will initiate an open investigation of pay-per-click
advertising models.

We already know of closed-door congressional inquiries into the
PPC market. Stemming from the AIT vs. Google case, we have
learned of FBI and Secret Service investigations, and in-camera
hearings before congressional committees. We also know of at
least two republican and one democratic senators interested in
opening further investigations.

How this one plays out is a mystery, but it will likely play out
this year. Thus far, the investigations have been kept behind
closed doors, a silent testimony to the strength of Google's
lobby effort in Washington DC. No matter how talented and
strenuous the lobby, an investigation of Click Fraud will be
made public sometime this year.

4/ An investigation of pay-per-click advertising models will
threaten Google's stock values.

Hence the strong lobby effort. If common sense doesn't threaten
Google's stock values, the threat of congressional oversight
will.

5/ Google will drop below $450 before the end of first quarter
2007. If Google drops below $425, watch for a major landslide in
California.

The market is softening. With expected slowdowns in the housing,
building, manufacturing and retail sectors, tech investors are
showing signs of skittishness going into 2007. The first part of
January is traditionally a time when the market readjusts to
decisions made by investors in the last weeks of December. Many
sell shares towards the end of the year in order to clarify
their incomes for tax time. This year, a number of analysts have
advised clients to unload some of their Google shares as a hedge
against an expected downturn in share values.

While Google remains far above the $450 range, expect it to drop
to around $450 by mid-March. It's earnings to assets ratios are
way too extended and though it has better prospects than any of
its competitors, the search advertising market is changing
rapidly. The only thing that saves Google from dropping is
rapid entry into the television advertising market. The first
signs of a weakening AdWords market will be the catalyst of the
first round of a Goog landslide. If the share prices hit $425,
watch for major selling. If it goes below $400, buy a boat.

6/ Yahoo will find focus and direction.

Having freed itself from the sticky peanut butter mess it found
itself in late 2006, Yahoo has made the first steps towards
reorganizing their messy organization. The battle between
Hollywood and Sunnyvale has been settled with the geeks coming
out on top of the agents. This year is Yahoo's year to stage a
serious comeback or to languish into obscurity. I suspect the
former. However, all the major algorithmic search engines will
face a new challenger this year.

7/ The major search engines will face a new and socially adept
competitor this year.

Wikipedia founder Jimmy Wales has hooked up with Amazon to
produce a human-powered search engine that will be released
sometime in 2007. This collaborative effort to dethrone Google
will be moderately successful. Wikisari will quickly become one
of the biggies, but Google will still be king.

8/ 2007 will be the year Internet marketers discover the power
of online video.

The use of video will become a standard component of online
communication. Advertisements, long-distance conferencing and
public relations in general will be affected. The advancement of
video online will set the stage for a major realignment in the
traditional entertainment industry.

9/ DIY Infotainment Distribution for fun and profit.

Using tools like the Yahoo Publisher Network, YouTube, Flickr
and distributed podcasts, webmasters and private websites will
start to replace professionally developed network broadcasts as
an information and entertainment source.

Add efficient monetization to the mix through Yahoo Search
Marketing ads and you have an effective enticement for
adventurous webmasters to try entering the market.

10/ YPN will see challengers this year.

While the Yahoo Publisher Network is by far the most advanced
program of its kind, expect to see other search entities
introducing their own webmaster friendly monetized information
and entertainment distribution programs.

11/ The search marketing environment will further fragment with
the following sectors seeing;

stronger gains: Video, Social Networking, Niche marketing,
Vertical search
weaker margins: Traditional domain-driven SEO, Small-scale PPC

12/ Changes to the SEO Billing Model

Many traditional SEOs will be forced to adopt PPC style billing.
Fees will be charged based on the success of a campaign as
opposed to flat fees charged up front or on a monthly basis.
Greg Boser has written and commented on this several times, most
recently in a conversation with WebmasterRadio's Daron Babin on
the last Rainmaker show of 2006.

13/ Consolidation in the Search Marketing Sector

In 2007, financial pressures faced by firms in the SEO part of
the search marketing industry will produce a sector that looks
very different from the one we work in now. Watch for
consolidation between allied SEO, SEM, Public Relations and
link-building firms as clients demand full-service vendors.

14/ Hasty La Vista

Early Microsoft Vista users will eat a number of worms, catch a
bunch of viruses and end up as Zombies. As a result, savvy IT
departments put off the update until Microsoft can prove it has
created a master-patch.

15/ Patch This Microsoft

That big patch will be ready for download in 2008, effectively
delaying the rollout of Vista for another year.

2 Bonus Predictions

16/ Duke Nukem Forever

Duke Nukem Forever will be noted as the greatest game that never
was.

17/ Planet Fortune
Webmasters will have more rewarding opportunities this year than
ever. 2007 will see the fruition of several major initiatives
started in 2005 or 2006 such as the YPN, online video
advertising, and the combination of on and offline advertising.
With the largest public relations and marketing firms in the
world taking serious note of search, the search marketing
industry will be truly considered part of the mainstream ad
industry. This will empower webmasters as more attention, energy
and assistance is directed towards helping webmasters present
interesting content framed by increasingly expensive
advertising. There is a lot of fortune to be found online this
year and much of it will trickle down to the grassroots.

2007 is going to be an intense and interesting year. Next week,
we'll take a look at how I did in my 2006 predictions.

Tuesday, December 26, 2006

Embracing Cascading Style Sheets Makes Good Sense

Embracing Cascading Style Sheets Makes Good Sense
By Fred Black (c) 2006

Why do I like Cascading Styles Sheets (CSS) and what makes
CSS so great. The answer only makes sense if you know what
CSS is.

The Holy Grail of CSS is to separate the content of a web
page from the instructions that control what it looks like.
This makes it much easier for various devices to display
the web page correctly. The same page would display
correctly on all standard web browsers (Internet Explorer,
FireFox, Opera, Netscape, etc.), and devices used by
persons with a handicap, cell phones, other devices, and
yet-to-be-developed interfaces. The web site designer would
not have to make separate pages for these devices. However,
reality is different and here in the real world CSS does
not do all these things. It does have enough positive
points to make it worthwhile to learn and incorporate it
into your web pages.

There are multiple ways to control how something looks on a
web page. The color, size, and font used for a headline or
for a paragraph of text can be defined with in-line styles
and tags. The term "in-line" means that the commands for
controlling the color, size, and font are mixed in with the
content. This makes the source code for the page cluttered
and hard to read and edit when you want to update it or fix
something. Also, because you're repeating the same commands
over and over throughout the page, the file size of the
page gets larger and hence slower for those browsing your
site.

As opposed to in-line styles, CSS is not repeated
throughout the page. CSS can be defined in the head
section, or put in a separate file and referenced from the
HTML file, or both. CSS consists of definitions of how a
page component should render itself on the page. For
example, you can define that a headline should be blue, 24
point, centered text and that a paragraph should be black,
12 point, justified text. Once that is defined, any normal
HTML paragraph tags or headline tags would use these
definitions when rendered.

You can define almost all the normal HTML objects this way;
background color, background image, background image position,
tables, cells, images, divs, etc. This leaves your HTML code
clean and much easier to read. Just like those Ronco TV
commercials, there's more! If you have a multi-page web site
and you use CSS and all your CSS definitions are in a separate
file, you have one place to go to change the look and feel of
all the pages in your site. Imagine if you have a 50 page site
and you learn that the size of your text in all your paragraphs
is too small or the wrong color to maximize sales: instead of
having to edit 50 pages and change the definition of each
paragraph, you simply edit the CSS file and you're done!

But with CSS what do you do if you want one paragraph or a
set of paragraphs to look different? You define a class. If
you have a right column where you display ads in your CSS,
you would define a class and give it a name such as ".rcol",
then you would define the necessary items that you want to
look different (p tags for example). ".rcol p" would be
used to control how a paragraph tag was rendered. You
simply add "class=rcol" to the paragraph tag, or the table
tag if it's in a table, or div tag if it's in a div, etc.

This is also where the cascading in CSS comes into play:
the default definitions cascade down into a class as long
as the class does not contain something that overrides the
default. This means that in our example text rendered in a
paragraph tag looks different for the rcol class, but
because that's the only thing we've defined for rcol,
everything else would use the same styles as the rest of
the page.

You can also define size and positioning for objects in
CSS. This is one place where we hit the real world of CSS
pretty hard. Not all browsers support the size and position
commands the same way. This leads to hacks that define a
position and then use a command that is known. For example,
to cause Internet Explorer to bail out of the CSS, after
that line you use a position command that Netscape for
example understands. CSS uses the last definition of an
object so this technique can be used to "trick" or "hack"
CSS into working across more browsers than it normally
would.

I don't recommend doing this. One reason is that
it's messy and easy to forget why you did something. The
other reason is because as browsers are updated, or new
devices come online, they may not follow these unwritten
and unsupported hacks and your pages are apt to be all
messed up. To get around this I usually use CSS as much as
I possibly can and then use tables and in-line definitions
to control positioning and size. Some people will go to
great lengths to use CSS for everything, even replacing all
tables, but here in the real world, you should get the
page built, functioning, and in a form that can be used
reliably on as many platforms as possible.

Not all web site software packages like Microsoft Front
Page, Dreamweaver, or Adobe GoLive, etc. fully support CSS.
You'll have to do some coding manually. Don't worry, it's
not that hard. I have an online course that can teach you
how, just follow the link at the end of this article.

Take the time to learn CSS and implement it in your web
pages. It will be time well spent.

Web Development with SEO in Mind

Web Development with SEO in Mind
By Adam McFarland of iPrioritize
(http://www.iprioritize.com/) (c) 2006

When a business owner decides to bring their business to the
web, generally the last thing that they think about is search
engine optimization. They assume that whomever they hire to do
their web design will put up a site and then submit it to the
search engines and the traffic will magically pour in.
Unfortunately it takes more than that to drive search engine
traffic to your site, and even more unfortunately most
developers don't program with SEO in mind, nor do they educate
the client about the process involved in gaining traffic from
search engines.

Whether it's carelessness or a lack of knowledge, or a
combination of the two, this often leads to a client that
several months down the road doesn't understand why their site
doesn't get any traffic and isn't helping their business. A
good designer will not only program with SEO in mind, but will
also educate the client about the basic principles of SEO,
whether they are the one who executes it or not.

Many times the clients I inherit have gone through this scenario
and then face drastic on-site changes to get their site search
engine friendly before we are even able to begin the arduous
process of link building. Whether you are designing a site for
yourself or for a client, following the simple steps below when
programming will ultimately save the business time and money and
result in a search engine friendly site that truly maximizes the
online potential of the business.

Use proper tags for headings, bold text, italic text, and lists
– HTML has heading tags, bold tags, italic tags, and ordered and
unordered lists for a reason and you should use them. Using CSS
you can practically style them however you like, but actually
using a heading tag for your headings, and bold tags for
important text, will help allow search engines understand what
text on a page is a heading or what is more important than the
surrounding text. Simply applying a CSS style that makes text
larger or bold doesn't do that.

Optimize your images – search engine spiders can't read text
within an image. Adding ALT text to your image tag helps, but
ideally you should remove all wording from the image and style
it using CSS, adding the remaining portion of the image as a
background image to the text. Here is a side-by-side comparison
(http://www.seo-playbook.com/image_example.php) of two images
that look the same in your browser, but much different to a
search engine spider.

Avoid canonical problems – believe it or not, search engines can
see http://yoursite.com, http://www.yoursite.com, and
http://www.yoursite.com/index.html as three different pages. A
simple solution is to use a 301 redirect
(http://www.webconfs.com/how-to-redirect-a-webpage.php) to point
all of your pages to their "www" counterpart. You can also select
the preferred domain that Google shows in the new Google
Webmaster Tools (http://www.google.com/webmasters/) console.

Get rid of Session IDs if you have a PHP site – have you ever
seen a PHPSESSID variable added to the end of a URL on a PHP
page (it looks something like PHPSESSID=34908908)? This happens
because PHP will add a unique PHPSESSID to URLs within your site
if cookies aren't available. This can be extremely problematic
for your site's search engine ranking. Google and Yahoo will see
a unique PHPSESSID in the URL every time they visit a page on
your site, and in turn think that said page is a different page
each time. At worst, this could be viewed as duplicate content
and get your site banned, and at best it will reduce the
perceived value of each page. One solution that I've used
successfully is to utilize url_rewriter.tags
(http://www.php.net/session).

Put CSS and JavaScript in external files – nearly every site
nowadays uses CSS and JavaScript for something. While both are
great for enhancing user experience, neither will help your
search engine ranking if left on your page. One of the factors
that search engines consider when ranking your site is the
percentage of code relevant to the search term. CSS and
JavaScript can take up hundreds of lines of code, minimizing the
importance of your text and in turn hurting your ranking. By
putting them in separate files and simply including them in your
page by reference, you can reduce hundreds of lines down to one
and increase the amount of code in the file that is relevant
content.

Minimize the use of tables in layouts – the debate about whether
or not tables should be used in site design has been going on
for years and there's no end in site. I fall somewhere in the
middle – there are certain circumstances (like organizing
tabular data) where I think tables still make the most sense,
but I also appreciate the SEO benefits of using CSS layouts.
CSS layouts drastically reduce the amount of code in your site
that isn't content that the user sees. Just like moving CSS and
JavaScript to an external file, the less on-page code that isn't
content, the better. Check out search engine friendly layouts
(http://www.searchenginefriendlylayouts.com/) for some free
example layouts.

Validate your site – a site doesn't have to be perfectly coded
to rank high in the search engines (there are many, many other
factors), but valid HTML will help ensure that search engines
and browsers alike will accurately see your page. Try using the
official W3C Validator (http://validator.w3.org/) or install
this handy Firefox extension (https://addons.mozilla.org/
firefox/249/). Validating generally identifies areas of code
that are redundant, unnecessary, or not accepted across all
browsers. All of which will help make your site more search
engine friendly.

Future Evolution of Search

Future Evolution of Search
By Scott Van Achte, Senior SEO,
StepForth Search Engine Placement Inc. (http://www.stepforth.com) (c) 2006

The search engine world never rests. As online marketing
professionals discover new ways to obtain top rankings the
algorithms evolve right along side. There are two primary
reasons behind the updating of ranking algorithms. To increase
the quality and relevancy of the results, and to decrease the
many pages of online spam.

As the algorithms are updated, new ways to affect the results
are discovered, and the algorithm must then be again adjusted.
This is a cycle that has been around since the early days of
search, and one that won't be going away any time soon. A lot
has changed over the years, and the future is sure to also
deliver its plethora of surprises, but there are three main
factors that will always have some level of impact on your
search results.

SEO, Content and Links

Some people say that the world of search engine optimization is
over and that the entire basis behind successful rankings lies
in the power of incoming links. While incoming links do play a
significant role, and in most cases are a necessity, they are
far from the only determining factor.

There are many determining factors behind what will affect the
ranking of a site. The three largest contributing factors are
SEO, links, and site content. To compete in highly competitive
industries a site needs numerous on-topic pages of content,
relevant incoming links from a variety of sources, and, solid
site optimization. While search is always changing, these three
factors will remain constant. Each may change in the level of
impact they have, but they will always contribute to the top
listings.

Site content and SEO go hand in hand. Content is very important,
but without the SEO to add focus, it can go unnoticed. Proper
keyword densities, link paths and keyword placement will always
play a role in having the content discovered and ranked by the
search engines. If the fundamental SEO aspects are not in place,
there is a strong chance that the content may never see the
light of day. Incoming links add focus and relevance for the
site overall, but if the content is not relevant to the desired
phrases the odds of obtaining a top ranking are very bleak.

Links play, and will continue to play a strong role in the
future of search rankings as they add that important vote of
confidence. When site A links to site B, that tells the search
engines site B is worth considering. Value is passed, based on
relevance and the overall authority of site A.

As more and more webmasters develop new linking schemes, the
algorithms responsible for displaying top sites have to
continually evolve to weed out the ever increasing amounts of
spam. While Google's current algorithm relies heavily on
incoming links, especially for sites in highly competitive
markets, this algorithm will have to change and mutate over time
as the internet continues to evolve. If rankings were determined
100% by inbound links where would this leave us? Thousands, if
not millions, of valuable websites would go completely unnoticed.
We would also see many sites ranking that are not relevant to
the actual search term due to issues related to Google bombing.

Political opinions aside, the single word "failure" does not
accurately represent the George Bush bio page; however, it
continues to rank #1 in Google. This was made possible by the
anchor text used in links posted by thousands of bloggers and
webmasters. If links were solely responsible for rankings, we
would see a lot more examples of Google Bombing as the actual
number of links required to 'bomb' would decline.

Where is Search Going?

For us to know the exact future of search we will have to wait
and see what happens, but some things are certain to grow in
popularity.

The future will undoubtedly see more advances in localized
search, serving results relevant to the locality of the
searcher. Is this the best way? Only time will tell, but even if
this is the future, we will still see SEO, links & content
dictating the results. The SEO and content will have to be in
part geared towards local information such as zip codes, city
names, etc, but they still will be important contributors.

Links will undoubtedly contribute to rankings long into the
future, but quite possibly will have a reduced role with more
SEO fundamentals making a comeback. One example is to take a
look at MSN Live Search. As reported by Ross Dunn in the SEO
BLOG (http://news.stepforth.com/blog/2006/11/
msn-algorithm-update-nov-3rd-2006.php) just this past weekend
an algorithm update has shown increased value on fundamentals
such as title tags and domain names. These two areas were once
an incredibly powerful tool in obtaining rankings, and had
reduced in value. Now, at least in MSN, they are gaining ground
once again.

Still in its infant stages, Mobile Search is growing as more and
more people turn to their cell phones and other mobile devices
for search. Mobile search will likely have the most benefit for
localized type searching. People looking for an address, weather
report, local business, entertainment information, etc. As time
goes on the number of users using Mobile Search will continue to
grow, and optimized sites will be the ones found by these
searchers. A whole new level of optimizing mobile websites will
likely emerge.

In 10 years time search will certainly look very different.
While it has become a staple in the lives of millions, in the
big scheme of things the internet is still very young and search
even younger.

Why SEO will always be important

SEO will always play an important role in having sites found in
the search engines. Regardless of how search algorithms evolve
they will always require a level of on site content in order to
correctly rank websites. As long as this content is considered,
proper keyword placement and frequencies will play a role.

SEO in itself will continue to change. The proper frequencies of
keyword placement, linking techniques and URL structure may
alter, but will always have an impact.

As we move into the future and as the search engine algorithms
continue to evolve SEO will always play an important role in
having your websites obtain top rankings. While the small things
will always change it is important to have the basic
fundamentals in place and doing so will help sustain consistent
rankings into the future.

Thursday, December 21, 2006

The Life and Near Death of DMOZ

The Life and Near Death of DMOZ
By Jim Hedger (c) 2006

The casket was all but closed on the venerable Open Directory
Project (ODP, or dmoz.org (http://www.dmoz.org/) ). A December
16 blog post by an ODP founder, Rich Skrenta, "DMOZ had 9 lives.
Used up yet?" (http://www.skrenta.com/2006/12/
dmoz_had_9_lives_used_up_yet.html
), suggested that the directory
at DMOZ is now, like Marley's ghost, deader than a doornail.
DMOZ was down and, for over a month and a half, it looked like
it was down for the count.

In reality, DMOZ is not dead though the rumours of its demise
were not exactly exaggerated either. Because this six-week
unscheduled outage followed several years of consumer
dissatisfaction, lagging editorial energy, and layoffs at AOL,
many made the logical assumption that the plug had been pulled.

While the website still functions as a searchable directory, its
editing functions have only just been restored after six weeks
of downtime. Since the last week in October, editors and
submitters have been greeted by versions of a customized DMOZ
404 page (http://www.dmoz.org/unavailable.html). DMOZ was
basically a dead directory referencing over 4million websites
spanning nearly 600,000 categories. Though editing has been
restored, it is still not possible to submit new sites.

Even if webmasters could submit new sites, chances are they
would not receive timely editorial attention. For the last few
years, webmasters have complained about the now legendary
backlog of sites awaiting review and inclusion. It can take
months or even years for spelling mistakes to be corrected and
an enormous number of the 590,000 categories that make up the
directory do not even have editors. Though many webmasters
consider the Open Directory useless because of that backlog, it
still swings a big weight in the search sector.

The greatest success of the Open Directory Project stems from
the free database access offered to any other search entity. The
majority of search engines and directories use the ODP's
RDF-esque data-dump (http://esw.w3.org/topic/DmozRdf) to help
populate their databases. As every ODP listing is human edited,
Google and other search engines have tended to treat ODP
references as trustable sites. Carrying a PageRank of 8, links
from the ODP continue to be considered Google-Gold by SEOs.
Other search engines receiving results from the DMOZ directory
include Ask, Yahoo and AOL. Clearly the ODP remains an important
entity in the search space.

It has certainly earned its status as an important entity. The
Open Directory Project has a long history that dates back to
1998. Since the day it went online as GnuHoo in June 1998 it has
played a crucial, defining role in the evolution of the search
sector and of the Internet.

Gnuhoo appeared on June 5 1998 in response to the rapid growth
of the web. The number of new sites coming online in 1998 far
exceeded the capacity of Yahoo's editorial staff that was
rumoured to number less than 200. Gnuhoo co-founders Richard
Skrenta and Bob Truel believed they could create a better
directory using an unlimited supply of volunteer editors than
Yahoo could with their limited team of professional editors.

They were right. NewHoo grew faster than Yahoo did in the last
half of 1998. Less than a year after it went online, the
all-volunteer project had acquired 8,000 editors and over 430,000
websites. By then it had undergone two name changes and had been
acquired by one of the largest emerging online entities.

Within days of being online, Gnuhoo had attracted enough
attention to force a rapid succession of name changes. First
the Free Software Foundation objected to the use of the term GNU
after a Slashdot article misconnected the two projects. Gnuhoo
was thus renamed Newhoo. A few days later, Yahoo raised issues
about the use of the suffix "Hoo". At the same time, Netscape
Communications Corporation opened a dialogue with Skrenta about
acquiring the upstart directory project as a major asset during
their competitive phase with Microsoft.

Promising to respect the founders' original intentions to keep
the site a non-commercial entity, Netscape acquired the
directory for $1 million in October 1998 and renamed it the Open
Directory Project. ODP data was released freely under the Open
Directory Licence. A month after Netscape bought ODP, America
Online (AOL) purchased Netscape. AOL agreed to honour the Open
Directory Licence, formalizing it in a Social Contract
(http://dmoz.org/socialcontract.html) with the web community.
This marks the real start of the ODP's rise. By early spring
1999, most of the major search engines were pulling data from
the ODP.

1998 and 1999 was a special time in the history of the Internet.
Billions of dollars were invested as eager speculators and
venture capitalists moved to cash in on the promise of instant
riches. Start-up companies with no functional business plans
became multi-million dollar concerns overnight. The first
generation of instant online millionaires was spawned and talk
of breaking the traditional business cycle was taken seriously.
The bottom was about to fall out of what had become a
stratospheric marketplace but at the time, very few saw the
danger through the haze of the hype. When the sky fell, it fell
hard. In a tangential way, the ODP was directly involved. Though
it is technically a non-profit society, ownership of the ODP is
considered a business asset.

The trigger event that led to the crash of 2000 was the most
significant deal in the history of global publishing. In January
2000, less than a year after it had acquired Netscape and DMOZ,
AOL purchased the Time Warner media empire for approximately
$160 Billion in an all-stock deal. The excess of that deal, one
in which an upstart tech firm absorbed the largest brick and
mortar information and entertainment business in the world, made
a number of analysts look at the silliness of it all. Within
three months, the shares AOL used to buy Time Warner would be
worth a fraction of their value when the deal was struck.

The Tech-crash of 2000 had a cascading effect across the web.
Most, if not all, of those new businesses without business plans
were quickly put out of business as the value of those firms had
declined and no new sources of investment were forthcoming.
Online properties supported by shareholders, such as Yahoo and
AOL/Time Warner, were in sudden desperate trouble. 18-months of
tech sector doldrums set in as the investment world started
looking for a revenue source that could sustain the staggering
costs of the sector.

A new search engine appeared on the scene around this time. It
had a funny name and appeared to disregard the dominant portal
or directory structure favoured by most search engines. Hidden
behind its sparse front page and childish logo was a
revolutionary way of producing what everyone agreed at the time
were extraordinarily accurate search results. The age of Google
began in late 2000. A year later, the power of viral marketing
had propelled Google into the big leagues, making it a serious
challenger to AltaVista, Lycos and Yahoo.

Google populated itself in part by using DMOZ data. In its
earliest years, Google used DMOZ as its directory, displaying
virtually mirrored results. Google's unique method of judging
page content by the number and value of incoming links made a
listing at the Open Directory critically important for SEOs and
webmasters. As Google's popularity and reach grew, the value of
a DMOZ link grew. Because ODP listings are human reviewed,
Google has traditionally tended to trust them, thus producing
stronger placements faster. Between 2001 and into 2005, Google
was responsible for over 80% of all organic search listings
either directly or through feeding competitors such as Yahoo and
MSN.

When Google figured out how to make the paid-advertising system
Overture was using make oodles of money, all hell broke lose
again and we rapidly advanced to where we are today.

When Google became the most important search engine, search
marketers began targeting the Open Directory with site
submissions, often with several sites for the same company. As
one ODP editor put it, "We never asked to be used by Google like
this." As the decade progressed, new methods of creating web
documents (html editors, CMS, blogs, etc...) spurred another
period of extraordinary growth that far surpassed the ability of
DMOZ editors to keep up.

A classic dilemma existed. A link from DMOZ could mean the
difference between weeks and months waiting for a good placement
at Google. The ODP was never supposed to have such influence.
The relationship Google's algorithm created between itself, the
Open Directory and webmasters wanting a DMOZ listing ended up
threatening the open editorial policies originally envisioned.
It was difficult to enlist new editors when many applicants were
primarily motivated by the ability to insert their own sites.

Though it boasts almost 75,000 editors, it also contains over
590,000 directory categories and sub-categories. The Open
Directory is enormous and continues to be driven by volunteers.
One of two things happen; either its volunteer editors deal with
an average of 8 categories each or some categories will have to
go unedited. The latter tends to happen more often than the
former and the public and search engines are left with a less
than complete directory to draw from. Such has been the case for
the past two or three years.

In their defence, the ODP editorial staff would suggest that
the majority of sites they continue to see are junk advertisement
pages designed for SEO or PPC purposes. Similar comments appear
in any number of threads found at the Open Directory Resource
Zone (http://www.resource-zone.com/forum/), a public chat forum
designed to promote communication between editors and users.

With a massive backlog of unreviewed submissions and a huge
demand from search marketers hungry for the rankings boost
expected from a DMOZ listing, many felt the ODP was becoming an
elite, secretive society. Editorial applicants reported their
requests were going unanswered and allegations of corruption
(http://www.sitepronews.com/archives/2005/may/30.html) amongst
rouge editors emerged. By end of 2005, the ODP appeared to be in
total disarray with more sites in the review process than were
actually in the directory. Throughout 2006, the ODP has become
less and less relevant to the search marketing community until,
towards the end of the year, it was gone.

Most of the directory appears to be functioning again though it
is likely a version carped together using data from the last RDF
file. When the server at AOL crashed, it took most of the
current directory and all of its records with it. A number of
meta editors have spent the past six weeks rebuilding the
directory with the help of a few friendly AOL techs. The submit
a site feature is, as of this time, not functioning.

Outwardly, the importance of the Open Directory was obvious but
the greatest contributions to the Internet from the Open
Directory team come from the people involved with the movement
and the open-source philosophy that has descended from them.

When Netscape absorbed it, the Open Directory Project became
part of an amazingly influential environment. Founded by
legendary Marc Andreessen, Netscape was already part of the Open
Source movement. Netscape founded the Mozilla Foundation in
January 1998, nearly a year before it acquired DMOZ. The
Mozilla Foundation introduced and marketed the Firefox browser.

The ODP was arguably the first successful long-term project that
could fall under the general heading Web2.0. Its philosophy set
the stage for the Wikipedia and other community based websites.
Unlike other collaborative projects that predate it, the ODP was
a truly grassroots endeavour. Participants didn't need to be
extraordinary technicians; they just had to be able to
understand the editing techniques used by their community.

Though rumours of its death are obviously exaggerated,
complaints about its demise are not. The ODP is a wonderful
entity, but the power it inadvertently exerts is far greater than
its ability to edit itself. Many have suggested the ODP should
shut its door for good but perhaps this downtime has given its
meta-editorial collective a chance to consider its role in the
search community.
================================================================
Search marketing expert Jim Hedger is one of the most prolific
writers in the search sector with articles appearing in numerous
search related websites and newsletters, including SiteProNews,
Search Engine Journal, ISEDB.com, and Search Engine Guide.

He is currently Executive Editor for the Jayde Online news sources
SEO-News (http://www.seo-news.com) and SiteProNews
(http://www.sitepronews.com). You can also find additional tips
and news on webmaster and SEO topics by Jim at the SiteProNews
blog (http://blog.sitepronews.com/).
================================================================


Copyright © 2006 Jayde Online, Inc. All Rights Reserved.

SEO-News is a registered service mark of Jayde Online, Inc.

Wednesday, December 20, 2006

Anatomy Of A Web-Advertising Campaign

Anatomy Of A Web-Advertising Campaign
By Jerry Bader (c) 2006

In The Beginning There Was Marketing

Anyone in business who has any interest in using the Web to
further his or her business is well aware of "search engine
optimization." Not a day goes by that my email in-box isn't
loaded with information on how to get the best search engine
results, and not a week goes by that a client or potential
client doesn't request that their website be not just search
friendly but search engine fanatical.

For some time we have been preaching the importance of
delivering the marketing message and that your message should
not be corrupted or distorted by techniques aimed at attracting
search engine robots while driving away real people who may
actually be potential customers.

Now I realize that in many circles this attitude is considered
outright heresy, but hopefully there are a few marketing types
around that understand websites have to deliver more than
miscellaneous random eyeballs; websites have to deliver a
message that is memorable, understandable, useable, and if
you're really good at your job, information that can be
incorporated into your audiences' belief system.

With that in mind we were pleasantly surprised when Google the
primary target of this SEO obsessive compulsive frenzy of
technical slight-of-hand announced that they were instituting
Google Video Ads and to add a little icing on the cake, they
purchased YouTube adding to their already considerable
investment in Google Video. Somebody at the big "G" thinks
video is a viable Web-medium even if the purveyors of search
engine fool's gold would have you believe otherwise.

The list of companies, including Forbes, Amazon, Wyeth, and
Ford, delivering Web-audio and Web-video grows daily and we are
not just talking about major corporations. Small companies are
using multimedia to get the edge on their larger competitors
who still have their heads buried in the search engine
optimization sand.

Acknowledging All The Issues

In developing our campaign to promote the use of Web-audio and
Web-video as an effective method of delivering marketing
messages over the Web, we identified four key issues that would
have to be addressed:

(a) We had to demonstrate that website design was about
delivering the marketing message and not just search engine
optimization.

(b) We had to demonstrate that even small and medium-sized
companies could afford professional Web-audio and Web-video and
that it wasn't cost prohibitive.

(c) We had to demonstrate that professional Web-audio and
Web-video required more than just the ability to use a video
camera and that professional multimedia story-telling required
a unique set of creative skills and technical ability not often
found in-house in most businesses.

(d) We had to demonstrate that the development and production
of creative multimedia marketing and professional webmedia
content had to do with talent and experience, not size.

These were the challenges that informed all our subsequent
decisions.

The Concept

In order to make people pay attention to what we had to say we
needed a concept that was both familiar and edgy. Sure we were
sticking a finger in the eye of all the search engine
optimizers but you can't be afraid to make a strong statement
if you want people to sit-up and take notice, especially if you
are fighting a tidal wave of misconception.

The fact that we were telling people that delivering your
marketing message on the Web using multimedia was more
important than search engine optimization was enough to make
what we were doing controversial, but we also needed a vehicle
that allowed us to present the opposing point of view. What we
needed was a recognizable style that demonstrated our ability
to deliver a memorable, comprehensible, useable,
belief-altering message in the medium we were promoting.

Since we primarily use Macintosh computers for all our work and
only use PCs to check for compatibility, we thought we would pay
homage to the brilliant Mac commercials running on television.
The format worked for us because it allowed us to create two
characters of our own that would present opposing points of
view over a series of videos that would comprise the campaign.
We knew that some people would react unfavorably to our using
such a familiar format but we figured it would demonstrate how
even small but talented production companies can deliver high
quality multimedia Web-based marketing on tight budgets.

A Market Primed and Ready

Our efforts in advocating the power of using the human voice
and image to deliver marketing stories over the Web was finally
getting through to companies who were fed-up with the cost and
ineffectiveness of continually chasing the holy grail of search
engine optimization. Company presidents and marketing managers
were starting to listen, starting to realize there was another
way. This campaign was aimed at pushing these business
executives to act on what they already knew: good marketing is
about delivering the message, not keyword density.

Preproduction, Production, and Post Production

We wanted to make sure we had a distinctive sound by composing
our own signature theme music and creating our own cast of
characters with a distinctive message promoting the concept of
multimedia. In fact these planned web-commercials really don't
sell anything, all they do is make people aware that search
engine optimization is not the only thing they should be
thinking about when they are developing a website or webmedia
campaign. In short, the medium was the message.

The use of Web-audio and Web-video is the best way to implement
this kind of marketing presentation. We sat down and started to
write and before we knew it we had eighteen scripts each
featuring a different issue in the search engine optimization
versus multimedia controversy.

The next step was finding the right actors to play the part.
Whereas Web-audio allows us to draw upon a vast number of voice
talents across North American, video is much more limiting,
especially if we wanted to keep the cost down to a reasonable
amount. Even if we were prepared to blow the budget on actors,
we knew our clients wouldn't, so it was important to
demonstrate that we could get the job cast at a sensible cost.
The casting proved to be an interesting exercise of frustration
and humor. We had all types of applicants ranging from the
sublime to the ridiculous to the outright bizarre, but
ultimately we were able to find two fine young actors who
understood exactly what we were doing and who took to the parts
as if they were written specifically for them.

One of our greatest assets as a firm is that we do everything
from concept to implementation, including writing, videoing,
editing, graphic, motion, and website design; but if you want
to produce a campaign at a sensible price you still have to be
careful you don't write overly elaborate scripts that require
multiple sets, locations or hard to acquire props. That said we
still had to find a cute dog we could trust on set, links of
various kinds of sausages, a hard to put together toy, and best
of all a real straightjacket from an interesting website that
specialized in rather strange items of clothing.

The shoot itself went extremely smoothly and we ended up
shooting all eighteen videos in less than two days. We assumed
some of the videos that looked good on paper just wouldn't
translate to the screen, but to our surprise every one of the
scripts worked. We knew what we wanted to say and weren't
afraid to say it, even though we were flying in the face of
conventional wisdom.

While Josh Bader our Director of Photography was digitizing,
color correcting and editing the raw footage, Simon Bader our
Director of Audio composed a number of theme music compositions
to choose from for our signature sound. Once all the pieces were
put together into a series of finished videos, we were ready to
implement the campaign.

Implementation

The first set of six videos were uploaded to Google Video and
YouTube as well as onto a webpage
(http://www.mrpwebmedia.com/ads/) that was created to house the
full campaign of eventually eighteen videos, each presenting a
different issue in the search engine optimization versus
multimedia controversy. Versions of the videos were also used
to create a Google Video Ad campaign.

Credits

Produced by MRPwebmedia
Executive Producer: Jerry Bader
Written By: Jerry and Josh Bader
Director of Photography and Visual Design: Josh Bader
Director of Audio and Music Composer: Simon Bader
SEO Guy: Sean Kaufmann
Multimedia Guy: Erez Bowers

Tuesday, December 19, 2006

Web Development with SEO in Mind

Web Development with SEO in Mind
By Adam McFarland of iPrioritize

(http://www.iprioritize.com/) (c) 2006

When a business owner decides to bring their business to the
web, generally the last thing that they think about is search
engine optimization. They assume that whomever they hire to do
their web design will put up a site and then submit it to the
search engines and the traffic will magically pour in.
Unfortunately it takes more than that to drive search engine
traffic to your site, and even more unfortunately most
developers don't program with SEO in mind, nor do they educate
the client about the process involved in gaining traffic from
search engines.

Whether it's carelessness or a lack of knowledge, or a
combination of the two, this often leads to a client that
several months down the road doesn't understand why their site
doesn't get any traffic and isn't helping their business. A
good designer will not only program with SEO in mind, but will
also educate the client about the basic principles of SEO,
whether they are the one who executes it or not.

Many times the clients I inherit have gone through this scenario
and then face drastic on-site changes to get their site search
engine friendly before we are even able to begin the arduous
process of link building. Whether you are designing a site for
yourself or for a client, following the simple steps below when
programming will ultimately save the business time and money and
result in a search engine friendly site that truly maximizes the
online potential of the business.

Use proper tags for headings, bold text, italic text, and lists
– HTML has heading tags, bold tags, italic tags, and ordered and
unordered lists for a reason and you should use them. Using CSS
you can practically style them however you like, but actually
using a heading tag for your headings, and bold tags for
important text, will help allow search engines understand what
text on a page is a heading or what is more important than the
surrounding text. Simply applying a CSS style that makes text
larger or bold doesn't do that.

Optimize your images – search engine spiders can't read text
within an image. Adding ALT text to your image tag helps, but
ideally you should remove all wording from the image and style
it using CSS, adding the remaining portion of the image as a
background image to the text. Here is a side-by-side comparison
(http://www.seo-playbook.com/image_example.php) of two images
that look the same in your browser, but much different to a
search engine spider.

Avoid canonical problems – believe it or not, search engines can
see http://fda-approved-rx.com, http://www.fda-approved-rx.com, and
http://www.fda-approved-rx.com/index.htm as three different pages. A
simple solution is to use a 301 redirect
(http://www.webconfs.com/how-to-redirect-a-webpage.php) to point
all of your pages to their "www" counterpart. You can also select
the preferred domain that Google shows in the new Google
Webmaster Tools (http://www.google.com/webmasters/) console.

Get rid of Session IDs if you have a PHP site – have you ever
seen a PHPSESSID variable added to the end of a URL on a PHP
page (it looks something like PHPSESSID=34908908)? This happens
because PHP will add a unique PHPSESSID to URLs within your site
if cookies aren't available. This can be extremely problematic
for your site's search engine ranking. Google and Yahoo will see
a unique PHPSESSID in the URL every time they visit a page on
your site, and in turn think that said page is a different page
each time. At worst, this could be viewed as duplicate content
and get your site banned, and at best it will reduce the
perceived value of each page. One solution that I've used
successfully is to utilize url_rewriter.tags
(http://www.php.net/session).

Put CSS and JavaScript in external files – nearly every site
nowadays uses CSS and JavaScript for something. While both are
great for enhancing user experience, neither will help your
search engine ranking if left on your page. One of the factors
that search engines consider when ranking your site is the
percentage of code relevant to the search term. CSS and
JavaScript can take up hundreds of lines of code, minimizing the
importance of your text and in turn hurting your ranking. By
putting them in separate files and simply including them in your
page by reference, you can reduce hundreds of lines down to one
and increase the amount of code in the file that is relevant
content.

Minimize the use of tables in layouts – the debate about whether
or not tables should be used in site design has been going on
for years and there's no end in site. I fall somewhere in the
middle – there are certain circumstances (like organizing
tabular data) where I think tables still make the most sense,
but I also appreciate the SEO benefits of using CSS layouts.
CSS layouts drastically reduce the amount of code in your site
that isn't content that the user sees. Just like moving CSS and
JavaScript to an external file, the less on-page code that isn't
content, the better. Check out search engine friendly layouts
(http://www.searchenginefriendlylayouts.com/) for some free
example layouts.

Validate your site – a site doesn't have to be perfectly coded
to rank high in the search engines (there are many, many other
factors), but valid HTML will help ensure that search engines
and browsers alike will accurately see your page. Try using the
official W3C Validator (http://validator.w3.org/) or install
this handy Firefox extension (https://addons.mozilla.org/firefox/249/). Validating generally identifies areas of code
that are redundant, unnecessary, or not accepted across all
browsers. All of which will help make your site more search
engine friendly.
================================================================
Adam McFarland owns iPrioritize - simple to-do lists
(http://www.iprioritize.com/) that can be edited at any time
from any place in the world. He also provides SEO consulting
(http://www.seo-playbook.com/) for small businesses looking for
a cost-effective way to drive more traffic to their site and
convert more visitors into customers.