You are on page 1of 4

SEARCH ENGINE OPTIMIZATION for PR

Search is the dominant force on the web and content


that ranks highly in a Google search is de facto going
to have more hits, more impact and more value.
There are currently six searches conducted via
Google for every two on Yahoo! and one on Microsoft
Live.
PR practitioners therefore must take account of this
and consider how they use digital PR to support good
search rankings. PR activity is creating content that
is of increasing relevance to the way that search engines work.
A typical Internet search has three components to it.
The first is a huge database that contains every word
from every page from every web site in the world. This
database can be searched and matched very quickly
against any word(s) that you enter into a search engine.
The second component of a search engine is its spiders, robots, or just bots. These terms are metaphors
for automated computer programs that go out and
creep around on the Internet find a web site, and
crawl from page to page indexing and cataloging each
pages content. The search engines computer simply
opens a home page, captures the content, and goes
onto the next page, and does the same thing.
The third part of the process is your typical search interface, which is what you see when you go to Google
or Yahoo! when you enter your query and see the results.
How search engine optimization evolved
As AltaVista, Yahoo!, Lycos and other early search
engines began to rise to pre-eminence everybody involved in the internet became subject to the power
and influence of search. At some time around the middle of the 1990s, the idea of search engine optimization (SEO) was spawned. In the early days, web
designers submitted pages or links to all of the search
engines and they in turn would send a web crawler to
the site to collect information, which would then be indexed. The web crawler or spider would extract keywords that would provide the basis for future
searches. Very quickly site owners and administrators
started to work out ways of getting sites placed in
searches, ideally as highly ranked as possible. This

process by the end of the 90s became known as SEO.


Early searches relied on the information that was provided by the website itself in the form of tags or keywords. Content providers could manipulate these tags
in an attempt to rank highly in searches.
Search engines were going to have to improve the
way that they found information or searching would
become increasingly unreliable. Two students at Stanford University, Larry Page and Sergey Brin, created
a search engine technique based on mathematical algorithms that measured links from one website to another. This was the basis for Google, the search
engine they launched in 1998.
The search engine optimization business started to
find ways to manipulate this new form of search. In
short, they found ways of creating spurious links to
the sites that they wanted to promote using devices
such as link farms, which involved the creation of
thousands of websites whose only function was to
provide links to the original site to improve its page
rank. SEO became an important element in any digital
marketing campaign. SEO expanded to look at the
actual functioning of the search algorithms and would
review code, presentation and structure of websites
to improve their ranking. All of the leading search engines, Google, Microsoft Search and Yahoo! use web
crawlers to find pages automatically.
High page rankings can have a huge commercial
value and therefore there continues to be a tension
between the SEO business and optimal function of
search engines. The techniques are regarded as
being either good design that search engines approve
of, or they are regarded as black hat or attempts to
trick search engines into providing a higher rank than
a site actually merits. This will usually lead to these
sites being banned temporarily or permanently once
the search engines discover black hat techniques. For
example using text that is hidden from human eyes
but visible to a search engine. Some major international companies, such as BMW, have been accused
of using some of these techniques and have been the
subject of temporary bans by Google. As a way of
dealing with this, the major search engine organizations have not only maintained the confidentiality of
exactly how their searches work but they have also
changed the processes on a regular basis.

Lect.univ.dr. Daniel CIUREL

English for Digital Media

Course 5

thought to how we deliver that content in a way that


is itself optimized for search. The public relations industry needs to start adopting some of the techniques
that the white hat search engine optimizers have been
using for years. For example, the type of language
that we use in our written output needs to consider
the use of terms that are more likely to be used when
our audiences are using search engines. We will need
to move away from the use of convoluted terms and
phrases that in the past have been favoured by some
branding campaigns to more straightforward and descriptive terminology that will raise our search rankings. This is important as the industry moves away
from the old-fashioned press release in favour of the
more web-relevant social media release or SMR.

In May 2007, Google introduced a radical change to


the system by introducing the concept of universal
search, which blends listings from its news, video,
images, local and book search engines with those
gathered from web crawlers. One of the changes is
the use of vertical search, that is, if the search is about
sport the search engine searches sports sites; if it is
about a medical issue then medical sites will be promoted.
News, for example, works differently in universal
search the results are blended in with the traditional
search results, which greatly elevates the importance
of news stories in the search rankings for any commercial organization. Images and video are now also
included in the main search.
Before the introduction of universal search, Google
searches gave a list of 10 web results and things like
news headlines did not appear (unless you used the
separate and still available News search) because
of the way that the web search algorithms worked.
Universal search changed this by running a simultaneous news search and blending the results.

Social search
An article appeared in Popular Mechanics magazine
in April 2008 that began with the words Search is
dead. The argument that the article posed was that
the huge escalation in social networks would eventually make algorithm-based search engines redundant.
This seems like a bold claim when Google has become arguably the worlds most powerful brand.
The core of the argument is that as sites like Facebook, Twitter and LinkedIn, web users will find what
they want by using their social network rather than
search because they trust the people in their social
network, or indeed people in general will know the answer that you want better than a mathematical equation. This is clearly starting to happen with
micro-blogging.
Social networks or online communities are often built
or reinforced around the notion of shared interests.
We create an enormous amount of data when we participate in social networks. The principles of the social
web have crossed over into algorithmic search.
Launched in mid-2008, Scour is a social search engine that combines the searches from three popular
search engines as well as allowing users to vote on
the results. Scour takes results from Google, Yahoo!
and Live Search by searching and ranking the results
dependent on the recommendations of its users. It is
also possible to customize Scour and filter by any
combination of the three search engines. The usergenerated aspect of the Scour search means that the
user gives a search result a thumbs up if it is relevant
or a thumbs down if it isnt. Votes and comments directly impact on the search rankings. Scour also includes a rewards system where users collect points
for searching, commenting and voting as well as inviting friends to the site. By providing a platform for the
user to vote and comment on relevancy, searchers
connect with one another creating a true social search
community.

PR and natural search


Natural search is a description of the process of
searching that produces results based on their actual
relevance rather than because their ranking has been
boosted by paid-for search engine optimization techniques. The changes to the way that Google and
other search engines operate is highly relevant for the
public relations industry.
In the first instance, it elevates the importance of news
and PR is about news. The principal function of the
public relations press office is to support journalists
and to provide editorial content for news stories. To
verify the importance of this, all you need to do is
enter some search terms or the name of an organization that you are working for and you will see news
content ranking highly in the Google first page results.
The other sources of content that are starting to rank
highly in searches are the new channels of web used
in PR blogs, wikis and podcasts. Social networks
have become important in terms of creating searchable and relevant content. LinkedIn, the business-orientated social network, is a very good example. If you
or someone you know has a LinkedIn profile try
searching on a combination of the name and the
name of the business where the person in question
works. You should see the LinkedIn profile ranking
highly in that search. If you think about it, most of us
perform this type of search on a regular basis and
usually whenever we are considering entering into
any business partnership with someone new.
Having considered the increasing importance of public-relations generated content we should give some

Lect.univ.dr. Daniel CIUREL

English for Digital Media

Course 5

5. Link Farming. In the real world if you were to


build your house in a bad neighbourhood then
your house would be affected by its surroundings.
The same is true of the virtual world. Link farms
or free for all (FFA) pages have no other purposes than to list links of unrelated websites.
Black hat SEO techniques
They wont provide you with any traffic and you
1. Hidden Content comes in many guises but run the risk of having your site banned for partithe basic principle is that within the code for the cipating. Dont participate in link farming.
site there will be content stuffed with keywords,
White hat SEO techniques
this content will not be visible to the end user of
the site.
One way of doing this is by using comment tags. 1. Quality Content. When we first started looking
Content can also be hidden from the end user by at SEO as a separate entity to website build there
using CSS, excessively small text and coloured was one phrase that we would continually hear,
content is King, and its true. There is nothing
text on the same coloured background.
All of these techniques are frowned upon by more valuable you can do to optimise your site
search engines and if detected can mean your for search engines than offer unique well written
website will be penalised or even banned. To the content. A search engines aim is to serve up what
untrained eye it can be very difficult to spot the it believes to be the most appropriate website for
any given search to the end user.
use of some of these techniques.
There are several techniques (tactics) of SEO.
Some of them are legitimate (white hat), some
are not (black hat) and some still fall in-between
(grey hat).

2. Use Structural (Semantic) Mark Up and


Separate Content from Presentation. Semantically structuring your mark up helps search engines understand the content of your webpage
which is of course a good thing. Making proper
use of heading elements is essential because
search engines give more weight to the content
within the heading elements.
3. Meta Keywords should be a short list of words Using CSS to separate the design elements from
that inform of the main focus of the page. Meta the content makes for much leaner code and
keywords have been so misused in the past that makes it easier for search engines to find what
there are few if any search engines that take any theyre looking for, which is content. Remember
content is king!
heed of them.
2. Meta Keyword Stuffing. There are two Meta
tags that are generally used to inform search engines of the content on the page. They reside between the <head> tag of a page and when used
incorrectly they can alert a search engine that a
site is using spam techniques in an attempt to improve its ranking.

3. Titles and Meta Data. Providing pages with


proper titles and meta data is essential. As discussed previously (black hat SEO techniques)
section the meta description and meta keywords
elements have been so misused in the past that
Search Engines now regard them as less important, its still important to use them and use them
properly. Titles however still carry a lot of weight
and when we think of semantic mark up it is obvious why. The title of anything is a declaration
as to what the content might be, so make sure
your page titles are a true representation of the
content of the page.

4. Doorway or Gateway Pages are pages designed for search engines and not for the end
user. They are basically fake pages that are
stuffed with content and highly optimised for one
or two keywords that link to a target or landing
page. The end user never sees these pages because they are automatically redirected to the target page.
Off-the-shelf SEO software often encourages the
use of gateway pages as do SEO firms that dont
know what theyre talking about. Search engine
spiders are being enhanced continually to detect
these pages and will get ignored or worse still,
flag your site up as being spam and ban you all
together.

Lect.univ.dr. Daniel CIUREL

English for Digital Media

4. Keyword Research and Effective Keyword


Use. Create your website with keywords and key
phrases in mind. Research keywords and key
phrases you think people might use to find your
site. Single words are not always the most effective target, try multi-word phrases that are much
more specific to your product or service and youll
be targeting end users that are much more likely
to want what you are offering.
Use the keywords and key phrases youve identified effectively throughout your website. Assign
each page 2-3 of the keywords youve identified
and use the keywords throughout all the important elements of the page.

Course 5

2. Paid Links. The practice of purchasing link on


websites solely for the increase in link-popularity
that it can mean has grown steadily over the last
year-or-so with link auction sites such as
LinkAdage making this practice easier.
When links are purchased as pure advertising the
practice is considered legitimate, while the practice of purchasing links only for the increase in
link-popularity is considered an abuse and efforts
will be made to either discount the links or penalize the site (usually the sellers though not always).
As a general rule, if you are purchasing links you
should do so for the traffic that they will yield and
consider any increase in link-popularity to be an
5. Quality Inbound Links. Having inbound links "added bonus
to your website can be likened to having a vote
for the good but there are good links and bad 3. Duplicate Content. Due primarily to the inlinks so therefore votes for the good and votes crease in popularity of affiliate programs, duplithat are bad. Good links are links from other web cate content on the web has become an
pages that are regarded highly by the search en- increasingly significant problem for both search
gines and are contextually relevant to the content engines and search engine users alike with the
of your page. Bad links are links from web pages same or similar sitesdominating the top positions
that arent regarded highly or potentially banned in the search engine results pages.
by search engines and have no relevance to the To address this problem many search engines
content of your page.
have added filters that seek out pages with the
same or very similar content and eliminate the
Grey hat SEO techniques
duplicate. Even at times when the duplicate content is not detected by the search engines it is
1. Cloaking. There are times when cloaking is often reported by competitors and the site's rankconsidered a legitimate tactic by users and ings penalized.
search engines alike. Basically, if there is a logi- There are times when duplicate content is concal reason why you should be allowed to present sidered legitimate by both search engines and
different information to the search engines than visitors and that is on resource sites. A site that
the visitor (if you have content behind a "mem- consists primarily as an index of articles on a spebers only" area for example) you are relatively cific subject-matter will not be penalized by postsafe. Even so, this tactic is very risky and it is rec- ing articles that occur elsewhere on the net,
ommended that you contact each search engine, though the weight it may be given as additional
present your reasoning, and allow them the op- content will likely not be as high as a page of
portunity to approve it's use.
unique content.
Arguably, another example of a site legitimately If you find competitors using these tactics it is not
using cloaking, is when the site is mainly image- unethical to report them to the search engines.
based such as an art site. In this event, provided You are helping yourself, the search engines, and
that the text used to represent the page accu- the visitors by insuring that only legitimate comrately defines the page and image(s) on it, this panies, providing real information and content,
could be considered a legitimate use of cloaking. appear at the top of the search engine results.
As cloaking has often been abused, if other methods such as adding visible text to the page is possible it is recommended. If there are no other
alternatives it is recommended that you contact
the search engine prior to adding this tactic and
explain your argument.
4

You might also like