Internet Marketing Glossary
Take a few minutes to review this glossary to get familiar with key terms and acronyms. (Just think how informed you’ll sound at the next meeting with your team!)
Take a few minutes to review this glossary to get familiar with key terms and acronyms. (Just think how informed you’ll sound at the next meeting with your team!)
A 301 redirect tells Google that a web page has permanently moved. When you build a new website, it is important to 301 redirect all of the individual pages to the new page names, so that people looking for your website don’t get confused by “broken links” (links that don’t work) in the search results.
A 404 error is when the server is unable to locate a website address (URL). Correct any pages on your website that produce a 404 error. If people link to pages that you have deleted, this will generate a 404 error and the link energy will be wasted.
The act of testing two different options of creative (such as a headline in an ad or a call to action) to see which one performs best.
“Above the fold” generally means the portion of the web page that is viewable before you have to scroll down to see the rest of the page.
Google AdSense allows publishers to place ads on their websites and share in the profits with Google.
Google AdWords is an advertising program using text ads and display ads where the advertiser only pays when someone actually “clicks” on the ad and sends that person to the advertiser’s website.
Affiliate marketing allows merchants to expand their reach by paying a commission to publishers (such as blogs) willing to place advertisements on their websites that link to the merchant. It is not uncommon in affiliate programs to have over 10,000 publishers displaying your ads/banner ads that cost you nothing unless the clicks convert into sales within a few days.
Asynchronous JavaScript and XML (AJAX) allows a web page to request additional data from a server without requiring a new page to load.
A set of parameters a search engine uses to match the keywords in a query with the content of each web page, so that the web pages found can be ordered suitably in the query results.
The “alt” attribute in HTML is code that displays alternative text when a user “mouses” over an image on a web page. It is especially helpful to visually impaired people. It also helps search engines understand what an image is about.
The text that a user clicks on to follow a link to another web page.
An Application Program Interface (API) allows developers to access software functions. Most major search and social software products have an API program.
Active Server Pages is a dynamic Microsoft programming language.
Black hat search engine optimization is the practice of doing just about anything—including things that are illegal—to achieve top rankings on the search engine results pages.
Bookmarking allows you to make note of something you found online and save the URL so you can return to it later.
A copy of a web page that is stored by a search engine. Search engines store website data for analysis at a later date, as opposed to analyzing it on the fly. You can enter the entire query (including the word “cache”) “cache:www.yourdomain.com” in a search engine or browser address bar to see when your site was last stored by Google.
A graphic or text block that directs visitors to act in some specific manner, such as requesting a brochure, filling out a form, or making a purchase.
The canonical version of a URL (website address) is the most authoritative version indexed by major search engines.
Cascading Style Sheets is a programming method for adding visual styles such as fonts, colors, and formatting to web documents.
The click-through rate is the percentage of people who view an advertisement online and click on an embedded link in the ad to be taken to a particular web page.
Cloaking is when you display different content to search engines and searchers on the same web page to fool the search engines.
Content management systems like WordPress allow website owners to easily edit the content of their pages without needing help from a web developer.
A conversion is achieved when a goal is completed, such as when you turn a lead or prospect into a customer.
A group of tactics designed to increase sales.
The ratio of visitors that turn into a sale after taking an action on a website.
A small data file written to a user’s local machine to track their activity.
Cost per action is the amount it costs to get one lead or sale.
Cost per click is the amount the advertiser pays each time a user clicks on a paid ad and is pulled to the advertiser’s website.
Cost per thousand ad impressions is what it costs an advertiser to show 1,000 people an ad.
Turning ad campaigns on or off (starting or stopping ad campaigns), changing an ad bid price, or changing other budget constraints based on the time of day you think your prospects will be most likely to buy.
A link that points to a page other than the home page within a website.
A categorized listing of websites. Most directories are compiled or built by having website owners submit listings.
A “dofollow” link is one that search engines follow and that passes link value.
Domain Name Servers are naming schemes used to “point” a domain name/host name to a specific TCP/IP address on the Internet.
Material on a web page that is the same (or almost the same) as material on another page. The search engines do not like copies of information.
Database-driven content (.asp, .cfm, .cgi, or .shtml, etc.), such as in an online shopping cart online where products are “stored” or a site that draws on a database when building its pages.
This is a small graphic or illustration that appears next to URLs in a web browser address bar.
File Transfer Protocol allows you to transfer data between computers using FTP software, such as when uploading pages to your website/server from your desktop computer.
An HTML technique for combining two or more separate HTML documents within a single web browser screen. Avoid frames if at all possible, or use workarounds, or your site may not properly indexed in the search engines.
Making a page rank well for a specific search query by pointing hundreds or thousands of links at it with the keywords in the anchor text.
Google’s search engine “spider” or web-crawling robot.
In the past, Google updated their index of web pages roughly once a month. Those updates were called Google Dances. Since Google shifted to a constantly updating index, Google no longer does this. Major search indexes are constantly updating. Google refers to this continuous refresh as “everflux.”
A hashtag symbol (#) used on Twitter as a way to group messages (“tweets”) by a theme.
Apache directory-level configuration file which can be used to password protect or redirect files. This is one place you can set up 301 redirects.
The practice of marketing by sharing helpful content instead of pushing ads on people. SEO, social media, blogging, lead nurturing, conversion optimization, online PR, and analytic tracking are examples of inbound marketing tactics.
A company that sells end users access to the web. Some of these companies also sell usage data to web analytics companies.
Internet Protocol Address. Every computer connected to the Internet has a unique IP address.
A website scripting language that can be embedded into HTML documents to add dynamic features such as drop down menus or rotating images.
The practice of writing web page text that uses the core keyword an excessive number of times per page.
Klout is an online tool that allows you to measure your influence in social media.
The web page on which a visitor arrives (or “lands”) after clicking on a link or advertisement.
Latent Semantic Indexing is a way for search engines to mathematically understand and represent language based on the similarity of web pages and keyword co-occurrence. A relevant search result may not even contain the search term, but may come up based solely on the fact that it contains many words that are similar to those appearing on relevant pages containing the search words.
Underlined text that the user can click on to move from one web page to another or to another position on the same web page. Most major search engines consider links from other sites pointing to you as a vote of trust.
The amount of authority your site has based on how many quality sites link to it.
The art of targeting, creating, and formatting information that provokes your target audience to point high-quality links to your site. Many link baiting techniques are targeted at social media users and bloggers.
The process of building high-quality linkage data that search engines will evaluate to determine whether your website is authoritative, relevant, and trustworthy.
A domain builds link equity over a period of time after it has had other sites linking to it.
Link juice is the amount of energy passed from one link to another.
Long tail keywords are longer, more precise, and more specific, and thus they are easier to rank for and tend to convert better. For example, a long tail keyword would be something like “Italian leather running shoes” as opposed to the head term “running shoes.”
Online, a lurker is a person who reads discussions on a message board, newsgroup, or social network but rarely or never participates by adding comments of his/her own.
All major search engines combine a manual review process involving actual humans with their automated relevancy algorithms to help catch search spam and train the relevancy algorithms. Abnormal usage data or link growth patterns may also flag a website for manual review.
Meta data is information in the code that is not visible to the user, such as meta tags.
The meta description tag is typically a sentence or two of content that describes the content of the web page. This tag often dictates the description users see in the search results, so definitely use it. Make sure each page has only descriptions that relate to the concepts on that specific page.
The meta keyword is a tag which can be used to highlight the keywords and keyword phrases that the page is about and hoping to rank for. This tag is no longer used by Google for ranking.
A meta tag used to make a browser “refresh” and take the user to another URL.
Code placed in the HTML header of a web page, providing information that is not visible to browsers. The most common meta tags (and those most used in SEO) are meta title, keywords, and description.
The line of code that dictates the text that is at the very top of the browser window. Title tags are one of the top quick fixes for SEO.
A measure of the number of people who think of you or your product when thinking of products in your category.
A website that duplicates the contents of another website.
The act of testing multiple creative elements simultaneously to see which combination of elements will perform the best.
On Facebook, the News Feed is the page where users can see all the latest updates from their Facebook “friends.”
A piece of code used to prevent a link from passing “link authority.” Commonly used on sites with user-generated content, such as in blog comments.
Software that is written and distributed in such a way that any software developer can modify it as they see fit.
Most major search engines have results that consist of paid ads and unpaid listings. The unpaid listings are called the organic search results and are organized by relevancy (which is largely determined based on linkage data, page content, usage data, domain history, and trust-related data). Some studies have shown that 60% to 80% of clicks are on the organic search results.
A logarithmic scale based on link equity which estimates the importance of web documents. Google’s relevancy algorithms have moved away from heavily relying on PageRank and place more emphasis on trusted links via algorithms such as TrustRank.
Pay per click is the pricing model through which most search ads and many contextual ad programs are sold. PPC ads only charge advertisers if a potential customer clicks on the advertiser’s ad.
A permalink is the address (URL) of a specific post within a blog or site.
Altering the search results based on a person’s location, search history, content they recently viewed, or other factors relevant to them on a personal level.
PHP Hypertext Preprocessor is an open source server-side scripting language used to render web pages or add interactivity to them.
A podcast is a pre-recorded audio or video file, often released episodically and downloaded via an RSS feed.
A measure of how close words are to one another in text on a web page.
A measure used by Google to help filter bad ads out of their AdWords program.
When the search engines index content as it comes out from a site (like Twitter, for example) with no time delay.
Nepotistic link exchanges where websites try to build false authority by trading links, using three-way link trades, or other low-quality link schemes. When sites link naturally, there is some amount of cross-linking within a community, but if most or all of your links are reciprocal in nature, it may be a sign of ranking manipulation. Sites that trade links that are off-topic or on links pages that are stashed away deep within their sites probably do not pass much link authority, and may add more risk than reward in terms of SEO strategy.
If a website has been penalized by a search engine for spamming, the website owner may fix the infraction and ask for reinclusion in the search engine indexes. Depending on the severity of the infraction and the brand strength of the website, it may or may not be added back to the search index.
The source that drove a visitor to your website.
The process of informing a search engine or directory that a new web page or website should be indexed.
A measure of how useful/relevant searchers find the search engine results they get. Many search engines may bias organic search results to informational resources, since commercial ads also show up in the search results.
The practice of making sure websites that appear in the search engines when searching on keywords such as your name or company name, have content that speaks positively of you.
A measure of the efficiency of an investment, as in how much return you receive on each marketing dollar spent.
Any automated browser program that follows hypertext links and accesses web pages. Such robots include search engine spiders.
A file that sits in the root directory of a site and tells search engines which files not to crawl to ignore. Some search engines will still list your URLs as URL-only listings even if you block them using a robots.txt file.
Setting up an RSS (Really Simple Syndication) feed allows users to subscribe to online content and read it when they please.
An RSS reader allows users to group articles from various sites into one place using RSS feeds. Google Reader, for example, allows you to quickly and easily view large volumes of content without having to jump from site to site.
A web page is considered scan-and-skim-friendly when it is set up with more bullet points and smaller chunks of information, instead of many long, dense paragraphs.
A scrapline is a short phrase used to express benefits or a value proposition. It is like a tagline but not used near the logo.
Also known as Search Marketing. Can mean paid search marketing or search marketing in general, whether paid or organic/SEO.
The art and science of publishing information online and marketing it in a manner that helps search engines understand that your information is relevant to particular search queries.
The web page on which the search engines show the results for a search query.
A person’s attitude as indicated by their online comments about a brand.
A computer that hosts files and “serve” (supply) them to the web.
A web page that can be used to help give search engines (and human visitors) a secondary route to navigate through your site.
Social media refers to the methods people use to interact online and how they create, share, and exchange information and ideas in online communities and networks. Examples of social media sites include Facebook, Twitter, Google+, and LinkedIn.
Unsolicited email messages that the recipient does not want. Search engine spam refers to people doing deceptive tactics to game the search results.
A search engine robot that searches or “crawls” the web for pages to include in the index and/or search results.
A feature-rich or elegantly designed, beautiful web page that typically offers poor usability and does not offer much content for the search engines to index.
Content that does not change frequently. May also refer to content that does not have any social elements to it and does not use dynamic programming languages.
When a search engine thinks your pages are not of high quality, they trust them less and they will be buried in a lower tier of search results that don’t often appear to users.
A classification system of controlled vocabulary used to organize topical subjects, usually hierarchical in nature.
Search relevancy algorithm that places additional weighting on links from trusted websites that are controlled by major corporations, educational institutions, or governmental institutions.
The unique address of any web document.
A real visitor to a website, not merely an amount of pages one visitor views.
Search engines frequently revise their algorithms and data sets to help keep their search results fresh and make their relevancy algorithms hard to reverse-engineer. Most major search engines are continuously revising both their relevancy algorithms and search index.
A technique used to help make URLs more unique and descriptive in an effort to facilitate better site-wide indexing by major search engines.
How easy or difficult it is for website visitors to perform the desired actions. The proper structure and formatting of text and hyperlink-based calls to action can drastically increase your website’s usability, and thus your conversion rates.
Self-propagating promotion techniques commonly transmitted through email, blogging, and word-of-mouth marketing channels.
A server that allows multiple top-level domains to be hosted from a single computer.
The SEO practices used by the “good guys.” Search engines set up guidelines that help them extract billions of dollars of ad revenue from the work of publishers and the attention of searchers. Within that highly profitable framework, search engines consider certain marketing techniques deceptive in nature, and label them as “black hat SEO.” Practices considered within the guidelines are called “white hat SEO.” Since search guidelines are not a static set of rules, SEO practices considered legitimate one day may be considered deceptive the next.
Software that allows information to be published online using collaborative editing.
A free, online collaborative encyclopedia that is built using wiki software.
Popular open-source blogging software platform that offers both a downloadable blogging program and a hosted solution.