SEO

Traditional

Make it easier for crawlers/bots to scan a website's pages, examine content, map all links (outbound, inbound), and not interfere with this goal. Site content is qualitatively assessed on it being written for and valuable to people, and not for machines. Additionally, site content is evaluated for subject consistency and relatedness so search engines can prioritize results based on perceived subject matter authority. This entails having "targeted and unique content with relevant keywords and terminology".

All content including text, headers, images, multimedia are there to keep visitors interested and on your web page for preferably longer time periods than just bouncing. Which trackers can measure from search engine result pages. However images and multimedia alone don't inherently have rich metadata that engines can use to infer why a visitor stayed on a page for an extended period of time. Therefore, having enough supplementary text around images/multimedia (like explanations and transcripts) is helpful for engines.

Engines also prioritize content by how 'new' it is. If a site publishes new content on a regular basis, the website and the individual content will have a higher probability of being ranked higher in SERPs. Correlated to 'new' is how functional a website/page is. Any broken links or missing images is an additional qualitative indicator of maintenance, that's also factored in ranking.

In the Old Days

Keyword stuffing, meta description, blackhat tricks that got you banned.

PageRank

The number and quality of inbound links has been the predominant factor of authority and relevance since Google was founded. Their PageRank algorithm was developed to rank entities with the most number of inbound links as being the most empirically authoritative. When Supernodes link to other entities, they indirectly tell engines that entity is trustworthy for the particular context they were mentioned.

On the flip side, sites with fewer outbound links carry greater weight than sites linking out more often.

Keywords

Top keywords represent only 20% of all search queries. The rest is long-tail words and phrases that are domain specific and have relatively higher value, if a site can position itself well to correlate for those keywords. It is generally accepted that the closer a query represents specific information, searchers are more interested and ready to act on information. Search services include: wordtracker.com, semrush.com which can help orgs find keywords with variable popularity and competitiveness.

Additional services: Google Keyword Planner, Google Trends.

For advertising, keywords are priced on popularity since intent cannot be measured. As such, it could very well be more efficient to pick keywords that are less specific, but whose bid price is reasonable for a visitors potential value.

Pick: low competition, high number of average monthly searches, high relevance to org.

One tactic used to find a high number of relevant keywords, phrases is to content scrape the search engine advertising services, and use those keywords within webpages and (inbound) links for better positioning.

'Key Pages' are your brand endpoints which you people to talk about and links to point to. They can refer to product/service pages, categories, documentation, brand story, press releases, blog posts, and other areas deemed interesting. A page's 'link velocity' is the rate at which it earns links over time.

Ensuring the canonical URLs are clean (no parameters), keyword optimized, never break, and are redirected (if necessary) is absolutely critical for SEO.

In some ways, this is an extension of digital PR to continuously generate inbound links to meaningful key pages. Aside from typical product PR, additional methods used include writing guest blog posts for trustworthy websites that correlate with your brand.

A common technique is to build rank is through collective linking of similar domains. Bloggers often do/did this with 'blogrolls' that listed favorite other bloggers, but usually out of natural appreciation for one another's writing. While this may work on a small scale, the more orchestrated such a strategy appears, the higher likelihood algorithms will see the non-randomness and will penalize group domains.

Consider that all links are practically endorsements through their link text. If domain operators just want to link to, but not endorse a 3rd party, they can add 'rel=nofollow' attributes to links.

Dead links are pages that are no longer reachable on one's domain, because of their removal or having their addressed changed. Tools can show which inbound links are dead, which can then be added back to a domain and redirected to an appropriate page. This is good for UX and so that search engines can appropriately forward visitors.

To better understand competitors activity and find link opportunities, consider tools like Spyfu.

Use Google Search Console to review links and search traffic. Links from content farms, rings, and disreputable websites (according to Google) will be penalized through mere association even if it's inferred but not actual. One should try to contact the bad link owners to ask removing links. If impossible, Google allows declaring non-association to bad links through their 'Disavow Tool'.

Single Page Applications

While crawlers can use JS to index a website, they may timeout at certain times. Because the process is opaque, its advised use server-side rendering with JS apps so that key endpoints are always available for initial loads.

White vs. Black Hat

White Hat: Making great content for humans through sustainable rank improvement Black Hat: Tries to appeal to crawlers. Though potentially lucrative, it highly risks penalty by drop in PageRank

Ad Campaigns

As ads are another means of reaching search engine users, they can be cost effective. Particularly for their ability to target viewers at specific times, based on location, demo- and psychographics, and other attributes which can be inferred.

Pricing

CPM: Cost per 1,000 impressions

CPC: Cost per click (more expensive, requires making great impressions to convert visitors). Bidding too low though can result in underused budgets. Requires fine tuning and competitive analysis, akin to eBay selling.

CPA: Cost per action, implying conversion to a marketing goal. Typically used for referral-based marketing and paying out commissions for verified actions.

CPI: Cost per install. Similar to CPA though used within software app stores.

HTML Pages

Optimizing all pages to be descriptive, unique, and link to other pages that are relevant.

Canonical URL: representing a human-readable structured directory. Title: 30-65 characters Meta description: 120-150 characters Keywords: Once useful in the web's very early days, now irrelevant

Other considerations: using semantic HTML to give structure and relative importance to content. Header tags for example indicate an ordered outline of what a page is about. Describe what links point to in their text. Keyword optimize image filenames for indexing in search engines.

Strategies

Orgs with Physical Presence

Use Google My Business. Ask for reviews from Google, Facebook; ratings will appear on SERPs. Submit to directories (moz.com/local). If a local Chamber of Commerce can list you as a member and link to you, it can be beneficial, especially if they have a .gov domain. If you change location or contact data, get it updated in local citations. If incorrect information is found and reported by people as incorrect, the business info will be marked as slightly less relevant. Local backlinks from trusted 3rd parties. Use Google Search Console for testing, analysis, and setting up geographic targeting.

Metadata

Tools

Google Structured Data Testing Tool Browser extensions: 'SEO Meta' for chrome Facebook Opengraph debugger Twitter Card Validator Pinterest Rich Pins Validator

Social

Inbound links from social media and related platforms usually go beyond text, and automatically build mini previews of what the link is about.

With Facebook OpenGraph, developers can insert meta tags in their HTML pages which provide metadata to be used for these previews. Twitter can also use the FB OpenGraph data for generating previews, though additional Twitter-specific tags are helpful.

Pinterest Rich Pins are another source of social content, though it requires a user account and verification process for quality control. Pinterest provides the ability to designate 4 types of pins: Product, Recipe, Article, and App.

Structured Data

Beyond standard HTML structure, there are additional data formats that have arisen over time and due to their consistent structure and use across the web, they can be relied upon to specifically mark and categorize content. Three formats proposed by internet standards bodies include Microdata, RDFa, and JSON-LD. The last of which promises simpler implementation since it doesn't entail wrapping HTML elements like Microdata does.

Attributes are provided using schema vocabulary, which organizations like Schema.org have promoted for use.

Benefits include enhanced features in apps, less work for crawlers to make relational connections, and greater appearance control of search engine results including info about products, events, recipes, and reviews.

See this article for more information on JSON-LD

Mobile

While a device type may not seem relevant for SEO, Google cares about product UX (including indexed webpages) across all touchpoints. Since mobile search traffic has exceeded desktop, they have split their indexing into mobile and non-mobile so users get better results. The mobile index pays greater attention to content, design, and speed. They analyze these factors in combination with abandonment metrics to adjust their PageRank algorithm accordingly.

Since Progressive Web Apps enhance mobile experiences, we should probably consider any associated features with the potential to improve PageRank.

Content

TODO JSON-LD, Google ecosystem accounts (Maps, Places...) Site structure, location, hours, contact info Other microdata

Design

Use a responsive design HTML Header meta tags:

  • Viewport corresponds with responsive design

  • Additional tags related to when a website is saved as a bookmark or PWA to a phone. Includes: fullscreen, status bar, home screen title

Speed

See sections on Performance and SPA performance for reference.

PWA Features

Although there's many potential enhancements, noteworthy considerations include:

  • Web App Manifests

  • Home Screen Icons

  • 'Add to Homescreen' guidance

Platforms

Google AMP, Facebook's Instant Articles, and Apple News are web publishing services that take existing website content, optimize it for mobile, and serve it through CDNs for faster delivery.

Since large digital platforms and telecoms already aggressively cache resources, they've created business services that allow organizations to not only reach visitors faster, but also use other platform products, notably advertising.

The relation to SEO is that platform-optimized websites have a higher probability of being favorably curated and indexed.

Google AMP

Accelerated Mobile Pages (AMP) promises to be a solution for publishers, advertisers, ecommerce vendors, and adtech platforms. For basic compliance, it involves adding markup to webpages, removing 3rd party JS and external CSS, and other steps. Google has tried to simplify the amount of work needed for compliance by providing participants auditing and validation functions.

For interactivity and other design features, AMP has its own markup language and scripts to call so that pages are compliant. Because Google is repackaging and serving the webpages, they can add additional interaction and media features when endusers also use Chrome.

A common complaint among web publishers though, is that browser addresses for loaded AMP pages show Google and not the publisher in the root domain. This raises the issue of whether visitors know who the original publisher is and how fresh the content is. Additionally, there are security issues regarding authenticity if an entity publishes webpages under another identity. Google began working on this issue in 2018 and is adapting their cache to align with an emerging web packaging standard.

Last updated