Categories
Web

Smarter ways to book meeting rooms using Slack

At AnalogFolk we decided to improve the process to book meetings rooms using Slack bot and Google Calendar.

Check out the article: https://analogfolk.com/blog/smarter-meeting-rooms

Categories
Optimizations Web

Search Engine Optimization (SEO) Tips – Technical, Social and Links

“It’s pointless having a website if no-one can find it!”

Before we continue, I’d just like to state that there is no Silver Bullet about SEO, these recommendations come from personal experiments. The algorithms that rank the pages are not publicly available and even though I might mention Google as a synonym for search engine, you should always consider who you’re targeting. E.g. Yahoo has a relevant share in the US compared to the UK, and Baidu is the “Google” in China.

So what is Search Engine Optimization (SEO)?

“SEO is a way of creating more visibility on Search Engines such as Google, Bing! and Baidu, by ranking websites higher in search results, which in turn lead to more click-troughs and conversions.”

To improve your Search Engine Results Page (SERP) ranking, there are on-the-page factors like Code / Markup and Content, and off-the-page factors, such as Links / Social. By ranking higher in the SERP, you increase your visibility and reputation, which in turn lead to sales / engagement. See below a great infographic about SEO.

SearchEngineLand-Periodic-Table-of-SEO-2013-medium

Technical Improvements – Why should I care?

Although the Technical Improvements have the smallest impact of the improvements I’ll be mentioning in terms of reputation, they are the first step to get your website pages visible / indexed.

Case-study – shopcade.com

Shopcade is a social / e-commerce platform (something like Facebook meets Amazon). At the time they had a catalog of 100M products, but only a fraction of those products were indexed / discoverable, so the mission was to get most of the catalog showing up on the SERPs.

After implementing most of the technical improvements (code / markup), the site went from 34K indexed pages to 85M on Google in a year! I believe I have your attention now…
Shopcade GWT Index Graph

We got this graph from Google Webmaster Tools (GWT) and we could monitor which improvements had better results (we also added comments on Google Analytics every time we made an improvement, so we could track better).

Crawlers / Bots / Spiders

A Web crawler is an Internet bot which systematically browses the World Wide Web, typically for the purpose of Web indexing. A Web crawler may also be called a Web spider, a bot, an ant or an automatic indexer.
A spider is a program run by a search engine to build a summary of a website’s content (content index). Spiders create a text-based summary of content and an address (URL) for each webpage.

Below is an example of how Humans perceive a webpage. We can clearly understand what is important, what is the title, and where is the Call to Action (CTA), etc.
The Venture

And below is an example of how a Spider would “understand” the same website. So we need to make use of the appropriate HTML tags to help the spiders understand the content and importance of the page, by using title tags, semantic tags, and because they don’t “read” the images content, alt tags for images.
The Venture - Source

Technical Improvements – What can I do?

  • Setup the “robots.txt” to allow “crawlers” to index your pages

Improper configuration might prevent the pages to be indexed

  • Avoid User-Agent Cloaking

By showing different content depending on the User-Agent the site might be penalized

  • Improve site speed

Slower sites will be penalised. See more on Why should we care about website performance

  • Every image should have an “alt” attribute

This helps the automated “crawlers” understand the content of the images

  • Every page should have a canonical URL and make use of pagination meta tags

The Canonical URL is the owner of the content, even if there are multiple URLs that serve the same content. This avoids penalization for content duplication. e.g. <link rel=“canonical” href=“http://example.com/page-a”>. Pagination tags example <link rel=“next” …> and <link rel=“previous” …>

  • Make use of http redirection (301) when necessary to avoid duplicate content

e.g., http://www.example.com, https://www.example.com, http://example.com, http://example.com/ should all redirect to only one of them, like https://www.example.com/, this avoids content duplication, which means multiple pages serving the exact same content. An alternative is to make use of the canonical meta tag.

  • Make use of the HTML5 semantic Tags

Use tags like article, section, aside, summary, …

  • Use H1-H6 in order to demonstrate hierarchy

H1 being the most important to H6 the least

  • Add a Meta Title tag

This should be relevant to the actual content of the page. Should not be longer than 50-60 characters, as Google truncates the rest in the search results. You should avoid duplicate meta titles

  • Add a Meta Description tag

This should be relevant to the actual content of the page. Should not be longer than 150-160 characters, as Google truncates the rest in the search results. You should avoid duplicate meta descriptions

  • Avoid broken links and have a helpful 404 (page not found) error page

Whenever a page is moved, we should implement a 301 (permanent) redirection to the new page to keep the same value as it had before (link juice). 404 pages should return the http status 404 and should have useful links back into the site. Broken Links are bad User Experience (UX) therefore penalized.

  • Add Social Meta Tags (Facebook Open Graph, Twitter Cards, and Google Publisher)

These tags, help the spider to understand the content of the website, and allow those companies to better represent the content of the page on a snippet. Google Publisher will eventually add more credibility to the site, by showing your latest posts and profile on the top right side of the search results page

  • Submit sitemaps regularly

Submitting sitemaps helps the Search Engines to be aware of your latest pages, updates, etc

  • Make use of rich snippets by using Microformat, Schema.org, …

By using this special markup, “spiders” will be able to better understand the content and provide rich snippets on the search results. Available formats can describe a Product, a Person, Comments, Reviews, etc. Also take advantage of Google Now Cards by using schema.org for email (airline ticket, bills, …).

Below an example of a rich snippet with ratings, price, …
Rich Snippet

  • Think Mobile

Optimize the website for mobile, by creating a specific mobile website when possible. Search results vary if made from a desktop or from a mobile device

  • Use Google Analytics (GA) and Google Webmaster Tools (GWT)

They provide very insightful data about how is your site performing. GWT also provides SEO recommendations

  • SEO friendly URL structure

Which one is sexier? http://example.com/page.php?p=918726 or http://example.com/awesome-article
The URL string is also searchable, which helps your page to be discoverable.
Avoid making the URLs too deep, if it takes many clicks to get there, then the content is less relevant or important. Also, if we make categories like http://example.com/product/product-name, http://example.com/product/ should also be available as a page that contains products… URLS are case sensitive

Below we have an example of URLs being highlighted in search results:
URLs in search results

How do I know my site is being indexed?

So, after implementing those improvements, it’s time to see if they’re working.
Getting your site indexed is just one of the steps. You can verify that your site is being indexed by searching for “site: www.sitename.com”
AnalogFolk Site Results

Content – Why should I care?

Now that we’re sure the site is being indexed, we want to rank higher than our competitors for the selected keywords. About 85% of the users find what they’re looking for in the 1st search results page. We should clearly define which keywords are relevant to our content, e.g. Lucozade as a keyword vs energy drink or feeling down… If a user searches for the brand, they will already know what they want, so we should try to target the related keywords in order to increase discoverability.

Case-study shopcade.com

The website was being properly indexed, but performing poorly (not showing on the top results).
By implementing a series of non-technical improvements, Shopcade consistently started to rank even higher than the actual brand that also sold the products, consequently improving the click-through rate by 50% – 80%.
Shopcade Ranking

Content – What can I do?

  • Content is King

Write fresh, well written, unique quality content. One of the most important factors for SEO. Sell through content

  • Start a Blog

Use the blog to keep users engaged and add more content. Use the blog to cross-sell some products / articles. Some blog are engaging and funny, and as a side effect, they lead users into your website

  • Inform the client

Users who are informed are more likely to buy. As an example, we have the eBay Buying Guides, that explain what to look for in a certain product. They also use it as an opportunity to cross-sell by suggesting a few products

  • Tweak the page title to provide more information about the product

e.g. simply prepending “35% Off” Product Name on the Shopcade product pages title, made the users click this site instead of others

  • Link Juice

Keep users engaged by suggesting similar/recent/top products/articles. This means users will spend more time on the site and more likely to come back. It also allows Search Engines to discover your content more easily. Create lists about your products, create articles like Who the famous “TV SERIES” Are you? – Target the intended keywords to drive traffic into your site

  • Allow users to add comments / reviews

Not only it adds unique content and freshness, it also helps other users to make the purchase / keep engaged. Be aware of spam comments and links (you should considering adding “nofollow” to those links). One of the downsides of allowing users to add comments is risk of being bullied by the users, and it might imply an overhead due to moderation

  • Use Google Trends and other similar tools

Find out which keywords are trending and create content around it to try to capture a few more users into your website and then convert them. E.g. Super Bowl, create an article or a product list targeting the competition, and them try to convert the users

  • Use Google Webmaster Tools

Find out how your users are discovering your website and try to explore it even further. Either by creating content around it or by doing more of what is currently converting better

  • Target long tail queries

Easier to rank due to less competition and the more specific the higher the intention. E.g. “Adidas Originals ZX Flux” as opposed to “adidas running shoes”

Links / Social – What can I do?

“Each link back to your site is a vote…”

  • Social Media

Share your articles / content in Social Media accounts and allow users to share and spread the word. Consider owning a #hashtag. Create competitions to keep users engaged and go viral

  • Link Building / Establish Relations

Engage other users / bloggers to write content / reviews about your product (should be natural and not paid)

  • Link Quality

Authority website links will count more than regular and “spammy” websites

  • Link Without Any Conditions

All links should feel natural. Having links to other websites, is like voting for those websites. Avoid linking to spammy websites and don’t impose linking back

What have I missed? Feel free to comment.

Resources:
Categories
Frontend Optimizations Web

Website Performance – Real Speed vs Apparent Speed

Even if we optimize all the aspects, we might still have a website that’ll take longer than we wished for. We can make use of several techniques to make the site apparently faster

We’ve seen in Website Performance Optimizations how we can make the website load faster. But there are times we can combine these improvements with others that will help to make the site apparently load faster. Even if the website is still loading, the user visualises most of the page in the shortest time possible, which translates in a good user experience and the perception that the site is fast.

Introducing webpagetest.org Speed Index

The Speed Index is the average time at which visible parts of the page are displayed. It takes the visual progress of the visible page loading and computes an overall score for how quickly the content painted. The lower the number the better.
Speed Index

Let’s say the following website takes 6s to load. We can see the difference of a good apparent speed, by rendering 93% of the content in 1s and load the remaining 7% in 5s, and a not so good speed index, in which after 1s only 18% was loaded in 1s and only at 5s it loaded around 90%.
Comparison of two strip films speed index

If we plot the completeness of a page over time, we will end up with something that looks like this:
SpeedIndex
The progress is then converted into a number by calculating the area under the curve:
Speed index Comparison Chart

Case-study shopcade.com

When I worked at shopcade.com, as part of a decision to provide a better user experience and also rank higher in search results, we’ve decided to tackle the page loading time (real and apparent). These improvements lead to higher conversion rates, but also, we started getting more and more indexed pages by Search Engines, and more frequently indexed. This eventually brought a lot of organic traffic to the website, along with other specific SEO improvements.
So, by applying the already mentioned performance optimizations (we also tweaked the backend framework, to get a much faster response time – Time to First Byte), we got these results:
Comparison chart
The DOM content loaded was triggered just after 0.6s, and we still triggered some scripts asynchronously, after the Content Download trigger. Check the before and after results on Webpagetest.org.
N.B. Just 1.3 seconds were to download a web font.

Below we can see an example of poorly optimized website:
Bad requests chart
And here a better one:
Good requests chart

What have I missed? Feel free to comment.

Continue Reading:

Resources:
Categories
Frontend Optimizations Web

Website Performance – Optimizations

After you’ve read about Why Should We Care about Website Performance? I’ve come up with a list of potential optimizations:

  • Cleanup Code

Remove comments, remove unnecessary HTML, JS and CSS.

  • Combine and Minify Assets (JS and CSS)

Combine the JS files and libraries into one JS file and minify. The same for CSS to reduce number of requests.

  • Load CSS on the <head> and JS just before </body>

Loading CSS first, prevents additional repaints and reflows, and will make the page look much better from the beginning and JS in the end to allow for the page to be rendered without being blocked while loading the scripts.

  • Try to load scripts as asynchronous

This way, the page rendering won’t be blocked and triggers the Document loaded event (e.g., $(document).ready()), much sooner. All social media plugins and analytics should be loaded asynchronously.

  • Make use of sprites whenever possible

This way we avoid excessive number of requests. Each request has costs from DNS Lookup, SSL negotiation, content download. If the image is small enough, the cost of DNS and SSL is higher than the asset itself.

  • Cache

Make an effective use of cache. Enable caching for assets, DB queries, templates / pre-render templates, but also implement Cache Busters to allow invalidating the cache when the assets are updated. One preferred URL fingerprint is /assets/9833e4/css/style.css, as some of the other solutions might have problems with proxies and CDNs (e.g., some CDNs ignore the query string parameters).

  • Download assets from cookieless domains

It may save a lot of time, since the cookies are sent for every request and may be as much as 4Kb of overhead per request.

  • Download assets from multiple subdomains / CDNs

e.g., static.domain.com, static1.domain.com, etc, as browsers usually have a limit on how many concurrent connections they establish with each domain, which means, the first set of assets, needs to be downloaded before starting new connections.

  • Consider using Google CDN for common libraries

Google CDN is usually very fast, and physically close to the client. And the client might already have the asset cached.

  • Enable Compression

Enabling compression (e.g. GZIP) to make the file size much smaller to download. With jQuery ou can get a gain of 88% when compared to the original size – jQuery (273K), Minified (90k), Compressed (32K).

  • Remove inline scripts and styles

Move them into files to make them cacheable. *Depending on each specific case.

  • Serve adequately sized images

If we only need a 50x50px image, just serve that image. Don’t rely on the browser to resize, as it will still download the full size image.

  • Optimize images

Remove unnecessary data from images (strip irrelevant information), compress, and if it is a JPEG, make use of the progressive version, as this will make the image start appearing sooner.

  • Prevent excessive redirects

Each redirect costs time, so avoid unnecessary redirections.

  • Consider using Nginx for serving static content

Nginx is very fast and optimized to serve static content.

  • Consider using techniques like lazy loading

If the content is not important for SEO or another reason, consider triggering the load, only after the page is served.

  • Consider hosting images, web server and database server in a different machines

This reduces the load on each server, which translates in faster response times.

Continue Reading:

Resources:
Categories
Frontend Optimizations Web

Why Should We Care About Website Performance?

Nearly half of the web users expect a site to load in 2 seconds or less, and they tend to abandon a site that isn’t loaded within 3 seconds, according to Akamai and Gomez.com

The performance of websites is critical for the success of businesses. A well-performing website improves a lot the user experience. It’ll keep your audience coming back, staying longer and converting a whole lot better. It is also one of the measured signals for search results rankings, and usually appears higher than less performing websites. Mobile devices have become so significant today that a website should always consider the limitations of those devices.

Disclaimer
Just to mention that these optimizations should be considered on a project per project basis. Some improvements might not be worth considering, depending on budget, project duration, time schedules, available resources, etc.

A few facts and stats

  • 47% of consumers expect a web page to load in 2 seconds or less
  • 40% of people abandon a website that takes more than 3 seconds to load
  • 75% of the 1,058 people asked would not return to websites that took longer than four seconds to load
  • A 1 second delay in page response can result in a 7% reduction in conversions
  • If an e-commerce site is making £100,000 per day, a 1 second page delay could potentially cost you £2.5 million in lost sales every year
  • The average weight of a web page today is 1.9MB
  • Page speed in one of the SEO measured signals and can affect conversion rate
  • The average web page size in 2014 was 1,953Kb and had 95 HTTP pull requests per page

Most Common Reasons For a Poor Performance

  • Bloated CMS Templates

Typical WordPress themes are crammed full of features. Many will be third-party plugins, styles and widgets the author has added to make the theme more useful or attractive to buyers. Many features will not be used but the files are still present.

  • HTML5/CSS3 Boilerplates

A boilerplate may save time but it’s important to understand they are generic templates. The styles and scripts contain features you’ll never use and the HTML can be verbose with deeply-nested elements and long-winded, descriptive class names. Few developers bother to remove redundant code.

  • Carelessness

Developers are inherently lazy; we write software to make tasks easier. However, we should always be concerned about the consequences of page weight.

  • Too many requests

Each request takes time to process, as it includes time for DNS Lookup, SSL negotiation, Server Response, Content Size, Connection Speed, etc. Also, broswers impose a limit on simultaneous connections.

  • Extremely large images

High quality images may look very nice, but we should consider the cost of downloading them in slow connections and/or mobile devices

  • Websites not optimized for mobile users

Websites not optimized for mobile users usually suffer from issues like bloated graphics, non-playable videos and irrelevant cross-linking. Google recently implemented a change on the ranking algorithm that favours mobile optimized websites on mobile searches.

Continue Reading: