Don’t Assume All Web Users Have a Fiber-like Connection

I have pointed out for more than a decade how poorly many web pages are coded. They often assume a very low latency connection and without it the user experience is poor.

Most of the web really sucks if you have a slow connection

The flaw in the “page weight doesn’t matter because average speed is fast” is that if you average the connection of someone in my apartment building (which is wired for 1Gb internet) and someone on 56k dialup, you get an average speed of 500 Mb/s. That doesn’t mean the person on dialup is actually going to be able to load a 5MB website.

This is so true. While living in Asia I had a perfectly good connection to watch high resolution video but just the latency would cause many websites to be unusable (or so frustrating you wouldn’t want to use it). The huge number of files that must be downloaded as well as the large size of that content is something that most sites don’t care about. I can only assume they just test the pages on their fiber connection and if it works they are ok. This is a very bad idea for nearly every website.

When I was at Google, someone told me a story about a time that “they” completed a big optimization push only to find that measured page load times increased. When they dug into the data, they found that the reason load times had increased was that they got a lot more traffic from Africa after doing the optimizations. The team’s product went from being unusable for people with slow connections to usable, which caused so many users with slow connections to start using the product that load times actually increased.

This illustrates some additional interesting tidbits: learning from data requires thought, potential customers are avoiding you for reasons you are likely unaware of. Learn to get a deep understanding of your customers and potential customers focus on the customer’s “Jobs to be Done.” Also learn to be thoughtful about the use of data: Understanding Data, Simpson’s Paradox, Managing to Test Result Instead of Customer Value, All Data is Wrong, Some is Useful, Data is only a Proxy – it isn’t reality.

The quoted post is good. But it doesn’t display a date 🙁 This is a very bad oversight for such an article (where the date of the article can greatly impact what you take from of it). By looking on the RSS feed I was able to see it was published in 2017.

Related: Functional Websites are Normally Far Superior to AppsThe Edge-case Excuse for Poor Software Coding PracticesDelighting CustomersFocus on End UsersUse Urls: Don’t Use Click x, Then Click y, Then Click z Instructions

Chrome Remote Desktop

I am more often frustrated by Google the last few years that pleased with them. But they do still provide some pretty awesome tools. For example, Chrome Remote Desktop lets you access a computer over the internet (and lets you to allow another user to access your computer securely over the internet).

Chrome Remote Desktop allows users to remotely access another computer through Chrome browser or a Chromebook. Computers can be made available on an short-term basis for scenarios such as ad hoc remote support, or on a more long-term basis for remote access to your applications and files.

screen shot of chrome remote desktop

Chrome Remote Desktop is fully cross-platform. Provide remote assistance to Windows, Mac and Linux users, or access your Windows (XP and above) and Mac (OS X 10.6 and above) desktops at any time, all from the Chrome browser on virtually any device, including Chromebooks (including Android phones and iPhones). The iPhone app is new.

Some users worry about installing such an app given all the spying and hacking scandals. That is not a completely crazy worry. Google, and others, have been taking advantage of weak user control (and even bugs and work arounds to avoid stated user preferences) to track users and use that information to make money selling ads. With many cool and useful tools there are risk of them being misused. And practices of governments and huge corporations have been so egregious to give a sensible person pause. Still in the right situations this is a pretty cool looking tool (similar things exist but the combination of price [this being free] and simplicity make this interesting).

Related: Chrome Remote Desktop support forumUsing scp (secure copy) to Copy Files Between ComputersUsing Rsync to copy Files Between Servers and ComputersLinux/Ubuntu File and Directory PermissionsGovernments Shouldn’t Prevent Citizens from Having Secure Software Solutions

VPS, Cloud and Server Colocation Options

Finding the right place to host your content is important. Thankfully their are several excellent providers. For virtual private servers (one server shared with multiple virtual servers) Linode and DigitalOcean. And there are lots of good choices but those two are widely appreciated for excellent service at a good price.

AWS ec2 (the Amazon elastic cloud) is not great for minimal hosting in my opinion – it adds extra complexity and is likely more expensive. But it is a great solution when you have the resources to manage it and you have significantly variable demand. Because of the ability to add capacity on the fly as you need it you can maintain a low baseline and add capacity only as needed and drop that extra capacity as soon as it isn’t needed.

Rackspace is another good option for hosting. Rackspace and AWS are often used for very large applications and sites but Linode and Digital Ocean also can serve those needs and provide similar options to add capacity on the fly.

All of these options require you to manage your server (which may well be a virtual server – that is just a portion of a actual physical server that you control).

Rackspace also offers co-location where your physical server is put in their network operation center with electricity; cooling; network and internet connections; and physical security managed by them and the server managed by you.

As colocation has evolved what is included and to what level things like physical security and redundancy are dealt with have evolved. It has become quite complex to understand all the options for those organizations that need more than a simple virtual private server. As often happens when their is a business need, people offer solutions. And there are companies that specialize in helping you find the best colocation options for your needs.

Today the cloud options have led many organizations to eliminate (or greatly reduce) there own network operations centers and colocation needs. But cloud options are not always the right choice. And for some needs cloud options are not appropriate yet (mainly due to security or legal issues stemming from security concerns).

Managing your own servers with a colocation arrangement can be significantly cheaper than cloud hosting options (especially if you don’t need to massively increase capacity to deal with short term bursts of demand). Of course, technology continues to change so quickly it is hard to predict what the future will bring.

Service quality is absolutely critical for colocation. While saving money is important, the reason colocation was selected (over virtual private servers or the cloud) is normally how critical the function was. Using experts to help sort through the options and assure the quality of service of provides is wise.

Related: Keeping Your Hosted Ubuntu Web Server Software Up to DateChecklist for Setting Up a New Domain on VPSSystem Monitoring Tools for VPS

Very Simple Process to Claim Authorship of Pages with Google

Update: August 2014 Google abandoned this project so there is no point wasting your time on it. As I have stated elsewhere Google’s efforts are so unreliable (Google nearly always fails and then closes whatever you invested your time in) that you should careful consider if you bother with their stuff (much mores than any other company). This has been true since Larry Page became the CEO and means that I avoid Google initiatives. I also abandon, or at least don’t waste more of my time investing in anything Google buys. Without a strong extra case to justify bothering with something from an organization proven to be so unreliable it just isn’t sensible to treat Google efforts as reliable.

With authorship I figured the odds were that even with Google’s poor track record this was something worth bothering with. Oops. I have to re-calibrate to assume even less chance of Google making something that won’t just be a waste of my time for future efforts.

There are lots of posts explaining how to claim authorship of pages with Google. I find them very complex, by and large.

I was able to use this simple process.

  1. Include by author text with a link to the author page using rel=”author” In WordPress you can just edit the theme page to have the author link include the rel=”author” tag.
  2. [html]<a rel="author" href="http://johnhunter.com/">John Hunter</a>[/html]

    For WordPress blog with multiple authors here is the syntax to use (this pulls the author url from their profile. The author can update their web site url when logged into the blog.

    [php]<a rel="author" href="<?php the_author_meta(‘user_url’); ?>"><?php the_author() ?></a>[/php]

  3. Then connect author page to Google+ profile. You can make this most any page (Google may exclude some free sites). Most people use the about page or author pages on the same blog they are trying to claim authorship of. But all that really matters is linking this page to your Google+ profile (obviously substitute your url), I have many sites with my material so my home page is what I used, I just added the following to that page.

    [html]<a href="https://plus.google.com/u/0/123" rel="me">Google+</a>[/html]

  4. Then link your Google+ profile to the authorship page you want to use. You can add it under Other profiles or contributor to. This is the trickest part do to very bad UI of Google+ you have to go into edit mode and then click on the areas (there is no indication they are editable until you click – extremely bad UI, Google seems to very much like this hard to understand hidden UI elements lately, hopefully that will end soon). Likely Google will fix this at some point so this part will no longer matter.

To test out whether things are working you can go into Google webmaster tools to the rich snippets area. Test out a url you claimed authorship of and you should see something like:

search result with photo

Google can now use your Google+ profile to include a photo (from your Google+ profile) next to search results

With those simple changes over 1,000 posts on my blog were updated. It took far longer to search for and read a bunch more complex ways of accomplishing this than actually doing this once I was able to see what little had to be done. Writing this post took way longer. Now I just have to do step one for other sites and blogs and they will all be updated to show my authorship (for those I already had on my Google+ profile, if they were not listed there yet they have to be added).

There are many benefits to establishing authorship through Google.

  • Google can include your photo next to search results. Especially until people get tired of this, it will likely increase clicks on your links. And even after people are tired of it, if they notice your photo (and respect you) that may well increase clicks.
  • Builds your personal brand
  • Google can use authorship as an additional factor in calculating the worth of a page for a specific search.
  • Google can get a feel for what areas you are an expert in. First by just analyzing what your write about. But more importantly they will be able to use this extra layer of information to determine AuthorRank (similar idea to PageRank) by seeing links to your authored material.
  • Hopefully this can allow for RSS feeds by author (no matter where the content is published) eventually, if it doesn’t already.

Knowing authorship will allow Google to improve search result quality, so I can understand why they are pushing for it. It is a bit annoying how they keep pushing Google+ but this implementation seems tolerable. Some posts make it sound like you have to make your “authorship page” the Google+ profile, but really you can decide the authorship home page (for me johnhunter.com). They do pull your profile photo from Google+ to use, which is less than ideal.

System Monitoring Tools for VPS

Tools for monitoring performance and troubleshooting Ubuntu VPS web servers

  • Munin – graphs of system resources over time. Very nice. Can be a bit difficult to setup.
  • top – system stats
  • iotop – like top, but for io stats. Install [bash]sudo apt-get install iotop[/bash] Useful setup [bash]iotop -b -o -d 30 -t[/bash] -b (batch – so you can keep a running tally of results) -o (only those processes with io) -d (delay and seconds – how often to print out stats) -t (include time in printout)
  • vmstat – stats on memory, io, swap, cpu and system. Example: [bash]vmstat 10[/bash] (prints out stats every ten seconds.
  • iostat

Error logs

  • [bash]sudo nano /var/log/apache2/error.log[/bash]

Apache web server access log statistics

  • Webalizer – [bash]sudo apt-get install webalizer[/bash] GeoIP is required for webalizer [bash]sudo apt-get install geoip-bin[/bash] detailed instructions

Using the Host File in Ubuntu

You can use the host file to have your computer route to whatever addresses you desire (instead of using your nameserver). For example, by putting

[bash]sudo nano /etc/hosts[/bash]

Then add a line to the file with the ip address and the name you will use.
[bash]204.11.50.136 wastetime[/bash]

One useful way to use this is to test out a website on a new host prior to changing the nameserver to point to the new host. In this case, if you want to make sure your host file is being read you can ping wastetime and if it is working it will show the results for a ping to 204.11.50.136