There are no affiliate links on PagePipe.
If I use ByteCheck to run a test against PagePipe’s homepage, it returns the result of ~663 milliseconds TTFB.
If using PageSpeed Insights which seems to be the go-to tool, it returns ~ 740 milliseconds.
Is a TTFB difference of 77-milliseconds high?
That 77 millisecond deviation between different measurement methods is common. Time to First Byte is server overhead. It’s affected by many things that slow down the server. Some servers fluctuate wildly: SiteGround hosting in particular. Some oddity is because of security and link checker plugins (server resource intensive).
All speed tests give different results. If you’re getting an offset of only 161ms, on two different tests, that’s amazing.
What we trust most is our desktop browser timer addon. And our eyes.
We don’t use a real stop watch. We use a browser load timer addon or extension.
Google doesn’t use PageSpeed Insights – or it’s criteria for page ranking. It’s not connected to their algorithm.
It’s a separate idealistic and ivory-tower test created by egghead scientists. It’s an inexplicably complex toy puzzle to solve.
A good score or bad score on PageSpeed Insights doesn’t the affect ranking in any way.
We’ve seen pages that load in 12 seconds get all “greens” and a high score. It is impossible to have WordPress pass the test with a 100. Yet, 1/3 of the internet is built with WordPress. What kind of test is that? A biased one.
The limit of human tolerance for machine delay is 10 seconds. A page that slow can still pass their test as “green.” Not always. But it deceived with trickery.
This silly speed test causes website-owner anxiety.
The only valid measurement is a browser timer (stopwatch). It’s only accurate for its geographic location. All speed tests are flawed. There are too many physical variables. Most use simulation. They don’t give absolute results but are instrumental in measuring incremental improvements. Relative measurements.
You cannot improve if you can’t measure. Cruddy machine measurements are better than no measurement.
Many things cause TTFB delays. Most variables are beyond the control of the site owner. Except one. Move your site to a different host and wait for their great TTFB to deteriorate. Will throwing money at the server problem make you more profit? Doubtful. The ROI is poor.
What affects page ranking is the user experience.
Speed is a primary blocker of a good user experience. Number 2 is aesthetics. User experience is how people *feel* when they use your site. Google can’t do the primary measurement of that *feeling*. But they can measure secondary things with machines like:
- bounce rate
- dwell time
- return visits
These are indicators of inferred quality content that satisfies the user’s need. But if users don’t wait because of long delays to see the content, what good is the content if never viewed?
The bounce rate goes up. Credibility drops as an authority site.
Speed affects SEO indirectly over time. Not directly. Not instantly.
So, “If tools such as Google PageSpeed Insights return a metric of 95+ consistently, is this a leading indicator of a site that is performant?”
Performant means functioning well or as expected. This test is not an indicator of “performant.”
It doesn’t even mean “good enough.”
As mentioned, we’ve seen horrible slow sites fake out this test. We never use this test to check page speed. Nor do any other professionals we know. Paid optimizer services gladly guarantee a passing Google PageSpeed score (an easy out). But not actual load time in milliseconds (real hard work). They know how to game the test. So do we.
But we’re not playing that Google game.
You are experiencing the frustration of the PageSpeed Insight test. It’s a gas-lighting ploy. Google makes you question your sanity.
I understand that Time to First Byte is a constraint or limitation that the server is experiencing or exhibiting.
I am assuming TTFB is a key indicator in improving page speed.
If a site is highly optimized and the server is the only remaining factor, I presume that this impacts page ranking. In reading through your articles though, it seems to indicate otherwise.
If tools such as Google Page Insights return a metric of 95+ consistently, is this a leading indicator of a site that is performant?
Should I avoid tools such as Pingdom, ByteCheck, GT Metrix, Google Pagespeed Insights if they’re not an accurate measure of speed?
We want to locate and correct speed issues. The waterfall many of these tests produce is most useful. You can examine what is being loaded. We can then do value analysis on the components.
Borrowed Industrial Concept
A manufacturing method is a value analysis to streamline processes and components. It includes:
We use these methods to optimize websites.
“Value engineering began at General Electric Co. during World War II. Because of the war, there were shortages of skilled labour, raw materials, and component parts. Lawrence Miles, Jerry Leftow, and Harry Erlicher at G.E. looked for acceptable substitutes. They noticed that these substitutions often reduced costs, improved the product, or both. What started out as an accident of necessity was turned into a systematic process. They called their technique ‘value analysis’.”
Online speed tests are relative – not absolute. Using the same test, we watch (not measure) improvements from our site changes. Reducing requests is a false goal. It doesn’t always improve speed (load time) in milliseconds. In fact, often concatenation (minification plugins) break your site (unstylized HTML).
- Relative changes on small numbers often look big.
- Relative changes on big numbers often look small.
- Absolute changes on small numbers often look small.
- Absolute changes on big numbers often look big.
- Explore both types of changes when looking at data.
A browser timer or stopwatch is a valid tool to measure speed for its location geographically. How does one measure for sites that have customers globally?
Good question. Again, you can only use approximation testing techniques. WebPagetest.org allows for selecting many different locations and browsers.
Pingdom.com also has a selection of geographic regions.
That is enough. It’s not accurate but results are repeatable. That means you can detect improvements from changes.
Assuming I’m on the right track, if speed is a primary blocker for a good user experience, is a stopwatch the only method of measuring it?
We rely on Pingdom most and WebPagetest.org second. Only because Pingdom gives faster results. Pingdom is a best-case scenario and WebPagetest.org is a worst-case scenario. Pingdom results are always faster. WebPagetest.org is always slower. Different methods of timing.
The browser timer is our acid test of the test. We trust it more.
You can learn about the differences between the two test above on our Site Tuning services page:
If yes, are tools such as the ones found in browsers that offer a measure of network speed also inaccurate and ineffective?
We have no evidence. Repeatable results are most important. If there is a conflict of results, I always trust the timer solution most.
If actual load time is the key determinant, what factors play a role?
Page weight is the total sum of all files creating a browser page. This is usually expressed in “k” (kilobytes). Page weight is an indicator of optimization and speed.
Page weight is key for mobile user experience. Mobile speed is 2X to 3X desktop speed. Much slower. The image shows the time tolerances of impatient visitors for browser response time.
If speed indirectly affects SEO over time, are the only factors content and design of a site (aesthetics)?
Borrowed Science Concept
Hurdle technology ensures poisons in food products are eliminated or controlled. Stuff like mold, bacteria, and fungi. For food products, hurdles are pH and water activity measurements. These approaches are hurdles the pathogen has to overcome for safe food production.
In the end, “how fast is fast enough” is human perception of waiting or impatience.
If the user leaves because the page is too slow, they never see the beautiful page. Or read the riveting content.
Speed is about hospitality and being polite.
The hurdles to detoxifying web user experience:
1. Bad speed
load time in milliseconds
2. Ugly aesthetics
halo bias in 50 milliseconds after page load
3. Poor content
readable and findable solutions
Your best return on investment always comes most from improving content.
The truest user experience is good content. You want to improve the profitability of your site: focus on content. Not aesthetic. Not speed. And not SEO gimmicks.
Good content is original, actionable, and answers a question. It’s sourced, unique, concise, correct grammar, and proper formatting.
9 Ingredients for great content:
- Create Original Content. Not deception.
- Always Focus On Creating Strong Headlines.
- Make Your Content Actionable.
- Be Able to Provide Answers.
- Be Accurate in Your Reporting and Sourcing of Information.
- Create Engaging and Thought-Provoking Content.
- Communicate Better by Adding Images and Video.
- Write Short and Pointed Content.
- Make Continual Updates to Your Website or Blog
Instead of band-aid approaches, we drill down to the root cause of your slow site. This is origin optimization. Also known as site tuning. To do this, we analyze site components:
- Scripts and third-party services.
- Images and media library.
- We minimize globally loading plugin effects.
Find out more details about Site Tuning – Get Speed!