Major difference in time metrics between different versions of sitespeed
Hi @soulgalore,
We are currently on version 11.0.0 of sitespeed and looking to upgrade to a newer version. We have attempted this before, but have constantly come up against many issues with the newer versions. I.e. major differences in how sitespeed is measuring the rendering and major differences in the metrics produced.
Example when testing with version 24.0.0 vs 11.0.0. First visual and Last Visual change metrics are way off, i.e. 1.6 seconds vs 9.6 seconds. Am running the exact same script against both versions locally using the following docker command:
docker run --shm-size=2g -vpwd:/sitespeed.io sitespeedio/[sitespeed.io](http://sitespeed.io/)[:24](http://sitespeed.io:24/).0.0 --outputFolder assignment_output8 --speedIndex true --video true --multi ./tests/lp-multi-v3_cp.js --browsertime.url [https://perf-aws.front.develop.squads-dev.com](https://shawneentest-org-dev.com) --browsertime.login testTe --block js-agent.newrelic.com --block www.google-analytics.com --block www.googletagmanager.com --block bam.nr-data.net --browsertime.viewPort 1366X768 --spa --browsertime.iterations 1. The same issue exists when I try running with versions from 18.0.0 upwards :
11.0.0:

24.0.0:

Within the waterfalls for both runs, I can see that when running with version 24.0.0 - firstVisualChange and lastVisualChange measurements have been moved from where they were caught in version 11.0.0. Also 26 requests were hit in version 11.0.0 and only 23 hit in version 24.0.0. See diagrams below:
11.0.0:

24.0.0:

Please advise
Hi @shawneenc I think you should try to update more often. In the Docker container 11.0.0 Chrome 78 is used (and Firefox 70) and a lot of things changed in the browser since then. For the projects I test I try to be update with the current stable browser version so it matches most of what my users use.
If you want to check what's changed in sitespeed.io since 11.0.0 you can do that in the changelog: https://github.com/sitespeedio/sitespeed.io/blob/main/CHANGELOG.md
If you want to see what changed in Chrome since 78 I think the best way is to check the Chromium commit log.
@soulgalore is there any way to run a version of sitespeed with a new version of chrome from docker? With each version run (from docker) the version of chrome that was available at the same time as that version of sitespeed gets pulled and not the newest version
You need to build the container yourself then. You can base that container on https://hub.docker.com/r/sitespeedio/webbrowsers/tags (the tags says which browser version that exist in the container).
In your metrics, the first visual change of 167 ms seems really fast, I think that maybe been broken, if you look at the video/screenshot do that metric look correct? You can check the same and see if you can spot what's last visual change (to see if its correct or not).
One thing that I remember changed (but I haven't checked the changelog) is that a couple of years ago we added 3 extra seconds on when to end the test. Before it was loadEventEnd + 2 s and then loadEventEnd +5s, that could pickup more requests.
Hi @soulgalore,
Can you please explain why sitespeed sometimes takes the last visual change metric at a later point? I.e. in some instances it is taking it along with the visual completeness metrics and other times it takes it at a later stage. The screenshots have been included above. Do you know why sitespeed does this? Have you encountered any issues like this before?
Last Visual Change comes from the Visual Metrics script that analyses the video. Easiest is to check the video and filmstrip and see what that last change actually is. Sometimes it can be hard, we have tuned the number of pixels through the years of how many that needs to be changed to be counted last change. If you find something that doesn’t match, it would be great if you can create an issue for that in https://github.com/sitespeedio/browsertime. There’s a configuration that you can use: --browsertime.videoParams.keepOriginalVideo that will keep the original video (the original video from the screen), it will be named something with the -original. Attach that to the issue, then its easy to reproduce.
By screenshot I mean the screenshot from the video "filmstrip":

Hi @soulgalore,
On further investigation, it looks like with the newer version of sitespeed (24.0.0), the viewport size passed in within the docker command is getting ignored and is waiting for the full page to render/load, as opposed to the viewport screen only.. See filmstrips below:
version 24.0.0:

version 11.0.0 - correct:

We cannot switch to a new version with this issue. It will throw all of our metrics off, as we comapare browser render times sprint over sprint. We use last visual change for this
Ah I see now, thanks. In your example you use --browsertime.viewPort 1366X768 the correct way according to the docs is to use a lower case x --browsertime.viewPort 1366x768 - it seems like a couple of years ago (four?) the capital X also worked but I don't have any memory of that it should, I wonder if that was just by accident. Running --help in 11.0.0 also shows a lower case x, right?
Hi @soulgalore when we run from docker - this is the command we use, which has the lower case 'x':
docker run --shm-size=2g -v /home/ubuntu/workspace/rowser-rendering_shawneen_branch:/sitespeed.io sitespeedio/sitespeed.io:24.0.0 --outputFolder output --speedIndex true --video false --config ./config.json --multi ./tests/lp-multi-v3_cp.js --browsertime.url https://url/?performancetest=sitespeed --block js-agent.newrelic.com --block www.google-analytics.com --block www.googletagmanager.com --block bam.nr-data.net --browsertime.cacheClearRaw --browsertime.login performance --browsertime.viewPort 1366x768 --spa --budget.output junit -n 1
I did a run including the following to capture the original video:
--browsertime.videoParams.keepOriginalVideo true ${url} -n 1 --video --visualMetrics
This produced the following:
Original Video Filmstrip (v24) - page did not finish loading before last visual change was captured:

Normal filmstrip (v24)

Sorry I'm slow, I don't follow what you show me here? So using lower case x you get the correct view port right?
no @soulgalore I get the results that are displayed above when using lower case 'x' --browsertime.viewPort 1366x768. It looks like in the more recent versions of sitespeed the viewport size gets ignored and it waits for a full page to render and not the viewport sized screen
Yeah but I don't see any difference in the images? Easiest is to try with something small like --browsertime.viewPort 200x200 and see what it looks like. When I tried yesterday it worked, when I tried with X on 11.0.0 it surprisingly worked too.
When tests ends depends on JavaScript that runs in the browser checkout https://www.sitespeed.io/documentation/sitespeed.io/browsers/#choose-when-to-end-your-test
We run these scripts against 3 separate environments. Within 1 of the environments it worked last week, this week however the exact same script, ran with the exact same parameters is showing the behaviour outlined above. The other 2 environments we run the test against have always showed the above behaviour
Looks like the later versions of sitspeed are flaky
the behaviour outlined above
Do I understand correct that you are seeing that the last visual change do not end when something is painted on the screen instead it picks up the scrollbar? If that's the case could you please run with --browsertime.videoParams.keepOriginalVideo and then instead of attach the screenshots, attach the video, then I can try to reproduce it. If you look in the video folder you will have a file named 1-original.mp4.

With that original recording I can run visual metrics locally that gets the metrics and see what's going on.
Hi @soulgalore,
Both videos attached - the first is the actual incorrect results and the 2nd is the orignal video
1st: https://user-images.githubusercontent.com/55998417/175037247-9429dec0-2465-421e-aa19-03cd07718db2.mp4
2nd - original video: https://user-images.githubusercontent.com/55998417/175037319-ffe664b0-2bea-4e96-b23a-5d1ac0dbcd27.mp4
Are you sure that's the original video, that should always start with some orange frames and do not include any timing metrics? Checkout https://github.com/sitespeedio/sitespeed.io/issues/3677#issuecomment-1163058795 on how to get that.
@soulgalore did a run locally also - these are the videos from that run:
https://user-images.githubusercontent.com/55998417/175046201-d022c9f5-7508-4781-ad87-570076f45fbd.mp4
https://user-images.githubusercontent.com/55998417/175046215-c5c55219-c1b6-4d11-9649-55a5b43d2129.mp4
https://user-images.githubusercontent.com/55998417/175046222-1df1a1fb-b387-4bc9-a121-e1a3bd559067.mp4
https://user-images.githubusercontent.com/55998417/175046229-80eceba6-4bf6-4837-8fde-1e1628b3f3ae.mp4
Great, I'll have a look tonight.
Just to be totally clear, which of these two are wrong? Looking at the metrics it both look rights, just one that loads slower? Did you check the TTFB for both?
with the latest set of videos sent from the local run - these are all original videos.
The comment before that - which had 2 videos - the first video is displaying incorrect last visual change
I need to have the original video from the run that gets incorrect visual change, if I have that I can try to reproduce it.
@shawneenc one thing you could try is to use the latest version of visual metrics, we have a version that @gmierz has been working on that we gonna make default later this year, maybe that will pick up the latest change better. If you use the Docker container you can enable that with --browsertime.visualMetricsPortable true.
Else I need the original video from the run that fails, that's the only way for me to hava look and see if I can fix it.
Hi @soulgalore the link to download the original video does not work from the results produced from the local run. I can send you the original video from a run within jenkins. However this will not be the exact same run the videos above came from. Jenkins also does not produce the exact set of videos you needed above (i.e. no orange strip an beginning of run) ..... It is still the same page and measurement though. Included video from jenkins below:
https://user-images.githubusercontent.com/55998417/175244558-e11984d7-b082-4c1d-8a0d-1bb30daf081a.mp4
Yes there's no need to HTML link to the original video, so you need to copy it from the Jenkins or from where you upload the result. I need the original video, it works like this:
- we record a video of the screen from the browser (original video)
- The video is analysed using visual metrics
- We remove the orange frames, convert the video and add the timer and metrics on top of the video.
To be able to reproduce I need the original untouched video.
@soulgalore from the local run - there is no link to download/copy the actual original video. From jenkins the originally requested 'original' video link is not the correct one you require.
The video I have sent you is the only video I have for the original video you are requesting now. Are you saying this is not the correct video?
Yes you need to supply the original (named original video) from the run where you see that something is wrong. See my comment in https://github.com/sitespeedio/sitespeed.io/issues/3677#issuecomment-1163058795 and the actual screenshot of which video.
@soulgalore I know what you are asking for. Can you understand what I am trying to explain?
There are issues producing what your are asking for running both locally and from jenkins - a different issue with each one:
- From jenkins the 'original' videos from this: browsertime.videoParams.keepOriginalVideo with the orange strip are not produced
- From the local run - the original video is not produced as a stand alone file to send and the download on this video within the sitespeed results does not work. Is there a way to store this video through a command prompt like what is done above (browsertime.videoParams.keepOriginalVideo)
Do you understand what I am saying here?
I'll try :)
From jenkins the 'original' videos from this: browsertime.videoParams.keepOriginalVideo with the orange strip are not produced
You mean it's not created so there's a bug? Or how do you mean not produced? Or you cannot change the configuration?
the original video is not produced as a stand alone file .. . Is there a way to store this video through a command prompt like
Can't you add --browsertime.videoParams.keepOriginalVideo? Or how do you run it locally?
video from this (--browsertime.videoParams.keepOriginalVideo) is produced (I have sent this). You are also looking for another video though right?
I need the original video where the orange frame is still there and there's no text added from where you test and you see that the last visual change is wrong. I can see two videos that is attached that has the orange frames but they do not have any late last visual change. Including videos that do not have the issue with late visual change will not help, screenshots will not help. Original video from when you see metrics that are off is the one I need.
The original video is named 1-original.mp4 from the first run, 2-original.mp4 from the second run etc.
@soulgalore
Video generated from --browsertime.videoParams.keepOriginalVideo: https://user-images.githubusercontent.com/55998417/175292737-0c34e1ac-02ea-47c3-a0cc-18588788cb10.mp4