Scott and Tim do a live audit of Enda Sportswear, talking about third-party risks, the importance of testing from different geographies, and test user journeys with custom scripting.
Want more info? Pls follow us on Twitter: Tim Kadlec , Scott Jehl, and WebPageTest
Sign up for a FREE WebPageTest account and start testing
Scott and Tim do a live audit of Enda Sportswear, talking about third-party risks, the importance of testing from different geographies, and test user journeys with custom scripting.
Want more info? Pls follow us on Twitter: Tim Kadlec , Scott Jehl, and WebPageTest
Sign up for a FREE WebPageTest account and start testing
(04:48) Scott: Hello?
(04:50) Tim: That is so much better.
(04:50) Scott: Oh, wow. All right.
(04:53) Tim: So I think I've mentioned to you before, and, and this was, I mean, it was dangerously close here again. I, I just about lost it, but, when I was in like school and in high school and stuff, in fact, I did tell you this. I used to, have these giggle fits right where I just find something and I just would not be able to stop laughing for literally minutes. I had a whole thing with my college professor, where we went to go check a project one time. And for whatever reason, I just started laughing in his office when we were trying to get help from him. And he was looking at me like, I was just like, what was wrong with me? My coding partner was like, giving me the dirty look, like, what are you doing? And I had to excuse myself from the room. I couldn't stop laughing. I was that close to it with the deranged robot thing going on.
(05:38) Scott: Man. I'm going to have to tune into this show for the first time and see how I sounded.
(05:44) Tim: Yeah, no, it was so you didn't hear it at all?
(05:47) Scott: No, I sounded to me very, great, I sounded great.
(05:54) Tim: All right. Yeah, you'll have to check it out. Anyway, that was a great start. That was exciting.
(06:00) Scott: Yeah. How you doing otherwise?
(06:02) Tim: Not too bad. How about you
(06:03) Scott: Doing well happy almost Halloween.
(06:06) Tim: Yeah, no, I think actually when you go back and watch the thing we should have just played this off as a Halloween, like thing, cuz it did sound, it was par for the course. It worked for that. Yeah.
(06:16) Scott: I do have a hat.
(06:19) Tim: Nice, nice. Yeah, I wasn't sure. That looks good. That looks good. I wasn't sure what to expect honestly, from you on this stream, because I know that like, we had the request, I believe. What was it yesterday? You got the, the new wetsuit and I know I saw the request on Twitter to have you pop on the stream like this. Actually I think they requested that I get one too, which is, maybe why we didn't do it, but--
(06:50) Scott: Yeah. Well it also takes me about 37 minutes to put that suit on and I didn't see that request until just now, sorry.
(06:57) Tim: Yeah. Selective viewing of the request. Yeah. Oh good. Awesome. Yeah, no, so it's yeah, so, and like this has been a busy, it's been a busy, there's been some cool stuff happening like perf wise, the last couple weeks.
(07:14) Scott: Some browser releases, some browser updates as well.
(07:18) Tim: Like what, safari right. Kind of had the tech preview launch release.
(07:23) Scott: Yeah. I saw that yesterday, that looked, pretty exciting. Not so much on the performance front necessarily, but dialogue element updates. Well, I think that would end up coming back to potentially being a, performance improvement just with all the dialogue JavaScript no longer being needed. Oh,
(07:43) Tim: Polyfills out there, right? Yeah. For this Kind of stuff, right? Yeah. Yeah. And anytime you can remove a couple polyfills is good.
(07:51) Scott: Yeah. Yeah. And one I was thinking of, in particular, because I, I worked on a dialogue, web component last year, inert and needing to, use the inert attribute, which, at least I haven't checked recently, but I don't think it's quite settled in every browser yet. That polyfill was pretty expensive, because it has to basically touch all of the non-dialog elements in the Dom. So it'll be nice. It’ll be nice to see a native implementation of that I haven't tested yet, but that was cool. Yeah. What else did you see?
(08:30) Tim: Well there was the whole, I mean, that's always good. There was the whole, WordPress thing too, where they did basically the WordPress core team. Yeah. The WordPress core team kind of put out this proposal to build up a performance team, like for improving the performance of WordPress core. Which is cool, like WordPress, is open source. You've got the team. I mean, you did JQuery and all that stuff, but like for the folks who are kind of tuning in, like you've got the core team that kind of contributes back to the core functionality and I liked, they had this whole proposal for performance team and why and stuff like that. And yeah, I think what was cool about this for me is like, it's a trend to some extent.
(09:15) Scott: A lot of teams, starting to build performance into their workflows and into their tooling too, I guess.
(09:23) Tim: Yeah. And then they mention a few here, like they mentioned comparing to other platforms. Like using the HTP archives report, they mentioned comparing themselves to Wix who has been very vocal on Twitter and stuff about, how they're doing and HTP archive data, Shopify, which has their own speed score, Vercel yesterday at their event announced some performance checks. Like it's like a thing now where platforms are trying to find ways to have these perform and defaults and also sort of encourage, the folks using those platforms, by giving them the information and the tools to like build really performance experiences.
(10:03) Scott: And yeah [Inaudibe10:06] comes to mind as well as, as one always. Performance first. Yeah.
(10:12) Tim: Which is awesome. I dig that. I think that's like, I mean, there was a whole conversation about this for a few years now where, this idea that like, if you really wanted to shift performance or accessibility or security or anything like that on the web, there's the part of it where you kind of go after the individual developers and do your, that's important, the education and advocation that way, but to really make like big shifts especially when you're talking about that, like long tail of the web, requires really that the platforms and the tools themselves are doing things to get better.
(10:47) Scott: Yep, absolutely. Yeah. So that's was exciting to see.
(10:50) Tim: I like that. And then this week, this week was, TPAC as well going on.
TPAC is all what's that?
(11:01) Scott: I was going to say, I have not caught up on, the events, anything perf related that came out of that.
(11:08) Tim: Yeah. So TPAC is like the W3C groups, like all the working groups. So there's actually the web perf working group every day for like two, in fact, they're doing it right now. every day for a few hours have been getting together and, having these conversations and it's kind of fun because it's like, I mean, it's a mix, like it's, some of it is pretty heavy. Like you're getting your-- it's a lot of browser vendors and standards, people talking about like, what are those next standards and stuff like that. But particularly during this week, what they're also doing is they're bringing in, they have people from different companies that are talking about, like, they had a whole session about RUM like, where does rum fall short? Like when rum is really hard, real user monitoring, some of those metrics are really hard to attribute to like what actually triggered the event or, get some sort of attribution and like things like that that are complicated.
Or, there was a talk from somebody at Microsoft Excel about how they're using the new, event timing, API to try and measure responsiveness inside their app cuz it's, a single complex, SPA. Yeah or things like you. So new APIs, like there's the whole JavaScript self-profiling API, which sounds pretty cool. Different levels of compression. There's just a lot of really interesting stuff that comes out of it. And it is interesting and apparently goats from the looks of it here [Inaudible12:29] is involved for anybody who doesn't know [Inaudible12:30] he does have goats.
It's interesting cuz you get at like a really interesting blend between, again like the talks from the specific companies on how they're doing something or trying to measure something. But you also get folks who are doing like deep dive in the weeds browser stuff. So I'll be honest. Like sometimes I just kind of sit there and I'm like, when they're talking about, well, if we were going to implement this inside the browser, it means we'd have to do X and Y and I'm like, all of that's just flying over the head, but it's just, yeah. There's a lot of cool stuff still that comes up.
(13:03) Scott: Yeah, yeah, yeah. Yeah. I mean, there's there are just so many facets of, web development that, it's impossible to be on top of all of them, but yeah, it's great to see these notes. This is some interest stuff.
(13:15) Tim: Yeah. And so yeah, I'll, we can share the doc for this, but the, I would say that the, I guess it's going to be a really long link, isn't it? That might not be super helpful. Yeah. That's not going to be helpful, but basically.
(13:31) Scott: Type that in,
(13:33) Tim: Yeah, we'll type it into the comments here, but the, all of these things will also link off to the presentations. Like the videos were recorded for each of the talks. The only thing they don't record are the actual, Q and A part after, because they want the companies and the people involved to feel comfortable like getting into the weeds without, having sayings something that maybe they're going to regret later on. So
(13:59) Scott: Got it. Yeah.
(14:00) Tim: Yeah. Yeah.
(14:04) Scott: Also in Perth news, our team got together in person or a lot of our,
(14:08) Tim: No, that was good. It was good news. It was good news. Yeah.
(14:12) Scott: Actually it was really nice
(14:15) Tim: Gina posted a picture of that, which I thought was cool. Like there's the core, there's the group, there's the team right there.
(14:24) Scott: Yeah. In the same place, on a rooftop. Amazing.
(14:26) Tim: Amazing. No, it was fun. It was like, I think that's the first time. I mean, I'd met you before. That's the first time I've been anybody like I've been working with these amazing people for, what is it like 10 months, 11 months now for some of 'em and like finally getting to hang out in person was incredible.
(14:42) Scott: It's a very good thing. Yeah, yeah. And there's Jeff Lubeck actually who joined what like three weeks ago?
(14:49) Tim: Where's Jeff
(14:51) Scott: Coming up on a month now. Yeah, yeah,
(14:54) Tim: Yeah, yeah. Think it's about a month. Yeah. Just about a month just under maybe. Yeah. No, it's a, I mean it's a rock solid. I mean, the team's been awesome and it was fun. Like, I don't want to say too much, but, there's some cool stuff. I think that we came up with while we were there.
(15:10) Scott: Yeah. I think there's a, there's a good, there's an exciting roadmap ahead for webpage test.
(15:14) Tim: Yeah. So stay tuned. I'm particularly excited about redacted, so
(15:23) Scott: Yeah. Is that on the new framework?
(15:26) Tim: Oh yeah, yeah. The new, page test react driven model. Yeah. Oh, we are excited too. For what it's worth like Jeff, I don't, I don't even know if Jeff, is watching this, but he's fantastic. Like, yep.
(15:43) Scott: Commits landing right from the start.
(15:46) Tim: Exactly. Yeah. And he is like doing things like actually testing, like putting unit tests and stuff for our code base. What is that? It's weird. It's nice.
(15:54) Scott: Yep. Yep. Cool. Really good. Yeah. So what else do we have here?
(16:01) Tim: Yeah. So I thought like today we'd go. Like, it's been a while since we did just like a pick the site did a, like we, I mean, in fact, I don't know that we've done it since you would kind of come on the team, but like early on, we'd like pick a site and just kind of walk through. And I thought we'd do that today.
(16:14) Scott: I like that idea
(16:15) Tim: and there's a couple things I think that we can kind of show along the way, but I wanted to do, I'm not sure if you're familiar cuz you like to run right. Or
(16:23) Scott: I do like to run. Yep.
(16:25) Tim: Yeah. Okay. So then
(16:28) Scott: Pick a running site.
(16:31) Tim: I did. It is running. Yeah. Hold on. Yeah. Let's see. Are you familiar with Enda at all?
(16:43) Scott: Enda? Hmm. Yeah. I think it's new to me looks interesting.
(16:48) Tim: Yeah. So this was, I put out there on Twitter. This was months ago and I was like at the point where like the shoes that I was using to run were like chunks of the bottom were falling off and it's like, so I put out on Twitter like, oh, what kind of stuff should I, you know, What type of shoes do people like? And one of the folks came back with this company, which I had never heard of either, but it's really cool. They are based out of Kenya. they, I guess started with like a crowd funding campaign to make the first one out of like that they produced, it got really, really popular. And it's cool. Like it's, all out of there like made in Kenya, they take the running very seriously. They take like the responsibility as like a company very seriously. You can see this notes here around like what they do around climate neutral and all that kind of stuff. There's just a lot, I think that Enda does. That's really neat. And like I said, the shoes have been really well reviewed.
(17:49) Scott: Interesting. Cause I am in the market. I'm at that stage with my running shoes right now with the, well, I don't know how many hundred miles is the recommendation anymore, but I think they're blown past that. I think. So
(18:03) Tim: I was going to Say whatever the recommendation I tend to ignore it. I think like I pointed out it's when mine fall apart. But yeah, no, it's nice. And I like, so they've got these shoes that seem to be really popular. So I've spent a fair amount of time looking at the site and I thought, well, it might be nice to kind of profile that. Because like I say, cool company, cool shoes, kind of fun.
(18:24) Scott: Nice. All right. Good one.
(18:27) Tim: yeah. So where do you want to start? Should we start with like a, Hmm. Start with like a collect page or a homepage or product, what you feel
(18:33) Scott: That's a good idea. Would you do several or, like a stepped kind of thing?
(18:39) Tim: we can do that too. Yeah. I think let's do both. Let's start with like an initial landing and then like that's a good point. Like this is a commerce shop. Right. And so it's not really a cool I guess. And so, there's the flow matters a lot. Like when we talk about, measuring a performance, we have a tendency to kind of zoom in on like single pages or single points in time, but really if I'm Enda, and I'm going to be doing anything in terms of like monitoring performance and I want to make sure that I'm providing the best user experience possible. It's really, the users are going to be coming and doing this. I'm going to be coming and looking at this page, I'm going to be like, oh, this shoe looks awesome. That’s really neat.
Let me choose my size. Maybe I go to add it to my cart, and then go to the checkout. And this is the bit that really matters from the financial side of things. Is like, not just that page, but this whole process to the point where like I've added something to the cart and I got to the checkout point, the more friction and the more time and the more work that comes in there, like that's where you're going to start to see your, if you're not monitoring that flow at it, particularly that last bit, the add card in checkout.
(19:55) Scott: Yeah.
(19:57) Tim: Probably losing stuff.
(19:59) Scott: Yeah. Yeah, I think, there are different performance considerations too, beyond just the first view. Right. So, often we run a test with an initial run and a repeat run to how the caching is working, hopefully working well. That sort of thing you can test pretty, practically with a multistep kind of test, right?
(20:25) Tim: Yeah, yeah, exactly. So I think that makes a lot of sense. So let's start with, let's start first with, I guess the individual page kind of get a lay of the land and then let's go ahead and, and, use like scripting to go through the rest of the process. so I'm going to kick off a page for, I guess this would be your, what your PLP, your product listing page, I think is what they would like to call this in the ecom land. We’ll do mobile. We could, do we want to test from the US or do we want to test for Africa in this case?
(21:02) Scott: Hmm, that seems like a, yeah. I would imagine the, the audience would match for, yeah. Trying that out, that’s great. Very nice that we're able to do that from different regions. Like that's
(21:18) Tim: Pretty. Yeah. I mean, and we, we haven't talked a lot about that on the Twitch stuff at all, but like, I do think that's, an important bit there, like the fact that there's like getting that representative when we talk about like representative connection and the browser type and all that stuff matters, but the geography is really important too.
(21:37) Scott: I would, mention there too, that, in, past client work that, I recall doing, there's often this, idea of testing the audience that you already have, we don't have any users on, mobile phones, they're all on desktop, so let's focus there. Or we don't really have any users outside of the US. Something like that. Whereas, maybe optimizing your site for those use case is that you don't currently have, would enable people who are in those regions or on those devices to start using your product better. So yeah. Being able to test around the world, even if you don't have a worldwide audience yet, probably a pretty good idea.
(22:25) Tim: Yeah, it is actually, that's, that's a really good point. I wonder if there, let me, I want to show something. Because I think it's kind of interesting. I think, or it could be kind of interesting, I guess I haven't actually looked for them. I'm going to grab, let’s see, I keep that off in a separate thing. Cause I had a shortcut under different browser profile, but I'm going to look at the crux dashboard for site really quick. To see if they're in crux, because crux like what you just pointed out about like the, things not, maybe they're not showing up from a, a geography or a device as much because you haven't served that audience. I think crux is actually a really interesting, and obvious example of that sometimes. and so, because the way crux works and, there's a question here by the way, from Shawn and it does crux does Chrome user experience report does, break down by device and network, the dashboard doesn't do geography, but if you query the database directly, you can get a, a geographic breakdown.
But this is where I think it gets interesting because Chrome user experiences like this anonymized user experience data. Like cuz they're collecting anonymized data through sessions, it's like a whole thing. There’s a lot of stuff that happens to make sure that what they report data wise. They're not doing anything that could potentially be a privacy risk. One of those is there's like a threshold, there's this idea of like, if you don't have enough traffic from any subset of the audience, so let's say that you are, yeah. Your traffic in China or your traffic in Africa is below whatever this threshold is of users. They just won't report it. So it'll be completely nonexistent. And that goes for connection as well. And so what I often see
(24:18) Scott: And device types too.
(24:18) Tim: Yeah. And device types too, would be the same thing. Right?
(24:23) Scott: Yeah. So if I had my user agents set to Scott Jehl's phone, whatever might not pop yeah. Right. There would have to be a lot of those. And I think there's only one.
(24:37) Tim: Yeah. If there's not enough users. Right. Because the idea is if there's a small enough number, then that becomes a privacy risk because now, you could start to splice and dice it. And I mean, security and privacy is a whole thing. Right? this actually is a pretty good example as it turns out, I think like, so when we look at crux data, a lot of the time we see like big, big percentages for connection distribution of 4g, which is effectively the fastest that they're going to report cuz that's all the, effective connection type standard supports. And then you see like these other percentages that look really, really small and like 3.86% being 3g. And like even less than that, if anything coming in, like there's nothing being reported as 2G or slow 2G here. This is an interesting example I think, because you can see how they alternate, like they have, it's pretty standard 95, 96, 97%. But then you have these two months where like nothing gets reported for 3g.
So my guess is this is one of those examples where there's probably people on this slower connection speed. And maybe for whatever reason, it just, not enough of those sessions are getting recorded. And so if you're looking at your data and not like aware of how that's coming across, you may think, oh, we don't have any 3g users, but like it's probably that there's a decent chunk. It's just like, they alternate between like not having enough sessions from that network to be reported. So it doesn't show up in your analytics. So you think, and kind of ignore it, but it's actually, they're there. It's just yeah. Yeah. And maybe if you improve the experience for them, you'd get more there too. So anyway, it's just, I think an Interesting,
(26:17) Tim: Yeah, it's interesting. Yeah. Interesting when you have sort of, public analytics, like different considerations
(26:25) Scott: for sure.
(26:28) Tim: Anyway, that was a nice level of sight. I like that though. Good. Okay. So this is the result we've got, for the, in the site.
(26:40) Scott: All right. So we're on 4g on uh,
(26:42) Tim: Yeah, so I ran it on 4g here, out of Cape Town is the only node that we had in Africa. And an emulated G4.
(26:54) Scott: Okay. Might be worth kicking off a, another round from somewhere else. Just to see, maybe there are CDN differences, maybe hosted somewhere other than Kenya. I'm not sure, but yeah,
(27:06) Tim: That's a valid point.
(27:07) Scott: Interesting to see the timing.
(27:10) Tim: Yeah. Let's run one from the US, you would think, between the two different, at least you'd have, if there is going to be a CDN issue, I suspect it's likely we'll surface it that way. So yeah, it's a good point. And if we look down, I think we see that like our test result is a little bit more negative for this page than their 75th percentile of crux data is. But again, I'm always more worried about, they still do look like they have some room for improvement on the largest Contentful paint in the wild. I'm always more concerned when it's like the opposite. Like if our results are too good, that to me is the real problem.
(27:49) Scott: Yeah. Interesting. Okay. So yeah pretty high observed, LCP here at least. Okay. And then time
(28:00) Tim: Blocking time is pretty high and then that CLS is right. And that needs improvement, which looks like they're right on the edge, for this URL at least.
(28:11) Scott: Alright Okay. All right.
(28:12) Tim: So let's jump to run 2 and let's dive right into the film strip. There's a lot of red here. We'll get to that in a second. It looks, I think, wonder if there's, we also may have timed that interestingly, what do we got here? Oh no, these are so red is 404, right? So these are all 404 errors, like resource not found.
(28:31) Scott: Hmm. Interesting. Yeah.
(28:34) Tim: Yeah. And sure enough, it just doesn't, its not-
(28:36) Scott: Shopify site.
(28:37) Tim: Yeah. It looks like it's a Shopify site,
(28:43) Scott: Which, I've found with past clients often means there are, sometimes quick, ways to improve your performance. Cause there's like that whole, ecosystem of Shopify plugins, that you can, pretty quickly apply a fix. So yeah. Yeah. So yeah, it looks like a bunch of 404 but what were our initial type timings time to first bite was not too bad.
(29:15) Tim: No time to first byte wasn't bad. We’re okay there, our bigger issues were around. We did have a bit of a gap, so I'm going to hop back to the summary for a second. Here. We did have a little bit of a gap between that time to first byte and start render, usually an indication that we've got render blocking resources, and then we definitely had a gap between largest contentful paint. And when we get that first contenful paint.
(29:42) Scott: Yeah. Pretty big one there that, that suggests maybe, maybe images are either served from JavaScript or they're too large,
(29:53) Tim: Yeah. That's usually you're right. That's a solid point. If there's that big of a gap, there's probably something like that factoring in. so that's like a, that the start render delay, I guess first. So there are a couple blocking resources. We've got the CSS, which is always going to be blocking unless you're getting super creative with it. I think we talked about that on a previous episode or one of the, approaches you had to, that that you'd come up with.
(30:23) Scott: Yeah. Well I think, they're just looking at this, even at a glance. I mean, one of the blocking CSS files is a fonts file. Maybe that one could move its font declarations into the HTML and reach out to the fonts a little sooner without that double
(30:41) Tim: That's a good point chain I mean it's, and it is small. Look at that
(30:45) Scott: Yeah. Just drop that right into the head of the page in a style element and yeah, that would be at least one of 'em, but they're all in the same domain. Right. So we're kind of,
(30:54) Tim: Yeah. They're all off CDN.Shopify. So you're still going to have to make the request and it's not like the, like the connection time. It's not like the request itself is super long in that one, but it's still a little bit right. Like drop that in the head, I think you're right. Actually all their CSS is pretty small. Like this is what, this Color stuff. Pretty small
(31:17) Scott: Could use some well it's so it's so small anyway. Doesn't really need to be
(31:27) Tim: The Theme itself. Okay. That's pretty typical. But yeah, you could roll, you could easily roll, the colors or fonts parts either into the theme for that single request or in line in the page and get those, you know
(31:39) Scott: Yeah, Is there compression on that? I am looking.
(31:44) Tim: Yeah, there is. So we do have, this right here. You can see the uncompressed size compared to the bytes in. Oh, that's good. Yeah. Okay. We're good there. Google tag manager does not look like it's, it looks like its being loaded, probably asynchronously. Chrome's reporting a potentially blocking status, which means that that's usually async, that's like if it arrives before, we get something out onto the screen, like the way async, let me back up the way async scripts work is, as soon as they arrive, then they get executed. So they're downloaded sort of in the background, they don't block during the download phase, but the execution phase they do. So if they arrive before you get something painted on the screen, well then the execution has the fire and it's blocking. If they arrive after you get something out on the screen, then it didn't technically block like it'll execute, but you've already got something on the screen. So that's why it's a potentially blocking. Yeah.
(32:47) Scott: Yeah that makes sense. But that next script,
(32:49) Tim: This one,
(32:50) Scott: Yeah. That looks problematic.
(32:52) Tim: Yeah, it does. And this one is pulling. Yeah, this one is long. We've got a long connection time up front there.
(32:59) Scott: So another domain.
(33:02) Tim: Yeah. A separate domain and yeah, it's not like massively big file, but yes, all that cost right there and that is loaded in a blocking manner. So you can see there's that tiny little sliver of pink here. That bit of JavaScript execution before we ever get anything painted off here too. One thing that I would, we could do there is kick off a couple tests just to test impact of that.
(33:30) Scott: Such as blocking it maybe?
(33:34) Tim: Yeah. So I'm going to go back. This is run one, going to go back to the waterfall view, cuz there's a little bit of a shortcut on the view. We could add it to the film strip view. We should make a note of that. But in here, what I can do is we can block the request. We'll just block the URL it doesn't matter too much, I suppose. This is cozy country, cozy country, and we'll run that. Like the other way to do this is, if you're on the homepage, there's the ability to block under advanced settings where you could drop in a URL or a domain. So there's a couple different ways to fire a test like this off. But what this is going to do is this will let us run the page, have that, URL, block it from being loaded entirely so that we can see like with that out of the way, does that make any [Inaudible34:35 ] For us?
(34:38) Scott: And in theory, unless that tag manager is blocking it could save a couple, couple seconds, right?
(34:43) Tim: Yeah. Right. This is where the race is going to come in and this is why it's good to test this kind of thing. I'm going to close this off for a bit if, to your point with that
To your point with that async one, it's going to be a little bit of a race between the browser. Like how quickly it gets that CSS down to where it can like parse through the CSS and get to the rendering stage. And can it get there before Google tag manager arrives? And if it does then you're right. Like we could potentially shift that render quite a bit earlier. If it doesn't, then it might be one of those situations where like, if we can't complete the actual style calculation, part of the process, maybe we only gain a little bit of time here and it's probably one of those that's going to be like, depending, because it's a race condition. Sometimes it's going to be a bigger improvement than others. Like even in the real world I would imagine on this.
(35:37) Scott: Okay. Interesting. So yeah, that could potentially impact the gap between what, first byte and start render, right?
(35:46) Tim: Yeah. So this is the result of that one with it blocked came back. So we went from what we got about a 1.1 second gap. And he here we had about a 2 second gap
(35:58) Scott: Could drop that in the film strip. Maybe
(36:01) Tim: That's a good call. So I'm going to go to my test history. We'll actually take two views of this. We’ll select the two of 'em. We can compare that'll hop to our film strip. We can also plot, which we only did three runs. So it's not the most exciting, but it's, just to show it off. So
(36:25) Scott: Yeah. Look at that.
(36:26) Tim: Yeah. So our film strip, yeah, definitely out a lot earlier on the screen. Which is nice and that's, with similar time to first bytes, nearly identical actually, which is nice. Nice.
(36:38) Scott: so Yeah. Almost identical the next thing to know, I guess, is how important it is for that script to block. Right. And if it could be deferred, maybe just add a defer attribute and they'd save a second or more right on initial render. So that's great.
(37:01) Tim: Yeah. And it is a vendor script, so yeah. You'd have to yeah. Look at the docs, talk to the vendor. Although there may be in my experience, at least like sometimes the vendors are aware that they can be deferred. Other times it's like a thing where they're trying to just get themselves to fire earlier. And that's where you may want to run your own experiments too, to your point. Like if you can defer it, even if it's not in the docs or even if the vendor doesn't think it's possible, sometimes you can.
(37:26) Scott: Yeah.
(37:26) Tim: But yeah, there's a nice little improvement there.
(37:29) Scott: Yeah. That's a good one.
(37:31) Tim: And then if we look at the, the plotting, this is one of the things I like to do for like the variability. Let’s see which test we had for the-- So, the one ending with six was our blocks. So we're going to compare it to the original test. And I'm just going to force vertical access to zero, just cuz for me, it's at least a little easier to see, but what this will do is this plots, the results of each run. And so in this case it was very consistent. Like when we had, the stuff blocked, it was statistically significant in terms of how much faster it was. And it was consistent too. There wasn't like a situation where it was ever slower. for this change, I wouldn't expect it to, it's just, sometimes there are changes that introduce like a high level of variability in the results and yeah, that doesn't seem like that's a case here. So
(38:26) Scott: Yeah. Nice. Yeah. That's a nifty little view that, I don't know. I, I think it probably doesn't get a lot of use, but um,
(38:34) Tim: No and we've, I mean, we hid it to be fair, It was never very prominent. I think we only recently, like in the last couple weeks, put it here on this. I'm not going to plot all that, but we've only recently put it on this from the test history to make it a little easier. Nice. So that's a nice improvement.
(38:51) Scott: Yeah. Great.
(38:54) Tim: let's go back to our original test on this one and we can look at the film strip again. I probably have this open in another page. Don't I? Yeah, I do. This is me and my tabs, man.
(39:10) Scott: Are you starting to look at that? That LCP?
(39:13) Tim: Yeah. The LCP would be the next thing here. So actually let's go back to the actual result then for LCP, cuz we've got that core web vital shortcut, that'll make that a little easier. So yeah, so just to review, we've got in this case at least a good ten second gap now in the real world, it doesn't look like the gap is as high. Certainly not 10 seconds, but you can see are in that needs improvement range and there is a good 1.7 second gap. So it's not like this is, there is a gap that occurs here. So if we click on that and come through, oh, in this case it's firing on actually this DIV this little text change up here.
(39:55) Scott: Interesting. Yeah. Okay.
(39:58) Tim: Which Looks like it's the dynamic thing for folk. If you can read that heads up. If you're outside of the us and Canada shop from the Kenya store for better ship shipping rates. So there's like potentially a dynamic message. That's triggering LCP here.
(40:14) Scott: Okay. Yeah.
(40:16) Tim: Which probably explains the gap. At least the 10 seconds versus 1.7.
(40:24) Scott: Is that two jumps we're looking at?
(40:27) Tim: In terms of the shifting?
(40:30) Scott: Yeah. I wonder is this, is it playing different, ads like in rotation or is it, I guess those are steps early, right?
(40:41) Tim: I think it's Steps.
(40:44) Scott: Yeah [Inaudible40:43] pretty close together. Yeah.
(40:46) Tim: We Had a little bit of a shift happen. Right. And yeah, I think it's a step thing. So we can see like the little bit of shift we saw was from yeah. Not really ads. It's just kind of popping in that dynamic message sort of after the
Fact.
(41:00) Scott: Right. Good example
(41:00) Tim: So in this case, it's about holding the space for that, for that div, but this also, I mean, this is again a good point of why like understanding the difference between why we might see some results that we do synthetically versus real user. if we're reporting 12 of second LCP, real user monitoring, that crux is 2.7, but that message itself might not, that's only going to show for, this version of the site for one, like not from the version that would be targeted at the local audience. it looks like, and the other thing is, that's one of those things that may or may not, depending on when it comes in or if something doesn't pop in and sometimes like that's maybe why we sometimes see a faster LCP. Cause if that message isn't there, then it's one of those images.
(42:01) Scott: So I guess we could, did you, did you dig into what maybe is a, a script that's loading a little later that triggers that,
(42:10) Tim: That text? I did not yet. So let's what, right. We've got a DIIV here, let's see this waterfall truncates at the point where LCP fires. So it would be presumably something happening Close-ish to that event. Probably some bit of JavaScript execution or something or a request that completes with data. There is a little bit of JavaScript execution here. In fact this might be it.
(42:43) Scott: Yeah. That's interesting.
(42:44) Tim: So there's this [Inaudible42:46] I think that's a personalization thing, right? Familiar with that.
(42:52) Scott: No, that's a new one to me.
(42:54) Tim: Integrated experience for collecting and displaying UGC Customer reviews, visual marketing, potentially.
(43:07) Scott: Yep.
(43:07) Tim: So if you look at the waterfall, the reason why I zeroed in on that one is, let's see if we can get to the
(43:15) Scott: LCP line.
(43:16) Tim: Yeah. The LCP line is right at the end here of the waterfall and this is the tiniest little bit like, you can barely see it cause it's not like a super long execution, but we have some script execution from that script right before LCP. And then we also have some that, but it looks like from this one.
(43:35) Scott: Just Uno.
(43:36) Tim: Yeah. But that one looks like it's just LCP, like maybe right after. So my guess is, it is this, yacht pull thing, doing something. So this is one of those situations where maybe if that is the case, maybe pulling back a third party script that late. This is one of those rare cases where maybe we want to bump the third party script up a little higher or,
(44:07) Scott: Yeah Well, It's interesting. Like what kicks off the request to that script? Is it triggered by another script?
(44:15) Tim: It is, triggered by a loader, another loader from CDN loyalty. Yatch pull again, so now we're further up. That would be, this one looks like, that one looks like it's actually pulled in.
(44:33) Scott: Okay. But that was requested pretty late. So maybe, maybe that could move up in the source,
(44:39) Tim: Right? Yeah. That would be an idea of, we talked a little bit about priority hints, I think the other last week, but that might, that might be an interesting solution there because you probably don't want it blocking display of your page necessarily, but you want a little higher. So I wonder if this is one of those situations where you could vote it deferred like this, but then give it a priority, hint, to tell it to come in a little bit earlier on the network, at least.
(45:04) Scott: Yeah.
(45:06) Tim: And get that head start or yeah.
(45:08) Scott: Yeah. That's a tricky one, but yeah, that'd be a good place to start. Yeah. Interesting to see an LCP. That's not just a hero image. Yeah.
(45:23) Tim: It's always the hero image, right. It's a little, yeah. There's actually a question in here, like identifying the LCP element in the waterfall when it's text, in the waterfall that that's trickier. Right. Like, cuz it's not an actual individual request, like an image. So like,
(45:40) Scott: But we can show the element, right?
(45:43) Tim: Yes.
(45:44) Scott: In that other view.
(45:46) Tim: Yeah. Like if we're in the event summary, we can see the element itself so we can see the div. We can see a little bit of the outer HTML. Sometimes there's going to be a hint in here. Like often if it's dynamically injected, there's some sort of a signal in an attribute somewhere where that came from. Yeah. It didn't really seem to be the case here, which is why then we had to kind of dive in and say, okay, right before LCP fires, which is the end of this truncated waterfall, like what happens? Is there a request that comes back right before that, maybe it's like an API response with some content or is there some JavaScript execution, which is often the case. And then that's kind of your clue that okay. That's probably triggered by that thing that occurred, but it is a little bit more. Yeah. It's definitely not as straightforward as like, oh, it's this image. Boom.
(46:31) Scott: Right. Interesting. Okay. Yeah. Yeah. Anything else to look at here? I think [Inaudible46:38] I was going to say, we mentioned a multistep test.
(46:41) Tim: That's what I was going to say. Let's do that cuz I don't want to, I think that's kind of fun. It is kind of fun and I don't want to necessarily lose that here. So let's, do that. So we're going to go through a flow, we'll start with this collection and then we'll kind of go to an individual product. We'll add it to the cart and then check out. So the first thing we'll be dropping that URL here. Making sure we've got our usual test settings and stuff is all fine. I'm just going to do multistep here.
(47:15) Scott: Should we do three runs again or just one for,
(47:19) Tim: Yeah, we, we could bring it down to one just for, expediting the results a little bit. I think would make sense.
(47:25) Scott: I agree. All right.
(47:27) Tim: Okay. So the first thing we got to do is, so we're going to have to do this inside of script. This would be the custom scripting functionality that webpage test has to do flows or things like that. The first thing is just going to be to navigate to that page. I could drop the URL in, we also have this sort of pass through variable that we could use that says grab the URL that's up here and substitute for this. So we'll do that. So the next thing we need to do is get from this page to an individual product. So
(48:05) Scott: You need a selector of some sort.
(48:07) Tim: Exactly. Yeah. We need to find a selector that's yeah unique or at least gets us to one of these links. So what I'm going to do is inspect the source. I tend to test these out, like as I'm doing it in the console, if it works in the console, it's going to work in the script. So it's just for now, if we can just grab that first image, maybe the first product we could grab any that we wanted, but I think this works fine. For links H-ref works really, really well. So if we did console, we did, let's see my consoles all full of stuff. So if we did a document.queryselector Will do A and then drop that H-ref in there That size up a little bit just to make it a little easier to
(49:03) Scott: All right. Yeah, that looks good.
(49:06) Tim: I miss, I missed my closing there. Okay. So now we get a preview. This is why I like doing it in the console too. Because it'll automatically highlight the element right away so I can see. Yep. That's the one I wanted. I can see that I got the thing. And then we're going to have to fire a click event on it. So I'm going to copy that. Assuming this works, and then hit enter just to make sure it does what I wanted it to. And that looks about right.
We'll go back to our script and we'll add and execute and wait and just drop that right in
(49:40) Scott: Slick.
(49:44) Tim: Yeah. Sorry about that. I realized partly through demoing this that I have, it's a little bit of a tease, stay tuned yeah. Stay tuned.
(49:55) Scott: Got that admin flag on.
(49:58) Tim: Yeah. Yeah, exactly. We'll get something out yet. My bad. Anyway, so let's go back to our just where we-- that was my oppo thing that confused me. Okay. So now we're on this page. So now we should be at this step. Our next thing is going to be to add one to the cart. If we wanted to do this, like you could go ahead and choose like a different size. I think since we just want to measure the flow in this case, I don't know. We could probably just add to cart directly and yeah. So we'll just look at this button. Yeah. The ID looks like an option or the name too, right?
(50:42) Scott: Yeah.
(50:43) Tim: Yeah. The ID, the only thing that's throw me on the idea is this big number makes me nervous. Yeah. Like I'm not sure if that's a [Inaudible50:49] dynamic thing.
So let's try this, and then let's do the exact same thing. Document do query selector. And we're going to grab a button with a name of add.
(51:09) Scott: Looks good.
(51:09) Tim: Looking good. And then click
(51:12) Scott: Love those little previews that you get in depth.
(51:13) Tim: I do. The previews are so nice. So I click through that. Great. Now we get a little checkout thing. So let go back. Oh, go away. I don't think I'm set. I think we can ignore that. We could probably find a cookie to ignore that actually, but that's, we'll just for now for simplicity. Leave it Out of there.
(51:30) Scott: And you need to exec and wait on there as well.
(51:33) Tim: Yes, I do. Good call. I would've run this test and then been like, why isn't this working? Yeah, we could space this out just for readability too.
(51:44) Scott: Yeah. And if you're interested in these, scripting options, all these, various commands you can write are over in the docs.
(51:53) Tim: Yeah. On the, Custom scripting right there in front of me. I was looking for it.
(52:06) Scott: Yep. All right there. Lots you got to do.
(52:09) Tim: Yeah. It's there's quite a bit that can actually, we use scripting for things like advanced Firefox preferences and all sorts of fun stuff.
(52:16) Scott: Yeah. Very cool.
(52:18) Tim: Okay. So now we're here. So now the last thing we need to do is check out. So let's grab this one and we'll do the same. We'll just use that name attribute again and then do documents.queryselector.
(52:35) Scott: Good naming on this site. Everything's very clear.
(52:39) Tim: It's not always this easy to grab a, especially when you're getting into some of the stuff where it's like, angular, reactor, whatever, angular in particular like, and these forms can be driven by that. Then you have to get into like, sometimes it's funky. Like where does the click need to be fired from? And or is it the right event even? Yeah, this is nice. This is so much easier. That's the part, I'm curious about, cuz that transition, there seems to take a little bit, so we'll grab that. So now we know the script worked in the browser. We’ve got our commands here. We're only doing the one test. Let's fire it off. I can zoom out. Zooming in flexes that responsive chop though. I do like that.
(53:36) Scott: Yeah. And, and in my, my opinion, it highlights things that I need to fix
(53:44) Tim: There is that too. Yeah. It's nice. It's a good stress test.
(53:47) Scott: Yeah. Yeah. Its usability was the first, yeah. Aim.
(53:53) Tim: Exactly. It's a good step in the right direction for that.
(53:57) Scott: All right. So we've got a, multi-step kicked off of,
(53:59) Tim: Yeah, I know. This might take a little bit, right. Cuz even though we're only doing the one test run, it's going to be a lot more involved. Right. It's going to have to load the page. It's going to wait till the page loads. Then it's going to exec the JavaScript, wait until any related activity concludes, then keep kind of moving through. But yeah, no, this time, at least we get to see that entire flow. And so what we're hoping for here is like, it's one thing to land directly on a product page. It's another thing to land directly on a checkout page. But like that actual process in real life, the way users are going to go through it is those interactions in between. And then, you want to check and make sure there's not a significant cap or unnecessary friction.
(54:41) Scott: Yeah. And it's the sort of thing that you could, run, periodically to check how that particular flow is performing. Um right. Which seems pretty useful, especially coming into holiday seasons, things like that. So,
(54:55) Tim: Oh yeah. And I definitely worked, I mean, when I was doing consulting and stuff, I definitely worked with a few orgs who that was like the specific problem they would have is they'd see a big drop off when people would go, they'd get to the point where they'd added something to the cart, but then going to the checkout, there was a big drop off. And if they were monitoring the pages individually, it looked okay. But when they looked at those two, like that actual flow from I've added to jumping to my checkout cart, like then it, it became apparent that there was some performance issues. And when you clean those up, you were fine. But yeah,
(55:27) Scott: Here we go.
(55:28) Tim: All right. So it breaks it out by step. Now there's a couple things to note about the steps. it's not so bad here because most of these are distinct, see, we probably should have cleared that order thing and see it popped its head up a little bit, but oh, well it still gets the actual working that's happening behind the scenes to get to our checkout process.
(55:52) Scott: Yeah. But it would've been pretty realistic to have to close a dialogue or
(55:56) Tim: True. True. Yeah. Maybe shouldn't have stopped that over or at least added a cookie or something to clean that up, but hindsight 2020, the other thing is on these steps, if it's a step that does not trigger another page navigation, you're not going to get metrics like, like, largest contentful paint and stuff like that because those are based on, page navs. So we have like the one where we added to a cart, that brought that sidebar in, that's not a page nav, which is why we're not going to see any sort of, LCP for that step or anything like that.
(56:30) Scott: Interesting. Yeah.
(56:32) Tim: Yeah. So now you've got this, that results. So what's nice is we've got the waterfall for each. We already kind of saw that initial landing page, but now we're starting to see the impact of, caching from one page to another. Like we jumped from collections to the PDP. Now we're seeing a lot of that probably thankfully was cached I guess, from the collection stuff. But now you still have all these other request. Most of them looks like little, third party things are hopping off to the cart quite a bit, actually off to the cart here.
(57:02) Scott: That's true.
(57:04) Tim: Got a little adjacent connections there-
(57:08) Scott: Almost [Interposed talking57:09] kind of thing.
(57:10) Tim: Yeah. And this would be our, this is when we add it to the cart. We could've name these too, by the way, did not do that in the script, but we could have used set event name and then say, PDP or set event name at the cart, which would've just made it a little easier to distinguish which these steps are. But
(57:32) Scott: Yeah. Have you seen, have you seen clients add, say a delay to sort of, mimic how long it would take to even find that button to click, to sort of like
(57:44) Tim: Good question.
(57:45) Scott: Cause this is I
(57:46) Tim: I haven’t yeah. I haven't seen many do it, but
(57:49) Scott: Right. It's the fastest, a robot to do it. Yeah,
(57:56) Tim: Exactly. Yeah. I mean, that's, that's also a really good point. we're kind of, yeah, the, our it's going to be jumping around pretty darn fast because it's going to find that it's not like the delay in sort of looking through the page or anything like that. I think there's like, there's probably a point where you have to, and I'm not sure the answer is probably different for different sites and situations, but there's probably a point where you have to like get to a point where it's realistic enough and accept that there's going to be some like misses, right. It's not going to perfectly mimic the real user experience, but like,
(58:35) Scott: It's also about how you're communicating this. Like if it's, something where you're saying it, takes five seconds to, to reach our site and check out, for the average user, you probably wouldn't want to rely on the test like this for that kind of sure statement. Yeah.
(58:51) Tim: That's really good point too,
(58:52) Scott: But it's really good for testing, every, every poll request and make sure you don't regress, that kind of thing.
(59:01) Tim: No, that's a really good point. and I do see like for what it's worth this last step, which was the one that I was a little curious about, which is I've added to cart and now we went and clicked through to the checkout. That is a little bit of a lengthy process. Like we have this all kind of fires before we ever get to in this case, this is our actual checkout page. So we've got some delay there just tracking and beaconing and stuff going on. That's adding a fair amount of time. And then we're making requests for, there are like some JavaScript files that apparently that are related to the checkout in style sheet and stuff that we haven't used before. One of them is pretty big, like this JavaScript file is, 250 K. and those are all done in a blocking way.
And you can see, so then the loading of that's going to delay that checkout page coming in. And the other thing is we have this pink bar, which is sort of that Dom content, loaded, like any activity related to that, that's where JQuery, would trigger a lot of work typically. so there is a fair amount of stuff that's happening in that transition period, and I don't know that if this is an issue for them or not, but if there is an issue where there's like potentially some drop off here, it'd be worth, zeroing in, on all this activity, basically lines one through maybe 19,20 or
(1:00:32) Scott: Yeah. Are we using a, tracking service that we'd don't need to use anymore?
(1:00:35) Tim: Yep or can maybe make some, can we do some prefetching for some of these checkout things, like if I've added to cart, if there's a pretty strong likelihood that I didn't check out, could I add that point where I'm adding to cart, maybe prefetch some of these scripts or style sheets to get those in cash to, so I'm not waiting for the request
(1:00:54) Scott: Also looks like a, pre-order now domain makes an appearance here that maybe could be, at least, pre connected um yep. On the prior cause it's pretty costly there.
(1:01:09) Scott: Yeah. No, that's a good point. That's another 350. That's almost, what 700 milliseconds or so to do the connection TCP and LCP there. So yeah. Pre connecting off to that might be another option. That's a really good point. Or you could also yeah. Pre connecting and those are where we can get dynamic. Like we talk about those that have to be in the head, or you could use headers, but you can dynamically inject pre connects, preloads, prefetches. So again at that add to come art, fire off a couple pre connects to that, or maybe some of these key pay services, which it looks like we might be doing DNS resolution for upfront, like pre connecting to these might help.
Because you can see the DNS resolution, the little teal happening, but a full blown pre connect should also do our TCP connection and SSL negotiation. And for those key payment services, that might be a nice thing to speed those up too. So there's yeah. Closing the gap here a little bit. Yeah. That would be, there looks like there's some opportunity here to make that a little friction.
(1:02:11) Scott: Yeah. Yeah. Definitely. Those, those third parties, especially.
(1:02:14) Tim: Yeah. Yeah. Nice. Yeah. So that's the, the scripting stuff. I think that you know about where we are in terms of time and stuff, but yeah. Yeah. The scripting thing fun. I, I dig into, I like digging into that.
(1:02:27) Scott: Yeah. That's nice. We looked at that, looked at that site from a few different angles and it seems like, they're already in pretty good shape, but
(1:02:37) Tim: Yeah. Their core web vitals looked pretty solid in terms of what we were seeing there. I think they've got a pretty solid foundation to build on too. It's not like there is nothing that I saw. Sometimes you look at a site and you're like, this is going to require quite a bit of like re-architecture to get there. I don't think that's the case here. Like I think we saw, there's probably just a couple things here and there that if they make those improvements on, can get them some really big wins without having to do a full blown re architecture or anything like that.
(1:03:05) Scott: Right. Right take little steps
(1:03:09) Tim: Cool. I like it. Awesome. Thanks. Yeah. Nice, nice chat with you. I mean, we can do the wetsuit thing next time. Maybe.
(1:03:16) Scott: Yeah. I'll ship it to you. I don't know. It should take less than a week and
(1:03:21) Tim: Okay. I appreciate that. That would be good. Yeah, we could get, I wonder if we could get webpage test branded wetsuits, just the next team photos. All of us in webpage test wetsuits.
(1:03:31) Scott: I'll you, you try it on first and yeah, maybe we can all order 'em
(1:03:34) Tim: Sounds good. Yeah. Sounds like a good plan. All right. Thanks man. And thanks everybody for kind of tuned in. yeah, I we'll be back in another two weeks, as always if there is anything you want us to look into let us know.
(1:03:49) Scott: All right. See you later, Tim.