Introducing The Brand New UI, Auditing The Athletic with Scott Jehl and Tim Kadlec

By 
Watch the session
About

Today was a big day @ WebPageTest LIVE! Weeks of work behind the curtains was unveiled today: a stunning all new UI. To walk us through many of the new updates, Scott Jehl joined Tim Kadlec to run through the changes as they profiled TheAthletic.com .

You can read the latest blog post published as well today (link below) that discusses the updates as well. Enjoy. blog post: https://bit.ly/wptlive-new-ui-blog.

Want more info? Pls follow us on Twitter: Tim Kadlec , Scott Jehl, and WebPageTest

Sign up for a FREE WebPageTest account and start profiling

Record on
Duration
 minutes

Speakers

Tim Kadlec​
Director of DevEx Engineering
WebPageTest
Scott Jehl
Sr. DevEx Engineering​
WebPageTest

Introducing The Brand New UI, Auditing The Athletic with Scott Jehl and Tim Kadlec

Event Description

Today was a big day @ WebPageTest LIVE! Weeks of work behind the curtains was unveiled today: a stunning all new UI. To walk us through many of the new updates, Scott Jehl joined Tim Kadlec to run through the changes as they profiled TheAthletic.com .

You can read the latest blog post published as well today (link below) that discusses the updates as well. Enjoy. blog post: https://bit.ly/wptlive-new-ui-blog.

Want more info? Pls follow us on Twitter: Tim Kadlec , Scott Jehl, and WebPageTest

Sign up for a FREE WebPageTest account and start profiling

Transcription

(00:08) Scott: Are you live already by the way?  


(00:11) Tim: I think we might be live. We're live, right. Aren't we?


(00:15) Scott: I believe we're live. Can you guys hear me now?


(00:17) Tim: Oh yeah. There's that sweet baritone voice.


(00:20) Henri: Oh no, it's the baritone mic, mic free I sound like a young Mike Tyson. Mic Tyson.


(00:32) Tim: Nice. I see the mic thing. Great. Yeah, that was nice. That was nice. That was good.  


(00:38) Henri:  I'll let you guys take over for a hot second as I deal with some, tech issues here.


(00:43) Tim: Yeah. Okay.


(00:50) Scott: Yeah. So, just gonna put a tweet up about something we're working on. Do we have any viewers yet?


(00:56) Tim: Yeah, there are a few people it looks like coming in.


(00:59) Scott: Okay. Give it a few minutes.


(01:02) Henri: Yeah. I believe there are actually let me, as you guys sort of sort that out, I'm gonna bring up the rest of viewers here. Sorry. I see the comment. Yeah, some people getting up Happy, Walter Good morning. Hey Marco, Bruno dev what's up? Glad you guys could join us. Folks can join us. I'm actually gonna open up another window here. Bingo I'm gonna go here actually. It’s been quite the morning. I have a thousand windows open, so I have to apologize.


(01:43) Tim: All right. I think I do the same thing I end up. Yeah. I mean, it's better than the end of the day. By the end of the day, I'll have a thousand windows with a thousand tabs each and it's, then I'll complain about the memory usage of the browser. and why is the machine slow, but yeah, So, yeah, I mean, I guess we, Henri, you had a couple of things you wanted to kind of bring up right away.


(02:06) Henri: I did. And I have to go back and check, something was being uploaded and I'm not sure if it's ready yet. But I will say this much right now, for those who came, I guess two weeks ago on, I was gonna say July, can you imagine that, I'm just so excited. For those who came two weeks ago on June, June, God, January 13th. It was myself and Paul Calvano of Akamai, who was nice enough to spend the entire hour with us here at webpage test live. And, we ended up, auditing, a bunch of EV sites and it was a lot of fun. Paul enjoyed it, I loved it. And the video should be done in the next 5/10 minutes at the very most if it's not done already.


So, the minute that's done and sort of processed I'll share that in the chat. People have been asking when it go up and, it's pretty much done. It’s just being processed, like I said, as we speak. So, yeah I'm excited to have that, up and, ready for folks to view, and, a lot of insights. we discover that Tesla was on Droople, which was really cool and, among other things, but yeah, that's really what I wanted to make sure, I got across, but I'm gonna hang out in the chat anyways, so I'll be able to share the link and see what's going on.


(03:37) Tim: Yeah. It's always nice to hear from Paul. He’s one of the--, there's like a, there's a bunch of people I think that are like, kind of unsung heroes in the web community. I would consider Paul to be one of those.


(03:46) Henri: absolutely.  


(03:48) Tim: I would like to just, this is actually true. We didn't plan this. But you're right. There's a little bit of a, probably starts at Scott and goes up to Henri.


(03:57) Henri: Yeah, something like that.


(04:00) Tim: I just get lazy. I can't, yeah, it takes six months to get to this too. It's like, it's a thing. It's a commitment


(04:07) Henri: I mean, I feel like Tim, you look like you might, you might break out into a song soon, it's like, so nicely just sort of like, yeah, [Inaudible04:17 ] you about to, I don't know.


(04:22) Tim: All right. Yeah, so, I mean, that's awesome about the video. Yeah, definitely drop that in the chat. I think today we were gonna kind of do an audit, kind of old school kind of live audit, but, maybe we could, yeah, just drop in.


(04:37) Scott: Do you wanna share your screen?


(04:41) Henri: Definitely. And I'm gonna bounce myself outta here. I wanna make sure I do this right. Okay, perfect. I saw it. So, I'll be in the chat folks, hanging out with the pythons, the people and, guys have a good time and I'll see you soon.


(04:55) Tim: See you, buddy.  


(04:55) Scott: All right. Yeah. Thanks.


(05:02) Tim: There we go. Look at that. So this is looking a little fresh here, Scott.


(05:08) Scott: It's looking a little different. Yeah. So to anyone who has tuned in today, we've put a little, little bit of an update online on the site. You'll notice that throughout the, various views, homepage result page, really almost touched every, view on the site. We’ve reorganized some things we haven't really changed a whole lot, but we've started the process of, better highlighting, and featuring some of the best parts of webpage tests. And also we wanted to make the homepage a little more, welcoming to people who aren't already a power user of the tool, to sort of explain what it is and give some helpful starting points. So you'll see if you go to the homepage, we've got some suggested, groups of tests that you might, choose to start with. But all the options that you're used to finding are still there. If you go to, the advanced configuration, but under simple, we've highlighted some new locations. We’ve got Mumbai in there. We've got, Toronto Germany. And you might notice some browsers that are new as of, what two days ago you put Edge on the live site. I think,  


(06:40) Tim: Yeah, we started, yeah. Slowly rolling out to the test agents and stuff about a couple of days ago, two days


(06:46) Scott: Yeah. So you can run Edge, and Firefox we've had for a while. And there are some more browsers as well. If you go into advanced, you can do, safari and, nightly versions of the browsers that are listed in that simple config. But what we wanted to do with the homepage was just sort of, give a little more information about what you're looking at, if you're coming to webpage test for the first time, see what it looks like when you, submit a test, get a result, page, little screenshot up top to show you what that looks like. And, yeah, that's, that for the homepage, maybe we could talk a little bit about the changes to the results as well.


(07:27) Tim: So, yeah, let's do that. Okay. So I did run, Henri suggested we look at The Athletic, which is like a sports publication site. And, actually really good sports. Like if you like reading about sports, they write very, very well. So yeah. So I ran a test, actually a couple, I think, one on Edge desktop, and one on Moto G four. Yeah. Scott, if you wanna.


(07:55) Scott: Yeah so I think, the immediate thing you'll notice is that, while all of the result pages that you're used to finding are still there, they're kind of tucked into this new menu just to clean up the page a little bit, and, make it clear what, you can find right off the bat from the first summary page. We have a header that is new. It kind of gives, a little better information about the test conditions that were run. so a lot of this information was sort of, sprinkled throughout the UI before, and we've moved it into a central place, even in, templates that formerly didn't really, feel quite as connected to the other results such as the film strip page.


So a lot of, webpage test users' favorite, views is the compare film strip view. we haven't changed it a whole lot, but we did add a header, up top that gives a little more information about the test and lets you get back to the result pages of the test that you're looking at, which was always possible, but just a little harder to find. So again, not a whole lot of new features going on quite yet we're just better surfacing the things that, that we have already. If you go back to the summary we've brought, the film strips in, to the summary page, they were always a little hard to find before. You had to scroll down and there was a link to the compare view, which of course was one of the most popularly, shared pages of, webpage test results. So we've brought those in, if you run multiple runs, or multiple first view repeat view kind of tests you'll get metrics and timelines for those, as well, and multi-step tests are all the same in this page. We've just sort of kept the tables we had with the metrics before but punched up the type, made it a little easier to scan, than it was in prior versions. So that's the summary,


(10:08) Tim: Do you also go through the individual one.


(10:10) Scott: Yeah. I should point out that a lot of this hasn't changed yet and it's sort of an ongoing process as you scrolled down that table is, similar to how it was yesterday. So you'll notice some familiar parts for sure.  


(10:27) Tim: Which I think was part of the goal, right? Like we wanted to make this with the UI, like partly to make this more approachable and friendly and easier to kind of digest and stuff like you said for everyone. But that also means, making sure that the guts of it are there. So for the folks who are familiar with it, you can still get to everything you're used to.


(10:49) Scott: Exactly. Yeah. What else is in that?  


(10:54) Tim: Well this is new like the little compare for SP things kind Of nice.  


(10:59) Scott: Oh yeah, so we added some, helpful links, to be able to get to, the timeline in that particular or the film strip, rather in that particular, button. Some of those links were elsewhere in the result pages we just wanted to make. 'em a little more visible. Let’s see, what else do we have here? If you go into the result menu, I think, did we add any other pages? Oh, you probably are familiar with, the link to find out what a site costs, external sites like that, were in the body of the page before, and linked out. So like in the metrics table we had, a link to data cost or what is my site cost, which is a site that Tim runs. So that's moved into the menu here so that we could have a little more room to punch up the type in the metrics area. Let’s see, what else, the details view, has gotten, a little local nav, which will help, being able to move between runs and repeat runs in this, view. Before it was a little tricky, you had to know to edit the URL really. I don't think we really surfaced a way to, to easily get between, runs in the details view.


(12:23) Tim: Yeah, you always had to kind of go back to the summary and jump off from there. And this is nice, cuz this is like, I'm a fan of this. Cause one of the things I think I've talked about in some of the other streams is like, there's a lot of information. We tend to zero in on that median run, but there's a lot of information in the variability between runs. Like if you're seeing that variability between run one and two and three, that's usually because there's some characteristic of the site that makes it slightly variable. And so there's a lot of, in my opinion, a lot of value to be gleaned by jumping between those different runs and seeing, what's going on, and yeah,


(12:58) Scott: Exactly. Yeah. Yeah. And then what was another thing I was going to mention? And maybe that was it a lot of the work went into, just sort of playing up a lot of the features that we had already like, bringing in iconography, for the locations and the browsers that you're using and those are on the homepage in here as well. Just trying to bring in a little more modernized feel to the tool. And part of this is to set the stage for some features that are planned, that needed a better frame, so to speak, in the UI. So those are coming very soon and we're really excited to release those.


(13:47) Tim: I was waiting to see if I was gonna have to mute you for a little bit. Actually, we could do that. I could mute you and you can talk about the features for a few seconds and we'll just play music over the top censored or whatever, but yeah, no, there's like, yeah, you're right. Some of this is setting the stage for a few things that we've got coming out in the next weeks in the next months. And, they're very exciting. So like laying the groundwork for this, like I'm, I'm super, super excited for what's coming next there.


(14:15) Scott: Yeah, me too. Yeah. So to be clear, we, hit the launch button. What like one minute before you went online here we


(14:26) Tim: We Literally, yeah. Yeah, it was literally within a few minutes of the Twitch stream, going live. So yeah, we hit and then jumped over here where Henri was, waiting for us already. so, yeah, so, there might be, we did some obviously, we were doing a little bit of walking through and testing and stuff like that, but it has been up for 5 minutes, 10 minutes, 20 minutes, I guess at this point. So if there are things you find please, hit us up, probably the GitHub repo is the best place to do that.


(15:03) Scott: Yeah, absolutely, Feedback on Twitter or whatever is great. But any ideas or, bugs, I'm sure there are going to be little tweaks that we still need to make. It’s a very iterative process, ongoing design. So just let us know.


(15:24) Tim: Drop that actually in the chat. Just so the links handy, we should have a nice little, there's a little issue template and stuff that you can use to hopefully make that easier yeah, so, I mean, we can,  


(15:35) Scott: I don't really have a whole lot of other than that plan for today. Did you? We could do some.


(15:42) Tim: I think honestly, it was now after that, like kind of showing it off as we're kind of diving into one of these, tests kind of looking at some results and seeing if there's anything interesting, or feeling questions cuz I'm seeing a couple of those too that are coming across and that's kind of fun. We don't usually do get a lot of or field a lot of questions on here about, potential things that are, coming in the pipe and stuff like that. And maybe we can talk about that. Yeah, if you're comfy


(16:07) Scott: Sounds good. Yeah.


(16:09) Tim: So like this one, I think, Matt asking about the waterfall chart, Matt, be honest, you're just concerned that you're gonna have to update your, impressively thorough and accurate, how to read a webpage test waterfall post aren't you that's what that question Is about.  


(16:24) Scott: Yeah. Yeah, I can speak to that one. I mean, one of the, I think that was one of the high priority issues that we've had in the tracker, for quite a while now. And it'll be a little easier to get to the waterfall now that, now that we have the UI a little better framed to hold some updated UI, Jeff, in particular, I know he's excited to look at the waterfall. We were thinking, possibly moving things, into SVG, I think in the process of updating it would be nice to figure out what features, should change about it or shouldn't change. So we're not just necessarily just reproducing what's already there, in SVG, but actually making it more interesting to use, I think we've got an issue for that.


(17:20) Tim: There's actually a couple, I think issues probably related to things about the waterfall, like either features or functionality requests. and there are a few things that we're thinking of there too, like that ones we, if we can get it to something that is like, a little bit more text-based like SVG or whatever should unlock some of that stuff. But I think it'll just make it a lot easier for folks to kind of look through the waterfall and find what they're looking for and stuff like that.


(17:45) Scott: Yeah. Yeah. So yeah, we're pretty excited about that one.  I don't know about a timeline for that because yeah.


(17:58) Tim: But yeah, it's something and jump in, like we, a big part of the planning for sure is looking at GitHub and seeing which issues are getting discussed and what requests are coming in from like that's, we have our own ideas and things that we like to do, but when we get stuff like that in the GitHub repo, like we pay close attention to that and, that helps us prioritize. So if there are things like that, that you want to see tweaked or adjusted, that is the place to have that conversation and we'll jump in for sure.


(18:27) Scott: Yeah, definitely. You see any other questions? I love the waterfall one though, that that's been a priority for us.


(18:35) Tim: Yeah. Yeah. I was glad to bring that on cuz and that we'll give you a little bit of a heads up so you don't have to panic just so you can, anybody who's not familiar with Matt, Matt has written an incredibly detailed post. With great information on how to read the webpage test waterfall and there's a-- The waterfall is, there's a ton going on there. Right. So that is a thing. So whenever that happens, Matt, for sure we'll be giving you a heads up ahead of time so that you can, get a little bit of fair warning and stuff like that.


(19:07) Scott: Yeah. I know I learned a great deal of how to use the waterfall personally just from Matt's post about it.


(19:17) Tim: Yeah, for sure.


(19:21) Scott: Let's see, what else do we have here?


(19:29) Tim: Lot of the waterfall stuff on what's that


(19:30) Scott: A lot of comments.


(19:32)Tim:  Yeah. No that's good. I'm glad to see, both the excitement and the questions about stuff.  


(19:38) Scott: Script recipes.


(19:39) Tim: Script recipes is an interesting one. Like first party only. That's a good idea. We’ve got a few things cooking there.


(19:48) Scott: yeah. I would say the answer is yes. We're thinking about that one.


(19:52) Tim: Well leave it at that, but yeah. Yeah. There are a few things cooking there. Yeah.


(20:00) Scott: let's see. Good in the film strip compare


(20:06) Tim: Film, strip compare. If there are no tags assigned to add some Meta info, like the times step to differentiate between the two. Cool. So I assume yeah. What we're talking. Yeah, it is, it is smart.  


(20:18) Scott: Yeah. And  I should elaborate a little bit on,  


(20:22) Tim: This is a terrible comparison? I picked two different sites. What was I--


(20:26) Scott: Actually that's useful? Yeah. Cuz that, that was the point I was about to bring up. So typically, or, up until about, half an hour ago, the film strip page looked more like this. Where there was no header up top and that'll continue to be the case if you're comparing sites that are from different test IDs. But if we happen to find that all the tests in the URL are from the same, like multiple runs of the same tests, for example, then we'll show it, in a header, like any other result page. But I agree, I think some level of labeling yeah. We’ve got at the edit title there and I think that feature is, in flux anyway. We were thinking of


(21:16) Tim: It is a bit it's we've got, yeah, it's a bit in flux at the moment. We'll be doing some stuff there to make that a little bit more resilient than it is today and give you the ability to actually, edit labels down, cuz that doesn't actually change the label in your test history and stuff like that. So we'll be working on some of that stuff as planned as well.


(21:35) Scott: Yeah. What else do we have here update looks nice. Thanks for saying that. Yeah. Yeah. This has been a couple months in the making. It’s a particularly challenging site, I think, to just, re-approach because we have so many features that are, baked into these views that we really wanna preserve. Right. So, a lot of this update wasn't very deep on the template change level. It was more like, CSS updates and finding areas where we could lightly change templating without compromising anything that people are, expecting to find on there. So


(22:29) Tim: Yeah, I actually think that's one of the fun and I legitimately mean fun things about like when we have gone through and made these sort of UI changes and stuff like that, it's just a better, even a deeper appreciation for just how many different things, Pat, I think you're in the chat, but how many different things pat has added to this tool over the years that like can pop up and just, contextual information and stuff like that or things that can change depending on how the test was run. There are a lot of, fun little rabbit trails to go down with webpage tests for different kinds of analysis, depending on what you're looking at. And whenever we're doing the UI changes, we're reminded of some of the ones that might be a little bit more hidden, but one, yeah, I think with the UI changes, you may hear one of the things I really liked was surfacing some of those things that are like some of those favorite features is that, just making 'em a little bit more connected and easier to get to, so.


(23:19) Scott: Right.


(23:22) Tim: Yeah. Great. Awesome.


(23:25) Scott: Yeah. Do you wanna like, run some tests or yeah. Yeah. What if we, what about running, the same site across a couple of those new locations that we feature maybe like see, how it impacts.


(23:40) Tim: Should we just, we can just run The Athletic on a bunch of 'em let me close off all these tabs because I'm already getting to the point where little tab happy. Alright, so let's fire off a, let's see. So one default here is mobile 4g network, Virginia we'll do desktop cable, Virginia. Why not? We're just gonna do all of them. It sounds fun.  


(24:07) Scott: Oh, I forgot to mention the vitals. The web vitals page has a little, little update too.


(24:13) Tim: Oh yeah. We'll show that in one of these results. Sarah, I kicked off all those simple configs. The other thing too, to just to note on the simple configs, like, it's partly here, I think there are two things. Like, I mean, you mentioned we wanna make it easy for folks. There was always a webpage test slash easy in the past that had like a default, which was that mobile, 4g kind of a thing. But I think one of the things that with the multiple configurations that we're hoping to accomplish, I guess, are twofold, is partly like exposing, like making people aware that like, hey, we, there are these other locations. and then sort of encouraging folks to do that, to test on different locations and different browsers because, that's, that's really critical and healthy, I think for the web, just not to not always be looking at the same browser and not always be looking at that same location.  


Certainly, the actual ones you might want to test on are gonna vary based on your own traffic. But we did try with our defaults here to use ones that we're representatives. So like Edge, in Canada, I think it's somewhere around seven and a half percent market share or something like that, but it's higher than it is in the US. And other locations Firefox in Germany is somewhere. What was it like, do you remember, was it like 12%, 13%, 15%, maybe even it was higher. Yeah. Pretty high compared to, I think in the US it's, 5% or, something. So the defaults here are not arbitrary. They're picked on like, based on what we see in terms of, global traffic to give some sort of smart defaults for that.


(25:43) Scott: Yeah. Yeah. And I think it's just a small nudge, that we're starting to move in this direction of encouraging, this idea of testing in places that you're not currently seeing a lot of traffic perhaps, sort of, seeing how you perform in markets that, could one day be your audience.


(26:08) Tim: There's that whole thing. What is that? There's a whole law, like one of those science laws or whatever around that too. Like the idea of, well, I mean, to put in layman's terms more is like analytics can be a little bit of a self-fulfilling prophecy. Like if you're not testing on Firefox in Frankfurt, don't be surprised if you don't have Firefox users in Frankfurt. Like, if there are gaps potentially in the experience that makes things in analytics not look awesome so.


(26:34) Scott: Exactly. Yeah.  


(26:36) Tim: Yes we did. Yeah. Yeah. I just first off at hats off, by the way, Pat actually was the one who landed the PR kinda landed, by surprise. I wasn't expecting coming when it did, to get edge actually in the agent. So yeah, that's all rolled out in snuck out there and that was exciting like that.


(26:53) Scott: Yeah. Yeah. Thanks for that, Pat. That was really, really a surprise this week.


(27:00) Tim: So we've got, I think results for all of the defaults now at this point.


(27:05) Scott: We've got your scroll bars turned on at all times.


(27:07) Tim: Oh, do I? I might have a, yeah. I'm sorry about that.  


(27:11) Scott: No, that's okay. It actually brings up something that, this is an ongoing, process of updating the design. And one thing that, you'll notice is if you, open the site on a smaller screen size, a mobile phone, or just resize your browser, a lot of the features will overflow. So the metrics, you'll have to scroll through them. And we did that for a number of reasons. Partly because the result pages are designed to support multiple runs. Sometimes many stepped runs. So in those cases, reflowing all the numbers would happen for every run. It would get quite tall. so with that in mind, I'm still working on a way, if anyone has ideas, I'd love to hear 'em, but a way to show the, there's more to scroll to see, because sometimes it's not so obvious. So I'm thinking maybe like a, a little drop shadow when there's more to scroll to on the left or the right could help with the metrics table.  


(28:21) Tim: That'd be kind of awesome.


(28:22) Scott: Yeah. Maybe later this week that'll drop in, but just, note, there might be more metrics also worth pointing out. This is not a new thing. But the metrics that you see are going to vary depending on the browser that you tested, and this is a good example, right? Firefox, you're not going to see, CLS metric or LCP. So some of those core web vitals.


(28:49) Tim: Yeah. Those are only supported in, chromium browsers at the moment, which means, so we get 'em in edge, we get 'em in our Chrome tests and we'll surface those when we can. But if you're testing in Firefox or Safari, you'll get first contentful paint, that's supported across the board. You’ll get total blocking time because that's something that we can kind of measure synthetically. But CLS and largest contentful paint no. so yeah, you'll get some differences there, which is fine. I actually think it's probably good to do the multiple testing and realize it because I've definitely, talked to organizations who don't realize that when they're monitoring core web vitals from their real user data, they're only actually looking at Chrome stuff at the moment.


(29:32) Scott: Yeah. So a Germany Frankfurt, what are the timings looking like?


(29:38) Tim: Yeah. So let's see here. So we've got, in Firefox total block of time is a little over 2.9 start render at about 3.7 and first byte is 2.5. So this will be, I'm curious on the first byte, because we've gone globally you'll hear a little bit. Time to first byte is 3.2, 1.6 out of India, which is interesting. And then of course these are gonna be a faster connection here for the cable one, but yeah. So it's possible, like there's definitely a little bit of difference probably variability, in terms of that time to first byte, at least from different locations, it looks like unless it could also be, like a caching thing on the server that could be kicking in.


(30:23) Scott: Yeah. And it's not quite, enough to trigger our flag. It looks like.


(30:31) Tim: Yeah. So there is a flag if the time to first bite is over a certain level across all tests runs, I believe in that test. Because we've seen bot things identify webpagetest as a blot and then artificially slowdown that response.  


(30:47) Scott: Some CDNs do. Right.


(30:50) Tim: So if we see that we'll trigger an alert up here, right to let folks kind of rerun with the UI preserve so that doesn't identify as webpage test.


(31:00) Scott: Yeah. And I think we have that set. That threshold is pretty high. I think it's, over three seconds. Cause it's not even kicking in here, might something we adjust down later.


(31:10) Tim: I think it's three and it's three across all the runs. So for example, running 3 here the time to first byte is much, much faster. And so that's why it doesn't trick because if it's a bot it would trigger everything. It would hit on all of these and yeah. Thanks, Pat for jumping in. I saw Walter had the question about time to first byte, including TLS, DNS and all that kind of stuff. And as Pat noted at the page level time to first byte does include everything from the start of the navigation until that first response comes back, including yes. All of that stuff.


(31:45) Scott: Yeah. And Matt had a good question in there about, highlighting metrics that aren't supported. Right. And this is something so we've begun to call it out a little bit. Just in the setup paragraph, so each result page has a little explanation now that didn't used to be there. That paragraph kind of just gives you an idea of what you'll find, in that result page below and the summary in particular mentions that, the metrics you see will vary, by the browser that you run. That said, I think, I'd like to do a little more in that regard and maybe, alongside the metrics table, maybe add a little tooltip for something and mention the metrics that aren't supported by this browser. Something like that. Yeah. Not seeing what you expect, that kinda thing.


(32:40) Tim: Yeah. So I think in general, I think that's one of the things that we'll be, I think it's safe to say that we'll be continuing to do and improve is sort of having that contextual information to make it easier for folks to see like why is X different here or why is this not there or whatever. I think it's one thing that's probably safe to say. And the other thing is, the browser, the cross-browser stuff, we've had some really good conversations with, Firefox edge team and stuff like that. And so I would so say it's pretty safe to expect that over, the next few months and stuff will be leveling up a little bit in terms of, how we handle cross browsers differences and perhaps even providing the ability to grab, different metrics if they're available in the different browsers, exposing them differently, but also underlying dev tools. Like some of those, there are certain things you can do, like edge has been doing some interesting stuff around memory and stuff like that. So I'd be curious to see if we can tap into some of the tools that they're building for, the surface at that point. And so they'll be more coming that--


(33:44) Scott: So yeah, that's, that's a great question.


(33:54) Tim: So let's see, other than, so we saw the time to first byte had a lot of variability, which is fair. I mean you're comparing across different networks, different locations. It kind of makes sense I don't see--


(34:06) Scott: It's interesting. It's, seemingly faster in Asia for this site, right?


(34:13) Tim: Yeah. Now this could be like, these are all just looking at the high level of these waterfalls actually here, you know what I'm gonna use your new, link, because this is handy. To jump to all three of them here across this test. I was gonna say what's interesting to me here is I thought the time to first byte, and if we're looking at time to first byte, we're looking at specifically this teal orange and purple section right up to really that first dark shaded blue that we're getting. So not the full response, but just that first-byte time is fairly consistent across these tests. It's significantly slower on one of them. But we saw some variability too on the other one. Like we had two that were really slow time to first bite, from the server, and then the third one looked. Okay. So if we were to click on the test, run details on one of them, another handy little link, this is one where it was slower.


Yeah. So we had 2.5 seconds at this point. And on the other one, let's grab this one. We had a time to first byte of 618 milliseconds. So that's quite a bit of difference there just between those two test runs. Tells me we might have something from a caching layer, that's either, you know? Okay. So we've got Cloudflare cache status hit in this case, in the slow one. Let's see what we got, expired. Okay. So we've got some cache layer that kicks in from Cloudflare in this case. And when it's kicked in, when it's there, we have a really fast time to first bite when we got that cache hit. And when we're running into this expired setting, we end up with a very slow response from the server. so this means, we've probably got, maybe we could tweak, some of the settings in Cloudflare to cache more aggressively potentially depends how important it is that when athletic publishes new content, it's out there and right away. But you could probably get a little bit more aggressive there. And there's also potentially work that could be done on the actual server itself to make when the cache doesn't hit on Cloudflare, at least that response is a little faster


This one is, you'll see this a lot. Like, this is why the variability stuff comes in. Like, if you're looking at just that median run, you might miss this. But if you're looking at like the discrepancy between those different runs, that's when you're gonna note that, Hey, like we've got potentially a time to first-byte issue. That's gonna be impacting at least the high percent percentage of runs based on how many times we're seeing it across these. And that's something worth digging into.


(37:08) Scott: Yeah. Yeah. Interesting. What do the vitals pages look like on?


(37:15) Tim: Yeah. Let's pop to one of those.


(37:17) Scott: Well, first of all, you won't see them on each one. Right. So the Firefox page won't have those metrics.


(37:23) Tim: Yeah. So the vitals link just doesn't show up and a few other one. Yeah. Just because we can't, it doesn't, there's nothing to show you. Edge though should have it because edge reports, those metrics it's chromium under the hood.


(37:36) Scott: Look at that little Edge icon up there.  


(37:39) Tim: Yeah. It's pretty nice. So for the core web vitals got a little cleanup on this page as well.


(37:49) Scott: Yeah, yeah. A little bit, again, ongoing process, we're touching all these templates and there's just a lot of moving parts here. So, you can expect to see things continue to shift around, particularly on a page like this, that just has a great deal of data. But yeah, a cleanup of sorts already.  This page is just loaded, with interesting information


(38:14) Tim: It is there's actually a few things I know we were holding back on until after this UI to try and add to this page two for terms of contextual information. So it's a few other things I think we can surface to make it easier for folks to jump in. Like on this one. And so we've got, again, just like we have had before we've got the comparison of these metrics for the webpage test run itself. And then down here we can see how does that compares to the Chrome user experience report data. And so webpage test, in this case, was a bit slower for example, on LCP. But the crux data shows that even at P75, it's still in that needs improvement area. So we're probably fairly representative at least of what the issue is.


(39:01) Scott: And that's which location that was Edge.


(39:03) Tim: This is out of India. And I mean, just to compare, I guess right now we're pulling this in on, let's see here.


(39:14) Scott: Would that mean in that case? Because, the India location was set up to throttle the 3g. So would that land it, slower than the crux data just by nature of being on a [Interposed talking]


(39:33) Tim: This is actually a really good, example of I think how this crux data can be used. So let's compare like we've got three tests here from Chrome. One was from Virginia on a 4g connection one is from Virginia on a cable connection and a desktop device. And then one is Mumbai on a Moto G four and a 3g connection. So first off the crux data is only going to be showing you desktop, or mobile. So in this case it's a desktop test. So what we're pulling here is the actual desktop, crux numbers for this URL. And that's why if you look at the P75 for the mobile tests for crux data, it's actually a different, I got click happy. It's actually a different number. The 75th percentile for mobile actually looks faster for the athletic, than the LCP on desktop.


Then it also impacts our test results so like this desktop result testing on a cable connection was too fast. This doesn't look very representative of what Chrome is at least seeing in the wild for this site on desktop devices. It’s seen significantly slower scores, which suggested if we really wanted to test this accurately, we probably need to slow this connection speed down on this test at least to get us something so that we're at least seeing the same issue that it's seeing in the real world. Cause otherwise, this looks good, right? Like the 2.3 seconds were good. Everything's looking great, but the [Inaudible41:08 ] isn't backing that up. So we'd probably need to slow this one down.  


for this one let's see the 3g and 4g both ended up, in the red, they're a bit slower than that  P75. So maybe either, in this case, it's a matter of potentially ya maybe we do go a little bit faster on the network or maybe, the device could be a little bit more powerful, but I generally at least find this is less concerning to me if webpage just is a bit slower than real user data. Because at least I'm seeing issues that kind of exist. I'm more worried about like this kind in a scenario cuz then I'm gonna get a false positive basically. I'm gonna get this reassurance that things are great and it's not maybe the case, so.  


(41:51) Scott: Right, That makes sense. Thanks. Yeah. What else is there to look at? Oh for power users so to speak who are used to, are running scripted tests or custom metrics. A lot of that information was, found in various parts of the UI and now you'll find it in that more menu. Depending at least the custom metrics links to here. So that used to be a page of its own. And now that's in the details view and what's nice about, is it's easy to change, between detail runs. So like first view, second view, and then first run second run. Those will all pair with the right, custom metrics. so we figured since that local nav is there already, we might as well just put that right in the page instead of having a separate page for it.  So that'll redirect, you might have, recalled the screenshot page, that one now redirects to detail as well. And the screenshot that you see is actually live per the run that you're looking at. So, if you were to go to run 3 or say you were to go to, step three, I guess, of a step run, you would see potentially a different screenshot, depending on what you're looking at.  


(43:21) Tim: Which is nice.


(43:23) Scott: Yeah. That looks good. Still, some tweaks to make, but I think it was definitely ready to go for now. Yeah. Yeah.


(43:36) Tim: So yeah, I mean I think, yeah, I guess just another shout out So please like for anybody who's digging through and whether it's things that are either going to be, feature recommendations that you've got, there are quite a few good ones here in the chat that people are bringing up that are awesome, or if you come across anything that's not quite looking right or working right. Again, please like absolutely over to the GitHub repo, file an issue. It's the best way to get in front of us and the best way to get it prioritized. As Scott pointed out we'll watch Twitter too and get feedback there, but, that introduces at least the layer of like, oops, we have to go file it on GitHub so we can track it. So if it's on GitHub, we know it's in front of us and we can prioritize from there.


(44:23) Scott: Yeah, that's great. Yeah.


(44:24) Tim: But yeah. So stay tuned too, because like said they'll be more coming. I, we can't tease too much, but it, it won't be long before you start seeing a few things, some major features, but then I think, always, webpage test is always taking the approach of you, trying to get quick iterative improvements and things out there too. And so that's always going to be the case as well.


(44:52) Scott: Yeah. We'll be, we'll be pushing tweaks as we find them.  


(44:53) Tim: Yeah, for sure.  


(44:57) Scott: Yeah. Is that about all you wanted to cover for today?


(45:00) Tim: I think for today that's probably pretty good. We've only got, about eight minutes I think at this point, trying to dive too awful deep on an audit would not be great. So, we bring my man on right back on.


(45:11) Scott: Henri


(45:12) Tim: Hey Surprise.


(45:15) Scott: He's always got the mic like just ready.


(45:19) Henri: Yeah, that was a surprise. I was typing. I was like, what just happened here?  I mean, thank you actually. Cause, I was trying to keep an eye on, the chat and Twitter and stuff and I noticed the Paul video is, is finally done. So I'm posting that into the chat right now for anyone who wants to go check it out. Boom. There you go. Yeah, that was actually funny I was like, oh, he brought me in, cool. So that it was cool that that was pardon me. That,


(45:58) Scott: That show you recorded is on, there it is.  


(46:03) Henri: Yeah, it is on YouTube. First of all, yeah, it took a while, to upload, it's, everyone's got the amazing download speeds. It's really the upload speeds that matter it is. And the mine is not gangster, so I'll just leave it at that. But what I did wanna say is, it's been amazing, kind of like reading and seeing some of the comments online with regards to the new layout and everything that's happening. So, it's been, a very amazing watch, and if I could see eyes sort of light up, I mean that was happening in the comments. So it's been, pretty awesome. Yeah. I must say.


(46:48 Scott: Glad to see some positive feedback. It's good.


(46:51) Tim: for sure. Yeah. It's and just.


(46:53) Scott: Exciting stuff on the way.


(46:55) Tim: Perpetual thing, by the way. We are always game you shooting us URLs, to audit and stuff. It's always fun. So yeah, if anybody does have sites feel free to hit us up on Twitter at any point, or here or whatever, and just drop 'em in and we take a list and we'll pull from 'em when we get a chance. So yeah. Perfect. All right. Well, thanks everyone. And yeah, if you wanna learn more about the, the UI changes and stuff like that, we did end up putting, we do have a blog post up that you can check out. I'll draft the link here in the chat. But yeah, thanks everybody for tuning in, and, looking forward to, catching up with everybody next time.


(47:33) Henri: Amazing.


(47:35) Tim: Thanks gang.  


(47:36) Scott: Have a good day all right.


(47:37) Henri: Totally. Thanks. Thanks one great show guys. All right.


 


Tags
Site Speed
Core Web Vitals
Website Performance

Sign up for a FREE WebPageTest account and start profiling

Speakers

Tim Kadlec​
Director of DevEx Engineering
WebPageTest
Scott Jehl
Sr. DevEx Engineering​
WebPageTest