Many sports involve some element of timekeeping. There are moments when timekeeping is done with immense precision and accuracy and other moments where timekeeping is sloppy at best. If you ever watch the last two minutes of a college basketball game, you will see endless reviews of when exactly a ball went out of bounds. The refs will ensure that the time remaining on the game clock is accurate within a tenth of second. Yet throughout that same game, you will have seconds gained and lost due to random human error in managing the clock. The ball is inbounded and the clock starts a second too early or late. Someone commits a foul and there is a half second before the clock is actually stopped. What about this one — in this clip below, LeBron commits an 8 second violation, but you can see it took 9.1 seconds off the game clock (from 33.7 seconds remaining to 24.6 seconds remaining). Should we care? Probably not.
There are cases where we don’t really care about being precise with time measurements even when it’s blatantly baked into the rules of the game. In basketball, supposedly a player only gets 10 seconds to shoot a free throw. That rule seems to be rarely enforced, though to be fair I found this two minute compilation of Giannis getting called on it. Even in this case, there is no official 10 second counter. We are just relying on refs doing their best to count to 10. (I tested myself counting to 10 with a stopwatch a few times and I’m proud to say I am decent at it. Give it a try for yourself sometime.)
In the NFL, it’s not uncommon to see the center hike the ball after the play clock expires without any penalty being called. The offense is usually given an unspoken extra half second or so to get the play off after the clock expires! One very noteworthy case of this happened in the closing moments of a Ravens vs. Lions game. Lamar Jackson had a very obvious delay of game penalty. There was no call. The next play Baltimore won the game with a record setting 66 yard field goal.
The reality is that there are trade-offs when it comes to timekeeping and watchability. No one cares about the 1.1 seconds of lost time after LeBron’s 8 second violation. No one cares about the split seconds of time gained and lost all throughout the game. You just want to watch the freaking game. If anything, there are too many reviews and slowdowns as it is.
Even though we only sometimes care about timekeeping, it’s clear that in the last 50 years, we continue to take steps to care more about it. Prior to the 1989 NBA season, the game clocks did not even display tenths of seconds. That meant if the game clock showed 0:01, no one really knew if there was a full second left or just some fraction of a second. You can see a fun example here in maybe the most famous NBA one second buzzer beater pre-1989. No one will ever really know how much time was left here.
Moments before this shot, Celtics fans stormed the court thinking the game was over, and one even decided to fight a ref. One can only imagine how passionate this guy was about timekeeping.
Post 1989, we have the luxury of seeing tenths of seconds on game clocks, and we know that Derek Fisher got this shot off in less than 0.4 seconds.
That’s what got me thinking though. Just like in 1976 when a clock showed 0:01 and we didn’t know if that meant 0.2 seconds, 0.5 seconds, or a full 1.0 seconds, when Derek Fisher took that shot, we didn’t know if there were 0.32 seconds, 0.37 seconds, or 0.40 seconds left. How do these clocks actually work? Do they round up or do they round down?
Well it turns out even that has changed over time as we will examine more closely.
If a clock only tracks whole seconds, I think (I hope) we can all agree that 0:00 is actually true 0.0 seconds remaining. So as the clock flips from 0:01 to 0:00, the buzzer sounds and no time is remaining. From that, we can infer on “whole-second clocks” that time is always rounded up. If there is half a second left, the clock will still display 0:01. If there is a tenth of a second left, the clock will still display 0:01. We can also extrapolate that out to any other point in the game. If the clock reads 1:23, then technically there is somewhere between 1:22 and 1:23 left, or at least that’s the way it used to be.
Something curious happened when the NBA starting introducing clocks that display tenths of seconds. Up until the final minute of the game, the clock displays whole seconds only, but in the last minute you will see the clock go from 1:00 to 0:59.9. What you will notice though in the clip below is that the clock actually hangs on 1:00 for a full second before switching to 59.9. From that we can actually infer that when the clock read 1:00, it actually meant some time between 1:00 and 1:01 was left on the clock. That means now time is rounded down!
We can take this to an even weirder case next. It was not until the 2011-2012 season that the NBA started adding tenths of seconds to the final 5 seconds of a shot clock. So that meant from 1990 until 2011, the game clock had tenths of seconds displayed in the final minute of a quarter, but the shot clock still had whole seconds only. Of course, it was still necessary for shot clocks to be rounded up to guarantee that 0 seconds actually meant 0 seconds — it would be unacceptable to have 0.3 seconds displayed simply as 0 seconds on a shot clock. So in summary, from 1990 to 2011, game clocks rounded down and shot clocks rounded up. Remember that for Final Jeopardy. During this era, you would often have this fun situation:
A possession might start with between 24 and 25 seconds on the clock like this.
After holding the ball for 0.9 seconds, the shot clock would still read 24.
This looks a bit nonsensical because the shot clock can never have more time than the game clock. Of course it didn’t actually. Time was just rounded up on shot clocks back then. A moment later, order would be restored with the game clock again appearing to have more time.
Given that this was in the 2011 NBA Finals, this may have even be the last known occurrence of this phenomenon, as the next season would see tenths of seconds introduced to the final 5 seconds of a shot clock. Just like how the game clock second rounding changed, so did the shot clock. Shot clocks are now also rounded down when displaying whole seconds. One way to see this again is by simply noticing that the shot clock hangs on 5 seconds for a full second before switching to 4.9.
This rounding concept comes up from time to time and confuses everyone when it comes to enforcing an 8 second violation. When taking possession of the ball in the backcourt, the offense has 8 seconds to bring the ball to the front court. Given that the shot clock starts with 24 seconds on it, what will the shot clock read when an 8 second violation occurs? As a casual fan watching a game, it’s tempting to see it read 16 seconds remaining and believe that a violation has occurred. The problem as we just discovered is that when displaying whole seconds, shot clocks are rounded down. So when it says 16 seconds, there are really somewhere between 16 and 17 seconds left. If there are 16.1 seconds left, then only 7.9 seconds have come off the clock and there is no 8 second violation yet. It’s not until the clock switches from 16 to 15 that refs will actually blow the whistle. Of course, by the time the clock has switched to 15, technically just a fraction of time more than 8 seconds has elapsed, so the whistle will be blown too late! That assumes the ref blows the whistle at all — can we really expect a ref to keep one eye on the shot clock and one eye on the ball at the same time? No. You try getting something like this right in real time!
So to recap, we now live in a world where:
Shot clocks are rounded down when displaying whole seconds. If 23.5 seconds remain, then the clock will say 23. Under 5 seconds, the clock will begin displaying tenths of seconds.
Game clocks are rounded down when displaying whole seconds. If 4:21.7 seconds remain, then the clock will say 4:21. Under a minute, the clock will begin displaying tenths of a second
This brings us to the final piece of the puzzle. We will focus on the shot clock example, but this same problem applies to the game clock. Based on this whole concept of rounded down shot clocks, I have been assuming that the 5 second mark is displayed when there are between 5.0 seconds and 5.9999 seconds (will use 4 9’s for simplicity). The moment there are less than 5 seconds left — let’s call it 4.9999 seconds — I assume it would flip to 4.9. However, if true, that creates a new problem. If 4.9999 is displayed as 4.9, then that would mean the moment there is just less than 0.1 seconds remaining — let’s call it 0.0999 seconds — the clock would read 0.0. That simply cannot be true. There is no way a clock would read 0.0 and a buzzer would sound if there was actually closer to 0.1 seconds remaining.
So where the hell has this missing 0.1 seconds gone? It seems like there are a few possibilities, some more plausible than others:
Up until this point, I have believed the shot clock, when displaying whole seconds, is strictly rounded down. So when the shot clock starts, it displays 24 seconds for a moment in time but immediately drops to 23 seconds after a fraction of a second — since 23.9999 would be rounded down to 23. I suppose we have to consider the possibility that it actually drops to 23 at 23.90. That is, maybe the shot clock is only rounded down when the decimal is <.90 and otherwise is rounded up. So the clock goes from 6 seconds to 5 seconds at the 5.90 second mark, and exactly one second later the clock drops to 4.9 seconds. I will call this the “0.9 rounding theory”. If true, it would make sense why the 5 to 4.9 transition looks like it takes a second — because it does. But it also means that 8 second violations are actually 8.1 second violations — because in this world, the clock doesn’t transition from 16 to 15 until there are 15.90 seconds left.
Maybe the simpler theory — which I will call the “1.1 second transition theory” — is that the transition from 5 seconds to 4.9 seconds is actually taking 1.1 seconds, and it’s just visually hard to tell that when watching on TV. In this world, all of our previous rounding assumptions hold true. When displaying whole seconds, the shot clock is rounded down, but when displaying tenths of seconds, the shot clock is rounded up. There is just a special case to account for transitioning from 5 to 4.9 only.
Maybe the transition from 0.1 seconds to 0.0 seconds is actually taking 0.2 seconds. I call this “the big finish theory” because the final moment somehow would carry more weight.
Maybe shot clocks are really only 23.9 seconds and we’ve been duped since 2012. If true, maybe quarters are only 11:59.9 seconds long, and we’ve unknowingly taken our first step towards Adam Silver’s 10 minute quarter idea. I call this the “the missing time theory”.
I’ve considered how we might prove these theories:
The “0.9 rounding theory”
We would need to show that the shot clock takes 0.1 seconds to go from 24 to 23. That is really hard to prove because there is so much human error in starting the shot clock. It might take 0.1 seconds simply because the clock operator waits for the player to touch the ball before clicking the button
We would need to show that the 5 second to 4.9 second transition in fact takes 1 second
The “1.1 second transition theory”
This one is easier. Prove that the 5 to 4.9 second transition actually takes 1.1 seconds
The “the big finish theory”
Also easy to test. If we can definitively see that the 0.1 to 0.0 transition is taking twice as long as the other tenth of second transitions, then this would be true. It would create a lot of other issues though and call into question the Trent Tucker Rule
The “the missing time theory”
I guess if we can’t prove any of the other theories are true, this is my only one to fall back on
We are up against some real limitations to actually test this theory, but I at least have some fun examples to consider. Unfortunately, when looking at video clips, if the video is 60 frames per second (0.0167 seconds per frame), we can only get so precise. Nevertheless, we can start with this mind-bending example, which is a clip that I posted above that I will repost here.
I examined this video frame by frame, and inserted markers at the following spots:
The frame that the shot clock switched from 6 seconds 5 seconds (Marker 1)
The frame that the shot clock switched from 5 seconds 4.9 seconds (Marker 2)
The frame that the shot clock switched from 4.0 seconds 3.9 seconds (Marker 3)
The frame that the shot clock switched from 3.0 seconds 2.9 seconds (Marker 4)
The frame that the shot clock switched from 2.0 seconds 1.9 seconds (Marker 5)
The frame that the shot clock switched from 1.0 seconds 0.9 seconds (Marker 6)
The frame that the shot clock switched from 0.1 seconds 0.0 seconds (Marker 7)
Going from “Marker 1” to “Marker 2” represents the all important 5 to 4.9 second transition. And by my measurements that took exactly 60 frames, or 1 second. Interesting - well that’s one in the loss column for the “1.1 second transition theory”. Not surprisingly, we similarly see 60 frames between Markers 2 and 3, 3 and 4, 4 and 5, and 5 and 6. The final 0.9 seconds (Mark 6 to 7) we would expect to take 54 frames, but it actually took 56 frames. That represents an extra .033 seconds than I would’ve thought. So that means the last 0.9 seconds of shot clock really lasted 0.933 seconds. That is weird I have to admit, but we would’ve needed to see at another 3-4 frames in that final second to consider “the big finish theory”. I actually went one level deeper and between Marker 6 and 7, I investigated the number of frames that elapsed between each tenth of a second. I would’ve expected 6 frames per tenth of a second, and although that was the most common case, some were 8 frames and some were 4 frames. A key data point against “the big finish theory” was that the final 0.1 seconds was a normal 6 frames, so didn’t seem like anything special about it.
Now, if you are beginning to mistrust clocks, this clip is only about to make things worse. This is a special example where, throughout the entirety of the clip, you can see both the clock on the scorebug and the clock on the actual basket. Would you believe it that they aren’t even aligned? Below is a quick snapshot where you can see 1.3 seconds on the scorebug, and 1.1 on the hoop. You can go frame by frame and see a consistent 0.2 second misalignment.
Although this is deeply disturbing, it didn’t matter in this case for Darren Collison. Just imagine how Wisconsin fans felt when Ryan Evans made a game-tying buzzer beater that was released before the TV clock expired, but upon further review it was released after the scoreboard clock expired.
So I guess we can’t trust TV scoreboard clocks? Or maybe we can trust them to count time but just not be in sync with the stadium? Anyways, coming back to the Darren Collison shot, because we have the luxury of seeing the stadium clock, I went ahead and looked at the same key markers, but this time with the stadium clock:
The frame that the shot clock switched from 6 seconds 5 seconds (Marker 1)
The frame that the shot clock switched from 5 seconds 4.9 seconds (Marker 2)
The frame that the shot clock switched from 4.0 seconds 3.9 seconds (Marker 3)
The frame that the shot clock switched from 3.0 seconds 2.9 seconds (Marker 4)
The frame that the shot clock switched from 2.0 seconds 1.9 seconds (Marker 5)
The frame that the shot clock switched from 1.0 seconds 0.9 seconds (Marker 6)
The frame that the shot clock switched from 0.1 seconds 0.0 seconds (Marker 7)
There were two differences when looking at this data compared to the TV clock data. The key transition between Marker 1 and 2, which was 60 frames in the earlier analysis, was only 58 frames here. We are going in the wrong direction now! I wanted to see 66 frames to prove the “1.1 second transition theory”. Somehow in this case that transition took just under a second. The other difference with the TV clock data came between Markers 6 and 7. There were in fact 54 frames between Marker 6 and 7 as expected, compared to 56 from the previous analysis. So that final 0.9 seconds of time took the number of frames I would’ve expected. All other key markers had 60 frames in between, also as expected.
I know it’s only one clip, but so far the “1.1 second transition theory” is not looking good. Neither is “the big finish theory”. I’m not ready yet to believe “the missing time theory”. The most plausible explanation seems to be the “0.9 rounding theory”, but it’s quite difficult to prove. The only way to actually prove it would be to show the shot clock takes 0.1 seconds to go from 24 to 23. However, in any given video, it’s virtually impossible to test this.
Kyrie Irving famously hit the game winning 3 pointer in Game 7 against the Warriors in the 2016 NBA Finals. After he made the shot, the clock stopped at 53.0 seconds before Draymond Green inbounds to Steph Curry.
The shot clock then drops to 23 at the same moment the game clock drops to 52.9
Does that prove that the 24 to 23 second transition takes 0.1 seconds. It’s certainly possible, but unfortunately this isn’t enough evidence for me. There are too many other possibilities, like:
Some smaller fraction of game clock actually ran off, but it just so happened to be enough to flip the transition between 53.0 and 52.9
Human error in starting the shot clock too late
This is sadly where the story ends for me, and, unless you get an actual NBA clock expert to write a blog, we may never know the truth. I guess you have to decide for yourself if you subscribe to “0.9 rounding theory” or “the missing time theory”. Heck, maybe Kyrie Irving was put on this earth to prove there may be another theory lurking in the shadows.
I would welcome any comments if I made any mistakes here or if you know the real truth. Maybe it’s better if we never really know though.
Great post! I’ve never thought this in-depth about game clocks before. Was a fun journey.