Behind the NFL's yellow first down line, and what's next for sports TV
Everything started with a simple yellow line. On September 27, 1998, Sportvision debuted its yellow first down marker on the ESPN broadcast of the Week 4 game between the Ravens and Bengals. For the first time fans watching at home could see the exact moment the ball crossed the plane.
Sixteen years later, Sportvision can now weave almost anything into a football broadcast, from down and distance arrows to virtual video screens; it can even reveal the yard lines completely obscured by snow during winter games.
The chroma key and camera modeling technologies on which the yellow first down line was built, though, still lie deep at the heart of almost everything the company has brought to football since 1998. “That whole concept just blew everybody away,” says Mike Jakob, president of Sportvision. “It still remains one of the foundational [improvements] that we think enhance the viewing experience.”
Chroma key works by identifying which pixels in a video frame should be replaced with graphics. It is essentially the same sort of green-screen technology used by news channels to insert a weather map behind a weather forecaster, but with the added complications of being outside. For each game, Jakob explains, “we have a palette of colors, and we look at that and we select all the different shades of grass, and whether it’s grass or turf.” The computer also has to be able to deal with the field being in sunlight or shade, and to handle the curve of the ground -- football fields are actually higher in the center than on the sidelines.
After the success of the first down line, Sportvision first added a down and distance arrow, taking data that was already available on the scoreboard or at the top of the TV screen and relocating it to the center of the action. Over time, more and more basic data from the stadium -- such as how much time is left on the play clock or how much time is left in the game -- were incorporated into the video feed. The aim was “for the fan to instantly have the context of what’s happening in the game at the moment,” Jakob says.
Advances in technology made a lot of this much easier, and drastically reduced the amount of equipment and personnel that had to be taken to each game. Processing power improved, graphics cards got better, and the computer algorithms that made the adjustments in the picture were refined. At first Sportvision would travel with a truck full of gear and several computer operators, but now everything can be done by just one person and a small stack of computers. Not everything got easier, though. “We had to revamp the entire system to handle HD because there were so many more pixels,” Jakob says. “Our process goes pixel by pixel, so we had a lot more pixels to figure out in each frame of video, and that required more computational power.”
The other key technology behind the first down line was camera modeling. “You have to be very good at camera modeling,” Jakob explains. “You have to be able to look through the lens and know what you’re looking at and where everything in the real world is in your virtual world.” As the camera pans, tilts and zooms, the graphics have to be kept in perspective so that they appear to be part of the picture. To do this, Sportvision fits each of the broadcaster’s cameras with instruments that measure pan, tilt and zoom data. The system can then adjust the size, shape and location of the graphic to have it fit seamlessly into each frame.
Having solved the problem of augmenting reality on the field of play, Sportvision moved on to other spots in the stadium. Using the same camera modeling technology allowed virtual billboards or video screens to be placed around the field. These are used to display stats or replays, and they stay fixed in place as the camera moves around, giving the appearance that they are actual physical structures, not just computer graphics.
“We did that a number of years ago with the pass,” he says. “We introduced something called Pass Track where we can show the trail on a ball… Showing how a quarterback dropped a ball right over a receiver’s shoulder, and the arc of the pass, whether it was thrown on a hard trajectory, or whether it was lofted and dropped in.”
Pass Track works by asking an operator to go through a replay and click on a couple of the frames, so that the computer can identify the ball. Using algorithms that model the physics, the full trajectory of the ball can then be calculated and displayed on a replay. And the same can be done for kicks, making it possible to determine exactly where and when a ball crosses between the uprights of the goal.
But neither Pass Track and Kick Track work with live data, and that is the next challenge. “In a sport like football there are so many collisions, players obscuring each other, that [the players are] hard to distinguish. Their jerseys are the same, their helmets are the same on the same team.” Sportvision, the NFL and other companies are experimenting with active-tracking technology, but seeing that rendered live into a football broadcast could still be a few seasons away.
However, Sportvision demonstrated what this could look like at last Sunday’s NHL All-Star game. An infrared system was used to track chips placed on players’ jerseys and on the puck, allowing the company to record such data as speed and location, and then integrate that directly into the video. “We were live tracking the players, putting tags on them and trails,” Jakob says. “That video was being put on a Jumbotron for fans at the arena. They were sitting there watching the live tracking and the graphics from their seats. You could glance down at the ice and look up and have players identified.”
So, 16 years after the yellow first down line first made it to our TV screens, everything it made possible might finally make it to our stadiums.