Vista Normal

Hay nuevos artículos disponibles. Pincha para refrescar la página.
AnteayerSalida Principal

Scrapping the Local Loop, by the Numbers

11 Junio 2024 at 14:00

A few years back I wrote an “Ask Hackaday” article inviting speculation on the future of the physical plant of landline telephone companies. It started innocently enough; an open telco cabinet spotted during my morning walk gave me a glimpse into the complexity of the network buried beneath my feet and strung along poles around town. That in turn begged the question of what to do with all that wire, now that wireless communications have made landline phones so déclassé.

At the time, I had a sneaking suspicion that I knew what the answer would be, but I spent a good bit of virtual ink trying to convince myself that there was still some constructive purpose for the network. After all, hundreds of thousands of technicians and engineers spent lifetimes building, maintaining, and improving these networks; surely there must be a way to repurpose all that infrastructure in a way that pays at least a bit of homage to them. The idea of just ripping out all that wire and scrapping it seemed unpalatable.

With the decreasing need for copper voice and data networks and the increasing demand for infrastructure to power everything from AI data centers to decarbonized transportation, the economic forces arrayed against these carefully constructed networks seem irresistible. But what do the numbers actually look like? Are these artificial copper mines as rich as they appear? Or is the idea of pulling all that copper out of the ground and off the poles and retasking it just a pipe dream?

Phones To Cars

There are a lot of contenders for the title of “Largest Machine Ever Built,” but it’s a pretty safe bet that the public switched telephone network (PSTN) is in the top five. From its earliest days, the PSTN was centered around copper, with each and every subscriber getting at least one pair of copper wires connected from their home or business. These pairs, referred to collectively and somewhat loosely as the “local loop,” were gathered together into increasingly larger bundles on their way to a central office (CO) housing the switchgear needed to connect one copper pair to another. For local calls, it could all be done within the CO or by connecting to a nearby CO over copper lines dedicated to the task; long-distance calls were accomplished by multiplexing calls together, sometimes over microwave links but often over thick coaxial cables.

Fiber optic cables and wireless technologies have played a large part in making all the copper in the local loops and beyond redundant, but the fact remains that something like 800,000 metric tons of copper is currently locked up in the PSTN. And judging by the anti-theft efforts that Home Depot and other retailers are making, not to mention the increase in copper thefts from construction sites and other soft targets, that material is incredibly valuable. Current estimates are that PSTNs are sitting on something like $7 billion worth of copper.

That sure sounds like a lot, but what does it really mean? Assuming that the goal of harvesting all that largely redundant PSTN copper is to support decarbonization, $7 billion worth of copper isn’t really that much. Take EVs for example. The typical EV on the road today has about 132 pounds (60 kg) of copper, or about 2.5 times the amount in the typical ICE vehicle. Most of that copper is locked up in motor windings, but there’s a lot in the bus bars and wires needed to connect the batteries to the motors, plus all the wires needed to connect all the data systems, sensors, and accessories. If you pulled all the copper out of the PSTN and used it to do nothing but build new EVs, you’d be able to build about 13.3 million cars. That’s a lot, but considering that 80 million cars were put on the road globally in 2021, it wouldn’t have that much of an impact.

Farming the Wind

What about on the generation side? Thirteen million new EVs are going to need a lot of extra generation and transmission capacity, and with the goal of decarbonization, that probably means a lot of wind power. Wind turbines take a lot of copper; currently, bringing a megawatt of on-shore wind capacity online takes about 3 metric tons of copper. A lot of that goes into the windings in the generator, but that also takes into account the wire needed to get the power from the nacelle down to the ground, plus the wires needed to connect the turbines together and the transformers and switchgear needed to boost the voltage for transmission. So, if all of the 800,000 metric tons of copper currently locked up in the PSTN were recycled into wind turbines, they’d bring a total of 267,000 megawatts of capacity online.

To put that into perspective, the total power capacity in the United States is about 1.6 million megawatts, so converting the PSTN to wind turbines would increase US grid capacity by about 16% — assuming no losses, of course. Not too shabby; that’s over ten times the capacity of the world’s largest wind farm, the Gansu Wind Farm in the Gobi Desert in China.

There’s one more way to look at the problem, one that I think puts a fine point of things. It’s estimated that to reach global decarbonization goals, in the next 25 years we’ll need to mine at least twice the amount of copper that has ever been mined in human history. That’s quite a lot; we’ve taken 700 million metric tons of copper in the last 11,000 years. Doubling that means we’ve got to come up with 1.4 billion metric tons in the next quarter century. The 800,000 metric tons of obsolete PSTN copper is therefore only about 0.05% of what’s needed — not even a drop in the bucket.

Accepting the Inevitable

These are just a few examples of what could be done with the “Buried Fortune” of PSTN copper, as Bloomberg somewhat breathlessly refers to it in the article linked above. It goes without saying that this is just back-of-the-envelope math, and that a real analysis of what it would take to recycle the old PSTN copper and what the results would be would require a lot more engineering and financial chops than I have. Even if it is just a drop in the bucket, I think we’ll probably end up doing it, if for no other reason than it takes something like two decades to bring a new copper mine into production. Until those mines come online and drive the price of copper down, all that refined and (relatively) easily recycled copper just sitting there is a tempting target for investors. So it’ll probably happen, which is sad in a way, but maybe it’s a more fitting end to the PSTN than just letting it sit there and corrode.

How Facebook Killed Online Chat

Por: Lewin Day
29 Mayo 2024 at 14:00

In the early days of the internet, online conversations were an event. The technology was novel, and it was suddenly possible to socialize with a whole bunch of friends at a distance, all at once. No more calling your friends one by one, you could talk to them all at the same time!

Many of us would spend hours on IRC, or pull all-nighters bantering on MSN Messenger or AIM. But then, something happened, and many of us found ourselves having shorter conversations online, if we were having any at all. Thinking back to my younger days, and comparing them with today, I think I’ve figured out what it is that’s changed.

Deliberate Choices

Having the right nick, profile image, and personal message was a big part of looking cool on MSN Messenger. You needed something that would make you seem interesting, hip, and worth talking to. Song lyrics were common. Credit: Screenshot, MSN Messenger history

Twenty five years ago, a lot more of us were stuck getting by with dialup. The Internet wasn’t always on back then. You had to make the decision to connect to it, and sit at your computer to use it.

Similarly, logging into an IRC room was a deliberate action. It was a sign that you were setting aside time to communicate. If you were in a chat room, you were by and large there to talk. On AIM or MSN Messenger, it was much the same deal. If you wanted to have a chat, you’d leave your status on available. If you didn’t wanna talk, you’d set yourself to Busy or Away, or log off entirely.

This intentionality fostered meaningful interactions online. Back then, you’d sign in and you’d flick through your list of friends. If someone’s icon was glowing green, you knew they were probably up to talk. You might have a quick chat, or you could talk for hours. Indeed, logging on to a chatroom for an extended session was a pastime enjoyed by many.

If you were on Linux, or used multiple chat services, you might have experimented with multi-chat clients like Pidgin back in the day. Credit: Uberushaximus, GPL

Back then, people were making the conscious decision to set aside time to talk. Conversations were more focused and meaningful because both parties had set aside time to engage. This intentionality led to richer, more engaging discussions because participants were fully present.

Furthermore, the need to log in and out helped create a healthy boundary between life online and off. Users balanced their online interactions with other responsibilities and activities. There was a clear distinction between online and offline life, allowing for more complete engagement in both. When you logged off, that was it. There was no way for your online friends to get a message to you in real time, so your focus was fully on what was going on in front of you.

Critical Shift

T’was the endless march of technology that changed the meta. Broadband internet would keep our computers online round the clock. You could still log in and out of your chat apps, of course, and when you walked away from your computer, you were offline.

But technology didn’t stop there. Facebook came along, and tacked on Messenger in turn. The app would live on the smartphones in our pockets, while mobile data connections meant a message from the Internet could come through at any time.

If your buddies were green, you could hit ’em up for a chat! Facebook kind of has us all defaulting to available at all times, though, and it throws everything off. Credit: Pidgin.IM

Facebook’s always-on messaging was right there, tied to a website many of us were already using on the regular. Suddenly, booting up another app like AIM or MSN seemed archaic when we could just chat in the browser. The addition of the app to smartphones put Messenger everywhere we went. For many, it even started to supplant SMS, in addition to making other online chat platforms obsolete.

Always-on messaging seemed convenient, but it came with a curse. It’s fundamentally changed the dynamics of our online interactions, and not always for the better.

Perpetual availability means that there is a constant pressure to respond. In the beginning, Facebook implemented “busy” and “available” status messages, but they’re not really a thing anymore. Now, when you go to message a friend, you’re kind of left in to the dark as to what they’re doing and how they’re feeling. Maybe they’re chilling at home, and they’re down for a deep-and-meaningful conversation. Or maybe they’re working late at work, and they don’t really want to be bothered right now. Back in the day, you could seamlessly infer their willingness to chat simply by noting whether they were logged in or not. Today, you can’t really know without asking.

That has created a kind of silent pressure against having longer conversations on Facebook Messenger. I’m often reluctant to start a big conversation with someone on the platform, because I don’t know if they’re ready for it right now. Even when someone contacts me, I find myself trying to close out conversations quickly, even positive ones. I’m inherently assuming that they probably just intended to send me a quick message, and that they’ve got other things to do. The platform provides no explicit social signal that they’re happy to have a proper conversation. Instead, it’s almost implied that they might be messaging me while doing something else more important, because hey, Messenger’s on all the time. Nobody sits down to chat on Facebook Messenger these days.

Do any of these people want to chat? I can’t tell, because they’re always online!

It’s also ruining the peace. If you’ve got Messenger installed, notifications pop up incessantly, disrupting focus and productivity. Conversations that might have once been deep and meaningful are now often fragmented and shallow because half the time, someone’s starting them when you’re in the middle of something else. If you weren’t “logged on” or “available”, they’d wait until you were ready for a proper chat. But they can’t know that on Facebook Messenger, so they just have to send a message and hope.

In a more romantic sense, Facebook Messenger has also killed some of the magic. The ease of starting a conversation at any moment diminishes the anticipation that once accompanied online interactions. Plenty of older Internet users (myself included) will remember the excitement when a new friend or crush popped up online. You could freely leap into a conversation because just by logging on, they were saying “hey, wanna talk?” It was the equivalent social signal of seeing them walk into your local pub and waving hello. They’re here, and they want to socialize!

It’s true that we effectively had always-on messaging before Facebook brought it to a wider audience. You could text message your friends, and they’d get it right away. But this was fine, and in fact, it acted as a complement to online messaging. SMSs used to at least cost a little money, and it was generally time consuming to type them out on a limited phone keypad. They were fine if you needed to send a short message, and that was about it. Meanwhile, online messaging was better for longer, intentional conversations. You could still buzz people at an instant when you needed to, but SMS didn’t get in the way of proper online chats like Facebook Messenger would.

The problem is, it seems like we can’t really go back. As with so many technologies, we can try and blame the creators, but it’s not entirely fair. Messenger changed how we used online chat, but Facebook didn’t force us to do anything. Many of us naturally flocked to the platform, abandoning others like AIM and MSN in short order .We found  it more convenient in the short term, even if some of us have found it less satisfying in the long term.

Online platforms tend to figure out what we respond to on a base psychological level, and game that for every last drop of interaction and attention they can. They do this to sell ads and make money, and that’s all that really matters at the end of the day. Facebook’s one of the best at it. It’s not just online chat, either. Forums went the same way, and it won’t end there.

Ultimately, for a lot of us, our days of spending hours having great conversations online are behind us. It’s hard to see what could ever get the broader population to engage again in that way. Instead, it seems that our society has moved on, for the worse or for the better. For me, that’s a shame!

The Great Green Wall: Africa’s Ambitious Attempt To Fight Desertification

Por: Lewin Day
9 Mayo 2024 at 14:00

As our climate changes, we fear that warmer temperatures and drier conditions could make life hard for us. In most locations, it’s a future concern that feels uncomfortably near, but for some locations, it’s already very real. Take the Sahara desert, for example, and the degraded landscapes to the south in the Sahel. These arid regions are so dry that they struggle to support life at all, and temperatures there are rising faster than almost anywhere else on the planet.

In the face of this escalating threat, one of the most visionary initiatives underway is the Great Green Wall of Africa. It’s a mega-sized project that aims to restore life to barren terrain.

A Living Wall

Concentrated efforts have helped bring dry lands back to life. Credit: WFP

Launched in 2007 by the African Union, the Great Green Wall was originally an attempt to halt the desert in its tracks. The Sahara Desert has long been expanding, and the Sahel region has been losing the battle against desertification. The Green Wall hopes to put a stop to this, while also improving food security in the area.

The concept of the wall is simple. The idea is to take degraded land and restore it to life, creating a green band across the breadth of Africa which would resist the spread of desertification to the south. Intended to span the continent from Senegal in the west to Djibouti in the east, it was originally intended to be 15 kilometers wide and a full 7,775 kilometers long. The hope was to complete the wall by 2030.

The Great Green Wall concept moved past initial ideas around simply planting a literal wall of trees. It eventually morphed into a broader project to create a “mosaic” of green and productive landscapes that can support local communities in the region.

Reforestation is at the heart of the Great Green Wall. Millions of trees have been planted, with species chosen carefully to maximise success. Trees like Acacia, Baobab, and Moringa are commonly planted not only for their resilience in arid environments but also for their economic benefits. Acacia trees, for instance, produce gum arabic—a valuable ingredient in the food and pharmaceutical industries—while Moringa trees are celebrated for their nutritious leaves.

 

Choosing plants with economic value has a very important side effect that sustains the project. If random trees of little value were planted solely as an environmental measure, they probably wouldn’t last long. They could be harvested by the local community for firewood in short order, completely negating all the hard work done to plant them. Instead, by choosing species that have ongoing productive value, it gives the local community a reason to maintain and support the plants.

Special earthworks are also aiding in the fight to repair barren lands. In places like Mauritania, communities have been digging  half-moon divots into the ground. Water can easily run off or flow away on hard, compacted dirt. However, the half-moon structures trap water in the divots, and the raised border forms a protective barrier. These divots can then be used to plant various species where they will be sustained by the captured water. Do this enough times over a barren landscape, and with a little rain, formerly dead land can be brought back to life. It’s a traditional technique that is both cheap and effective at turning brown lands green again.

Progress

The project has been an opportunity to plant economically valuable plants which have proven useful to local communities. Credit: WFP

The initiative plans to restore 100 million hectares of currently degraded land, while also sequestering 250 million tons of carbon to help fight against climate change. Progress has been sizable, but at the same time, limited. As of mid-2023, the project had restored approximately 18 million hectares of formerly degraded land. That’s a lot of land by any measure. And yet, it’s less than a fifth of the total that the project hoped to achieve. The project has been frustrated by funding issues, delays, and the degraded security situation in some of the areas involved. Put together, this all bodes poorly for the project’s chances of reaching its goal by 2030, given 17 years have passed and we draw ever closer to 2030.

While the project may not have met its loftiest goals, that’s not to say it has all been in vain. The Great Green Wall need not be seen as an all or nothing proposition. Those 18 million hectares that have been reclaimed are not nothing, and one imagines the communities in these areas are enjoying the boons of their newly improved land.

In the driest parts of the world, good land can be hard to come by. While the Great Green Wall may not span the African continent yet, it’s still having an effect. It’s showing communities that with the right techniques, it’s possible to bring some barren zones from the brink, turning hem back into useful productive land. That, at least, is a good legacy, and if the projects full goals can be realized? All the better.

NASA Is Now Tasked With Developing A Lunar Time Standard, Relativity Or Not

Por: Lewin Day
2 Mayo 2024 at 14:00

A little while ago, we talked about the concept of timezones and the Moon. It’s a complicated issue, because on Earth, time is all about the Sun and our local relationship with it. The Moon and the Sun have their own weird thing going on, so time there doesn’t really line up well with our terrestrial conception of it.

Nevertheless, as humanity gets serious about doing Moon things again, the issue needs to be solved. To that end, NASA has now officially been tasked with setting up Moon time – just a few short weeks after we last talked about it! (Does the President read Hackaday?) Only problem is, physics is going to make it a damn sight more complicated!

Relatively Speaking

You know it’s serious when the White House sends you a memo. “Tell NASA to invent lunar time, and get off their fannies!”

The problem is all down to general and special relativity. The Moon is in motion relative to Erath, and it also has a lower gravitational pull. We won’t get into the physics here, but it basically means that time literally moves at a different pace up there. Time on the Moon passes on average 58.7 microseconds faster over a 24 hour Earth day. It’s not constant, either—there is a certain degree of periodic variation involved.

It’s a tiny difference, but it’s cumulative over time. Plus, as it is, many space and navigational applications need the utmost in precise timing to function, so it’s not something NASA can ignore. Even if the agency just wanted to just use UTC and call it good, the relativity problem would prevent that from being a workable solution.

Without a reliable and stable timebase, space agencies like NASA would struggle to establish useful infrastructure on the Moon. Things like lunar satellite navigation wouldn’t work accurately without taking into account the time slip, for example. GPS is highly sensitive to relativistic time effects, and indeed relies upon them to function. Replicating it on the Moon is only possible if these factors are accounted for. Looking even further ahead, things like lunar commerce or secure communication would be difficult to manage reliably without stable timebases for equipment involved.

Banks of atomic clocks—like these at the US Naval Observatory—are used to establish high-quality time standards. Similar equipment may need to be placed on the Moon to establish Coordinated Lunar Time (LTC). Credit: public domain

Still, the order to find a solution has come down from the top. A memo from the Executive Office of the President charged NASA with its task to deliver a standard solution for lunar timing by December 31, 2026.  Coordinated Lunar Time (LTC) must be established and in a way that is traceable to Coordinated Universal Time (UTC). That will enable operators on Earth to synchronize operations with crews or unmanned systems on the Moon itself. LTC is required to be accurate enough for scientific and navigational purposes, and it must be resilient to any loss of contact with systems back on Earth.

It’s also desired that the future LTC standard will be extensible and scalable to space environments we may explore in future beyond the Earth-Moon system itself. In time, NASA may find it necessary to establish time standards for other celestial bodies, due to their own unique differences in relative velocity and gravitational field.

The deadline means there’s time for NASA to come up with a plan to tackle the problem. However, for a federal agency, less than two years is not exactly a lengthy time frame. It’s likely that whatever NASA comes up with will involve some kind of timekeeping equipment deployed on the Moon itself. This equipment would thus be subject to the time shift relative to Earth, making it easier to track differences in time between the lunar and terrestrial time-realities.

The US Naval Observatory doesn’t just keep careful track of time, it displays it on a big LED display for people in the area. NASA probably doesn’t need to establish a big time billboard on the Moon, but it’d be cool if they did. Credit: Votpuske, CC BY 4.0

Great minds are already working on the problem, like Kevin Coggins, NASA’s space communications and navigation chief. “Think of the atomic clocks at the U.S. Naval Observatory—they’re the heartbeat of the nation, synchronizing everything,” he said in an interview. “You’re going to want a heartbeat on the moon.”

For now, establishing CLT remains a project for the American space agency. It will work on the project in partnership with the Departments of Commerce, Defense, State and Transportation. One fears for the public servants required to coordinate meetings amongst all those departments.

Establishing new time standards isn’t cheap. It requires smart minds, plenty of research and development, and some serious equipment. Space-rated atomic clocks don’t come cheap, either. Regardless, the U.S. government hopes that NASA will lead the way for all spacefaring nations in this regard, setting a lunar time standard that can serve future operations well.

 

VAR Is Ruining Football, and Tech Is Ruining Sport

Por: Lewin Day
29 Abril 2024 at 14:00
The symbol of all that is wrong with football.

Another week in football, another VAR controversy to fill the column inches and rile up the fans. If you missed it, Coventry scored a last-minute winner in extra time in a crucial match—an FA Cup semi-final. Only, oh wait—computer says no. VAR ruled Haji Wright was offside, and the goal was disallowed. Coventry fans screamed that the system got it wrong, but no matter. Man United went on to win and dreams were forever dashed.

Systems like the Video Assistant Referee were brought in to make sport fairer, with the aim that they would improve the product and leave fans and competitors better off. And yet, years later, with all this technology, we find ourselves up in arms more than ever.

It’s my sincere belief that technology is killing sport, and the old ways were better. Here’s why.

The Old Days

Moments like these came down to the people on the pitch. Credit: Sdo216, CC BY-SA 3.0

For hundreds of years, we adjudicated sports the same way. The relevant authority nominated some number of umpires or referees to control the game. The head referee was the judge, jury, and executioner as far as rules were concerned. Players played to the whistle, and a referee’s decision was final. Whatever happened, happened, and the game went on.

It was not a perfect system. Humans make mistakes. Referees would make bad calls. But at the end of the day, when the whistle blew, the referee’s decision carried the day. There was no protesting it—you had to suck it up and move on.

This worked fine until the advent of a modern evil—the instant replay. Suddenly, stadiums were full of TV cameras that captured the play from all angles. Now and then, it would become obvious that a referee had made a mistake, with television stations broadcasting incontrovertible evidence to thousands of viewers across the land. A ball at Wimbledon was in, not out. A striker was on side prior to scoring. Fans started to groan and grumble. This wasn’t good enough!

And yet, the system hung strong. As much as it pained the fans to see a referee screw over their favored team, there was nothing to be done. The referee’s call was still final. Nobody could protest or overrule the call. The decision was made, the whistle was blown. The game rolled on.

Then somebody had a bright idea. Why don’t we use these cameras and all this video footage, and use it to double check the referee’s work? Then, there’ll never be a problem—any questionable decision can be reviewed outside of the heat of the moment. There’ll never be a bad call again!

Oh, what a beautiful solution it seemed. And it ruined everything.

The Villain, VAR

The assistant video assistant referees are charged with monitoring various aspects of the game and reporting to the Video Assistant Referee (VAR). The VAR then reports to the referee on the ground, who may overturn a decision, hold firm, or look at the footage themself on a pitchside display. Credit: Niko4it, CC BY-SA 4.0

Enter the Video Assistant Referee (VAR). The system was supposed to bring fairness and accuracy to a game fraught with human error. The Video Assistant Referee was an official that would help guide the primary referee’s judgement based on available video evidence. They would be fed information from a cadre of Assistant Video Assistant Referees (AVARs) who sat in the stadium behind screens, reviewing the game from all angles. No, I didn’t make that second acronym up.

It was considered a technological marvel. So many cameras, so many views, so much slow-mo to pour over. The assembed VAR team would look into everything from fouls to offside calls. The information would be fed to the main referee on the pitch, and they could refer to a pitchside video replay screen if they needed to see things with their own eyes.

A VAR screen mounted on the pitch for the main referee to review as needed. Credit: Carlos Figueroa, CC BY-SA 4.0

The key was that VAR was to be an assistive tool. It was to guide the primary referee, who still had the final call at the end of the day.

You’d be forgiven for thinking that giving a referee more information to do their job would be a good thing.  Instead, the system has become a curse word in the mouths of fans, and a scourge on football’s good name.

From its introduction, VAR began to pervert the game of football. Fans were soon decrying the system’s failures, as entire championships fell the wrong way due to unreliability in VAR systems. Assistant referees were told to hold their offside calls to let the video regime take over. Players were quickly chided for demanding video reviews time and again. New rules would see yellow cards issued for players desperately making “TV screen” gestures in an attempt to see a rivals goal overturned. Their focus wasn’t on the game, but on gaming the system in charge of it.

Fans and players are so often stuck waiting for the penny to drop that celebrations lose any momentum they might have had. Credit: Rlwjones, CC BY-SA 4.0

VAR achieves one thing with brutal technological efficiency: it sucks the life out of the game. The spontaneity of celebrating a goal is gone. Forget running to the stands, embracing team mates, and punching the air in sweet elation. Instead, so many goals now lead to minute-long reviews while the referee consults with those behind the video screens and reviews the footage. Fans sit in a stunted silence, sitting in the dreaded drawn-out suspense of “goal” or “no goal.”

The immediacy and raw emotion of the game has been shredded to pieces. Instead of jumping in joy, fans and players sit waiting for a verdict from an unseen, remote official. The communal experience of instant joy or despair is muted by the system’s mere presence. What was once a straightforward game now feels like a courtroom drama where every play can be contested and overanalyzed.

It’s not just football where this is a problem, either. Professional cricket is now weighed down with microphone systems to listen out for the slightest snick of bat on ball. Tennis, weighed down by radar reviews of line calls. The interruptions never cease—because it’s in every player’s interest to whip out the measuring tape whenever it would screw over their rival. The more technology, the more reviews are made, and the further we get from playing out the game we all came to see.

Making Things Right

Enough of this nonsense! Blow the whistle and move on. Credit: SounderBruce, CC BY-SA 4.0

With so much footage to review, and so many layers of referees involved, VAR can only slow football down. There’s no point trying to make it faster or trying to make it better. The correct call is to scrap it entirely.

As it stands, good games of football are being regularly interrupted by frustrating video checks. Even better games are being ruined when the VAR system fails or a bad call still slips through. Moments of jubilant celebration are all too often brought to naught when someone’s shoelace was thought to be a whisker’s hair ahead of someone’s pinky toe in a crucial moment of the game.

Yes, bad calls will happen. Yes, these will frustrate the fans. But they will frustrate them far less than the current way of doing things. It’s my experience that fans get over a bad call far faster when it’s one ref and and a whistle. When it’s four referees, sixteen camera angles, and a bunch of lines on the video screen? They’ll rage for days that this mountain of evidence suggests their team was ripped off. They won’t get over it. They’ll moan about it for years.

Let the referees make the calls. Refereeing is an art form. A good referee understands the flow of the game, and knows when to let the game breathe versus when to assert control. This subtle art is being lost to the halting interruptions of the video inspection brigade.

Football was better before. They were fools to think they could improve it by measuring it to the nth degree. Scrap VAR, scrap the interruptions. Put it back on the referees on the pitch, and let the game flow.

❌
❌