Is the end nigh for the cheap universal internet?
THE MOST SUCCESSFUL TECHNOLOGY adopted in the last 50 years has undoubtedly been the internet. The mid 60s co-operative project between the US Defense Advanced Research Projects Agency and the UK’s National Physical Laboratory was created on the back of new thinking on computer networks — what the British team had called “packet switching”.
The ARPANET, as it was originally called, was designed to be massively extensible, extremely rugged and, famously, capable of surviving a nuclear attack.
The key decision for change came in the early 90s, when the internet, which had been restricted to the defence and academic communities, was opened up for commercial use.
It has become a worldwide broadcasting system, a mechanism for massive information dissemination and a medium for collaboration and interaction between individuals and their computers.
The internet has already revolutionised the world’s giant telecoms companies, destroying their old business model based on charging by time and distance. It is in the process of killing off the traditional business model for much of the newspaper industry by eliminating lucrative classified advertising revenues and it is currently moving in on the broadcast industry where the traditional model of delivering massive audiences to advertisers is coming under threat.
The fact that the internet has coped with this expansion does not mean that there haven’t been strains. Its original architects didn’t envisage spam, viruses and phishing attacks. The absence of any meaningful way of guaranteeing the identity of the originator of a message was a key missing element which has caused huge security difficulties.
The original packet-switched model, where the route the packet takes is much less important than it getting there, was fine for the original internet in which e-mail and documents were the biggest form of content transmission. Nowadays, it is different.
Currently, the largest volume of traffic on the internet is generated by downloading audio and video files, many from unregulated sites, but other data types, such as streaming TV programmes, are growing fast.
However, the current internet is not designed to carry the level of traffic in the very efficient, time-sensitive and reliable way that ensures that TV programmes can be viewed successfully. New methodologies, such as peer-to-peer techniques, allow the traditional internet to cope, but in reality, the old model is getting increasingly creaky.
The major telecoms companies see an opportunity here. By building a parallel, high-speed network designed to carry services such as television transmissions, they could ensure quality delivery of programmes and add extra charges for using it. Anyone using a large amount of time-sensitive data, such as TV programmes, could be expected to pay.
This would mean the end of the universal internet where everybody just pays for the connection to the service, independent of what, or how much, you do with it.
The telecoms companies would dearly love to get back some of the billing opportunities that they lost as the internet swept all before it.
It will be fascinating to see whether they succeed.