Home
 » ISP News » 
Sponsored

“Danger Will Robinson” – Internet Capacity Crunch in 8 Years Time

Monday, September 21st, 2015 (11:38 am) - Score 1,522
broken uk internet connection and running man

Professor Andrew Ellis of Aston University’s School of Engineering and Applied Science has once again warned that the Internet “could be on the brink … of a capacity crunch“, which might force ISPs to throttle your broadband and it could happen within the next 8 years. Best get that tinfoil hat.

The warning is largely a repeat of the same comments that were made earlier this year (here) and there is often a small semblance of truth to such remarks, although they usually make the incorrect assumption that the industry won’t adapt or evolve to resolve it.

Professor Ellis said:

Demand for internet capacity keeps soaring, and we’re now reaching the point where it’s increasingly difficult to stay ahead of that demand using current approaches. It’s incredible we’ve managed to stay ahead this long, but now researchers are finding they just cannot fit much more data down traditional fibre optic lines.

Soon, unless we increase costs by deploying more fibres, we may need radical changes to the way we either use or distribute data if we are to overcome this capacity crunch. We should start having the conversation now – are consumers willing to accept higher charges for increased bandwidth or can we be more considered about the capacity we consume? Will we lay additional cables, or will we look to the likes of Netflix to help us manage demand?

Suffice to say that comments like this have consistently cropped up ever since the days of dial-up. Somebody is always warning that the Internet can’t cope or will run out capacity, yet strangely it never quite happens. Well not so strangely, perhaps.

In reality the industry usually adjusts through a mix of innovation and or spending on the construction of new international fibre optic cable links (e.g. this one). Granted we haven’t seen as many of these links being built in recent years, but that’s largely because scientific innovation has found new ways to push more data down existing cables.

Going forwards there is still a fair bit of additional innovation to come that should continue to add improvements to existing cables and if the time comes when we need more physical connections, which it almost certainly will, then no doubt somebody will build them because the demand will exist to pay for it.

In the meantime Professor Ellis will be warning the world about this impending crunch as part of Lightfest 2015, which is being held on Friday 25th September 2015 at the Library of Birmingham. But you really need not worry.

Delicious
Add to Diigo
Mark Jackson
By Mark Jackson
Mark is a professional technology writer, IT consultant and computer engineer from Dorset (England), he is also the founder of ISPreview since 1999 and enjoys analysing the latest telecoms and broadband developments. Find me on Twitter, , Facebook and Linkedin.
Leave a Comment
9 Responses
  1. Steve Jones

    I don’t take this too seriously. There are lots of things being done behind the scenes to increase capacity, especially in the area of content delivery networks. Also, much work has been done to vastly increase the throughput of existing fibre through the use of WDM. Yes, it’s an engineering challenge, but I don’t really see a “crunch” where it will all get stuck in a sort of internet gridlock.

  2. Checkplease

    YEAR 2K bug!!!!! oh, hang on……
    IPv6 running out!!! oh, hang on…….
    Running out of capacity!!!! etc, etc…..

    • Steve Jones

      The year 2K bug was real (I was there and remember the vast amount of work involved). However, as with this, there were engineering solutions and the work was put in place, albeit as a cost. In my case I recall that dealing with the W2K bug was used to do a huge number of much need uplifts on both software and hardware which had long been held back because the business was always reluctant to spend money on maintenance. That was a very common story across business; it’s often almost impossible to get finance for maintenance and uplifts unless there’s imminent prospect of a disaster. There’s simply no perceived return – the go-getters higher up the chain just see it as a waste. I guarantee if you go into any large company or organisation that has been around for many years that you will find it has a huge burden of obsolete systems.

      In that respect, the Y2K bug was a major gift to IT people (and I don’t just mean all those contractors who got rich), I mean in terms of a once-in-a-lifetime (or even century) chance to clear the decks of a lot of rubbish. It’s also worth noting that as least some of the application fixes didn’t fix a millennium bug. They actually fixed a century buy and, come the year 2100, they might need another bodge.

    • tonyp

      I agree the Y2K bug required much examination of systems code. The vast number of systems needed to be checked or modified for 4 figure year code. I guess the next serious date bug will come in 2036 when the 64bit timestamp seconds from 01/01/1900 becomes all ones – which on many systems today treat as a return error number. I’m sure there will be another panic when the due date draws near and I hope the life support systems I will be relying on then (if I’m still here) will not crash and switch me off!

      Still who can know how much capacity we will actually need in 8 years time? Who would have predicted we would have run out of IPv4 numbers in 1990? Its all a wet finger in the air I think. There might well be a new app/method/device that we havn’t devised or brought to market yet that will be all the rage then. Perhaps something from the bio-sciences?

    • MikeW

      Isn’t the next big timer bug due to 32-bit constraints within Unix, which counts from an epoch of 01/01/1970?

      That overflows during Jan 19th 2038.

      However, because it is an overflow of a signed integer, it becomes a negative value which (depending on software) might then skip backwards to December 1901 … but not 01/01/1900.

      As for Y2K itself … beyond the “ordinary” fixes we did to make sure our code worked, and survived correctly, I also remember preparing for operational support for Y2K itself. Just in case the worst happened, and we needed to be recovering the country’s networks, while all else was falling in flames around us. We had generators installed for the building, so that we could work, and provide support for our telco clients, even if the power grid went off.

      A huge amount of preparation for something we never expected to need…

  3. AndrewH

    I thought professors were supposed to be clever?
    Just another person thinking linearly and not exponentially….
    A bit like this daft Government spending billions on a new Nuclear power station when we’ll have nearly 100% solar in 15 years and energy will be clean, abundant and probably about 1-2p per Khw or less..

    • 100% solar – as in actual power delivery – will never happen in the UK because a) Nimbys always seem to object to the plans for related sites / stations (much as they do with wind farms) and b) My roof panels don’t work so well during the night or on cloudy days.. and we get A LOT of those (read: useless during those periods and battery backup is pretty naff).

    • Steve Jones

      If you were to design the power system which had the worst possible fit with UK requirements it would the solar. It produces power when the demand is least, on sunny days (and we don’t have lots of domestic air conditioning like the US to soak up the excess) and we don’t exactly have guaranteed sunny weather. The panels produce no power in the dark evenings when demand is highest and relatively little during the winter.

      One of the costs of Solar which nobody ever talks about, is that it has to be backed up by conventional generation using plant that is otherwise idle. The power companies are compelled to take the excess via (very expensive) feed in tariffs. Stopping and restarting thermal power generator systems is troublesome and only things like diesel generators and some types of gas-powered plants can do it. Needless to say that the electricity from these sources gets very expensive as they have high fixed costs to cover and they have to be recovered from those periods when they are in generation. In consequence, the price for “on demand” power is high. There are some techniques to smooth sudden peaks by shutting down demand for short periods (there have been trials where companies allow things like cooling systems to be turned off for short, tolerable periods in exchange for lower power costs), but these don’t deal with the main issue.

      Wind is a much better fit for UK conditions but that still suffers from intermittent supply and needs backup, but at least it produces power at night and during the winter.

      There is talk of grid-level storage systems, but it’s largely talk and nobody has yet produced a cost effective system for it. Electricity storage is expensive and has low energy density.

      So yes, solar will never meet anything remotely like all the UK’s electricity generation requirements. It an save a bit of CO2, but it’s very expensive when the requirements for backup systems are included. George Monbiot was scathing about domestic solar panels in the UK and saw them as a means of subsidising middle class householders, often at the expense of rather less well off families who have to pay the elevated cost of power.

      http://www.theguardian.com/commentisfree/2010/mar/01/solar-panel-feed-in-tariff

      Although he modified that view (a bit) here

      http://www.theguardian.com/environment/georgemonbiot/2015/jan/23/community-energy-companies-big-six-big-society

    • comnut

      OR we may have eventually got ‘clean fusion’ power going… 🙂

Comments RSS Feed

Javascript must be enabled to post (most browsers do this automatically)

Privacy Notice: Please note that news comments are anonymous, which means that we do NOT require you to enter any real personal details to post a message. By clicking to submit a post you agree to storing your comment content, display name, IP, email and / or website details in our database, for as long as the post remains live.

Only the submitted name and comment will be displayed in public, while the rest will be kept private (we will never share this outside of ISPreview, regardless of whether the data is real or fake). This comment system uses submitted IP, email and website address data to spot abuse and spammers. All data is transferred via an encrypted (https secure) session.

NOTE 1: Sometimes your comment might not appear immediately due to site cache (this is cleared every few hours) or it may be caught by automated moderation / anti-spam.

NOTE 2: Comments that break our rules, spam, troll or post via known fake IP/proxy servers may be blocked or removed.
Promotion
Cheapest Superfast ISPs
  • Hyperoptic £17.00 (*22.00)
    Avg. Speed 30Mbps, Unlimited
    Gift: Code: ONLINEDEAL
  • Vodafone £20.00 (*22.00)
    Avg. Speed 35Mbps, Unlimited
    Gift: None
  • TalkTalk £22.50
    Avg. Speed 36Mbps, Unlimited
    Gift: None
  • Plusnet £23.99 (*34.98)
    Avg. Speed 36Mbps, Unlimited
    Gift: None
  • First Utility £24.99 (*31.99)
    Avg. Speed 35Mbps, Unlimited
    Gift: None
Prices inc. Line Rental | View All
Poll
*Javascript must be ON to vote*
The Top 20 Category Tags
  1. BT (2232)
  2. FTTP (1646)
  3. FTTC (1466)
  4. Broadband Delivery UK (1458)
  5. Openreach (1183)
  6. Politics (1179)
  7. Business (1054)
  8. Statistics (932)
  9. Fibre Optic (859)
  10. Mobile Broadband (845)
  11. Ofcom Regulation (778)
  12. Wireless Internet (776)
  13. FTTH (755)
  14. 4G (735)
  15. Virgin Media (722)
  16. Sky Broadband (533)
  17. TalkTalk (511)
  18. EE (483)
  19. Vodafone (375)
  20. Security (361)
New Forum Topics
Promotion
Helpful ISP Guides and Tips
»
»
»
»
»
»
»
»
»
»
»
»
»
»
»
»
»
»
»
»
»
»
»
»
»
»
»
»
Sponsored

Copyright © 1999 to Present - ISPreview.co.uk - All Rights Reserved - Terms  ,  Privacy and Cookie Policy  ,  Links  ,  Website Rules