0

Speaking at the Westminster eForum on Web 2.0 in London, the VP of legislative affairs with AT&T, Jim Cicconi, has claimed that without investment the Internet's current network architecture will be at full capacity by 2010. That's just 24 short months from now, and with the ever increasing volume of video and user-generated content that is constantly being uploaded it is hardly surprising.

VeriSign, are certainly not surprised, and have been arguing for some time that the Internet is full and we need to get off - or at least get on with increasing capability to cope with this increased capacity requirement. It has already announced an expansion to its Project Titan initiative designed to strengthen, protect and make structural upgrades to the Internet's infrastructure and increase its internet infrastructure ten fold by 2010.

Among the upgrades announced are:

  • Adding additional network operations centers in the eastern United States and Northern Europe to manage and provide increased redundancy for Internet traffic. These sites expand VeriSign's data center capacity and diversify its locations to improve Internet traffic management and counter region-specific cyber attacks and threats.
  • Increasing its daily Domain Name System (DNS) query capacity from 400 billion queries a day to more than 4 trillion queries a day and scaling its proprietary constellation of resolution systems to increase their bandwidth from over 20 gigabits per second (Gbps) to greater than 200 Gbps.
  • Distributing its infrastructure to more than 100 locations around the globe to provide redundancy and reduced latency that improves the experience for users by reducing bottlenecks and increasing speed.

"VeriSign is working to stay ahead of the constantly changing demands on its Internet infrastructure and threats to its security," said Ken Silva, chief technology officer at VeriSign. "The first stage of Project Titan was focused on the speed of the Internet and range of our infrastructure. This next stage will focus on ensuring that the level of security exceeds demands, such as new attacks coming from wireless devices, to keep the infrastructure stable and operational."

As Editorial Director and Managing Analyst with IT Security Thing I am putting more than two decades of consulting experience into providing opinionated insight regarding the security threat landscape for IT security professionals. As an Editorial Fellow with Dennis Publishing, I bring more than two decades of writing experience across the technology industry into publications such as Alphr, IT Pro and (in good old fashioned print) PC Pro. I also write for SC Magazine UK and Infosecurity, as well as The Times and Sunday Times newspapers. Along the way I have been honoured with a Technology Journalist of the Year award, and three Information Security Journalist of the Year awards. Most humbling, though, was the Enigma Award for 'lifetime contribution to IT security journalism' bestowed on me in 2011.

2
Contributors
1
Reply
2
Views
9 Years
Discussion Span
Last Post by jwenting
0

As you say, nothing new.
And had the responsible agencies and companies that manage the core infrastructure not been vigilant and constantly increasing capacity we'd have hit the limit years ago.

As it is the main problem we're facing today is not that the capacity of the network in bandwidth is getting critically congested, but that the address space if filling up rapidly.
And with IPv6 adoption still going at a glacial pace I don't see that problem going away any time soon, in fact it may not go away until people are pressed into it with their noses when they can't acquire an IP address because there are none to be had.

Have something to contribute to this discussion? Please be thoughtful, detailed and courteous, and be sure to adhere to our posting rules.