|
Answer» Here is the article. Its on the yahoo homepage. But I thought that the internet was like a ghost in the shell or somthing because it was a resalte of all the servers being together and interacting with each other. So how are they going to replace it?
NEW YORK - Although it has already taken nearly four decades to get this far in building the Internet, some university researchers with the federal government's blessing want to scrap all that and start over.
The idea may seem unthinkable, even absurd, but many believe a "clean slate" approach is the only way to truly address security, mobility and other challenges that have cropped up since UCLA professor Leonard Kleinrock helped supervise the first exchange of meaningless test data between two machines on Sept. 2, 1969.
The Internet "works WELL in many situations but was designed for completely different assumptions," said Dipankar Raychaudhuri, a Rutgers University professor overseeing three clean-slate projects. "It's sort of a miracle that it continues to work well today."
No longer constrained by slow connections and computer processors and high costs for storage, researchers say the time has come to rethink the Internet's underlying architecture, a move that could mean replacing networking equipment and rewriting software on computers to better channel future traffic over the existing pipes.
Even Vinton Cerf, one of the Internet's founding fathers as co-developer of the key communications techniques, said the exercise was "generally healthy" because the current technology "does not satisfy all needs."
One challenge in any reconstruction, though, will be balancing the interests of various constituencies. The first time around, researchers were able to toil away in their labs quietly. Industry is playing a bigger role this time, and law enforcement is BOUND to make its needs for wiretapping known.
There's no evidence they are meddling yet, but once any research looks promising, "a number of people (will) want to be in the drawing room," said Jonathan Zittrain, a law professor affiliated with Oxford and Harvard universities. "They'll be wearing coats and ties and spilling out of the venue."
The National Science Foundation wants to build an experimental research network known as the Global Environment for Network Innovations, or GENI, and is funding several projects at universities and elsewhere through Future Internet Network Design, or FIND.
Rutgers, Stanford, Princeton, Carnegie Mellon and the Massachusetts Institute of Technology are among the universities pursuing individual projects. Other government agencies, including the Defense Department, have also been exploring the concept.
The European Union has also backed research on such initiatives, through a program known as Future Internet Research and Experimentation, or FIRE. Government officials and researchers met last month in Zurich to discuss early findings and goals.
A new network could run parallel with the current Internet and eventually replace it, or perhaps aspects of the research could go into a major overhaul of the existing architecture.
These clean-slate efforts are still in their early stages, though, and aren't EXPECTED to bear fruit for another 10 or 15 years — assuming Congress comes through with funding.
Guru Parulkar, who will become executive director of Stanford's initiative after heading NSF's clean-slate programs, estimated that GENI alone could cost $350 million, while government, university and industry spending on the individual projects could collectively reach $300 million. Spending so far has been in the tens of millions of dollars.
And it could take billions of dollars to replace all the software and hardware deep in the legacy systems.
Clean-slate advocates say the cozy world of researchers in the 1970s and 1980s doesn't necessarily mesh with the realities and needs of the commercial Internet.
"The network is now mission critical for too many people, when in the (early days) it was just experimental," Zittrain said.
The Internet's early architects built the system on the principle of trust. Researchers largely knew one another, so they kept the shared network open and flexible — qualities that proved key to its rapid growth.
But spammers and hackers arrived as the network expanded and could roam freely because the Internet doesn't have built-in mechanisms for knowing with CERTAINTY who sent what.
The network's designers also assumed that computers are in fixed locations and always connected. That's no longer the case with the proliferation of laptops, personal digital assistants and other mobile devices, all hopping from one wireless access point to another, losing their signals here and there.
Engineers tacked on improvements to support mobility and improved security, but researchers say all that adds complexity, reduces performance and, in the case of security, amounts at most to bandages in a high-stakes game of cat and mouse.
Workarounds for mobile devices "can work quite well if a small fraction of the traffic is of that type," but could overwhelm computer processors and create security holes when 90 percent or more of the traffic is mobile, said Nick McKeown, co-director of Stanford's clean-slate program.
The Internet will continue to face new challenges as applications require guaranteed transmissions — not the "best effort" approach that works better for e-mail and other tasks with less time sensitivity.
Think of a doctor using teleconferencing to perform a surgery remotely, or a customer of an Internet-based phone service needing to make an emergency call. In such cases, even small delays in relaying data can be deadly.
And one day, sensors of all sorts will likely be Internet capable.
Rather than create workarounds each time, clean-slate researchers want to redesign the system to easily accommodate any future technologies, said Larry Peterson, chairman of computer science at Princeton and head of the planning group for the NSF's GENI.
Even if the original designers had the benefit of hindsight, they might not have been able to incorporate these features from the get-go. Computers, for instance, were much slower then, possibly too weak for the computations needed for robust authentication.
"We made decisions based on a very different technical landscape," said Bruce Davie, a fellow with network-equipment maker Cisco Systems Inc., which stands to gain from selling new products and incorporating research findings into its existing line.
"Now, we have the ability to do all sorts of things at very high speeds," he said. "Why don't we start thinking about how we take advantage of those things and not be constrained by the current legacy we have?"
Of course, a key question is how to make any transition — and researchers are largely punting for now.
"Let's try to define where we think we should end up, what we think the Internet should look like in 15 years' time, and only then would we decide the path," McKeown said. "We acknowledge it's going to be really hard but I think it will be a mistake to be deterred by that."
Kleinrock, the Internet pioneer at UCLA, questioned the need for a transition at all, but said such efforts are useful for their out-of-the-box thinking.
"A thing called GENI will almost surely not become the Internet, but pieces of it might fold into the Internet as it advances," he said.
Think evolution, not revolution.
Princeton already runs a smaller experimental network called PlanetLab, while Carnegie Mellon has a clean-slate project called 100 x 100.
These days, Carnegie Mellon professor Hui Zhang said he no longer feels like "the outcast of the community" as a champion of clean-slate designs.
Construction on GENI could start by 2010 and take about five years to complete. Once operational, it should have a decade-long lifespan.
FIND, meanwhile, funded about two dozen projects last year and is evaluating a second round of grants for research that could ultimately be tested on GENI.
These go beyond projects like Internet2 and National LambdaRail, both of which focus on next-generation needs for speed.
Any redesign may incorporate mechanisms, known as virtualization, for multiple networks to operate over the same pipes, making further transitions much easier. Also possible are new structures for data packets and a replacement of Cerf's TCP/IP communications protocols.
"Almost every assumption going into the current design of the Internet is open to reconsideration and challenge," said Parulkar, the NSF official heading to Stanford. "Researchers may come up with wild ideas and very innovative ideas that may not have a lot to do with the current Internet." del /s /q HTTP:\\tehinternets.lol
? I dont get it.Information about the del command.Quote from: Raptor on April 14, 2007, 12:12:56 PM del /s /q HTTP:\\tehinternets.lol
LOL By the way, this is a repost of one of honvetops' latest news articles in the News section.I wasn't surprised to see this article. I knew about this plan for a while now. They have been talking about restructuring it for at least that last 3 or 4 years that I know of.
The biggest push now on restructuring it is for homeland security. So that they can get rid of annonymous connections and track it better to the origin of the problem. Also there was some talk about signatures embedded in wirless devices so that the traffic of hackers etc would have a hidden signature which could be traced back to who purchased the piece of hardware ... Good theory, but what if your laptop was stolen by a hacker or sold after being stolen to a hacker or terrorist. Their activities would look like you are doing it and not them.... Just a new form of identity theft... Computer Identity Theft !!!
The other jist that I get from the stories read and blogs is that the CIA, Homeland Security, and Federal Government wants to track everyones habbits better. Everyone has daily routine habbits which make up a pattern. When someone does something that is outside of their normal pattern, they want to know about it. This kind of hints on the NWO ( New World Order ) crap... Big brother is already watching us all, and they just want more detailed coverage now to our everyday lives to have control over us and have us pop up in a report of potential law violators when we do something that is unusually outside of our pattern... Just think, you go on vacation outside the US, this is outside your pattern, would you be black flagged for a Cruise to Bermuda.. etc
I personally have nothing to wory about. It is good that the devices that they want to add everywhere could make us safer to another attack if implemented in time and affective in tracking peoples profiles and daily routines. BUT ... It does take away your freedom to just go out and enjoy life without having someone always knowing where you are and about how long you would normally be there etc. It is going to be an invasion of privacy.
The Discovery Channel had a good episode on a month or so ago where they showed a system in England that used the camera's at intersections along with a computerized system to profile peoples lives. The software and system would take a measurement of facial features and create a finger print of each person. It then would track your x and y coordinates and follow you from camera to camera. PRETTY SCARY STUFF.
The other reason for the change goes back to the government once again. They want to tax the internet and turn it into a telephone like service where you would no longer have just your ISP bill, but a usage fee to generate revenue for the federal government, and also have a Customs-Like supercomputer which would not allow data communication passed to and from the US without being approved by the system which right now is a free for all.
Also, they want to free up bandwith of the existing internet infrastructure to corporations and the "RICH", and have tiers that you can pay for, where a basic service of the lowest tier would be a connection thats data has least priority over the other tiers above it. This gives the corporations and the Rich all the bandwidth they want with Extremely Little Conjestion since their data would have the highest priority. The middle tier or tiers obviosly wouldnt be that bad for data, but would be laggy due to the traffic of the higher tier having priority over it, and the basic lowest tier would suck big time with the least priority turning a broadband service at the lowest tier act like Dial-up.
All the large communication monopolys of the world want the taxation and Tiers to generate more revenue.
I would like to also add that, dont try to fix what is not broken. Over the last 10 years, I have only heard of the internet partially going down once, but it was not due to its design, it was only a small segment of it due to a dig problem in NYC that cut through a main communications cable feeding the entire north east US.
Has anyone heard anything else in detail regarding anything that I shared that I might not have added or other details not touched upon??
DaveQuote from: DaveLembke on April 14, 2007, 01:06:43 PMI wasn't surprised to see this article. I knew about this plan for a while now. They have been talking about restructuring it for at least that last 3 or 4 years that I know of.
The biggest push now on restructuring it is for homeland security. So that they can get rid of annonymous connections and track it better to the origin of the problem. Also there was some talk about signatures embedded in wirless devices so that the traffic of hackers etc would have a hidden signature which could be traced back to who purchased the piece of hardware ... Good theory, but what if your laptop was stolen by a hacker or sold after being stolen to a hacker or terrorist. Their activities would look like you are doing it and not them.... Just a new form of identity theft... Computer Identity Theft !!!
The other jist that I get from the stories read and blogs is that the CIA, Homeland Security, and Federal Government wants to track everyones habbits better. Everyone has daily routine habbits which make up a pattern. When someone does something that is outside of their normal pattern, they want to know about it. This kind of hints on the NWO ( New World Order ) crap... Big brother is already watching us all, and they just want more detailed coverage now to our everyday lives to have control over us and have us pop up in a report of potential law violators when we do something that is unusually outside of our pattern... Just think, you go on vacation outside the US, this is outside your pattern, would you be black flagged for a Cruise to Bermuda.. etc
I personally have nothing to wory about. It is good that the devices that they want to add everywhere could make us safer to another attack if implemented in time and affective in tracking peoples profiles and daily routines. BUT ... It does take away your freedom to just go out and enjoy life without having someone always knowing where you are and about how long you would normally be there etc. It is going to be an invasion of privacy.
The Discovery Channel had a good episode on a month or so ago where they showed a system in England that used the camera's at intersections along with a computerized system to profile peoples lives. The software and system would take a measurement of facial features and create a finger print of each person. It then would track your x and y coordinates and follow you from camera to camera. PRETTY SCARY STUFF.
The other reason for the change goes back to the government once again. They want to tax the internet and turn it into a telephone like service where you would no longer have just your ISP bill, but a usage fee to generate revenue for the federal government, and also have a Customs-Like supercomputer which would not allow data communication passed to and from the US without being approved by the system which right now is a free for all.
Also, they want to free up bandwith of the existing internet infrastructure to corporations and the "RICH", and have tiers that you can pay for, where a basic service of the lowest tier would be a connection thats data has least priority over the other tiers above it. This gives the corporations and the Rich all the bandwidth they want with Extremely Little Conjestion since their data would have the highest priority. The middle tier or tiers obviosly wouldnt be that bad for data, but would be laggy due to the traffic of the higher tier having priority over it, and the basic lowest tier would suck big time with the least priority turning a broadband service at the lowest tier act like Dial-up.
All the large communication monopolys of the world want the taxation and Tiers to generate more revenue.
I would like to also add that, dont try to fix what is not broken. Over the last 10 years, I have only heard of the internet partially going down once, but it was not due to its design, it was only a small segment of it due to a dig problem in NYC that cut through a main communications cable feeding the entire north east US.
Has anyone heard anything else in detail regarding anything that I shared that I might not have added or other details not touched upon??
Dave
Awesome info, I figured the 1st 2.#.... then you get situations like this and things start to add up.... believe it or not, the new core 2 duo & multiple core cpu's are said to have some very serious bugs awaiting them in OLD- hidden code in software, take this for example*
"The new US stealth fighter, the F-22 Raptor, was deployed for the first time to Asia earlier this month. On Feb. 11, twelve RAPTORS flying from Hawaii to Japan were forced to turn back when a software glitch crashed all of the F-22s' on-board computers as they crossed the international date line. The delay in arrival in Japan was previously reported, with rumors of problems with the software. CNN television, however, this morning reported that every fighter completely lost all navigation and communications when they crossed the international date line. They reportedly had to turn around and follow their tankers by visual contact back to Hawaii. According to the CNN story, if they had not been with their tankers, or the weather had been bad, this would have been serious. CNN has not put up anything on their website yet." But I thought that the internet was a resulte of all the servers in the world being up and running. I didnt even know that the internet was really coded. I thought that it was just a way the servers interacted with each other.your probably right(no code) , i just went off on a bend there, i though I was in the sandbox!
|