Advanced Meta Tag Generator <$BlogRSDURL$>

formatted for Firefox

Get Firefox!
Monday, August 08, 2005
  big impact- Experts plan more secure internet: internet 2.0
Originally developed by the Defense Department, the Internet is now a global electronic communications network made up of hundreds of millions of computers, servers and other devices run by various governments, academic institutions, companies and individuals.

Because no one entity owns it, the network depends on goodwill to function smoothly.

The internet has become so huge – and so misused – that some worry that its power to improve society has been undermined.

A new internet

Now a movement is gathering steam to upgrade the network, to create an Internet 2.0. How, or even if, that could be done is subject of much debate. But experts are increasingly convinced that the Internet’s potential will never be met unless it is reinvented.

The internet is stuck in the flower-power days of the 1960s during which people thought the world would be beautiful if you are just nice.

Many of the bugs in the Internet are part of its top layers of software, the jazzy, graphics-heavy, shrink-wrapped programs that come loaded on new computers of sold in retail stores.

But some of the most critical issues were built into the network’s core design, written decades ago and invisible to the average user.

For example, a way to verify the identity of a sender of e-mail or other communications is just beginning to become available, meaning that many criminals roam the network with relative anonymity.

And the system that matches addresses to websites is vulnerable to hackers, redirecting users to sites they never wanted to visit.

Technological solutions for many of those problems have existed for years, but it’s been difficult to build a consensus to implement them.

Arguments about global politics, potential profits and ownership of intellectual property have plagued groups trying to fix things.

The problem with the Internet is that anything you do with it now is worth a lot of money. It’s not just about science anymore. It’s about who gets to reap the rewards to bringing safe technologies to people.

As the number of users exploded to more than 429 million in 2000 from 45 million in 1995, Lynch remembered watching in horror as hackers defaced popular websites and shady marketers began to bombard people’s e-mail inboxes with so much spam that real messages couldn’t get through.

A simpler time

When the Internet’s founding fathers were designing the network in the 1960s and 1970s, they thought a lot about how the network would survive attacks from the outside – threats like tornados, hurricanes, even nuclear war.

What they didn’t spend much time thinking about was internal sabotage. Only several hundred people had access to the first version of the internet and most knew each other well.

Years passed before the internet’s founders realized what they had created. All this was an experiment. They were trying to figure out whether the technology would work. They weren’t anticipating this would become the telecommunications network of the 21st century.

Even as, marveled at the wonders of instant messaging, Napster and other revolutionary tools that would not have been possible without the internet, Leonard Kleinrock, 71, a professor at the University of California at Los Angeles who is credited with sending the first message – “lo,” for “log on” – from one computer to another in 1969, began to see the internet’s dark side.

Some technologist have said the Internet or parts of it are so far gone that it should be rebuilt from scratch, and over the past decade there have been several attempts to do so.
But most now agree that the network has become to big and unruly for a complete overhaul. For now groups are working for what are essentially bandages for the network.

Improved standards

Today, a complicated bureaucracy of groups known by their abbreviations help govern the network: the IETF (the Internet Engineering Task Force, which comes up with the technical standards), ICANN (the internet Corporation for Assigned Names and Numbers, which manages the naming system for websites) and the W3C) the World Wide Web Consortium, which develops technologies for the Web).

But their power is limited and their legal standing murky. Some have recently argued that the United Nations should take over some regulatory functions.

Firms have set up their own standards groups to suit their own interests.

The one thing everyone seems to agree on is that security must be the priority when it comes to the next generation Internet.
Major companies are promoting technology that will give recipients of e-mail “return addresses”, or a better way of ensuring that senders are who they say they are, though the companies disagree on whose technology should be used.

A group of scientists from the IETF, perhaps the most important standards-making body for the network, are working on a way to better collect and share information on computer intrusions.

The Internet2, a consortium of mostly academic institutions that has built a screaming-fast network separate from the public Internet, is testing a technology that allows users to identify themselves as belonging to some sort of group.

Douglas E. Van Houweling, president of Internet2 and a professor at the University of Michigan, thinks the system could be used to limit access without using passwords to, say, chat rooms for women with children on a certain soccer team, or to subscribers of certain magazines or newspapers.

You’ve heard the saying that on the Internet nobody knows you’re a dog, and that’s of course the problem. Authentication will allow communities to form where people are known and therefore can be trusted.


But they is a trade-off for such security. The network becomes Balkanized, with more parts of it closed to most people.

It is believes the Internet will never truly be secure, though, because of the diversity of software and devices that run on it. If one has a flaw, others are vulnerable.

For years computer designers have tried to build a machine that lives up to the “orange book”, a specification written by technologists at the predecessor to the National Institute of Standard and Technology.

It describes a bug-free, completely secure computer that has to be built in a clean room with designers who have gone through extensive background check and are not allowed to communicate with anyone.

There have been a few computer systems built like this for the military and they vanish, just vanish. Nobody talks about them anymore. They have been created, but for the average person they may as well not exist.

Until that perfect machine is built for consumers, it will be up to people like at the Internet Storm Centre to keep the network up and running.

The centre is operated by the SANS Institute, a Bethesda, Maryland-based nonprofit dedicated to computer security. But most of its work is done by an eclectic group of volunteers who sign on remotely from around the world, including former National Security Council staff member and a grandmother in Iowa.

-the Internet 2.0-

zoneH (haha..), big impact (tech-news), blackhole (best info), custom (ur customizer), best (entertainment), online tips (quick tips), try ar (must try).

error-diggs feeds


Subscribe in NewsGator Online

Technology Blog Top Sites