Computable.nl
  • Thema’s
    • Carrière
    • Innovatie & Transformatie
    • Cloud & Infrastructuur
    • Data & AI
    • Governance & Privacy
    • Security & Awareness
    • Software & Development
    • Werkplek & Beheer
  • Sectoren
    • Channel
    • Financiële dienstverlening
    • Logistiek
    • Onderwijs
    • Overheid
    • Zorg
  • Awards
    • Overzicht
    • Nieuws
    • Winnaars
    • Partner worden
  • Vacatures
    • Vacatures bekijken
    • Vacatures plaatsen
  • Bedrijven
    • Profielen
    • Producten & Diensten
  • Kennisbank
  • Magazine
  • Nieuwsbrief

Peer-to-peer reappears

17 mei 2001 - 22:004 minuten leestijdOpinieCloud & Infrastructuur
Martin Healey
Martin Healey

I don’t think that there is a single definition of "peer-to-peer" computing. As far as I am concerned it simply mans that the resources of one machine connected to a network can be used by another connected machine.

It should also imply that each participant has similar resources, for instance that all can be both clients and servers. This is in stark contrast to the classical computer architecture in which specific machines provide the services for other devices. This means specialised servers and equally specific clients. The latter is the long established terminal/computer architecture, but it was also the architecture adopted by the first PC local area networks, dominated by Novell Netware. Most client/server implementations still use the same basic architecture, although the servers provide far more functionality than the simple file and print services of Netware, e.g. databases and application servers.
Most servers are in fact normal computers running appropriate software. Since the servers use multi-tasking operating systems, most run a mixture of services concurrently, although there is some argument for dedicated servers. For an n user network there must be n+1 computers with the normal arrangement. This makes every sense with larger networks, since the server must be significantly more powerful than the clients. The server machine should also have more expensive features such as redundancy, and higher quality, since the system is vulnerable to the single point of failure. In the days of Netware this was a problem for smaller networks however, due to the extra cost of the server machine. The demand for lowering the cost of small PC based LANs could be met by making one or more of the PCs both a client and a server. The problem was that the PC DOS operating system and its bastard child, Windows 3.1, was not a protected mode, multi-tasking system! Microsoft eventually released Windows 3.11, and it was a standard feature, which worked for simple applications, with a low usage, but we all know just how easy it was to crash Windows 3.1 with the simplest GUI application, Word being a prime culprit. Once crashed by a client application, it meant that the server capability also disappeared.
By the time Windows 95/98/NT appeared the cost of PC hardware had fallen so that there was not that much interest in the peer-to-peer LAN solution. But then comes the Internet and a new generation of enthusiastic users and now there is a reawakening of the interest in peer-to-peer computing across the Internet.
The basic idea is to belong to a user group. Each individual’s computer is configured so that certain facilities are declared as available to other users whenever connected to the Net. So far the only practical application is file sharing. Personally I think this is a bad idea, prone to misuse and vulnerable to viruses, but I did experiment with the most famous (or infamous) system, Napster. Napster specialised in making files of music available, encoded in MP3 format. When connected to the Network via Napster, then the subset of files which are declared as Napster files are accessible by any other Napster member logged on. Napster provided a directory of the file names of all MP3 files available at any time. The search engine would thus enable a user to trace if a specific track was available anywhere.
The machine I was using was a fast one with a megabit per second link. When I found another user with a file I fancied I could copy it to my machine (which of course led to all the copyright problems) and other members could copy tracks I had encoded onto my machine to their own. The problem with the whole concept then became clear. If the file I fancied was on another machine with a Mbps link, then the transfer took about 10 minutes. However for machines using dial-up modems, most of them in fact, then a typical 5 minute sound track took hours to copy. When ever a member had done his or her searching and disconnected, then the server disappeared and the file transfer aborted in mid stream. In practice it would have been totally impractical with a modem connection. Until all computers are secure and have permanent, high speed connection, the idea is going to remain for enthusiasts only, and by then the application service providers will have got their act together.

Meer over

Netwerken

Deel

    Inschrijven nieuwsbrief Computable

    Door te klikken op inschrijven geef je toestemming aan Jaarbeurs B.V. om je naam en e-mailadres te verwerken voor het verzenden van een of meer mailings namens Computable. Je kunt je toestemming te allen tijde intrekken via de af­meld­func­tie in de nieuwsbrief.
    Wil je weten hoe Jaarbeurs B.V. omgaat met jouw per­soons­ge­ge­vens? Klik dan hier voor ons privacy statement.

    Whitepapers

    Computable.nl

    Kies de juiste virtualisatie-aanpak

    Vergelijk drie krachtige open source-oplossingen: Proxmox, Kubernetes en OpenStack

    Computable.nl

    Beveiliging begint bij de Server

    Is serverhardware de blinde vlek in het securitybeleid? Waarom lifecycle-denken cruciaal is voor IT-security

    Computable.nl

    Bouw de AI-organisatie niet op los zand

    Wat is de afweging tussen zelf bouwen of het benutten van cloud?

    Geef een reactie Reactie annuleren

    Je moet ingelogd zijn op om een reactie te plaatsen.

    Populaire berichten

    Meer artikelen

    Uitgelicht

    Partnerartikel
    AdvertorialSecurity & Awareness

    Cybersec Netherlands 2025 – Programma...

    Keynotes over cloud, AI en geopolitiek: het programma van Cybersec Netherlands raakt de kern van digitale weerbaarheid

    Meer persberichten

    Meer lezen

    ActueelCloud & Infrastructuur

    Eurofiber en Bright Access versterken glasvezelnetwerk op bedrijventerreinen

    Krimp groei
    ActueelSecurity & Awareness

    Kort: Fintech snel minder populair, Centric neemt Groupcard niet over (en meer)

    Joost Smit
    AchtergrondCloud & Infrastructuur

    Op deze vier paarden zet Google Cloud in

    AchtergrondCarrière

    Willem van der Poel – Een avonturier in de techniek

    ActueelCloud & Infrastructuur

    NLnet en CWI testen eerste openbare Scion-verbinding in Nederland

    ActueelCloud & Infrastructuur

    Kingston University organiseert expositie gaming-geschiedenis

    ...

    Footer

    Direct naar

    • Carrièretests
    • Kennisbank
    • Planning
    • Computable Awards
    • Magazine
    • Abonneren Magazine
    • Cybersec e-Magazine
    • Topics
    • – Phishing
    • – Ransomware

    Producten

    • Adverteren en meer…
    • Jouw Producten en Bedrijfsprofiel
    • Whitepapers & Leads
    • Vacatures & Employer Branding
    • Persberichten

    Contact

    • Colofon
    • Computable en de AVG
    • Service & contact
    • Inschrijven nieuwsbrief
    • Inlog

    Social

    • Facebook
    • X
    • LinkedIn
    • YouTube
    • Instagram
    © 2025 Jaarbeurs
    • Disclaimer
    • Gebruikersvoorwaarden
    • Privacy statement
    Computable.nl is een product van Jaarbeurs