A few people here have asked me to do a bit of digging into what type of
machine/server we should get. As most of you will probably know Im
biased towards serving with NCSA on a Sun and cacheing with CERN on
another Sun. So in an attempt to be a bit more balanced before I wander
off and spout off, here is the fundamental question ;)
There is going to be a new server set up, with a choice of machine and
software to run on it. The initial suggestion is a P90 running Solaris 2.4.
This can obviously run either CERN or NCSA based servers (my leaning
here would be towards NCSA since we dont need it to cache, however there
is a minor problem with the expected load - more on that in a minute!).
However there are one or two other possibilities. We have some RISC iX
machines (Acorn Archimedes 440/540's) which quite happily run the NCSA
server and when running X11 can quite easily cope with 120+ requests/hour.
This is on a 12Mb machine with around 12Mb swap.
Another possibility as I see it is to not bother with a new machine, and
use some form of dynamic IP addressing and spread the service over a
number of machines that we already have.
BTW, I should add that things like Linux and FreeBSD are apparently out
since the people having to run the server want a "commercially
supported" OS on the machine.
Then that brings the question of the server to run. Like most of you
will know I can pretty well handle the CERN and NCSA servers BUT it wont
be my job to do so *phew*. Thus I feel that the NCSA server is a tad
easier to use and that is the way to go. What about servers like say
There is also the possibility that there may be some sensitive/secure
information that needs to be served. Are the current free servers up to
this or does that mean then that we need to go commercial (Im thinking
of the Netscape server here).
Suggestions and comments on my reasonings would be appreciated. Im not
after any holy war stuff here, just a few comments. :)