New version page

UNCC ITIS 3100 - Web Servers

This preview shows page 1-2-3-4-5-37-38-39-40-41-42-74-75-76-77-78 out of 78 pages.

View Full Document
View Full Document

End of preview. Want to read all 78 pages?

Upload your study docs or become a GradeBuddy member to access this document.

View Full Document
Unformatted text preview:

Web ServersGeneric OverviewSlide 3Common FeaturesSlide 5Slide 6Additional FeaturesSlide 8Slide 9Origin of returned contentContent OriginPath translationSlide 13Slide 14PerformanceSlide 16Slide 17Load limitsSlide 19Overload causesSlide 21Slide 22Overload symptomsSlide 24Anti-overload techniquesSlide 26Slide 27Slide 28Historical notesSlide 30Slide 31Slide 32SoftwareSlide 34Slide 35StatisticsSlide 37Popular Web ServersApacheSlide 40Slide 41Apache HistoryHistorySlide 44FeaturesSlide 46Slide 47Slide 48UsageSlide 50Slide 51Slide 52Slide 53LicenseSlide 55Slide 56Microsoft IISIISHistory of IISSlide 60Slide 61SecuritySlide 63Slide 64Slide 65Slide 66Slide 67Authentication mechanismsSlide 69Internet Information Services 7.0Slide 71Slide 72Slide 73Slide 74Slide 75Slide 76Slide 77SummaryWeb ServersGeneric Overviewhttp://en.wikipedia.org/wiki/Web_serversWeb ServersA web server can be:A computer program Responsible for accepting HTTP requests from clients (web browsers)Returns HTTP responses with optional data contentsUsually web pagesHTML documentsLinked objects (images, etc.). A computer that runs a computer program which provides the above functionalityCommon FeaturesCommon FeaturesHTTP Accepts HTTP requests from a clientProvides HTTP responses to the clientTypically an HTML documentCan also be:Raw text fileImage Some other type of documentdefined by MIME-typesIf an error is found in the client request or while trying to serve the requestWeb server has to send an error responseMay include custom HTMLMay have text messages to better explain the problem to end users.Common FeaturesLogging Web servers keep detailed information to log files Client requestsServer responsesAllows the webmaster to collect dataRunning log analyzersAdditional FeaturesAuthenticationOptional authorization before allowing access to some or all resourcesRequires a user name and passwordHandleStatic contentDynamic content Support one or more related interfaces SSI, CGI, SCGI, FastCGI, JSP, PHP, ASP, ASP .NET, Server API such as NSAPI, ISAPI, etc.Additional FeaturesHTTPS support VIA SSL or TLSAllows secure (encrypted) connectionsUsing port 443 instead of port 80Content compressionI.e. by gzip encodingReduces the size of the responsesLower bandwidth usage, etc.Additional FeaturesVirtual hostingServe many web sites using one IP addressLarge file supportServe files greater than 2 GBTypical 32 bit OS restrictionBandwidth throttlingLimit the speed of responsesDo not saturate the networkAble to serve more clientsOrigin of returned contentWhere does it all come from?Content OriginThe origin of the content may be:StaticComes from an existing file pre-existing in a file system DynamicDynamically generated by some other programScriptApplication Programming Interface (API) called by the web serverStatic content is usually delivered much faster than dynamic content2 to 100 timesEspecially if the latter involves data pulled from a databasePath translationHow does it find it?Path translationWeb servers map the path component of a Uniform Resource Locator (URL) into:Local file system resourceStatic requestsInternal or external program nameDynamic requestsFor a static request the URL path specified by the client is relative to the Web server's root directoryPath translationConsider the following URL requested by a client:http://www.example.com/path/file.html Client's web browser translates it into a connection to www.example.com with the following HTTP 1.1 request:GET /path/file.html HTTP/1.1 Host: www.example.com The web server on www.example.com then appends the given path to the path of its root directoryOn Unix machines, this is commonly /var/www/htdocs. The result would then be the local file system resource:/var/www/htdocs/path/file.html Web server then reads the file, if it exists, and sends a response to the client's web browserResponse will describe the content of the file and contain the file itselfPerformancePerformanceWeb servers:Serve requests quicklyFrom more than one TCP/IP connection at a timeMain key performance parameters are:number of requests per seconddepends on the type of request, etc.latency response time in milliseconds for each new connection or requestthroughput in bytes per seconddepending on file size, cached or not cached content, available network bandwidth, etc.Measured under:Varying load of clientsVarying requests per clientPerformancePerformance parameters may vary noticeably depending on the number of active connectionsA fourth parameter is the concurrency level supported by a web server under a specific configurationSpecific server model used to implement a web server program can bias the performance and scalability level that can be reached under heavy load or when using high end hardwaremany CPUs, disks, etc.Load limitsLoad limitsWeb server (program) has defined load limitsIt can handle only a limited number of concurrent client connections per IP address (and IP port) Usually between 2 and 60,000Default between 500 and 1,000Can serve only a certain maximum number of requests per second depending on:its own settingsthe HTTP request typecontent origin (static or dynamic)whether the served content is or is not cachedthe hardware and software limits of the native OSWhen a web server is near to or over its limitsIt becomes overloaded and thus unresponsiveOverload causesOverload causesA sample daily graph of a web server's load, indicating a spike in the load early in the day.Overload causesAt any time web servers can be overloaded because of:Too much legitimate web trafficThousands or even millions of clients hitting the web site in a short interval of timeDDoS (Distributed Denial of Service) attacksComputer wormsAbnormal traffic because of millions of infected computers (not coordinated)XSS virusesMillions of infected browsers and/or web serversInternet web robotsTraffic not filtered / limited on large web sites with very few resources (bandwidth, etc.)Internet (network) slowdownsClient requests are served more slowly and the number of connections increases so much that server limits are reachedWeb servers (computers) partial


View Full Document
Loading Unlocking...
Login

Join to view Web Servers and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Web Servers and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?