Вы находитесь на странице: 1из 18

Integrating Web Caching and

Web Prefetching
In Client-Side Proxies
Objective

 The motivation for our project is to design an innovative


cache replacement algorithm, not only considering the caching
effect in the Web environment, but also evaluating the
prefetching rules provided by various prefetching schemes.
Introduction (Cont.)

Purpose
• Reduce user perceived latency
Key Issue
• Efficiently predict and fetch the future URLs
likely to be visited by the user in advance
 Efficient prediction algorithms

Cost
• Increased network and server load due to
errors in prediction
 Appropriate threshold value
Web caching

Web Caching :
Satisfying user Web requests by servers other
than original Web servers publishing the
requested Web objects
1. Latency,
2. External traffic,
3. Load on web servers and routers.

Deployed at: Corporate network boundaries, ISPs,


Web Servers, etc.
Web Cache

Browser
Browser
Cache

Client
Centralized Web
Web Cache Server
Browser
Browser
Cache

Client
Corporate LAN Internet
Web caching

 Benefits:

 Improves web performance (reduces access latency)


 Increases web capacity
 Alleviate traffic congestion (reducing network
bandwidth consumption)
 Reducing number of client requests (workload)
 Possibly improve failure tolerance and robustness of
Web (maintaining cached copies of web objects for
unreachable networks)
Web Prefetching

• Prefetching- caches web objects in anticipation of users’


future needs
Focuses on making cache-related storage capacity
decisions (storage capacity limits the number of prefetched
web objects)

Therefore allocate cache storage in prefetching


Categories

• DNS Prefetching
 Web client initiate the name resolution process in
advance of the user’s request
• Connection Prefetching
 Set up TCP connection before the user’s request
 Timing issue is important
• HTTP Prefetching
 Web client issuing HTTP request in advance and
cache the response
Web prefetching techniques

 Client-initiated policies
 User A is likely to access URL U2 right after URL U1

 Server-initiated policies
 Anticipate future requests based on server logs and

proactively send the corresponding Web objects to


participating cache servers or client browsers
 Hybrid policies
 Combine user access patterns from clients and general

statistics from servers to improve the quality of


prediction
Who Initiates the Prefetching

Server Side Prefetching


• Uses information from the web server to predict the
documents that will be requested soon
 Prefetch the object from the disk to the main

memory of the server


• Provides hints to the client to improve the client’s
response time
 The client then can initiate a prefetch
Who Initiates the Prefetching (Cont.)

Proxy Side Prefetching


 Proxy prefetches documents from web servers and stores in
proxy cache
 Proxy can gather information from multi-client to multi-
server, makes it possible to do good prediction
Who Initiates the Prefetching (Cont.)

Client Side Prefetching


• Works with browser
• Learns the personal profile and predicts future requests
• Uses the available bandwidth to retrieve documents
when the user is stop to read web contents
Modules
• System environment
 System model
 Prefetching rule

• Cache replacement algorithm


 Normalized profit function
 Algorithm IWCP

• Performance analysis
 Impact of cache capacity
 Impact of confidence threshold
 Execution overhead
module1

• System environment
 System model
 Prefetching rule
Prediction
Log Engine

-
request request
Request
Processing Prefetching
module rule
Response
response + repository
hint
Web Client Proxy server
Web object
depository Web server

System model of web prefetching and web caching


Prefetching rule
 Rule1: A prefetching rule is an implication of the form
o1,..,oi c oi+1 where o1,..,oi , oi+1 Є O and o1,..,oi , oi+1 is an
access sequence in D and c is the confidence of the
prefetching rule

 Rule2: The confidence c of the prefetching ruleo1,..,oi c oi+1


is a conditional probability of p(o1,..,oi , oi+1 )/ p(o1,..,oi )

 An object oi+1 is referred to as an implied object if and only if


the prefetching rule is triggered by some client who has
already referenced the objects in precedent request.
Hybrid Prefetching-System Model

System model of prefetching proxy


System Specification:

 Hardware Specification:
Processor Type : Pentium -IV
Speed : 1.2 GHZ
Ram : 128 MB RAM
Hard disk : 20 GB HD
 Software Specification:
Operating System : Linux,Win2000
Pgm Package : JAVA / J2EE
Protocol : HTTP.
Web Server : Apache Tomcat 5.0

Вам также может понравиться