As buffer cache is used to overcome the speed gap between processor and storage devices, performa... more As buffer cache is used to overcome the speed gap between processor and storage devices, performance of buffer cache is a deciding factor in verifying the system performance. Need of improved buffer cache hit ratio and inabilities of the Least Recent Used replacement algorithm inspire the development of the proposed algorithm. Data reuse and program locality are the basis for determining the cache performance. The proposed algorithm determines the temporal locality by detecting the access patterns in the program context from which the I/O request are issued, identified by the program counter signature, and the files to which the I/O request are addressed. For accurate pattern detection and enhanced cache performance re-reference behavior exploited in the cache block are associated with unique signature. Use of multiple caching policies is supported by the proposed algorithm so that the cache under that pattern can be best utilized.
As buffer cache is used to overcome the speed gap between processor and storage devices, performa... more As buffer cache is used to overcome the speed gap between processor and storage devices, performance of buffer cache is a deciding factor in verifying the system performance. Need of improved buffer cache hit ratio and inabilities of the Least Recent Used replacement algorithm inspire the development of the proposed algorithm. Data reuse and program locality are the basis for determining the cache performance. The proposed algorithm determines the temporal locality by detecting the access patterns in the program context from which the I/O request are issued, identified by the program counter signature, and the files to which the I/O request are addressed. For accurate pattern detection and enhanced cache performance re-reference behavior exploited in the cache block are associated with unique signature. Use of multiple caching policies is supported by the proposed algorithm so that the cache under that pattern can be best utilized.
WSEAS Transactions on Information Science and Applications
Changes in cache size or architecture are the methods used to improve the cache performance. Use ... more Changes in cache size or architecture are the methods used to improve the cache performance. Use of a single policy cannot adapt to changing workloads. Non detection based policies cannot utilize the reference regularities and suffer from cache pollution and thrashing. Cache Access Pattern (CAP), is a policy that detects patterns, at the file and program context level, in the references issued to the buffer cache blocks. Precise identification is achieved as frequently and repeatedly access patterns are distinguished through the use of reference recency. The cache is partitioned, where each sub-cache holds the blocks for an identified pattern. Once-identified pattern is not stored, repeatedly identified patterns is managed by MRU, frequently identified and unidentified patterns are managed by ARC. Future block reference is identified from the detected patterns. This improves the hit ratio, which in turn reduces the time spent in I/O and overall execution.
2013 6th International Conference on Emerging Trends in Engineering and Technology, 2013
Achieving deadline in real time environment is a real practical difficulty. Real time scheduling ... more Achieving deadline in real time environment is a real practical difficulty. Real time scheduling algorithms consume more time in making scheduling decisions which leads to increase in deadline misses. The problem is further complicated in virtual environment. This motivates the idea for designing an efficient algorithm for future prediction of block access requests in virtual environment. Disk block access requests made by real time applications in virtualized environment are analyzed. Future disk access request are predicted by non-work conserving disk scheduling algorithm in offline mode. These pre-fetched disk blocks are moved to buffer cache. Pattern based detection technique is applied for predicting the future access of buffer cache blocks. Executing processes access large amount of data and require disk accesses. In real time virtualized environment the requesting processes are scheduled using an adaptive real time scheduling algorithm which reduces the deadline misses. Thus, due to non-work conserving algorithm seek time is reduced. Pattern based technique improves the hit ratio and reduces I/O time. An adaptive real time scheduling algorithm helps to schedule the processes for achieving their deadlines. Thus the performance of applications is enhanced in virtual environment and therefore non-work conserving pattern based adaptive real time scheduling algorithm is found very useful for hard real time applications.
Uploads
Papers by Reetu Gupta