QueryFind: Search Ranking based on Users' Feedback and Expert's Agreement

碩士 === 國立臺灣科技大學 === 電子工程系 === 91 === Given a query word, search engines can retrieve vast amount of Web pages from the World Wild Web to users. However, the main challenge of search engines is to effectively rank vast retrieved Web pages to meet users’ needs. Because the traditional ran...

Full description

Bibliographic Details
Main Authors: Po-Hsiang Wang, 王柏翔
Other Authors: Hahn-Ming Lee
Format: Others
Language:en_US
Published: 2003
Online Access:http://ndltd.ncl.edu.tw/handle/77693347842502043066
Description
Summary:碩士 === 國立臺灣科技大學 === 電子工程系 === 91 === Given a query word, search engines can retrieve vast amount of Web pages from the World Wild Web to users. However, the main challenge of search engines is to effectively rank vast retrieved Web pages to meet users’ needs. Because the traditional ranking method is based on content-oriented approaches to give each Web page a score for ranking, the ranking score is calculated by some sophisticated approaches and it is independent of users’ query words. Therefore, the relation between Web pages and users’ required information cannot be completely matched. In this manner, the most relevant Web pages to users’ query words might not be shown at the top of the search result list. That is, users still need to spend time for seeking out their required Web pages. Therefore, a novel ranking method named as QueryFind, based on learning from historical query logs, is proposed to predict users’ information needs and reduce the seeking time from the search result list. Our method uses not only the users’ feedback but also the source search engine’s recommendation. Based on this ranking method, we exploit users’ feedback to implicitly judge the Web pages’ quality. We also apply the meta-search concept to give each Web page a content-based ranking score. Therefore, the time users spend for seeking out their required information from search result list can be reduced and more relevant Web pages can be presented. In our experiments, Yam Search Engine’s query log over one week is used to evaluate. We also propose a novel evaluation approach to verify the feasibility of our ranking method. The approach is to capture the ranking order and Web pages that users have clicked from the search result list. Finally, our experiments show that the time users spend for seeking out their required information can be reduced significantly.