Different strategies of decision support can be identified in a consultation process and implemented in an expert system. This paper concentrates on experiments that were carried out with an expert system that was developed in the area of information retrieval, to perform the job of an information specialist who assists users in selecting query terms for database searches. Three different support strategies are utilized in the system. One is a “participative” strategy, in which the system performs a search within its knowledge base and during which there is interaction between the system and the user, whereby the system informs the user of intermediate findings and the user judges their relevancy and directs the search. The second is a more “independent” support strategy, in which the system performs a search and evaluates its findings without informing the user before the search is completed. The third is a “conventional” strategy (not an expert system) in which the system only provides information according to the user's request, but it does not make judgments/ decisions; the user himself is expected to evaluate and to decide. Three main questions are examined in the experiments: (a) which of the three support strategies or systems is more effective in suggesting the appropriate query terms; (b) which of the approaches is preferred by users; and (c) which of the expert systems is more efficient, i.e. more “accurate” and “fast” in performing its consultation job. The experiments reveal that the performance of the system with the first two strategies is similar, and it is significantly better than the performance with the third strategy. Similarly, users generally prefer these two strategies over the “conventional” strategy. Between the first two, the more “independent” system behaves more “intelligently” than the more “participative” one.