SEEFOR 16(2): early view
Article ID: 2521
DOI: https://doi.org/10.15177/seefor.25-21
ORIGINAL SCIENTIFIC PAPER
Evaluating the Effectiveness of Google Scholar, ScienceDirect, Mendeley, and eLibrary for Interdisciplinary Research
Natalya Ivanova1, Svetlana Ivanova1,2, Sergey Ivanchikov1, Sezgin Ayan3,*
(1) Institute Botanic Garden Ural Branch of RAS, Forestry Department, 8 Marta Street, 202a, RU-620144 Yekaterinburg, Russia;
(2) Institute of Natural Sciences and Mathematics, Ural Federal University, Department of High-Performance Computer Technologies, 19 Mira Street, RU‑620002 Yekaterinburg, Russia;
(3) Kastamonu University, Faculty of Forestry, Silviculture Department, Kuzeykent Campus, TR-37200 Kastamonu, Türkiye
Citation: Ivanova N, Ivanova S, Ivanchikov S, Ayan S, 2025. Evaluating the Effectiveness of Google Scholar, ScienceDirect, Mendeley, and eLibrary for Interdisciplinary Research.South-east Eur for 16(2): early view. https://doi.org/10.15177/seefor.25-21.
Received: 27 May 2025; Revised: 8 Sep 2025; Accepted: 19 Sep 2025; Published online: 27 Nov 2025
Cited by: Google Scholar
Abstract
As the volume of new scientific evidence, new ideas and practical developments increases, so does the risk that researchers will fall behind the current state of the science and its practical applications. A high-quality synthesis of information in the form of systematic reviews and meta-analyses aims to address this problem and can significantly facilitate the researchers' orientation in the current state of the scientific field. The following four search engines were selected for the methodology of this study: Google Scholar, ScienceDirect, Mendeley and eLibrary. Information was searched in various scientific disciplines, including machine learning, economics, and ecology. For the comparative analysis of the search engines, a keyword search was performed for the following terms: "climate warming", "ecological service", "reforestation", "wildfires", "ecological indicator", "cryptocurrency", "financial market", "machine learning", "market prediction", and "blockchain". For more complex queries, the following was searched: "machine learning" and "cryptocurrency", "machine learning" and "financial market", "machine learning" and "reforestation", "machine learning" and " wildfires". The search was conducted in February 2025 for the last 10 years (from 2015 to 2024). As a result, it was found that the search engines Google Scholar, ScienceDirect, Mendeley and eLibrary have significant differences that need to be taken into account when organizing searches. However, it is not recommended to use only one search engine in the study as this may lead to false conclusions. It is confirmed that Google Scholar returns significantly more records than other search engines. However, this search engine should not be used as the main source of information because of the low reproducibility of the results and the need for a great deal of effort to verify the relevance of the records. ScienceDirect and Mendeley provide similar results and may be suitable as a primary search engine. Google Scholar and eLibrary can be used as additional resources to increase the completeness of coverage of relevant publications. In conclusion, the analysis of searches carried out can help researchers choose the most appropriate search engine for specific research and may contribute to the writing of high quality systematic reviews.
Keywords: Google Scholar; ScienceDirect; Mendeley; eLIBRARY.RU; ecology; economics; machine learning
INTRODUCTION
Researchers face unprecedented challenges in tracking past and present research results due to the exponential growth of new publications (Larsen and von Ins 2010). This problem can be especially challenging in interdisciplinary research. It is important to note that an increase in the volume of new scientific results, new ideas and practical developments can sometimes lead to an increase in the risks of researchers lagging behind the current development of the scientific field and the practical applications being implemented. It is possible that the relevance, and scientific and practical significance of some studies has been reduced (Gusenbauer and Haddaway 2020). A high-quality synthesis of information in the form of systematic reviews and meta-analyses could help to solve this problem and significantly ease the researchers' orientation in the current state of the scientific field. The purpose of reviews is to highlight the most important theoretical and applied developments from the abundance of publications in a particular field of research (Konno and Pullin 2020, Gusenbauer 2021). Systematic reviews and meta-analyses are essential tools for academic researchers, policy makers and decision makers.
Finding information in the literature is the most important task for any systematic review. Without the use of web-based literature search systems, it is currently impossible to carry out academic work (Gusenbauer and Haddaway 2020, Gusenbauer and Gauster 2025). The initial task when conducting a systematic review is to undertake a thorough, objective, and reproducible search to identify as many relevant studies as possible. A thorough and reproducible search establishes the basis for subsequent synthesis and helps to minimize bias, and therefore contributes to the achievement of reliable research results. Consequently, this research stage is recognized as a pivotal one. Conversely, the meticulous and reproducible nature of systematic reviews is a hallmark distinction from conventional narrative reviews, which, regrettably, allocate insufficient attention to information retrieval. Consequently, the findings of research in this domain may be too subjective and unreliable (Gusenbauer and Haddaway 2020, Gusenbauer and Gauster 2025).
The utilization of a solitary search engine for the purposes of research is often deemed to be an inadequate approach. This is due to the fact that it does not permit the acquisition of comprehensive and reliable information regarding the status of the issue under investigation (Kugley et al. 2017). This is due to the fact that no search engine is capable of achieving perfection. It is important to note that each search engine functions in a distinct manner, possesses a unique set of strengths and weaknesses, and the identical search query entered into different search engine may yield different results. It is therefore imperative to organize online searches in an efficient manner and to select a set of search engines that are optimally suited to the specific research objectives. This will enable researchers to search for and identify relevant studies with a given level of accuracy in terms of completeness and publication quality. It is imperative that this is achieved in a procedurally unbiased manner.
A significant amount of research has already been conducted in relation to this issue (Falagas et al. 2008, Gehanno et al. 2013, Gusenbauer 2019, 2022, Gusenbauer and Haddaway 2020). A number of studies have been conducted which compare only a small number of search engines. For instance, Gehanno et al. (2013) conducted a comparative analysis of the effectiveness of Google Scholar and PubMed. Mongeon and Paul-Hus (2016) compared the coverage of journals in the Web of Science and Scopus. A number of publications have been published focusing on the comparative effectiveness of a single search engine (Bramer 2016, Hug and Braendle 2017). Harding and Alakangas (2016) conducted a comparative analysis of the search engines of three databases: Web of Science, Scopus and Google Scholar. Singh et al. (2021) conducted a comparative analysis of the journal coverage of Web of Science, Scopus and Dimensions. Concurrent research is being conducted by other scholars on a broad spectrum of search engines (Gusenbauer 2019, Gusenbauer and Haddaway 2020). Researchers have underscored that the accuracy, completeness, and reliability of search engine results are contingent on the specific search engine employed. However, the optimal search engine for a given meta-analysis or research area remains unclear. In each instance, it is necessary to find a compromise between accuracy, completeness, reliability and the complexity of data extraction. This necessitates further research in a range of scientific disciplines, as well as an assessment of the strengths and weaknesses of various search engines.
Four search engines/databases were tested (Google Scholar, ScienceDirect, Mendeley and eLibrary) on a specific example of information retrieval from different scientific fields: machine learning, economics and ecology. Google Scholar, ScienceDirect, Mendeley, eLIBRARY.RU are databases that are searched by a unique search engine designed for them. In what follows, we mean by Google Scholar, ScienceDirect, Mendeley, eLIBRARY.RU the search engines of these databases. Despite its high popularity, the Web of Science search engine was not included in the analysis since not all researchers have access to it. The study gave preference to open-access resources. The present study specifically focused on fundamentally disparate scientific domains, thereby testing the hypothesis that a search system would operate in a consistent manner across these different scientific domains. It is important to note that the effectiveness of the Mendeley search engine has not previously been evaluated in comparison with other resources. The effectiveness of a search engine is determined by its breadth of coverage, relevance, ease of access and reproducibility. The most effective search engine is one that allows reliable extraction of the largest number of relevant entries with the fewest errors. Searching must also be reproducible. The present study is distinguished by its analysis of the national search engine eLibrary. To the best of the author's knowledge, this search engine has not been analyzed previously. Therefore, despite the fact that some search engine comparisons have been made before, our research has a relatively high degree of originality and scientific novelty.
This research area is pertinent and will be of interest to a diverse readership from various scientific disciplines, since almost every researcher is aware of the limitations of the search systems they utilize. In order to utilize a specific search engine in a research, it is imperative that every researcher arrives at a valid conclusion. It is also noteworthy that search engines are continuously developed and refined. The new ones are being created and the existing ones are being improved. Consequently, a comparative analysis of their effectiveness will be relevant for many more years. The aim of this research is to initiate the development of a separate scientific field that will meet the researchers’ needs.
MATERIALS AND METHODS
A comparative analysis was undertaken of Google Scholar, ScienceDirect, Mendeley and eLibrary. Google Scholar is a freely available search engine that searches not only peer-reviewed scientific journals and books, but also dissertations, preprints, abstracts, technical reports and other scientific literature. A study undertaken in 2014 demonstrated that by using Google Scholar approximately 90% of all academic documents published in English on the Internet can be found (Khabsa and Giles 2014). Alternative search engines where therefore compared with Google Scholar. ScienceDirect is a digital library that provides access to high-quality, peer-reviewed full texts of research articles and book chapters, with the exception of grey literature. It is recognized as an important information resource for scientists around the world, and bibliographic metadata is available for reading for free. Mendeley is a software application designed for the management and distribution of research papers, and it is also highly convenient for the creation of bibliographies for scientific articles. This resource was employed in the context of systematic reviews (Ivanova 2024, Ivanova and Zolotova 2024), and the potential for utilizing it as the primary search engine was hypothesized. The primary rationale behind this is the convenience of this resource for working with bibliographic data. However, we are not aware of any comparisons between the capabilities of its search engines and those of other resources. Consequently, the present study aims to address this knowledge gap in the scientific literature. eLibrary is the largest Russian information and analytical portal in the field of science, technology, medicine and education, containing abstracts and full texts of articles. In contradistinction to the aforementioned resources, eLibrary facilitates the discovery of scientific literature published in Russian, thereby complementing the information obtained from other databases. However, a study comparing the effectiveness of eLibrary search engines with the resources listed above has not yet been conducted. The aforementioned factors contribute to the uniqueness and originality of our research.
For the comparative analysis of the search engines, a keyword search was performed for the following terms: "climate warming", "ecological service", "reforestation", "wildfires", "ecological indicator", "cryptocurrency", "financial market", "machine learning", "market prediction", and "blockchain". For more complex requests, the following was searched: “machine learning” and "cryptocurrency", “machine learning” and “financial market”, “machine learning” and "reforestation", “machine learning” and "wildfires". The search was conducted for terms with and without quotation marks to check the appropriateness of quotation marks. These keywords reflect the most pressing issues in various scientific fields at the moment. Therefore, it was assumed that these keywords are searched for very frequently, making our research analysis relevant and in demand among researchers. Of course, our research has only covered a small part of current problems, but it enables us to draw initial conclusions and stimulate further research. The strategy of selecting the most relevant contemporary issues for research analysis aimed to encompass a wide range of scientific disciplines. The search was conducted in February 2025 for the last 10 years (from 2015 to 2024). The search results were recorded separately for each year. Then visual diagrams were created and a comparative analysis was carried out.
ANOVA was used to test the hypothesis about the importance of the search engine factor. Tukey's honest significance test (Tukey's HSD) was used to identify specific pairs of search engines whose search results differ statistically from each other.
RESULTS
Climate Warming
Climate warming represents one of the most pressing environmental issues currently in the world. A substantial corpus of publications on this issue was revealed by a search by the relevant search engine (Figure 1). It was confirmed that Google Scholar search provides the largest number of entries. However, it is noteworthy that two similar queries, namely "climate warming" and climate warming, yielded divergent results. In the context of the present study, it was found that the utilization of the keyword "climate warming" resulted in a significantly lower number of records being identified in comparison to the utilization of the same keyword without quotes. However, an interesting point is that in the first case, there was an increase in the number of records found, and in the second, there was a decrease (Figure 1). An analysis of the number of records obtained through the use of alternative search engines confirmed the trend of an increase in the number of records in both cases. Consequently, it can be deduced that there is a persistent upward trend. A further noteworthy finding is the identification of similarities in the number of records for ScienceDirect, Mendeley, and eLibrary when employing the search query "climate warming" for most of the studied time period. Conversely, when employing climate warming as a search query for most of the studied time period, ScienceDirect yields a greater number of records than Mendeley and eLibrary.
ANOVA showed significant differences in the search results from different search engines for the keyword "climate warming" (F(3, 36) = 89.958, p = 0.00000). To clarify the results of the ANOVA, Tukey's honest significance test was conducted (Table 1).
Statistically significant differences were found between Google Scholar and all the other search engines studied based on the search results for the keyword "climate warming". Similar results were obtained for the keyword climate warming (without quotes) (F(3, 36) = 126.54, p = 0.0000). Tukey's test also showed a similar result (Table 1).
Ecological Indicator
The issue of the term ecological indicator is likewise the focus of many current studies, with a substantial number of records being returned by search engines (Figure 2). The employment of the keyword "ecological indicator" yielded an intriguing outcome that did not align with our null hypothesis. This is due to the fact that ScienceDirect provided the largest number of records for almost the entire studied time period. Google Scholar was in second place. This result was obtained solely for this keyword. Conversely, a similar search query for ecological indicators yielded a remarkably divergent result: Google Scholar yielded a greater number of entries than other search engines, with a figure more than four times higher than that of the latter. The first option clearly demonstrated a tendency to increase the number of publications, while the second option showed an initial upward trajectory, which is then intersected by a downward tendency starting in 2020.
ANOVA revealed significant differences in the search results for the keyword “ecological indicator”, both with and without quotation marks (F(3, 36) = 57.052, p = 0.00000 and F(3, 36) = 108.66, p = 0.0000, respectively). Tukey’s test revealed pairs of search engines for which there were statistically significant differences in search query results (Table 2).
Ecological Service
The significance of ecosystem functions in ensuring the stability of the biosphere and in maintaining a favorable human habitat is evident in the substantial number of records identified by search engines using the keywords “ecological service” and ecological service (Figure 3).
The utilization of the term "ecological service" showed an escalating trend in the number of documented records, while the employment of the term ecological service showed a decline in frequency. According to the number of entries received, the search engines were ranked as follows: Google Scholar, ScienceDirect, Mendeley and eLibrary. It is noteworthy that when using the term "ecological service", there are significantly less differences in the number of records obtained using Google Scholar, ScienceDirect, and Mendeley than when using the term ecological service.
ANOVA revealed significant differences in the search results for the keyword "ecological indicator", both with and without quotation marks (F(3, 36) = 23.796, p = 0.00000 and F(3, 36) = 91.58, p = 0.0000, respectively). Tukey's test revealed statistically significant differences in the search results obtained using Google Scholar compared to those obtained using all other search engines. Statistically significant differences in search results were also revealed when using the keyword "ecological service" in eLibrary RU and ScienceDirect (Table 3).
Reforestation and Wildfires
A keyword search for "reforestation" and "wildfires" yielded a greater number of results when using Google Scholar (Figure 4). On the other hand, ScienceDirect and Mendeley yielded analogous results.
Figure 4. Search results for the keywords: (a) reforestation; (b) wildfires.
The choice of a search engine was also a significant factor (F(3, 36) = 279.70, p = 0.00000 for reforestation and F(3, 36) = 117.00, p = 0.0000 for wildfires). Tukey's test revealed statistically significant differences in the search results obtained using Google Scholar compared to those obtained using all other search engines (Table 4).
Table 4. Tukey's honest significance test for the keywords reforestation and wildfires.
Financial Market and Market Prediction
When conducting a search for the term "financial market", Google Scholar returns the highest number of results (Figure 5). However, it should be noted that Mendeley has consistently ranked second in terms of the number of records for a number of years. A similar pattern was identified during the search for "market prediction" (Figure 6).
The choice of a search engine was also a significant factor (F(3, 36) = 404.28, p = 0.00000 and F(3, 36) = 60.794, p = 0.00000 for financial market with and without quotation marks, respectively; F(3, 36) = 29.452, p = 0.00000 and F(3, 36) = 322.13, p = 0.00000 for market prediction with and without quotation marks, respectively).
Tukey's test revealed statistically significant differences in the search results obtained using Google Scholar compared to those obtained using all other search engines (Table 5).
Blockchain and Cryptocurrency
In the context of research conducted on blockchain and cryptocurrency, Google Scholar emerges as the leading source of information, with Mendeley ranking second (Figure 7). A notable feature of the present study is that the search for these keywords resulted in eLibrary achieving third place in terms of the number of entries. In all other instances, eLibrary search engine exhibited the lowest number of entries in comparison to the other search engines analyzed. ANOVA and Tukey's test revealed similar results (F(3, 36) = 16.264, p = 0.00000 for blockchain, F(3, 36) = 18.311, p = 0.0000 for cryptocurrency) (Table 6).
Table 6. Tukey's honest significance test for the keywords blockchain and cryptocurrency.
Figure 7. Search results for the keywords: (a) blockchain; (b) cryptocurrency.
Machine Learning
It was confirmed that the problem of machine learning is widely discussed in the literature: all search engines recorded a large number of entries. At the same time, Google Scholar clearly prevailed in terms of the number of identified records (Figure 8). Mendeley was in second place, followed by ScienceDirect and/or eLibrary, depending on the year. A feature was noticed: ScienceDirect, Mendeley, and eLibrary revealed an increasing trend in the number of identified records over time, while Google Scholar showed an increase until 2019–2020, and then a rather sharp decrease was diagnosed.
A search for more complex queries, including "machine learning" and "crypto-currency", "machine learning" and "financial market", "machine learning" and "reforestation", "machine learning" and "wildfires", revealed a clear increasing trend in the number of entries (Figure 9). This finding was consistent across all search systems which were examined. Google Scholar was the most comprehensive source of entries. ScienceDirect was in second place. The exception to this was the search of the terms "machine learning" and "financial market". In this particular instance, Mendeley was positioned second in terms of the number of entries.
ANOVA (Table 7) and Tukey's test (Table 8) revealed statistically significant differences.
Table 7. ANOVA results for the term machine learning.
Table 8. Tukey's honest significance test for the term machine learning.
DISCUSSION
The research analysis confirmed that Google Scholar identifies a significantly higher number of entries than other search engines. Exceptions to this rule are possible, but are extremely rare. The present study's findings are consistent with those of previous research (Bramer 2016, Gusenbauer and Haddaway 2020, Martín-Martín et al. 2021). It is imperative to acknowledge the crucial significance of coverage as a key factor in optimizing search engine selection. Consequently, Google Scholar is frequently selected by researchers as the primary data source (Harzing and Alakangas 2016). However, as demonstrated in our own research (Ivanova 2024) and in the research conducted by other authors (Bremer 2016, Gusenbauer and Haddaway 2020), this search engine is not suitable for use as the primary source of information. This is due to the fact that search results vary greatly depending on the search and search strategies employed (Bramer 2016). Consequently, the reproducibility of search results is suboptimal, which cannot be considered acceptable for rigorous systematic reviews. The manual on systematic reviews provides three fundamental quality requirements that are imperative for the execution of a literature search (Livoreil et al. 2017). Firstly, the objective should be to identify all relevant entries (or as many as the author's capabilities allow). Secondly, the search process must be transparent. Thirdly, the search process must be reproducible (Livoreil et al. 2017). There is a broad consensus on the significance of adhering to these requirements (Wanyama et al. 2022, Hiebl 2023, Gusenbauer and Gauster 2025). The coverage of the search engine database is undoubtedly an important criterion. However, the relevance of this factor depends on whether the search engine can provide accurate searches. In the event of a search query yielding a substantial number of irrelevant records that require meticulous scrutiny and manual deletion, as previously observed in our preceding studies (Ivanova 2024), the use of such a search engine becomes difficult, resulting in an ineffective search results. It is precisely this issue that has been identified as one of the main disadvantages of Google Scholar, both by our team and by other researchers in the field (Gusenbauer and Haddaway 2020). Consequently, it is not recommended to use Google Scholar as the primary source of information. However, Google Scholar can be considered an additional source of information that may prove useful.
A comparison of the two platforms, ScienceDirect and Mendeley, revealed that they showed analogous results with regard to the number of documented relevant records. Furthermore, it is evident that the quality of the search results on ScienceDirect and Mendeley is significantly higher than that of Google Scholar. As demonstrated in the study conducted by Gusenbauer and Haddaway (2020), ScienceDirect can be selected as the primary search engine. It is evident from the data presented that Mendeley can also function effectively as the primary search engine. The completeness of the search is comparable to that of ScienceDirect. However, the authors of this study believe that Mendeley is a more convenient resource than ScienceDirect. This resource is characterized by good precision, defined as the percentage of relevant records in the result set. Consequently, the time spent on checking the relevance of records is negligible. Furthermore, an analysis of Figures 5–8 reveals that Mendeley offers a more extensive array of relevant entries in the domain of economics and machine learning when compared to ScienceDirect. In the field of ecology, Mendeley is shown to be only marginally inferior to ScienceDirect with regard to the comprehensiveness of its search functionality (Figures 1, 3, 4). However, in assessing the effectiveness of this resource, it is important to recognize a significant limitation inherent in Mendeley, which is that it facilitates the extraction of a maximum of 2,000 records.
The search engine employed by eLibrary is not as comprehensive as it could be. In terms of ease of use, it is our opinion that eLibrary is significantly inferior to Mendeley. It can thus be concluded that this resource is not suitable for use as the primary search engine. However, eLibrary can be utilized in order to provide a more extensive coverage of Russian-language literature.
CONCLUSIONS
The selection of an appropriate search engine is crucial for the outcome of the research and the quality of the systematic review. It is imperative to acknowledge the substantial disparities inherent in the search engines Google Scholar, ScienceDirect, Mendeley and eLibrary when performing search queries. However, it is not recommended to use only one search engine in the study. The research analysis demonstrated that Google Scholar identifies a substantially higher number of entries in comparison to other search engines. However, the utilization of this search engine as a primary source of information is not recommended, primarily due to the suboptimal reproducibility of the results and the considerable effort required to ascertain the relevance of the records. ScienceDirect and Mendeley were shown to produce analogous search results, thus rendering them potentially suitable as the primary search engine. Google Scholar and eLibrary can be used as additional resources to increase the completeness of coverage of relevant publications. The findings of this research can facilitate researchers in the selection of the most appropriate search engine for a specific research, thereby enabling the production of high-quality systematic reviews.
Author Contributions
NI and SI conceived and designed the research, NI and SI processed the data and performed the statistical analysis, NI, SA and SI wrote the manuscript.
Funding
This research was funded by the state assignment of the Institute Botanic Garden, the Ural Branch of the Russian Academy of Sciences.
Conflicts of Interest
The authors declare no conflicts of interest.
REFERENCES
Bramer WM, 2016. Variation in number of hits for complex searches in Google Scholar. Journal of the Medical Library Association JMLA 104(2): 143-145. https://doi.org/10.3163/1536-5050.104.2.009.
Falagas ME, Pitsouni EI, Malietzis GA, Pappas G, 2008. Comparison of PubMed, Scopus, Web of Science, and Google Scholar: Strengths and weaknesses. FASEB J 22: 338-342. https://doi.org/10.1096/fj.07-9492LSF.
Gehanno J-F, Rollin L, Darmoni S, 2013. Is the coverage of Google Scholar enough to be used alone for systematic reviews. BMC Med Inform Decis 13: 7. https://doi.org/10.1186/1472-6947-13-7.
Gusenbauer M, 2019. Suitable for systematic reviews and meta-analyses? The capacity of 23 Academic search engines. Academy of Management Proceedings 1: 12759 https://doi.org/10.5465/AMBPP.2019.12759abstract.
Gusenbauer M, 2021. The age of abundant scholarly information and its synthesis–A time when ‘just google it’ is no longer enough. Res Synth Methods 12(6): 684-691. https://doi.org/10.1002/jrsm.1520.
Gusenbauer M, 2022. Search where you will find most: Comparing the disciplinary coverage of 56 bibliographic databases. Scientometrics 127: 2683-2745. https://doi.org/10.1007/s11192-022-04289-7.
Gusenbauer M, Gauster SP, 2025. How to search for literature in systematic reviews and meta-analyses: A comprehensive step-by-step guide. Technol Forecast Soc 212: 123833. https://doi.org/10.1016/j.techfore.2024.123833.
Gusenbauer M, Haddaway NR, 2020. Which academic search systems are suitable for systematic reviews or meta-analyses? Evaluating retrieval qualities of Google Scholar, PubMed, and 26 other resources. Res Synth Methods 11: 181-217. https://doi.org/10.1002/jrsm.1378.
Harzing A-W, Alakangas S, 2016. Google Scholar, Scopus and the Web of Science: A longitudinal and cross-disciplinary comparison. Scientometrics 106: 787-804. https://doi.org/10.1007/s11192-015-1798-9.
Hiebl MRW, 2023. Sample selection in systematic literature reviews of management research. Organ Res Methods 26(2): 229-261. https://doi.org/10.1177/1094428120986851.
Hug SE, Braendle MP, 2017. The coverage of Microsoft Academic: Analyzing the publication output of a university. Scientometrics 113: 1551-1571. https://doi.org/10.1007/s11192-017-2535-3.
Ivanova N, 2024. Global overview of the application of the Braun-Blanquet approach in research. Forests 15: 937. https://doi.org/10.3390/f15060937.
Ivanova N, Zolotova E, 2024. Vegetation dynamics studies based on Ellenberg and Landolt indicator values: A Review. Land 13: 1643. https://doi.org/10.3390/land13101643.
Khabsa M, Giles CL, 2014. The number of scholarly documents on the public web. PlosOne 9(5): e93949. https://doi.org/10.1371/journal.pone.0093949.
Konno K, Pullin AS, 2020. Assessing the risk of bias in choice of search sources for environmental meta-analyses. Res Synth Methods 11(5): 698-713. https://doi.org/10.1002/jrsm.1433.
Kugley S, Wade A, Thomas J, Mahood Q, Jørgensen A-MK, Hammerstrøm K, Sathe N, 2017. Searching for studies: Guidelines on information retrieval for Campbell Systematic Reviews. Campbell Systematic Reviews 13: 1-73. https://doi.org/10.4073/cmg.2016.1.
Larsen PO, von Ins M, 2010. The rate of growth in scientific publication and the decline in coverage provided by Science Citation Index. Scientometrics 84(3): 575-603. https://doi.org/10.1007/s11192-010-0202-z.
Livoreil B, Glanville J, Haddaway NR, et al., 2017. Systematic searching for environmental evidence using multiple tools and sources. Environmental Evidence 6: 1-14. https://doi.org/10.1186/s13750-017-0099-6.
Martín-Martín A, Thelwall M, Orduna-Malea E, Delgado López-Cózar E, 2021. Google Scholar, Microsoft Academic, Scopus, Dimensions, Web of Science, and OpenCitations’ COCI: A multidisciplinary comparison of coverage via citations. Scientometrics 126: 871-906. https://doi.org/10.1007/s11192-020-03690-4.
Mongeon P, Paul-Hus A, 2016. The journal coverage of Web of Science and Scopus: A comparative analysis. Scientometrics 106: 213-228. https://doi.org/10.1007/s11192-015-1765-5.
Singh VK, Singh P, Karmakar M, Leta J, Mayr P, 2021. The journal coverage of Web of Science, Scopus and Dimensions: A comparative analysis. Scientometrics 126: 5113-5142. https://doi.org/10.1007/s11192-021-03948-5.
Wanyama SB, McQuaid RW, Kittler M, 2022. Where you search determines what you find: the effects of bibliographic data-bases on systematic reviews. Int J Soc Res Method 25(3): 409-422. https://doi.org/10.1080/13645579.2021.1892378.
© 2025 by the Croatian Forest Research Institute. This is an Open Access paper distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0).
