Correlations between submission and acceptance of papers in peer review journals

10/13/2019 ∙ by Marcel Ausloos, et al. ∙ 0

This paper provides a comparative study about seasonal influence on editorial decisions for papers submitted to two peer review journals. We distinguish a specialized one, the Journal of the Serbian Chemical Society (JSCS) and an interdisciplinary one, Entropy. Dates of electronic submission for about 600 papers to JSCS and 2500 to Entropy have been recorded over 3 recent years. Time series of either accepted or rejected papers are subsequently analyzed. We take either editors or authors view points into account, thereby considering magnitudes and probabilities. In this sample, it is found that there are distinguishable peaks and dips in the time series, demonstrating preferred months for the submission of papers. It is also found that papers are more likely accepted if they are submitted during a few specific months, - these depending on the journal. The probability of having a rejected paper also appears to be seasonally biased. In view of clarifying reports with contradictory findings, we discuss previously proposed conjectures for such effects, like holiday effects and the desk rejection by editors. We conclude that, in this sample, the type of journal, specialized or multidisciplinary, seems to be the drastic criterion for distinguishing the outcomes rates.

READ FULL TEXT VIEW PDF
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

1 Introduction

In the peer review process, two ”strategic” questions have to be considered: on one hand, - for editors, what is the load due to the number (and what is the relative frequency) of papers submitted at some time during a year?; on the other hand, - for authors, is there any bias in the probability of acceptance of their (by assumption high quality) paper when submitted in a given month, because of the (being polite) mood of editors and/or reviewers? A study about such a time concentration (and dispersion) of submitted papers and their subsequent acceptance (or rejection) seems to become appropriate from a scientometrics point of view, in line with recent ”effects” found and known through media, like coercive citations or faked research reports.

In fact, the mentioned question of paper submissions timing is of renewed interest nowadays in informetrics and bibliometrics due to the flurry of new publication journals by electronic means. Moreover, paper acceptance rate is of great concern to authors who feel much bias at some time. No need to say that the peer review process is sometimes slow, with reasons found in editor’s and reviewers’ workload, whence a difficulty of finding reviewers. Tying such questions are the open access policy and the submission fees imposed by publishers. on one hand, but also doubts or constraints about the efficiency in managing peer-review of scientific manuscripts–editors’ perspective [1] and of authors [2]. Thus, one may wonder if there is some ”seasonal” or ”day of the week” effect.

Very recently, Boja et al. [3], in this journal, showed that ”the day of the week when a paper is submitted to a peer reviewed journal correlates with whether that paper is accepted”, when looking at a huge set of cases for high Impact Factor journals. However, there was no study of rejected papers.

From the seasonal point view, previously, but in recent time, Shalvi et al. [4] discussed the case of electronic submission monthly frequency to two psychology journals, Psychological Science (PS) and Personality and Social Psychology Bulletin (PSPB), over 4 and 3 years respectively. Shalvi et al. [4] found a discrepancy in the pattern of ”submission-per-month” and ”acceptance-per-month” for PS, - but not for PSPB. More papers were submitted to PS during ”summer months”, but no seasonal bias effect (based on a test for percentages) was found about subsequent acceptance; nevertheless, the percentage of accepted papers when submitted in Nov. and Dec. was found to be very low. In contrast, many papers were submitted to PSPB during ”winter months”, followed by a dip in April, but the percentage of published papers was found to be greater if the submission to PSPB occurred in [Aug.-Sept.-Oct.]. Moreover, a marked ”acceptance success dip” occurred if the submission was in ”winter months”. The main difference between such patterns was conjectured to stem from different rejections policies i.e. employing desk rejections or not.

Later, Schreiber [5] examined submissions to a specialized journal, Europhysics Letters (EPL), over 12 years. He observed that the number of submitted manuscripts had been steadily increasing while the number of accepted manuscripts had grown more slowly. He claimed to find no statistical effect. However, - from Table 2 in [5], there is a clearly visible maximum for the number of submissions in July, more than 10% over the yearly mean, and a marked dip in submissions in February, - even taking into account the ”month small length”. Examining the acceptance rate (roughly ranging between 45 and 55 %, according to the month of submission), he concluded that strong fluctuations can be seen, between different months,. One detects a maximum in July and a minimum in January for the most recent years.

Alikhan et al. [6]

had a similar concern: they compiled submissions, in 2008, to 20 journals pertaining to dermatology. It was found that May was the least popular month, while July was the most popular month. We have estimated a

, from the Fig. 1 data in Alikhan et al. [6]

. thereby suggesting a far from uniform distribution. There is no information on acceptance rate in

[6].

Other papers have appeared pretending discussing seasonal or so effects, concluding from fluctuations, but finding no effect, from standard deviations arguments, - instead of

tests. Yet, it should be obvious to the reader that a

test performs better in order to find whether a distribution is uniform or not, - our research question. In contrast, a test based on the standard deviation and the confidence interval can only allow some claim on some percentage deviation of (month) outliers; furthermore such studies are tacitly assuming a normality of the (submission or acceptance) fluctuation distributions, - which is far from being the case. Usually, the skewness and kurtosis of the distributions to be mandatory complements are not provided in ”fluctuations studies” by such authors.

In order to contribute answers to the question on ”monthly bias”, we have been fortunate to get access to data for submitted, and later either accepted or rejected, papers to a specialized (chemistry) scientific journal and to a multidisciplinary journal. Two coauthors of the present report, ON and AD, are Sub-Editor and Manager of the Journal of the Serbian Chemical Society (JSCS). One coauthor MA is a member of the editorial board of Entropy. It will be seen that comparing features from these two journals allows one to lift some veil on the reported apparent discrepancy in other cases.

Thus, here below, we explore the fate of papers submitted for peer review during a given month, plus their publication fate. We find that, in the case at hands, fluctuations of course occur from one year to another. However, for , submission peaks do occur in July and September, while many less papers are submitted in May and December. A marked dip in submissions occurs in August for , - the largest number of submissions occurs in October and December.

However, if the number of submitted papers is relevant for editors and handling machines, the probability of acceptance (and rejection) is much concerning authors. Relatively to the number of submitted papers, it is shown that more papers are accepted for publication if they are submitted in January (and February), - but less so if submitted in December, for JSCS; the highest rejection rates occur for papers submitted in December and March. For , the acceptance rate is the lowest in June and December, but is high for papers submitted during spring months, February to May. Statistical tests, e.g., and confidence intervals, are provided to ensure the validity of the findings.

Due to different desk rejection policies and in order to discuss the effect of such policies as in [6], we discuss a possible specific determinant for JSCS data: possible effects due to religious or holiday bias (in Serbia) are commented upon.

2 Data

The JSCS and Entropy peer review process are both mainly managed electronically, - whence the editorial work is only weakly tied to the editors working days111N.B. Nevertheless, there are days of the week effects [7]. .

2.1 The Journal of the Serbian Chemical Society

JSCS contains 14 sub-sections and many sub-editors, as it can be viewed from the journal website \(http://shd.org.rs/JSCS/\).

The (36 data points) time series of the monthly submissions to JSCS in a given month () in year () [2012, 2013 and 2014] is shown in Fig. 1. The total number of submission () decreased by or so from =2012 or 2013 to =2014: 317 or 322.

Next, let us call the numbers of papers later accepted () and those rejected (). Among the total number of submitted papers ()= 913 submitted papers, (= 162 + 146 + 116) were finally accepted for publication. In view of further discussion, let it be pointed out that among the total number = 474 (= 149 + 172 +153) of (peer and subsequently ) rejected papers, i.e., 52%, 202 papers were desk rejected, without going to a peer review process, i.e. 22.1%. For completeness, let it be recorded that several papers were rejected because the authors did not reply to the reviewers remarks in due time and a few submissions were withdrawn. (Thus, : 424 + 474 913).

The time series of the positive fate, thus acceptance, of submitted papers for a specific month submission is also shown in Fig. 1.

Figure 1: Time series of (left) the number of submitted papers and (right) of the number of accepted papers when submitted to JSCS during a given month () in 2012, 2013 and 2014.

The statistical characteristics222Thereafter, the indices and are not written, for simplicity, if there is no ambiguity. of the , , , and distributions for are given in Table 1 - Table 4.

2.2 Entropy

Entropy covers research on all aspects of entropy and information studies. The journal home page is \(http://www.mdpi.com/journal/entropy\).

The (36 data point) time series of the monthly submission to Entropy over the years 2014, 2015, and 2016 is shown in Fig. 2. The number of submission increased by or so from 2014 to 2015: 604, but not much () between 2015 and 2016: 961.

Figure 2: Time series of (left) the number of submitted papers and (right) of the number of accepted papers when submitted to Entropy during a given month in 2014, 2015, and 2016

Among the submitted papers, were finally accepted for publication. The time series of the positive fate, thus acceptance, of submitted papers after a specific month submission, is also shown in Fig. 2.

In view of further discussion below, let it be pointed out that there were (110 + 162 + 246 =) 518 peer review rejected papers, i.e. 20.1%; 805 papers were desk rejected at submission, i.e. 31.2%.

The statistical characteristics of the , , , and distributions for are given in Table 5 - Table 8.

3 Data analysis

The most important value to discuss is the calculated , for checking whether or not the distribution is uniform over the whole year.

Notice that we can discuss the data not only comparing different years, but also the cumulated data: , and similarly for , , and , as if all years are ”equivalent”. For further analysis, we provide the statistical characteristics of the cumulated distributions in Table 1 - Table 8.

We have also taken into account that months can have a different number of days, normalizing all months as if there were 31 days long (including the special case of February in 2016). The fact that the number of papers appears not to be an integer, in so doing, is not a drastic point, but more importantly such a data manipulation does not disagree at all with our following conclusions. Thus, we do not report results due to such ”data normalization”.

3.1 Jscs Data analysis

In all JSCS cases, the mean of each distribution decreases from 2013 to 2014; so does the standard deviation . This is the case for the cumulated time series, = , data which necessarily differs from . The coefficient of variation (CV ) is always quite small, indicating that the data is reliable beyond statistical sampling errors. Each coefficient of variation333The coefficient of variation is usually used to compare distributions; even if the means are drastically different from one another; its value also points toward a possible anomalous spread of the distribution or a multipeak effect. , or or or , for the cumulated data is lower than the other CVs; this is a posteriori pointing to the (statistical) interest of accumulating data for each month of different years, - beside looking at the more dispersed data over a long time span.

Next, observe the summary of statistical characteristics in Table 1 - Table 4; they show that the distributions are positively skewed, except those for the submitted papers which are negatively skewed. The kurtosis of each distribution is usually negative, except for the anomalous cases, and , whence for the latest case for the whole series. It can be concluded that the distributions are quite asymmetric, far from a Gaussian, but rather peaked.

Almost all measured values fall within the classical confidence interval . However, in five cases, a few extreme values fall above the upper limit, as can be deduced from the Tables.

”Finally”, notice that all values, reported in Table 1 - Table 4

are much larger than the 95% critical value: they markedly allow to reject the null hypothesis, i.e. a uniform distribution, for each examined case. Thus a monthly effect exists beyond statistical errors

for all , , and cases.

3.2 Entropy Data analysis

In the case of Entropy data, the CV is usually low;, - and much lower than in the case of JSCS. The skewness and kurtosis are not systematically positive or negative. The number of outliers outside the confidence interval is also ”not negligible”; this is hinted from the number of maximum and minimum values falling outside the confidence interval, yet ”not too far” from the relevant interval border. Nevertheless, this implies that the distribution behaviors are influenced by the number of data points, to a larger extent for Entropy than for JSCS.

Nevertheless, notice that all values, reported in Table 5 - Table 8 are also much larger than the 95% critical value: they markedly allow to reject the null hypothesis, i.e. a uniform distribution, for each examined case. A month anomaly effect exists beyond statistical errors for all and ; it is weaker for the and cases. The large values obviously point to distinguishable peaks and dips, thereby markedly promoting the view of monthly effect bias for and .

4 Discussion

Let us first recall that the journals here examined have different aims; one is a specialized journal, the other is an interdisciplinary journal. To our knowledge, this is the first time that a journal with such a ”broadness” is considered within the question on monthly bias. It seems that one should expect an averaging effect due to a varied number of constraints on the research schedules pertaining to different topics and data sources. One subquestion pertains on whether a focussed research influences the timing of paper submission, and later acceptance (or rejection). One would expect more bias for the JSCS case than for the Entropy case. Comparing journals (in psychology), but with different ”specializations”, Shalvi et al. [4] had found different behaviors indeed. Let us observe what anomalies are found in the present cases.

4.1 Jscs

Comparing months in 2012, 2013 and 2014, it can be noticed that the most similar months (the least change of positions in the decreasing order of ”importance”) are Dec., May, June for the lowest submission rate, while Sept. and July are those remaining on the top of the month list, for the highest submission rate; see figures. A specific deduction seems to be implied: there is a steady academic production of papers strictly before and after holidays, but there is a quiet (production and) submission of papers before holidays. This finding of July production relevance is rather similar to that found for most other journals, - except PSPB [4].

Concerning the May dip anomaly, one may remind ourselves that in most countries (including Serbia), lectures and practical work at faculties end by June; since many authors (professors, assistants) are very engaged with students at that time, probably May is not the month when they are focused on writing papers but rather ”prefer” finishing regular duties. In fact, corroborating this remark, it has been observed that most papers submitted to JSCS are from academia researchers [8].

A huge peak in January 2013 is intriguing. It was searched whether something special occurred January 2013; it was checked that the submission system worked properly: there was no special clogging a month before. Moreover, there were no special invitations or collection of articles for a special issue. Therefore, the peak can be correlated to that found for PS. From such a concordance, it seems that more quantitative correlation aspects could be searched for through available data.

Notice that on a month rank basis, for 2013 and 2014, the Kendall coefficient for submitted papers, but for accepted papers; concerning the correlation between the cumulated and , the Kendall coefficient .

Two other points related to JSCS, are discussed in Sect. 5.1 and 6: (i) the possible influence of desk rejection policy, a conjecture of Shalvi et al. [4], for distinguishing patterns, and (ii) the acceptance and rejections rates, which are tied to the submission patterns, but also pertain to the ”entrance barrier” (editor load mood) conjecture proposed by Schreiber [5].

4.2 Entropy

In the case of Entropy, the cumulated measure (over the 3 years here examined) points to a more frequent submission in December, and a big dip in August. From a more general view point, there are more papers submitted during the last 3 months of the year. A marked contrast occurs for the accepted papers for which a wide dip exists for 4 months : from June till September. The discussion on desk rejection and better chance for acceptance are also found in Sect. 5.1 and 6.

Notice that for the correlation between the cumulated and , the Kendall coefficient .

Finally, comparing the cumulated numbers of submitted and accepted papers to JSCS and to Entropy, and ranking the months accordingly, the Kendall coefficient is weakly negative: -0.333 and -0.1818, respectively.

5 Constraint determinants

5.1 Seasonal desk rejection by editor

Often controversial or scorned upon, the desk rejection patterns at JSCS and Entropy can be discussed now. Table 4 and Table 8 provide the relevant data respectively. Notice that for either or , we do not discuss reasons why editors (and reviewers) reject papers; these reasons are outside the present considerations; see for some information [9, 10, 11].

Let us consider first. It can be observed that ”only” (160/596) 27% papers are desk rejected, - this is interestingly compared to the () rejected papers after peer review: 325/596 , for JSCS; the ratio is . The highest desk rejection rate occurs for papers submitted in Nov., while the lowest is for those submitted in May; see Fig. 3. Distinguishing years, it happens that a high rejection rate occurs if the papers were submitted in Nov. 2014 and Aug. 2013, while a low rejection rate occurred for papers submitted in Feruary and May 2013.

Figure 3: Aggregated distribution of the number of desk rejected papers when submitted (left) to JSCS during a given month, in 2012 or in 2013 or in 2014 and (right) to Entropy during a given month in 2014 or in 2015, or in 2016.

There is no apparent month correlations. For example, the month with the greatest number of submissions (overall) is Sept.; the rejection rate in Sept. 2013 was 0.469, out of which 0.250 were desktop rejected. In Sept. 2014, these percentages were 0.555 and 0.333. On the other hand, the month with the lowest number of submissions is May. In May 2013, the rejection rate was 0.500, but desktop rejection was only 0.111. In May 2014, the rejection rate was 0.562, and desktop rejection was 0.250.

For completeness in arguing, let it be known that official holidays in Serbia are on Jan. 1-2 and 7 (Christmas day), Feb. 15-16, in April (usually) one Friday and one Monday (Easter holiday), May 1-2, and Nov. 11, - at which time one does not expect editors to be on duty for desk rejection.

Next concerning Entropy, (805/2573) 31% are desk rejected at submission, much more than those rejected by reviewers (and the editor), i.e. (518/2573) 20%. The greatest desk rejection occurs in December and January, - the lowest in February, May, and August. However, in terms of percentage of desk rejection with respect to the number of submitted papers, the months of December, September and June are the most probable, while in February and May the editors seem more soft.

Conclusions: there seems to be no effect due to holidays on the editorial workflow, as months most often containing holidays (January, July and August) exhibit no special statistical anomaly, - with respect to either submission or decision rate as compared to other months, for JSCS. Yet, the is quite large (16.55; see Table 4). Thus, the seasonal effect might have another origin. The Entropy data distribution is even more uniform ( 6.52; see Table 8). If any, some seasonal effect on might occur during winter time.

5.2 Entrance barrier editor load effect

Schreiber [5] considers that an entrance barrier can be set up by editors due to their work load. We understand such a bias as resulting from an accumulation of submitted papers at some time thereafter correlated to a large rate of desk rejection. One can without much restriction assume that the correlation has to be observed for zero month-time lag, since both journals are usually prone to replying quickly to authors.

A visual comparison of the correlation between the number of desk rejected papers and the number of submitted papers to JSCS during a given month, distinguishing 2013 from 2014 or to Entropy during a given month in 2014 or in 2015, or in 2016 is shown in Fig. 4. For , the number of desk rejected papers is roughly proportional to the number during a given month, , a value already noticed, - except at , when can be as large as 30 - 50%. However, both in 2013 fall and 2014 spring-summer time, there are months for which is large, but is low, leading to a doubt on a editor barrier load effect.

For Entropy, it occurs that there are two clusters separated by borders and . When , the number of desk rejected papers proportionally much increases. That was surely the case in 2015.

Conclusions: JSCS or editors may raise some entrance barrier due to overload whatever the season,

Figure 4: Entrance barrier load conjecture effect. Visual correlation between the number of desk rejected papers and the number of submitted papers to (left) JSCS during a given month, between 2012 and 2014 or to (right) Entropy during a given month between 2014 and 2016.

6 Optimal submission month, - for paper later acceptance

The above data and discussion on the number of papers is relevant for editors, and automatic handling of papers. Of course, this holds partially true as well for authors who do not want to overload editorial desks with many publications at a given time, since authors expect some rather quick (and positive) decision on their submission. However, another point is of great interest for authors, somewhat bearing on the reviewer and desk editor mood. The most relevant question, on a possible seasonal bias, for authors is whether his/her paper has a greater chance to be accepted if submitted during a given month. Thus, the probability of acceptance, the so called ”acceptance rate” is a relevant variable to be studied!

The relative number (i.e., monthly percentages) of papers accepted or rejected, or , after submission on a specific month is easily obtained from the figures. The months () can be ranked, e.g. in decreasing order of importance, according to such a relative probability (thereafter called ) of having a paper accepted if submitted in a given month () to JSCS or to Entropy in given years; see Table 9. One can easily obtained the corresponding of rejected papers; see Table 10. This holds true for any yearly time series leading to some , whence allowing to compare journals according to

(6.1)

One could also consider

(6.2)

for the corresponding cumulated data over each specific time interval. A comment on the matter is postponed to the Appendix.

6.1 JSCS case

The relevant percentage differences between accepted and rejected number of papers to JSCS in 2013 and 2014 are given in Fig. 5.

From this difference in probability perspective, it does not seem to be recommended that authors submit their paper to JSCS in Mar or Dec.. They should rather submit their papers in January, with some non-negligible statistical chance of acceptance for submissions in February or October.

Figure 5: Monthly aggregated percentages difference between accepted () and rejected () number of papers, normalized to the number of submitted papers, on a given month (left) to JSCS over [ 2012 - 2014], (right) to Entropy in [2014 - 2016].

6.2 Entropy case

For Entropy, an equivalent calculation of can be made, - from aggregating data in Fig. 2, over a 12 month interval leading to Fig. 5. Even though the best percentage of accepted papers occurs if the papers are submitted from January till May (with a steady increase, in fact) and in October and November, the percentage of submitted papers in December is the largest of the year, and the probability of acceptance is the lowest for such papers.

Thus, a marked dip in acceptance probability occurs if the papers are submitted during the summer months [June-Sept.], as seen in Fig. 5, whence suggesting to avoid such months for submission to Entropy.

7 Warnings and Discussion

For fully testing seasonal effects, one might argue that one should correlate the acceptance/rejection matter to the hemisphere, or/and to nationality of authors, and notice the influence of co-authors444Those of editors might also be of concern: most are Serbians for JSCS, the variety is large for Entropy. .

We apologize for not having searched for the affiliations (either in the southern or northern hemisphere, - since seasons are different) of submitting authors to Entropy; we expect that such a ”hemisphere effect”, if it exists, is hidden in the statistical error bar of the sample, . Concerning the nationalities of authors (and reviewers) of JSCS in the period Nov. 2009 - Oct. 2014, those have been discussed by Nedic and Dekanski [8]; see Fig. 3 and Fig. 2 respectively in [8]. For completeness, let us mention that the disitribution

Data on papers submitted, accepted, rejected, withdrawn, to JSCS from mainly Serbian authors and from ”outside Serbia, on given years can be found in Table 11. From such a reading, it appears that JSCS editors are fair, not biased, in favor or against papers with the corresponding author being from Serbia.

At this level, more importantly, a comparison resulting form the observation of Fig. 5 allows to point to a marked difference between a specialized journal and a multidisciplinary one, - at least from the editorial aim, and the peer reviewers points of view. The difference between the probability of acceptance and that of rejection, on a monthly basis, i.e. , has an astoundingly different behavior: the value is only positive over 3 months for JSCS, but is always positive for Entropy. This can be interpreted in terms of peer review control. Recall that the percentage of desk rejection is approximatively the same for JSCS and Entropy, but the peer review rejection is much higher () for JSCS in contrast with a reviewer rejection rate for Entropy. In terms of seasonal effect, one has a positive value in January (and February) for JSCS, but a positive effect for the spring and fall months for Entropy. We consider that such a spread is likely due to the multidisciplinary nature of the latter journal, reducing the strong monthly and seasonal bias on the fate of a paper.

8 Conclusion

Two remarks seem to be of interest for attempting some understanding of these different findings. On one hand, statistical procedures (either or confidence interval bounds ) have not to lead to identical conclusions: both can point to deviations, but the former indicates the presence (or absence) of peaks and dips with respect to the uniform distribution, while the latter points to statistical deviations when the distributions of residuals is expected to be like a Gaussian. In the latter case, an extension of the discussion including skewness and kurtosis is mandatory [12]. We have pointed out such departures from Gaussianity. The second remark on monthly and/or seasonal bias, in view of the contradistinctions hereby found between the chemistry and multidisciplinary journal, might not be mainly due to desk rejection effects, as proposed by Shalvi et al. [4], but rather pertains to the peer reviewers having different statuses within the journal aims spread.

In so doing, by considering two somewhat ”modest, but reliable” journals555 The value is often reported by publishers and editors as a ”quality criterion” for a given journal. It is easy to find from the above data that 0.44 and 0.49 for JSCS and Entropy respectively; JSCS had an impact factor (IF) = 0.970 in 2015, which is the result of articles published in [2013-2014]; its 5-year IF is 0.997, and h(2017) = 33; Entropy had an IF = 1.821 (2016) and a 5-year IF = 1.947 (2016), while h(2017) = 37. , we have demonstrated seasonal effects, in paper submission and also in subsequent acceptance. A seasonal bias effect is stronger in the specialized journal. Recall that one can usually read when an accepted paper has been submitted, but the missing set, the rejected papers when submitted, is usually unknown. Due to our editorial status, we have provided a statistical analysis about such an information. Our outlined findings and intrinsic behavioral hypotheses markedly take into account the scientific work environment, and point, in the present cases, to seasonal bias effects, mainly due to authors in the submission/acceptance stage of the peer review process.

In order to go beyond our observation, we are aware that more data must be made available by editors and/or publishers. Avoiding debatable hypotheses on the quality of papers, ranking of journals, fame of editors, open access publicity, submission fees, publication charges, and so on, we may suggest more work on time lag effects, beyond Mrowinski et al. [13, 14], in order to pin point better the role of both editors and reviewers quality and concern. In so doing, it might be wise to consider some ARCH-like modeling of seasonal effects, as it has been done for observing day of the week effect in paper submission/acceptance/rejection to/in/by peer review journals [15]. This suggestion of ARCH econometric-like modeling is supplementary supported by arguments as in related bibliometric studies. Indeed, one could develop a Black–Scholes–Schrödinger–Zipf–Mandelbrot model framework for studying seasonal effects, - instead of the coauthor core score as in [16].


Acknowledgements

MA greatly thanks the MDPI Entropy Editorial staff for gathering and cleaning up the raw data, and in particular Yuejiao Hu, Managing Editor.

Appendix

In this Appendix, we discuss Eq.(6.2), graphically displayed in Fig. 6. In some sense, this equation assumes that all years are equivalent and data for each month can be superposed whatever the year. It has been shown in the main text that such an aggregation process leads to a more comfortable statistical analysis.

The -axis scales appear to be markedly different in Fig. 5 and Fig. 6. However the patterns are very similar, thus a priori allowing for such an aggregation process. The conclusions on seasonal effects drawn from both figures or equations, Eq.(6.1), and Eq.(6.2), are therefore qualitatively similar.

Figure 6: Difference between accepted () and rejected () percentages of papers, when the number of such papers is aggregated in a given month for the examined time interval, (left) over [2012 - 2014] to JSCS or (right) over [2014 - 2016] to Entropy.

References

  • [1] Nedić, O., Drvenica, I., Ausloos, M., & Dekanski, A. B. (2018). Efficiency in managing peer-review of scientific manuscripts–editors’ perspective. J. Serb. Chem. Soc. 83, 1391-1405.
  • [2] Drvenica, I., Bravo, G., Vejmelka, L., Dekanski, A., & Nedić, O. (2019). Peer Review of Reviewers: The Author’s Perspective. Publications 7, 1.
  • [3] Boja, C. E., Herţeliu, C., Dârdală, M., & Ileanu, B. V. (2018). Day of the week submission effect for accepted papers in Physica A, PLOS ONE, Nature and Cell. Scientometrics 117, 887-918.
  • [4] Shalvi, S., Baas, M., Handgraaf, M.J.J,, & De Dreu, C.K.W. (2010). ”Write when hot - submit when not: seasonal bias in peer review or acceptance?” Learned Publishing 23, 117-123.
  • [5] Schreiber, M. (2012). ”Seasonal bias in editorial decisions for a physics journal: you should write when you like, but submit in July”. Learned Publishing 25, 145-151.
  • [6] Alikhan, A., Karan, A., Feldman, S.R., & Maibach, H.I. (2011). ”Seasonal variations in dermatology manuscript submission”. Journal of Dermatological Treatment 22, 60.
  • [7] Ausloos, M., Nedic, O., & Dekanski, A. (2016). ”Day of the week effect in paper submission/acceptance/rejection to/in/by peer review journals”. Physica A 456, 197-203.
  • [8] Nedic, O. & Dekanski, A. (2015). ”A survey on publishing policies of the Journal of the Serbian Chemical Society - On the occasion of the 80th volume”. J. Serb. Chem. Soc. 80, 959-969.
  • [9] Callaham, M.L., Baxt, W.J., Waeckerie, J.F., & Wears, R.L. (1998). ”Reliability of Editor’s subjective quality ratings of Peer reviews of manuscripts”. JAMA, 280, 229-231, and refs therein.
  • [10] Cole, S., Cole, J.R., & Simon, G.A. (1981). ”Chance and Consensus in Peer Review”. Science 214, 881-886.
  • [11] Hargens, L.L. (1968). ”Scholarly Consensus and Journal Rejection Rates”. American Sociological Review 53, 139-151.
  • [12] Doane, D.P. & Seward, L.E. (2011), Applied Statistics in Business and Economics, 3rd ed., McGraw-Hill/Irwin pp. 154-156.
  • [13] Mrowinski, M.J., Fronczak, A., Fronczak, P., Nedic, O., & Ausloos, M. (2016). ”Review times in peer review: quantitative analysis and modelling of editorial work flows”, Scientometrics 107, 271-286.
  • [14]

    Mrowinski, M.J., Fronczak, A., Fronczak, P., Nedic, O., & Ausloos, M. (2017). ”Artificial intelligence in peer review: how can evolutionary computation support journal editors?”.

    PLOS ONE 12, e0184711.
  • [15] Ausloos, M,, Nedic, O., Dekanski, A., Mrowinski, M.J., Fronczak, A., & Fronczak, P, (2017). ”Day of the week effect in paper submission/acceptance/rejection to/in/by peer review journals. II. An ARCH econometric-like modeling”, Physica A: Statistical Mechanics and its Applications, 468, 462–474.
  • [16] Rotundo, G. (2014). ”Black–Scholes–Schrödinger–Zipf–Mandelbrot model framework for improving a study of the coauthor core score”. Physica A: Statistical Mechanics and its Applications, 404, 296–301.