This tactic relies on the socalled modus tollens, a much used logic in both positivist and interpretive research in IS. A second manifestation of incomplete logic is that NHST neglects predictions under H. A widespread misconception is that rejecting H allows accepting a specific H. But NHST does not require a specification of the data that H would predict, it only computes probabilities conditional on H.Rejection of H thus offers no insight about how well the data might fit a general or specific H.It also favors vaguely defined hypotheses because these are harder to definitely assess against credible alternatives.It makes it difficult and unlikely that theories are ever conclusively falsified. Notably, in the social sciences, the vast majority of papers by now focus on statistically significant results, often not fully or not entirely disclosing information about results that do not meet the commonly established thresholds.The possible risk of thresholdbased reporting is that the publication of negative or insignificant results is impeded, which leads to publication bias, the systematic suppression of research findings due to small magnitude, statistical insignificance, or contradiction of prior findings or theory. Shifts in academic culture, the availability of scholarly performance metrics and regulatory moves toward measuring research impact have created ample pressures on academics to publish significant contributions to meet expectations for promotion and tenure and demonstrate research impact. One consequence of these pressures has been the emergence of a dominant type of research design where directional hypotheses are proposed alongside null hypotheses that claim there is no effect.This type of research design has been referred to as the midrange script, which is a legitimate, popular, reasonable and safe way of constructing knowledge with good prospects of publishability, but which also limits richer theorizing, constrains the freedom in relating theory and empirics, and weakens alternative forms of knowledge construction, such as datadriven research or blue ocean theorizing. A second consequence of the publication pressure in academic culture is the growing prevalence of socalled <a href="http://www.targetmol.com/compound/Pamabrom">Targetmol's
Pamabrom</a> questionable research practices that skirt the line between ethical and unethical behavior.The adoption of these practices is often understated but evidence amounts that they are prevalent in academia today. This runs the risk of misconstruing hypotheses that predicted false positives as theory to account for what is effectively an illusory effect.It also risks favoring weaker theories that post hoc accommodate results rather than correctly predict them, which in turns promotes developing narrow theory at the expense of broader, richer theorizing, and inhibits the generation of plausible alternative hypotheses.The emergence of big data and the growing prevalence of digital trace data evidence of activities and events that is logged and stored digitally increasingly allow researchers to obtain very large amounts of data, often to the point that the data collected resembles entire populations or at least very large fractions of populations.Yet, NHST originally was conceived as a smallsample statistical inference technique. In contexts involving digital trace populationlevel data, statistical inferences are increasingly meaningless because parameters of the data closely or fully resemble parameters of the studied populations.