Our age of highly advanced technology and unprecedented computing power in the 21st century has performed wonders for our never ending pursuit of understanding the world around us. Scientific discovery has always encompassed the process of formulating a theory to explain an observed phenomenon, through which we synthesize an experiment to test the legitimacy of this theory, and perhaps improve upon it. It’s a perpetual positive feedback loop that has led to many extraordinary discoveries and complex theories. The scientific method is quickly being revolutionized as we give birth to a new era of experimentation, one that is merging with computer science. In a world that is still highly dependent on a few sparse state of the art laboratories where most of the world’s significant research gets done, this leaves large gaps of productivity to be filled in many areas of study. Merging modeling and simulation with the scientific method is crucial to producing the most significant results in human experimentation, while simultaneously offering an accessible platform for scientists to ground their research.
Modeling and simulation live at the intersection between theory and experimentation. On one hand, a model and its associated implementation in software, such as a simulation tool, are an expression of theory. On the other hand, the execution of that software, or running the simulation, and the collection of results from it can be viewed as a virtual experiment. It’s worth noting that modeling and simulation should never be viewed as a substitute for real-world experimentation. Instead, they are highly valued in scientific discovery for providing additional insights that are often impractical or impossible to test through real-world experiments or theoretical analysis. Better yet, modern laptops are more than equipped to be able to handle simulation software that can explore many complex scenarios, such as muons traveling through our atmosphere, x-rays permeating through the human body or gamma rays interacting with a gas in a controlled chamber.
As we peer deeper into the micros and macros of the universe, we have a tendency to spend large sums of money on new equipment so we have the capacity to make certain measurements that would otherwise be impossible. Post-World War II cash and prestige encouraged the construction of powerful and expensive particle accelerators, which took the form of facilities including Stanford Linear Accelerator Center (SLAC) in Menlo Park, California, Fermi National Accelerator Laboratory (Fermilab) in Batavia, Illinois, the European Organization for Nuclear Research (CERN) near Geneva and several others. This was a drastic move away from the traditional tabletop experimentation. This evolution of experimentation brought forth certain struggles in the ways that we go about attempting to conduct an experiment. Researchers and scientists from across the world take months, even years of their time just to pitch proposals in order to acquire a designated block of time at the facilities. The intricacies and difficulties around setting up a perfect experiment, let alone executing said experiment, can be such an arduous task! This is why we need complex simulations to try to predict the outcome of physical experiments.
The computing power of modern computers can simulate events many orders of magnitudes greater than what we’d be able to model in the real world, at much less the cost. Determining how likely certain outcomes are with unknown aspects is how we strive to reduce uncertainties in both computational and real-world applications. For example, if we want to predict the acceleration of a human body in a head-on collision with another car, even if the speed was known, small differences in the manufacturing of individual cars, how tightly every bolt has been tightened and other slight variations will lead to different results that can only be predicted in a statistical sense. There are many types of computer simulations; their common feature is the attempt to generate a sample of representative scenarios for a model in which a complete enumeration of all possible states of the model would be prohibitive.
Simulations provide experimental test beds for us to lay the groundwork for real experiments, and their coexistence provides all the insight we need. It is assumed that running experimental results will represent the real behavior of an object under test with specific measuring errors, and as scientists we have to make sure that this error is bound and lies within a certain margin. This is where simulation comes into play, representing the behavior of the same object based on its theoretical model. The calculated discrepancy between the two makes us that much closer to understanding the nature of reality. It’s imperative to use all the tools at our disposal in order to get the clearest picture.