see Statistical software.
Computer software, or simply software, is that part of a computer system that consists of encoded information or computer instructions, in contrast to the physical hardware from which the system is built. The term is roughly synonymous with computer program, but is more generic in scope.
The term “software” was first proposed by Alan Turing and used in this sense by John W. Tukey in 1957. In computer science and software engineering, computer software is all information processed by computer systems, programs and data.
Computer software includes computer programs, libraries and related non-executable data, such as online documentation or digital media. Computer hardware and software require each other and neither can be realistically used on its own.
At the lowest level, executable code consists of machine language instructions specific to an individual processor—typically a central processing unit (CPU). A machine language consists of groups of binary values signifying processor instructions that change the state of the computer from its preceding state. For example, an instruction may change the value stored in a particular storage location in the computer—an effect that is not directly observable to the user. An instruction may also (indirectly) cause something to appear on a display of the computer system—a state change which should be visible to the user. The processor carries out the instructions in the order they are provided, unless it is instructed to “jump” to a different instruction, or interrupted.
The majority of software is written in high-level programming languages that are easier and more efficient for programmers, meaning closer to a natural language.
High-level languages are translated into machine language using a compiler or an interpreter or a combination of the two. Software may also be written in a low-level assembly language, essentially, a vaguely mnemonic representation of a machine language using a natural language alphabet, which is translated into machine language using an assembler.
Perfusion computed tomography (CT) is a technique that allows rapid qualitative and quantitative evaluation of cerebral perfusion by generating maps of cerebral blood flow (CBF), cerebral blood volume (CBV), and mean transit time (MTT). The technique is based on the central volume principle (CBF = CBV/MTT) and requires the use of commercially available software employing complex deconvolution algorithms to produce the perfusion maps.