Algorithms
We now know that to make a computer do stuff, we need to write a specific series of instructions for it in a way that can be translated into something it understands.
An algorithm is a finite sequence of well-defined, computer-implementable instructions, typically to solve a class of specific problems or perform a computation.
Let's explain various parts of this definition:
finite sequence means that the algorithm can be expressed in a finite number of steps, though these steps may repeat infinitely;
well-defined means that each step is defined precisely, unambiguously;
- computer-implementable means that an algorithm doesn't have to be written in a programming language, but we have to be capable of writing it in it - a process called implementation;
- performing a computation is calculating a value of a certain equation, such as 1+2; and
- to contrast that, solving a class of specific problems would be creating an algorithm to address a more general case, such as x+y.
Algorithms are always technically deterministic because they produce the same output for the same input. Some people call algorithms that include randomness non-deterministic because they give an illusion of non-determinism, but the truth is they also rely on inputs (seeds) for their random number generators. These seeds can be system times or something else, but if we used all the same seeds multiple times we'd get the same result.
