Algorithm

In a loop, subtract the larger number against the smaller number. Halt the loop when the subtraction will make a number negative. Assess two numbers whether one of them equal to zero or not. If yes, take the other number as the greatest common divisor. If no, put the two number in the subtraction loop again.
Flowchart of using successive subtractions to find the greatest common divisor of number r and s

In mathematics and computer science, an algorithm (/ˈælɡərɪðəm/ ) is a finite sequence of rigorous instructions, typically used to solve a class of specific problems or to perform a computation.[1] Algorithms are used as specifications for performing calculations and data processing. More advanced algorithms can use conditionals to divert the code execution through various routes (referred to as automated decision-making) and deduce valid inferences (referred to as automated reasoning), achieving automation eventually. Using human characteristics as descriptors of machines in metaphorical ways was already practiced by Alan Turing with terms such as "memory", "search" and "stimulus".[2]

In contrast, a heuristic is an approach to problem solving that may not be fully specified or may not guarantee correct or optimal results, especially in problem domains where there is no well-defined correct or optimal result.[3] For example, social media recommender systems rely on heuristics in such a way that, although widely characterized as "algorithms" in 21st century popular media, cannot deliver correct results due to the nature of the problem.

As an effective method, an algorithm can be expressed within a finite amount of space and time[4] and in a well-defined formal language[5] for calculating a function.[6] Starting from an initial state and initial input (perhaps empty),[7] the instructions describe a computation that, when executed, proceeds through a finite[8] number of well-defined successive states, eventually producing "output"[9] and terminating at a final ending state. The transition from one state to the next is not necessarily deterministic; some algorithms, known as randomized algorithms, incorporate random input.[10]

  1. ^ "Definition of ALGORITHM". Merriam-Webster Online Dictionary. Archived from the original on February 14, 2020. Retrieved November 14, 2019.
  2. ^ Blair, Ann, Duguid, Paul, Goeing, Anja-Silvia and Grafton, Anthony. Information: A Historical Companion, Princeton: Princeton University Press, 2021. p. 247
  3. ^ David A. Grossman, Ophir Frieder, Information Retrieval: Algorithms and Heuristics, 2nd edition, 2004, ISBN 1402030045
  4. ^ "Any classical mathematical algorithm, for example, can be described in a finite number of English words" (Rogers 1987:2).
  5. ^ Well defined with respect to the agent that executes the algorithm: "There is a computing agent, usually human, which can react to the instructions and carry out the computations" (Rogers 1987:2).
  6. ^ "an algorithm is a procedure for computing a function (with respect to some chosen notation for integers) ... this limitation (to numerical functions) results in no loss of generality", (Rogers 1987:1).
  7. ^ "An algorithm has zero or more inputs, i.e., quantities which are given to it initially before the algorithm begins" (Knuth 1973:5).
  8. ^ "A procedure which has all the characteristics of an algorithm except that it possibly lacks finiteness may be called a 'computational method'" (Knuth 1973:5).
  9. ^ "An algorithm has one or more outputs, i.e. quantities which have a specified relation to the inputs" (Knuth 1973:5).
  10. ^ Whether or not a process with random interior processes (not including the input) is an algorithm is debatable. Rogers opines that: "a computation is carried out in a discrete stepwise fashion, without the use of continuous methods or analogue devices ... carried forward deterministically, without resort to random methods or devices, e.g., dice" (Rogers 1987:2).