Turing was prosecuted in 1952 for homosexual acts; the Labouchere Amendment of 1885 had mandated that “gross indecency” was a criminal offence in the UK. He accepted chemical castration treatment, with DES, as an alternative to prison.
Turing died in 1954, 16 days before his 42nd birthday, from cyanide poisoning. An inquest determined his death as a suicide, but it has been noted that the known evidence is also consistent with accidental poisoning.
In 2009, following an Internet campaign, British Prime Minister Gordon Brown made an official public apology on behalf of the British government for “the appalling way he was treated”. Queen Elizabeth II granted Turing a posthumous pardon in 2013.
The “Alan Turing law” is now an informal term for a 2017 law in the United Kingdom that retroactively pardoned men cautioned or convicted under historical legislation that outlawed homosexual acts. Turing has an extensive legacy with statues of him and many things named after him, including an annual award for computer science innovations. He appears on the current Bank of England £50 note, which was released to coincide with his birthday.
A 2019 BBC series, as voted by the audience, named him the greatest person of the 20th century.
*23 June 1912, Maida Vale, London, England
†7 June 1954, Wilmslow, Cheshire, England
Alan Mathison Turing OBE FRS was an English mathematician, computer scientist, logician, cryptanalyst, philosopher, and theoretical biologist. Turing was highly influential in the development of theoretical computer science, providing a formalisation of the concepts of algorithm and computation with the Turing machine, which can be considered a model of a general-purpose computer. Turing is widely considered to be the father of theoretical computer science and artificial intelligence.
Born in Maida Vale, London, Turing was raised in southern England. He graduated at King’s College, Cambridge, with a degree in mathematics. Whilst he was a fellow at Cambridge, he published a proof demonstrating that some purely mathematical yes–no questions can never be answered by computation and defined a Turing machine, and went on to prove the halting problem for Turing machines is undecidable.
In 1938, he obtained his PhD from the Department of Mathematics at Princeton University. During the Second World War, Turing worked for the Government Code and Cypher School (GC&CS) at Bletchley Park, Britain’s codebreaking centre that produced Ultra intelligence. For a time he led Hut 8, the section that was responsible for German naval cryptanalysis. Here, he devised a number of techniques for speeding the breaking of German ciphers, including improvements to the pre-war Polish bombe method, an electromechanical machine that could find settings for the Enigma machine.
Turing played a crucial role in cracking intercepted coded messages that enabled the Allies to defeat the Axis powers in many crucial engagements, including the Battle of the Atlantic. Official war historian Harry Hinsley estimated that this work shortened the war in Europe by more than two years and saved over 14 million lives.
After the war, Turing worked at the National Physical Laboratory, where he designed the Automatic Computing Engine (ACE), one of the first designs for a stored-program computer. In 1948, Turing joined Max Newman’s Computing Machine Laboratory, at the Victoria University of Manchester, where he helped develop the Manchester computers and became interested in mathematical biology.
He wrote a paper on the chemical basis of morphogenesis and predicted oscillating chemical reactions such as the Belousov–Zhabotinsky reaction, first observed in the 1960s. Despite these accomplishments, he was never fully recognised in his home country during his lifetime because much of his work was covered by the Official Secrets Act.
During the 1970s, Ritchie collaborated with James Reeds and Robert Morris on a ciphertext-only attack on the M-209 US cipher machine that could solve messages of at least 2000–2500 letters.Ritchie relates that, after discussions with the NSA, the authors decided not to publish it, as they were told that the principle was applicable to machines still in use by foreign governments.
Ritchie was also involved with the development of the Plan 9 and Inferno operating systems, and the programming language Limbo.
As part of an AT&T restructuring in the mid-1990s, Ritchie was transferred to Lucent Technologies, where he retired in 2007 as head of System Software Research Department.
Ritchie was found dead on October 12, 2011, at the age of 70 at his home in Berkeley Heights, New Jersey, where he lived alone. First news of his death came from his former colleague, Rob Pike. He had been in frail health for several years following treatment for prostate cancer and heart disease. News of Ritchie’s death was largely overshadowed by the media coverage of the death of Apple co-founder Steve Jobs, which occurred the week before.
*9 September 1941, Bronxville, New York, U.S.
†12 October 2011, Berkeley Heights, New Jersey, U.S.
Dennis MacAlistair Ritchie was an American computer scientist. He created the C programming language and, with long-time colleague Ken Thompson, the Unix operating system and B programming language. Ritchie and Thompson were awarded the Turing Award from the ACM in 1983, the Hamming Medal from the IEEE in 1990 and the National Medal of Technology from President Bill Clinton in 1999.
Ritchie was the head of Lucent Technologies System Software Research Department when he retired in 2007. He was the “R” in K&R C, and commonly known by his username dmr.
Dennis Ritchie was born in Bronxville, New York. His father was Alistair E. Ritchie, a longtime Bell Labs scientist and co-author of The Design of Switching Circuits on switching circuit theory. As a child, Dennis moved with his family to Summit, New Jersey, where he graduated from Summit High School. He graduated from Harvard University with degrees in physics and applied mathematics.
Version 7 Unix for the PDP-11, including Dennis Ritchie’s home directory: /usr/dmr
In 1967, Ritchie began working at the Bell Labs Computing Sciences Research Center, and in 1968, he defended his PhD thesis on “Computational Complexity and Program Structure” at Harvard under the supervision of Patrick C. Fischer. However, Ritchie never officially received his PhD degree as he did not submit a bound copy of his dissertation to the Harvard library, a requirement for the degree. In 2020, the Computer History museum worked with Ritchie’s family and Fischer’s family and found a copy of the lost dissertation.
During the 1960s, Ritchie and Ken Thompson worked on the Multics operating system at Bell Labs. Thompson then found an old PDP-7 machine and developed his own application programs and operating system from scratch, aided by Ritchie and others. In 1970, Brian Kernighan suggested the name “Unix”, a pun on the name “Multics”. To supplement assembly language with a system-level programming language, Thompson created B. Later, B was replaced by C, created by Ritchie, who continued to contribute to the development of Unix and C for many years.
John von Neumann
After the war, he served on the General Advisory Committee of the United States Atomic Energy Commission, and consulted for organizations including the United States Air Force, the Army’s Ballistic Research Laboratory, the Armed Forces Special Weapons Project, and the Lawrence Livermore National Laboratory.
As a Hungarian émigré, concerned that the Soviets would achieve nuclear superiority, he designed and promoted the policy of mutually assured destruction to limit the arms race.
In 1955, von Neumann was diagnosed with what was either bone, pancreatic or prostate cancerafter he was examined by physicians for a fall, whereupon they inspected a mass growing near his collarbone.
The cancer was possibly caused by his radiation exposure during his time in Los Alamos National Laboratory. He was not able to accept the proximity of his own demise, and the shadow of impending death instilled great fear in him.
He invited a Catholic priest, Father Anselm Strittmatter, O.S.B., to visit him for consultation. Von Neumann reportedly said, “So long as there is the possibility of eternal damnation for nonbelievers it is more logical to be a believer at the end,” referring to Pascal’s wager. He had earlier confided to his mother, “There probably has to be a God. Many things are easier to explain if there is than if there isn’t.”
Father Strittmatter administered the last rites to him. Some of von Neumann’s friends, such as Abraham Pais and Oskar Morgenstern, said they had always believed him to be “completely agnostic”.Of this deathbed conversion, Morgenstern told Heims, “He was of course completely agnostic all his life, and then he suddenly turned Catholic—it doesn’t agree with anything whatsoever in his attitude, outlook and thinking when he was healthy.”
Father Strittmatter recalled that even after his conversion, von Neumann did not receive much peace or comfort from it, as he still remained terrified of death.
He died at age 53 on February 8, 1957, at the Walter Reed Army Medical Center in Washington, D.C.
*28 December 1903, Budapest, Kingdom of Hungary
†8 February 1957, Washington, D.C., United States
John von Neumann was a Hungarian-American mathematician, physicist, computer scientist, engineer and polymath. Von Neumann was generally regarded as the foremost mathematician of his time and said to be “the last representative of the great mathematicians”. He integrated pure and applied sciences.
Von Neumann made major contributions to many fields, including mathematics (foundations of mathematics, functional analysis, ergodic theory, representation theory, operator algebras, geometry, topology, and numerical analysis), physics (quantum mechanics, hydrodynamics, and quantum statistical mechanics), economics (game theory), computing (Von Neumann architecture, linear programming, self-replicating machines, stochastic computing), and statistics.
He was a pioneer of the application of operator theory to quantum mechanics in the development of functional analysis, and a key figure in the development of game theory and the concepts of cellular automata, the universal constructor and the digital computer.
Von Neumann published over 150 papers in his life: about 60 in pure mathematics, 60 in applied mathematics, 20 in physics, and the remainder on special mathematical subjects or non-mathematical ones. His last work, an unfinished manuscript written while he was in the hospital, was later published in book form as The Computer and the Brain.
His analysis of the structure of self-replication preceded the discovery of the structure of DNA. In a shortlist of facts about his life he submitted to the National Academy of Sciences, he wrote, “The part of my work I consider most essential is that on quantum mechanics, which developed in Göttingen in 1926, and subsequently in Berlin in 1927–1929. Also, my work on various forms of operator theory, Berlin 1930 and Princeton 1935–1939; on the ergodic theorem, Princeton, 1931–1932.”
During World War II, von Neumann worked on the Manhattan Project with theoretical physicist Edward Teller, mathematician Stanislaw Ulam and others, problem-solving key steps in the nuclear physics involved in thermonuclear reactions and the hydrogen bomb.
He developed the mathematical models behind the explosive lenses used in the implosion-type nuclear weapon and coined the term “kiloton” (of TNT) as a measure of the explosive force generated.
After leaving CERN in late 1980, he went to work at John Poole’s Image Computer Systems, Ltd, in Bournemouth, Dorset. He ran the company’s technical side for three years. The project he worked on was a “real-time remote procedure call” which gave him experience in computer networking. In 1984, he returned to CERN as a fellow.
In 1989, CERN was the largest Internet node in Europe and Berners-Lee saw an opportunity to join hypertext with the Internet.
Berners-Lee wrote his proposal in March 1989 and, in 1990, redistributed it. It then was accepted by his manager, Mike Sendall, who called his proposals “vague, but exciting”.
Berners-Lee published the first web site, which described the project itself, on 20 December 1990; it was available to the Internet from the CERN network.
He devised and implemented the first Web browser and Web server, and helped foster the Web’s subsequent explosive development. He currently directs the W3 Consortium, developing tools and standards to further the Web’s potential. In April 2009, he was elected as Foreign Associate of the National Academy of Sciences.
In 2004, Berners-Lee was knighted by Queen Elizabeth II for his pioneering work.
Sir Timothy John Berners-Lee 8 June 1955, also known as TimBL, is an English computer scientist best known as the inventor of the World Wide Web. He is a Professorial Fellow of Computer Science at the University of Oxford and a professor at the Massachusetts Institute of Technology (MIT).
Berners-Lee proposed an information management system on 12 March 1989, then implemented the first successful communication between a Hypertext Transfer Protocol (HTTP) client and server via the Internet in mid-November.
Berners-Lee is the director of the World Wide Web Consortium (W3C), which oversees the continued development of the Web. He co-founded (with his wife-to-be Rosemary Leith) the World Wide Web Foundation. He is a senior researcher and holder of the 3Com founder’s chair at the MIT Computer Science and Artificial Intelligence Laboratory (CSAIL).
He is a director of the Web Science Research Initiative (WSRI) and a member of the advisory board of the MIT Center for Collective Intelligence. In 2011, he was named as a member of the board of trustees of the Ford Foundation. He is a founder and president of the Open Data Institute and is currently an advisor at social network MeWe.
He attended Sheen Mount Primary School, and then went on to attend south-west London’s Emanuel School from 1969 to 1973.
A keen trainspotter as a child, he learnt about electronics from tinkering with a model railway. He studied at The Queen’s College, Oxford, from 1973 to 1976, where he received a first-class Bachelor of Arts degree in physics. While at university, Berners-Lee made a computer out of an old television set, which he bought from a repair shop.
After graduation, Berners-Lee worked as an engineer at the telecommunications company Plessey in Poole, Dorset. In 1978, he joined D. G. Nash in Ferndown, Dorset, where he helped create typesetting software for printers.
Berners-Lee worked as an independent contractor at CERN from June to December 1980. While in Geneva, he proposed a project based on the concept of hypertext, to facilitate sharing and updating information among researchers. To demonstrate it, he built a prototype system named ENQUIRE.
This led to the COBOL language, which was inspired by her idea of a language being based on English words. In 1966, she retired from the Naval Reserve, but in 1967 the Navy recalled her to active duty. She retired from the Navy in 1986 and found work as a consultant for the Digital Equipment Corporation, sharing her computing experiences.
On New Year’s Day 1992, Hopper died in her sleep of natural causes at her home in Arlington, Virginia; she was 85 years of age.
The U.S. Navy Arleigh Burke-class guided-missile destroyer USS Hopper was named for her, as was the Cray XE6 “Hopper” supercomputer at NERSC.
During her lifetime, Hopper was awarded 40 honorary degrees from universities across the world. A college at Yale University was renamed in her honor. In 1991, she received the National Medal of Technology. On November 22, 2016, she was posthumously awarded the Presidential Medal of Freedom by President Barack Obama.
*9 December 1906, New York City, U.S.
†1 January 1992, Arlington, Virginia, U.S.
Grace Brewster Murray Hopper was an American computer scientist and United States Navy rear admiral.
One of the first programmers of the Harvard Mark I computer, she was a pioneer of computer programming who invented one of the first linkers. Hopper was the first to devise the theory of machine-independent programming languages, and the FLOW-MATIC programming language she created using this theory was later extended to create COBOL, an early high-level programming language still in use today.
Prior to joining the Navy, Hopper earned a Ph.D. in mathematics from Yale University and was a professor of mathematics at Vassar College. Hopper attempted to enlist in the Navy during World War II but was rejected because she was 34 years old.
She instead joined the Navy Reserves. Hopper began her computing career in 1944 when she worked on the Harvard Mark I team led by Howard H. Aiken. In 1949, she joined the Eckert–Mauchly Computer Corporation and was part of the team that developed the UNIVAC I computer. At Eckert–Mauchly she managed the development of one of the first COBOL compilers.
She believed that a programming language based on English was possible. Her compiler converted English terms into machine code understood by computers. By 1952, Hopper had finished her program linker (originally called a compiler), which was written for the A-0 System. During her wartime service, she co-authored three papers based on her work on the Harvard Mark 1.
In 1954, Eckert–Mauchly chose Hopper to lead their department for automatic programming, and she led the release of some of the first compiled languages like FLOW-MATIC. In 1959, she participated in the CODASYL consortium, which consulted Hopper to guide them in creating a machine-independent programming language.
McCarthy was instrumental in the creation of three of the very earliest time-sharing systems (Compatible Time-Sharing System, BBN Time-Sharing System, and Dartmouth Time Sharing System).
In 1961, he was perhaps the first to suggest publicly the idea of utility computing, in a speech given to celebrate MIT’s centennial: that computer time-sharing technology might result in a future in which computing power and even specific applications could be sold through the utility business model (like water or electricity).
McCarthy spent most of his career at Stanford University. He received many accolades and honors, such as the 1971 Turing Award for his contributions to the topic of AI, the United States National Medal of Science, and the Kyoto Prize.
In 1982, he seems to have originated the idea of the space fountain, a type of tower extending into space and kept vertical by the outward force of a stream of pellets propelled from Earth along a sort of conveyor belt which returns the pellets to Earth. Payloads would ride the conveyor belt upward.
McCarthy died at his home in Stanford on October 24, 2011.
*4 September 1927, Boston, Massachusetts, U.S.
†24 October 2011 , Stanford, California, U.S.
John McCarthy was an American computer scientist and cognitive scientist. McCarthy was one of the founders of the discipline of artificial intelligence. He co-authored the document that coined the term “artificial intelligence”, developed the Lisp programming language family, significantly influenced the design of the ALGOL programming language, popularized time-sharing, and invented garbage collection.
McCarthy showed an early aptitude for mathematics; during his teens he taught himself college mathematics by studying the textbooks used at the nearby California Institute of Technology.
McCarthy graduated from Belmont High School two years early. McCarthy was accepted into Caltech in 1944.
McCarthy initially completed graduate studies at Caltech before moving to Princeton University. He received a Ph.D. in mathematics from Princeton in 1951 after completing a doctoral dissertation, titled “Projection operators and partial differential equations”, under the supervision of Donald C. Spencer.
In 1958, he proposed the advice taker, which inspired later work on question-answering and logic programming.
McCarthy invented Lisp in the late 1950s. Based on the lambda calculus, Lisp soon became the programming language of choice for AI applications after its publication in 1960.
In 1958, McCarthy served on an ACM Ad hoc Committee on Languages that became part of the committee that designed ALGOL 60. In August 1959 he proposed the use of recursion and conditional expressions, which became part of ALGOL.
He then became involved with developing international standards in programming and informatics, as a member of the International Federation for Information Processing (IFIP) IFIP Working Group 2.1 on Algorithmic Languages and Calculi, which specified, maintains, and supports ALGOL 60 and ALGOL 68.
Edsger W. Dijkstra
Until the mid-1960s computer programming was considered more an art (or a craft) than a scientific discipline. In Harlan Mills’s words (1986), “programming [before the 1970s] was regarded as a private, puzzle-solving activity of writing computer instructions to work as a program”. In the late 1960s, computer programming was in a state of crisis.
Dijkstra was one of a small group of academics and industrial programmers who advocated a new programming style to improve the quality of programs. Dijkstra, who had a background in mathematics and physics, was one of the driving forces behind the acceptance of computer programming as a scientific discipline.
He coined the phrase “structured programming” and during the 1970s this became the new programming orthodoxy. As the originator of the structured programming movement (the first remarkable movement in the history of computer programming), his ideas about programming methodology helped lay the foundations for the birth and development of the professional discipline of software engineering, enabling programmers to organize and manage increasingly complex software projects.
As Bertrand Meyer (2009) noted, “The revolution in views of programming started by Dijkstra’s iconoclasm led to a movement known as structured programming, which advocated a systematic, rational approach to program construction. Structured programming is the basis for all that has been done since in programming methodology, including object-oriented programming.”
*11 May 1930, Rotterdam, Netherlands
†6 August 2002, Nuenen, Netherlands
Edsger Wybe Dijkstra was a Dutch computer scientist, programmer, software engineer, systems scientist, science essayist, and pioneer in computing science.
A theoretical physicist by training, he worked as a programmer at the Mathematisch Centrum (Amsterdam) from 1952 to 1962.
A university professor for much of his life, Dijkstra held the Schlumberger Centennial Chair in Computer Sciences at the University of Texas at Austin from 1984 until his retirement in 1999. He was a professor of mathematics at the Eindhoven University of Technology (1962–1984) and a research fellow at the Burroughs Corporation (1973–1984). In 1972, he became the first person who was neither American nor British to win the Turing Award.
One of the most influential figures of computing science’s founding generation, Dijkstra helped shape the new discipline both as an engineer and a theorist.
His fundamental contributions cover diverse areas of computing science, including compiler construction, operating systems, distributed systems, sequential and concurrent programming, programming paradigm and methodology, programming language research, program design, program development, program verification, software engineering principles, graph algorithms, and philosophical foundations of computer programming and computer science.
Many of his papers are the source of new research areas. Several concepts and problems that are now standard in computer science were first identified by Dijkstra or bear names coined by him.
Knuth then left his position to join the Stanford University faculty in 1969, where he is now Fletcher Jones Professor of Computer Science, Emeritus.
Knuth has also delved into recreational mathematics. He contributed articles to the Journal of Recreational Mathematics beginning in the 1960s, and was acknowledged as a major contributor in Joseph Madachy’s Mathematics on Vacation.
As a writer and scholar, Knuth created the WEB and CWEB computer programming systems designed to encourage and facilitate literate programming, and designed the MIX/MMIX instruction set architectures. Knuth strongly opposes the granting of software patents, having expressed his opinion to the United States Patent and Trademark Office and European Patent Organisation.
By 2011, the first three volumes and part one of volume four of his series had been published. Concrete Mathematics: A Foundation for Computer Science 2nd ed., which originated with an expansion of the mathematical preliminaries section of Volume 1 of TAoCP, has also been published.
In April 2020, Knuth said he is hard at work on part B of volume 4, and he anticipates that the book will have at least parts A through F.
*10 January 1938, Milwaukee, Wisconsin, U.S.
Donald Ervin Knuth is an American computer scientist, mathematician, and professor emeritus at Stanford University. He is the 1974 recipient of the ACM Turing Award, informally considered the Nobel Prize of computer science. Knuth has been called the “father of the analysis of algorithms”.
He is the author of the multi-volume work The Art of Computer Programming. He contributed to the development of the rigorous analysis of the computational complexity of algorithms and systematized formal mathematical techniques for it.
In the process he also popularized the asymptotic notation. In addition to fundamental contributions in several branches of theoretical computer science, Knuth is the creator of the TeX computer typesetting system, the related METAFONT font definition language and rendering system, and the Computer Modern family of typefaces.
While studying physics at Case, Knuth was introduced to the IBM 650, an early commercial computer. After reading the computer’s manual, Knuth decided to rewrite the assembly and compiler code for the machine used in his school, because he believed he could do it better.
After receiving his PhD, Knuth joined Caltech’s faculty as an assistant professor.
Just before publishing the first volume of The Art of Computer Programming, Knuth left Caltech to accept employment with the Institute for Defense Analyses’ Communications Research Division, then situated on the Princeton University campus, which was performing mathematical research in cryptography to support the National Security Agency.
During 1997, Cerf joined the Board of Trustees of Gallaudet University, a university for the education of the deaf and hard-of-hearing. Cerf himself is hard of hearing.
Cerf has worked for Google as a vice president and Chief Internet Evangelist since October 2005. In this function he has become well known for his predictions on how technology will affect future society, encompassing such areas as artificial intelligence, environmentalism, the advent of IPv6 and the transformation of the television industry and its delivery model.
Cerf helped fund and establish ICANN, the Internet Corporation for Assigned Names and Numbers. He joined the board in 1999, and served until November 2007. He was chairman from November 2000 to his departure from the Board.
In 2015 Cerf co-founded (with Mei Lin Fung), and is currently chairman of, People-Centered Internet.
In June 2016, his work with NASA led to Delay-tolerant networking being installed on the International Space Station with an aim towards an Interplanetary Internet.
Since at least 2015, Cerf has been raising concerns about the wide-ranging risks of digital obsolescence, the potential of losing much historic information about our time – a digital “dark age” or “black hole” – given the ubiquitous digital storage of text, data, images, music and more.
Among the concerns are the long-term storage of, and continued reliable access to, our vast stores of present-day digital data and the associated programs, operating systems, computers and peripherals required to access such.
*23 June 1943, New Haven, Connecticut, U.S.
Vinton Gray Cerf is an American Internet pioneer and is recognized as one of “the fathers of the Internet”, sharing this title with TCP/IP co-developer Bob Kahn.
He has received honorary degrees and awards that include the National Medal of Technology, the Turing Award, the Presidential Medal of Freedom, the Marconi Prize and membership in the National Academy of Engineering.
Cerf received a Bachelor of Science degree in mathematics from Stanford University. After college, Cerf worked at IBM as a systems engineer supporting QUIKTRAN for two years.
He left IBM to attend graduate school at UCLA where he earned his M.S. degree in 1970 and his PhD in 1972. Cerf studied under Professor Gerald Estrin and worked in Professor Leonard Kleinrock’s data packet networking group that connected the first two nodes of the ARPANet, the first nodeon the Internet, and “contributed to a host-to-host protocol” for the ARPANet.
While at UCLA, Cerf met Bob Kahn, who was working on the ARPANet system architecture. Cerf wrote the first TCP protocol with Yogen Dalal and Carl Sunshine, called Specification of Internet Transmission Control Program (RFC 675), published in December 1974.
As vice president of MCI Digital Information Services from 1982 to 1986, Cerf led the engineering of MCI Mail, the first commercial email service to be connected to the Internet.
In 1992, he and Kahn, among others, founded the Internet Society (ISOC) to provide leadership in education, policy and standards related to the Internet.
Cerf rejoined MCI during 1994 and served as Senior Vice President of Technology Strategy. In this role, he helped to guide corporate strategy development from a technical perspective.
Although this work was important in popularizing backpropagation, it was not the first to suggest the approach. Reverse-mode automatic differentiation, of which backpropagation is a special case, was proposed by Seppo Linnainmaa in 1970, and Paul Werbos proposed to use it to train neural networks in 1974.
During the same period, Hinton co-invented Boltzmann machines with David Ackley and Terry Sejnowski.
His other contributions to neural network research include distributed representations, time delay neural network, mixtures of experts, Helmholtz machines and Product of Experts.
In 2007 Hinton coauthored an unsupervised learning paper titled Unsupervised learning of image transformations.
An accessible introduction to Geoffrey Hinton’s research can be found in his articles in Scientific American in September 1992 and October 1993.
In October and November 2017 respectively, Hinton published two open access research papers on the theme of capsule neural networks, which according to Hinton are “finally something that works well.”
*6 December 1947, Wimbledon, London
Geoffrey Everest Hinton is a British-Canadian cognitive psychologist and computer scientist, most noted for his work on artificial neural networks.
Since 2013, he has divided his time working for Google (Google Brain) and the University of Toronto. In 2017, he co-founded and became the Chief Scientific Advisor of the Vector Institute in Toronto.
Hinton was educated at King’s College, Cambridge, graduating in 1970 with a Bachelor of Arts in experimental psychology. He continued his study at the University of Edinburgh where he was awarded a Ph.D. in artificial intelligence in 1978 for research supervised by Christopher Longuet-Higgins.
He was the founding director of the Gatsby Charitable Foundation Computational Neuroscience Unit at University College London, and is currently a professor in the computer science department at the University of Toronto.
He holds a Canada Research Chair in Machine Learning, and is currently an advisor for the Learning in Machines & Brains program at the Canadian Institute for Advanced Research. Hinton taught a free online course on Neural Networks on the education platform Coursera in 2012.
While Hinton was a professor at Carnegie Mellon University (1982–1987), David E. Rumelhart and Hinton and Ronald J. Williams applied the backpropagation algorithm to multi-layer neural networks. Their experiments showed that such networks can learn useful internal representations of data.