Michael Jordan is an American scientist
Michael Jordan is an American scientist, a professor at the University of California, and a leading researcher in machine learning and artificial intelligence .
Jordan received a bachelor’s degree with honors in psychology at Louisiana State University in 1978. Master’s degree in mathematics in 1980 and a doctorate in cognitive science in 1985 at the University of California, San Diego.
Jordan is currently a professor at the University of California, Berkeley. He was also a professor at MTI from 1988-1998.
In the 1980s, Jordan began to develop recurrent neural networks of the cognitive model. He popularized Bayesian networks in machine learning and became known after he revealed statistical links in machine learning. Jordan was also prominent in the formalization of variational methods for approximate derivation and popularization of the maximization algorithm in machine learning.
Jordan has numerous awards, including the award of the best student at an international conference on machine learning, the award for the best article at a management conference, and the award of a young researcher. In 2010, he became a member of the Computer Engineering Association, for his contributions to the theory and application of machine learning.
Professor Jordan is a member of the National Academy of Sciences, a member of the National Academy of Engineering Sciences and a member of the American Academy of Arts and Sciences.
In 2016, Jordan was identified as the most influential scientist, based on an analysis of his published literature.
Michael I. Jordan
Michael Irwin Jordan (born February 25, 1956 in Maryland ) is an American computer scientist and one of the leading specialists in the field of machine learning . He is a professor at the University of California, Berkeley .
Life and work
He studied psychology at Louisiana State University with a bachelor’s degree in magna cum laude in 1978 and mathematics at Arizona State University with a master’s degree in statistics in 1980 and received his PhD in cognitive science from the University of California, San Diego in 1985 with David Rumelhart ( The Learning of Representations for Sequential Performance )  . There he was a member of the PDP Group. He was a post-doctoral student at the University of Massachusetts at Amherst . In 1988 he became Assistant Professor, 1992 Associate Professor and 1997 Professor atMassachusetts Institute of Technology and 1998 professor at the University of California, Berkeley . There he is Pehong Chen Distinguished Professor in the Faculty of Electrical Engineering and Computer Science and the Faculty of Statistics.
He deals with machine learning where he used statistical methods, neural networks, kernel machines and applications in distributed computing, natural language processing, signal processing, nonparametric Bayesian statistics, applications of statistics in genetics and probabilistic graphic models (such as Bayesian networks in machine learning). With David Blei (his PhD student) and Andrew Ng , he introduced Latent Dirichlet Allocation in 2002 . With Jeff Elman , he was a pioneer of recurring neural networks in the 1980s.
He is a member of the National Academy of Sciences , the National Academy of Engineering , the American Association for the Advancement of Science and the American Academy of Arts and Sciences (2011). He is a fellow of the Association for Computing Machinery , the AAAI, SIAM , the Institute of Mathematical Statistics (IMS) and the IEEE and a member of the International Statistical Institute . In 2006 he received the IEEE Neural Networks Pioneer Award, in 2009 the ACM-AAAI Allen Newell Award, received a National Science Foundation Presidential Young Investigator Award and 2011 IMS Neyman Lecture. In 2015 he received the Rumelhart Prize . He was medallion lecturer at the IMS and received the IJCAI Award for Research Excellence in 2016 . In 2012 he had the Chaire d’Excellence of the Fondation Sciences Mathèmatiques de Paris. In 2018 he was plenary speaker at the International Congress of Mathematics ( Dynamical, symplectic and stochastic perspectives on gradient-based optimization ). Jordan was awarded the John von Neumann Medal for 2020.