Sepp Hochreiter

January 10, 2024
Computer Scientist

Quick Facts

Sepp Hochreiter
Full Name Sepp Hochreiter
Occupation Computer Scientist
Date Of Birth Feb 14, 1967(1967-02-14)
Age 57
Birthplace Mühldorf
Country Germany
Horoscope Aquarius

Sepp Hochreiter Biography

Name Sepp Hochreiter
Birthday Feb 14
Birth Year 1967
Place Of Birth Mühldorf
Birth Country Germany
Birth Sign Aquarius

Sepp Hochreiter is one of the most popular and richest Computer Scientist who was born on February 14, 1967 in Mühldorf, Germany. Sepp Hochreiter (born Josef Hochreiter in 1967) is a German computer scientist. Since 2018, he has been the head of the Institute for Machine Learning at the Johannes Kepler University of Linz after having headed in the Institute of Bioinformatics from 2006 until 2018. In 2017, he was appointed the director of the Linz Institute of Technology (LIT) AI Lab which focuses on the advancement of research in artificial intelligence. Previously, he was at the Technical University of Berlin, at the University of Colorado at Boulder, and at the Technical University of Munich.

Neural networks are various kinds of mathematically simplified models of neural networks in the biological world, similar to ones in the human brain. In neural networks that feedforward (NNs) they transmit information that flows forward only in one direction. It starts from the input layer which gets information from surrounding environment and then through the layers hidden and finally to the output layer which provides the information to the surrounding environment. As opposed to NNs networks (RNNs) have the ability to utilize internal memory for processing various patterns of inputs. If data mining relies upon neural networks, improper fitting decreases the capacity of the network to properly handle future data. To prevent overfitting Sepp Hochreiter developed algorithms for discovering neural networks with low complexity such as “Flat Minimum Search” (FMS) which seeks the “flat” minimum — an area of high connectivity within the parameter space, where the network’s function remains constant. Therefore, the parameters of the network are able to be specified with low precision, which implies a network with a low complexity which avoids overfitting. Neural networks with low complexity are ideal for deep learning as they regulate the level of complexity in every layer of the network and, consequently, develop more hierarchical representations of input. Sepp Hochreiter’s team introduced “exponential linear units” (ELUs) which accelerate learning for deep neural networks, and contribute to greater accuracy in classification. Similar to corrected linear units (ReLUs) and leaked ReLUs (LReLUs) as well as the parametrized version of ReLUs (PReLUs) ELUs help to solve the problem of the gradient disappearing by using the identification of positive values. But, they have better the quality of learning as in comparison to ReLUs because of negative values that make mean unit activations towards zero. Mean shifts to zero increase the speed of learning because they bring your normal curve closer to unit’s natural gradient due to the reduction in bias shift. Sepp Hochreiter invented self- normalizing networks (SNNs) that allow feedforward networks that abstractly represent input at different levels. SNNs can avoid problems with batch normalization because the activations across different samples automatically become average zero with a variance of one. SNNs enable technology to (1) create extremely deep networks, which means networks that comprise multiple layers (2) make use of novel periodicity techniques and (3) learn with great efficiency across multiple layers. Unsupervised deep learning is a method of learning. Generative Adversarial Networks (GANs) are very well-known since they produce new images that appear more real than the ones produced by other generative methods. Sepp Hochreiter has proposed the two-time-scale update rule (TTUR) to learn GANs using stochastic gradient descent on any loss function that is differentiable. Methods derived from stochastic approximation have been applied to prove that the TTUR is able to reach the Nash equilibrium that is stationary locally. It is the very first demonstration that convergence occurs for GANs in a global setting. The other contribution of this paper is that of the “Frechet Inception Distance” (FID) which is an suitable quality measure for GANs than the previous Inception Score. He created corrected factor network (RFNs) to construct extremely sparse, non- linear high-dimensional representations for the input. RFN models can identify minor and rare events within the input, have very low interfering with code unit, suffer from tiny errors in reconstruction and help explain the data’s covariance structure. RFN learning is an alternating minimization method that is generalized developed of the posterior regularization technique which is a means of ensuring non-negative and normalized posterior methods. RFN were extensively utilized in genetics and bioinformatics.

The pharmaceutical industry has seen several chemical substances (drug candidates) do not work in the final stages of the pipeline to develop drugs. The reason for this is insufficient effectiveness in relation to the target biomolecule (on-target effect) or undesirable interplay with biological molecules (off-target or adverse side effects) or unintentionally adverse effects. These Deep Learning and biclustering methods that were developed by Sepp Hochreiter have identified new off-target and on-target effects in several drug design projects. In 2013, Sepp Hochreiter’s lab took home the DREAM subchallenge, which was to predict an average level of toxicity for chemicals. In 2014, this achievement with Deep Learning was continued by winning the “Tox21 Data Challenge” of NIH, FDA and NCATS. The purpose for this Tox21 Data Challenge was to accurately predict the off-target and toxic effects caused by chemicals found in environmental sources including household products, drugs and household chemicals. These remarkable results show that Deep Learning is superior over other methods of virtual screening. In addition, the Hochreiter group was able to identify synergistic effects between drug combinations.

Sepp Hochreiter Net Worth

Net Worth $5 Million
Source Of Income Computer Scientist
House Living in own house.

Sepp Hochreiter is one of the richest Computer Scientist from Germany. According to our analysis, Wikipedia, Forbes & Business Insider, Sepp Hochreiter 's net worth $5 Million. (Last Update: December 11, 2023)

Sepp Hochreiter developed the long short-term memory (LSTM) for which the first results were presented in his thesis for his doctoral degree in the year 1991. The principal LSTM paper was published in 1997, and has been regarded as a breakthrough that marks an important milestone in the history in machine learning. LSTM solves the issue of Recurrent neural networks (RNNs) or deep networks that erase information over time, or, more precisely by layering (vanishing or expanding gradient). LSTM learns from the training sequences and then processes the new sequences to create the output (sequence classification) or produce the output sequence (sequence to sequence mapping). Neural networks utilizing LSTM cells have solved a variety of tasks in the field of biological sequence analysis, drug design, automated song composition, translation the recognition of speech, reinforcement learning and robotics. LSTM with its optimized structure was successfully utilized to achieve efficient detection of protein homology without the need for a sequence alignment. LSTM is utilized to train the learning algorithm which means that LSTM acts as an Turing machine i.e. as a computer that is where the learning algorithm is run. Because that the LSTM Turing device is an artificial neural network it can create new methods of learning by analyzing issues in learning. It is evident that the new learned methods are superior to the ones created by humans. LSTM networks are utilized to support Google Voice transcription Google voice search as well as Google’s Allo as the primary technology used for voice search and commands within Google’s Google App (on Android and iOS) and also for transcription in Android devices. Additionally, Apple has utilized LSTM in its “Quicktype” function since iOS 10.

Sepp Hochreiter has made numerous contributions to the fields of machine- learning as well as bioinformatics and deep learning. He invented the long- short-term memory (LSTM) for which the first results were published in his master’s thesis in the year 1991. The primary LSTM research paper came out in 1997, and is seen as a breakthrough that marks an important milestone in the history of machine-learning. The foundations of deep learning was shaped by his study of the disappearing or expanding gradient. He also contributed to meta learning and suggested flat minima as the most effective solution for learning artificial neural networks for ensuring a minimal generalization error. He devised new activation algorithms for neural networks such as exponential linear units (ELUs) or scaled ELUs (SELUs) to enhance learning. He also contributed to reinforcement learning using actor-critic techniques and the RUDDER method. Bicollustering techniques were applied to the field of toxicology and drug discovery. Support vector machines were extended to deal with kernels that are not positive definite , using their “Potential Support Vector Machine” (PSVM) model. He also utilized this model to perform feature selection, specifically to selection of genes in microarray data. Additionally, in biotechnology, he invented “Factor Analysis for Robust Microarray Summarization” (FARMS).

Height, Weight & Body Measurements

Sepp Hochreiter height Not available right now. Sepp weight Not Known & body measurements will update soon.

Who is Sepp Hochreiter Dating?

According to our records, Sepp Hochreiter is possibily single & has not been previously engaged. As of December 1, 2023, Sepp Hochreiter’s is not dating anyone.

Relationships Record : We have no records of past relationships for Sepp Hochreiter. You may help us to build the dating records for Sepp Hochreiter!

Facts & Trivia

Sepp Ranked on the list of most popular Computer Scientist. Also ranked in the elit list of famous people born in Germany. Sepp Hochreiter celebrates birthday on February 14 of every year.

More Computer Scientists

Related Posts