Image from Google Jackets

Recurrent Neural Networks [electronic resource] : From Simple to Gated Architectures / by Fathi M. Salem.

By: Contributor(s): Material type: TextTextLanguage: İngilizce Publisher: Cham : Springer International Publishing : Imprint: Springer, 2022Edition: 1st ed. 2022Description: 1 online resourceContent type:
  • text
Media type:
  • computer
Carrier type:
  • online resource
ISBN:
  • 9783030899295
Subject(s): LOC classification:
  • TK7867
Online resources:
Contents:
Introduction -- 1. Network Architectures -- 2. Learning Processes -- 3. Recurrent Neural Networks (RNN) -- 4. Gated RNN: The Long Short-Term Memory (LSTM) RNN -- 5. Gated RNN: The Gated Recurrent Unit (GRU) RNN -- 6. Gated RNN: The Minimal Gated Unit (MGU) RNN.
Summary: This textbook provides a compact but comprehensive treatment that provides analytical and design steps to recurrent neural networks from scratch. It provides a treatment of the general recurrent neural networks with principled methods for training that render the (generalized) backpropagation through time (BPTT). This author focuses on the basics and nuances of recurrent neural networks, providing technical and principled treatment of the subject, with a view toward using coding and deep learning computational frameworks, e.g., Python and Tensorflow-Keras. Recurrent neural networks are treated holistically from simple to gated architectures, adopting the technical machinery of adaptive non-convex optimization with dynamic constraints to leverage its systematic power in organizing the learning and training processes. This permits the flow of concepts and techniques that provide grounded support for design and training choices. The author’s approach enables strategic co-training of output layers, using supervised learning, and hidden layers, using unsupervised learning, to generate more efficient internal representations and accuracy performance. As a result, readers will be enabled to create designs tailoring proficient procedures for recurrent neural networks in their targeted applications. Explains the intricacy and diversity of recurrent networks from simple to more complex gated recurrent neural networks; Discusses the design framing of such networks, and how to redesign simple RNN to avoid unstable behavior; Describes the forms of training of RNNs framed in adaptive non-convex optimization with dynamics constraints.
Tags from this library: No tags from this library for this title. Log in to add tags.
Star ratings
    Average rating: 0.0 (0 votes)
Holdings
Item type Current library Home library Collection Call number Copy number Status Notes Date due Barcode
E-Book E-Book Merkez Kütüphane Merkez Kütüphane E-Kitap Koleksiyonu TK7867EBK (Browse shelf(Opens below)) 1 Geçerli değil-e-Kitap / Not applicable-e-Book EBK03424

Introduction -- 1. Network Architectures -- 2. Learning Processes -- 3. Recurrent Neural Networks (RNN) -- 4. Gated RNN: The Long Short-Term Memory (LSTM) RNN -- 5. Gated RNN: The Gated Recurrent Unit (GRU) RNN -- 6. Gated RNN: The Minimal Gated Unit (MGU) RNN.

This textbook provides a compact but comprehensive treatment that provides analytical and design steps to recurrent neural networks from scratch. It provides a treatment of the general recurrent neural networks with principled methods for training that render the (generalized) backpropagation through time (BPTT). This author focuses on the basics and nuances of recurrent neural networks, providing technical and principled treatment of the subject, with a view toward using coding and deep learning computational frameworks, e.g., Python and Tensorflow-Keras. Recurrent neural networks are treated holistically from simple to gated architectures, adopting the technical machinery of adaptive non-convex optimization with dynamic constraints to leverage its systematic power in organizing the learning and training processes. This permits the flow of concepts and techniques that provide grounded support for design and training choices. The author’s approach enables strategic co-training of output layers, using supervised learning, and hidden layers, using unsupervised learning, to generate more efficient internal representations and accuracy performance. As a result, readers will be enabled to create designs tailoring proficient procedures for recurrent neural networks in their targeted applications. Explains the intricacy and diversity of recurrent networks from simple to more complex gated recurrent neural networks; Discusses the design framing of such networks, and how to redesign simple RNN to avoid unstable behavior; Describes the forms of training of RNNs framed in adaptive non-convex optimization with dynamics constraints.

There are no comments on this title.

to post a comment.
Devinim Yazılım Eğitim Danışmanlık tarafından Koha'nın orjinal sürümü uyarlanarak geliştirilip kurulmuştur.