Recurrent Neural Networks [electronic resource] : From Simple to Gated Architectures / by Fathi M. Salem.
Material type:
- text
- computer
- online resource
- 9783030899295
- TK7867
Item type | Current library | Home library | Collection | Call number | Copy number | Status | Notes | Date due | Barcode | |
---|---|---|---|---|---|---|---|---|---|---|
![]() |
Merkez Kütüphane | Merkez Kütüphane | E-Kitap Koleksiyonu | TK7867EBK (Browse shelf(Opens below)) | 1 | Geçerli değil-e-Kitap / Not applicable-e-Book | EBK03424 |
Introduction -- 1. Network Architectures -- 2. Learning Processes -- 3. Recurrent Neural Networks (RNN) -- 4. Gated RNN: The Long Short-Term Memory (LSTM) RNN -- 5. Gated RNN: The Gated Recurrent Unit (GRU) RNN -- 6. Gated RNN: The Minimal Gated Unit (MGU) RNN.
This textbook provides a compact but comprehensive treatment that provides analytical and design steps to recurrent neural networks from scratch. It provides a treatment of the general recurrent neural networks with principled methods for training that render the (generalized) backpropagation through time (BPTT). This author focuses on the basics and nuances of recurrent neural networks, providing technical and principled treatment of the subject, with a view toward using coding and deep learning computational frameworks, e.g., Python and Tensorflow-Keras. Recurrent neural networks are treated holistically from simple to gated architectures, adopting the technical machinery of adaptive non-convex optimization with dynamic constraints to leverage its systematic power in organizing the learning and training processes. This permits the flow of concepts and techniques that provide grounded support for design and training choices. The author’s approach enables strategic co-training of output layers, using supervised learning, and hidden layers, using unsupervised learning, to generate more efficient internal representations and accuracy performance. As a result, readers will be enabled to create designs tailoring proficient procedures for recurrent neural networks in their targeted applications. Explains the intricacy and diversity of recurrent networks from simple to more complex gated recurrent neural networks; Discusses the design framing of such networks, and how to redesign simple RNN to avoid unstable behavior; Describes the forms of training of RNNs framed in adaptive non-convex optimization with dynamics constraints.
There are no comments on this title.