Protein secondary structure prediction with bidirectional recurrent neural nets: Can weight updating for each residue enhance performance?
Date
2010Author
Agathocleous, MichalisChristodoulou, Georgia
Promponas, Vasilis J.
Christodoulou, Chris C.
Vassiliades, Vassilis
Antoniou, Antonis
ISSN
1868-4238Source
6th IFIP WG 12.5 International Conference on Artificial Intelligence Applications and Innovations, AIAI 2010Volume
339 AICTPages
128-137Google Scholar check
Keyword(s):
Metadata
Show full item recordAbstract
Successful protein secondary structure prediction is an important step towards modelling protein 3D structure, with several practical applications. Even though in the last four decades several PSSP algorithms have been proposed, we are far from being accurate. The Bidirectional Recurrent Neural Network (BRNN) architecture of Baldi et al. [1] is currently considered as one of the optimal computational neural network type architectures for addressing the problem. In this paper, we implement the same BRNN architecture, but we use a modified training procedure. More specifically, our aim is to identify the effect of the contribution of local versus global information, by varying the length of the segment on which the Recurrent Neural Networks operate for each residue position considered. For training the network, the backpropagation learning algorithm with an online training procedure is used, where the weight updates occur for every amino acid, as opposed to Baldi et al. [1], where the weight updates are applied after the presentation of the entire protein. Our results with a single BRNN are better than Baldi et al. [1] by three percentage points (Q3) and comparable to results of [1] when they use an ensemble of 6 BRNNs. In addition, our results improve even further when sequence-to-structure output is filtered in a post-processing step, with a novel Hidden Markov Model-based approach. © 2010 IFIP.
Collections
Cite as
Related items
Showing items related by title, author, creator and subject.
-
Article
Classification capacity of a modular neural network implementing neurally inspired architecture and training rules
Poirazi, Panayiota; Neocleous, Costas K.; Pattichis, Constantinos S.; Schizas, Christos N. (2004)A three-layer neural network (NN) with novel adaptive architecture has been developed. The hidden layer of the network consists of slabs of single neuron models, where neurons within a slab-but not between slabs- have the ...
-
Article
Samba, a Xenopus hnRNP expressed in neural and neural crest tissues
Yan, C. Y. I.; Skourides, Paris A.; Chang, Christopher C.; Brivanlou, A. (2009)RNA binding proteins regulate gene expression at the posttranscriptional level and play important roles in embryonic development. Here, we report the cloning and expression of Samba, a Xenopus hnRNP that is maternally ...
-
Article
Erratum: Dynamical neural networks that ensure exponential identification error convergence (Neural Networks (1997) 10:2 (299-314)) PII S0893608098000409
Kosmatopoulos, E. B.; Christodoulou, Manolis A.; Ioannou, Petros A. (1998)