| 2008 |
| 31 | EE | Jack Raymond,
David Saad:
Composite CDMA - A statistical mechanics analysis
CoRR abs/0811.2403: (2008) |
| 2007 |
| 30 | EE | Jack Raymond,
David Saad:
Sparsely-spread CDMA - a statistical mechanics based analysis
CoRR abs/0704.0098: (2007) |
| 29 | EE | K. Y. Michael Wong,
David Saad:
Minimizing Unsatisfaction in Colorful Neighborhoods
CoRR abs/0704.3835: (2007) |
| 2006 |
| 28 | EE | K. Y. Michael Wong,
C. H. Yeung,
David Saad:
Message-Passing for Inference and Optimization of Real Variables on Sparse Graphs.
ICONIP (2) 2006: 754-763 |
| 2005 |
| 27 | EE | Stéphane Bounkong,
Borémi Toch,
David Saad:
Optimal Embedding for Watermarking in Discrete Data Spaces.
Information Hiding 2005: 77-90 |
| 26 | EE | K. Y. Michael Wong,
David Saad,
Zhuo Gao:
Message passing for task redistribution on sparse graphs.
NIPS 2005 |
| 25 | EE | Juan P. Neirotti,
David Saad:
Improved message passing for inference in densely connected systems
CoRR abs/cs/0503070: (2005) |
| 2003 |
| 24 | EE | Stéphane Bounkong,
Borémi Toch,
David Saad,
David Lowe:
ICA for Watermarking Digital Images.
Journal of Machine Learning Research 4: 1471-1498 (2003) |
| 2002 |
| 23 | EE | Stéphane Bounkong,
David Saad,
David Lowe:
Independent Component Analysis for Domain Independent Watermarking.
ICANN 2002: 510-515 |
| 2001 |
| 22 | EE | Jort van Mourik,
David Saad,
Yoshiyuki Kabashima:
Weight vs. Magnetization Enumerator for Gallager Codes.
IMA Int. Conf. 2001: 148-157 |
| 21 | EE | David Saad,
Yoshiyuki Kabashima,
Tatsuto Murayama,
Renato Vicente:
Statistical Physics of Low Density Parity Check Error Correcting Codes.
IMA Int. Conf. 2001: 307-316 |
| 2000 |
| 20 | | Renato Vicente,
David Saad,
Yoshiyuki Kabashima:
Error-correcting Codes on a Bethe-like Lattice.
NIPS 2000: 322-328 |
| 1999 |
| 19 | EE | Yoshiyuki Kabashima,
Tatsuto Murayama,
David Saad,
Renato Vicente:
Regular and Irregular Gallager-zype Error-Correcting Codes.
NIPS 1999: 272-278 |
| 1998 |
| 18 | EE | Anthony C. C. Coolen,
David Saad:
Dynamics of Supervised Learning with Restricted Training Sets.
NIPS 1998: 197-203 |
| 17 | EE | Yoshiyuki Kabashima,
David Saad:
The Belief in TAP.
NIPS 1998: 246-252 |
| 1997 |
| 16 | | Magnus Rattray,
David Saad:
Globally Optimal On-line Learning Rules.
NIPS 1997 |
| 15 | | Todd K. Leen,
Bernhard Schottky,
David Saad:
Two Approaches to Optimal Annealing.
NIPS 1997 |
| 14 | EE | Jason A. S. Freeman,
David Saad:
Online Learning in Radial Basis Function Networks.
Neural Computation 9(7): 1601-1622 (1997) |
| 13 | EE | Barak Cohen,
David Saad,
Emanuel Marom:
Efficient Training of Recurrent Neural Network with Time Delays.
Neural Networks 10(1): 51-59 (1997) |
| 1996 |
| 12 | EE | David Saad,
Sara A. Solla:
Learning with Noise and Regularizers in Multilayer Neural Networks.
NIPS 1996: 260-266 |
| 11 | EE | Ansgar H. L. West,
David Saad,
Ian T. Nabney:
The Learning Dynamcis of a Universal Approximator.
NIPS 1996: 288-294 |
| 10 | EE | David Saad:
General Gaussian Priors for Improved Generalization.
Neural Networks 9(6): 937-945 (1996) |
| 9 | EE | Jason A. S. Freeman,
David Saad:
Radial Basis Function Networks: Generalization in Over-realizable and Unrealizable Scenarios.
Neural Networks 9(9): 1521-1529 (1996) |
| 1995 |
| 8 | EE | David Saad,
Sara A. Solla:
Dynamics of On-Line Gradient Descent Learning for Multilayer Neural Networks.
NIPS 1995: 302-308 |
| 7 | EE | Ansgar H. L. West,
David Saad:
Adaptive Back-Propagation in On-Line Learning of Multilayer Networks.
NIPS 1995: 323-329 |
| 6 | EE | Jason A. S. Freeman,
David Saad:
Learning and generalization in radial basis function networks.
Neural Computation 7(5): 1000-1020 (1995) |
| 1994 |
| 5 | EE | Glenn Marion,
David Saad:
Hyperparameters Evidence and Generalisation for an Unrealisable Rule.
NIPS 1994: 255-262 |
| 4 | EE | Peter Sollich,
David Saad:
Learning from queries for maximum information gain in imperfectly learnable problems.
NIPS 1994: 287-294 |
| 1993 |
| 3 | EE | N. Shamir,
David Saad,
Emanuel Marom:
Neural Net Pruning Based On Functional Behavior Of Neurons.
Int. J. Neural Syst. 4(2): 143-158 (1993) |
| 1992 |
| 2 | EE | David Saad:
Training Recurrent Neural Networks - The Minimal Trajectory Algorithm.
Int. J. Neural Syst. 3(1): 83-101 (1992) |
| 1 | EE | David Saad,
R. Sasson:
Examining the Chir Algorithm Performance for Multilayer Networks and Continuous Input Vectors.
Int. J. Neural Syst. 3(2): 157-165 (1992) |