2008 |
10 | EE | Katsuyuki Hagiwara,
Kenji Fukumizu:
Relation between weight size and degree of over-fitting in neural network regression.
Neural Networks 21(1): 48-58 (2008) |
2007 |
9 | EE | Katsuyuki Hagiwara:
Orthogonal Shrinkage Methods for Nonparametric Regression under Gaussian Noise.
ICONIP (1) 2007: 537-546 |
2006 |
8 | EE | Katsuyuki Hagiwara,
Hiroshi Ishitani:
On the Expected Prediction Error of Orthogonal Regression with Variable Components.
IEICE Transactions 89-A(12): 3699-3709 (2006) |
2002 |
7 | EE | Katsuyuki Hagiwara:
On the Problem in Model Selection of Neural Network Regression in Overrealizable Scenario.
Neural Computation 14(8): 1979-2002 (2002) |
6 | EE | Katsuyuki Hagiwara:
Regularization learning, early stopping and biased estimator.
Neurocomputing 48(1-4): 937-955 (2002) |
2001 |
5 | EE | Katsuyuki Hagiwara,
Taichi Hayasaka,
Naohiro Toda,
Shiro Usui,
Kazuhiro Kuno:
Upper bound of the expected training error of neural network regression for a Gaussian noise sequence.
Neural Networks 14(10): 1419-1429 (2001) |
2000 |
4 | EE | Katsuyuki Hagiwara,
Kazuhiro Kuno:
Regularization Learning and Early Stopping in Linear Networks.
IJCNN (4) 2000: 511-516 |
3 | EE | Katsuyuki Hagiwara,
Kazuhiro Kuno,
Shiro Usui:
On the Problem in Model Selection of Neural Network Regression in Overrealizable Scenario.
IJCNN (6) 2000: 461-466 |
1998 |
2 | | Katsuyuki Hagiwara,
Kazuhiro Kuno,
Shiro Usui:
Upper Bounds on the Expected Training Errors of Neural Networks Regressions for a Gaussian Noise.
ICONIP 1998: 502-505 |
1994 |
1 | EE | Qi Jia,
Katsuyuki Hagiwara,
Naohiro Toda,
Shiro Usui:
Equivalence relation between the back propagation learning process of an FNN and that of an FNNG.
Neural Networks 7(2): 411- (1994) |