2008 |
24 | EE | Shidong Li,
Hidemitsu Ogawa:
Optimal noise suppression: A geometric nature of pseudoframes for subspaces.
Adv. Comput. Math. 28(2): 141-155 (2008) |
2006 |
23 | EE | Marko Jankovic,
Hidemitsu Ogawa:
Modulated Hebb-Oja learning Rule-a method for principal subspace analysis.
IEEE Transactions on Neural Networks 17(2): 345-356 (2006) |
22 | EE | Masashi Sugiyama,
Hidemitsu Ogawa:
Constructing Kernel Functions for Binary Regression.
IEICE Transactions 89-D(7): 2243-2249 (2006) |
2005 |
21 | EE | Aqeel Syed,
Hidemitsu Ogawa:
Optimal Sampling Operator for Signal Restoration in the Presence of Signal Space and Observation Space Noises.
IEICE Transactions 88-D(12): 2828-2838 (2005) |
2004 |
20 | EE | Marko Jankovic,
Hidemitsu Ogawa:
Time-oriented hierarchical method for computation of principal components using subspace learning algorithm.
Int. J. Neural Syst. 14(5): 313-323 (2004) |
2003 |
19 | EE | Ganka Petkova Kovacheva,
Hidemitsu Ogawa:
Radial basis function classifier for fault diagnostics.
ISICT 2003: 64-69 |
18 | EE | Marko Jankovic,
Hidemitsu Ogawa:
A New Modulated Hebbian Learning Rule - Biologically Plausible Method for Local Computation of a Principal Subspace.
Int. J. Neural Syst. 13(4): 215-223 (2003) |
2002 |
17 | | Masashi Sugiyama,
Hidemitsu Ogawa:
Theoretical and Experimental Evaluation of the Subspace Information Criterion.
Machine Learning 48(1-3): 25-50 (2002) |
16 | EE | Masashi Sugiyama,
Hidemitsu Ogawa:
Optimal design of regularization term and regularization parameter by subspace information criterion.
Neural Networks 15(3): 349-361 (2002) |
15 | EE | Masashi Sugiyama,
Hidemitsu Ogawa:
A unified method for optimizing linear image restoration filters.
Signal Processing 82(11): 1773-1787 (2002) |
14 | EE | Hidekazu Iwaki,
Hidemitsu Ogawa,
Akira Hirabayashi:
Optimally generalizing neural networks with the ability to recover from single stuck-at r faults.
Systems and Computers in Japan 33(7): 114-123 (2002) |
2001 |
13 | | Masashi Sugiyama,
Hidemitsu Ogawa:
Incremental Active Learning for Optimal Generalization.
Neural Computation 12(12): 2909-2940 (2001) |
12 | | Masashi Sugiyama,
Hidemitsu Ogawa:
Subspace Information Criterion for Model Selection.
Neural Computation 13(8): 1863-1889 (2001) |
11 | EE | Masashi Sugiyama,
Hidemitsu Ogawa:
Incremental projection learning for optimal generalization.
Neural Networks 14(1): 53-66 (2001) |
10 | EE | Masashi Sugiyama,
Hidemitsu Ogawa:
Properties of incremental projection learning.
Neural Networks 14(1): 67-78 (2001) |
9 | EE | Akiko Nakashima,
Akira Hirabayashi,
Hidemitsu Ogawa:
Error correcting memorization learning for noisy training examples.
Neural Networks 14(1): 79-92 (2001) |
8 | EE | Akiko Nakashima,
Hidemitsu Ogawa:
Noise suppression in training examples for improving generalization capability.
Neural Networks 14(4-5): 459-469 (2001) |
7 | EE | Akira Hirabayashi,
Hidemitsu Ogawa:
A family of projection learnings.
Systems and Computers in Japan 32(5): 21-35 (2001) |
2000 |
6 | EE | Masashi Sugiyama,
Hidemitsu Ogawa:
A new information criterion for the selection of subspace models.
ESANN 2000: 69-74 |
5 | EE | Masashi Sugiyama,
Hidemitsu Ogawa:
Incremental Active Learning with Bias Reduction.
IJCNN (1) 2000: 15-20 |
1999 |
4 | EE | Masashi Sugiyama,
Hidemitsu Ogawa:
Training Data Selection for Optimal Generalization in Trigonometric Polynomial Networks.
NIPS 1999: 624-630 |
3 | EE | Sethu Vijayakumar,
Hidemitsu Ogawa:
RKHS-based functional analysis for exact incremental learning.
Neurocomputing 29(1-3): 85-113 (1999) |
1995 |
2 | EE | D. Liu,
Yukihiko Yamashita,
Hidemitsu Ogawa:
Pattern recognition in the presence of noise.
Pattern Recognition 28(7): 989-995 (1995) |
1969 |
1 | | Hidemitsu Ogawa,
Yoshinori Isomichi:
Optimum Spatial Filter and Uncertainty
Information and Control 14(2): 180-216 (1969) |