2009 |
52 | EE | Taiji Suzuki,
Masashi Sugiyama:
Estimating Squared-Loss Mutual Information for Independent Component Analysis.
ICA 2009: 130-137 |
51 | EE | Shinichi Nakajima,
Masashi Sugiyama:
Analysis of Variational Bayesian Matrix Factorization.
PAKDD 2009: 314-326 |
50 | EE | Takeaki Uno,
Masashi Sugiyama,
Koji Tsuda:
Efficient Construction of Neighborhood Graphs by the Multiple Sorting Method
CoRR abs/0904.3151: (2009) |
2008 |
49 | | Hirotaka Hachiya,
Takayuki Akiyama,
Masashi Sugiyama,
Jan Peters:
Adaptive Importance Sampling with Automatic Model Selection in Value Function Approximation.
AAAI 2008: 1351-1356 |
48 | EE | Liwei Wang,
Masashi Sugiyama,
Cheng Yang,
Zhi-Hua Zhou,
Jufu Feng:
On the Margin Explanation of Boosting Algorithms.
COLT 2008: 479-490 |
47 | EE | Masashi Sugiyama,
Shinichi Nakajima:
Pool-Based Agnostic Experiment Design in Linear Regression.
ECML/PKDD (2) 2008: 406-422 |
46 | EE | Shohei Hido,
Yuta Tsuboi,
Hisashi Kashima,
Masashi Sugiyama,
Takafumi Kanamori:
Inlier-Based Outlier Detection via Direct Density Ratio Estimation.
ICDM 2008: 223-232 |
45 | EE | Akiko Takeda,
Masashi Sugiyama:
nu-support vector machine as conditional value-at-risk minimization.
ICML 2008: 1056-1063 |
44 | EE | Neil Rubens,
Vera Sheinman,
Takenobu Tokunaga,
Masashi Sugiyama:
Order Retrieval.
LKR 2008: 310-317 |
43 | EE | Takafumi Kanamori,
Shohei Hido,
Masashi Sugiyama:
Efficient Direct Density Ratio Estimation for Non-stationarity Adaptation and Outlier Detection.
NIPS 2008: 809-816 |
42 | EE | Masashi Sugiyama,
Tsuyoshi Idé,
Shinichi Nakajima,
Jun Sese:
Semi-Supervised Local Fisher Discriminant Analysis for Dimensionality Reduction.
PAKDD 2008: 333-344 |
41 | EE | Yuta Tsuboi,
Hisashi Kashima,
Shohei Hido,
Steffen Bickel,
Masashi Sugiyama:
Direct Density Ratio Estimation for Large-scale Covariate Shift Adaptation.
SDM 2008: 443-454 |
40 | EE | Masashi Sugiyama,
Neil Rubens:
Active Learning with Model Selection in Linear Regression.
SDM 2008: 518-529 |
39 | EE | Tsuyoshi Kato,
Hisashi Kashima,
Masashi Sugiyama:
Integration of Multiple Networks for Robust Label Propagation.
SDM 2008: 716-726 |
38 | EE | Masashi Sugiyama,
Hirotaka Hachiya,
Christopher Towell,
Sethu Vijayakumar:
Geodesic Gaussian kernels for value function approximation.
Auton. Robots 25(3): 287-304 (2008) |
37 | EE | Masashi Sugiyama,
Motoaki Kawanabe,
Gilles Blanchard,
Klaus-Robert Müller:
Approximating the Best Linear Unbiased Estimator of Non-Gaussian Signals with Gaussian Noise.
IEICE Transactions 91-D(5): 1577-1580 (2008) |
36 | EE | Masashi Sugiyama,
Neil Rubens:
A batch ensemble approach to active learning with model selection.
Neural Networks 21(9): 1278-1286 (2008) |
2007 |
35 | EE | Keisuke Yamazaki,
Motoaki Kawanabe,
Sumio Watanabe,
Masashi Sugiyama,
Klaus-Robert Müller:
Asymptotic Bayesian generalization error when training and test distributions are different.
ICML 2007: 1079-1086 |
34 | EE | Masashi Sugiyama,
Hirotaka Hachiya,
Christopher Towell,
Sethu Vijayakumar:
Value Function Approximation on Non-Linear Manifolds for Robot Motor Control.
ICRA 2007: 1733-1740 |
33 | EE | Masashi Sugiyama,
Shinichi Nakajima,
Hisashi Kashima,
Paul Von Bünau,
Motoaki Kawanabe:
Direct Importance Estimation with Model Selection and Its Application to Covariate Shift Adaptation.
NIPS 2007 |
32 | EE | Tsuyoshi Kato,
Hisashi Kashima,
Masashi Sugiyama,
Kiyoshi Asai:
Multi-Task Learning via Conic Programming.
NIPS 2007 |
31 | EE | Neil Rubens,
Masashi Sugiyama:
Influence-based collaborative active learning.
RecSys 2007: 145-148 |
30 | EE | Shun Gokita,
Masashi Sugiyama,
Keisuke Sakurai:
Analytic Optimization of Adaptive Ridge Parameters Based on Regularized Subspace Information Criterion.
IEICE Transactions 90-A(11): 2584-2592 (2007) |
29 | EE | Masashi Sugiyama:
Generalization Error Estimation for Non-linear Learning Methods.
IEICE Transactions 90-A(7): 1496-1499 (2007) |
28 | EE | Yasushi Hidaka,
Masashi Sugiyama:
A New Meta-Criterion for Regularized Subspace Information Criterion.
IEICE Transactions 90-D(11): 1779-1786 (2007) |
2006 |
27 | EE | Masashi Sugiyama,
Benjamin Blankertz,
Matthias Krauledat,
Guido Dornhege,
Klaus-Robert Müller:
Importance-Weighted Cross-Validation for Covariate Shift.
DAGM-Symposium 2006: 354-363 |
26 | EE | Motoaki Kawanabe,
Gilles Blanchard,
Masashi Sugiyama,
Vladimir Spokoiny,
Klaus-Robert Müller:
A Novel Dimension Reduction Procedure for Searching Non-Gaussian Subspaces.
ICA 2006: 149-156 |
25 | EE | Masashi Sugiyama:
Local Fisher discriminant analysis for supervised dimensionality reduction.
ICML 2006: 905-912 |
24 | EE | Amos J. Storkey,
Masashi Sugiyama:
Mixture Regression for Covariate Shift.
NIPS 2006: 1337-1344 |
23 | EE | Akira Tanaka,
Masashi Sugiyama,
Hideyuki Imai,
Mineichi Kudo,
Masaaki Miyakoshi:
Model Selection Using a Class of Kernels with an Invariant Metric.
SSPR/SPR 2006: 862-870 |
22 | EE | Masashi Sugiyama,
Keisuke Sakurai:
Analytic Optimization of Shrinkage Parameters Based on Regularized Subspace Information Criterion.
IEICE Transactions 89-A(8): 2216-2225 (2006) |
21 | EE | Masashi Sugiyama,
Hidemitsu Ogawa:
Constructing Kernel Functions for Binary Regression.
IEICE Transactions 89-D(7): 2243-2249 (2006) |
20 | EE | Masashi Sugiyama:
Active Learning in Approximately Linear Regression Based on Conditional Expectation of Generalization Error.
Journal of Machine Learning Research 7: 141-166 (2006) |
19 | EE | Gilles Blanchard,
Motoaki Kawanabe,
Masashi Sugiyama,
Vladimir Spokoiny,
Klaus-Robert Müller:
In Search of Non-Gaussian Components of a High-Dimensional Distribution.
Journal of Machine Learning Research 7: 247-282 (2006) |
2005 |
18 | EE | Masashi Sugiyama,
Klaus-Robert Müller:
Model Selection Under Covariate Shift.
ICANN (2) 2005: 235-240 |
17 | EE | Masashi Sugiyama:
Active Learning for Misspecified Models.
NIPS 2005 |
16 | EE | Gilles Blanchard,
Masashi Sugiyama,
Motoaki Kawanabe,
Vladimir Spokoiny,
Klaus-Robert Müller:
Non-Gaussian Component Analysis: a Semi-parametric Framework for Linear Dimension Reduction.
NIPS 2005 |
2004 |
15 | EE | Masashi Sugiyama,
Motoaki Kawanabe,
Klaus-Robert Müller:
Regularizing generalization error estimators: a novel approach to robust model selection.
ESANN 2004: 163-168 |
14 | | Masashi Sugiyama:
Estimating the error at given test input points for linear regression.
Neural Networks and Computational Intelligence 2004: 113-118 |
13 | EE | Masashi Sugiyama,
Motoaki Kawanabe,
Klaus-Robert Müller:
Trading Variance Reduction with Unbiasedness: The Regularized Subspace Information Criterion for Robust Model Selection in Kernel Regression.
Neural Computation 16(5): 1077-1104 (2004) |
2002 |
12 | EE | Masashi Sugiyama,
Klaus-Robert Müller:
Selecting Ridge Parameters in Infinite Dimensional Hypothesis Spaces.
ICANN 2002: 528-534 |
11 | EE | Masashi Sugiyama,
Klaus-Robert Müller:
The Subspace Information Criterion for Infinite Dimensional Hypothesis Spaces.
Journal of Machine Learning Research 3: 323-359 (2002) |
10 | | Masashi Sugiyama,
Hidemitsu Ogawa:
Theoretical and Experimental Evaluation of the Subspace Information Criterion.
Machine Learning 48(1-3): 25-50 (2002) |
9 | EE | Masashi Sugiyama,
Hidemitsu Ogawa:
Optimal design of regularization term and regularization parameter by subspace information criterion.
Neural Networks 15(3): 349-361 (2002) |
8 | EE | Masashi Sugiyama,
Hidemitsu Ogawa:
A unified method for optimizing linear image restoration filters.
Signal Processing 82(11): 1773-1787 (2002) |
2001 |
7 | | Masashi Sugiyama,
Hidemitsu Ogawa:
Incremental Active Learning for Optimal Generalization.
Neural Computation 12(12): 2909-2940 (2001) |
6 | | Masashi Sugiyama,
Hidemitsu Ogawa:
Subspace Information Criterion for Model Selection.
Neural Computation 13(8): 1863-1889 (2001) |
5 | EE | Masashi Sugiyama,
Hidemitsu Ogawa:
Incremental projection learning for optimal generalization.
Neural Networks 14(1): 53-66 (2001) |
4 | EE | Masashi Sugiyama,
Hidemitsu Ogawa:
Properties of incremental projection learning.
Neural Networks 14(1): 67-78 (2001) |
2000 |
3 | EE | Masashi Sugiyama,
Hidemitsu Ogawa:
A new information criterion for the selection of subspace models.
ESANN 2000: 69-74 |
2 | EE | Masashi Sugiyama,
Hidemitsu Ogawa:
Incremental Active Learning with Bias Reduction.
IJCNN (1) 2000: 15-20 |
1999 |
1 | EE | Masashi Sugiyama,
Hidemitsu Ogawa:
Training Data Selection for Optimal Generalization in Trigonometric Polynomial Networks.
NIPS 1999: 624-630 |