ImageNet Large Scale Visual Recognition Competition 2010 (ILSVRC2010)
Back to Main page  

Submissions ranked by flat cost

team nameteam membersfilenameflat costhie costdescription
NEC-UIUCNEC: Yuanqing Lin, Fengjun Lv, Shenghuo Zhu, Ming Yang, Timothee Cour, Kai Yu UIUC: LiangLiang Cao, Zhen Li, Min-Hsuan Tsai, Xi Zhou, Thomas Huang Rutgers: Tong Zhangflat_opt.txt0.281912.1144using sift and lbp feature with two non-linear coding representations and stochastic SVM, optimized for top-5 hit rate
NEC-UIUCNEC: Yuanqing Lin, Fengjun Lv, Shenghuo Zhu, Ming Yang, Timothee Cour, Kai Yu UIUC: LiangLiang Cao, Zhen Li, Min-Hsuan Tsai, Xi Zhou, Thomas Huang Rutgers: Tong Zhanghierarchical.txt0.281922.1107using sift and lbp feature with two non-linear coding representations and stochastic SVM, optimized for hierarchical cost
NEC-UIUCNEC: Yuanqing Lin, Fengjun Lv, Shenghuo Zhu, Ming Yang, Timothee Cour, Kai Yu UIUC: LiangLiang Cao, Zhen Li, Min-Hsuan Tsai, Xi Zhou, Thomas Huang Rutgers: Tong Zhangflat_rerank.txt0.282992.1258using sift and lbp feature with two non-linear coding representations and stochastic SVM, optimized for top-5 hit rate with re-rank
NEC-UIUCNEC: Yuanqing Lin, Fengjun Lv, Shenghuo Zhu, Ming Yang, Timothee Cour, Kai Yu UIUC: LiangLiang Cao, Zhen Li, Min-Hsuan Tsai, Xi Zhou, Thomas Huang Rutgers: Tong Zhangflat_avg.txt0.287822.164using sift and lbp feature with two non-linear coding representations and stochastic SVM
XRCEJorge Sanchez, XRCE Florent Perronnin, XRCE Thomas Mensink, XRCExrce_res_18aug.txt0.336492.5553System based on the Fisher kernel framework as described in F. Perronnin, J. Sanchez and T. Mensink, \\\"Improving the Fisher kernel for Large-Scale Image Classification\\\", ECCV, 2010.
XRCEJorge Sanchez, XRCE Florent Perronnin, XRCE Thomas Mensink, XRCExrce_res_18aug.txt0.336492.5553System based on the Fisher kernel framework as described in F. Perronnin, J. Sanchez and T. Mensink, \\\"Improving the Fisher kernel for Large-Scale Image Classification\\\", ECCV, 2010.
NEC-UIUCNEC: Yuanqing Lin, Fengjun Lv, Shenghuo Zhu, Ming Yang, Timothee Cour, Kai Yu UIUC: LiangLiang Cao, Zhen Li, Min-Hsuan Tsai, Xi Zhou, Thomas Huang Rutgers: Tong Zhangflat_single.txt0.345792.5986using sift and lbp feature with one non-linear coding representation and stochastic SVM
Intelligent Systems and Informatics Lab., The University of TokyoTatsuya Harada (The Univ. of Tokyo) Hideki Nakayama (The Univ. of Tokyo) Yoshitaka Ushiku (The Univ. of Tokyo) Yuya Yamashita (The Univ. of Tokyo) Jun Imura (The Univ. of Tokyo) Yasuo Kuniyoshi (The Univ. of Tokyo) result_hie2_sift_ssim_surf_phog_rgb_hlac_ccd025_250.txt0.445583.6536--- Image features We use BoF (SIFT and Dense SURF), SSIM, PHOG, RGB Histogram, and HLAC. --- Label feature The label feature is a 1676 dimensional binary vector where the elements of the label featu
Intelligent Systems and Informatics Lab., The University of TokyoTatsuya Harada (The Univ. of Tokyo) Hideki Nakayama (The Univ. of Tokyo) Yoshitaka Ushiku (The Univ. of Tokyo) Yuya Yamashita (The Univ. of Tokyo) Jun Imura (The Univ. of Tokyo) Yasuo Kuniyoshi (The Univ. of Tokyo) result_hie2_sift_ssim_surf_phog_rgb_hlac_ccd025_250.txt0.445583.6536--- Image features We use BoF (SIFT and Dense SURF), SSIM, PHOG, RGB Histogram, and HLAC. --- Label feature The label feature is a 1676 dimensional binary vector where the elements of the label featu
UCIHamed Pirsiavash Deva Ramanan Charless Fowlkestest400.pred.txt0.466243.6288dense sift + color features + grid based bag of features + approximated chi^2 kernel svm (In training, uses only the first 400 samples/class)
UCIHamed Pirsiavash Deva Ramanan Charless Fowlkestest400.pred.txt0.466243.6288dense sift + color features + grid based bag of features + approximated chi^2 kernel svm (In training, uses only the first 400 samples/class)
hminmaxJim Mutch, Sharat Chikkerur, Hristo Paskov, Ruslan Salakhutdinov, Stan Bileschi, Hueihan Jhuang. MITsubmit_sbow_gist_color_bestc101_cat_plus_sbow_liblinear.txt0.544334.4359all features concatenated and combined with sbow(liblinear) using PoE. Validation 0.76
hminmaxJim Mutch, Sharat Chikkerur, Hristo Paskov, Ruslan Salakhutdinov, Stan Bileschi, Hueihan Jhuang. MITsubmit_sbow_gist_color_bestc101_cat_plus_sbow_liblinear.txt0.544334.4359all features concatenated and combined with sbow(liblinear) using PoE. Validation 0.76
hminmaxJim Mutch, Sharat Chikkerur, Hristo Paskov, Ruslan Salakhutdinov, Stan Bileschi, Hueihan Jhuang. MITruss_test.txt0.571094.8581Scores combined using logistic regression (russ). Validation .7684
NTU_WZXZhengxiang Wang CeMNet, SCE, NTU, Singapore Liang-Tien Chia, Clement CeMNet, SCE, NTU, Singaporetest_LI2C_Tr100_Triplet100.txt0.583134.9933Using the provided raw SIFT feature as input and use my LI2C algorithm for learning the weight associated with each feature (submitted to PR journal).
NTU_WZXZhengxiang Wang CeMNet, SCE, NTU, Singapore Liang-Tien Chia, Clement CeMNet, SCE, NTU, Singaporetest_LI2C_Tr100_Triplet100.txt0.583134.9933Using the provided raw SIFT feature as input and use my LI2C algorithm for learning the weight associated with each feature (submitted to PR journal).
hminmaxJim Mutch, Sharat Chikkerur, Hristo Paskov, Ruslan Salakhutdinov, Stan Bileschi, Hueihan Jhuang. MITsubmit_sbow_gist_color_bestc101_cat.txt0.586784.8035All features concatenated. Validation 0.8
hminmaxJim Mutch, Sharat Chikkerur, Hristo Paskov, Ruslan Salakhutdinov, Stan Bileschi, Hueihan Jhuang. MITsubmit_sbow_gist_color_bestc101_poe.txt0.590854.9916scores combined using PoE. Validation 0.8
LIGGeorges Quénot - LIGLIG_knn_opp_sift_color_texture_rerank.top0.607235.0544Fusion of KNNs with dense and Harris-Laplace filtered opponent SIFTs and color histogram and Gabor texture, with re-ranking and optimized for the hierarchical measure
LIGGeorges Quénot - LIGLIG_knn_opp_sift_color_texture_rerank.top0.607235.0544Fusion of KNNs with dense and Harris-Laplace filtered opponent SIFTs and color histogram and Gabor texture, with re-ranking and optimized for the flat measure
LIGGeorges Quénot - LIGLIG_knn_opp_sift_color_texture_hierarchical.top0.623255.3676Fusion of KNNs with dense and Harris-Laplace filtered opponent SIFTs and color histogram and Gabor texture, optimized for the hierarchical measure
LIGGeorges Quénot - LIGLIG_knn_opp_sift_color_texture_flat.top0.625715.4182Fusion of KNNs with dense and Harris-Laplace filtered opponent SIFTs and color histogram and Gabor texture, optimized for the flat measure
LIGGeorges Quénot - LIGLIG_knn_color_texture_flat.top0.694586.1162KNN with color histogram and Gabor texture, optimized for the flat measure
LIGGeorges Quénot - LIGLIG_knn_color_texture_hierarchical.top0.698436.0582KNN with color histogram and Gabor texture, optimized for the hierarchical measure
ibm-ensemblelexing xie, ibm research hua ouyang, georgia tech apostol natsev, ibm researchf18_0p9_k20.txt0.700936.0742fast knn ensemble 20
ibm-ensemblelexing xie, ibm research hua ouyang, georgia tech apostol natsev, ibm researchf18_0p9_k15.txt0.714116.2223fast knn ensemble 15
hminmaxJim Mutch, Sharat Chikkerur, Hristo Paskov, Ruslan Salakhutdinov, Stan Bileschi, Hueihan Jhuang. MITsbow_liblinear.txt0.730796.1884Baseline liblinear using 600k examples
National Institute of InformaticsCai-Zhi ZHU @ National Institute of Informatics, Tokyo,Japan Xiao ZHOU @ Hefei Normal Univ. Heifei, China Shin\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\'ichi Satoh @ National Institute of Informatics, Tokyo, Japanflat.caizhizhu.nii.txt0.741656.4535color lbp feature, flat class representation, canocial correlation analysis, random walk ranking
LIGGeorges Quénot - LIGLIG_knn_opp_sift_flat.top0.744256.9932Fusion of KNNs with dense and Harris-Laplace filtered opponent SIFTs, optimized for the flat measure
RegularitiesOmid Madani, SRI International Brian Burns, SRI Internationallabels.test.aggregation.inImageNet.txt0.750676.6738similar to file 1, but aggregation of several models, from different passes (so may give better accruacy withing 5)
RegularitiesOmid Madani, SRI International Brian Burns, SRI Internationallabels.test.pass6.boost001marg004maxl400.inImageNet.txt0.754116.9017Results using a flat (linear) sparse model (online index learning, after 6 passes) with induced features (build on top of the provided 1000 discretized sift features)
RegularitiesOmid Madani, SRI International Brian Burns, SRI Internationallabels.test.pass6.boost001marg004maxl400.inImageNet.txt0.754116.9017same as file 1
LIGGeorges Quénot - LIGLIG_knn_opp_sift_hierarchical.top0.771546.9959Fusion of KNNs with dense and Harris-Laplace filtered opponent SIFTs, optimized for the hierarchical measure
National Institute of InformaticsCai-Zhi ZHU @ National Institute of Informatics, Tokyo,Japan Xiao ZHOU @ Hefei Normal Univ. Heifei, China Shin\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\'ichi Satoh @ National Institute of Informatics, Tokyo, Japancost.caizhizhu.nii.txt0.786596.8725color lbp feature, canocial correlation analysis, random walk ranking
RegularitiesOmid Madani, SRI International Brian Burns, SRI Internationallabels.test.pass3.s4.inImageNet.txt0.802956.9414similar to fil1 except at pass 3 with smaller outdegree (fan out) and higher learning rate (so the performance should be inferior)
ITNLP_HITDeyuan Zhang, HIT; Wenfeng Xuan, HIT; Xiaolong Wang, HIT; Bingquan Liu, HIT; Chengjie Sun, HIT; predict_label_2.txt0.988310.345resize image to 160 pixels keeping the aspect ratio, extract geometric blur and sift features, neural network reduce feature dimension, naive bayes with the concurrence of the descriptors
ITNLP_HITDeyuan Zhang, HIT; Wenfeng Xuan, HIT; Xiaolong Wang, HIT; Bingquan Liu, HIT; Chengjie Sun, HIT; predict_label_2.txt0.988310.345resize image to 160 pixels keeping the aspect ratio, extract geometric blur and sift features, neural network reduce feature dimension, naive bayes with the concurrence of the descriptors

Submissions ranked by hierarchical cost

team nameteam membersfilenameflat costhie costdescription
NEC-UIUCNEC: Yuanqing Lin, Fengjun Lv, Shenghuo Zhu, Ming Yang, Timothee Cour, Kai Yu UIUC: LiangLiang Cao, Zhen Li, Min-Hsuan Tsai, Xi Zhou, Thomas Huang Rutgers: Tong Zhanghierarchical.txt0.281922.1107using sift and lbp feature with two non-linear coding representations and stochastic SVM, optimized for hierarchical cost
NEC-UIUCNEC: Yuanqing Lin, Fengjun Lv, Shenghuo Zhu, Ming Yang, Timothee Cour, Kai Yu UIUC: LiangLiang Cao, Zhen Li, Min-Hsuan Tsai, Xi Zhou, Thomas Huang Rutgers: Tong Zhangflat_opt.txt0.281912.1144using sift and lbp feature with two non-linear coding representations and stochastic SVM, optimized for top-5 hit rate
NEC-UIUCNEC: Yuanqing Lin, Fengjun Lv, Shenghuo Zhu, Ming Yang, Timothee Cour, Kai Yu UIUC: LiangLiang Cao, Zhen Li, Min-Hsuan Tsai, Xi Zhou, Thomas Huang Rutgers: Tong Zhangflat_rerank.txt0.282992.1258using sift and lbp feature with two non-linear coding representations and stochastic SVM, optimized for top-5 hit rate with re-rank
NEC-UIUCNEC: Yuanqing Lin, Fengjun Lv, Shenghuo Zhu, Ming Yang, Timothee Cour, Kai Yu UIUC: LiangLiang Cao, Zhen Li, Min-Hsuan Tsai, Xi Zhou, Thomas Huang Rutgers: Tong Zhangflat_avg.txt0.287822.164using sift and lbp feature with two non-linear coding representations and stochastic SVM
XRCEJorge Sanchez, XRCE Florent Perronnin, XRCE Thomas Mensink, XRCExrce_res_18aug.txt0.336492.5553System based on the Fisher kernel framework as described in F. Perronnin, J. Sanchez and T. Mensink, \\\"Improving the Fisher kernel for Large-Scale Image Classification\\\", ECCV, 2010.
XRCEJorge Sanchez, XRCE Florent Perronnin, XRCE Thomas Mensink, XRCExrce_res_18aug.txt0.336492.5553System based on the Fisher kernel framework as described in F. Perronnin, J. Sanchez and T. Mensink, \\\"Improving the Fisher kernel for Large-Scale Image Classification\\\", ECCV, 2010.
NEC-UIUCNEC: Yuanqing Lin, Fengjun Lv, Shenghuo Zhu, Ming Yang, Timothee Cour, Kai Yu UIUC: LiangLiang Cao, Zhen Li, Min-Hsuan Tsai, Xi Zhou, Thomas Huang Rutgers: Tong Zhangflat_single.txt0.345792.5986using sift and lbp feature with one non-linear coding representation and stochastic SVM
UCIHamed Pirsiavash Deva Ramanan Charless Fowlkestest400.pred.txt0.466243.6288dense sift + color features + grid based bag of features + approximated chi^2 kernel svm (In training, uses only the first 400 samples/class)
UCIHamed Pirsiavash Deva Ramanan Charless Fowlkestest400.pred.txt0.466243.6288dense sift + color features + grid based bag of features + approximated chi^2 kernel svm (In training, uses only the first 400 samples/class)
Intelligent Systems and Informatics Lab., The University of TokyoTatsuya Harada (The Univ. of Tokyo) Hideki Nakayama (The Univ. of Tokyo) Yoshitaka Ushiku (The Univ. of Tokyo) Yuya Yamashita (The Univ. of Tokyo) Jun Imura (The Univ. of Tokyo) Yasuo Kuniyoshi (The Univ. of Tokyo) result_hie2_sift_ssim_surf_phog_rgb_hlac_ccd025_250.txt0.445583.6536--- Image features We use BoF (SIFT and Dense SURF), SSIM, PHOG, RGB Histogram, and HLAC. --- Label feature The label feature is a 1676 dimensional binary vector where the elements of the label featu
Intelligent Systems and Informatics Lab., The University of TokyoTatsuya Harada (The Univ. of Tokyo) Hideki Nakayama (The Univ. of Tokyo) Yoshitaka Ushiku (The Univ. of Tokyo) Yuya Yamashita (The Univ. of Tokyo) Jun Imura (The Univ. of Tokyo) Yasuo Kuniyoshi (The Univ. of Tokyo) result_hie2_sift_ssim_surf_phog_rgb_hlac_ccd025_250.txt0.445583.6536--- Image features We use BoF (SIFT and Dense SURF), SSIM, PHOG, RGB Histogram, and HLAC. --- Label feature The label feature is a 1676 dimensional binary vector where the elements of the label featu
hminmaxJim Mutch, Sharat Chikkerur, Hristo Paskov, Ruslan Salakhutdinov, Stan Bileschi, Hueihan Jhuang. MITsubmit_sbow_gist_color_bestc101_cat_plus_sbow_liblinear.txt0.544334.4359all features concatenated and combined with sbow(liblinear) using PoE. Validation 0.76
hminmaxJim Mutch, Sharat Chikkerur, Hristo Paskov, Ruslan Salakhutdinov, Stan Bileschi, Hueihan Jhuang. MITsubmit_sbow_gist_color_bestc101_cat_plus_sbow_liblinear.txt0.544334.4359all features concatenated and combined with sbow(liblinear) using PoE. Validation 0.76
hminmaxJim Mutch, Sharat Chikkerur, Hristo Paskov, Ruslan Salakhutdinov, Stan Bileschi, Hueihan Jhuang. MITsubmit_sbow_gist_color_bestc101_cat.txt0.586784.8035All features concatenated. Validation 0.8
hminmaxJim Mutch, Sharat Chikkerur, Hristo Paskov, Ruslan Salakhutdinov, Stan Bileschi, Hueihan Jhuang. MITruss_test.txt0.571094.8581Scores combined using logistic regression (russ). Validation .7684
hminmaxJim Mutch, Sharat Chikkerur, Hristo Paskov, Ruslan Salakhutdinov, Stan Bileschi, Hueihan Jhuang. MITsubmit_sbow_gist_color_bestc101_poe.txt0.590854.9916scores combined using PoE. Validation 0.8
NTU_WZXZhengxiang Wang CeMNet, SCE, NTU, Singapore Liang-Tien Chia, Clement CeMNet, SCE, NTU, Singaporetest_LI2C_Tr100_Triplet100.txt0.583134.9933Using the provided raw SIFT feature as input and use my LI2C algorithm for learning the weight associated with each feature (submitted to PR journal).
NTU_WZXZhengxiang Wang CeMNet, SCE, NTU, Singapore Liang-Tien Chia, Clement CeMNet, SCE, NTU, Singaporetest_LI2C_Tr100_Triplet100.txt0.583134.9933Using the provided raw SIFT feature as input and use my LI2C algorithm for learning the weight associated with each feature (submitted to PR journal).
LIGGeorges Quénot - LIGLIG_knn_opp_sift_color_texture_rerank.top0.607235.0544Fusion of KNNs with dense and Harris-Laplace filtered opponent SIFTs and color histogram and Gabor texture, with re-ranking and optimized for the hierarchical measure
LIGGeorges Quénot - LIGLIG_knn_opp_sift_color_texture_rerank.top0.607235.0544Fusion of KNNs with dense and Harris-Laplace filtered opponent SIFTs and color histogram and Gabor texture, with re-ranking and optimized for the flat measure
LIGGeorges Quénot - LIGLIG_knn_opp_sift_color_texture_hierarchical.top0.623255.3676Fusion of KNNs with dense and Harris-Laplace filtered opponent SIFTs and color histogram and Gabor texture, optimized for the hierarchical measure
LIGGeorges Quénot - LIGLIG_knn_opp_sift_color_texture_flat.top0.625715.4182Fusion of KNNs with dense and Harris-Laplace filtered opponent SIFTs and color histogram and Gabor texture, optimized for the flat measure
LIGGeorges Quénot - LIGLIG_knn_color_texture_hierarchical.top0.698436.0582KNN with color histogram and Gabor texture, optimized for the hierarchical measure
ibm-ensemblelexing xie, ibm research hua ouyang, georgia tech apostol natsev, ibm researchf18_0p9_k20.txt0.700936.0742fast knn ensemble 20
LIGGeorges Quénot - LIGLIG_knn_color_texture_flat.top0.694586.1162KNN with color histogram and Gabor texture, optimized for the flat measure
hminmaxJim Mutch, Sharat Chikkerur, Hristo Paskov, Ruslan Salakhutdinov, Stan Bileschi, Hueihan Jhuang. MITsbow_liblinear.txt0.730796.1884Baseline liblinear using 600k examples
ibm-ensemblelexing xie, ibm research hua ouyang, georgia tech apostol natsev, ibm researchf18_0p9_k15.txt0.714116.2223fast knn ensemble 15
National Institute of InformaticsCai-Zhi ZHU @ National Institute of Informatics, Tokyo,Japan Xiao ZHOU @ Hefei Normal Univ. Heifei, China Shin\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\'ichi Satoh @ National Institute of Informatics, Tokyo, Japanflat.caizhizhu.nii.txt0.741656.4535color lbp feature, flat class representation, canocial correlation analysis, random walk ranking
RegularitiesOmid Madani, SRI International Brian Burns, SRI Internationallabels.test.aggregation.inImageNet.txt0.750676.6738similar to file 1, but aggregation of several models, from different passes (so may give better accruacy withing 5)
National Institute of InformaticsCai-Zhi ZHU @ National Institute of Informatics, Tokyo,Japan Xiao ZHOU @ Hefei Normal Univ. Heifei, China Shin\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\'ichi Satoh @ National Institute of Informatics, Tokyo, Japancost.caizhizhu.nii.txt0.786596.8725color lbp feature, canocial correlation analysis, random walk ranking
RegularitiesOmid Madani, SRI International Brian Burns, SRI Internationallabels.test.pass6.boost001marg004maxl400.inImageNet.txt0.754116.9017Results using a flat (linear) sparse model (online index learning, after 6 passes) with induced features (build on top of the provided 1000 discretized sift features)
RegularitiesOmid Madani, SRI International Brian Burns, SRI Internationallabels.test.pass6.boost001marg004maxl400.inImageNet.txt0.754116.9017same as file 1
RegularitiesOmid Madani, SRI International Brian Burns, SRI Internationallabels.test.pass3.s4.inImageNet.txt0.802956.9414similar to fil1 except at pass 3 with smaller outdegree (fan out) and higher learning rate (so the performance should be inferior)
LIGGeorges Quénot - LIGLIG_knn_opp_sift_flat.top0.744256.9932Fusion of KNNs with dense and Harris-Laplace filtered opponent SIFTs, optimized for the flat measure
LIGGeorges Quénot - LIGLIG_knn_opp_sift_hierarchical.top0.771546.9959Fusion of KNNs with dense and Harris-Laplace filtered opponent SIFTs, optimized for the hierarchical measure
ITNLP_HITDeyuan Zhang, HIT; Wenfeng Xuan, HIT; Xiaolong Wang, HIT; Bingquan Liu, HIT; Chengjie Sun, HIT; predict_label_2.txt0.988310.345resize image to 160 pixels keeping the aspect ratio, extract geometric blur and sift features, neural network reduce feature dimension, naive bayes with the concurrence of the descriptors
ITNLP_HITDeyuan Zhang, HIT; Wenfeng Xuan, HIT; Xiaolong Wang, HIT; Bingquan Liu, HIT; Chengjie Sun, HIT; predict_label_2.txt0.988310.345resize image to 160 pixels keeping the aspect ratio, extract geometric blur and sift features, neural network reduce feature dimension, naive bayes with the concurrence of the descriptors