The tables report the results for task 1A considering the results achieved by each
team. The Annealing time measures the execution time of the approach. In the case of Quantum
Annealing this consists in the programming time, sampling time and post-processing time.
MQ2007
Team
Submission id
ndcg@10
Annealing Time (us)
Type
n° features
DS@GT qClef
1A_MQ2007_SA_ds-at-gt-qclef_pfi-k-25-cmi
0.4500
2218797
S
25
DS@GT qClef
1A_MQ2007_SA_ds-at-gt-qclef_pfi-k-20-cpfi
0.4318
2184864
S
20
DS@GT qClef
1A_MQ2007_SA_ds-at-gt-qclef_mi-k-25
0.4510
2213641
S
25
DS@GT qClef
1A_MQ2007_SA_ds-at-gt-qclef_mi-k-15
0.4485
2136012
S
15
DS@GT qClef
1A_MQ2007_SA_ds-at-gt-qclef_pfi-k-30-cmi
0.4523
2156998
S
30
DS@GT qClef
1A_MQ2007_QA_ds-at-gt-qclef_mi-1
0.4436
182844
Q
15
DS@GT qClef
1A_MQ2007_QA_ds-at-gt-qclef_mi-2
0.4552
159783
Q
13
FAST-NU
MQ2007_SA_FAST-NU_SA-2918
0.4212
4072677
S
15
FAST-NU
1A_MQ2007_SA_FAST-NU_SA-2915
0.3358
4163986
S
15
FAST-NU
1A_MQ2007_QA_FAST-NU_ae194be3-5267-45dd-aa0e-36a58579d719
0.4311
338763
Q
15
FAST-NU
1A_MQ2007_QA_FAST-NU_26065450-e42a-4d92-bfb9-ff367d132142
0.4409
286966
Q
15
FAST-NU
1A_MQ2007_QA_FAST-NU_1bba5207-9919-4048-b4a0-80f89b03f603
0.4375
274721
Q
15
SINAI-UJA
response_k21_nr3000
0.4530
3448310
S
21
SINAI-UJA
response_k23_nr3000
0.4478
6631729
S
23
SINAI-UJA
response_k25_nr3000
0.4510
2998083
S
25
SINAI-UJA
response_k27_nr3000
0.4438
6636607
S
27
SINAI-UJA
response_k29_nr3000
0.4491
6614343
S
29
SINAI-UJA
response_k21_nr100
0.4580
33508
Q
21
SINAI-UJA
response_k23_nr100
0.4437
36553
Q
23
SINAI-UJA
response_k25_nr100
0.4550
30580
Q
25
SINAI-UJA
response_k27_nr100
0.4425
34324
Q
27
SINAI-UJA
response_k29_nr100
0.4528
34259
Q
29
BASELINE
RFE_BASELINE_HALF
0.4450
-
-
23
BASELINE
BASELINE_ALL
0.4473
-
-
46
Istella
Team
Submission id
ndcg@10
Annealing Time (us)
Type
n° features
DS@GT qClef
1A_Istella_SA_ds-at-gt-qclef_mi_25
0.6025
126630661
S
25
DS@GT qClef
1A_Istella_SA_ds-at-gt-qclef_mi_30
0.5104
133814280
S
30
DS@GT qClef
1A_Istella_SA_ds-at-gt-qclef_mi_50
0.6524
159964262
S
50
DS@GT qClef
1A_Istella_SA_ds-at-gt-qclef_mi_60
0.6682
173222227
S
60
DS@GT qClef
1A_Istella_SA_ds-at-gt-qclef_mi_70
0.6523
184046761
S
70
DS@GT qClef
1A_Istella_QA_ds-at-gt-qclef_mi_50
0.5586
9987158
H
50
BASELINE
RFE_BASELINE_HALF
0.5560
-
-
110
BASELINE
BASELINE_ALL
0.7146
-
-
220
The tables report the results for task 1B considering the results achieved by each
team. The Annealing time measures the execution time of the approach. In the case of Quantum
Annealing this consists in the programming time, sampling time and post-processing time.
ICM_100
Team
Submission id
ndcg@10
Annealing Time (us)
Type
n° features
Malto
1B_100_ICM_SA_MALTO_1B - 100_ICM submission
0.0207
6148929
S
51
BASELINE
ALL_FEATURES
0.0226
-
-
100
ICM_400
Team
Submission id
ndcg@10
Annealing Time (us)
Type
n° features
Malto
1B_400_ICM_SA_MALTO_1B - 400_ICM submission - 200
0.0294
80780716
S
200
Malto
1B_400_ICM_SA_MALTO_1B - 400_ICM submission
0.0182
70269351
S
53
BASELINE
ALL_FEATURES
0.0328
-
-
400
The tables report the results for task 2 considering the results achieved by each
team averaged over the 5 folds. The Annealing time measures the execution time of the approach. In the case of Quantum
Annealing this consists in the programming time, sampling time and post-processing time.
Asterisks (*) refer to submissions where not all 5 folds were provided.
Yelp
Group
Submission id
Macro F1 Avg
Avg Reduction
Avg Fine-Tuning + Prediction Time (s)
Avg Annealing Time (us)
Type
DS@GT qClef
Yelp_SA_qclef_bcos_075
99.5(0.2)
0.25
1548.5(2.8)
25996916
S
DS@GT qClef
Yelp_SA_qclef_it_del_075
99.3(0.3)
0.25
1549.2(1.5)
25784129
S
DS@GT qClef
Yelp_SA_qclef_svc_075
99.3(0.4)
0.25
1550.5(2.6)
25916782
S
DS@GT qClef
Yelp_QA_qclef_bcos
99.4(0.2)
0.274
1500(54.7)
1767195
Q
GPLSI
Yelp_SA_gplsi_2-SentimentPairs(docs=just-final,reads=2000,limit=True)
90.8(5.7)
0.963
170.8(3.8)
35810108
S
GPLSI
Yelp_SA_gplsi_2-SentimentPairs(docs=pair-related,reads=2000,limit=True)
99.2(0.3)
0.627
822.2(395)
35810108
S
GPLSI
Yelp_SA_gplsi_2-LocalSets
99.4(0.2)
0.512
1045.5(5.3)
28788950
S
GPLSI
Yelp_SA_gplsi_2-SentimentKmeansCard
98.5(1.1)
0.875
338.8(21)
17823652
S
GPLSI
Yelp_SA_gplsi_2-emoconflictCard
98.6(0.5)
0.728
628.2(65.9)
34024297
S
GPLSI
Yelp_QA_gplsi_2-SentimentKmeansCard
98.7(0.2)
0.869
351(25.1)
553306
Q
GPLSI
Yelp_QA_gplsi_2-emoconflictCard
98.8(0.6)
0.702
678.8(80.9)
549364
Q
Malto
Yelp_SA_MALTO_2 - vader_nyt_2L_0
99.2(0.2)
0.751
582(2)
142948641
S
BASELINE
BASELINE_ALL
99.4(0.1)
-
2027.1(1.1)
-
-
Vader
Group
Submission id
Macro F1 Avg
Avg Reduction
Avg Fine-Tuning + Prediction Time (s)
Avg Annealing Time (us)
Type
DS@GT qClef
Vader_SA_qclef_combined_075
65.9(4.7)
0.25
1529.4(3)
25299992
S
DS@GT qClef
Vader_SA_qclef_bcos_075
62.5(10.4)
0.25
1528.6(2.2)
25735414
S
DS@GT qClef
Vader_SA_qclef_it_del_075
65.6(3)
0.25
1529.5(2.3)
25347707
S
DS@GT qClef
Vader_SA_qclef_svc_075
65.4(7.1)
0.25
1529(2.4)
25530256
S
DS@GT qClef
Vader_QA_qclef_bcos
62.6(7.5)
0.283
1493.3(83)
1873818
Q
GPLSI
Vader_SA_gplsi_2-SentimentPairs-docs=just-final,reads=2000,limit=True-
47.4(5.4)
0.962
172.8(5.7)
42407652
S
GPLSI
Vader_SA_gplsi_2-SentimentPairs-docs=pair-related,reads=2000,limit=True-
62.2(4.1)
0.7
671.8(352.8)
42407652
S
GPLSI
Vader_QA_gplsi_2-SentimentPairs-docs=just-final,reads=2000,limit=True-
50(64)*
0.835*
172.9(26.9)*
545303*
Q
GPLSI
Vader_QA_gplsi_2-SentimentPairs(docs=pair-related,reads=(2000,limit=True)
62.1(1.8)*
0.658*
750.7(2653.2)*
545303*
Q
GPLSI
Vader_SA_gplsi_2-LocalSets
63.3(4.9)
0.505
1048.3(6.7)
29109748
S
Malto
Vader_SA_MALTO_2 - vader_nyt_2L_0
63.1(2.5)
0.751
574.5(1.7)
126087244
S
BASELINE
BASELINE_ALL
88.9(0.8)
-
1997.3(5.7)
-
-
The table report the results for task 3 considering the results achieved by each
team. The Annealing time measures the execution time of the approach. In the case of Quantum
Annealing this consists in the programming time, sampling time and post-processing time.
N° centroids
Team
Submission id
ndcg@10
DBI
Annealing Time (us)
Type
10
GPLSI
10_SA_gplsi_3-FPS-Medoids
0.5783
7.5147
15374699
S
10
GPLSI
10_SA_gplsi_3-SubMedoidsQUBO
0.5579
6.8779
15304643
S
10
GPLSI
10_SA_gplsi_CLARA-CLARANS
0.5444
6.6710
15395337
S
10
GPLSI
10_SA_gplsi_MBK-Medoids
0.5600
6.4258
15510116
S
10
DS@GT qClef
10_SA_ds-at-gt-qclef_1
0.5800
7.4776
83073
S
10
DS@GT qClef
10_SA_ds-at-gt-qclef_2
0.0172
4.4706
82843
S
10
BASELINE
BASELINE_10
0.5509
7.9892
-
-
25
GPLSI
25_SA_gplsi_3-FPS-Medoids
0.5475
5.5577
20875484
S
25
GPLSI
25_SA_gplsi_3-SubMedoidsQUBO
0.5298
5.6255
40686713
S
25
GPLSI
25_SA_gplsi_CLARA-CLARANS
0.5310
5.6507
20531723
S
25
GPLSI
25_SA_gplsi_MBK-Medoids
0.5193
5.3755
20757746
S
25
BASELINE
BASELINE_25
0.5284
6.1201
-
-
50
GPLSI
50_SA_gplsi_3-FPS-Medoids
0.5592
4.4531
9869029
S
50
GPLSI
50_SA_gplsi_3-SubMedoidsQUBO
0.5148
4.9325
23718874
S
50
GPLSI
50_SA_gplsi_CLARA-CLARANS
0.5017
5.1703
9976090
S
50
GPLSI
50_SA_gplsi_MBK-Medoids
0.5383
4.5025
24003792
S
50
DS@GT qClef
50_SA_ds-at-gt-qclef_3
0.0064
3.4217
228376
S
50
BASELINE
BASELINE_50
0.4656
5.3679
-
-