Commit af4fe336 by Tooba Mukhtar

adding datasets

parent 81199142
This source diff could not be displayed because it is too large. You can view the blob instead.
This source diff could not be displayed because it is too large. You can view the blob instead.
This source diff could not be displayed because it is too large. You can view the blob instead.
This source diff could not be displayed because it is too large. You can view the blob instead.
"Sepal.Length","Sepal.Width","Petal.Length","Petal.Width","Species"
5.1,3.5,1.4,0.2,"setosa"
4.9,3,1.4,0.2,"setosa"
4.7,3.2,1.3,0.2,"setosa"
4.6,3.1,1.5,0.2,"setosa"
5,3.6,1.4,0.2,"setosa"
5.4,3.9,1.7,0.4,"setosa"
4.6,3.4,1.4,0.3,"setosa"
5,3.4,1.5,0.2,"setosa"
4.4,2.9,1.4,0.2,"setosa"
4.9,3.1,1.5,0.1,"setosa"
5.4,3.7,1.5,0.2,"setosa"
4.8,3.4,1.6,0.2,"setosa"
4.8,3,1.4,0.1,"setosa"
4.3,3,1.1,0.1,"setosa"
5.8,4,1.2,0.2,"setosa"
5.7,4.4,1.5,0.4,"setosa"
5.4,3.9,1.3,0.4,"setosa"
5.1,3.5,1.4,0.3,"setosa"
5.7,3.8,1.7,0.3,"setosa"
5.1,3.8,1.5,0.3,"setosa"
5.4,3.4,1.7,0.2,"setosa"
5.1,3.7,1.5,0.4,"setosa"
4.6,3.6,1,0.2,"setosa"
5.1,3.3,1.7,0.5,"setosa"
4.8,3.4,1.9,0.2,"setosa"
5,3,1.6,0.2,"setosa"
5,3.4,1.6,0.4,"setosa"
5.2,3.5,1.5,0.2,"setosa"
5.2,3.4,1.4,0.2,"setosa"
4.7,3.2,1.6,0.2,"setosa"
4.8,3.1,1.6,0.2,"setosa"
5.4,3.4,1.5,0.4,"setosa"
5.2,4.1,1.5,0.1,"setosa"
5.5,4.2,1.4,0.2,"setosa"
4.9,3.1,1.5,0.2,"setosa"
5,3.2,1.2,0.2,"setosa"
5.5,3.5,1.3,0.2,"setosa"
4.9,3.6,1.4,0.1,"setosa"
4.4,3,1.3,0.2,"setosa"
5.1,3.4,1.5,0.2,"setosa"
5,3.5,1.3,0.3,"setosa"
4.5,2.3,1.3,0.3,"setosa"
4.4,3.2,1.3,0.2,"setosa"
5,3.5,1.6,0.6,"setosa"
5.1,3.8,1.9,0.4,"setosa"
4.8,3,1.4,0.3,"setosa"
5.1,3.8,1.6,0.2,"setosa"
4.6,3.2,1.4,0.2,"setosa"
5.3,3.7,1.5,0.2,"setosa"
5,3.3,1.4,0.2,"setosa"
7,3.2,4.7,1.4,"versicolor"
6.4,3.2,4.5,1.5,"versicolor"
6.9,3.1,4.9,1.5,"versicolor"
5.5,2.3,4,1.3,"versicolor"
6.5,2.8,4.6,1.5,"versicolor"
5.7,2.8,4.5,1.3,"versicolor"
6.3,3.3,4.7,1.6,"versicolor"
4.9,2.4,3.3,1,"versicolor"
6.6,2.9,4.6,1.3,"versicolor"
5.2,2.7,3.9,1.4,"versicolor"
5,2,3.5,1,"versicolor"
5.9,3,4.2,1.5,"versicolor"
6,2.2,4,1,"versicolor"
6.1,2.9,4.7,1.4,"versicolor"
5.6,2.9,3.6,1.3,"versicolor"
6.7,3.1,4.4,1.4,"versicolor"
5.6,3,4.5,1.5,"versicolor"
5.8,2.7,4.1,1,"versicolor"
6.2,2.2,4.5,1.5,"versicolor"
5.6,2.5,3.9,1.1,"versicolor"
5.9,3.2,4.8,1.8,"versicolor"
6.1,2.8,4,1.3,"versicolor"
6.3,2.5,4.9,1.5,"versicolor"
6.1,2.8,4.7,1.2,"versicolor"
6.4,2.9,4.3,1.3,"versicolor"
6.6,3,4.4,1.4,"versicolor"
6.8,2.8,4.8,1.4,"versicolor"
6.7,3,5,1.7,"versicolor"
6,2.9,4.5,1.5,"versicolor"
5.7,2.6,3.5,1,"versicolor"
5.5,2.4,3.8,1.1,"versicolor"
5.5,2.4,3.7,1,"versicolor"
5.8,2.7,3.9,1.2,"versicolor"
6,2.7,5.1,1.6,"versicolor"
5.4,3,4.5,1.5,"versicolor"
6,3.4,4.5,1.6,"versicolor"
6.7,3.1,4.7,1.5,"versicolor"
6.3,2.3,4.4,1.3,"versicolor"
5.6,3,4.1,1.3,"versicolor"
5.5,2.5,4,1.3,"versicolor"
5.5,2.6,4.4,1.2,"versicolor"
6.1,3,4.6,1.4,"versicolor"
5.8,2.6,4,1.2,"versicolor"
5,2.3,3.3,1,"versicolor"
5.6,2.7,4.2,1.3,"versicolor"
5.7,3,4.2,1.2,"versicolor"
5.7,2.9,4.2,1.3,"versicolor"
6.2,2.9,4.3,1.3,"versicolor"
5.1,2.5,3,1.1,"versicolor"
5.7,2.8,4.1,1.3,"versicolor"
6.3,3.3,6,2.5,"virginica"
5.8,2.7,5.1,1.9,"virginica"
7.1,3,5.9,2.1,"virginica"
6.3,2.9,5.6,1.8,"virginica"
6.5,3,5.8,2.2,"virginica"
7.6,3,6.6,2.1,"virginica"
4.9,2.5,4.5,1.7,"virginica"
7.3,2.9,6.3,1.8,"virginica"
6.7,2.5,5.8,1.8,"virginica"
7.2,3.6,6.1,2.5,"virginica"
6.5,3.2,5.1,2,"virginica"
6.4,2.7,5.3,1.9,"virginica"
6.8,3,5.5,2.1,"virginica"
5.7,2.5,5,2,"virginica"
5.8,2.8,5.1,2.4,"virginica"
6.4,3.2,5.3,2.3,"virginica"
6.5,3,5.5,1.8,"virginica"
7.7,3.8,6.7,2.2,"virginica"
7.7,2.6,6.9,2.3,"virginica"
6,2.2,5,1.5,"virginica"
6.9,3.2,5.7,2.3,"virginica"
5.6,2.8,4.9,2,"virginica"
7.7,2.8,6.7,2,"virginica"
6.3,2.7,4.9,1.8,"virginica"
6.7,3.3,5.7,2.1,"virginica"
7.2,3.2,6,1.8,"virginica"
6.2,2.8,4.8,1.8,"virginica"
6.1,3,4.9,1.8,"virginica"
6.4,2.8,5.6,2.1,"virginica"
7.2,3,5.8,1.6,"virginica"
7.4,2.8,6.1,1.9,"virginica"
7.9,3.8,6.4,2,"virginica"
6.4,2.8,5.6,2.2,"virginica"
6.3,2.8,5.1,1.5,"virginica"
6.1,2.6,5.6,1.4,"virginica"
7.7,3,6.1,2.3,"virginica"
6.3,3.4,5.6,2.4,"virginica"
6.4,3.1,5.5,1.8,"virginica"
6,3,4.8,1.8,"virginica"
6.9,3.1,5.4,2.1,"virginica"
6.7,3.1,5.6,2.4,"virginica"
6.9,3.1,5.1,2.3,"virginica"
5.8,2.7,5.1,1.9,"virginica"
6.8,3.2,5.9,2.3,"virginica"
6.7,3.3,5.7,2.5,"virginica"
6.7,3,5.2,2.3,"virginica"
6.3,2.5,5,1.9,"virginica"
6.5,3,5.2,2,"virginica"
6.2,3.4,5.4,2.3,"virginica"
5.9,3,5.1,1.8,"virginica"
This source diff could not be displayed because it is too large. You can view the blob instead.
"","x"
"1",9007
"2",8106
"3",8928
"4",9137
"5",10017
"6",10826
"7",11317
"8",10744
"9",9713
"10",9938
"11",9161
"12",8927
"13",7750
"14",6981
"15",8038
"16",8422
"17",8714
"18",9512
"19",10120
"20",9823
"21",8743
"22",9129
"23",8710
"24",8680
"25",8162
"26",7306
"27",8124
"28",7870
"29",9387
"30",9556
"31",10093
"32",9620
"33",8285
"34",8466
"35",8160
"36",8034
"37",7717
"38",7461
"39",7767
"40",7925
"41",8623
"42",8945
"43",10078
"44",9179
"45",8037
"46",8488
"47",7874
"48",8647
"49",7792
"50",6957
"51",7726
"52",8106
"53",8890
"54",9299
"55",10625
"56",9302
"57",8314
"58",8850
"59",8265
"60",8796
"61",7836
"62",6892
"63",7791
"64",8192
"65",9115
"66",9434
"67",10484
"68",9827
"69",9110
"70",9070
"71",8633
"72",9240
This source diff could not be displayed because it is too large. You can view the blob instead.
Citation Request:
This dataset is public available for research. The details are described in [Cortez et al., 2009].
Please include this citation if you plan to use this database:
P. Cortez, A. Cerdeira, F. Almeida, T. Matos and J. Reis.
Modeling wine preferences by data mining from physicochemical properties.
In Decision Support Systems, Elsevier, 47(4):547-553. ISSN: 0167-9236.
Available at: [@Elsevier] http://dx.doi.org/10.1016/j.dss.2009.05.016
[Pre-press (pdf)] http://www3.dsi.uminho.pt/pcortez/winequality09.pdf
[bib] http://www3.dsi.uminho.pt/pcortez/dss09.bib
1. Title: Wine Quality
2. Sources
Created by: Paulo Cortez (Univ. Minho), Antonio Cerdeira, Fernando Almeida, Telmo Matos and Jose Reis (CVRVV) @ 2009
3. Past Usage:
P. Cortez, A. Cerdeira, F. Almeida, T. Matos and J. Reis.
Modeling wine preferences by data mining from physicochemical properties.
In Decision Support Systems, Elsevier, 47(4):547-553. ISSN: 0167-9236.
In the above reference, two datasets were created, using red and white wine samples.
The inputs include objective tests (e.g. PH values) and the output is based on sensory data
(median of at least 3 evaluations made by wine experts). Each expert graded the wine quality
between 0 (very bad) and 10 (very excellent). Several data mining methods were applied to model
these datasets under a regression approach. The support vector machine model achieved the
best results. Several metrics were computed: MAD, confusion matrix for a fixed error tolerance (T),
etc. Also, we plot the relative importances of the input variables (as measured by a sensitivity
analysis procedure).
4. Relevant Information:
The two datasets are related to red and white variants of the Portuguese "Vinho Verde" wine.
For more details, consult: http://www.vinhoverde.pt/en/ or the reference [Cortez et al., 2009].
Due to privacy and logistic issues, only physicochemical (inputs) and sensory (the output) variables
are available (e.g. there is no data about grape types, wine brand, wine selling price, etc.).
These datasets can be viewed as classification or regression tasks.
The classes are ordered and not balanced (e.g. there are munch more normal wines than
excellent or poor ones). Outlier detection algorithms could be used to detect the few excellent
or poor wines. Also, we are not sure if all input variables are relevant. So
it could be interesting to test feature selection methods.
5. Number of Instances: red wine - 1599; white wine - 4898.
6. Number of Attributes: 11 + output attribute
Note: several of the attributes may be correlated, thus it makes sense to apply some sort of
feature selection.
7. Attribute information:
For more information, read [Cortez et al., 2009].
Input variables (based on physicochemical tests):
1 - fixed acidity
2 - volatile acidity
3 - citric acid
4 - residual sugar
5 - chlorides
6 - free sulfur dioxide
7 - total sulfur dioxide
8 - density
9 - pH
10 - sulphates
11 - alcohol
Output variable (based on sensory data):
12 - quality (score between 0 and 10)
8. Missing Attribute Values: None
This source diff could not be displayed because it is too large. You can view the blob instead.
This source diff could not be displayed because it is too large. You can view the blob instead.
This source diff could not be displayed because it is too large. You can view the blob instead.
This source diff could not be displayed because it is too large. You can view the blob instead.
This source diff could not be displayed because it is too large. You can view the blob instead.
This source diff could not be displayed because it is too large. You can view the blob instead.
This source diff could not be displayed because it is too large. You can view the blob instead.
This source diff could not be displayed because it is too large. You can view the blob instead.
This source diff could not be displayed because it is too large. You can view the blob instead.
This source diff could not be displayed because it is too large. You can view the blob instead.
Normalized handwritten digits, automatically
scanned from envelopes by the U.S. Postal Service. The original
scanned digits are binary and of different sizes and orientations; the
images here have been deslanted and size normalized, resulting
in 16 x 16 grayscale images (Le Cun et al., 1990).
The data are in two gzipped files, and each line consists of the digit
id (0-9) followed by the 256 grayscale values.
There are 7291 training observations and 2007 test observations,
distributed as follows:
0 1 2 3 4 5 6 7 8 9 Total
Train 1194 1005 731 658 652 556 664 645 542 644 7291
Test 359 264 198 166 200 160 170 147 166 177 2007
or as proportions:
0 1 2 3 4 5 6 7 8 9
Train 0.16 0.14 0.1 0.09 0.09 0.08 0.09 0.09 0.07 0.09
Test 0.18 0.13 0.1 0.08 0.10 0.08 0.08 0.07 0.08 0.09
Alternatively, the training data are available as separate files per
digit (and hence without the digit identifier in each row)
The test set is notoriously "difficult", and a 2.5% error rate is
excellent. These data were kindly made available by the neural network
group at AT&T research labs (thanks to Yann Le Cunn).
This source diff could not be displayed because it is too large. You can view the blob instead.
This source diff could not be displayed because it is too large. You can view the blob instead.
"","Ozone","Solar.R","Wind","Temp","Month","Day"
"1",41,190,7.4,67,5,1
"2",36,118,8,72,5,2
"3",12,149,12.6,74,5,3
"4",18,313,11.5,62,5,4
"5",NA,NA,14.3,56,5,5
"6",28,NA,14.9,66,5,6
"7",23,299,8.6,65,5,7
"8",19,99,13.8,59,5,8
"9",8,19,20.1,61,5,9
"10",NA,194,8.6,69,5,10
"11",7,NA,6.9,74,5,11
"12",16,256,9.7,69,5,12
"13",11,290,9.2,66,5,13
"14",14,274,10.9,68,5,14
"15",18,65,13.2,58,5,15
"16",14,334,11.5,64,5,16
"17",34,307,12,66,5,17
"18",6,78,18.4,57,5,18
"19",30,322,11.5,68,5,19
"20",11,44,9.7,62,5,20
"21",1,8,9.7,59,5,21
"22",11,320,16.6,73,5,22
"23",4,25,9.7,61,5,23
"24",32,92,12,61,5,24
"25",NA,66,16.6,57,5,25
"26",NA,266,14.9,58,5,26
"27",NA,NA,8,57,5,27
"28",23,13,12,67,5,28
"29",45,252,14.9,81,5,29
"30",115,223,5.7,79,5,30
"31",37,279,7.4,76,5,31
"32",NA,286,8.6,78,6,1
"33",NA,287,9.7,74,6,2
"34",NA,242,16.1,67,6,3
"35",NA,186,9.2,84,6,4
"36",NA,220,8.6,85,6,5
"37",NA,264,14.3,79,6,6
"38",29,127,9.7,82,6,7
"39",NA,273,6.9,87,6,8
"40",71,291,13.8,90,6,9
"41",39,323,11.5,87,6,10
"42",NA,259,10.9,93,6,11
"43",NA,250,9.2,92,6,12
"44",23,148,8,82,6,13
"45",NA,332,13.8,80,6,14
"46",NA,322,11.5,79,6,15
"47",21,191,14.9,77,6,16
"48",37,284,20.7,72,6,17
"49",20,37,9.2,65,6,18
"50",12,120,11.5,73,6,19
"51",13,137,10.3,76,6,20
"52",NA,150,6.3,77,6,21
"53",NA,59,1.7,76,6,22
"54",NA,91,4.6,76,6,23
"55",NA,250,6.3,76,6,24
"56",NA,135,8,75,6,25
"57",NA,127,8,78,6,26
"58",NA,47,10.3,73,6,27
"59",NA,98,11.5,80,6,28
"60",NA,31,14.9,77,6,29
"61",NA,138,8,83,6,30
"62",135,269,4.1,84,7,1
"63",49,248,9.2,85,7,2
"64",32,236,9.2,81,7,3
"65",NA,101,10.9,84,7,4
"66",64,175,4.6,83,7,5
"67",40,314,10.9,83,7,6
"68",77,276,5.1,88,7,7
"69",97,267,6.3,92,7,8
"70",97,272,5.7,92,7,9
"71",85,175,7.4,89,7,10
"72",NA,139,8.6,82,7,11
"73",10,264,14.3,73,7,12
"74",27,175,14.9,81,7,13
"75",NA,291,14.9,91,7,14
"76",7,48,14.3,80,7,15
"77",48,260,6.9,81,7,16
"78",35,274,10.3,82,7,17
"79",61,285,6.3,84,7,18
"80",79,187,5.1,87,7,19
"81",63,220,11.5,85,7,20
"82",16,7,6.9,74,7,21
"83",NA,258,9.7,81,7,22
"84",NA,295,11.5,82,7,23
"85",80,294,8.6,86,7,24
"86",108,223,8,85,7,25
"87",20,81,8.6,82,7,26
"88",52,82,12,86,7,27
"89",82,213,7.4,88,7,28
"90",50,275,7.4,86,7,29
"91",64,253,7.4,83,7,30
"92",59,254,9.2,81,7,31
"93",39,83,6.9,81,8,1
"94",9,24,13.8,81,8,2
"95",16,77,7.4,82,8,3
"96",78,NA,6.9,86,8,4
"97",35,NA,7.4,85,8,5
"98",66,NA,4.6,87,8,6
"99",122,255,4,89,8,7
"100",89,229,10.3,90,8,8
"101",110,207,8,90,8,9
"102",NA,222,8.6,92,8,10
"103",NA,137,11.5,86,8,11
"104",44,192,11.5,86,8,12
"105",28,273,11.5,82,8,13
"106",65,157,9.7,80,8,14
"107",NA,64,11.5,79,8,15
"108",22,71,10.3,77,8,16
"109",59,51,6.3,79,8,17
"110",23,115,7.4,76,8,18
"111",31,244,10.9,78,8,19
"112",44,190,10.3,78,8,20
"113",21,259,15.5,77,8,21
"114",9,36,14.3,72,8,22
"115",NA,255,12.6,75,8,23
"116",45,212,9.7,79,8,24
"117",168,238,3.4,81,8,25
"118",73,215,8,86,8,26
"119",NA,153,5.7,88,8,27
"120",76,203,9.7,97,8,28
"121",118,225,2.3,94,8,29
"122",84,237,6.3,96,8,30
"123",85,188,6.3,94,8,31
"124",96,167,6.9,91,9,1
"125",78,197,5.1,92,9,2
"126",73,183,2.8,93,9,3
"127",91,189,4.6,93,9,4
"128",47,95,7.4,87,9,5
"129",32,92,15.5,84,9,6
"130",20,252,10.9,80,9,7
"131",23,220,10.3,78,9,8
"132",21,230,10.9,75,9,9
"133",24,259,9.7,73,9,10
"134",44,236,14.9,81,9,11
"135",21,259,15.5,76,9,12
"136",28,238,6.3,77,9,13
"137",9,24,10.9,71,9,14
"138",13,112,11.5,71,9,15
"139",46,237,6.9,78,9,16
"140",18,224,13.8,67,9,17
"141",13,27,10.3,76,9,18
"142",24,238,10.3,68,9,19
"143",16,201,8,82,9,20
"144",13,238,12.6,64,9,21
"145",23,14,9.2,71,9,22
"146",36,139,10.3,81,9,23
"147",7,49,10.3,69,9,24
"148",14,20,16.6,63,9,25
"149",30,193,6.9,70,9,26
"150",NA,145,13.2,77,9,27
"151",14,191,14.3,75,9,28
"152",18,131,8,76,9,29
"153",20,223,11.5,68,9,30
This source diff could not be displayed because it is too large. You can view the blob instead.
"","x"
"1",-0.62
"2",-0.45
"3",-0.47
"4",-0.62
"5",-0.82
"6",-0.7
"7",-0.7
"8",-0.69
"9",-0.63
"10",-0.32
"11",-0.62
"12",-0.54
"13",-0.5
"14",-0.55
"15",-0.42
"16",-0.39
"17",-0.34
"18",-0.22
"19",-0.36
"20",-0.14
"21",-0.06
"22",-0.03
"23",-0.3
"24",-0.29
"25",-0.43
"26",-0.28
"27",-0.06
"28",-0.52
"29",-0.31
"30",-0.3
"31",-0.25
"32",-0.32
"33",-0.41
"34",-0.25
"35",0.04
"36",0.01
"37",-0.32
"38",-0.57
"39",-0.38
"40",-0.22
"41",-0.22
"42",0.03
"43",-0.16
"44",-0.1
"45",-0.08
"46",0
"47",0.16
"48",-0.06
"49",0.06
"50",-0.35
"51",0.05
"52",0.1
"53",0.09
"54",-0.25
"55",0.07
"56",-0.1
"57",-0.04
"58",0.04
"59",0.23
"60",0.15
"61",0.09
"62",0.12
"63",0.1
"64",0.11
"65",0.22
"66",-0.07
"67",0.05
"68",0.13
"69",0.1
"70",-0.03
"71",-0.29
"72",-0.06
"73",-0.02
"74",0.23
"75",-0.12
"76",-0.07
"77",-0.4
"78",0.01
"79",0.12
"80",0.05
"81",-0.08
"82",0.14
"83",0.08
"84",0.19
"85",-0.27
"86",-0.13
"87",-0.03
"88",0.02
"89",-0.13
"90",-0.1
"91",0.05
"92",-0.05
"93",-0.13
"94",0.3
"95",-0.15
"96",0.12
"97",-0.28
"98",0.21
"99",0.04
"100",0.1
"101",0.35
"102",0.53
"103",0.04
"104",0.42
"105",0.1
"106",0.05
"107",0.24
"108",0.4
"109",0.62
"110",0.38
"111",0.63
"112",0.59
"113",0.24
"114",0.24
"115",0.45
"116",0.74
"117",0.42
"118",0.62
"119",0.95
"120",0.73
"121",0.63
"122",0.81
"123",1
"124",0.9
"125",0.77
"126",1.08
"127",0.91
"128",1.15
"129",0.82
"130",0.91
"131",1.07
"132",0.92
"133",0.92
"134",1.01
"135",1.03
"136",1.26
"137",1.44
"138",1.35
"","x"
"1",-0.05
"2",0.01
"3",0
"4",-0.06
"5",-0.15
"6",-0.21
"7",-0.21
"8",-0.24
"9",-0.05
"10",-0.04
"11",-0.29
"12",-0.13
"13",-0.18
"14",-0.23
"15",-0.26
"16",-0.14
"17",0
"18",-0.05
"19",-0.23
"20",-0.16
"21",-0.07
"22",-0.18
"23",-0.26
"24",-0.41
"25",-0.51
"26",-0.28
"27",-0.26
"28",-0.32
"29",-0.47
"30",-0.52
"31",-0.49
"32",-0.47
"33",-0.31
"34",-0.37
"35",-0.21
"36",-0.14
"37",-0.33
"38",-0.38
"39",-0.22
"40",-0.27
"41",-0.26
"42",-0.24
"43",-0.3
"44",-0.3
"45",-0.3
"46",-0.26
"47",-0.17
"48",-0.23
"49",-0.28
"50",-0.33
"51",-0.19
"52",-0.16
"53",-0.24
"54",-0.29
"55",-0.22
"56",-0.23
"57",-0.19
"58",-0.09
"59",-0.17
"60",-0.09
"61",0.11
"62",0.25
"63",0.05
"64",0.03
"65",0.21
"66",0.19
"67",-0.1
"68",-0.13
"69",-0.17
"70",-0.13
"71",-0.14
"72",-0.05
"73",0.02
"74",0.01
"75",-0.17
"76",-0.19
"77",-0.14
"78",0.05
"79",0.07
"80",0.01
"81",0
"82",0.03
"83",0.02
"84",0.03
"85",-0.13
"86",-0.09
"87",-0.04
"88",-0.06
"89",-0.04
"90",0.14
"91",0.02
"92",-0.12
"93",0.08
"94",0.1
"95",-0.07
"96",-0.09
"97",-0.03
"98",0.14
"99",0.08
"100",0.21
"101",0.23
"102",0.18
"103",0.19
"104",0.26
"105",0.16
"106",0.12
"107",0.16
"108",0.32
"109",0.27