Description
I've found a problem in importing a csv file with numbers that loses the precision of all the numbers in a column.
If any of the entries in a column are in scientific format, all of the entries are converted to that and loose precision. Only happens if a number in the column is 12 digits or more (either representation - so either 1E+11 or 123456789012). Much larger numbers in a column with no scientific representation entries don't trigger the problem.
The mixed type is an error in my data, but thought I'd report the problem in pandas in case it effects legitimate data.
Happens in both 10.0 and 0.10.1.dev-6e2b6ea on OSX with numpy 1.6.2.
csv file:
id, text
135217135789158401, 'testing lost precision from csv'
1352171357E+5, 'any item scientific format loses the precision on all other entries'
test = pandas.DataFrame.from_csv('test.csv')
print test.index[0] == 135217135789158401
print test.index[1] == 1352171357E+5
Example of large number - column A is effected, column C isn't.
id, A, B, C
1, 99999999999, 'a', 99999999999
2, 123456789012345, 'b', 123456789012345
3, 1234E+0, 'c', 1234
This may be related to Issue #2069