scanpy
scanpy copied to clipboard
sc.read_csv - 'ValueError: could not convert string to float'
- [x] I have checked that this issue has not already been reported.
- [x] I have confirmed this bug exists on the latest version of scanpy.
- [ ] (optional) I have confirmed this bug exists on the master branch of scanpy.
Note: Please read this guide detailing how to provide the necessary information for us to reproduce your bug.
Minimal code sample (that we can copy&paste without having any data)
adata = sc.read_csv('/stanley/granger_lab_storage/Users/Will/120722_MOp_matrix_GABA_Glut_only_dropped_column.csv', delimiter = "\t")
---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
<ipython-input-43-462cb48f7ebe> in <module>
----> 1 adata = sc.read_csv('/stanley/granger_lab_storage/Users/Will/120722_MOp_matrix_GABA_Glut_only_dropped_column.csv', delimiter = "\t")
~/.local/lib/python3.8/site-packages/anndata/_io/read.py in read_csv(filename, delimiter, first_column_names, dtype)
51 Numpy data type.
52 """
---> 53 return read_text(filename, delimiter, first_column_names, dtype)
54
55
~/.local/lib/python3.8/site-packages/anndata/_io/read.py in read_text(filename, delimiter, first_column_names, dtype)
358 else:
359 with filename.open() as f:
--> 360 return _read_text(f, delimiter, first_column_names, dtype)
361
362
~/.local/lib/python3.8/site-packages/anndata/_io/read.py in _read_text(f, delimiter, first_column_names, dtype)
423 data.append(np.array(line_list[1:], dtype=dtype))
424 else:
--> 425 data.append(np.array(line_list, dtype=dtype))
426 break
427 # if row names are just integers
ValueError: could not convert string to float: 'AAACCTGAGGAGTCTG-L8TX_171026_01_F03'
Versions
anndata 0.8.0 scanpy 1.9.1
Could you please paste a subset of the file? The header + 2 lines or so
We will close the issue for now, hopefully you obtained the expected behaviour :)
However, please don't hesitate to reopen this issue or create a new one if you have any more questions or run into any related problems in the future.
Thanks for being a part of our community! :)