Converting from String to DateTime Drops Milliseconds
Please forgive me in advance for this rush job as I literally only spent less than 30 minutes on this, but it seems that the current driver drops milliseconds from timestamps coming out of the database. I hacked a test together based off the encode tests (I don't think that there is an easy way for me to get a DB type constant since they are private, so I hardcoded the 1184).
I have a pending, (equally rushed) PR, but just wanted to bring this up in case I'm overlooking some reason why things are (or have to be) the way they are.
solo_test('decode datetime', () {
var data = [
"22001-02-03 00:00:00.000", new DateTime(22001, DateTime.FEBRUARY, 3),
"2001-02-03 00:00:00.000", new DateTime(2001, DateTime.FEBRUARY, 3),
"2001-02-03 04:05:06.000", new DateTime(2001, DateTime.FEBRUARY, 3, 4, 5, 6, 0),
// FAILS!
"2001-02-03 04:05:06.999", new DateTime(2001, DateTime.FEBRUARY, 3, 4, 5, 6, 999),
"0010-02-03 04:05:06.123 BC", new DateTime(-10, DateTime.FEBRUARY, 3, 4, 5, 6, 123),
"0010-02-03 04:05:06.000 BC", new DateTime(-10, DateTime.FEBRUARY, 3, 4, 5, 6, 0)
//TODO test minimum allowable postgresql date
];
var tc = new TypeConverter();
var d = new DateTime.now().timeZoneOffset; // Get users current timezone
pad(int i) => i.toString().padLeft(2, '0');
var tzoff = '${d.isNegative ? '-' : '+'}'
'${d.inHours}:${pad(d.inMinutes % 60)}:${pad(d.inSeconds % 60)}';
for (int i = 0; i < data.length; i += 2) {
var str = data[i];
var dt = data[i + 1];
expect(tc.decode(str, 1184), dt);
}
});
Thanks for this.
It's the least I can do (assuming that I'm actually helping :-) ). Of course as I'm sure you're aware, there are more nuances to it all. I wasn't properly padding the optional milliseconds. I made a change to try to accommodate that as well as another crack at the unit tests:
solo_test('decode datetime', () {
var data = [
"2015-01-08 14:04:56.32+00", new DateTime.utc(2015, DateTime.JANUARY, 8, 14, 4, 56, 320),
"2015-01-08 14:04:56.001+00", new DateTime.utc(2015, DateTime.JANUARY, 8, 14, 4, 56, 1),
"2015-01-08 14:04:56.999+00", new DateTime.utc(2015, DateTime.JANUARY, 8, 14, 4, 56, 999),
"2015-01-08 14:04:56.2+00", new DateTime.utc(2015, DateTime.JANUARY, 8, 14, 4, 56, 200),
"2015-01-08 14:04:56.201+00", new DateTime.utc(2015, DateTime.JANUARY, 8, 14, 4, 56, 201),
"2015-01-08 14:04:56+00", new DateTime.utc(2015, DateTime.JANUARY, 8, 14, 4, 56),
"2015-01-08 14:04:56.0+00", new DateTime.utc(2015, DateTime.JANUARY, 8, 14, 4, 56),
"2015-01-08 14:04:56.000+00", new DateTime.utc(2015, DateTime.JANUARY, 8, 14, 4, 56),
];
var tc = new TypeConverter();
for (int i = 0; i < data.length; i += 2) {
var str = data[i];
var dt = data[i + 1];
expect(tc.decode(str, 1184, isUtcTimeZone: true), dt);
}
});
Oddly enough there is actually a test for milliseconds. But the test is buggy (it actually tests seconds), and as you've pointed out the implementation doesn't actually handle millis at all.
https://github.com/xxgreg/dart_postgresql/blob/master/test/postgresql_test.dart#L305
Thanks for bringing this up. It is helpful.