The TimeStamp class uses DateTime.Ticks which is incompatible with the native Unix Epoch milliseconds used by Redis TimeSeries.
If you have cross-platform clients accessing the time series data, or use the * auto timestamp which defaults to Unix ms, it's easy to end up with unexpected results when querying ranges and aggregating samples.
I suggest the TimeStamp class bounds check entries and default to supporting only UnixMs min and max timestamps. After testing, it appears that Redis TimeSeries only supports positive timestamp values, which means any dates before epoch won't work.
This would allow * to work as expected from the NRedisTimeSeries client and timestamps would be compatible across Unix/MacOS and Windows platforms. However using Unix ms means that DateTime.MinValue would need to throw an out of range exception. I feel if we build out a decent TimeStamp struct, this is an acceptable tradeoff.