I have large array of this type:
type TradeData =
{
Timestamp: DateTime
Instrument: Instrument
Price: decimal
Quantity: decimal
Direction: Direction
}
We're talking about several gigabytes loaded every time during debugging. The data is stored on the disk in binary using FSPickler which is much faster than Json, but it still takes a long time to load. I sped it up by slicing the file into chunks of 1h and loading them in parallel, but we're still taking 15-20 seconds at the start of every debugging session.
What I would like to know is if I can put the data in an array, write the array as a binary blob. Then I'd load the blob and do something like:
myData = (TradeData *) pBinaryBlob
I'm not sure if this is doable under dotnet, or if each object really needs to be initialized independently.
Santiago Trujillo
This can probably be done provided you use struct
s rather than objects. If you have already serialised the file as objects you may need to re-serialise the file once to bit-align with the structs or carefully annotate your struct with the necessary byte alignments.
If you read the FileStream
into byte buffers I'd then look at using either nativeptr<'T> or (ReadOnly)Span<'T>
and MemoryMarshal.Cast
to perform an unsafe
conversion.