c# - Cleaning up memory after reading a giant xml element value -


i turn here help, driving me crazy: i'm reading xml file wraps arbitrary number of items, each b64-encoded file (and accompanying metadata it). read whole file xmldocument, while cleaner code, realized there's no limit on size of file, , xmldocument eats lot of memory , can run out if file large enough. rewrote code instead use xmltextreader, works great if issue program sent xml file large number of reasonably-sized attachments... there's still big problem, , that's turn you:

if xml reader @ file element, element contains value that's enormous (say, 500mb), , call reader.readelementcontentasstring(), have string occupies 500mb (or possibly outofmemoryexception). in either case write log, "that file attachment totally way big, we're going ignore , move on", move onto next file. doesn't appear string tried read ever garbage collected, happens string takes ram, , every other file tries read after throws outofmemoryexception, though of files quite small.

recall: @ point, i'm reading element's value local string, have expected eligible garbage collection immediately (and garbage collected, @ latest, when program attempts read next item , discovers has no memory available). i've tried everything, in case: setting string null, calling explicit gc.collect()... no dice, task manager indicates gc collected 40k, of ~500mb requested store string in, , still out of memory exceptions attempting read else.

there doesn't seem way know length of value contained in xml element using xmltextreader without reading element, imagine i'm stuck reading string... missing something, or there no way read giant value xml file without totally destroying program's ability further afterwards? i'm going insane this.

i have read bit c#'s gc, , loh, nothing read have indicated me happen...

let me know if need further information, , thanks!

edit: did realize process running 32-bit process, meant being starved memory bit more should've been. fixed that, becomes less of issue, still behavior i'd fix. (it takes more and/or larger files reach point outofmemoryexception thrown, once thrown, still can't seem reclaim memory in timely fashion.)

i had similiar issue soap service used transfer large files base64 string.

i used xdocument instead of xmldocument then, did trick me.


Comments

Popular posts from this blog

Why does Ruby on Rails generate add a blank line to the end of a file? -

keyboard - Smiles and long press feature in Android -

node.js - Bad Request - node js ajax post -