[xml] very large XML docs

Date view Thread view Subject view Author view

From: F. David Sacerdoti (fds@cs.ucsd.edu)
Date: Wed Oct 18 2000 - 15:53:14 EDT


I am using libxml for a scientific library project here at UCSD in
conjunction with the San Diego Supercomputing Center. The library will
be running on supercomputer clusters (Blue Horizon), and is used in very
large calculations.

I have a question about dealing with very large documents. Since I may
be creating tags with several hundred megabytes of data in them, I would
like to stream the xml doc to disk as it is created. This would allow me
to use a limited buffer size for these very large tags.

I have seen xmlOutputBufferCreateFile() which can be used with
xmlOutputBufferWrite() to write a fixed length buffer to disk, but is it
true that the entire xml doc must be built in memory before any of it
can be written to disk?

Would it be possible to write a partially created xml tree to disk,
allowing me to free some memory buffers, and then go on to compute and
write the rest of the tree?

Thank you,
David Sacerdoti

----
Message from the list xml@rpmfind.net
Archived at : http://xmlsoft.org/messages/
to unsubscribe: echo "unsubscribe xml" | mail  majordomo@rpmfind.net


Date view Thread view Subject view Author view

This archive was generated by hypermail 2b29 : Wed Oct 18 2000 - 18:43:19 EDT