importing sqlite file: out of memory
Hi all,
I'm just migrating my sqlite file.
I have my sql file from the sqlite dump utility. It's about 400mb.
I'm trying to read it with dbsql berkeleydb utility (I'm using last version of berkeley db, 11g Release 2). The migration seems OK but after a while I get an error message: suddenly it cannot find a table (even if it's been found just a second before) and then it goes out of memory.
So my questions are:
1) does the '.read' command keep the file in memory? Is there any way to write the sql file as a new berkeleydb file?
2) Is the memory problem due to the fact that it keeps data in memory? Is every sql command in the sql file a single transaction (autocommit) or the .read command execute all the sql statements of my big file in a single transaction (so it could be very easy to go out of memory)?
Thanks
LB