Skip to Main Content

Berkeley DB Family

Announcement

For appeals, questions and feedback about Oracle Forums, please email oracle-forums-moderators_us@oracle.com. Technical questions should be asked in the appropriate category. Thank you!

Interested in getting your voice heard by members of the Developer Marketing team at Oracle? Check out this post for AppDev or this post for AI focus group information.

importing sqlite file: out of memory

user603182Apr 7 2010 — edited Apr 8 2010
Hi all,
I'm just migrating my sqlite file.
I have my sql file from the sqlite dump utility. It's about 400mb.
I'm trying to read it with dbsql berkeleydb utility (I'm using last version of berkeley db, 11g Release 2). The migration seems OK but after a while I get an error message: suddenly it cannot find a table (even if it's been found just a second before) and then it goes out of memory.
So my questions are:

1) does the '.read' command keep the file in memory? Is there any way to write the sql file as a new berkeleydb file?

2) Is the memory problem due to the fact that it keeps data in memory? Is every sql command in the sql file a single transaction (autocommit) or the .read command execute all the sql statements of my big file in a single transaction (so it could be very easy to go out of memory)?

Thanks
LB
Comments
Locked Post
New comments cannot be posted to this locked post.
Post Details
Locked on May 6 2010
Added on Apr 7 2010
7 comments
2,799 views