• Welcome, Guest. Please login.
April 16, 2021, 09:08:45 AM


Welcome to the SQLitening support forums!

Many Insert's

Started by Fim, March 05, 2018, 06:53:02 AM

Previous topic - Next topic


I have a conversion program, it reads a sequential file of about 4,000,000 records with 19 fields. For each record I make an INSERT in a table. It takes about 13 hours because compilation of each INSERT takes about 12 ms. Is there any way to use that function in slExeBind that tells SQLite3 to only compile the
SQL statement once and then insert each row re-using the compiled statement?

/Fim W.
Fim W

Paul Squires

If I were to take a guess... I bet you need to wrap all of the inserts into one begin/end transaction. That should speed things up immensely even more so than the compilation time of the sql statement.



Is all in one transaction.
Fim W


I will have a look at it.
/Fim W.
Fim W

Bern Ertl

Code not shown, so forgive me if I'm making some assumptions, but are you opening the source file, reading a record, inserting to db, reading next record, etc.?  It is *much* more efficient to load a sequential file to a memory buffer in one go and process the records from memory.  I use function like so:FUNCTION FileToBuffer( BYVAL sFile AS STRING, sBuffer AS STRING) AS LONG

   'Returns %True if successful, %False otherwise


   lDev = FREEFILE
      OPEN sFile FOR BINARY AS #lDev
      GET$ #lDev, LOF( lDev), sBuffer
      CLOSE #lDev
      CLOSE #lDev

   FUNCTION = %True


You can then use PARSE to massage the buffer into an array of records (assuming you don't hit any memory limitations) and each record (one at a time as you process) into an array of elements.

For even better efficiency, you could forgo PARSE and just use a BYTE pointer to read the data from the buffer.  It's a bit more programming on your end, but much more efficient when executing.


I will try to use the slexebind.
I will bee back.

/Fim W.
Fim W