SQLitening Support Forum

Please login or register.

Login with username, password and session length
Advanced search  

News:

Welcome to the SQLitening support forums!

Author Topic: Many Insert's  (Read 401 times)

Fim

  • Guru
  • ***
  • Posts: 139
    • View Profile
    • Fims sajt
Many Insert's
« on: March 05, 2018, 05:23:02 AM »

I have a conversion program, it reads a sequential file of about 4,000,000 records with 19 fields. For each record I make an INSERT in a table. It takes about 13 hours because compilation of each INSERT takes about 12 ms. Is there any way to use that function in slExeBind that tells SQLite3 to only compile the
SQL statement once and then insert each row re-using the compiled statement?

/Fim W.
Logged
Fim W

Paul Squires

  • Administrator
  • Master Geek
  • *****
  • Posts: 4219
    • View Profile
    • http://www.planetsquires.com
Re: Many Insert's
« Reply #1 on: March 05, 2018, 02:05:07 PM »

If I were to take a guess... I bet you need to wrap all of the inserts into one begin/end transaction. That should speed things up immensely even more so than the compilation time of the sql statement.
Logged
Paul Squires
http://www.planetsquires.com
support@planetsquires.com

Fim

  • Guru
  • ***
  • Posts: 139
    • View Profile
    • Fims sajt
Re: Many Insert's
« Reply #2 on: March 05, 2018, 05:14:06 PM »

Paul,

Is all in one transaction.
/Fim
Logged
Fim W

Fim

  • Guru
  • ***
  • Posts: 139
    • View Profile
    • Fims sajt
Re: Many Insert's
« Reply #3 on: March 06, 2018, 09:53:56 AM »

I will have a look at it.
/Fim W.
Logged
Fim W

Bern Ertl

  • Master Geek
  • ****
  • Posts: 481
  • Excellent
    • View Profile
    • InterPlan Systems
Re: Many Insert's
« Reply #4 on: March 06, 2018, 10:15:08 AM »

Code not shown, so forgive me if I'm making some assumptions, but are you opening the source file, reading a record, inserting to db, reading next record, etc.?  It is *much* more efficient to load a sequential file to a memory buffer in one go and process the records from memory.  I use function like so:
Code: [Select]
FUNCTION FileToBuffer( BYVAL sFile AS STRING, sBuffer AS STRING) AS LONG

   'Returns %True if successful, %False otherwise

   LOCAL lDev AS LONG

   lDev = FREEFILE
   TRY
      OPEN sFile FOR BINARY AS #lDev
      GET$ #lDev, LOF( lDev), sBuffer
      CLOSE #lDev
   CATCH
      CLOSE #lDev
      EXIT FUNCTION
   END TRY

   FUNCTION = %True

END FUNCTION

You can then use PARSE to massage the buffer into an array of records (assuming you don't hit any memory limitations) and each record (one at a time as you process) into an array of elements.

For even better efficiency, you could forgo PARSE and just use a BYTE pointer to read the data from the buffer.  It's a bit more programming on your end, but much more efficient when executing.

Fim

  • Guru
  • ***
  • Posts: 139
    • View Profile
    • Fims sajt
Re: Many Insert's
« Reply #5 on: March 07, 2018, 02:31:41 AM »

I will try to use the slexebind.
I will bee back.

/Fim W.
Logged
Fim W