• Welcome, Guest. Please login.
May 11, 2021, 09:00:27 PM


Welcome to the SQLitening support forums!

Show posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

Topics - Peter Presland

I need to test if table creation was successful or not. Documentation simply says that a 'No-op' occurs if the table already exists which is not an error, and I can find no further reference to it.

So, question: How do I test for the 'No-op' condition immediately following the Create command?
This problem is probably a function of my beginner status with PB9. However, since it concerns my use of SQLitening as well , I'm posting it here first.

Briefly, when running an SQLlite transaction involving the import of several million records from a .csv file, PB seems to start a separate thread for the transaction loop - at least that is my assumption since two instances of the application appear in Task Manager. Just before the 'BEGIN TRANSACTION' statement, I disable various controls and enable an 'Abort' button. I have a 'DoEvents' statement in the loop to ensure that any abort button click will be actioned - and it IS actioned. It initiates the following code:

Close #1


The #1 being the input .csv file. The requirement is to be able to gracefully abort an import-in-progress.

Problem is that SQLite returns an error saying that there is no transaction in progress and there is no open SQL database, when there most certainly are both  since the transaction loop continues to run to successful completion.

I suspect it has to do with either the separate thread or 'sub-classing' (or both) - with which I am as yet in total ignorance.

Any help / observations welcome

Peter Presland
Is there a simple way to move backwards through a record set whilst inside a 'While slGetRow Wend'  construction?

I need to be able to back up by one record before continuing forward through a selection and can't figure out if or how it may be doable
I have a trial installation of SQLite Expert Pro (It has a text file import facility). I tied importing the same text file and aborted after an hour. It had still only reached 1.5 million. I presume therefore that it does not use the "Begin Immediate Transaction" syntax when importing text files. That together with displaying the escalating 'No of records imported' number, the display overhead of which I presume gobbles up a good few milliseconds per record too.

Maybe my usage is a bit extreme but, nonetheless I reckon they'd do themselves a favour by addressing that import speed issue - unless of course there is a User tweekable parameter that I've missed somewhere - which is very likely.
Joe may recall suggesting I look at SQLitening in reply to a posting of mine on Chris Boss's site early last month. I'm very glad I did.

Briefly, I have hundreds of separate text files collected over several years containing financial time series tick data. Each one contains between 2 and 15 million records. My need is to parse these files into a form that is usable by my existing analytical software. I have made good progress in learning PB9 and had produced working prototypes of what I need. Apart from the PB9 learning curve, my biggest problem turned out to be processing time.

I started by using Input and Write statement to read from a text file, then following minor conditional manipulation, write the result to a separate output text file. A typical 5 million record file took just over 40 minutes to process. So I tried SQLitening - another leaning curve followed by serious disappointment. The same file took well over an hour to process.

Then I stumbled upon Pat Dooley's  'Slow insert speed' thread and BINGO !!! - it now takes just 3 minutes to process the same file, so I am one happy bunny.

It's still early days (for a rank beginner) but thanks a bunch to Joe for the steer and to Fred and whoever for a cracking product.