• Welcome to SQLitening Support Forum.
 

News:

Welcome to the SQLitening support forums!

Main Menu
Menu

Show posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

Show posts Menu

Topics - Andrew Lindsay

#1
Hello, I have a situation now where the amount of data generated is getting quite large and I would like to be able to spilt the data from a large table into smaller tables.
My database looks like the following

Table1
ID1 as INTEGER
FixedID as STRING *12
FreeID as STRING * 12
Data1 as REAL
Data2 as REAL
.
.
.
Data86 as REAL

For each FixedID, I have lots of FreeID elements, so I would like to create tables where each table would have the FreeID and subsequent data as the elements of the table.

So if I had 36 rows of records with three (3) unique FixedIDs, I'd like to put the associated FreeID records into the three tables.

I hope this makes sense.

Best regards

Andrew
#2
Hello,
I have two tables, one table has the following structure-


Table 1
ID
Text1
Text2
Text3
Data1
Data2
Data3
Data4 - Currently NULL
Data5 - Currently NULL
Table 2
ID
Text1
Text2
Text3
Data4
Data5

I would like to update the data in Table 1 (update the NULL data) where there is data for the matching ID rows.

Ideally I would like to know if there is a single SQL statement that would perform this operation, or do I need to do it manually/programatically?

Any assistance would be appreciated.
#3
Hello,

I have an extremely large table (over 19,000,000 entries).  I would like to have a quick way to keep tally of the numbers of records.  I have tried 'Select Count(*) from tTable;' but this takes nearly 5 minutes to recover the number of records.

I would like to set up a second table called 'Status', with a single entry called Records, that would be incremented when a record is added, and decremented when a record is deleted.

I have read that triggers can be set up to modify a figure in another table, but I am not sure how to do this for inserts or deletes.  Any assistance would be appreciated.

Best regards

Andrew Lindsay
#4
Greetings again,

I have a table that has the following set-up

RecordTime DATETIME,
extTens1 INTEGER,
Pitch1 REAL,
Pitch2 REAL,
Pitch3 REAL,
Roll1 REAL,....

The RecordTime is set as a unique key and index, and the values are stored every second, from some time in February until the beginning of October.

I would like to create a table say for Pitch 1, that would extract all the Pitch1 results per month, so I would have a table with newRecordTime, FebPitch1, MarPitch1, ...

Any assistance would be appreciated.

Regards

Andrew
#5
I have a table that has a DateTime field and an Integer field.  I want to I have imported (approximately) 4 GB of CSV text into the database, but now find that there are a considerable (maybe 30%) of duplicates.  I would like to know if there is a way to find and delete duplicate data (based on the DateTime record).

Best regards

Andrew Lindsay
#6
I am having trouble with the following Query.

  sqlQuery = "SELECT COUNT(*) As RowCount FROM Data Where RecordTime >= '2009/04/01T00:00:00' AND RecordTime < '2009/04/01T01:00:00'"
  Replace "'" With $Dq In sqlQuery
  slSEL(sqlQuery,1)
  slGetRow(1)
  NumberOfRows = Val(slFN("RowCount",1))
  slCloseSet(1)


I should get an answer of 60 (I have data for every minute), however, I get an answer of 0.  Any assistance would be appreciated.

Regards


Andrew
#7
Hopefully the last question for tonight.

How do I insert the following value into a table with a field that is labeled as TIME.

The Value is "30/04/2009 0:00".

I do not get any errors when I insert the record, but when I look at the value in SQLite Maestro, it seems to have stored the value as "20:09:00".  Any asistance would be appreciated.

Regards

Andrew
#8
I am connecting to a database and I do not know if there is a table present, how do I check if a table is present, and if not, create it?

I tried "If Not Exists FooBar Create FooBar ( Column1 TEXT, Column2 TEXT );"

Any asistance would be appreciated.

Regards

Andrew