Bob Barkerunread, Mar 9, 2010, 2:30:38 PM3/9/10 to I am attempting to reduce the number of harddisk writes I make into a Currently, I am inserting 1 row at a time because I am parsing a text INSERT INTO tblTest ( Field1, Field 2, Field3,
Field4) VALUES Apparantly, the above statement is NOT supported by MS Access! How INSERT INTO tblTest (Field 1, Field2, Field3, Field4) However, because I am parsing a text file and storing the individual Marshall Bartonunread, Mar 9, 2010, 3:22:03 PM3/9/10 to Bob Barker wrote:
That may also help because it keeps the connection to the Another thing that may(?) help is to remove all indexes on -- paii, Ronunread, Mar 9, 2010, 3:37:34 PM3/9/10 to "Marshall Barton" <> wrote in message Doing the inserts within a transaction may also help. Albert D. Kallalunread, Mar 9, 2010, 6:02:58 PM3/9/10 to I don't think inserting 10,000 rows should take more then about 1 or 2 Open a reocrdset....don't use separate sql insert commands. Using inserts So, try a reordset...open the reocrdset once..and use add new.... Also, test your processing loop without actually doing ANY inserts to find Post back here...the insert process will likely only take a minute or two as So open a reocrdset....keep it open, and use addnew for each row... -- Saladunread, Mar 9, 2010, 6:12:21 PM3/9/10 to 10,000 records is a few records to be sure but taking 1 hour to process? We'll assume you can't link to text file to the mdb and append from Maybe
you could The Frogunread, Mar 10, 2010, 3:38:44 AM3/10/10 to Hi Bob, I use a disconnected ADO recordset when doing bulk inserting. I create Next I start a transaction with the recordset, add the records I want, I firmly believe that a recordset is your way to go. Having it If you would like to post a little more of the code and sql that you Cheers The Frog Bob Barkerunread, Mar 10, 2010, 2:02:06 PM3/10/10 to Thank you all who responded thus far! In my test environment I set up, I was able to insert
1,000,000 Some issues I ran into: First: I was sending a CommandType Enum 10,000 ROWS - 1 Average Each row contained a String of variable length between 2 and 50, 1 THANK YOU ALL! ONCE AGAIN! gregharwardunread, Jun 6, 2010, 1:53:11 PM6/6/10 to The fastest method is to use Indexed Sequential Access Method - ISAM. This allows for directly transfering data from your connected ADO record source to a second dynamically connected datasource. I figured out how to do this this week and it is the only way to fly! Do yourself a favor any try this, you will be glad that you did! 1. Use an ADODB.Connection to connect to ANY ADO DB using the standard connection strings (www.connectionstrings.com). Here is the syntax: SELECT * INTO <Put ISAM Name Here> FROM <normal query syntax to connected DB and table> ...other options are possible, the point is just to replace the table name with the ISAM Name in the query. ISAM Names: CSV & TXT
Excel 2003 & Excel 2007 SQL Server Here are some
links: Additionally, if you would like to create a file based CSV, Excel 2003/2007, Access 2003/2007 database via the ADO connection, that can be done using the ADOX object. Set oADOXDatabase = New ADOX.Catalog oADOXDatabase.Create("Provider=Microsoft.Jet.OLEDB.4.0;Data Source=" & <path to file>) The combination of these concepts allows for the creation and population of a DB file using ADO very quickly. I have working code for this, so I can tell you that it works great if you feel the need...the need for speed! Greg Harward Bob Barker wrote: Thank you all who responded thus far! 10-Mar-10 Previous Posts In This Thread: On Tuesday, March 09, 2010 2:30 PM MS Access Mass/Bulk Insert into a table
Currently, I am inserting 1 row at a time because I am parsing a text INSERT INTO tblTest ( Field1, Field 2, Field3, Field4) VALUES Apparantly, the above statement is NOT
supported by MS Access! How INSERT INTO tblTest (Field 1, Field2, Field3, Field4) However, because I am parsing a text file and storing the individual this table either since now I will be doing the same number of writes to
SELECT" method. This is not what I want to do--my objective is to On Tuesday, March 09, 2010 3:22 PM Bob Barker wrote:Have you tested using a recordset with
the Have you tested using a recordset with the AddNew/Update Where False so you do not waste time retrieving existing That may also help because it keeps the connection to the Another thing that may(?) help is to remove all indexes on -- On Tuesday, March 09, 2010 3:37 PM Doing the inserts within a transaction may also help. On Tuesday, March 09, 2010 6:02 PM I do not think inserting 10,000 rows should take more then about 1 or 2minutes. Open a reocrdset....do not use separate sql insert commands. Using inserts
So, try a reordset...open the reocrdset once..and use add new.... Also, test your processing loop without actually doing ANY inserts to find Post back here...the insert process will likely only take a minute or two as So open a reocrdset....keep it open, and use addnew for each row... On Tuesday, March 09, 2010 6:12 PM Bob Barker wrote:10,000 records is a few records to be sure but taking 1 hour 10,000 records is a few records to be sure but taking 1 hour to process? We'll assume you cannot link to text file to the mdb and append from
Maybe you could On Wednesday, March 10, 2010 3:38 AM Hi Bob,I use a disconnected ADO recordset when doing bulk inserting. Cheers The Frog On Wednesday, March 10, 2010 2:02 PM
Phillip Holmesunread, Jun 6, 2010, 6:09:03 PM6/6/10 to MCSDPhil *** Sent via Developersdex
http://www.developersdex.com *** |