DataSet reportData = new DataSet();
reportData.ReadXml(Server.MapPath(”report.xml”));
SqlConnection connection = new SqlConnection(”CONNECTION STRING”);
SqlBulkCopy sbc = new SqlBulkCopy(connection);
sbc.DestinationTableName = “report_table”;
//if your DB col names don’t match your XML element names 100%
//then relate the source XML elements (1st param) with the destination DB cols
sbc.ColumnMappings.Add(”campaign”, “campaign_id”);
sbc.ColumnMappings.Add(”cost”, “cost_USD”);
connection.Open();
connection.Close();
//remove the xml file
Basically your creating a DataSet, populating it with XML data from a *.xml file somewhere on your system and passing the DataTable of interest from your DataSet into the WriteToServer method of the SQLBulkCopy class. The table specified in DestinationTableName must exist, additionally if your DB column names do not match your XML element names 100% you will need to use the ColumnMappings property to let SQLBulkCopy know which XML elements match to which DB cols. In the code above I’m telling SQLBulkCopy to pair the ‘campaign_id’ and ‘cost_USD’ columns in the DB with the ‘campaign’ and ‘cost’ elements from the XML file. XML elements which are not explicitly specified will not be written to the DB table.
The SQLBulkCopy class is available in the System.Data.SqlClient namespace, so you will need to add a using statement at the top of your code if you want to instantiate a SQLBulkCopy instance. Although my project will only really involve XML data with a max of about 2,000 elements I’ve done a bit of reading about some of the performance benefits some .net developers are getting by using SQLBulkCopy, a couple of people said they were writing 40-50K XML elements (how big were the elements though?) to their DB in only a few seconds, now that’s quick… it sure beats looping over ‘insert into’ statements.