I have more than 100 files posted every night. I need to import all of those files in SQL Server. I have a script task which gets a list of files names in a temporary table. Then I create number of jobs equal to number of files dynamically and insert into number of staging tables that is equal to number of files.
So, if 100 files needs to get loaded,
1. Create 100 staging tables.
2. Create 100 Jobs with 100 Tsql queries which runs Bulk insert into the staging table and then to a final table. (Please note that there is only one final table for all of these staging tables. Also, the updates are rare and only inserts occur. After insertion drop the staging table).
3. I run all the job at ones to acheive parallelism.
The purpose of creating this many staging tables and jobs is to run the process parallely.
Any thoughts would be highly appreciated.