Quantcast
Channel: Transact-SQL forum
Viewing all articles
Browse latest Browse all 23857

Performance updating a extra huge table

$
0
0

Hi guys, just an advice. I'm handling table with more than 300 millions rows, sometimes even 800 millions and so far I came up with some good solution but now I really need to be concerned about the performance. I got a table with:

FlyID int, FlyNumber int, SettlDate datetime2, SettlPeriod double, Consumpt dec, Ixl dec, Aunit int

300 millions rows. The settldate is a date , settperiod is an half hour ( so 48 period each day).

The other table is:

BMUnit int,  SettlDate datetime2, SettlPeriod double, Chargefact dec

I'm going to join the two table on bmunit=bmunit, settdate=settdate, settperiod=settperiod and with an insert filling a new table

Fingers crossed and I hope it wors within a reasonable time ( 3 hours...more?)

The real concern is:

I got another table with

FlyID int, Company varchar, CompanyID int, FromDate datetime, ToDate datetime

The logic should be something like this:

Update table1 set 1companyid=dd.companyid , company=company

where table1.flyid=company.flyid

and settlementdate >= fromdate and settlementdate <= todate

but just yesterday I tried something without date and the querr ran for more than seven hours and so I had to killed it. I'm wondering if there is a better way...all this stuff because I'm going to build several cube taking as source a big table. That's it's going to make the retrievement really fast, so far I cut pratically entire hours but now I need you this more element and before I start to write some code I'd like to hear some your advice..

Thanks


Viewing all articles
Browse latest Browse all 23857

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>