I don't know whether to use some combination of data flow transformations or resort instead to a Script Component to accomplish my goal. I am receiving electric usage odometer-type meter readings, anywhere from 0 to many per day for a given meter. Granularity of Fact table flattens that out to 1 reading per meter per day. So, today's odometer reading comes in and the calculated consumption is done by subtracting the most recent reading for this meter in the fact table from today's odometer value. E.g. last time I got a reading it was 132 and now the meter says 150 so consumption between the readings is 18. To complicate further, if for example 6 days have passed since that last reading I need to insert 1 reading for each day to the Fact table and spread that 18 value across them. So, question is which transforms are most appropriate for having to look into the Fact table's recent past in order to determine record(s) to introduce to the destination pipeline?
Example on Fact table:
Date: 7/10/2013 Meter: MeterA Consumption: 5 Estimated? N LastGoodReading: 132
--------------------------------------------------------
Six days later, on 7/16/2013, MeterA reports a reading for 150. From that I should create following output by using the earlier row, the last reading and the time that has elapsed between readings:
--------------------------------------------------------
Date: 7/11/2013 Meter: MeterA Consumption: 3 Estimated? Y LastGoodReading: NULL
Date: 7/12/2013 Meter: MeterA Consumption: 3 Estimated? Y LastGoodReading: NULL
Date: 7/13/2013 Meter: MeterA Consumption: 3 Estimated? Y LastGoodReading: NULL
Date: 7/14/2013 Meter: MeterA Consumption: 3 Estimated? Y LastGoodReading: NULL
Date: 7/15/2013 Meter: MeterA Consumption: 3 Estimated? Y LastGoodReading: NULL
Date: 7/16/2013 Meter: MeterA Consumption: 3 Estimated? N LastGoodReading: 150