Neg*_*ion 14 python performance inventory-management pandas
我有一个库存日记帐,其中包含产品及其相对库存数量(resulting_qty)以及每次添加或减少库存时的损失/收益(delta_qty)。
问题是库存记录不会每天更新,而是仅在发生库存变化时才更新。由于这个原因,很难提取给定日期所有物料的总库存数量,因为某些物料在某些天没有记录,尽管事实是,鉴于它们的最后一个条目result_qty大于0,它们确实有可用的库存。从逻辑上讲,这意味着一个项目在一定数量的天数内没有变化,该天数等于最大日期和最后记录的日期之间的天数。
我的数据看起来像这样,但实际上有数千个产品ID
| date | timestamp | pid | delta_qty | resulting_qty |
|------------|---------------------|-----|-----------|---------------|
| 2017-03-06 | 2017-03-06 12:24:22 | A | 0 | 0.0 |
| 2017-03-31 | 2017-03-31 02:43:11 | A | 3 | 3.0 |
| 2017-04-08 | 2017-04-08 22:04:35 | A | -1 | 2.0 |
| 2017-04-12 | 2017-04-12 18:26:39 | A | -1 | 1.0 |
| 2017-04-19 | 2017-04-19 09:15:38 | A | -1 | 0.0 |
| 2019-01-16 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-01-19 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-04-05 | 2019-04-05 16:40:32 | B | 2 | 2.0 |
| 2019-04-22 | 2019-04-22 11:06:33 | B | -1 | 1.0 |
| 2019-04-23 | 2019-04-23 13:23:17 | B | -1 | 0.0 |
| 2019-05-09 | 2019-05-09 16:25:41 | C | 2 | 2.0 |
Run Code Online (Sandbox Code Playgroud)
本质上,我需要使数据看起来更像这样,以便在按日期分组时(例如df.groupby(date).resulting_qty.sum() ):
注意由于字符限制,我删除了PID = A行,但是希望您能理解:
| date | timestamp | pid | delta_qty | resulting_qty |
|------------|---------------------|-----|-----------|---------------|
| 2019-01-16 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-01-17 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-01-18 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-01-19 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-01-20 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-01-21 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-01-22 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-01-23 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-01-24 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-01-25 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-01-26 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-01-27 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-01-28 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-01-29 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-01-30 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-01-31 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-02-01 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-02-02 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-02-03 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-02-04 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-02-05 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-02-06 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-02-07 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-02-08 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-02-09 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-02-10 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-02-11 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-02-12 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-02-13 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-02-14 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-02-15 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-02-16 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-02-17 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-02-18 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-02-19 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-02-20 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-02-21 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-02-22 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-02-23 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-02-24 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-02-25 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-02-26 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-02-27 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-02-28 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-03-01 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-03-02 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-03-03 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-03-04 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-03-05 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-03-06 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-03-07 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-03-08 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-03-09 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-03-10 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-03-11 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-03-12 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-03-13 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-03-14 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-03-15 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-03-16 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-03-17 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-03-18 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-03-19 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-03-20 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-03-21 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-03-22 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-03-23 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-03-24 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-03-25 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-03-26 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-03-27 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-03-28 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-03-29 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-03-30 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-03-31 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-04-01 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-04-02 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-04-03 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-04-04 | 2019-01-16 23:37:17 | B | 0 | 0.0 |
| 2019-04-05 | 2019-04-05 16:40:32 | B | 2 | 2.0 |
| 2019-04-06 | 2019-04-05 16:40:32 | B | 2 | 2.0 |
| 2019-04-07 | 2019-04-05 16:40:32 | B | 2 | 2.0 |
| 2019-04-08 | 2019-04-05 16:40:32 | B | 2 | 2.0 |
| 2019-04-09 | 2019-04-05 16:40:32 | B | 2 | 2.0 |
| 2019-04-10 | 2019-04-05 16:40:32 | B | 2 | 2.0 |
| 2019-04-11 | 2019-04-05 16:40:32 | B | 2 | 2.0 |
| 2019-04-12 | 2019-04-05 16:40:32 | B | 2 | 2.0 |
| 2019-04-13 | 2019-04-05 16:40:32 | B | 2 | 2.0 |
| 2019-04-14 | 2019-04-05 16:40:32 | B | 2 | 2.0 |
| 2019-04-15 | 2019-04-05 16:40:32 | B | 2 | 2.0 |
| 2019-04-16 | 2019-04-05 16:40:32 | B | 2 | 2.0 |
| 2019-04-17 | 2019-04-05 16:40:32 | B | 2 | 2.0 |
| 2019-04-18 | 2019-04-05 16:40:32 | B | 2 | 2.0 |
| 2019-04-19 | 2019-04-05 16:40:32 | B | 2 | 2.0 |
| 2019-04-20 | 2019-04-05 16:40:32 | B | 2 | 2.0 |
| 2019-04-21 | 2019-04-05 16:40:32 | B | 2 | 2.0 |
| 2019-04-22 | 2019-04-22 11:06:33 | B | -1 | 1.0 |
| 2019-04-23 | 2019-04-23 13:23:17 | B | -1 | 0.0 |
| 2019-04-24 | 2019-04-23 13:23:17 | B | -1 | 0.0 |
| 2019-04-25 | 2019-04-23 13:23:17 | B | -1 | 0.0 |
| 2019-04-26 | 2019-04-23 13:23:17 | B | -1 | 0.0 |
| 2019-04-27 | 2019-04-23 13:23:17 | B | -1 | 0.0 |
| 2019-04-28 | 2019-04-23 13:23:17 | B | -1 | 0.0 |
| 2019-04-29 | 2019-04-23 13:23:17 | B | -1 | 0.0 |
| 2019-04-30 | 2019-04-23 13:23:17 | B | -1 | 0.0 |
| 2019-05-01 | 2019-04-23 13:23:17 | B | -1 | 0.0 |
| 2019-05-02 | 2019-04-23 13:23:17 | B | -1 | 0.0 |
| 2019-05-03 | 2019-04-23 13:23:17 | B | -1 | 0.0 |
| 2019-05-04 | 2019-04-23 13:23:17 | B | -1 | 0.0 |
| 2019-05-05 | 2019-04-23 13:23:17 | B | -1 | 0.0 |
| 2019-05-06 | 2019-04-23 13:23:17 | B | -1 | 0.0 |
| 2019-05-07 | 2019-04-23 13:23:17 | B | -1 | 0.0 |
| 2019-05-08 | 2019-04-23 13:23:17 | B | -1 | 0.0 |
| 2019-05-09 | 2019-04-23 13:23:17 | B | -1 | 0.0 |
| 2019-05-10 | 2019-04-23 13:23:17 | B | -1 | 0.0 |
| 2019-01-19 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-01-20 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-01-21 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-01-22 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-01-23 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-01-24 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-01-25 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-01-26 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-01-27 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-01-28 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-01-29 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-01-30 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-01-31 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-02-01 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-02-02 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-02-03 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-02-04 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-02-05 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-02-06 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-02-07 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-02-08 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-02-09 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-02-10 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-02-11 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-02-12 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-02-13 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-02-14 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-02-15 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-02-16 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-02-17 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-02-18 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-02-19 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-02-20 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-02-21 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-02-22 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-02-23 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-02-24 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-02-25 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-02-26 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-02-27 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-02-28 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-03-01 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-03-02 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-03-03 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-03-04 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-03-05 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-03-06 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-03-07 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-03-08 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-03-09 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-03-10 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-03-11 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-03-12 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-03-13 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-03-14 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-03-15 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-03-16 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-03-17 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-03-18 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-03-19 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-03-20 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-03-21 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-03-22 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-03-23 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-03-24 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-03-25 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-03-26 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-03-27 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-03-28 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-03-29 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-03-30 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-03-31 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-04-01 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-04-02 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-04-03 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-04-04 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-04-05 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-04-06 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-04-07 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-04-08 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-04-09 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-04-10 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-04-11 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-04-12 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-04-13 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-04-14 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-04-15 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-04-16 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
| 2019-04-17 | 2019-01-19 09:40:38 | C | 0 | 0.0 |
Run Code Online (Sandbox Code Playgroud)
到目前为止,我所做的是创建了一系列循环,这些循环生成介于产品生命周期的最小日期和所有产品的最大日期之间的日期范围。然后,如果没有关于新日期的信息,则将最后记录的行值附加为具有新日期的新行。我将这些附加到列表中,然后使用更新的列表生成一个新的数据框。代码非常慢,需要2个多小时才能完成整个数据集:
date_list = []
pid_list= []
time_stamp_list = []
delta_qty_list = []
resulting_qty_list = []
timer = len(test.product_id.unique().tolist())
counter = 0
for product in test.product_id.unique().tolist():
counter+=1
print((counter/timer)*100)
temp_df = test.query(f'product_id=={product}', engine='python')
for idx,date in enumerate(pd.date_range(temp_df.index.min(),test.index.max()).tolist()):
min_date= temp_df.index.min()
if date.date() == min_date:
date2=min_date
pid = temp_df.loc[date2]['product_id']
timestamp = temp_df.loc[date2]['timestamp']
delta_qty = temp_df.loc[date2]['delta_qty']
resulting_qty = temp_df.loc[date2]['resulting_qty']
date_list.append(date2)
pid_list.append(pid)
delta_qty_list.append(delta_qty)
time_stamp_list.append(timestamp)
resulting_qty_list.append(resulting_qty)
else:
if date.date() in temp_df.index:
date2= date.date()
pid = temp_df.loc[date2]['product_id']
timestamp = temp_df.loc[date2]['timestamp']
delta_qty = temp_df.loc[date2]['delta_qty']
resulting_qty = temp_df.loc[date2]['resulting_qty']
date_list.append(date2)
pid_list.append(pid)
delta_qty_list.append(delta_qty)
time_stamp_list.append(timestamp)
resulting_qty_list.append(resulting_qty)
elif date.date() > date2:
date_list.append(date.date())
pid_list.append(pid)
time_stamp_list.append(timestamp)
delta_qty_list.append(delta_qty)
resulting_qty_list.append(resulting_qty)
else:
pass
Run Code Online (Sandbox Code Playgroud)
有人可以帮助我了解正确的方法是什么,因为我100%确信这不是最佳方法。
谢谢
one*_*pan 10
这里的想法是重新索引,DataFrame以填补您的空白。
设置DataFrame使用您的示例生成的:
from io import StringIO
buffer = StringIO()
buffer.write('''\
date|timestamp|pid|delta_qty|resulting_qty
2017-03-06|2017-03-06 12:24:22|A|0|0.0
2017-03-31|2017-03-31 02:43:11|A|3|3.0
2017-04-08|2017-04-08 22:04:35|A|-1|2.0
2017-04-12|2017-04-12 18:26:39|A|-1|1.0
2017-04-19|2017-04-19 09:15:38|A|-1|0.0
2019-01-16|2019-01-16 23:37:17|B|0|0.0
2019-01-19|2019-01-19 09:40:38|C|0|0.0
2019-04-05|2019-04-05 16:40:32|B|2|2.0
2019-04-22|2019-04-22 11:06:33|B|-1|1.0
2019-04-23|2019-04-23 13:23:17|B|-1|0.0
2019-05-09|2019-05-09 16:25:41|C|2|2.0
''')
buffer.seek(0)
df = pd.read_csv(buffer, sep='|', parse_dates=['date', 'timestamp'])
Run Code Online (Sandbox Code Playgroud)
首先,我们为每种产品的最小日期和最大日期之间生成一个新的,无间隙的索引。根据您的示例,这样做的结果是,在最近一次现有更新之后,产品没有行。但是,可以轻松定制此步骤以适合您的确切要求。例如,如果您希望日期范围从首次输入产品到今天,则可以手动设置start和设置end。
from itertools import chain, cycle
date_ranges = df.groupby('pid').agg({'date': ['min', 'max']})
pairs = (zip(cycle([pid]), pd.date_range(start, end))
for pid, (start, end) in date_ranges.iterrows())
new_index = pd.Index(chain.from_iterable(pairs), name=['pid', 'date'])
Run Code Online (Sandbox Code Playgroud)
然后,我们应用新索引。这里有两个选择:
delta_qty用0最后一次更新填充剩余的列(这与您的请求有所不同,但似乎合乎逻辑,仅是次要的更改)无论哪种情况,.reindex方法和.fillna方法都是两个基本概念。我们可以reindex用来扩展稠密度DataFrame以包括所有日期,但数据稀疏。然后,我们nan用适当的数据填充。由于我们是从上次更新开始向前填充,因此我们希望method='ffill'根据文档指定
# this fills the rows per last update
results = df.set_index(['pid', 'date'])\
.reindex(new_index).reset_index()
results.fillna(method='ffill', inplace=True)
Run Code Online (Sandbox Code Playgroud)
这返回
pid date timestamp delta_qty resulting_qty
0 A 2017-03-06 2017-03-06 12:24:22 0.0 0.0
1 A 2017-03-07 2017-03-06 12:24:22 0.0 0.0
2 A 2017-03-08 2017-03-06 12:24:22 0.0 0.0
3 A 2017-03-09 2017-03-06 12:24:22 0.0 0.0
.. .. ... ... ... ...
24 A 2017-03-30 2017-03-06 12:24:22 0.0 0.0
25 A 2017-03-31 2017-03-31 02:43:11 3.0 3.0
.. .. ... ... ... ...
29 A 2017-04-04 2017-03-31 02:43:11 3.0 3.0
Run Code Online (Sandbox Code Playgroud)
对于 pid == 'A'
pid date timestamp delta_qty resulting_qty
0 A 2017-03-06 2017-03-06 12:24:22 0.0 0.0
1 A 2017-03-07 2017-03-06 12:24:22 0.0 0.0
2 A 2017-03-08 2017-03-06 12:24:22 0.0 0.0
3 A 2017-03-09 2017-03-06 12:24:22 0.0 0.0
.. .. ... ... ... ...
24 A 2017-03-30 2017-03-06 12:24:22 0.0 0.0
25 A 2017-03-31 2017-03-31 02:43:11 3.0 3.0
.. .. ... ... ... ...
29 A 2017-04-04 2017-03-31 02:43:11 3.0 3.0
Run Code Online (Sandbox Code Playgroud)
返回:
pid date timestamp delta_qty resulting_qty
0 A 2017-03-06 2017-03-06 12:24:22 0.0 0.0
1 A 2017-03-07 2017-03-06 12:24:22 0.0 0.0
2 A 2017-03-08 2017-03-06 12:24:22 0.0 0.0
3 A 2017-03-09 2017-03-06 12:24:22 0.0 0.0
.. .. ... ... ... ...
24 A 2017-03-30 2017-03-06 12:24:22 0.0 0.0
25 A 2017-03-31 2017-03-31 02:43:11 3.0 3.0
.. .. ... ... ... ...
29 A 2017-04-04 2017-03-31 02:43:11 0.0 3.0
Run Code Online (Sandbox Code Playgroud)
| 归档时间: |
|
| 查看次数: |
210 次 |
| 最近记录: |