TOC and APS

The necessary conditions for APS

Correct data

An APS needs correct data because it tries to optimize the operations by calculating the ‘best’ solution based on the available data. In a regular operational environment different scenarios are compared and ‘the best’ scenario is presented. But some data cannot be accurate like p.ex.:

1. Scrap rates. Depending on the risk we find average scrap rates or ‘inflated’ scrap-rates that include a lot of protection. In both cases reality will be different from the theoretical value. But in a regular MRP-calculation the feeding operations are ‘corrected’ by this scrap rate, which means that the WIP on the shop floor is different from the theoretical. If those scrap-rates are relatively important all the dependent operations are suffering from the differences.
2. Unstable set-ups. The same logic applies to unstable set-ups caused by technical problems or by ‘sequence dependent set-ups’. Yes, we knows that in some cases processes can be optimized by choosing the right sequence, but the impact on the throughput can only be measured if this method is applied on a constraint resource. In practice we see that this type of optimizations often are also used for non-constraint resources, which becomes then a waste of effort and leads to un-necessary complexity.
3. Fluctuating process-times are the most common source of inaccuracy. This can be minimized by the regular improvement procedures, but it takes time and in the meantime we cannot stop operations…

Sufficient stability

The previous causes are important drivers for in-stability but this is not the complete picture. Planning decisions are made for a certain horizon. The longer the horizon the higher the risk for changing demands. Customers are requesting urgent deliveries, or a sudden stock-disruption creates a panic-wave. But internal problems like: machine breakdowns or missing parts may also cause deviations from the original plan.

As a result of it the planning and the resulting detailed schedules are often changing. The more ‘optimization’ is calculated, the more difficult it becomes to stabilize the schedule. Some people even try to optimize within the noise of the system. The reason for it looks acceptable because 2% of the global operation expenses represent a lot of money! So, why not trying to optimize up to the last percentage? On the other hand the accuracy of the data is often less then 95%. And let’s not forget that the optimizations are calculated on dependent events, which means that if only one event is fluctuating the global picture starts to move. This chaotic situation is re-enforced by the fact that often an APS tries to find the shortest lead-time and thus eliminates the necessary buffers. This is why some managers tend to ‘freeze’ the schedule for a certain horizon, which un-avoidably leads to less flexibility to cope with operational problems and with un-expected demands. The other solution to re-calculate the schedule for each and every change leads to chaotic situations that cost lots of effort and money.

Optimization criteria

Must be coherent. Too often we have seen that the first focus for optimization goes to cost reduction. This is translated into rules for batch-sizes, set-ups and maximum inventories. But at the same time the objectives for improving delivery performance are maintained. If one tries to put all those rules in an optimization algorithm, it turns out that the global throughput often suffers due to local optimizations. Another conflict is created by the assumptions about cost optimization rules that are mostly not verified. The classical examples of this type of mistakes can be found in rules for batch sizing and ‘economical’ re-order points, that are based on theoretical costs that are not at all variable in the planning horizon.

Conclusion

Looking at the finite capacity-planning picture one could have the feeling that trying to improve the throughput while having fewer inventories and less operating expenses is a mission impossible. If we try to apply the regular (finite-capacity) solutions the answer is: YES!

For getting a realistic solution to this problem we must take in account several pre-requisites:

Accept the reality that our data are not – and most probably: will never be – 100% accurate.

Accept the fact that the market is asking for short delivery times and very high due-date performance.

Accept the fact that we have to reach better performance with fewer inventories.

Accept the fact that if we cannot expand our sales (In TOC-language: break the market constraint) we will be obliged to gradually have fewer operational expenses in a way that it not hurts our throughput.