Granularity in cost calculation – how detailed is really useful?
Many companies believe that the more detailed the cost calculation, the better the basis for decision-making. Controllers develop multi-level allocation models, designers record every screw in the parts list, and calculation teams work with decimal places that suggest more precision than actually exists. But on closer inspection, a paradox emerges: companies invest weeks in creating highly detailed calculations, only to find in the end that the decision has long since been made on the basis of rough rules of thumb – or that the competitor has already won the contract with a faster, less detailed calculation.

The problem is not a lack of expertise or tools. It lies in the fundamental question that many companies rarely ask themselves: How much detail is really necessary for the decision in question? The answer is uncomfortable, because it is not ‘as much as possible’ but ‘as much as necessary’. Between these two extremes lies an area of tension that companies have to navigate on a daily basis: when is a calculation detailed enough to make reliable statements, but at the same time lean enough to be completed on time?
This article examines the often underestimated issue of granularity in cost calculation from a practical perspective. You will learn why both overly rough and overly detailed calculations harbour risks, how to determine the optimal level of detail for different decision-making situations, and what role modern costing solutions can play in this process. Ultimately, it is not about calculating as accurately as possible, but about making better decisions faster.
What does granularity mean in cost calculation?
Granularity describes the level of detail with which costs are recorded, allocated and analysed. In practice, this involves deciding at which level cost objects should be defined, cost drivers identified and data structured. At product level, the question arises as to whether a complete end product, individual assemblies or each individual part should be calculated separately. At the process level, the question is whether manufacturing steps are considered as a whole or broken down into sub-operations. At the resource level, a distinction can be made between average machine hourly rates for entire production areas or specific cost rates for individual machines and systems. Each of these levels can be mapped in varying degrees of detail, and the combinations thereof result in the overall granularity of a calculation.
Why companies calculate too roughly
In many companies, it is customary to calculate costs rather roughly, often due to a combination of habit, lack of data and time pressure. Lump sums, surcharges or average values have established themselves as standard because they are quickly available and seem sufficient at first glance. They offer a pragmatic solution for situations in which detailed information is lacking or would be too time-consuming to collect.
The most common reason for rough calculations is a lack of data availability. In early development phases, there are no detailed design drawings, no final parts lists and no validated process parameters. Which machine will be used later, what cycle times are realistic or what reject rates are to be expected - none of this information is available yet. Even in later phases, reliable data is often missing: ERP systems contain outdated machine hour rates, material prices fluctuate more than the calculation reflects, and actual throughput times deviate considerably from planned values. Companies then calculate with the data that is available, but not with the data that would be ideal.
In addition, there is massive time pressure, especially in the quotation and development phase. When a customer makes an enquiry, they often expect a reliable quotation within a few days. Several projects are running in parallel, designers are working at full capacity and the calculation has to be done on the side. Under these conditions, there is no time for detailed process analyses or complex simulations. Instead, companies fall back on tried and tested rules of thumb. This approach works as long as products and processes do not change fundamentally, but fails precisely when innovation or cost pressure would require real transparency.
The risks of rough calculations often only become apparent when it is too late. Incorrect prices are the most obvious consequence: products are offered at too high a price and the order is lost, or they are calculated too favourably and every unit sold destroys margins. Make-or-buy decisions are particularly dangerous: If the company's own production costs are only estimated as a lump sum, decisions on external procurement can be made on the wrong basis, with the result that profitable production capacities are reduced or unprofitable processes are kept in-house. At least as critical are hidden cost drivers that remain invisible in rough calculations. If, for example, the costs for custom-made products, complex geometries or additional quality checks disappear in general overhead rates, there is no basis for targeted optimisation.
Why companies calculate in too much detail
While some companies work with overly rough calculations, others suffer from the opposite problem: they calculate in such detail that the calculation itself becomes a bottleneck. Every screw is recorded individually, every work step is planned to the second, every overhead cost item is calculated on three levels. The result is costing models with hundreds of parameters that take weeks to create and tie up entire teams. What begins as a quest for precision often ends in ‘analysis paralysis’ - a state in which so much is analysed that decisions are delayed or, in the worst case, not made at all.
There are many reasons for excessive depth of detail. It is often the fear of making the wrong decisions that leads to ever more detailed analyses. Out of professional ambition, controlling departments develop highly complex allocation models that can theoretically allocate every cost type exactly. The possibilities offered by software systems tempt them to record more and more details. And last but not least, some companies are convinced that only a fully calculated item is really reliable. The result is a costing system that is impressively complex, but whose benefits are disproportionate to the effort involved. At the same time, the maintenance effort for master data, calculation models and parameters increases exponentially. Every change, every adjustment requires extensive updates, which ties up resources and restricts flexibility.
The phenomenon of pseudo-precision is particularly problematic: calculations show results accurate to several decimal places, even though key input variables are based on assumptions or outdated data. A typical example: the production costs of a component are calculated at 47.23 euros - determined from detailed process times, machine hour rates and material consumption. In reality, however, the material price used was three months ago, the cycle time comes from a simulation without validation in series production, and the machine hourly rate does not take into account the current capacity utilisation situation. The precise result suggests an accuracy that does not actually exist. Worse still, the complex calculation distracts from the really crucial questions. Instead of discussing whether the component should be manufactured in this form at all, there is a debate about cent amounts in secondary items. Precisely calculated, but wrongly decided.
Finally, there are situations in which calculations are simply too slow for the dynamics of the market. If it takes two weeks to prepare a detailed calculation, but the competition submits binding offers within three days, greater accuracy is of little use. In fast-moving markets or for initial enquiries, the speed of response often counts more than the decimal place. Companies that focus on maximum detail risk being systematically late. The supposedly better calculation becomes a competitive weakness because it makes the company too sluggish. This also shows that more detail is not automatically better.
The correlation between granularity and decision-making purpose
The question of the right granularity cannot be answered in general terms – it depends directly on the purpose of the calculation. A calculation always serves a concrete decision: Do we accept the offer? Do we continue to develop the product in this variant? Do we manufacture ourselves or do we buy in? Do we optimise this process or that one? Each of these decisions places different demands on the level of detail in the calculation. What is essential for one question may be completely irrelevant for another.
The difference is particularly clear between quotation costing and development-related costing. In the case of quotation costing, the main question is: At what price can and do we want to offer this product? What is important here is a reliable overall cost estimate that covers all the main cost blocks – material, production, assembly, overheads. The level of detail must be sufficient to identify risks and calculate the price competitively, but it does not have to cover every single step. This is not the case with costing during development: the aim here is to identify cost drivers and highlight optimisation potential. Which design variant is more cost-effective? Which tolerance specification causes disproportionately high production costs? A high level of granularity is required at critical points where decisions can be specifically influenced.
There are also fundamental differences between strategic and operational decisions. Strategic decisions such as the choice of location, technology investments or make-or-buy decisions require a calculation that correctly depicts orders of magnitude and makes critical assumptions transparent. Whether a component costs 49 or 51 euros is less relevant to the strategic question ‘Do we build our own production facility?’ than the basic cost structure and economies of scale. Bandwidths and scenarios can be used here. For operational control decisions, on the other hand, such as the question of which product mix is manufactured on which machine, details can be decisive. Here, machine hour rates, set-up times and batch size effects must be precisely recorded because they directly influence daily production planning.
The product life cycle is another key influencing factor. In the early phase, when only concept sketches or initial design drafts are available, a high level of granularity is simply not possible, nor does it make sense. The aim here is to determine an order of magnitude using parametric models and cost driver analyses: Are we in the region of 100, 500 or 2,000 euros per unit? Which assembly is the biggest cost driver? Detailed process calculations would be premature precision. In later phases, when design and processes are fixed and series production starts, the need for depth of detail increases: actual production costs must now be compared with planned costs, deviations analysed and optimisations implemented. This is where granular costing pays off because specific levers can be identified.
To summarise: the purpose determines the necessary level of detail. The best calculation is not the most detailed, but the one that provides the relevant information in the appropriate depth for the specific decision to be made. Companies that understand this correlation can use their costing resources in a targeted manner where they create the greatest added value, instead of making depth of detail a principle.
Influencing technological factors
The discussion about the right granularity has been fundamentally changed by technological developments in recent years. Modern costing software now enables a level of flexibility in terms of detail that was previously unthinkable. Instead of having to decide between ‘coarse’ or ‘detailed’, companies can increasingly control which level of detail is used depending on the situation.
The decisive difference lies in the ability to maintain different levels of granularity in parallel and activate them as required. Conventional costing systems required prior specification: Either costing was done at assembly level or at individual part level, either with flat-rate production surcharges or with detailed routings. Modern solutions, on the other hand, allow the same assembly to be estimated as a lump sum using similarity analyses, roughly dimensioned using parametric models or fully calculated - depending on which project or information phase you are in or how critical the component is. This flexibility not only reduces the effort involved, but also the risk: companies no longer have to commit prematurely to a level of detail that later turns out to be unsuitable.
The integration of CAD, ERP and MES systems opens up additional possibilities, but also harbours challenges. CAD systems can automatically extract geometric parameters that serve as the basis for parametric calculations, e.g. volumes, surfaces, materials, weights, complexity indices. ERP systems provide current material and resource costs as well as capacity utilisation information. MES systems feed back real process data: actual cycle times, set-up times, reject rates. In theory, this integration makes it possible to keep calculations highly automated and always up-to-date. However, practice shows that the opportunities can only be realised if data quality and consistency are guaranteed across all systems. Different part numbering, inconsistent master data or missing interfaces can lead to integration creating more problems than it solves. There is also a risk that technical feasibility will lead to excessive detailing: Just because a system can read every parameter from the CAD does not mean that all these parameters are relevant for the calculation.
Conclusion
The key insight is that granularity is not a quality feature in itself, but a design parameter that must be consciously adapted to the respective decision-making purpose. A calculation in early product development requires a different level of detail than a quotation calculation, strategic decisions require different information than operational control, and different priorities are relevant in the series production phase than in post-calculation. Companies that understand these differences and differentiate their costing practices accordingly waste less time on irrelevant details and at the same time gain the crucial insights where they are really needed.
In the end, it's not about calculating as accurately as possible, but about making better decisions faster. A calculation that makes the relevant cost drivers transparent in a reasonable amount of time and thus enables well-founded decisions is more valuable than a perfectly calculated analysis that comes too late or gets bogged down in details that nobody uses. Granularity should therefore not be seen as a technical issue, but as a strategic one: Where is it worth going into detail, where is orientation enough, and how can resources be used to maximise the gain in knowledge? Companies that can answer this question for themselves not only calculate more efficiently - they also gain a real competitive advantage.

Professional costing software from 4cost
The software and service solutions from 4cost provide you with a maximum of cost transparency at all phases. For improved cost control and increased profitability.
Request a commitment-free presentation now. Our experts will be happy to advise you on the right solutions for your company.
