A series of products that help mastering complexity and problems in an industrial setting.
Solution for the automatic quality assurance of complex BOMs
Designs of complex product variants or their sub-assemblies in discrete manufacturing can contain hidden and hard-to-find BOM errors with severe impacts for production, parts logistics, and lifecycle maintenance.
KENDAXA BOM Validator is an AI-based, continuously self-learning solution for the automatic quality assurance of complex BOMs (Bills Of Materials) in discrete variant manufacturing. Its core algorithms are system-agnostic and can (optionally) be integrated with different ERP and PLM data sources.
From historical parts and variant BOMs data, BOM Validator can learn how BOMs for the product family should look, including some alternative parts and structures. Based on the results of this continuous association rule mining process and a complementary set of expert knowledge rules, the system can identify potential errors or anomalies with a high level of confidence and label the specific parts of BOM as:
- incorrect / incompatible
- missing or
Significant reduction of the number of hidden errors in BOMs in a highly automated way. Up to 95% that are undetected with traditional quality assurance systems
BOM Validator is able to do even very complex checks in real time, more precisely within a fraction of a minute
Sees the whole BOM at once, so the complex check is done within one checking iteration
Cost saving due to earlier error detection and prevention of follow up costs (repairs, scrap disposal etc.)
Reduction of engineering time and manual quality assurance processes
Shortening lead time, process acceleration, quality improvement
Want To Know More?
The size of the company is not so important, but the relative complexity of the product is.
The BOM Validator is ideal for companies that have their own product lines with the resulting customer orders typically requiring a certain level of customization.
It is particularly suitable for production environments where errors are very costly.
Yes, see above!
The higher the level of customization possibilities, the greater the risk of possible errors and the greater the need for the BOM Validator.
Typical errors that occur:
- Wrong part
- Wrong quantity
- Wrong position
- Wrong attribute such as wrong color, finish, etc.
- Missing mirror part – eg the left backrest of the chair does not have the right one
- Faulty material
- Incompatible part with the rest of the BOM, eg the color is not suitable for the material
Basically, we need data on historical BOMs. This data is usually stored in a PLM system, which the customer uses as standard for the design and implementation of production documentation and BOMs.
Most PLM systems have an interface for reading data from their databases and this is actually exactly what we need to access the data. The rest can be done completely outside of these systems. Of course, the greater the degree of integration, the greater the ease of use, but this is not a prerequisite.
We continue to work on standard and very close integration with the largest players in the market in the field of PLM solutions such as Siemens, PTC, SAP, Dassault, etc.
On the contrary, our solution aims to speed up the process of designing new BOMs, especially those that contain tens of thousands of components and where “human” validation would take a really long time. The BOM Validator is able to validate BOMs in minutes automatically and the user just goes through the marked errors.
Another advantage is that by avoiding errors that would otherwise be found at a later stage of production, we speed up the entire production process. We avoid the delay and cost of remanufaturing.
The end user, most often the designer of the proposed technical solution, has a “BOM validation” button in his application for the design of Bill of Material structures. By clicking on this button, the user is redirected directly to the graphical interface of the BOM Validator, where the user is shown the result of validation of the proposed solution. The user therefore has immediate feedback on whether his design contains an error in the design. Validation parameters are detected automatically based on learned patterns, however, it is possible to set them manually. The user has the option, after reviewing any error, to mark it as invalid, or to specify the conditions under which the error is and under which it is not. This guarantees that the user is entitled to make the final decision. In this case, this also creates a “feedback”, which the BOM Validator will keep for further use in extracting new rules, adjusting the weights of the rules, but also for the validation itself.
It depends on the complexity of the products, but in general the more complicated/expensive the product, the more expensive the design error can be. In addition, if it is a large commission, where it is necessary to redo hundreds of pieces of products, the costs go up. There may also be additional costs such as contractual penalties for late delivery, etc.
In general, the sooner an error is intercepted, the lower the cost.
The acquisition often pays for itself very quickly.
Finding identical or most similar parts using 3D models
With a wide range of similar parts being used in discrete manufacturing, some (new) parts could quite often be replaced by other (existing) ones. But as these are stored under different numbers, names, or codes etc., this may be hard to recognize and take advantage of.
This results in additional engineering, logistics, production, and even maintenance costs as new parts are being developed or procured from third parties while there would be alternatives ready to use in stock.
Based on drawings and 3D models, KENDAXA Intelligence Platform can calculate the similarity of different parts of the same kind such as, e.g., bearings, gears, special screws, etc. and their substitutability.
This helps to reuse existing parts. It is also very useful when in urgent need of some spare part that is out of stock as it can find available other parts with identical or similar properties that could be used instead.
Prevent redundancy during the production or purchasing when available alternatives are in stock. For some specific type of parts it can be reduced by more than 35 %.
Significantly decreased wait times and delays.
Low consumption of computational resources thanks to one-time pre-processing of the models in the database.
Master Data Refinement
Cleansing, deduplication and enrichment of data.
A high quality of material master data is of enormous importance for numerous departments and processes in the company. Incorrect, incomplete or outdated material masters have a negative impact on the processes of the entire value chain. A typical result of careless data maintenance is multiple data records. When duplicates exist, it is not clear which is the correct record and whether the information is reliable. Manual duplicate checking takes a lot of time because the correct record has to be defined and duplicate records have to be merged. The results: longer process cycle times, lower efficiency, errors in the BOM, delays in delivery and wrong pricing. The areas of the company that benefit from intelligent material master data management are scheduling, purchasing, work preparation, production, warehouse management, quality management, accounting, service, sales and engineering.
Given the consequences of poor data quality, it is worth analyzing existing errors and information gaps in the data and remedying the resulting problems. The software puts master data maintenance on a stable footing by offering the option of defining your own rules and designing processes to fit the conditions and requirements in your own company.
Duplicates in product data and material master data regularly cause enormous – and above all unnecessary – costs. They falsify the inventory evaluations and show an incorrect stock level. They increase process costs and capital commitment without any productivity benefit. And they prevent savings effects from larger procurement quantities from being used in purchasing. Only with high data quality in your ERP system can you ensure that neither too much nor too little material is kept in stock. Valid, consistent product and material data is also a prerequisite for a uniform view of your company information – for instance: finance and controlling, production and logistics, materials management, sales and marketing.
Part name standardization
Mirror part linking & Assignment correction
Business rule enforcement
Consistent information on materials, products, items, goods, supplies, etc.
Efficient supply chain management: on-time processing of purchases and deliveries
Support for compliance with regulations such as food and drug safety and transportation guidelines
A reliable source with all the latest information on classifications, parts lists, documents and drawings, volumes and EAN/packaging materials
Cost savings through more efficient processes around product master data
Sustainable cost reduction by optimizing pricing through consistent product master data throughout the product lifecycle
Forecasting models and optimization
Everywhere around us are timelines. Timelines are passing through every business process, for instance: the number of incoming calls, the visitors per day in a shop, the number of online orders and leads, machine failure prediction and so on. All these factors have a direct impact on your business and the correct forecasting, which is necessary for planning further steps: staff shift planning, production planning, stock capacity planning, logistic optimization, machine service planning etc.
We perform a comprehensive analysis of your historical data, taking into account trends and detected abnormalities. Thereby we can identify key external “inputs”, that are significantly impacting your business. Based on this we determine a forecasting model and its specific training and retraining with different external factors. The data from the forecasting analysis is then fitted and the final result is processed. The output is intuitive and easily readable including a graphical visualization.
Important source of information for different business planning processes such as сapacity and material planning, production and maintenance planning, finance planning.
Minimalization of risks while making decisions.
Significant decrease in total cost of insurance settlements related to supplier chain related events.
Prevention of additional and unnecessary costs by up to 80%.