top of page

Industry 4.0 and The Future of Manufacturing



I would like to share a few thoughts about leveraging modeling and converge architecture disciplines to support manufacturing organizations (aerospace, automotive, pharma, appliances, electronics) to leverage Industry 4.0. Much of the recent excitement around IoT is focused on auditing performance and monitoring anomalies, which is a natural extension of practice that we’ve had for years around instrumentation. The exponential growth in storage capacity & compute resources has opened the door for mining this information to identify opportunities in asset lifecycles for optimizing maintenance in alignment with operating effectiveness (efficiency, quality, performance). There are a few key areas where models and architecture can be introduced to further the advancement of Industry 4.0:


Analytics:

The use of algorithms to provide analytics makes sense, however there are some challenges that come with building analytics (especially when you introduce scale). Building analytic models is only one part of the puzzle, we also need to build quality models that can assess the accuracy of the analytics and minimize the likelihood of significant error that propagates through the data (spent 6 years doing this and it’s not for the faint of heart). So, we would use analytics (linear and non-linear algorithms) to assess the accuracy of the analytics used to monitor performance, predict failure rates, and interpret the impact of unplanned changes within inputs, processing and outputs (IPO).


Identity:

A lot of emphasis is placed on the assets involved in Industry 4.0 as they are the “nodes” being instrumented. Most of these assets are physical, but many more are virtual (logical devices, systems, virtual locations in networks, services, processes, bots, etc.). The challenge that many organizations are struggling with is identifying these assets. I see this in organizations with large numbers of SKUs who have to deal with serialization (Pharma and the traceability up & down the supply chain). These challenges are not easily handled with conventional approaches for managing identity in security solutions (IDAM). This is where new models and solution (architectures) can play a pivotal role. Think about the challenge of temporal nodes (identities), which occur with emergent processes and services. These are self-assembled responses to demand for one or more outcomes.


RPA provides a scripted response with multiple defined pathways. As RPA footprints mature, shared services / components are introduced to create scale and promote reuse. So, when a process is executed, it has a specific identity at the time for all of the constituent components used for the final delivery of service. Should an issue occur, the traceability of the parts and their context (data, events, semantics, stakeholders, etc.) would need to be analyzed to assess root cause and explain causality and impact.


This is simple compared to the more elaborate situations where processes and services are being assembled to respond to one or more demands. As process design moves from modeling into dynamic assembly of activities, each activity now will need it’s own identity and the decisions used to guide assembly and directionality (selection of a given pathway or executing simultaneous pathways of activities) will need their own identities. This gets even more interesting as assembled orchestration involves components that are controlling other devices and increasingly controlling the end product being used by a customer. The lines of product, service, and experience become increasingly blurry using current perspectives. We will need to look more closely at smaller and smaller components and find ways of identifying them so we can trace, audit, and understand performance of processes, services, and products across the span of the supply chain and value chain.


Relationships:

So, a lot of work is focused on capturing the “nodes” but very little is known or captured about the relationships between these nodes. That is where causal factor analysis often gets very difficult (time consuming, resource intensive, and perhaps incomplete). For example: Suppose a cutting tool needs to have its blade sharpened or replaced. How do you make the right decision about when you perform the maintenance for something as “simple” as a cutting blade ? What are the temporal impacts around making the decision ? How would an algorithm address the complexities of causality and balance the need for sub-optimal cost management in the supply chain ops against the potential impacts of fabrication (downstream assembly) vs. operational performance (product in use by customer) ? Trying to insert all these variables into a matrix will drive compute costs up (to say nothing about the energy needs to support those complex analyses).


Our brains are able to handle these much more efficiently because we have the ability to map models on demand and the run the data (as best we can) through those models and quickly arrive at a “gut reaction” for a decision. Of course this used to work fine before we started to connect everything in the supply chain into a large scale Digital Twin. More opportunity perhaps to evolve models, analytics, and machine learning (to identify relationship, instantiate them, and help attribute them in support of the analytics).


Scenarios:

Part of the challenge with building these Digital Twins is the need for formalizing how scenarios are captured, scoping these out and identifying natural boundary points for separation of responsibilities and simplifying the problem space to focus compute resources more effectively. I’m a big fan of DDD (domain driven design) as it can help with the articulation of domains, resources, and the capabilities (yes, this is where I believe we can incorporate business architecture more effectively) into a dynamic model that can be used to guide the modeling of scenarios within agreed upon boundary points. Done right, we might actually be able to incorporate the elusive requirements into this space to augment decisions and impact analysis about how to drive change management using What IF planning for operations (initially), for products (over time), and for the business (eventually).


Many of these challenges can be addressed today using Biz:DNAⒸ models and applying these in conventional technologies. As the key areas involving models and architecture disciplines converge to form a digital twin, organizations will be able to build scale into their supply chains and value chains to manage the accelerating pace of complexity and change associated with the increasingly democratized future of manufacturing.

8 views

Recent Posts

See All
bottom of page