Part 1 of this article discussed the new ways of interacting with connected products and how manufacturers will change their approach to development, operations and service.
Part 2 maps out how this change will create new opportunities, and new frontiers of competition for software vendors who target the smart product sectors
The scope gets bigger, again
Removing product switches and displays makes some things simpler, but not enough to turn the tide of growing complexity.
Handling the transition to a smart product is tough because of the multiple technologies involved:
Trade-off decisions are now even more complex, so much so that a systems-engineering discipline may be needed to avoid a committee vote for every decision!
A smart connected product, sold with operation or service agreements, means much stronger connection of the engineering team to the product in operation.
Instead of being largely isolated in the old ‘development’ and ‘production’ parts of the organisation (see figure 1), data streams from the product provide a high fidelity view of the product in operation.
This will help calibrate simulations. The new service team will be fiercer than any customer in feedback of any problems.
Figure 1: Some of the changing engineering dataflows
New life in the field
Product function and performance depends on all its components (including the software), as well as the capabilities of the connected back-end systems.
So, development engineers (and, of course, the sales and marketing teams) have a new method of providing new capabilities – update the software (and remember to update the as-maintained records).
Caught in the dataflows?
It is easy to imagine engineering teams getting caught out by the volume, frequency, scope and detail of even these new dataflows, and we haven’t even mentioned software configuration and support for resellers wanting to demonstrate the new capabilities, or coordinating a new software baseline with production and test.
Fortunately, for most design and manufacturing organizations, this is familiar territory, given that engineering dataflows and processes have been getting more and more complicated for decades, for a range of reasons including:
- distributed development teams
- global supply chains
- gaining regulatory approvals
Software from the Product Lifecycle Management stable has provided the tools needed to manage data, and manage workflows. PLM has the structures needed to handle the new dataflows.
The new engineering software battlegrounds
The transition of smart connected products from the special case (NASA has been building smart connected products for decades) to more widespread adoption is a shift in the tectonic plates of the engineering software landscape.
Handling new dataflows is just one example, but there are loads of other opportunities for competing engineering software vendors to gain an edge over their rivals.
Agile systems definition:
Agile methods are established in software development, and include characteristics that would be described as “just good engineering” by traditionalists and hardware developers.
But few tools for agile software development offer the visibility and control needed for exchange of complex requirements databases between customer and a complex supply chain.
Configuration management, product line engineering and platform architectures all offer partial answers, but smart connected products will create demand for new agile systems definition tools to support concept and early stage architecture development, capable of driving consistent use of the many early stage simulations product architects will need.
ALM or PLM or both?
In software development, Application Lifecycle Management tools play the role that PLM plays for the physical parts of a product. So how can integrated software/hardware teams manage their work?
There are several ways of answering this question.
One is to separate out ‘management’ of everything into a higher level function that supports access control, versions, workflows, baselines, variants, dependencies … everything excluding the content of the object being managed.
Others compete with this concept by creating integrated environments – the Integrated Development Environment (IDE) used in software development is an example – in which authoring and test tools are included, so the result manages the content as well as the status of the managed objects.
Cambashi research interviews have indicated that engineering managers feel that ‘software is different’, yet still expect PLM vendors to take the lead on how to configure tools for integrated hardware/software development.
The BoM boundaries.
When talking about product definition, the problem has always been “Which Bill?” As designed, as planned, as manufactured, as installed, as maintained – they all have a claim.
This has been a traditional battle ground between PLM providers and ERP providers. PLM has been secure in control of the engineering parts list. The battle starts as this is translated into the as designed bill of materials.
For many companies, this is where ERP takes over, and becomes of the owner of the BoM used for production scheduling, including all the handling of alternate parts. Similarly, PLM has control of development of the manufacturing process, and the manufacturing process plan for each product, sometimes called the ‘Bill-of-Process”.
But ERP providers can get involved as this gets translated into shop floor documentation and electronic work instructions. Adding embedded software as a component of the product will disrupt this battle.
Service and Over-the-Air update:
Most service organizations will want to make sure that engineering has no more than read-only access to products in the field. Similarly, service organisations will want control over the applications that handle data (especially alarms) from in-service products.
The service organisation will want their process of escalation and adherence to service-level-agreements, to take priority over engineering’s desire to identify root causes. This is a new and interesting area, because PLM systems already contain all the configuration dependencies.
Could PLM be extended so that these dependencies can drive service decisions in the field? Or do service need their own as-maintained BoM and configurator rules?
Some design methods start with ‘how can this capability be tested’. It is also possible to parameterise tests, and link these parameters to product parameters – so the final choice of the product parameter in effect generates the test specification:
- Will these concepts help manage and automate test creation and execution for smart products?
- To what extent will the tests on software that allow the master version to be released to manufacturing need to be supplemented with further tests once the software is loaded onto the smart product?
- Will the simulation environments used during product development define the external operating conditions or the response of the product in a way that allows re-use in testing?
Embedded software is critical to smart product performance. Simulation technologies have grown to handle multi-physics and interconnected sub-systems, software is a new technology to handle.
The simulation battle ground for engineering software vendors is active on many fronts, including:
- simulation data management
- the practicality of flexible ways of enabling hardware (and software) “-in-the-loop” as the various prototypes of electronics, sensors, actuators become available
- the feedback of actual test and product performance to calibrate and improve simulation models, enabling simulation at an early stage in development
- making simulation accessible to a wider range of engineers
In addition, as the role of the digital twin of a product becomes larger, there will be more demand for simulation to support product operation decisions.
Getting used to a product with no visible means of control is just the start.
Security, internet access, the likely need to replace controllers with new generations of electronics during the lifetime of a machine, these are just some of the new factors for product developers to think about.
As with previous new technologies, engineering processes and dataflows will adapt.
For PLM vendors with ALM capability, this is a time of opportunity – the information their technology holds about a product now has even more value in manufacturing, as well as for operation and maintenance.
But ERP vendors will point out that their systems help match processes to costs, and that is often the message budget holders want to hear.
What did you think?
Did you attend Liveworx 2016? Let me know what you thought – tweet @Cambashi_Peter or contact me via LinkedIn.
1 thought on “The New Face of Machinery – Part 2”
Pingback: Cambashi Insights: The New Face of Machinery; Part 1