System Design Principles: adaptation to time for long living autonomous systems

Authors

DOI:

https://doi.org/10.32347/tit.2022.51.0301

Keywords:

complex system, adaptation, time, system failure, artificial general intelligence, non-stationary stochastic process

Abstract

The article presents the principles of creating systems with the adaptation to the operation time.  In the literature on systems design the adaptation mostly concerned: 1) the unknown object structure; 2) the unknown object parameters; 3) unknown parameters of input signals; 4) the unknown functions of system state dynamics; and 5) unknown environment conditions. It usually assumed that the control process is intended to achieve a certain, usually optimal state of the system, - such way the “adaptive system concept” are close to “optimal system concept”. The other approach for system design with these conditions is robustness, as robust control does not need a priori information about the bounds on these uncertain or time-varying parameters; robust control guarantees that if the changes are within given bounds the control law need not be changed, while adaptive control is concerned with control law changing itself. In the last 10 – 15 years there was introduced the new approach as “resilient systems”. “Resilience engineering” may look like “repairing engineering” – it is assumed that errors or malfunctions occur “for sure” and the system should respond appropriately to this.

The system operation time, as the cause of adaptation, was very rare considered, mostly when reliability issues are discussing. The proposed approach is new. The proposed principles should be used with known approaches of dependable system design, – these are engineering and information theory redundancy. Both approaches must be used in the design phase and are unchanged structural parameters of the system during operations. There were concerned mostly “long-living systems” and the same task of reliability.

Proposed principles can be used in the development of the systems designed for continuous operation with absence of the possibility of external human intervention to restore system performance or some maintenance procedures. By «system» in this article are meant «Complex Adaptive Systems» (CAS). Currently, the proposed approach can be attributed to the development of "Artificial General Intelligence" (AGI). Examples of such systems include space-based and underwater-based robotic systems. By “Adaptability of the system to time” – in the sense of control process – it is meant a certain structural reconfiguration of the system, considering the non-stationary nature of stochastic processes of errors, damages, and system failures.

The formulation of principles is of a general declarative nature – at this stage the author gives preference to the essence of the proposal, rather than its formalization. The article does not provide specific design guidelines, but contains some examples of possible applications, mainly to highlight the essence of the proposals.

References

Tsypkin Y. (1971). Adaptation and Learning in Automatic Systems. Esso Production Research Company, Academic Press, New York and Lon-don, 290.

Pontryagin, L.S., Boltyanskii, V.G., Gamkrelidze, R.V. and Mishchenko, E.F. (1962). The Mathematical Theory of Optimal Processes. Wiley (Interscience), New York, 3.

Zhou K., Doyle J. C., Glover K. (1996). Ro-bust and Optimal Control. Prentice Hall, 596.

Hollnagel E., David D. Woods, Nancy Leveson (2006). Resilience Engineering: Con-cepts and Precepts. Aldershot UK, Ashgate, 416.

Bi, Z.M., Lin, Y., Zhang, W.J. (2010). The ar-chitecture of adaptive robotic systems for manufacturing applications. Robotics and Computer-Integrated Manufacturing, 26, 461-470.

Sun, Z., Yang, G.S., Zhang, B., Zhang, W. (2011). On the concept of the resilient machine. In Proceedings of the 2011 6th IEEE Conference on Industrial Electronics and Applications. Beijing, China, 21-23 June 2011, 357-360.

Butler M., Jones C., Romanovsky A., Troubitsyna E. (2006). Rigorous Development of Complex Fault-Tolerant Systems. Springer-Verlag, Berlin Heidelberg, 241-261.

Butler M., Jones C., Romanovsky A., Troubitsyna E. (2009). Methods, Models and Tools for Fault Tolerance. Springer-Verlag Berlin Hei-delberg, 350.

Stapelberg R.F. (2009). Handbook of Reliability, Availability, Maintainability and Safety in Engineering Design. Springer-Verlag London Limited, 856.

loannou P., Fidan B. (2006). Adaptive Control Tutorial’. Society of Industrial and Applied Mathematics (SIAM), Philadelphia, 403.

Fritsch S., Senart A., Schmidt D. C., Clarke S. (2008). Time-bounded Adaptation for Auto-motive System Software, ICSE, Leipzig, Ger-many, 89-96.

Barbour J. (1999). The End of Time, Oxford University Press, New York, USA, 384.

Gomaa H. (2011). Software Modeling and Design UML, Use Cases, Patterns, and Software Architectures, Cambridge University Press, USA, 578.

Kornieiev S. (2016). Operating System of Artificial Intelligence: the basic definitions, “Artificial Intelligence”, Vol.74, 4. ISSN 1561-5359, 7-13.

Scheffer M. et al. (2009). Early warning signals for critical transitions. Nature, Vol.461, No.3, Sep.3, 53-59.

Fisher L. (2011). Crashes, Crises, and Calamities: How We Can Use Science to Read the Early-Warning Signs, Basic Books, New York, 256.

Thurn S., Hane R., Klime P. (2018). Introduction to the Theory of Complex Systems, Oxford University Press, 448.

Kalman R. (1960). Contribution to the theory of optimal control. Boletin Sociedad Matematica Mexicana, Vol.5, 102-119.

Wertz J., Larson W. (1999). Space Mission Analysis and Design, Microcosm Press, USA, Kluwer Academic Publishers, The Netherland, 923.

Kornieiev S. (2015). The AUV Approach, World Pipelines (USA), "Coatings & Corrosion", 63-66.

Kornieiev S. (2017). Operating System to base AI-applications: the overview and general technical requirements, Proceedings of XIV International Scientific & Applied Conference TAAPSD, 45-56.

Downloads

Published

2023-04-26

How to Cite

Kornieiev, S. (2023). System Design Principles: adaptation to time for long living autonomous systems. Transfer of Innovative Technologies, 5(1), 54–61. https://doi.org/10.32347/tit.2022.51.0301

Issue

Section

Information Technology