Friday 20 March, Morning
The workshop, which is open for all registered participants, will take place Friday 20 March at 9:00-12:00.
The workshop will focus on the role of software in 6G. It will use the results of the 12 expert groups on Thursday as input for the challenges and opportunities to software and software engineering.
The agenda includes short presentations followed by group work. The group work will contribute to a white paper “Future Software Engineering”. More information coming soon.
Co-chairs for the workshop are Prof. Tommi Mikkonen, University of Helsinki ja Prof. Markku Oivo, University of Oulu.
As we move towards global megatrends and challenges, the role of software and data intensive solutions will be emphasized. Evolution to 5G and 6G networks together with AI and IoT will introduce totally new challenges and opportunities to software engineering. Current software engineering methods and architectures support conventional systems including some support to distributed systems. 5G and 6G will bring the need to develop massively distributed, dynamic, heterogeneous systems, edge/fog computing and multitude of IoT systems. Functionalities are distributed dynamically and various types of systems must interoperate with each other even when they are created at different times by different suppliers creating challenges that go far beyond current configuration management practices.
New communication paradigms will include more and more software, and the role of software in telecommunications technologies in general is growing. Software defined radio has been on the agenda for a long time, but now softwarization is going much further when utilizing software, machine learning and artificial intelligence to build, manage and optimize networks. AI is hitting the mainstream on several fronts. Trustworthy, industrial scale AI will require new software engineering methods and tools integrated with traditional software and systems. We will consume software using numerous devices, either sequentially or at the same time, when executing certain processes, and hence enabling a seamless, “liquid” application/experience flow from one device to another is becoming reality.
All of the above challenges will affect the software, its architecture and implementation. Software and its engineering approaches are evolving to numerous directions simultaneously. Designs must both support and take advantage of ultra-fast communications and extremely short latencies. The architectures must be highly dynamic and support massively distributed, heterogeneous and dynamic systems. Applications, platforms, ecosystems, and business models (e.g., micro-operators) require new software solutions, both technically and in their management. Even at present, new models are constantly emerging, ranging from new architectural styles and supporting facilities on the technical side to dynamically evolving business models where service provider and service user are roles that can be assumed dynamically.
Quality assurance, testing, verification and validation will need novel approaches. How is quality assurance, verification and validation of completely new types of systems handled, where e.g. AI modules that only produce statistical results are included? How do DevOps, Continuous Intergration, Continuous Delivery & Deployment work in such new environments? What are the ethical consequences of systems that are always on and that are responsible for our everyday operations.
Services will be increasingly based on software and the amount of software will explode. A good example is a car. Modern cars may have 200 million lines of code and this will be dwarfed by the amount of software in the 5G and 6G era.
Personal computing is no longer personal but based on clouds that tie together all the computers that an individual uses. This means new protocols, interfaces, and design decisions regarding what will be located and where, who owns the master copy, procedures for shutting down and disposing of computers, and mechanisms for managing all data in a secure, private fashion.
The sheer amount of data is also becoming an issue. How do you manage the massive amount of data that sensors produce especially when data is embedded and distributed across the network and processed in the network, edge and in the cloud? How to build smart applications on top of them? How is quality assurance, verification and validation of completely new types of systems handled?