Working on the evolution of the internet

Kuipers initiated, among other things, the Delft on Internet of Things (Do IoT) field laboratory at TU Delft. A collaboration in which scientists, entrepreneurs and government agencies innovate together in the field of IoT. “We provide companies with a secure and advanced environment in which they can test and improve their ideas and prototypes. For example, we are currently working on 5G applications in autonomous transportation and healthcare.” And in the long run? “The ambition is to develop an open source self-regulating 6G network, the successor to 5G, in which AI will play a major role.” Learn more about the Do IoT field lab here.

Internet maintenance is a tightrope walk

All these innovations suppose, therefore, great challenges. For example, how are we going to make sure there isn’t a jumble of protocols, which makes the makeup of the Internet even more inscrutable? “You can see that challenge on several levels,” Kuipers continues. “First, science can develop tools that help operators find optimal settings and parameters. Not only does this work efficiently, but it can also provide a centralized understanding of error codes, allowing them to be resolved quickly. Take, for example, the total outage of Facebook on October 4, 2021, caused by a misconfiguration. We want to avoid those kinds of mistakes.” But of course Kuipers prefers to go further. “The software itself also deserves attention. You may want to configure it optimally, but in practice you find that you need a maximum of half the features for a well-functioning router. The rest is unnecessary, and even a risk. In other words, flexibility in software is extremely important. We should no longer be tied to large, cumbersome programs, but rather move toward an Internet where every router runs exactly the software it needs. And with the recent rise of programmable networks, that is now possible.”

There’s just one problem: The Internet already seems to be in a precarious balance. Research shows that only a small percentage of internet outages are caused by hackers, with the vast majority due to misconfigurations and hardware failures. Is it therefore prudent to operate ‘on the living patient’? And if so, how can you avoid unnecessary risks? “Those are tough questions,” Kuipers acknowledges, “but we can’t help but go in that direction. That is why we are working on standardized tools and processes with which we can adapt the internet safely.” A quick look at the numbers shows that preventing internet outages is a worthwhile ambition. It is estimated that a failure of the Dutch network can easily cost 15 million euros per hour. And that Facebook blackout? It cost the company billions of dollars in market value.

Genetic programming and the power of AI

One of the most promising technologies to make the Internet more robust and resilient is AI. In fact, Kuipers’ ambition is to develop a completely self-regulating network. A network that monitors itself and makes adjustments based on changes in usage, new techniques, or sudden congestion. “We now have the first proof of concept of such a self-regulatory network. We use ‘genetic programming’ for this. As with a biological process, small variations in the programming of the network are continually created. If the algorithm notices that these have a favorable outcome, it is selected accordingly. Finally, we also merged two promising codes. The result is a network that does exactly what you ask it to do.”

“But let me say one thing before I continue: these kinds of techniques don’t turn into a ‘black box.’ An integral part of the self-regulating network is a self-reporting component: operators must be able to request information about the status and progress of network software and configuration at any time. That way you stay in control.” Kuipers is now busy creating a ‘user interface’, so that the self-regulating program can be used in a realistic environment. Here he too is giving himself a very progressive thought: the operator would regulate the network through a dialogue with the AI, instead of an antiquated control panel. “Where we want to go is sort of an Alexa or Siri, which you can actually have a conversation with. Adjusting a network is so precise that the feedback from the AI ​​is of great added value in all your choices.” For example: the operator makes a suggestion for further development of a network, but it does not seem to be specific enough. “Then the AI ​​asks questions: ‘Gosh, operator, what exactly do you mean when you say X or Y?’ Not literally in such colloquial language, of course, but as understandable as possible.”

A transparent internet

Kuipers has a clear point on the horizon: a secure and transparent Internet. An internet whose users can always see and control their traffic. Now, for example, the end user has little way of knowing if their internet traffic goes through countries or providers that do not conform to that user’s values ​​and standards. “While there is quite a bit to find in terms of privacy,” says Kuipers. “One of the great benefits of self-regulating, self-scheduling and self-reporting networks is that you get a clear picture of what’s going on behind the scenes, and you can adjust it right away. Perhaps in the future we will all set up our own Internet profile, through which you indicate your preferences about how you want to use the Internet. For example, you require that all your traffic go through operators that meet certain security requirements. Or maybe even through operators that meet certain sustainability requirements? In short: more transparency and control over the handling of what is ultimately your own data.”

Because ultimately, Kuipers acknowledges, the Internet remains a black box for the average user. “Even in the self-regulated Internet, the reality is still incredibly complex. But the benefits are clear: far fewer interruptions and far more control and transparency. The average user also benefits from this, whether they notice it or not.”

Leave a Comment