Meeting Minutes
===
:::spoiler Document Usage
This document is used for all minutes of the WoT CG.
After the meeting, the contents will be copied to the GitHub with a PR.
===
###### tags: `Meetup`
:::spoiler Document Usage
This document is used for all minutes of the WoT CG.
After the meeting, the contents will be copied to GitHub with a PR.
For one week, changes can be requested and then the PR will be merged.
After the PR is merged, the content here can be deleted.
Document Access:
- Everyone has read and commenting access
- Cristiano Aguzzi and Ege Korkan can edit the document
- Other people can request write access for a meeting
- The scribe of the meeting will have write access
<!-- Do not change this spoiler container when writing an instance document -->
<!-- See https://github.com/ikatyang/emoji-cheat-sheet for list of emojis -->
:::
:::info
:date: **Date:** 17 July 2025
### :bust_in_silhouette: Participants
<!-- This list will copied over from the meeting tool -->
- Ege Korkan
- Cristiano Aguzzi
-
### :scroll: Agenda
:writing_hand: **Scribe:** Cris
----
### Introduction
Ege: Welcome Everyone! we are at Meetup 27th! All slide are public
Ege: you can find us in different places on the web
### News
Ege: We don't have any major event planned, other than TPAC and Office hours
Ege: we are also publishing two new tutorial videos, there are more underways
Ege: TPAC will be held in Kobe Japan. We have a meeting planned for 14th of november. Let us know if you want to partecipate.
Ege: if you want to present in this kind of events (like this meetup) let us know.
### Meetup Presentation
Ege: Today with us we have Dimitrios and Nikos from National Technical University of Athens. Dimitrios is very active in different EU projects around IoT and cyber-physical systems. Todays, they will present Nephele project and how it can help build iot applications with IoT virtualization and edge computing.
Dimitrios: Thank you for give us the opportunity to present Nephele project. Nephele project will be finalized in September, we are trying to integrate the iot part in the cloud orchestration platform.
Dimitrios: NETMODE lab is focused in 5G/6G technologies and it is involeved in various EU and national projects.
Dimitrios: in nephele we pipointed two main challeges, iot converce and integrated meta-orchestration. We are trying to build synergies between edge and the cloud.
Dimitrios: We build ontop of a well-known python implementation of the Web of Things and extended it with additional functionalities to enable the orchestration across edge and cloud.
Dimitrios: In Nephele we have two main concepts: Virtual Object (VO) and Composite Virtual Object(cVO). VOs are entry points of the Nephele ecosystem and cVO are compositions of those VO. We tried to build an active community where we asked to build application graphs which consists in a set of reusable VOs and cVOs
Dimitrios: VOs are usually deployed on edge servers.
<Dimitrios show an example of a complex application graph build in Nephele>
Dimitrios: You can see that there are different comunication schemes. This is just an example. The VO stack architecture has three layers. The first one is the Physical Convergence Layer. The second layer is the Backend Logic Layer. There we have the "intellince" level of a VO. In here you can find bindings to databases for example. Finally, in the Edge/Cloud converce layer we want to solve issues about scaling and monitoring.
Dimitrios: VOs and cVOs are described by a document which allows you to deploy them in orchestration platforms like Kubernates.
Dimitrios: in WoT-py we added new protocols binding, security mechnisms, periodic functions and binding to influxDb.
Dimitrios: for example we included Zenoh protocol, used in robotic applications.
Dimitrios: We also added RTSP server (as a sidecar) for video streams and proxy mode.
Dimitrios: Another additions was to implement OpenID and Verifiable Credentials for access control to VOs and cVO.
Dimitrios: We were also able to properly describe a proxy pattern between VOs.
Dimitrios: we also expiremented with SDN for example we impelmente time sensitive networksing over NETCONF
Dimitrios: Thanks to the modularaty of VOs and cVOs we also successfully develop Digital Twins.
Dimitrios: We submited research work in the context of the internet of robotic things. We used VOs in an AI-assited mechanism to control the task foofloadin in an Edge AI scenario.
Dimitrios: Finally, we are defined a roadmap. Main focus in keeping it alive and following the development of the standards or the nodejs implementation. The VO stack could be published under Eclipse ThingWeb or as a standalone project.
Dimitrios: Thank you WoT community for the continuos interactions during this years.
<Nikos is taking over showing a video and documentation>
Nikos: the video shows an example of using influxdb and VO and how the stack stores property values in it.
Nikos: We developed also a web based dashboard and control application.
### Discussion
Ege: In the last part you metioned this hazard detection, is this logic happening in the cVOs or outside the stack.
Nikos: there we used cVOs that listen to events or conditions and act some specific code if a condition arise.
Leonhard: what standard or format do you use for definging VOs?
Nikos: we have yaml file with a specific format.
Ege: Zenoh protocol seems intresting, but it is more like a single implementation rather than a standard. Do you think is better then MQTT or what was your expirience.
Nikos: we mostly had the requirement from one participant that couldn't use MQTT for his use case (robotic use case)
Dimitrios: Zenoh performance it is even faster than MQTT expecially for large payload. Even if MQTT is more used.
Ege: was just used with ROS?
Dimitrios: in our case yes.
Ege: you showed us different domains but everything was descripted with TD, there were some drawbacks by using Thing Description as a single vocabulary or model for describing this use cases?
Anastasios: in DT it is important to merge different sources.
Cris: Did you need to manage long running actions or queue of actions? Also did you need to map the data, like influxdb data to the properties?
Anastasios: We have logic to handle complex interactions with state and action request. For example, can't reboot a device if somebody is asking data.
Nikos: We have a mechanism to handle queue of actions but the Thing Description did not reflect this behaviour.
Ege: did you see any missing feature in the Web of Things?
Nikos: there is no support for proxies, you can't tell if you are interacting with the proxy or with the real thing.
Ege: we have similar thing, like sometimes people describe a room with a Thing Description but it is a collection of sensors and there is no "Room" device.
## :ballot_box_with_check: Resolutions
## :exclamation: Action Items
## :envelope_with_arrow: Feedbacks
## Presentation Summary
- What is your evaluation of the Zenoh and its applicability here? How does it compare to MQTT in your pov?
- Video in VO, how does it look like? Is it a live stream, webpage?
- You are able to "merge" different domains. Typically, video handling, SDN, classical IoT data are all separate. By putting them together, we can manage them at the same time, which is nice. Do you see some drawbacks, like too much abstraction etc?
- Anything missing in the standards? or anything that annoyed you?
- Hazard detection: Where does the logic live? "if detected, actuate" logic