Meeting Minutes
===
:::spoiler Document Usage
This document is used for all minutes of the WoT CG.
After the meeting, the contents will be copied to the GitHub with a PR.
===
###### tags: `Meetup`
:::spoiler Document Usage
This document is used for all minutes of the WoT CG.
After the meeting, the contents will be copied to GitHub with a PR.
For one week, changes can be requested and then the PR will be merged.
After the PR is merged, the content here can be deleted.
Document Access:
- Everyone has read and commenting access
- Cristiano Aguzzi and Ege Korkan can edit the document
- Other people can request write access for a meeting
- The scribe of the meeting will have write access
<!-- Do not change this spoiler container when writing an instance document -->
<!-- See https://github.com/ikatyang/emoji-cheat-sheet for list of emojis -->
:::
:::info
:date: **Date:** 18 November 2024
### :bust_in_silhouette: Participants
<!-- This list will copied over from the meeting tool -->
- Ege Korkan
- Cristiano Aguzzi
### :scroll: Agenda
:writing_hand: **Scribe:** Ege
:computer:
:::
----
### Introduction
Ege: Welcome to the 22nd meetup!
### News
Cris: the week after we will have WoT week a physical event dedicated to WoT.
Cris: We have office hours
Cris: let us know if you want to present something!
#### WoT Week
Cris: We will see each other from 25-29 November in Munich. Two venues, Siemens and Microsoft.
Cris: During the event we will host a Plugfest event an interoperability hackaton!
Cris: Wednesday it is demo time, you can show your WoT solution.
Cris: let us know if you want to join!
### Meetup Presentation
V: I work at CALA. We accelerate lasers to measure ions. It is remotely controlled due to high power laser or radiation.
V: You can measure the pulse or the cross section. Which allows you to get physical characterics. So there are data analysis steps as well.
V: Data acqusition is IoT in the end.
V: High power lasers by streching, reducing the frequency and then amplifying. In the end, it is compressed, where you get a high power pulse.
V: All these steps use custom built components. The whole setup cannot be bought.
> Vignesh shows the UI with different cameras and controls.
V: Here we can use WoT to work with the integration time value in the UI.
V: When we look at the devices, e.g. an oscilloscope, it has properties like time resolution. Actions like start acquisition. Events like measurement points.
V: However, those devices also have state machines. When a measurement is happening, you cannot the trigger value of a camera. 1 out of 4 cameras can do that.
> Vignesh shows the code part of Hololinked
V: from the code, we can autogenerate the TD.
> Vignesh shows property, action and event generation.
V: We can add state information. You can tell if an action is allowed in a state or not.
V: This is also reflected in the UI.
V: One important thing is to correlate the timing of different measurements.
V: Another part is how we can synchronize GUIs. Say, one user starts measurement from one PC.
V: GUI information about the state is also very important and it is device specific. So we can notify the other clients about state changes.
V: You can use the context API of frameworks like React or Svelte.
V: That is it for today. I can get
### Question
Cris: Is the last event implementation built on top of MQTT?
V: It is on ZMQ. You don't have to define multiple socket. Messaging contract is different.
V: We don't have a ZMQ binding though. It is interesting and the WebThing Protocol can be used by over ZMQ.
V: Images need to be a separate frame though.
Cris: What if you encode the image in JSON with base64 then you don't need another frame.
V: I also could not get nice image manipulation in JS so I use the python qt for that.
Richard: How do you handle big arrays like numpy?
V: You can preencode it in zeromq. You can dump the memory on the message quite easily.
Richard: Is there a standard way in msgpack (or JSON or whatever) to serialise a big numeric array? I'm wondering if there's something that's not numpy-specific?
V: Python's own parser was much more performant than numpy. C++ and JSON
Ege: How is the state infomration conveyed to the Consumer?
V: there is a property but the machine itself is in my head. I hope that the standards can help.
Cris: You can also show different TDs per state, as long as there aren't so many states
V: I also do a lot of time-critical processing. Like 1ms time intervals are important.
Richard: I'm also curious how "dynamic" a thing description is allowed to be - I've sometimes assumed I can parse it once when I set up my client then assume it won't change: is that a bad idea?
Cris: there are some use cases to ask for dynamic TDs but they are not that developed yet. So you are doing correctly.
Ege: What does it mean to delete a property btw?
V: I want to free up the memory. I can either write an empty array or send a DELETE request.
Ege: So it is more like resetting
Cris: Or clearing.
V: I also fire a property update when there is a write.
Ege: How do you do async actions?
V: You can add `synchronous:False` which starts a thread but you cannot do anything with that thread.
Ege: Maybe you can combine the devices you are bringing with the openflexure microscope we have.
Richard: It would cool to test if the UI can control the microscope backend. I also have a new implementation that is almost ready. You can use a new OS image.
## :ballot_box_with_check: Resolutions
## :exclamation: Action Items
## :envelope_with_arrow: Feedbacks
- State machine modeling is missing.
- Long running actions is done differently between two implementations (hololinked vs openflexure)
## Presentation Summary
- Deleting properties? what is that
- Async actions -> How to handle it later on?
- For state machine, what is generated in the TD?