# InfluxDB UI for Visualization ###### tags: `Independent Study` :::success **Reference:** [Influxdata Document - Chronograf](https://docs.influxdata.com/chronograf/v1.8/introduction/getting-started/) [Docker Hub - chronograf](https://hub.docker.com/_/chronograf/) [Influxdata Document - Basic C++ client](https://w2.influxdata.com/blog/getting-started-c-influxdb/) [Influxdata Document - C++ Client github](https://github.com/offa/influxdb-cxx) [Influxdata Document - Client library](https://docs.influxdata.com/influxdb/v1.8/tools/api_client_libraries/) [O-RAN Virtual Exhibition](https://www.virtualexhibition.o-ran.org/classic/generation/2021/category/intelligent-ran-control-demonstrations/sub/intelligent-control/118) [influxdb v1 - issue](https://github.com/influxdata/influxdb/issues/11035) [influxdb1-clientv2 github](https://github.com/influxdata/influxdb1-client) **Note:** We are using the old version of InfluxDB, so we have to install the old UI (Chronograf), open the UI interface, and configure the dashboard to make sure the data can be displayed on the dashboard. In this page, I will 1. Introduce why InfluxDB. 2. Roughly explain the data I wanna show about independant project. 3. Show how could data can be visualized on chronograf. 4. Provide the configuration and source code that xApps can access InfluxDB. ::: [TOC] ## InfluxDB Introduction 1. InfluxDB is a time-series database used to store runtime data which means data would change over time, e.g., rsrp, rsrq, throughput.. 2. The database used in cherry release is Redis used to store key-value data which means data wouldn't change over time, e.g., the name of gnb, plmn ID... 3. Combining the first and second points, you can see why we need to use InfluxDB because Near-RT RIC wants to analyze the data that changes in time, and then make adjustments of operation in RAN. 4. By the way, compared with PromQL, InfluxDB has UI for monitoring visualizations, and most importantly, it has a durable long-term storage. ## What I want on influxDB I will explain what/why I wanna show on InfluxDB this part. ### Cell Load with Slice - Gauge To make the manager can visually know the how RIC adjust cell load while the load balancing algorithm is implementing. The pictures below roughly shows what I wanna show on InfluxDB. Gauge shows current cell load and slice load. Graph shows cell load with timestamp, cell load increment, and offloaded cell load. Currently, I don't have data about slice so I am not exactly know how to present it. Thus, it will be put aside. ![](https://i.imgur.com/1shyR7q.png) ![](https://i.imgur.com/b5usW1v.png) ![](https://i.imgur.com/P5SbDIf.png) ### Cell Future Throughput UE belongs to - Graph To make the manager can visually know the relation between UE downlink throughput and other UE Data, e.g., signal strength, signal to noise ratio.. The picture below show what I wanna show on InfluxDB. ![](https://i.imgur.com/yfc6GMy.png) ## Create a InfluxDB UI The old version of InfluxDB(1.8.0) does not have a UI, so you have to install Chronograf and connect to it. The new version is already integrated, but OSC seems to use the old version of InfluxDB, so I installed Chronograf. ### Step 1 - Install Chronograf ```bash= docker pull chronograf:1.9 # docker pull grafana/grafana ``` ### Step 2 - Check the IP & Port of InfluxDB ```bash= kubectl get service -n ricplt ``` ![](https://i.imgur.com/6yEpkE1.png) ### Step 3 - Run Chronograf ```bash= docker run -p 8888:8888 --net=influxdb chronograf --influxdb-url=http://10.97.7.71:8086 # docker run -d --name=grafana -p 3000:3000 grafana/grafana ``` ### STEP 4: Run AD xApp ### <Center>[Installation Guide (E-Release) ](https://hackmd.io/@OP6n8eNbREK5elXuZw1wBw/Hy1RaOFTK)<Center> ### Outcome The outcome below shows the data is indeed linked to the database. ![](https://i.imgur.com/ehXIwVr.png) ![](https://i.imgur.com/ihyrzuW.png) ![](https://i.imgur.com/YvbLHtf.png) ## Influxdb Client - C++ Since I might use TS xApp to upload the cell load, we need to use the C++ library of InfluxDB. Below shows the installation step to use it, and the basic writing data method to the InfluxDB by using c++. ### Step 1 - Install cmake ```bash= sudo apt-get update sudo apt-get install libssl-dev sudo apt-get install -y libcurl4-openssl-dev sudo apt-get install -y autoconf gawk libtool automake pkg-config autoconf- archive sudo apt install openssl sudo apt install libsctp-dev cd wget https://github.com/Kitware/CMake/releases/download/v3.16.2/cmake-3.16.2.tar.gz tar -xzvf cmake-3.16.2.tar.gz cd cmake-3.16.2.tar/ ./bootstrap make sudo make install cd cp ./cmake-3.16.2/bin/cmake /usr/bin/ ``` ### Step 2 - Install InfluxDB Package location: ts/Dockerfile ```bash= apt-get update apt install -y curl apt-get install -y libcurl4-openssl-dev apt-get install libboost-all-dev cd git clone https://github.com/offa/influxdb-cxx.git cd influxdb-cxx mkdir build && cd build cmake -D INFLUXCXX_TESTING:BOOL=OFF .. sudo make install ``` ### Step 3 - Modify Dockerfile location: ts/Dockerfile ```dockerfile= # # snarf up SDL dependencies, then pull SDL package and install # RUN apt-get update # RUN apt-get install -y libboost-filesystem1.65.1 libboost-system1.65.1 libhiredis0.13 # RUN wget -nv --content-disposition ${PC_STG_URL}/sdl_${SDL_VER}-1_amd64.deb/download.deb && \ # wget -nv --content-disposition ${PC_STG_URL}/sdl-dev_${SDL_VER}-1_amd64.deb/download.deb &&\ # dpkg -i sdl-dev_${SDL_VER}-1_amd64.deb sdl_${SDL_VER}-1_amd64.deb RUN git clone https://github.com/Tencent/rapidjson && \ cd rapidjson && \ mkdir build && \ cd build && \ cmake -DCMAKE_INSTALL_PREFIX=/usr/local .. && \ make install && \ cd ${STAGE_DIR} && \ rm -rf rapidjson ######### Ken Install influxDB ### RUN apt-get update RUN apt install -y curl RUN apt-get install -y libcurl4-openssl-dev RUN apt-get install -y libboost-all-dev RUN git clone https://github.com/offa/influxdb-cxx.git RUN cd influxdb-cxx && mkdir build && cd build && cmake -D INFLUXCXX_TESTING:BOOL=OFF .. && make install RUN ldconfig # install curl and gRPC dependencies ``` ### Step 4 - Modify CMakeList.txt location: ts/src/ts_xapp/CMakeList.txt ```cmake= find_package(Protobuf REQUIRED) add_executable( ts_xapp ts_xapp.cpp ) ######### Ken Install influxDB ### link_libraries(InfluxDB) find_package(InfluxDB) target_link_libraries( ts_xapp InfluxDB;ricxfcpp;rmr_si;pthread;curl;rc-api;grpc++;${Protobuf_LIBRARY}) install( TARGETS ts_xapp DESTINATION ${install_bin} ) ``` ### Step 5 - Copy InfluxDB Header Files ```bash= cd cp -a /usr/local/include/ ts/src/ts_xapp/ ``` ### Step 6 - Modify Source Code for testing (Write Data) ```cpp= #include <grpcpp/security/credentials.h> #include "../../ext/protobuf/api.grpc.pb.h" //----- Ken Create the Route for InfluxDB #include "InfluxDBFactory.h" using namespace rapidjson; using namespace std; using namespace xapp; using Namespace = std::string; using Key = std::string; using Data = std::vector<uint8_t>; using DataMap = std::map<Key, Data>; using Keys = std::set<Key>; //----- Ken Create the Route for InfluxDB string influxdb_url = "http://10.97.7.71:8086?db=UEData"; auto db_influx = influxdb::InfluxDBFactory::Get(influxdb_url); ``` ```cpp= extern int main( int argc, char** argv ) { //----- Ken Create the Route for InfluxDB db_influx->write(influxdb::Point{"test"} .addField("value", 10) .addTag("host", "localhost") ); //----------------------------------------- int nthreads = 1; char* port = (char *) "4560"; shared_ptr<grpc::Channel> channel; ``` ### Outcome For the basic operation, as what source code express, we can specify the value, fields, and the tags, uploading the data to the TimeSeries DataBase, InfluxDB, by using c++. ![](https://i.imgur.com/Hp50qqT.png) ## Influxdb Client - Go Since I might use KPIMON xApp to upload the data from RIC Test, we need to use go library of InfluxDB. Below shows the installation step to use it, and the basic writing data method to the InfluxDB by using Go. ### Step 1 - Modify the Dockerfile location: kpimon/Dockerfile ```dockerfile= # "COMPILING E2SM Wrapper" RUN cd e2sm && \ gcc -c -fPIC -Iheaders/ lib/*.c wrapper.c && \ gcc *.o -shared -o libe2smwrapper.so && \ cp libe2smwrapper.so /usr/local/lib/ && \ mkdir /usr/local/include/e2sm && \ cp wrapper.h headers/*.h /usr/local/include/e2sm && \ ldconfig WORKDIR /go/src/gerrit.o-ran-sc.org/r/scp/ric-app/kpimon RUN mkdir pkg RUN go env -w GO111MODULE=off ### ######### Ken Install influxDB ### RUN go get -u github.com/influxdata/influxdb1-client/v2 #### RUN go build ./cmd/kpimon.go && pwd && ls -lat FROM ubuntu:20.04 ``` ### Step 2 - Modify the source code for testing (Write Data) location: kpimon/control/control.go ```go= import ( "encoding/json" "errors" "log" "os" "strconv" "strings" "sync" "time" "gerrit.o-ran-sc.org/r/ric-plt/sdlgo" "gerrit.o-ran-sc.org/r/ric-plt/xapp-frame/pkg/xapp" //----- Ken Create the Route for InfluxDB "github.com/influxdata/influxdb1-client/v2" "fmt" //"github.com/go-redis/redis" ) ``` ```go= func (c *Control) sendRicSubRequest(subID int, requestSN int, funcID int) (err error) { //----- Ken Create the Route for InfluxDB cl, err := client.NewHTTPClient(client.HTTPConfig{ Addr: "http://10.97.7.71:8086", }) if err != nil { fmt.Println("Error creating InfluxDB Client: ", err.Error()) } defer cl.Close() // Create a new point batch bp, _ := client.NewBatchPoints(client.BatchPointsConfig{ Database: "UEData", }) // Create a point and add to batch tags := map[string]string{"kpi": "kpi-total"} fields := map[string]interface{}{ "idle": 10.1, "system": 53.3, "user": 46.6, } pt, err := client.NewPoint("cell_usage", tags, fields, time.Now()) if err != nil { fmt.Println("Error: ", err.Error()) } bp.AddPoint(pt) // Write the batch cl.Write(bp) //------------------------------------------------------------ var e2ap *E2ap var e2sm *E2sm ``` ### Outcome For the basic operation, as what source code express, we can specify the value, fields, and the tags, uploading the data to the TimeSeries DataBase, InfluxDB, by using go. ![](https://i.imgur.com/RU5Rv8r.png) ## Influxdb Client - Python In AD & QP xApp, they can already access the InfluxDB by using python, so I will just list the setting and source code to access InfluxDB by using AD source code as an example. ### Step 1 - Modify the setting location: setup.py ```python= from setuptools import setup, find_packages setup( name="test", ''' Some settings here.. ''' install_requires=["A","B","C", "influxdb"], ''' Some settings here.. ''' ) ``` ### Step 2 - Import InfluxDB Package ```python= from influxdb import DataFrameClient import pandas ``` ### Create DataBase ```python= class INSERTDATA: def __init__(self): host = 'r4-influxdb.ricplt' self.client = DataFrameClient(host, '8086', 'root', 'root') self.dropdb('UEData') self.createdb('UEData') def createdb(self, dbname): print("Create database: " + dbname) self.client.create_database(dbname) self.client.switch_database(dbname) def dropdb(self, dbname): print("DROP database: " + dbname) self.client.drop_database(dbname) def dropmeas(self, measname): print("DROP MEASUREMENT: " + measname) self.client.query('DROP MEASUREMENT '+measname) ``` ### Write & Cluster(Tag) Data ```python= db.client.write_points(data, 'valid') ##################################################################### db.client.write_points(df, 'train', batch_size=500, protocol='line') ##################################################################### db.client.write_points(df, 'liveUE', batch_size=500, protocol='line') ##################################################################### def write_anomaly(self, df, meas='AD'): """Write data method for a given measurement Parameters ---------- meas: str (default='AD') """ self.client.write_points(df, meas) ##################################################################### ``` ### Read Data ```python= result = self.client.query('select * from ' + meas + ' limit ' + str(limit)) ``` ### Outcome ![](https://i.imgur.com/PLP2Bg3.png) ## Chronograf ### Template Variable [Ref](https://docs.influxdata.com/chronograf/v1.9/guides/dashboard-template-variables/#quoting-template-variables-in-influxql) ``` # at the dashboard , import csv # Name = ueid # at the Panel SELECT mean("nr_cell_slice_usage_0_embb") AS "mean_nr_cell_slice_usage_0_embb" FROM "UEData"."autogen"."slice_usage" WHERE "ueid" = ':ueid:' and time > :dashboardTime: AND time < :upperDashboardTime: GROUP BY time(:interval:) FILL(null) ``` ## Grafana ### Template Variable [Ref](https://grafana.com/docs/grafana/latest/variables/variable-types/chained-variables/) ``` # at the setting, use type "Query" SELECT ("ueid") FROM "UEData"."autogen"."slice_usage" # at the Panel WHERE ueid =~ /^$ueid$/ ``` ### Show Prediction ``` ``` ## Add Influxdb source code into the project (cmake) **增加一個專案到一個專案** (那個專案要有寫好的CMakeList.txt) ``` add_sub_directories() ```