From: josephthaliath Date: Tue, 6 Dec 2022 05:11:38 +0000 (+0530) Subject: Installation documention X-Git-Url: https://gerrit.o-ran-sc.org/r/gitweb?a=commitdiff_plain;h=3774e9226327edeb59997447cba69c431e18da56;p=aiml-fw%2Faimlfw-dep.git Installation documention Issue-Id: AIMLFW-12 Signed-off-by: josephthaliath Change-Id: I8edebaa5fc10eea2f9bf9223c0bfa9f3b105fd75 --- diff --git a/docs/_static/logo.png b/docs/_static/logo.png new file mode 100644 index 0000000..3803e04 Binary files /dev/null and b/docs/_static/logo.png differ diff --git a/docs/conf.py b/docs/conf.py new file mode 100644 index 0000000..d96be56 --- /dev/null +++ b/docs/conf.py @@ -0,0 +1,23 @@ +# ================================================================================== +# +# Copyright (c) 2022 Samsung Electronics Co., Ltd. All Rights Reserved. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# +# ================================================================================== + +from docs_conf.conf import * + +linkcheck_ignore = [ + 'http://localhost', +] diff --git a/docs/conf.yaml b/docs/conf.yaml new file mode 100644 index 0000000..c799c4e --- /dev/null +++ b/docs/conf.yaml @@ -0,0 +1,6 @@ +--- +project_cfg: oran +project: aimlfw-dep + +# Change this to ReleaseBranchName to modify the header +version: latest diff --git a/docs/favicon.ico b/docs/favicon.ico new file mode 100644 index 0000000..00b0fd0 Binary files /dev/null and b/docs/favicon.ico differ diff --git a/docs/index.rst b/docs/index.rst new file mode 100644 index 0000000..8d39895 --- /dev/null +++ b/docs/index.rst @@ -0,0 +1,18 @@ +.. This work is licensed under a Creative Commons Attribution 4.0 International License. +.. SPDX-License-Identifier: CC-B + +.. Copyright (c) 2022 Samsung Electronics Co., Ltd. All Rights Reserved. + +Welcome to O-RAN SC aimlfw-dep Documentation +============================================ + +.. toctree:: + :maxdepth: 2 + :caption: Contents: + + installation-guide.rst + + +* :ref:`genindex` +* :ref:`modindex` +* :ref:`search` diff --git a/docs/installation-guide.rst b/docs/installation-guide.rst new file mode 100644 index 0000000..648346c --- /dev/null +++ b/docs/installation-guide.rst @@ -0,0 +1,334 @@ +.. This work is licensed under a Creative Commons Attribution 4.0 International License. +.. http://creativecommons.org/licenses/by/4.0 + +.. Copyright (c) 2022 Samsung Electronics Co., Ltd. All Rights Reserved. + + +Installation Guide +================== + +.. contents:: + :depth: 3 + :local: + +Abstract +-------- + +This document describes how to install AIMLFW, it's dependencies and required system resources. + + +Version history + ++--------------------+--------------------+--------------------+--------------------+ +| **Date** | **Ver.** | **Author** | **Comment** | +| | | | | ++--------------------+--------------------+--------------------+--------------------+ +| 2022-11-30 | 0.1.0 | | First draft | +| | | | | ++--------------------+--------------------+--------------------+--------------------+ +| | | | | +| | | | | ++--------------------+--------------------+--------------------+--------------------+ +| | | | | +| | | | | +| | | | | ++--------------------+--------------------+--------------------+--------------------+ + + +Introduction +------------ + +.. + + +This document describes the supported software and hardware configurations for the reference component as well as providing guidelines on how to install and configure such reference system. + +The audience of this document is assumed to have good knowledge in RAN network nd Linux system. + + +Hardware Requirements +--------------------- +.. + +Below are the minimum requirements for installing the AIMLFW + +1. OS: Ubuntu 18.04 server +2. 8 cpu cores +3. 16 GB RAM +4. 60 GB harddisk + +Software Installation and Deployment +------------------------------------ +.. + +.. code:: bash + + git clone "https://gerrit.o-ran-sc.org/r/aiml-fw/aimlfw-dep" + cd aimlfw-dep + +Update recipe file “RECIPE_EXAMPLE/example_recipe_latest_stable.yaml” which includes update of VM IP and datalake details +Note: In case the Influx DB datalake is not available, this can be skipped at this stage and can be updated after installing datalake. + +.. code:: bash + + bin/install_traininghost.sh + + + +Check running state of all pods and services using below command + +.. code:: bash + + kubectl get pods --all-namespaces + kubectl get svc --all-namespaces + + +Check the AIMLFW dashboard by using the following url + +.. code:: bash + + http://localhost:32005/ + +In case Influx DB datalake not available, it can be installed using the steps mentioned in section “Install influx db as datalake”. Once installed the access details of the datalake can be updated in RECIPE_EXAMPLE/example_recipe_latest_stable.yaml . Once updated, follow the below steps for reinstall of some components: + +.. code:: bash + + bin/uninstall.sh + bin/install.sh -f RECIPE_EXAMPLE/example_recipe_latest_stable.yaml + +Following are the steps to build sample training pipeline image for QoE prediction example. +This step is required before triggering training for the QoE prediction example. + +.. code:: bash + + cd /tmp/ + git clone "https://gerrit.o-ran-sc.org/r/portal/aiml-dashboard" + docker build -f aiml-dashboard/kf-pipelines/Dockerfile.pipeline -t traininghost/pipelineimage:latest aiml-dashboard/kf-pipelines/. + +Software Uninstallation +----------------------- + +.. code:: bash + + bin/uninstall_traininghost.sh + +Install Influx DB as datalake +----------------------------- + +.. code:: bash + + helm install my-release bitnami/influxdb + kubectl exec -it bash + +From below command we can get username, org name, org id and access token + +.. code:: bash + + cat bitnami/influxdb/influxd.bolt | tr -cd "[:print:]" + +eg: {"id":"0a576f4ba82db000","token":"xJVlOom1GRUxDNkldo1v","status":"active","description":"admin's Token","orgID":"783d5882c44b34f0","userID":"0a576f4b91edb000","permissions" ... + +Use the tokens further in the below configurations and in the recipe file. + +Following are the steps to add qoe data to influx DB + + +Execute below from inside influx Db container to create a bucket: + +.. code:: bash + + influx bucket create -n UEData -o primary -t + + +Install the following dependencies + +.. code:: bash + + sudo pip3 install pandas + sudo pip3 install influxdb_client + + +Use the insert.py in ric-app/qp repository to upload the qoe data in influx DB + + +.. code:: bash + + git clone https://gerrit.o-ran-sc.org/r/ric-app/qp + cd qp/qp + +update insert.py file with the following content: + +.. code:: bash + + import pandas as pd + from influxdb_client import InfluxDBClient + from influxdb_client.client.write_api import SYNCHRONOUS + import datetime + + + class INSERTDATA: + + def __init__(self): + self.client = InfluxDBClient(url = "http://localhost:8086", token="") + + + def explode(df): + for col in df.columns: + if isinstance(df.iloc[0][col], list): + df = df.explode(col) + d = df[col].apply(pd.Series) + df[d.columns] = d + df = df.drop(col, axis=1) + return df + + + def jsonToTable(df): + df.index = range(len(df)) + cols = [col for col in df.columns if isinstance(df.iloc[0][col], dict) or isinstance(df.iloc[0][col], list)] + if len(cols) == 0: + return df + for col in cols: + d = explode(pd.DataFrame(df[col], columns=[col])) + d = d.dropna(axis=1, how='all') + df = pd.concat([df, d], axis=1) + df = df.drop(col, axis=1).dropna() + return jsonToTable(df) + + + def time(df): + df.index = pd.date_range(start=datetime.datetime.now(), freq='10ms', periods=len(df)) + df['measTimeStampRf'] = df['measTimeStampRf'].apply(lambda x: str(x)) + return df + + + def populatedb(): + df = pd.read_json('cell.json.gz', lines=True) + df = df[['cellMeasReport']].dropna() + df = jsonToTable(df) + df = time(df) + db = INSERTDATA() + write_api = db.client.write_api(write_options=SYNCHRONOUS) + write_api.write(bucket="UEData",record=df, data_frame_measurement_name="liveCell",org="primary") + + populatedb() + + +Update in insert.py file + +Follow below command to port forward to access Influx DB + +.. code:: bash + + kubectl port-forward svc/my-release-influxdb 8086:8086 + +To insert data: + +.. code:: bash + + python3 insert.py + +To check inserted data in Influx DB , execute below command inside the influx DB container: + +.. code:: bash + + influx query 'from(bucket: "UEData") |> range(start: -1000d)' -o primary -t + + +Install Kserve for deploying models +----------------------------------- + +To install Kserve run the below commands + +.. code:: bash + + curl -s "https://raw.githubusercontent.com/kserve/kserve/release-0.7/hack/quick_install.sh" | bash + +Deploy trained qoe prediction model on Kserve +--------------------------------------------- + +Create namespace using command below + +.. code:: bash + + kubectl create namespace kserve-test + +Create qoe.yaml file with below contents + +.. code:: bash + + apiVersion: "serving.kserve.io/v1beta1" + kind: "InferenceService" + metadata: + name: qoe-model + spec: + predictor: + tensorflow: + storageUri: "" + runtimeVersion: "2.5.1" + resources: + requests: + cpu: 0.1 + memory: 0.5Gi + limits: + cpu: 0.1 + memory: 0.5Gi + + +To deploy model update the Model URL in the qoe.yaml file and execute below command to deploy model + +.. code:: bash + + kubectl apply -f qoe.yaml -n kserve-test + +Check running state of pod using below command + +.. code:: bash + + kubectl get pods -n kserve-test + + +Test predictions using model deployed on Kserve +----------------------------------------------- + +Use below command to obtain Ingress port for Kserve. + +.. code:: bash + + kubectl get svc istio-ingressgateway -n istio-system + +Obtain nodeport corresponding to port 80. +In the below example, the port is 31206 +NAME TYPE CLUSTER-IP EXTERNAL-IP PORT(S) AGE +istio-ingressgateway LoadBalancer 10.105.222.242 15021:31423/TCP,80:31206/TCP,443:32145/TCP,31400:32338/TCP,15443:31846/TCP 4h15m + + +Create predict.sh file with following contents + +.. code:: bash + + model_name=qoe-model + curl -v -H "Host: $model_name.kserve-test.example.com" http://:/v1/models/$model_name:predict -d @./input_qoe.json + +Update the IP of host where Kserve is deployed and ingress port of Kserve obtained using above method. + +Create sample data for predictions in file input_qoe.json. Add the following content in input_qoe.json file. + +.. code:: bash + + {"signature_name": "serving_default", "instances": [[[2.56, 2.56], + [2.56, 2.56], + [2.56, 2.56], + [2.56, 2.56], + [2.56, 2.56], + [2.56, 2.56], + [2.56, 2.56], + [2.56, 2.56], + [2.56, 2.56], + [2.56, 2.56]]]} + + +Use command below to trigger predictions + +.. code:: bash + + source predict.sh diff --git a/docs/requirements-docs.txt b/docs/requirements-docs.txt new file mode 100644 index 0000000..7372123 --- /dev/null +++ b/docs/requirements-docs.txt @@ -0,0 +1,6 @@ +sphinx +sphinx_rtd_theme>=1.0.0 +docutils<0.17 +sphinxcontrib-httpdomain +recommonmark +lfdocs-conf diff --git a/tox.ini b/tox.ini new file mode 100644 index 0000000..72742cf --- /dev/null +++ b/tox.ini @@ -0,0 +1,30 @@ +# documentation only +[tox] +minversion = 2.0 +envlist = + docs, + docs-linkcheck, +skipsdist = true + +[testenv:docs] +basepython = python3 +deps = + sphinx + sphinx-rtd-theme + sphinxcontrib-httpdomain + recommonmark + lfdocs-conf + +commands = + sphinx-build -W -b html -n -d {envtmpdir}/doctrees ./docs/ {toxinidir}/docs/_build/html + echo "Generated docs available in {toxinidir}/docs/_build/html" +whitelist_externals = echo + +[testenv:docs-linkcheck] +basepython = python3 +deps = sphinx + sphinx-rtd-theme + sphinxcontrib-httpdomain + recommonmark + lfdocs-conf +commands = sphinx-build -W -b linkcheck -d {envtmpdir}/doctrees ./docs/ {toxinidir}/docs/_build/linkcheck