- ============LICENSE_START===============================================
-# Copyright (C) 2019-2023 Nordix Foundation. All rights reserved.
+# ============LICENSE_START===============================================
+# Copyright (C) 2020-2023 Nordix Foundation. All rights reserved.
# ========================================================================
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
#
---
-# .readthedocs.yml
-# Read the Docs configuration file
-# See https://docs.readthedocs.io/en/stable/config-file/v2.html for details
-# Required
version: 2
formats:
branch = 'latest'
-language = 'en'
-
linkcheck_ignore = [
'http://localhost.*',
'http://127.0.0.1.*',
'https://gerrit.o-ran-sc.org.*',
- './pm-producer-api.html', #Generated file that doesn't exist at link check.
]
extensions = ['sphinx.ext.intersphinx',]
---
project_cfg: oran
-project: nonrtric-plt-pmproducer
+project: nonrtric-plt-datafilecollector
This document provides a quickstart for developers of the Non-RT RIC Data File Collector.
-Additional developer guides are available on the `O-RAN SC NONRTRIC Developer wiki <https://wiki.o-ran-sc.org/display/RICNR/Release+E>`_.
+Additional developer guides are available on the `O-RAN SC NONRTRIC Developer wiki <https://wiki.o-ran-sc.org/display/RICNR>`_.
-Data File Collector
--------------------
+Data File Collector Service
+---------------------------
The DFC is configured via the application.yaml file.
.. SPDX-License-Identifier: CC-BY-4.0
.. Copyright (C) 2023 Nordix
-Non-RT RIC PM Producer
-======================
+Non-RT RIC PM Data File Collector
+=================================
.. toctree::
:maxdepth: 2
Introduction
************
-The task of the Data File Collector is to collect files from traffical nodes in the RAN.
+The task of the Data File Collector is to collect OAM data files from RAN traffic-handling nodes.
The main use case is:
-* The DFC received a File Ready VES event from a Kafka topic. This contains a list of all available files.
-
-* The DFC fetches files that are not already fectched from the traffical node. This is done using one of the supported file transfer protocols.
-
+* The DFC receives a "File Ready" VES event from a Kafka topic. This contains a list of all available files.
+* The DFC fetches files that are not already fetched from the relevant RAN traffic-handling nodes. This is done using one of the supported file transfer protocols.
* Each file is stored in an S3 Object Store bucket or in the file system (in a persistent volume).
-
-* For each stored file, a file publish message is sent to a Kafka topic for further processing.
+* For each stored file, a "File Publish" message is sent to a Kafka topic for further processing.
Supported file transfer protocols are:
* SFTP
-
* FTPES
-
* HTTP
-
* HTTPS
"name":"A20220418.1900-1915_seliitdus00487.xml",
"hashMap":{
"fileFormatType":"org.3GPP.32.435#measCollec",
- "location":"https://launchpad.net/ubuntu/+source/perf-tools-unstable/1.0+git7ffb3fd-1ubuntu1/+build/13630748/+files/perf-tools-unstable_1.0+git7ffb3fd-1ubuntu1_all.deb",
+ "location":"https://gnb1.myran.org/pmfiles/",
"fileFormatVersion":"V10",
"compression":"gzip"
}
+# ============LICENSE_START===============================================
+# Copyright (C) 2021-2023 Nordix Foundation. All rights reserved.
+# ========================================================================
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+# ============LICENSE_END=================================================
+#
+
from docs_conf import *
branch = 'latest'
'https://gerrit.o-ran-sc.org.*'
]
-#branch configuration
-
-
-linkcheck_ignore = [
- 'http://localhost.*',
- 'http://127.0.0.1.*',
- 'https://gerrit.o-ran-sc.org.*',
-]
-
extensions = [
'sphinx.ext.intersphinx',
'sphinx.ext.autosectionlabel',
---
project_cfg: oran
project: nonrtric-plt-ranpm
-
-# Change this to ReleaseBranchName to modify the header
-default-version: latest
sphinx_bootstrap_theme
sphinxcontrib-redoc
lfdocs-conf
-sphinx-multiproject
API-Docs
========
-Here we describe the APIs to access the Non-RIC Influx Logger.
+Here we describe the APIs to access the Non-RT RIC Influx Logger.
-Influx Logger
-=============
+Influx Logger API
+=================
The API is also described in Swagger-JSON and YAML:
from docs_conf.conf import *
-#branch configuration
-
branch = 'latest'
-language = 'en'
-
linkcheck_ignore = [
'http://localhost.*',
'http://127.0.0.1.*',
'./pmlog-api.html', #Generated file that doesn't exist at link check.
]
-extensions = ['sphinx.ext.intersphinx','sphinxcontrib.redoc']
+extensions = ['sphinx.ext.intersphinx','sphinxcontrib.redoc', 'sphinx.ext.autosectionlabel',]
redoc = [
{
redoc_uri = 'https://cdn.jsdelivr.net/npm/redoc@next/bundles/redoc.standalone.js'
-
-
-
#intershpinx mapping with other projects
intersphinx_mapping = {}
---
project_cfg: oran
-project: nonrtric-plt-pmproducer
+project: nonrtric-plt-influxlogger
This document provides a quickstart for developers of the Non-RT RIC Influx Logger.
-Additional developer guides are available on the `O-RAN SC NONRTRIC Developer wiki <https://wiki.o-ran-sc.org/display/RICNR/Release+E>`_.
+Additional developer guides are available on the `O-RAN SC NONRTRIC Developer wiki <https://wiki.o-ran-sc.org/display/RICNR>`_.
-Influx Logger
--------------
+Influx Logger Service
+---------------------
The component is configured via the application.yaml file.
.. SPDX-License-Identifier: CC-BY-4.0
.. Copyright (C) 2023 Nordix
-Non-RT RIC PM Producer
-======================
+Non-RT RIC Influx Logger
+========================
.. toctree::
:maxdepth: 2
************
The task of the Influx Logger is to receive PM Measurement reports from a Kafka topic and to
-store the mesurement in an Influx time serier database.
+store the measurements in an Influx time series database.
.. image:: ./Architecture.png
:width: 1000pt
-*****************************************
-Setting up the PM mesurement subscription
-*****************************************
+******************************************
+Setting up the PM measurement subscription
+******************************************
The influx logger will create its data subscription automatically. This is done by reading a file that
defines the data to log and which Kafka topic to use (1). The contents of this file is used to create
Input PM Measurement
********************
-The PM measuremenet information received from the Kafka topic is produced by the pmproducer.
+The PM measurement information received from the Kafka topic is produced by the pm-producer.
Here follows an example of the expected input object:
.. code-block:: javascript
Here we describe the APIs to access the Non-RT PM Producer.
-PM Producer
-=============
+PM Producer API
+===============
The PM Producer provides support for delivery of PM measurement data over Kafka.
.. This work is licensed under a Creative Commons Attribution 4.0 International License.
.. SPDX-License-Identifier: CC-BY-4.0
-.. Copyright (C) 2022 Nordix
+.. Copyright (C) 2023 Nordix
Developer Guide
===============
This document provides a quickstart for developers of the Non-RT RIC PM Producer.
-Additional developer guides are available on the `O-RAN SC NONRTRIC Developer wiki <https://wiki.o-ran-sc.org/display/RICNR/Release+E>`_.
+Additional developer guides are available on the `O-RAN SC NONRTRIC Developer wiki <https://wiki.o-ran-sc.org/display/RICNR>`_.
PM Producer Service
----------------------
+-------------------
The following properties in the application.yaml file have to be modified:
+
* server.ssl.key-store=./config/keystore.jks
* app.webclient.trust-store=./config/truststore.jks
* app.configuration-filepath=./src/test/resources/test_application_configuration.json
.. Copyright (C) 2023 Nordix
-PM Producer
-~~~~~~~~~~~~~
+Non-RT RIC PM Producer
+~~~~~~~~~~~~~~~~~~~~~~
************
Introduction
Sent Kafka headers
==================
-For each filtered result sent to a Kafka topic, there will the following proerties in the Kafa header:
+For each filtered result sent to a Kafka topic, there will the following properties in the Kafka header:
-* type-id, this propery is used to indicate the ID of the information type. The value is a string.
-* gzip, if this property exists the object is gzipped (otherwise not). The property has no value.
-* source-name, the name of the source traffical element for the measurements.
+* type-id, this property is used to indicate the ID of the information type. The value is a string.
+* gzip, if this property exists the object is gzip'ed (otherwise not). The property has no value.
+* source-name, the name of the source RAN traffic-handling element from which the measurements will originate.
******************
Configuration File
******************
-The configuration file defines Kafka topics that should be listened to and registered as subscribeable information types.
+The configuration file defines Kafka topics that should be listened to and registered as information types which can be subscribed to.
There is an example configuration file in config/application_configuration.json
Each entry will be registered as a subscribe information type in ICS. The following attributes can be used in each entry:
* filter, the value of the filter expression. This selects which data to subscribe for. All fields are optional and excluding a field means that everything is selected.
- * sourceNames, section of the names of the reporting traffical nodes
- * measObjInstIds, selection of the measured resources. This is the Relative Distingusished Name of the MO that
+ * sourceNames, section of the names of the reporting RAN traffic-handling nodes
+ * measObjInstIds, selection of the measured resources. This is the Relative Distinguished Name (RDN) of the MO that
has the counter.
If a given value is contained in the filter definition, it will match (partial matching).
For instance a value like "NRCellCU" will match "ManagedElement=seliitdus00487,GNBCUCPFunction=1,NRCellCU=32".
* measTypes, the name of the measurement type (counter). The measurement type name is only
unique in the scope of an MO class (measured resource).
- * measuredEntityDns, selection of DNs for the traffical elements.
+ * measuredEntityDns, selection of DNs for the RAN traffic-handling elements.
* pmRopStartTime, if this parameter is specified already collected PM measurements files will be scanned to retrieve historical data.
The start file is the time from when the information shall be returned.
In this case, the query is only done for files from the given "sourceNames".
- If this parameter is excluded, only "new" reports will be delivered as they are collected from the traffical nodes.
+ If this parameter is excluded, only "new" reports will be delivered as they are collected from the RAN traffic-handling nodes.
* pmRopEndTime, for querying already collected PM measurements. Only relevant if pmRopStartTime.
If this parameters is given, no reports will be sent as new files are collected.
}
Here follows an example of a filter that will
-match two counters from all cells in two traffical nodes.
+match two counters from all cells in two RAN traffic-handling nodes.
.. code-block:: javascript
# limitations under the License.
# ==================================================================================
+# documentation only
[tox]
minversion = 2.0
envlist =
docs,
docs-linkcheck,
-
skipsdist = true
[testenv:docs]
basepython = python3
-deps = -r{toxinidir}/docs/requirements-docs.txt
+deps =
+ -r{toxinidir}/docs/requirements-docs.txt
+ -r{toxinidir}/datafilecollector/docs/requirements-docs.txt
+ -r{toxinidir}/influxlogger/docs/requirements-docs.txt
+ -r{toxinidir}/pmproducer/docs/requirements-docs.txt
commands =
sphinx-build -W -b html --keep-going -n -d {envtmpdir}/doctrees ./docs/ {toxinidir}/docs/_build/html