[REC-80] RIC port to AArch64 Created: 20/Jan/20  Updated: 15/Jun/20  Resolved: 05/Mar/20

Status: Done
Project: Radio Edge Cloud
Component/s: None
Affects Version/s: None
Fix Version/s: None

Type: Task Priority: Medium
Reporter: Alex Antone Assignee: Alex Antone
Resolution: Done Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified

Attachments: Microsoft Word RIC Components.xlsx     Text File RIC_tarfile_contents.txt     File job-Install_RIC_on_OpenEdge1-206-console-timestamp.log    

 Description   

 See attached RIC Components.xlsx for porting status.
 
Looking at logs from https://nexus.akraino.org/content/sites/logs/att/job/Install_RIC_on_OpenEdge1/203/

  • These docker images ar listed in the ric.tar file (some of them have multiple versions):
    /ric/ric_images/ :
      aaf_agent:2.1.15
      aaf_cass:2.1.15
      aaf_config:2.1.15
      aaf_core:2.1.15
      aaf_hello:2.1.15
      all-in-one:1.12
      alpine:latest
      busybox:latest
      cassandra:3.11.3
      cassandra_music:3.0.0
      chartmuseum:v0.8.2
      distcenter:4.0.0
      dmaap-mr:1.1.13
      filebeat:5.5.0
      it-dep-init:0.0.1
      it-dep-secret:0.0.2
      kafka111:1.0.0
      kong:1.3
      kong-ingress-controller:0.6.0
      mariadb-client-init:3.0.0
      minideb:latest
      org.onap.dcaegen2.collectors.ves.vescollector:1.4.4
      portal-app:2.5.0
      portal-db:2.5.0
      portal-sdk:2.5.0
      portal-wms:2.5.0
      postgresql:10.6.0
      readiness-check:2.0.2
      ric-dashboard:1.2.2
      ric-plt-a1:0.10.3
      ric-plt-appmgr:0.1.9
      ric-plt-dbaas:0.1.0
      ric-plt-e2:2.0.7
      ric-plt-e2mgr:2.0.7
      ric-plt-rsm:2.0.6
      ric-plt-rtmgr:0.3.3
      ric-plt-submgr:0.10.0
      ric-plt-vespamgr:0.0.5
      rtmgr:0.3.4
      sms:4.0.1
      smsquorumclient:4.0.0
      submgr:0.6.2
      testcaservice:4.0.0
      tiller:v2.12.3
      ubuntu:16.04
      ubuntu-init:2.0.0
      zookeeper:5.0.0
    

 



 Comments   
Comment by Alex Antone [ 20/Feb/20 ]

RIC cluster seems to be deploying successfully.
Working on standalone testing:

  • xApps installation
  • Workload generator tests
Comment by Alex Antone [ 04/Feb/20 ]

WIP RIC integration scripts at:
https://github.com/alexantone/ric-it-dep/commits/aarch64
in:
https://github.com/alexantone/ric-it-dep/tree/aarch64/_install

Comment by Alex Antone [ 04/Feb/20 ]

https://wiki.akraino.org/display/AK/REC+Project+Minutes+2020.01.23 :

  • We continue to monitor ORAN-SC progress related to the second (codename "Bronze") release for any new requirements on REC/TA in order to integrate it. In the mean time, our deployment script for post-Amber (the codename for ORAN-SC's first release) is slightly broken (most of the RIC deploys successfully, but there are two pods with issues) and we are working on transitioning from the "old" RIC installation procedure to the "new" RIC installation procedure.
  • The installation scripts used by they AT&T CD are in the process of being updated. The current scripts should probably not be used because the RIC installation process changed and the scripts are for the "old" way. Instead either the documentation from https://gerrit.o-ran-sc.org/r/gitweb?p=it/dep.git;a=tree;f=docs;h=c9c96f7b3ee8b9ec77d5527990048166e636f62c;hb=HEAD should be used, or wait until we finish revising the current REC deployment automation scripts to use the new RIC install process

 

https://wiki.akraino.org/display/AK/REC+Project+Minutes+2020.01.31 :

  • YAML files and tar file currently used in CD flow are obsolete but have not been refactored out yet. They are related to the old way of installing the RIC. Work is in progress to move to the new installation procedure.
Comment by Alex Antone [ 03/Feb/20 ]

Regarding RIC integration, based in the CI/CD logs, there is one more step "Install_CAAS_Ingress_on_OE1" before installing RIC which does not have logs pushed to nexus:
https://nexus.akraino.org/content/sites/logs/att/job/Cloudtaf_test_suite_Middletown_OE1/173/console-timestamp.log.gz :

13:54:55  Started by upstream project "Install_RIC_on_OpenEdge1" build number 206
13:54:55  originally caused by:
13:54:55   Started by upstream project "Install_CAAS_Ingress_on_OE1" build number 38
13:54:55   originally caused by:
13:54:55    Started by upstream project "Install_REC_on_OpenEdge1" build number 412
13:54:55    originally caused by:
13:54:55     Started by upstream project "Synchronize_REC_ISO_cache" build number 3900

Install_REC_on_OpenEdge1 build number 412 -> https://nexus.akraino.org/content/sites/logs/att/job/Install_REC_on_OpenEdge1/412/ (Jan 27 2020)
Install_CAAS_Ingress_on_OE1 build number 38 -> ??
Install_RIC_on_OpenEdge1 build number 206 -> https://nexus.akraino.org/content/sites/logs/att/job/Install_RIC_on_OpenEdge1/206/ (Jan 27 2020)

Comment by Alex Antone [ 03/Feb/20 ]

damnnet network config from attached job-Install_RIC_on_OpenEdge1-206-console-timestamp.log logfile:

 + cd ./REC_integration/danm/Middletown_OE1
 + ls danmnet40-e2adapter.yaml danmnet40-portal.yaml danmnet40-ricplatform.yaml
 + xargs -n 1 kubectl apply -f
 Warning: kubectl apply should be used on resource created by either kubectl create --save-config or kubectl apply
 clusternetwork.danm.k8s.io/default configured
 clusternetwork.danm.k8s.io/e2agent created
 clusternetwork.danm.k8s.io/e2adapter created
 clusternetwork.danm.k8s.io/default unchanged
 clusternetwork.danm.k8s.io/ext created
 clusternetwork.danm.k8s.io/ext2 created
 clusternetwork.danm.k8s.io/ew created
 clusternetwork.danm.k8s.io/default unchanged
 clusternetwork.danm.k8s.io/ran created
 clusternetwork.danm.k8s.io/ext1 created
 clusternetwork.danm.k8s.io/ew configured
Comment by Alex Antone [ 31/Jan/20 ]

Found some more charts under ric-infra/kong (Updated RIC Components.xlsx)

kong	kong
	kong-ingress-controller
	busybox

kong/charts/cassandra	cassandra
			nuvo/cain
			criteord/cassandra_exporter

kong/charts/postgresql	bitnami/postgresql
			bitnami/minideb
			wrouesnel/postgres_exporter

xapp-tiller		kubernetes-helm/tiller
Comment by Alex Antone [ 31/Jan/20 ]

Ported:
jaegertracing/all-in-one -> akrainoenea/jaegertracing-all-in-one:1.12 commit

Comment by Alex Antone [ 29/Jan/20 ]

Ported:
ric-plt-e2mgr -> akrainoenea/ric-plt-e2mgr:2.0.10 commit
ric-plt-e2 -> akrainoenea/ric-plt-e2:3.0.4.5 commit

Comment by Alex Antone [ 29/Jan/20 ]

Also the fact that some docker images clone and build from latest master of external repositories might be a big problem.
Versions should be pinned by tag.

For ric-plt-e2:
git clone https://gerrit.o-ran-sc.org/r/ric-plt/lib/rmr -b ${RMR_VER} # RMR_VER=1.13.0 converted from .deb package
git clone http://gerrit.o-ran-sc.org/r/com/log
git clone https://github.com/jarro2783/cxxopts.git
git clone https://github.com/Tencent/rapidjson.git
git clone https://github.com/cgreen-devs/cgreen.git -b 1.2.0 # converted from .deb package
git clone https://github.com/bilke/cmake-modules.git

Comment by Alex Antone [ 29/Jan/20 ]

ric-plt-e2 fails to build on AArch64 due to x86 specific assembler code:

Scanning dependencies of target e2
[ 99%] Building CXX object CMakeFiles/e2.dir/RIC-E2-TERMINATION/sctpThread.cpp.o
In file included from /opt/e2/RIC-E2-TERMINATION/sctpThread.cpp:22:0:
/opt/e2/RIC-E2-TERMINATION/sctpThread.h: In function 'double approx_CPU_MHz(unsigned int)':
/opt/e2/RIC-E2-TERMINATION/sctpThread.h:430:71: error: impossible constraint in 'asm'
     asm volatile ("rdtscp\n" : "=a" (rax), "=d" (rdx), "=c" (aux) : :);
                                                                       ^
/opt/e2/RIC-E2-TERMINATION/sctpThread.h:430:71: error: impossible constraint in 'asm'
     asm volatile ("rdtscp\n" : "=a" (rax), "=d" (rdx), "=c" (aux) : :);
                                                                       ^
CMakeFiles/e2.dir/build.make:62: recipe for target 'CMakeFiles/e2.dir/RIC-E2-TERMINATION/sctpThread.cpp.o' failed
make[2]: *** [CMakeFiles/e2.dir/RIC-E2-TERMINATION/sctpThread.cpp.o] Error 1
CMakeFiles/Makefile2:149: recipe for target 'CMakeFiles/e2.dir/all' failed
Makefile:129: recipe for target 'all' failed
make[1]: *** [CMakeFiles/e2.dir/all] Error 2
make: *** [all] Error 2
The command '/bin/sh -c cd /opt/e2/ && git clone https://github.com/bilke/cmake-modules.git     && cd /opt/e2/ && /usr/local/bin/cmake -D CMAKE_BUILD_TYPE=$BUILD_TYPE . && make' returned a non-zero code: 2
Comment by Alex Antone [ 29/Jan/20 ]

Also it seems the way docker image tags are selected is based on the example recipes.

For ric-plt-e2mgr:

  • On Amber branch 1.6.0 ovveriden by 2.0.10:
    ric-platform/50-RIC-Platform/helm/e2mgr/values.yaml
      39: e2mgr:
      40:   image:
      41:     name: e2mgr
      42:     tag: 1.6.0
    
    RECIPE_EXAMPLE/RIC_PLATFORM_RECIPE_EXAMPLE
      132: e2mgr:
      133:   # Use the following option to override the docker registry value
      134:   # repositoryOverride:
      135:   image:
      136:     name: ric-plt-e2mgr
      137:     tag: 2.0.10
  • on master branch 1.6.0 ovveriden by 3.0.1:
    ric-platform/50-RIC-Platform/helm/e2mgr/values.yaml
      39: e2mgr:
      40:   image:
      41:     name: e2mgr
      42:     tag: 1.6.0 
    
    RECIPE_EXAMPLE/RIC_PLATFORM_RECIPE_EXAMPLE
      126: e2mgr:
      127:   # Use the following option to override the docker registry value
      128:   # repositoryOverride:
      129:   image:
      130:     name: ric-plt-e2mgr
      131:     tag: 3.0.1
    

     

But in the ric.tar archive:

  ric/ric_image/ric-plt-e2mgr:2.0.4
  ric/ric_image/ric-plt-e2mgr:2.0.7
Comment by Alex Antone [ 29/Jan/20 ]

Ported:
  ric-plt-appmgr, 0.1.3 -> 0.3.3, akrainoenea/ric-plt-appmgr:0.3.3, commit
  ric-plt-a1, 2.0.0 -> 2.1.2,  akrainoenea/ric-plt-a1:2.1.2 (no changes required) 

 Unfortunatelly these images built from the master branch might be too new and we might actually need to build from the recently pulled Amber (frozen?) release branch

Comment by Jimmy Lafontaine [ 27/Jan/20 ]

Successfully built ric-plt-lib-rmr which some other containers are dependent on. To build and export the artifacts run:

docker build -t lib-rmr -f ci/Dockerfile .
docker run -v `pwd`/build:/export lib-rmr

I also published the artifacts to http://artifacts.cachengo.com/ric-deps/

Comment by Jimmy Lafontaine [ 27/Jan/20 ]

Successfully built the infra images which are the base of some of the other images in ric-plt. Building it required some changes: https://github.com/cachengo/oran-ci-management/commit/a976e3939d4c0cf52c4c6c28535190fd81116e55 . Some of these containers also pull a Java app whose aarch64 support I'm not sure about.

Comment by Alex Antone [ 27/Jan/20 ]

Successfully built ric-plt-a1 image on the Ampere POD1 Jumphost:

git clone "https://gerrit.o-ran-sc.org/r/ric-plt/a1" ric-plt-a1 && cd ric-plt-a1
docker build  --network=host -f Dockerfile .
Comment by Alex Antone [ 27/Jan/20 ]

Build arguments for most of the o-ran-sc projects can be inferred from the https://gerrit.o-ran-sc.org/r/admin/repos/ci-management repo jjb files:

jjb/ric-plt-a1/ric-plt-a1.yaml
jjb/ric-plt-appmgr/ric-plt-appmgr.yaml
jjb/ric-plt-asn1/ric-plt-asn1-documents.yaml
jjb/ric-plt/dbaas/hiredis-vip/info-ric-plt-dbaas-hiredis-vip.yaml
jjb/ric-plt-dbaas/ric-plt-dbaas.yaml
jjb/ric-plt/demo1/info-ric-plt-demo1.yaml
jjb/ric-plt-e2mgr/ric-plt-e2mgr.yaml
jjb/ric-plt-e2/ric-plt-e2.yaml
jjb/ric-plt/jaegeradapter/info-ric-plt-jaegeradapter.yaml
jjb/ric-plt-lib-rmr/ric-plt-lib-rmr.yaml
jjb/ric-plt/nodeb-rnib/info-ric-plt-nodeb-rnib.yaml
jjb/ric-plt-resource-status-manager/ric-plt-resource-status-manager.yaml
jjb/ric-plt/resource-status-processor/info-ric-plt-resource-status-processor.yaml
jjb/ric-plt/ric-dep/info-ric-plt-ric-dep.yaml
jjb/ric-plt/ric-test/info-ric-plt-ric-test.yaml
jjb/ric-plt-rtmgr/ric-plt-rtmgr.yaml
jjb/ric-plt-sdlgo/ric-plt-sdlgo.yaml
jjb/ric-plt-sdlpy/ric-plt-sdlpy.yaml
jjb/ric-plt-sdl/ric-plt-sdl.yaml
jjb/ric-plt/streaming-protobufs/info-ric-plt-streaming-protobufs.yaml
jjb/ric-plt-submgr/ric-plt-submgr.yaml
jjb/ric-plt-tracelibcpp/ric-plt-tracelibcpp.yaml
jjb/ric-plt-tracelibgo/ric-plt-tracelibgo.yaml
jjb/ric-plt/utils/info-ric-plt-utils.yaml
jjb/ric-plt-vespamgr/ric-plt-vespamgr.yaml
jjb/ric-plt-xapp-frame/ric-plt-xapp-frame.yaml

For example in jjb/ric-plt-a1/ric-plt-a1.yaml:

- a1_common: &a1_common
 # values apply to all A1 projects
 name: a1-common
 # git repo
 project: ric-plt/a1
 # jenkins job name prefix
 project-name: ric-plt-a1
 # maven settings file has docker credentials
 mvn-settings: ric-plt-a1-settings
- project:
 <<: *a1_common
 name: ric-plt-a1
 # image name
 docker-name: 'o-ran-sc/{name}'
 # source of docker tag
 container-tag-method: yaml-file
 # use host network
 docker-build-args: '--network=host'
 build-node: ubuntu1804-docker-4c-4g
 stream:
 - master:
 branch: master
 jobs:
 - '{project-name}-gerrit-docker-jobs'

 

 

Comment by Alex Antone [ 23/Jan/20 ]

Compiled docs from the it/dep repo can be found here:
 https://docs.o-ran-sc.org/projects/o-ran-sc-it-dep/en/latest/index.html

Installation procedure:
https://docs.o-ran-sc.org/projects/o-ran-sc-it-dep/en/latest/installation-guides.html

Comment by Alex Antone [ 23/Jan/20 ]

As mentioned in https://wiki.akraino.org/display/AK/REC+Project+Minutes+2020.01.23 , There will be a new RIC install procedure as ORAN-SCs RIC has a new release (Bronze).

The installation scripts used by they AT&T CD are in the process of being updated. The current scripts should probably not be used because the RIC installation process changed and the scripts are for the "old" way. Instead either the documentation from https://gerrit.o-ran-sc.org/r/gitweb?p=it/dep.git;a=tree;f=docs;h=c9c96f7b3ee8b9ec77d5527990048166e636f62c;hb=HEAD should be used, or wait until we finish revising the current REC deployment automation scripts to use the new RIC install process

 

Comment by Alex Antone [ 22/Jan/20 ]

The REC blueprint & workflow repo https://gerrit.akraino.org/r/admin/repos/rec also has some RIC references and some simillar steps  as in REC_integration/INSTALL_RIC.sh from https://gerrit.mtlab.att-akraino.org/r/rec_integration :

# Docker images referenced
xapp-manager:latest
e2mgr:1.0.0
e2:1.0.0
rtmgr:0.0.2
redis-standalone:latest

# Namespace creation
kubectl create namespace ricplatform

# Run install script from it/dep/generated/ricplt/
#  (source repo: https://gerrit.o-ran-sc.org/r/it/dep )
bash -x ./ric_install.sh
 11: kubectl create namespace rictest
 12: kubectl create namespace ricxa
Comment by Alex Antone [ 21/Jan/20 ]

Contents of final ric.tar image broken into sections:

ric/

Comment by Alex Antone [ 21/Jan/20 ]

Looking at logs from https://nexus.akraino.org/content/sites/logs/att/job/Install_RIC_on_OpenEdge1/203/ the following steps are performed:

1. Updating and Repackaging of the ric.tar file

* # Cloning repository ssh://attjenkins@gerrit.mtlab.att-akraino.org:29418/rec_integration -> REC_integration
  > git init /home/jenkins/workspace/Install_RIC_on_OpenEdge1/REC_integration
  > git fetch --tags --progress -- ssh://attjenkins@gerrit.mtlab.att-akraino.org:29418/rec_integration

* # Cloning repository https://gerrit.o-ran-sc.org/r/it/dep -> dep
   > git init /home/jenkins/workspace/Install_RIC_on_OpenEdge1/dep
   > git fetch --tags --progress -- https://gerrit.o-ran-sc.org/r/it/dep

* # Fetch ric.tar from http://www.mtlab.att-akraino.org/RIC/latest and rebuild tarfile
  + curl --output ./ric.tar http://www.mtlab.att-akraino.org/RIC/latest
  + tar xf ric.tar
  + rm -fr ric/dep ric/REC_integration ric.tar
  + rm -fr dep/.git REC_integration/.git
  + mv dep REC_integration ric
  + tar cf jenkins-Install_RIC_on_OpenEdge1-203/ric.tar ric
  + rm -fr ric
  # Push tarfile to 172.28.16.201

2. Install RIC

* # Executing /tmp/INSTALL_RIC.sh /tmp/ric.tar Middletown_OE1
  + cd /opt
  + sudo tar xfv /tmp/ric.tar

Extracted files: RIC_tarfile_contents.txt

* + sudo chown -R cloudadmin:cloudadmin ric
  + rm -f /tmp/ric.tar

* # Preload images
  + cd /opt/ric
  + bash ./dep/bin/preload-images -p /opt/ric/ric_image -d registry.kube-system.svc.rec.io:5555/ric

* # Deploying DANM Networks
  + cd ./REC_integration/danm/Middletown_OE1
  + ls danmnet40-e2adapter.yaml danmnet40-portal.yaml danmnet40-ricplatform.yaml | xargs -n 1 kubectl apply -f

* # Creating RIC K8S Namespaces
  + cd /opt/ric
  + echo ricinfra ricplt ricaux ricxapp | xargs -n 1 kubectl create ns
  + kubectl apply -f /opt/ric/REC_integration/ricplt-role.yaml
  + kubectl apply -f /opt/ric/REC_integration/rbac.yaml
  + kubectl apply -f /opt/ric/REC_integration/rbac2.yaml

* # Deploying RIC Infrastructure
  + bash -e ./dep/bin/deploy-ric-infra /opt/ric/REC_integration/recipe/Middletown_OE1/infra.yaml
  Deploying RIC infra components [kong]
  Deploying RIC infra components [credential]
  Deploying RIC infra components [xapp-tiller]

* # Deploying RIC Platform
  + bash -e ./dep/bin/deploy-ric-platform /opt/ric/REC_integration/recipe/Middletown_OE1/platform.yaml
  Deploying RIC infra components [appmgr rtmgr dbaas1 e2mgr e2term a1mediator submgr vespamgr rsm jaegeradapter]
  Deploying RIC infra components [extsvcplt]

* # Finished installation
  + exit 0

 

Generated at Sat Feb 10 06:04:56 UTC 2024 using Jira 9.4.5#940005-sha1:e3094934eac4fd8653cf39da58f39364fb9cc7c1.