[RC-2] RC port to AArch64 Investigation Created: 16/Jan/20  Updated: 13/May/20

Status: In Progress
Project: Regional Controller
Component/s: None
Affects Version/s: None
Fix Version/s: None

Type: Task Priority: Medium
Reporter: Alex Antone Assignee: Alex Antone
Resolution: Unresolved Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified

Epic Link: RC-AARCH64

 Description   

Regional controller repositories:

Repository Name Repository Description Codebase Component Type Name
regional_controller Scripts to set up the Regional Controller bash shell scripts start_akraino_portal.sh
regional_controller/api-server The Version 2 Regional Controller java component included in portal container
postgres_db_schema The Postgres DB for the Portal and workflow engine bash, sql docker akraino-postgres
camunda_workflow Akraino workflow engine code java docker akraino-workflow
portal Akraino Portal   docker akraino-portal
portal_user_interface The GUI portion of the Akraino Portal java webapp docker akraino-portal
portal_user_interface The GUI portion of the Akraino Portal java webapp war file portal_user_interface.war
portal_user_interface/portal-onapsdk   java webapp docker  
test_automation Test automation scripts for Akraino (tempest) yaml, templates archive test_automation.tgz
yaml_builds Scripts and YAML files/templates used to boot an Akraino cluster bash, yaml, jinja2 archive yaml_builds.tgz
addon-onap Scripts to add ONAP on top of a running Akraino cluster bash, python, yaml, json archive addon-onap.tgz
sample_vnf A sample VNF (Virtual Network Function) for Akraino bash, yaml archive sample_vnf.tgz
airshipinabottle_deploy   bash archive airshipinabottle_deploy.tgz
redfish Akraino support for Redfish imaging/booting of nodes bash, json archive redfish.tgz

External components:

Repository Name Repository Description Codebase Component Type Name
openmicroscopy/apacheds:latest ApacheDS LDAP server   docker akraino-ldap

Regional controller is based on Debian based distro.

Some Dockerfile base image tags:

Dockerfile Base image
api-server/docker/airflow/Dockerfile puckel/docker-airflow:1.10.2
api-server/docker/api/Dockerfile tomcat:8.5.50
portal_user_interface/Dockerfile tomcat:8.5.37
camunda_workflow/Dockerfile java:8
postgres_db_schema/Dockerfile postgres:9.6.9
addon-onap/src/onap_vm_scripts/oom_no_proxy/kubernetes/portal/docker/init/mariadb-client/Dockerfile ubuntu:16.04
addon-onap/src/onap_vm_scripts/oom_no_proxy/kubernetes/portal/docker/init/ubuntu/Dockerfile ubuntu:16.04
addon-onap/src/onap_vm_scripts/oom_no_proxy/kubernetes/dcae/pgaas/Dockerfile ubuntu:16.04
addon-onap/src/onap_vm_scripts/oom_no_proxy/kubernetes/dcae/cdap/docker/Dockerfile ubuntu:16.04
addon-onap/src/onap_vm_scripts/oom_no_proxy/kubernetes/dcae/cdap/docker/fs/Dockerfile ubuntu:16.04
addon-onap/src/onap_vm_scripts/oom_proxy/kubernetes/portal/docker/init/mariadb-client/Dockerfile ubuntu:16.04
addon-onap/src/onap_vm_scripts/oom_proxy/kubernetes/portal/docker/init/ubuntu/Dockerfile ubuntu:16.04
addon-onap/src/onap_vm_scripts/oom_proxy/kubernetes/dcae/pgaas/Dockerfile ubuntu:16.04
addon-onap/src/onap_vm_scripts/oom_proxy/kubernetes/dcae/cdap/docker/Dockerfile ubuntu:16.04
addon-onap/src/onap_vm_scripts/oom_proxy/kubernetes/dcae/cdap/docker/fs/Dockerfile ubuntu:16.04

Other base image tags:

File Base image
api-server/scripts/start_arc.sh DB_IMAGE = mariadb:10.4
api-server/scripts/start_arc.sh LDAP_IMAGE = osixia/openldap:1.2.4
api-server/scripts/start_arc.sh nginx:1.14.2
api-server/scripts/start_arc.sh redis:3.2.7
api-server/scripts/start_arc.sh postgres:9.6
regional_controller/start_akraino_portal.sh LD_IMAGE = openmicroscopy/apacheds *
portal-onapsdk/ONAP-SDK-APP/docker-scripts/arcportal/deploy.sh akraino/portal-onapsdk:arcportal-latest
portal-onapsdk/ONAP-SDK-APP/docker-scripts/mariadb/deploy.sh akraino/portal-onapsdk:mariadb-latest
portal-onapsdk/ONAP-SDK-APP/docker-scripts/mariadb/deploy_with_existing_persistent_storage.sh akraino/portal-onapsdk:mariadb-latest
sample_vnf/ats-demo/run_openstack_cli.sh docker.io/openstackhelm/heat:newton
test_automation/openstack_tempest/run_openstack_cli.sh docker.io/openstackhelm/heat:newton
airshipinabottle_deploy/version.properties docker.io/openstackhelm/heat:ocata
yaml_builds/tools/run_openstack_cli.sh docker.io/openstackhelm/heat:ocata
yaml_builds/tools/pegleg.sh quay.io/airshipit/pegleg:09d85465827f1468d3469e5bbcf6b48f25338e7c
redfish/install_server_os.sh httpd:alpine
redfish/install_server_os.sh networkboot/dhcpd
  • regional_controller seems to be obsolete. The current one is the V2 in regional_controller/api-server git repo.

Hardcoded arch amd64:

 addon-onap/src/onap_vm_scripts/oom_no_proxy/cloudify-onap/plugins/onap-installation-plugin/k8s_installer/common/constants.py
    16:HELM_URL = 'https://kubernetes-helm.storage.googleapis.com/helm-canary-linux-amd64.tar.gz'
 addon-onap/src/onap_vm_scripts/oom_no_proxy/kubernetes/dcae/cdap/docker/00-provisioning.sh
    240:wget -qO - http://repository.cask.co/ubuntu/precise/amd64/cdap/3.5/pubkey.gpg | apt-key add -
 addon-onap/src/onap_vm_scripts/oom_no_proxy/kubernetes/aai/templates/data-router-deployment.yaml
    35:          value: usr/lib/jvm/java-8-openjdk-amd64

 addon-onap/src/onap_vm_scripts/oom_proxy/cloudify-onap/plugins/onap-installation-plugin/k8s_installer/common/constants.py
    16:HELM_URL = 'https://kubernetes-helm.storage.googleapis.com/helm-canary-linux-amd64.tar.gz'
 addon-onap/src/onap_vm_scripts/oom_proxy/kubernetes/dcae/cdap/docker/00-provisioning.sh
    240:wget -qO - http://repository.cask.co/ubuntu/precise/amd64/cdap/3.5/pubkey.gpg | apt-key add -
 addon-onap/src/onap_vm_scripts/oom_proxy/kubernetes/aai/templates/data-router-deployment.yaml
    35:          value: usr/lib/jvm/java-8-openjdk-amd64

 addon-onap/src/regional_controller_scripts/INSTALL.sh
    122:    curl -o $BINDIR/ubuntu-16.04-server-cloudimg-amd64-disk1.img https://cloud-images.ubuntu.com/releases/16.04/release/ubuntu-16.04-server-cloudimg-amd64-disk1.img
    123:    openstack image create --file $BINDIR/ubuntu-16.04-server-cloudimg-amd64-disk1.img --disk-format qcow2 --container-format bare --public ubuntu16_04
    ...
    128:    curl -o $BINDIR/ubuntu-14.04-server-cloudimg-amd64-disk1.img https://cloud-images.ubuntu.com/releases/14.04/release/ubuntu-14.04-server-cloudimg-amd64-disk1.img
    129:    openstack image create --file $BINDIR/ubuntu-14.04-server-cloudimg-amd64-disk1.img --disk-format qcow2 --container-format bare --public ubuntu14_04

 yaml_builds/tools/j2/serverrc.j2
    40:# valid options are script-hwe-16.04.6-amd64.ipxe or script-16.04.6-amd64.ipxe
    41:SRV_BLD_SCRIPT=script-hwe-16.04.6-amd64.ipxe

 sample_vnf/ats-demo/run_ats-demo.sh
    38:UBUNTU_URL=${UBUNTU_URL:-https://cloud-images.ubuntu.com/releases/16.04/release/ubuntu-16.04-server-cloudimg-amd64-disk1.img}
 
 api-server/docker/airflow/Dockerfile
    27: add-apt-repository "deb [arch=amd64] https://download.docker.com/linux/$(. /etc/os-release; echo "$ID") $(lsb_release -cs) stable" && \
 api-server/docker/api/airflow.cfg
    562:git_sync_container_repository = gcr.io/google-containers/git-sync-amd64

 redfish/buildrc
    39:export UBUNTU_URL=${UBUNTU_URL:-http://releases.ubuntu.com/16.04/ubuntu-16.04.6-server-amd64.iso}
 redfish/dhcpd.conf.template
    41:#        filename "http://192.168.2.5:8090/script-hwe-16.04.4-amd64.ipxe";
    50:#        filename "http://192.168.2.5:8090/script-hwe-16.04.4-amd64.ipxe";
 redfish/serverrc.template
    30:# valid options are hwe-16.04.6-amd64 or 16.04.6-amd64
    31:SRV_BLD_SCRIPT=hwe-16.04.6-amd64

Hardcoded arch x86:

 addon-onap/src/onap_vm_scripts/oom_no_proxy/cloudify-onap/blueprint.yaml
    11:  # Plugin required: https://github.com/cloudify-incubator/cloudify-kubernetes-plugin/releases/download/1.2.1rc1/cloudify_kubernetes_plugin-1.2.1rc1-py27-none-linux_x86_64-centos-Core.wgn
 
 addon-onap/src/onap_vm_scripts/oom_no_proxy/message-router-blueprint.yaml
    29:  # Plugin required: https://github.com/cloudify-cosmo/cloudify-openstack-plugin/releases/download/2.2.0/cloudify_openstack_plugin-2.2.0-py27-none-linux_x86_64-centos-Core.wgn
    31:  # Plugin required: https://github.com/cloudify-incubator/cloudify-utilities-plugin/releases/download/1.2.5/cloudify_utilities_plugin-1.2.5-py27-none-linux_x86_64-centos-Core.wgn
    33:  # Plugin required: https://github.com/cloudify-incubator/cloudify-kubernetes-plugin/releases/download/1.2.0/cloudify_kubernetes_plugin-1.2.0-py27-none-linux_x86_64-centos-Core.wgn
    35:  # Plugin required: http://repository.cloudifysource.org/cloudify/wagons/cloudify-diamond-plugin/1.3.5/cloudify_diamond_plugin-1.3.5-py27-none-linux_x86_64-centos-Core.wgn
    37:  # Plugin required: http://repository.cloudifysource.org/cloudify/wagons/cloudify-fabric-plugin/1.5/cloudify_fabric_plugin-1.5-py27-none-linux_x86_64-centos-Core.wgn
    483: baseurl=https://packages.cloud.google.com/yum/repos/kubernetes-el7-x86_64
 
 addon-onap/src/onap_vm_scripts/oom_proxy/cloudify-onap/blueprint.yaml
    11:  # Plugin required: https://github.com/cloudify-incubator/cloudify-kubernetes-plugin/releases/download/1.2.1rc1/cloudify_kubernetes_plugin-1.2.1rc1-py27-none-linux_x86_64-centos-Core.wgn
 addon-onap/src/onap_vm_scripts/oom_proxy/message-router-blueprint.yaml
    29:  # Plugin required: https://github.com/cloudify-cosmo/cloudify-openstack-plugin/releases/download/2.2.0/cloudify_openstack_plugin-2.2.0-py27-none-linux_x86_64-centos-Core.wgn
    31:  # Plugin required: https://github.com/cloudify-incubator/cloudify-utilities-plugin/releases/download/1.2.5/cloudify_utilities_plugin-1.2.5-py27-none-linux_x86_64-centos-Core.wgn
    33:  # Plugin required: https://github.com/cloudify-incubator/cloudify-kubernetes-plugin/releases/download/1.2.0/cloudify_kubernetes_plugin-1.2.0-py27-none-linux_x86_64-centos-Core.wgn
    35:  # Plugin required: http://repository.cloudifysource.org/cloudify/wagons/cloudify-diamond-plugin/1.3.5/cloudify_diamond_plugin-1.3.5-py27-none-linux_x86_64-centos-Core.wgn
    37:  # Plugin required: http://repository.cloudifysource.org/cloudify/wagons/cloudify-fabric-plugin/1.5/cloudify_fabric_plugin-1.5-py27-none-linux_x86_64-centos-Core.wgn
    483: baseurl=https://packages.cloud.google.com/yum/repos/kubernetes-el7-x86_64

 portal-onapsdk/ONAP-SDK-APP/db-scripts/EcompSdkDDLMySql_2_4_Common.sql
    16:-- mysql  Ver 15.1 Distrib 10.1.17-MariaDB, for Linux (x86_64) using readline 5.1

 test_automation/openstack_tempest/tempest/values.yaml
    144:      http_image: "http://download.cirros-cloud.net/0.3.5/cirros-0.3.5-x86_64-disk.img"

 redfish/create_ipxe.sh
    125: rm -f $IPXE_ROOT/src/bin-x86_64-efi/ipxe.efi
    127: make -C $IPXE_ROOT/src bin-x86_64-efi/ipxe.efi EMBED=$IPXE_ROOT/boot.ipxe 2>&1 | grep -v "[DEPS]"| sed -e "s/^/    /g"
    128: if [ ! -f "$IPXE_ROOT/src/bin-x86_64-efi/ipxe.efi" ]; then
    129:     echo "ERROR:  failed creating ipxe.efi [$IPXE_ROOT/src/bin-x86_64-efi/ipxe.efi]"
    134: cp -f $IPXE_ROOT/src/bin-x86_64-efi/ipxe.efi $WEB_ROOT/ipxe.efi
    147: cp -f $IPXE_ROOT/src/bin-x86_64-efi/ipxe.efi $IPXE_IMG_MNT/EFI/BOOT/BOOTX64.EFI
    159: cp -f $IPXE_ROOT/src/bin-x86_64-efi/ipxe.efi $IPXE_ISO_DIR/EFI/BOOT/BOOTX64.EFI

.



 Comments   
Comment by Alex Antone [ 22/Jan/20 ]

Based on information from https://wiki.akraino.org/display /AK/Starting+the+Regional+Controller the api-server is the current Regional Controller repo:

There is one shell script, start_arc.sh, that may be downloaded and run to start the RC. This script takes no command line parameters; some internal values may be changed by setting and exporting certain environment variables.
To start the Regional Controller, clone the scripts from gerrit, and then run this script as follows

$ git clone https://gerrit.akraino.org/r/regional_controller/api-server.git
$ cd api-server/scripts
$ ./start_arc.sh
Comment by Alex Antone [ 20/Jan/20 ]

OBSOLETE:

    regional_controller/start_akraino_portal.sh:
    ----------------------------------------------
        DOCKER_REPO=nexus3.akraino.org:10003/

        # Database container
        #   DB_IMAGE=akraino_schema_db:$DATABASE_VERSION
        docker run ... --name akraino-postgres $DB_IMAGE

        # LDAP container updated with files from portal.war
        #   $LD_IMAGE=openmicroscopy/apacheds
        wget -q "$PORTAL_URL" -O /tmp/portal_user_interface.war
        unzip -oj /tmp/portal_user_interface.war WEB-INF/classes/*.ldif -d $LDAP_FILE_HOME
        rm -f /tmp/portal_user_interface.war
        docker run ... --name akraino-ldap $LD_IMAGE

        # Akraino portal container
        #   $PT_IMAGE=akraino-portal:$PORTAL_VERSION
        wget -q "$YAML_BUILDS_URL" -O - | tar -xoz -C $YAML_BUILDS_HOME
        docker run ... --name akraino-portal $PT_IMAGE

        # Worflow started as systemd hosted service (extracted from container ?!)
        #   WF_IMAGE=akraino-camunda-workflow-engine:$WORKFLOW_VERSION
        docker run ... --name akraino-workflow $WF_IMAGE
        docker cp  akraino-workflow:/config $CAMUNDA_HOME
        docker cp  akraino-workflow:/$jar_name $CAMUNDA_HOME/$jar_name
        docker stop akraino-workflow &> /dev/null
        docker rm akraino-workflow &> /dev/null

        cp -f /opt/akraino/region/akraino-workflow.service /etc/systemd/system/
        systemctl start akraino-workflow

        # Setting up content/repositories for:
        #   tempest 
        #   ONAP
        #   sample vnf
        #   airshipinabottle
        #   redfish
        wget -q "$TEMPEST_URL"    -O - | tar -xoz -C $TEMPEST_HOME
        wget -q "$ONAP_URL"       -O - | tar -xoz -C $ONAP_HOME
        wget -q "$SAMPLE_VNF_URL" -O - | tar -xoz -C $SAMPLE_VNF_HOME
        wget -q "$AIRSHIPINABOTTLE_URL" -O - | tar -xoz -C $AIRSHIPINABOTTLE_HOME
        wget -q "$REDFISH_URL" -O - | tar -xoz -C $REDFISH_HOME



    [regional_controller/]api-server/scripts/start_arc.sh
    -----------------------------------------------------
        DOCKER_REPO=nexus3.akraino.org:10003/

        # Just copy stuff from container then remove it
        #   API_IMAGE=akraino/arc_api:0.0.2-SNAPSHOT
        docker run --rm ... --name "arc-init" $API_IMAGE /bin/cp -R init/ /init/..

        # Start LDAP. LDAP_IMAGE=osixia/openldap:1.2.4)
        docker run ... --name "arc-ldap" $LDAP_IMAGE

        # Start DB. DB_IMAGE=mariadb:10.4)
        docker run ... --name "arc-db" $DB_IMAGE

        # Start API server. API_IMAGE=akraino/arc_api:0.0.2-SNAPSHOT
        docker run ... --name "arc-api" $API_IMAGE

        # Start NGiNX
        docker run ... --name "arc-nginx" nginx:1.14.2

        # Start the Airflow containers
        #    AF_IMAGE=akraino/airflow:0.0.1-SNAPSHOT
        docker run ... --name "arc-airflow-redis"     redis:3.2.7
        docker run ... --name "arc-airflow-postgres"  postgres:9.6
        docker run ... --name "arc-airflow-webserver" $AF_IMAGE webserver
        docker run ... --name "arc-airflow-flower"    $AF_IMAGE flower
        docker run ... --name "arc-airflow-scheduler" $AF_IMAGE scheduler
        docker run ... --name "arc-airflow-worker"    $AF_IMAGE worker

Comment by Alex Antone [ 20/Jan/20 ]

To me it seems there is some overlap/similarity between the following 2 scripts in their respective repositories:

    - regional_controller -> start_akraino_portal.sh:
      Database container (postgres based), LDAP container (apacheds), Portal Container, Workflow container -> systemdservice (camunda), tempset, ONAP, sample vnf, airshipinabottle, redfish (obsolete)
   
    -  regional_controller/api-server -> scripts/start_arc.sh:
       Database container (mariadb:10.4), LDAP container (osixia/openldap:1.2.4), API server container, Airflow containers (redis, postgres, airflow components), nginx.

To investigate:
    1. What is the relationship between the 2 projects: regional_controller and regional_controller/api-server ?
    2. Does api-server supersede regional_controller (looking at commit history api-server is more recent)?
    3. Where do portal_user_interface and portal_user_interface/portal-onapsdk fit in with respect to the Regional Controller (hosted on RC as container?) and among themselves.

Generated at Sat Feb 10 06:05:03 UTC 2024 using Jira 9.4.5#940005-sha1:e3094934eac4fd8653cf39da58f39364fb9cc7c1.