Skip to content

Commit

Permalink
SystemAdmin feature scenarios addition
Browse files Browse the repository at this point in the history
  • Loading branch information
itsmekumari authored and Vipinofficial11 committed Jan 5, 2024
1 parent 6f245ef commit dbbae99
Show file tree
Hide file tree
Showing 13 changed files with 633 additions and 2 deletions.
102 changes: 102 additions & 0 deletions .github/workflows/e2e.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,102 @@
# Copyright © 2023 Cask Data, Inc.
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
# use this file except in compliance with the License. You may obtain a copy of
# the License at
# http://www.apache.org/licenses/LICENSE-2.0
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations under
# the License.

# This workflow will build a Java project with Maven
# For more information see: https://help.github.com/actions/language-and-framework-guides/building-and-testing-java-with-maven
# Note: Any changes to this workflow would be used only after merging into develop
name: Build e2e tests

on:
push:
branches: [ develop, release/** ]
pull_request:
branches: [ develop, release/** ]
types: [opened, synchronize, reopened, labeled]
workflow_dispatch:

jobs:
build:
runs-on: k8s-runner-e2e
# We allow builds:
# 1) When triggered manually
# 2) When it's a merge into a branch
# 3) For PRs that are labeled as build and
# - It's a code change
# - A build label was just added
# A bit complex, but prevents builds when other labels are manipulated
if: >
github.event_name == 'workflow_dispatch'
|| github.event_name == 'push'
|| (contains(github.event.pull_request.labels.*.name, 'build')
&& (github.event.action != 'labeled' || github.event.label.name == 'build')
)
strategy:
matrix:
tests: [ cdap-e2e-tests ]
fail-fast: false
steps:
# Pinned 1.0.0 version
- uses: actions/checkout@v3
with:
path: plugin

- uses: dorny/paths-filter@4512585405083f25c027a35db413c2b3b9006d50
# Pinned version 2.11.1
if: github.event_name != 'workflow_dispatch' && github.event_name != 'push'
id: filter
with:
working-directory: plugin
filters: |
e2e-test:
- '**/e2e-test/**'
- name: Checkout e2e test repo
uses: actions/checkout@v3
with:
repository: cdapio/cdap-e2e-tests
path: e2e

- name: Cache
uses: actions/cache@v3
with:
path: ~/.m2/repository
key: ${{ runner.os }}-maven-${{ github.workflow }}-${{ hashFiles('**/pom.xml') }}
restore-keys: |
${{ runner.os }}-maven-${{ github.workflow }}
- name: Run required e2e tests
if: github.event_name != 'workflow_dispatch' && github.event_name != 'push' && steps.filter.outputs.e2e-test == 'false'
run: python3 e2e/src/main/scripts/run_cdap_e2e_test.py --testRunner TestRunnerRequired.java --cdapBranch "${{ github.event.pull_request.base.ref }}"

- name: Run all e2e tests
if: github.event_name == 'workflow_dispatch' || github.event_name == 'push' || steps.filter.outputs.e2e-test == 'true'
run: python3 e2e/src/main/scripts/run_cdap_e2e_test.py --testRunner TestRunnerRequired.java --cdapBranch "${{ github.event.pull_request.base.ref }}"

- name: Upload report
uses: actions/upload-artifact@v3
if: always()
with:
name: Cucumber report - ${{ matrix.tests }}
path: ./plugin/${{ matrix.tests }}/target/cucumber-reports

- name: Upload debug files
uses: actions/upload-artifact@v3
if: always()
with:
name: Debug files - ${{ matrix.tests }}
path: ./**/target/e2e-debug

- name: Upload files to GCS
uses: google-github-actions/upload-cloud-storage@v0
if: always()
with:
path: ./plugin/cdap-e2e-tests/target/cucumber-reports
destination: e2e-tests-cucumber-reports/${{ github.event.repository.name }}/${{ github.ref }}
Original file line number Diff line number Diff line change
@@ -0,0 +1,64 @@
#
# Copyright © 2023 Cask Data, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
# use this file except in compliance with the License. You may obtain a copy of
# the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations under
# the License.
#

@Sysadmin
Feature: Sysadmin - Validate system admin page design time scenarios

Background:
Given Open Datafusion Project to configure pipeline
When Open "System Admin" menu
Then Click on the Configuration link on the System admin page

@Sysadmin @SysAdminRequired
Scenario:Validate user is able to create new system preferences and able to delete the added system preferences successfully
Then Select "systemPreferences" option from Configuration page
Then Click on edit system preferences
Then Set system preferences with key: "keyValue" and value: "systemPreferences1"
Then Click on the Save & Close preferences button
Then Select "systemPreferences" option from Configuration page
Then Click on edit system preferences
Then Delete the preferences
Then Click on the Save & Close preferences button
Then Verify the system admin page is navigated successfully

Scenario:Validate user is able to add multiple system preferences inside system admin successfully
Then Select "systemPreferences" option from Configuration page
Then Click on edit system preferences
Then Set system preferences with key: "keyValue" and value: "systemPreferences2"
Then Click on the Save & Close preferences button
Then Click on edit system preferences
Then Delete the preferences
Then Delete the preferences
Then Click on the Save & Close preferences button
Then Verify the system admin page is navigated successfully

Scenario:Validate user is able to successfully reload system artifacts using reload
Then Click on Reload System Artifacts from the System admin page
Then Click on Reload button on popup to reload the System Artifacts successfully
Then Verify the system admin page is navigated successfully

Scenario:Validate user is able to open compute profile page and create a compute profile for selected provisioner
Then Click on the Compute Profile from the System admin page
Then Click on create compute profile button
Then Select a provisioner: "remoteHadoopProvisioner" for the compute profile
Then Verify the Create a Profile page is loaded for selected provisioner
Then Enter input plugin property: "profileLabel" with value: "validProfile"
Then Enter textarea plugin property: "profileDescription" with value: "validDescription"
Then Enter input plugin property: "host" with value: "testHost"
Then Enter input plugin property: "user" with value: "testUser"
Then Enter textarea plugin property: "sshKey" with value: "testSSHKey"
Then Click on: "Create" button in the properties
Then Verify the created compute profile: "validProfile" is displayed in system compute profile list
Original file line number Diff line number Diff line change
@@ -0,0 +1,48 @@
#
# Copyright © 2023 Cask Data, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
# use this file except in compliance with the License. You may obtain a copy of
# the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations under
# the License.
#

@Sysadmin
Feature: Sysadmin - Validate system admin page design time validation scenarios

Background:
Given Open Datafusion Project to configure pipeline
When Open "System Admin" menu
Then Click on the Configuration link on the System admin page

@SysAdminRequired
Scenario:Validate user is able reset the system preferences added inside system admin successfully
Then Select "systemPreferences" option from Configuration page
Then Click on edit system preferences
Then Set system preferences with key: "keyValue" and value: "systemPreferences1"
Then Reset the preferences
Then Verify the reset is successful for added preferences

Scenario:To verify the validation error message with invalid profile name
Then Click on the Compute Profile from the System admin page
Then Click on create compute profile button
Then Select a provisioner: "existingDataProc" for the compute profile
Then Enter input plugin property: "profileLabel" with value: "invalidProfile"
Then Enter textarea plugin property: "profileDescription" with value: "validDescription"
Then Enter input plugin property: "clusterName" with value: "validClusterName"
Then Click on: "Create" button in the properties
Then Verify that the compute profile is displaying an error message: "errorInvalidProfileName" on the footer

Scenario:To verify the validation error message with invalid namespace name
Then Click on Create New Namespace button
Then Enter the New Namespace Name with value: "invalidNamespaceName"
Then Enter the Namespace Description with value: "validNamespaceDescription"
Then Click on: "Finish" button in the properties
Then Verify the failed error message: "errorInvalidNamespace" displayed on dialog box
Original file line number Diff line number Diff line change
@@ -0,0 +1,149 @@
#
# Copyright © 2023 Cask Data, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
# use this file except in compliance with the License. You may obtain a copy of
# the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations under
# the License.
#

@Sysadmin
Feature: Sysadmin - Validate system admin page Run time scenarios

Background:
Given Open Datafusion Project to configure pipeline
When Open "System Admin" menu
Then Click on the Configuration link on the System admin page

@Sysadmin
Scenario:To verify user should be able to create Namespace successfully in System Admin
Then Click on Create New Namespace button
Then Enter the New Namespace Name with value: "namespaceName"
Then Enter the Namespace Description with value: "validNamespaceDescription"
Then Click on: "Finish" button in the properties
Then Verify the namespace created success message displayed on confirmation window
Then Verify the created namespace: "namespaceName" is displayed in Namespace tab

@SysAdminRequired
Scenario:To verify User should be able to add a secure key from Make HTTP calls successfully with PUT calls
Then Click on Make HTTP calls from the System admin configuration page
Then Select request dropdown property with option value: "httpPutMethod"
Then Enter input plugin property: "requestPath" with value: "secureKey"
Then Enter textarea plugin property: "requestBody" with value: "bodyValue"
Then Click on send button
Then Verify the status code for success response

@SysAdminRequired
Scenario:To verify User should be able to fetch secure key from Make HTTP calls successfully with GET calls
Then Click on Make HTTP calls from the System admin configuration page
Then Select request dropdown property with option value: "httpGetMethod"
Then Enter input plugin property: "requestPath" with value: "secureKey"
Then Click on send button
Then Verify the status code for success response

@SysAdminRequired
Scenario:To verify User should be able to delete secure key from Make HTTP calls successfully with DELETE calls
Then Click on Make HTTP calls from the System admin configuration page
Then Select request dropdown property with option value: "httpDeleteMethod"
Then Enter input plugin property: "requestPath" with value: "secureKey"
Then Click on send button
Then Verify the status code for success response

@BQ_SOURCE_TEST @BQ_SINK_TEST @SysAdminRequired
Scenario:To verify user should be able to run a pipeline successfully using the System preferences created
Then Select "systemPreferences" option from Configuration page
Then Click on edit system preferences
Then Set system preferences with key: "keyValue" and value: "systemPreferences2"
Then Click on the Save & Close preferences button
Then Click on the Hamburger menu on the left panel
Then Select navigation item: "studio" from the Hamburger menu list
When Select plugin: "BigQuery" from the plugins list as: "Source"
When Expand Plugin group in the LHS plugins list: "Sink"
When Select plugin: "BigQuery" from the plugins list as: "Sink"
Then Connect plugins: "BigQuery" and "BigQuery2" to establish connection
Then Navigate to the properties page of plugin: "BigQuery"
Then Enter input plugin property: "referenceName" with value: "BQReferenceName"
Then Click on the Macro button of Property: "projectId" and set the value to: "projectId"
Then Click on the Macro button of Property: "datasetProjectId" and set the value to: "datasetProjectId"
Then Enter input plugin property: "dataset" with value: "dataset"
Then Enter input plugin property: "table" with value: "bqSourceTable"
Then Validate "BigQuery" plugin properties
Then Close the Plugin Properties page
Then Navigate to the properties page of plugin: "BigQuery2"
Then Click on the Macro button of Property: "projectId" and set the value to: "projectId"
Then Click on the Macro button of Property: "datasetProjectId" and set the value to: "datasetProjectId"
Then Enter input plugin property: "referenceName" with value: "BQReferenceName"
Then Enter input plugin property: "dataset" with value: "dataset"
Then Enter input plugin property: "table" with value: "bqSourceTable"
Then Validate "BigQuery" plugin properties
Then Close the Plugin Properties page
Then Save the pipeline
Then Deploy the pipeline
Then Run the Pipeline in Runtime
Then Wait till pipeline is in running state
Then Open and capture logs
Then Verify the pipeline status is "Succeeded"

@BQ_SOURCE_TEST @BQ_SINK_TEST @SysAdminRequired
Scenario:To verify user should be able to run a pipeline successfully using existing System preferences and the Namespace preferences created
Then Click on Create New Namespace button
Then Enter the New Namespace Name with value: "sampleNamespaceName"
Then Enter the Namespace Description with value: "validNamespaceDescription"
Then Click on: "Finish" button in the properties
Then Verify the namespace created success message displayed on confirmation window
Then Click on the switch to namespace button
Then Click on the Hamburger menu on the left panel
Then Select navigation item: "namespaceAdmin" from the Hamburger menu list
Then Click "preferences" tab from Configuration page for "sampleNamespaceName" Namespace
Then Click on edit namespace preferences to set namespace preferences
Then Set system preferences with key: "keyValue" and value: "systemPreferences1"
Then Click on the Save & Close preferences button
Then Click on the Hamburger menu on the left panel
Then Select navigation item: "studio" from the Hamburger menu list
When Select plugin: "BigQuery" from the plugins list as: "Source"
When Expand Plugin group in the LHS plugins list: "Sink"
When Select plugin: "BigQuery" from the plugins list as: "Sink"
Then Connect plugins: "BigQuery" and "BigQuery2" to establish connection
Then Navigate to the properties page of plugin: "BigQuery"
Then Enter input plugin property: "referenceName" with value: "BQReferenceName"
Then Click on the Macro button of Property: "projectId" and set the value to: "projectId"
Then Click on the Macro button of Property: "datasetProjectId" and set the value to: "datasetProjectId"
Then Click on the Macro button of Property: "dataset" and set the value to: "dataset"
Then Enter input plugin property: "table" with value: "bqSourceTable"
Then Validate "BigQuery" plugin properties
Then Close the Plugin Properties page
Then Navigate to the properties page of plugin: "BigQuery2"
Then Enter input plugin property: "referenceName" with value: "BQReferenceName"
Then Click on the Macro button of Property: "projectId" and set the value to: "projectId"
Then Click on the Macro button of Property: "datasetProjectId" and set the value to: "datasetProjectId"
Then Click on the Macro button of Property: "dataset" and set the value to: "dataset"
Then Enter input plugin property: "table" with value: "bqSourceTable"
Then Validate "BigQuery" plugin properties
Then Close the Plugin Properties page
Then Save the pipeline
Then Deploy the pipeline
Then Run the Pipeline in Runtime
Then Wait till pipeline is in running state
Then Open and capture logs
Then Verify the pipeline status is "Succeeded"
Then Close the pipeline logs
When Open "System Admin" menu
Then Click on the Configuration link on the System admin page
Then Select "systemPreferences" option from Configuration page
Then Click on edit system preferences
Then Delete the preferences
Then Delete the preferences
Then Click on the Save & Close preferences button
Then Click on the Hamburger menu on the left panel
Then Select navigation item: "namespaceAdmin" from the Hamburger menu list
Then Click "preferences" tab from Configuration page for "sampleNamespaceName" Namespace
Then Click on edit namespace preferences to set namespace preferences
Then Delete the preferences
Then Click on the Save & Close preferences button
Loading

0 comments on commit dbbae99

Please sign in to comment.