Você está na página 1de 32

Reporting and Alerts Engine

Test Plan
Version 1.0

MD5 Team

Paul Cho
Jeff Gordy
Dana Stevenson
Wayne Fischer
Aaron Toren
Jorge Silva

University of San Diego


CSOL 560

October 22, 2018


____________________________________________________________________________

MD5 Team Software, Inc.


____________________________________________________________________________

Table of Contents
Executive Summary 3
Unit Tests 4
Overview 4
Task Unit Tests 5
Task 1.1 Retrieve Token From Header 5
Task 1.2 and 1.3 Validate Token 5
Task 1.3.1 Request 2FA 6
Task 1.3.2 Validate 2FA 6
Task 2.1 Log Status Event 6
Task 2.2 Send Event to Threat Determination 7
Task 3.1 Receive Threat Data 7
Task 3.2 Flag Threat Data 7
Task 3.3 Send Data to Reporting Mechanisms 9
Task 4.1 Receive report Event 9
Task 4.2 Alert Response Event 9
Task 4.3 Execute Alert Action Event 10
Task 4.4 Log Alert Action Status 10
Task 5.1 Capture Session Details (login, IP, timestamp) 10
Task 5.2 Push Record to Fusion Engine 11
Task 6.1 Execute Report Query 11
Task 6.2 Produce Report Data 12
Task 6.3 Format and Delivery 12
Task 6.4 Log Batch Report Execution Status 12
Task 7.1 Retrieve Token From Header 13
Task 7.2 Validate Token 13
Task 7.3 Request 2FA 13
Task 7.4 Validate 2FA 14
Task 8.1 Log User Access 14
Task 8.2 Wait For Request 15
Task 8.3 Log User Request 15
Task 9.1 Retrieve Search Filters 15
Task 9.2 Retrieve Data from Fusion Engine 16
Task 9.3 Output Report 16
Task 9.4 Log Report 16
Functional Tests 18
Core Functions 18
Usability Functions 20
Accessibility Functions 21
Exception/Systematic Event Handling 22

MD5_Team ©MD5_Team Software, Inc., 2018 Page 1


____________________________________________________________________________

MD5 Team Software, Inc.


____________________________________________________________________________
Regression Tests 24
Regression Testing Methodology 24
Prioritize High Impact Test Cases 24
Test Case Selection 25
Retest All 25
Regression Testing Tools 26
Regression Testing and Configuration Management 26
Verification 26
Validation 27
Validation Requirements Process 27
Validation Goal Analysis 27
Customer Acceptance Test 28
Usability Test 28
Model/Specification Inspection and Checking 29
Mitigation Plan 29

MD5_Team ©MD5_Team Software, Inc., 2018 Page 2


____________________________________________________________________________

MD5 Team Software, Inc.


____________________________________________________________________________

Executive Summary
The Reports and Alerts Engine Test Plan is a comprehensive plan to test the features and
functionality of the alerts and reports generated in the Supply Chain Risk Management (SCRM)
system. While the information presented is in a sequential waterfall methodology in the test
document, part of the test plan uses agile testing which is outlined in further detail in the
Mitigation strategy section.

Unit testing of individual components and functions is conducted first as the code is being
developed. Once a unit is deemed functional, it is then tested for full functionality using black
box methodology and sanity tests. After the system is deemed functional, it is fully regression
tested, checking for any errors or defects that might have occurred during the integration process.
Lastly, we perform Verification and Validation testing to verify that the code is ready and then
validated in the runtime environment.

The mitigation strategy to prevent defects and errors in the code revolves around an iterative
approach (agile testing) which heavily weighs in risk identification, assessment, analysis and
implementation of remediation procedures to minimize and eliminate risks and vulnerabilities.
The Earned Value Management (EVM) will also be included in the risk management practice so
that the planning managers can budget risk management costs more tightly while providing
insight into the risk management systems performance.

Testing is not limited to these methodologies, and this document should serve as a baseline rather
than the definitive processes and procedures to conduct testing of the Reports and Alerts Engine
as the product will evolve and change with the SCRM system as a whole.

MD5_Team ©MD5_Team Software, Inc., 2018 Page 3


____________________________________________________________________________

MD5 Team Software, Inc.


____________________________________________________________________________

Unit Tests
Overview

The Reporting and Alerts Module uses Node.js, Express.js, and MuleSoft for the custom in-
house software components and interaction between microservices. Unit tests are written using
the Chai Assertion Library which is a unit testing framework that supports behavior-driven
development (BDD). Behavior Driven Development allows chainable getters to improve
readability of assertions. Chai supports the following chains:

Chai’s Chainable Function Getters


● to ● be ● been
● is ● that ● which
● and ● has ● have
● with ● at ● of
● same ● but ● does

These getters allow us to use a more natural readable language in the test such as shown in the
following assertion.
expect(tokenIsPresentInHeader).to.be.true;

Chai also supports other special chain elements. An example of a special element is the .not
element which can be placed anywhere in a chain. Everything following the .not is negated.

As an example, if we want to assert that a particular function called MD5Team() does not throw
an exception we can use the following chai assertion:
expect(MD5Team() {}).to.not.throw();

This type of syntax allows for readable behavior-based unit testing. Chai also supports normal
TDD assertion style methods including .equal(actual, expected), .notEqual(actual, expected),
.isTrue(value), .isFalse(value) and others.

The MD5 Team uses a standard test naming convention to quickly bring new team members up
to speed and allow a geographically diverse development team to have a unified code base. The
method naming convention followed is MethodName_StateUnderTest_ExpectedBahavior(). As
an example, the following two method names are acceptable for the user token extraction
function tests:

● IsAuthorizedUser_ValidToken_Pass()
● isAuthorizedUser_InvalidToken_Fail()

MD5_Team ©MD5_Team Software, Inc., 2018 Page 4


____________________________________________________________________________

MD5 Team Software, Inc.


____________________________________________________________________________
It is important to note that if the expected behavior is “Fail” then that means we want the method
under test to fail in an expected manner. The outcome of the unit test for an expected failure that
fails should, ironically, be a pass when viewed in the test suite. The goal for every run of the
test suite is for every test to pass with its expected behavior. Any failing test must trigger a code
review by developers.

Task Unit Tests

Task 1.1 Retrieve Token From Header


Function Under Test: AuthorizeUser()
Unit Test Method Name Objective
IsAuthorizedUser_ValidToken_Pass() This test will present a valid HTTP Post
request with a valid token in the header. It
should pass.
IsAuthorizedUser_InvalidToken_Fail() This test will present a valid HTTP Post
request with a too small invalid token in the
header. It should fail.
IsAuthorizedUser_MissingToken_Fail() This test will present a valid HTTP Post
request without a token. It should fail.
IsAuthorizedUser_GiantToken_Fail() This test will present a valid HTTP Post
request with various very large headers
starting at 8KB then trying 16KB, 32KB,
64KB and on until the web service rejects
the post as too large. In all cases, the
authorization call should fail normally
without crashing

Task 1.2 and 1.3 Validate Token


Function Under Test: ValidateToken()
Unit Test Method Name Objective
ValidateToken_ValidToken_Pass() This test will present a valid token to the
function. It should pass.
ValidateToken_InvalidToken_Fail() This test will present a token with a valid
structure but is not in the authorization
database. It should fail.
ValidateToken_MalformedToken_Fail() This test will present a malformed token of
variable length to the function. It should
fail.

MD5_Team ©MD5_Team Software, Inc., 2018 Page 5


____________________________________________________________________________

MD5 Team Software, Inc.


____________________________________________________________________________

Task 1.3.1 Request 2FA


Function Under Test: Request2FA()
Unit Test Method Name Objective
Request2FA_ExtractOTP_Pass() This test will present a valid JSON body
with the parameter 2ndFactorOTP present
for extraction.
Request2FA_ExtractMissingOTP_Fail() This test will present a valid JSON body
with the parameter 2ndFactorOTP missing.
Request2FA_ExtractMalformedOTP_Fail() This test will present a valid JSON body
with the parameter 2ndFactorOTP
improperly formatted.. It should fail.

Task 1.3.2 Validate 2FA


Function Under Test: Validate2FA()
Unit Test Method Name Objective
Validate2FA_ValidOTP_Pass() This test will present a valid OTP to the
authorization engine for acceptance. It
should pass.
Validate2FA_InvalidOTP_Fail() This test will present an invalid OTP to the
authorization engine. The engine should
not be able to find the password. It should
fail.

Task 2.1 Log Status Event


Function Under Test: LogStatusEvent()
Unit Test Method Name Objective
LogStatusEvent_LogReady_Pass() This test will connect to the logging
endpoint and send a status event message
to log. It should pass.
LogStatusEvent_EndpointNotFound_Fail() This test will attempt to connect to an
invalid endpoint and should not return a
successful result.
LogStatusEvent_EndpointBusyStoreLog_Pass() This test will connect to a valid endpoint
that indicates it cannot accept new log
events. It should store the event for future
delivery to the endpoint.
LogStatusEvent_EndpointBusyCantStore_Fail() This test will connect to a valid endpoint
that indicates it cannot accept new log
events. It should attempt to store the event

MD5_Team ©MD5_Team Software, Inc., 2018 Page 6


____________________________________________________________________________

MD5 Team Software, Inc.


____________________________________________________________________________

for future delivery to the endpoint, but the


local storage should indicate full. It
should fail.

Task 2.2 Send Event to Threat Determination


Function Under Test: ValidateThreat()
Unit Test Method Name Objective
ValidateThreat_ValidThreat_Pass() This test will inspect the JSON body of the
threat status event message and find that all
parameters are properly constructed and
ready for processing.
ValidateThreat_InvalidThreat_Param1_Fail() This test will send an invalid body into the
ValidateThreat() function having an issue
with the first parameter of the specification.
There will be one or more unit tests for each
parameter in the specification. New unit
tests will be added here as necessary.
… Continuation of Parameter Tests 2 through
n-1
ValidateThreat_InvalidThreat_ParamN_Fail() This test will send an invalid body into the
ValidateThreat() function having an issue
with the n-th parameter of the specification.
There will be one or more unit tests for each
parameter in the specification. New unit
tests will be added here as necessary.

Task 3.1 Receive Threat Data


Function Under Test: ProcessThreat()
Unit Test Method Name Objective
ProcessThreat_Received_Pass() This test will forward a valid threat to the
ProcessThreat() function
ProcessThreat_Empty_Fail() This test will send a message that is empty or
not a valid threat to the ProcessThreat()
function. It should fail.

Task 3.2 Flag Threat Data


Function Under Test: ThreatReceipt()
Unit Test Method Name Objective

MD5_Team ©MD5_Team Software, Inc., 2018 Page 7


____________________________________________________________________________

MD5 Team Software, Inc.


____________________________________________________________________________

ThreatReceipt_CanFlag_Pass() This test will send a status event threat


message that can be properly classified by the
system defined severity levels and pre-defined
report tags. It should pass.
ThreatReceipt_CannotFlag_Fail() This test will send a status event threat
message that cannot be classified by the
system. The ThreatReceipt function should
indicate it is unable to classify it and the unit
test assertion should indicate a failure.

Function Under Test: FlagThreat()


Unit Test Method Name Objective
FlagThreat_CanFlag_Pass() This test will send a status event threat
message that can be properly classified by the
system defined severity levels and pre-defined
report tags. It should pass.
FlagThreat_CannotFlag_Fail() This test will send a status event threat
message that cannot be classified by the
system. The ThreatReceipt function should
indicate it is unable to classify it and the unit
test assertion should indicate a failure.

Function Under Test: FlagSeverity()


Unit Test Method Name Objective
FlagSeverity_CanFlag_Pass() This test will send a status event threat
message that can be properly classified by the
system defined severity levels and pre-defined
report tags. It should pass.
FlagSeverity_CannotFlag_Fail() This test will send a status event threat
message that cannot be classified by the
system. The ThreatReceipt function should
indicate it is unable to classify it and the unit
test assertion should indicate a failure.

Function Under Test: FalgReportTags()


Unit Test Method Name Objective
FlagReportTags_CanFlag_Pass() This test will send a status event threat
message that can be properly classified by the
system defined severity levels and pre-defined
report tags. It should pass.

MD5_Team ©MD5_Team Software, Inc., 2018 Page 8


____________________________________________________________________________

MD5 Team Software, Inc.


____________________________________________________________________________

FlagReportTags_CannotFlag_Fail() This test will send a status event threat


message that cannot be classified by the
system. The ThreatReceipt function should
indicate it is unable to classify it and the unit
test assertion should indicate a failure.

Task 3.3 Send Data to Reporting Mechanisms


Function Under Test: AlertReadEvent()
Unit Test Method Name Objective
AlertReadEvent_IsFlagged_Pass() This test will send a properly flagged Alert
status event message to the AlertReadEvent()
function.
AlertReadEvent_IsNotFlagged_Fail() This test will send an Alert status event
message to the AlertReadEvent() function that
has NOT been flagged. It should fail.

Function Under Test: ReportReadEvent()


Unit Test Method Name Objective
ReportReadEvent_IsFlagged_Pass() This test will send a properly flagged
Reporting status event message to the
ReportReadEvent() function.
ReportReadEvent_IsNotFlagged_Fail() This test will send a Reporting status event
message to the ReportReadEvent() function
that has NOT been flagged. It should fail.

Task 4.1 Receive report Event


No unit tests are appropriate for testing this idle loop.

Task 4.2 Alert Response Event


Function Under Test: AlertResponseEvent()
Unit Test Method Name Objective
AlertResponseEvent_IsFlagged_Pass() This test will send a properly flagged Alert
status event message to the
AlertResponseEvent() function.
AlertReadEvent_IsNotFlagged_Fail() This test will send an Alert status event
message to the AlertResponseEvent() function
that has NOT been flagged. It should fail.

MD5_Team ©MD5_Team Software, Inc., 2018 Page 9


____________________________________________________________________________

MD5 Team Software, Inc.


____________________________________________________________________________

Task 4.3 Execute Alert Action Event


Function Under Test: ExecuteAlert()
Unit Test Method Name Objective
ExecuteAlert_ValidAlert_Pass() This test will send a valid alert message into
the ExecuteAlert function for submission to an
end-user. The ExecuteAlert function should be
able to execute the indicated alert and
terminate normally.
ExecuteAlert_InvalidAlert_Fail() This test will send an invalid alert message into
the ExecuteAlert function for submission to an
end-user. The ExecuteAlert function should
fail to issue the alert.

Task 4.4 Log Alert Action Status


Function Under Test: LogRequiredAlertAction()
Unit Test Method Name Objective
LogAlertEvent_LogReady_Pass() This test will connect to the logging
endpoint and send a alert execution event
message to log. It should pass.
LogAlertEvent_EndpointNotFound_Fail() This test will attempt to connect to an
invalid endpoint and should not return a
successful result.
LogAlertEvent_EndpointBusyStoreLog_Pass() This test will connect to a valid endpoint
that indicates it cannot accept new log
events. It should store the event for future
delivery to the endpoint.
LogAlertEvent_EndpointBusyCantStore_Fail() This test will connect to a valid endpoint
that indicates it cannot accept new log
events. It should attempt to store the event
for future delivery to the endpoint, but the
local storage should indicate full. It should
fail.

Task 5.1 Capture Session Details (login, IP, timestamp)


Function Under Test: MetadataCapture()
Unit Test Method Name Objective
MetadataCapture_LogReady_Pass() This test will connect to the logging
endpoint and present valid metadata to
log. It should pass.

MD5_Team ©MD5_Team Software, Inc., 2018 Page 10


____________________________________________________________________________

MD5 Team Software, Inc.


____________________________________________________________________________

MetadataCapture_EndpointNotFound_Fail() This test will attempt to connect to an


invalid capture endpoint and should not
return a successful result.
MetadataCapture_EndpointBusyStoreLog_Pass() This test will connect to a valid capture
endpoint that indicates it cannot accept
new log events. It should store the event
for future delivery to the endpoint.
MetadataCapture_EndpointBusyCantStore_Fail() This test will connect to a valid capture
endpoint that indicates it cannot accept
new log events. It should attempt to
store the event for future delivery to the
endpoint, but the local storage should
indicate full. It should fail.

Task 5.2 Push Record to Fusion Engine


Function Under Test: PushRecordToFE()
Unit Test Method Name Objective
PushRecordToFE_ValidRecord_Pass() This test will connect to submit a record
for inclusion back into the Fusion Engine
module. The record will be properly
tagged and formatted and will be accepted
by the Fusion Engine.
PushRecordToFE_InvalidRecord_Fail() This test will connect to submit a record
for inclusion back into the Fusion Engine
module. The record will be improperly
formatted. It should fail.

Task 6.1 Execute Report Query


Function Under Test: ExecuteReportQuery()
Unit Test Method Name Objective
ExecuteReportQuery_ValidRecord_Pass() This test will connect to the Reporting
Module backend and issue a valid report
query. The function under test should
validate the query and pass.
ExecuteReportQuery_InvalidRecord_Fail() This test will connect to the Reporting
Module backend and issue an invalid
report query. It should fail.

MD5_Team ©MD5_Team Software, Inc., 2018 Page 11


____________________________________________________________________________

MD5 Team Software, Inc.


____________________________________________________________________________

Task 6.2 Produce Report Data


Function Under Test: ProduceReportData()
Unit Test Method Name Objective
ProduceReportData_ValidRecord_Pass() This test will connect to the Reporting
Module backend and issue a valid report
query. The function under test should
produce the RAW requested report data and
return it to the test.
ProduceReportData_InvalidRecord_Fail() This test will connect to the Reporting
Module backend and issue an invalid report
query. It should fail.

Task 6.3 Format and Delivery


Function Under Test: FormatDeliver()
Unit Test Method Name Objective
FormatDeliver_ValidRawReportData_Pass() This test will submit valid raw reporting data
to the format and delivery function. It should
be properly formatted and delivered to the
requested end destination.
FormatDeliver_InvalidRawReportData_Fail() This test will submit invalid raw reporting
data to the format and delivery function. It
should fail.

Task 6.4 Log Batch Report Execution Status


Function Under Test: LogBatch()
Unit Test Method Name Objective
LogBatch_LogReady_Pass() This test will connect to the logging endpoint
and present a valid set of report batch data to
log. It should pass.
LogBatch_EndpointNotFound_Fail() This test will attempt to connect to an invalid
capture endpoint and should not return a
successful result.
LogBatch_EndpointBusyStoreLog_Pass() This test will connect to a valid capture
endpoint that indicates it cannot accept new
log events. It should store the event for
future delivery to the endpoint.

MD5_Team ©MD5_Team Software, Inc., 2018 Page 12


____________________________________________________________________________

MD5 Team Software, Inc.


____________________________________________________________________________

LogBatch_EndpointBusyCantStore_Fail() This test will connect to a valid capture


endpoint that indicates it cannot accept new
log events. It should attempt to store the
event for future delivery to the endpoint, but
the local storage should indicate full. It
should fail.

Task 7.1 Retrieve Token From Header


Function Under Test: AuthorizeUser()
Unit Test Method Name Objective
IsAuthorizedUser_ValidToken_Pass() This test will present a valid HTTP Post
request with a valid token in the header. It
should pass.
IsAuthorizedUser_InvalidToken_Fail() This test will present a valid HTTP Post
request with a too small invalid token in the
header. It should fail.
IsAuthorizedUser_MissingToken_Fail() This test will present a valid HTTP Post
request without a token. It should fail.
IsAuthorizedUser_GiantToken_Fail() This test will present a valid HTTP Post
request with various very large headers
starting at 8KB then trying 16KB, 32KB,
64KB and on until the web service rejects the
post as too large. In all cases, the
authorization call should fail normally
without crashing

Task 7.2 Validate Token


Function Under Test: ValidateToken()
Unit Test Method Name Objective
ValidateToken_ValidToken_Pass() This test will present a valid token to the
function. It should pass.
ValidateToken_InvalidToken_Fail() This test will present a token with a valid
structure but is not in the authorization
database. It should fail.
ValidateToken_MalformedToken_Fail() This test will present a malformed token of
variable length to the function. It should fail.

Task 7.3 Request 2FA


Function Under Test: Request2FA()
Unit Test Method Name Objective

MD5_Team ©MD5_Team Software, Inc., 2018 Page 13


____________________________________________________________________________

MD5 Team Software, Inc.


____________________________________________________________________________

Request2FA_ExtractOTP_Pass() This test will present a valid JSON body


with the parameter 2ndFactorOTP present
for extraction.
Request2FA_ExtractMissingOTP_Fail() This test will present a valid JSON body
with the parameter 2ndFactorOTP missing.
Request2FA_ExtractMalformedOTP_Fail() This test will present a valid JSON body
with the parameter 2ndFactorOTP
improperly formatted. It should fail.

Task 7.4 Validate 2FA


Function Under Test: Validate2FA()
Unit Test Method Name Objective
Validate2FA_ValidOTP_Pass() This test will present a valid OTP to the
authorization engine for acceptance. It
should pass.
Validate2FA_InvalidOTP_Fail() This test will present an invalid OTP to the
authorization engine. The engine should not
be able to find the password. It should fail.

Task 8.1 Log User Access


Function Under Test: LogUserAccess()
Unit Test Method Name Objective
LogUserAccess_LogReady_Pass() This test will connect to the logging
endpoint and send a user access message to
log. It should pass.
LogUserAccess_EndpointNotFound_Fail() This test will attempt to connect to an
invalid endpoint and should not return a
successful result.
LogUserAccess_EndpointBusyStoreLog_Pass() This test will connect to a valid endpoint
that indicates it cannot accept new log
events. It should store the event for future
delivery to the endpoint.
LogUserAccess_EndpointBusyCantStore_Fail() This test will connect to a valid endpoint
that indicates it cannot accept new log
events. It should attempt to store the event
for future delivery to the endpoint, but the

MD5_Team ©MD5_Team Software, Inc., 2018 Page 14


____________________________________________________________________________

MD5 Team Software, Inc.


____________________________________________________________________________

local storage should indicate full. It should


fail.

Task 8.2 Wait For Request


No unit tests are appropriate for testing this idle loop.

Task 8.3 Log User Request


Function Under Test: LogUserRequest()
Unit Test Method Name Objective
LogUserRequest_LogReady_Pass() This test will connect to the logging
endpoint and send a user report request
message to log. It should pass.
LogUserRequest_EndpointNotFound_Fail() This test will attempt to connect to an
invalid endpoint and should not return a
successful result.
LogUserRequest_EndpointBusyStoreLog_Pass() This test will connect to a valid endpoint
that indicates it cannot accept new log
events. It should store the event for
future delivery to the endpoint.
LogUserRequest_EndpointBusyCantStore_Fail() This test will connect to a valid endpoint
that indicates it cannot accept new log
events. It should attempt to store the
event for future delivery to the endpoint,
but the local storage should indicate full.
It should fail.

Task 9.1 Retrieve Search Filters


Function Under Test: RetrieveSearchFilters()
Unit Test Method Name Objective
RetrieveSearchFilters_ValidJSON_Pass() This test will present a valid JSON body
to the RetrieveSearchFilters function
turning it into a requestBody object and
return it to the test. It should pass.
RetrieveSearchFilters_InvalidJSONParm1_Fail() This test will present valid JSON, but an
invalid parameter number one according
to the specification. Tests must be
written for every parameter of the
specification, and if the specification
changes new tests must be inserted to
keep this up to date.

MD5_Team ©MD5_Team Software, Inc., 2018 Page 15


____________________________________________________________________________

MD5 Team Software, Inc.


____________________________________________________________________________

… Continuation of Parameter Tests 2


through n-1
RetrieveSearchFilters_InvalidJSONParmN_Fail() This test will present valid JSON, but an
invalid n-th parameter according to the
specification

Task 9.2 Retrieve Data from Fusion Engine


Function Under Test: RetrieveData()
Unit Test Method Name Objective
RetrieveData_ValidRequest_Pass() This test will present a valid report
request to the RetrieveData function It
should pass.
RetrieveData_InvalidRequest_Fail() This test will present a properly
formatted, but an invalid request to the
RetrieveData function. It should fail.

Task 9.3 Output Report


Function Under Test: GenerateReport()
Unit Test Method Name Objective
GenerateReport_ValidRawReportData_Pass() This test will submit valid raw reporting
data to the generation/format function. It
should pass.
GenerateReport_InvalidRawReportData_Fail() This test will submit invalid raw
reporting data to the generation/format
function. It should fail.

Task 9.4 Log Report


Function Under Test: LogReport()
Unit Test Method Name Objective
LogReport_LogReady_Pass() This test will connect to the logging
endpoint and send a report event to log. It
should pass.
LogReport_EndpointNotFound_Fail() This test will attempt to connect to an
invalid endpoint and should not return a
successful result.

MD5_Team ©MD5_Team Software, Inc., 2018 Page 16


____________________________________________________________________________

MD5 Team Software, Inc.


____________________________________________________________________________

LogReport_EndpointBusyStoreLog_Pass() This test will connect to a valid endpoint


that indicates it cannot accept new log
events. It should store the event for future
delivery to the endpoint.
LogReport_EndpointBusyCantStore_Fail() This test will connect to a valid endpoint
that indicates it cannot accept new log
events. It should attempt to store the
event for future delivery to the endpoint,
but the local storage should indicate full.
It should fail.

MD5_Team ©MD5_Team Software, Inc., 2018 Page 17


____________________________________________________________________________

MD5 Team Software, Inc.


____________________________________________________________________________

Functional Tests
The goal of functional testing is to validate the basic functionality of the system’s core features
and functions. Each function is tested with the prescribed test procedure and the expected
outcome is compared with the actual outcome.

Core Functions
Function Test Procedure Expected Outcome Actual Outcome

SCRM threshold 1- Access the alerting engine The defined


configuration - UI. thresholds are logged
users can modify 2- Navigate to the threshold and stored in the
quantitative and configuration page. SCRM threshold
qualitative 3- Select the relevant repository and visible
variables for API/data model. in reporting.
alerting 4- Ensure that the data model
thresholds is defined and mapped to the
appropriate values.
5- Using the threshold
configuration wizard, define a
minimum tolerance for the
combination of quantity on hand
of material A + the days until
next delivery of material A.
6- Using the threshold
configuration wizard, define a
minimum tolerance for material
A’s relevant transit channel
weather condition + the
remaining quantity on hand.

MD5_Team ©MD5_Team Software, Inc., 2018 Page 18


____________________________________________________________________________

MD5 Team Software, Inc.


____________________________________________________________________________

Ad-Hoc report 1- Access the reporting A tabular report is


generation - A engine UI. shown on the screen
user can login and 2- From the reports page, with summarized
generate an ad- select the following information.
hoc report based variables: Summary numbers
on relevant a. Date/Time– last 24 can be drilled down.
variables hours
b. Event category -
SCRM
c. Event Type -
Threshold exceeded
d. Product – widget 1
3- Select create

Automated 1- Define an extremely low When the threshold is


alerting - The alert threshold. met, an alert is
alerts engine 2- Configure the alerting automatically sent to
automatically response to automatically the defined users in
executes the generate a report, email the using the prescribed
defined alerting report, and send the user a push method (email, push)
requirement notification. within 30 seconds of
· Push 3- Verify that when the the threshold being
notifications threshold is met, the defined reached.
· Email users are notified, and the
notifications expected report is delivered
within 30 seconds of the alert
trigger.

Automated report See above The report delivered


generation - The as part of the
alerts engine is automated alert
able to generate matches the
reports based on configuration
defined variables outlined as part of the
alert. The data is
accurate and up to
date from at the time
of the alert.

MD5_Team ©MD5_Team Software, Inc., 2018 Page 19


____________________________________________________________________________

MD5 Team Software, Inc.


____________________________________________________________________________

Usability Functions
Function Test Procedure Expected Outcome Actual Outcome

UI 1- Authenticate in the 1- All links and


responsiveness - reporting and alerts web UI. buttons begin to load
The UI is 2- Navigate from page to page within 1 second of
responsive for and execute basic reporting being clicked.
normal requests. 2- All pages load
navigation and 3- Log any page load or completely within 3
click through buttons which do not initiate a seconds.
actions. response within 1 second of being 3- Scrolling is
clicked. smooth and
4- Log any page that does not consistent.
load completely within 3 seconds.
5- Log any page that does not
scroll smoothly.

Links to 1- While testing for UI 1- All links are


nowhere/Dead responsiveness, log any dead active.
ends – There are links. 2- All pages are
no pages that 2- Navigate to each page reaches through a
users are unable within the site and verify that the linear and natural
to backtrack or path followed to reach every page menu progression
understand how is indicated in the toolbar/menus. that is easily
they were 3- Log all pages that do not repeated.
reached. All have a prominent title that 3- All pages have
buttons and links uniquely identifies that page to a unique and
are active and the user. prominent name that
link to valid corresponds with the
pages. menu/path followed
to reach it.

MD5_Team ©MD5_Team Software, Inc., 2018 Page 20


____________________________________________________________________________

MD5 Team Software, Inc.


____________________________________________________________________________

Accessibility Functions
Function Test Procedure Expected Outcome Actual
Outcome

User login – A 1- Obtain the URL used to reach Login attempts are
user is able to the Reporting and Alerts Engine UI partially successful
reach the login 2- Obtain a valid login from all three
page using a credential for the test environment. browsers and the user
standard web 3- Attempt to login using the is asked for a second
browser and is test credential using the following factor (refer to next
required to browsers: test)
authenticate a. Google Chrome
prior to access. b. Firefox
c. Microsoft Internet Explorer
4- Record any unsuccessful
attempts.

MFA - the 1- After attempting to login Login fails if the


system only from each browser, confirm that second factor is not
allows user successful login does not occur supplied within the
access when without supplying the second allowed time limit.
MFA factor.
requirement is
met.

MD5_Team ©MD5_Team Software, Inc., 2018 Page 21


____________________________________________________________________________

MD5 Team Software, Inc.


____________________________________________________________________________

Logout – The 1- After logging in, select the 1- The logout


logout button logout button, which should be button is visible on
can be reached located on the top right side of the top right of every
from the top every page. page.
menu on any 2- Confirm that the existing 2- Selecting the
page and session is immediately terminated. logout button
immediately 3- Attempt to select the back immediately
logs out the button in the web browser and terminates the current
authenticated verify that the login page is session.
user. displayed, and that back navigation 3- It is not
is not allowed. possible to navigate
backwards after
logging out.

Automatic 1- Obtain the logout threshold 1- Accounts are


account lockout from the system administrator. automatically locked
- Users are 2- After attempt to login with out after N failed
locked out after invalid credentials N times within login attempts, which
N unsuccessful 30 minutes so that the account is can be an invalid
login attempts locked out. password of second
factor.
2- Account
lockout is not
automatically lifted.

Exception/Systematic Event Handling


Function Test Procedure Expected Outcome Actual Outcome

MD5_Team ©MD5_Team Software, Inc., 2018 Page 22


____________________________________________________________________________

MD5 Team Software, Inc.


____________________________________________________________________________

User logging - 1- Request a user log from the 1- All user access
All user access system administrator. events are captured in
events are 2- Verify that all of the the audit log,
captured in the previous login attempts, page including all
audit log. navigation, and report access is successful and failed
visible in the log and accurately login attempts.
recorded. 2- All account
3- Using multiple browsers, administration events
load multiple pages are captured in the
simultaneously and ensure that log.
the log reflects both access
events.
4- Verify that the account
unlock performed by the user
administrator is captured in the
log.

Change logging 1- Access an alert threshold All configuration


- All setting from the alerts engine changes are
configuration configuration screen within the accurately captured in
changes are test environment. the configuration
captured in the 2- Modify the threshold change log. The user
audit log. settings and alert response who made the
configuration for a stratified change, date/time and
sample of alert types and before and after
responses. Record the before and values are visible.
after change values for future
reference.
3- Obtain the configuration
change log from the system
administrator.
4- Verify that all configuration
changes are accurately recorded
in the log and reconcile to the test
records.

MD5_Team ©MD5_Team Software, Inc., 2018 Page 23


____________________________________________________________________________

MD5 Team Software, Inc.


____________________________________________________________________________

Generic Error 1- While browsing the UI, 1- Injection


Messages – record any errors displayed on attempts fail
Only generic the screen. 2- Invalid
error messages 2- Attempt to inject invalid directory paths do not
are displayed. text into a report query. (OR 1=1, load.
etc.). 3- Error text
3- Attempt to browse to site shown on the screen
directory paths which do not is generic and
exist. displays no
4- Review the details of the information about the
errors to verify that no backend systems.
information about backend
systems can be deduced from the
error text.

Regression Tests

Regression Testing Methodology


Regression testing is the execution of a full or partial selection of already executed test cases to
verify that existing functionality still functions as intended. The conditions under which
Regression testing is normally performed includes, but is not limited to:

● Change in code due to requirements change


● New code (new features)
● Code modification to fix Defects
● Code modification to fix Performance issues

Regression testing techniques generally involve prioritizing high impact test cases, selecting test
cases that are impacted by the changes in the code, and lastly considers the Re-test (all)
condition.

Prioritize High Impact Test Cases


The Reports and Alerts Engine system has several high impact test cases that must be run in
every scenario, specifically those related to real-time alerts. The heart of the system depends on

MD5_Team ©MD5_Team Software, Inc., 2018 Page 24


____________________________________________________________________________

MD5 Team Software, Inc.


____________________________________________________________________________
threat analysis and release of alerts to appropriate end users to take action on changes in the
Supply Chain Risk Management (SCRM) system as a whole.

Particularly we are concerned with safety hazards, delays in processing, and custom user-
generated events (particularly those related to safety).

Unit Test Cases that fulfill this regression testing requirement are: Task 2.x through Task 5.x.
Special attention is given to Task 3.2 as this is the data flagging for routing throughout the rest of
the alerts and reports system.

System and User Authentication test cases also carry equal weight with the real time alerts, as
only authenticated events and authorized users should be in use in the system.

Unit Test Cases that fulfill this regression testing requirement are: Task 1.1, 1.2, 1.3, 1.3.1, 1.3.2,
and Task 7.1, 7.2, 7.3, and 7.4.

Secondary high impact consideration is given to Report generation. These can be broken down
into two categories, regulatory and audit, and two types: system generated and user generated.

Unit Test Cases that fulfill this regression testing requirement are Task 6.x, and Task 7.x through
Task 9.x

Test Case Selection


Test cases should be selected based upon the following criteria: frequency of defect, visibility of
functionality, core features, integration testing, complex test cases, boundary value test cases,
and a sample of successful and failure test cases (in this case our unit testing cases).

Reusable Test Cases are defined as cases that can be reused in regression testing cycles. The
Unit Testing and Functional Testing cases for the Reports and Alerts engine fall into this
category, as well as any future defined custom defined test cases for new functionality and/or
code modification verification.

Obsolete Test Cases are defined as test cases which will not be recycled in the future.
Performance fix and Defect fix test cases would be determined and then placed into this
category.

Retest All
This test scenario involves rerunning all existing tests in the test bucket and should be only run if
regression testing has confirmed a defect which could not be isolated to a specific code change.

MD5_Team ©MD5_Team Software, Inc., 2018 Page 25


____________________________________________________________________________

MD5 Team Software, Inc.


____________________________________________________________________________
It generally requires a huge amount of resources and time, and should be used sparingly.

Regression Testing Tools


Selenium is an open source tool that can be used for browser based regression testing. Selenium
provides a tool for authoring tests without the need to learn a test script language (Selenium
IDE). Selenium also provides a test domain-specific language (Selenese) to write tests in a
number of popular programming languages including: C#, Groovy, Java, Perl, PHP, Python,
Ruby and Scala. The tests can be run against most modern web browsers. Selenium works on
Windows, Linux, and macOS platforms. It is an open-source software package, released under
the Apache 2.0 license which allows web developers can download and use it without charge.

Regression Testing and Configuration Management


The Reports and Alerts Engine will use the project’s GitHub and bitbucket code repositories for
configuration management. Regression testing branches will be created and isolated from the rest
of the project code. The following rules will be observed to ensure the integrity and success of
the regression testing process:

● No changes must be allowed to code, during the regression test phase. (Regression test
code must be kept immune to developer changes).
● The database(s) used for regression testing must be isolated. No database changes must
be allowed to ensure that the results of the regression test are in a controlled and constant
environment.

Verification
The verification phase is to ensure that the code and testing meets the intended results and the
tenets of the Reporting and Alerts Engine. Output analysis will be conducted on each of the tests
and verification will be conducted on Task Unit Tests (security & malformed data), Functionality
Tests (customer experience), Regression Tests (fixes) and Test Case Selection (quality
assurance). Tests will leverage tools which monitor application behavior, corruption, user
privilege issues and other critical areas such as security. Verification ensures that all potential
issues, vulnerabilities, defects and customer experience are addressed. Anything that is identified
will be reviewed, fixed and then re-tested to ensure that quality is built into the Reporting and
Alerts Engine and is capable to perform all intended functions for the customer. The subsequent

MD5_Team ©MD5_Team Software, Inc., 2018 Page 26


____________________________________________________________________________

MD5 Team Software, Inc.


____________________________________________________________________________
phase (Deployment, Maintenance & Support ) shall only begin once testing requirements have
been met and the verification process is complete..

Validation

Validation Requirements Process


The process to validate the software requires inputs from the Fusion Engine component.
However, as we have independently developed our system, we provide the Reports and Alerts
Engine with structured data, as it would receive in production, with STIX, XML, and JSON
compliant input. Software testers will perform the software validation during the software quality
control phase of our software development lifecycle. During this phase of development testers
ensure that all the requirements described in the Software Requirements document were met by
the Reports and Alerts Engine component. User acceptance tests are performed alongside the
customer who tests each requirement as specified in the requirements document (e.g. Vision
Document).

Discrepancies are notated and compared to the requirements. If any are valid, these are compared
to the contract and a change request is put in to make the change to the code base. Once Change
Control Board review occurs the correction is released in a minor release version.

Validation Goal Analysis


The following requirements are tested.

1. The Fusion Engine provide instantaneous alert notifications in real-time and generates
reports in real-time
2. Alerts are sent to endpoints within thirty seconds of detection (soft real-time constraint)
3. The key performance indicator changes being tracked and alerted upon when exceeding
their constraints

MD5_Team ©MD5_Team Software, Inc., 2018 Page 27


____________________________________________________________________________

MD5 Team Software, Inc.


____________________________________________________________________________
4. Alerts are allowed to be customized by non-technical users based on any state change
reported by the Fusion Engine
5. A REST API is available, and functions appropriately based on unit test cases
6. Data is exchanged in required and compliant STIX, XML, and JSON formats.
7. The STIX2 information is processed and complies with JSON formatting standards.
8. All Fusion Engine messages are stored and validated using a reliable blockchain hashing
algorithm.
9. Multi-Factor authentication works for all accounts and services where it is applied.
10. Role-based access properly distinguishes between the three levels of access and enforces
privileges.
11. Sensitive data is anonymized in all alerts to prevent leakage of sensitive data.

Customer Acceptance Test


Customers will be invited to bring in end-users to perform an acceptance test once Quality
Assurance has verified requirements were met in the goal analysis. Customers may provide test
scenarios which will be applied to demonstrate that the system conforms to the requirements
outlined in the contract. System performance metrics will be reviewed and confirmed to meet
requirements with the customer, and all described business processes will be reviewed to ensure
they meet desired performance and system engineered specifications.

Test data will be generated and fed into the system to review system performance and ensure
customers accept the system functions. Any feedback will be reviewed by architects and
compared to requirements. Any changes which are requested which lay outside the scope of the
original requirements may be formally documented and are subject to additional contracting
costs and scheduling upon formal acceptance by the company and government contracting
office.

Usability Test
End-users will follow user manual procedures regarding alert creation, alert review, reports
creation, reports analysis, account creation, role-based assignments and modifications, and REST
API usability. During the usability test end-users are encouraged to provide feedback regarding
user-interface configuration, input buttons, links, and active system placement, and system
design. If design elements are found not to meet requirement specification, they will be
refactored and are subject to regression testing. Additional element changes are subject to
contract negotiation by the company and the government.

MD5_Team ©MD5_Team Software, Inc., 2018 Page 28


____________________________________________________________________________

MD5 Team Software, Inc.


____________________________________________________________________________

Model/Specification Inspection and Checking


Architectural models and all specifications used by the system will be validated by reviewing
model flows from previous documentation. Specifications will be validated by feeding input
which is in the form of structured data in JSON, STIX2, and REST formats. Output will be
reviewed to ensure that the recorded outputs are in the appropriate industry-standard
specifications as outlined in the requirements documentation.

Mitigation Plan
Our plan is to develop actionable risk mitigation strategy that includes an iterative
approach and methodology with a set of defined handling options and methods and
procedures for risk monitoring, reduction, and remediation. For the Supply Chain Risk
Management System (SCRM) the Reports and Alerts Engine will have on ongoing
monitoring of vulnerabilities and software errors, including other risks, that were
identified during operations. The identified risks will be result in follow-up actionable
items once reviewed within our risk model.

Vulnerabilities
● Insecure storage of data at rest that is not encrypted
● Security misconfigured for access control
● Source data compromised before it reaches the Fusion Engine (data in transit)
● Regulatory Reports not encrypted delivered via IoT or mobile
● Remote Administration
● Site Injection to manipulate the data

Software Errors
● Invalid neutralization of SQL database commands or OS commands
● Programming language bug or interoperability during the input for reports and alerts
(cross-site scripting – XSS).
● Improper validation of an array index (Index out of bounds)
● Database Objects have no security to referenced objects exposing them externally for
filesystem path traversal (Missing Authorization)
● Broken Authentication and Session Failure due to interrupted network connections or
latency timeouts
● Bad Data format including integer overflow
● Reports display format is incorrect
● Alerts missing initialization of variables
● Concurrent execution using shared resource without correct synchronization

MD5_Team ©MD5_Team Software, Inc., 2018 Page 29


____________________________________________________________________________

MD5 Team Software, Inc.


____________________________________________________________________________

Model & Approach


Approach will be an iterative plan that includes identification, assessment, monitoring
and tracking, analysis, and implementation including progress monitoring. The
mitigation plan will be designed to manage, track, eliminate by remediation, or reduction
of the risk to a level that is acceptable. After the plan is implemented it will be continued
to be tested and monitored and adjusted if its efficacy is requiring revision.

Figure1: Iterative Risk Management Model for Vulnerabilities and Software Errors

Handling Options:

● Assume/Accept: We will acknowledge that the risk does exist and will make a
deliberate decision to accept it without using capabilities to control it. (This would
require a Program Leader Approval).
● Avoid: The program requirements or constraints could be adjusted to reduce the
risk. For example, a change in funding or the technical requirements. (This would
require a Program Leader Approval).
● Control: Implementations of actions to minimize the impact of the risk.
● Transfer: Reassign the accountability, responsibility, and authority to another
organization and stakeholder that is capable of accepting and managing the risk.
● Watch/Monitor: Continuous monitor the environment for adjustments or
changes that would directly influence the risks that have been identified.

MD5_Team ©MD5_Team Software, Inc., 2018 Page 30


____________________________________________________________________________

MD5 Team Software, Inc.


____________________________________________________________________________
EVM - Earned Value Management:
Per the Government Cost Estimate and Assessment guide, we will use the Earned Value
Management (EVM) to reserve a budget to manage identifying and remediating risks.
This tool will be used not only to evaluate the risks identified but to also predict any
potential risks that may emerge and associate the cost related to those risks. Including
EVM with the risk management will provide us with the view for resource and technical
capability planning around risk remediation. We can include participation from the
Supply Chain Organization teams, Government, and the Fusion Engine to ensure that
these systems are working together to better communicate and manage the risks as they
are identified and discovered. With all cross-functional teams involved will facilitate the
consistency of mitigating the risks in a timely period. This will permit the planning
manager to use the Cost Schedule and Control System Criteria (C/SCSC) to receive data
on the status of costs, schedule, and technical achievements - which will permit a tighter
management of the budget for risk management.

Figure: GOA Cost Estimate and Assessment Guide (GAO-09-3SP 2009).

MD5_Team ©MD5_Team Software, Inc., 2018 Page 31

Você também pode gostar