Tuesday, June 9, 2020

Cloud Native Application Builder

This is my new Open Source project, aiming to manage cloud native application in more efficient way with experience similar to monolithic application.



The project is designed to allow adding code and deployment generators per technology so by time more components will be added easily.
The project/code generation will be enhanced to include draft implementations and in the future the tool will commit the generated code into a git repository.
Currently the project have the initial support for Qurarkus, SpringBoot and Java in addition to generating deployment commands for Tomcat, JBoss, MySQL, MongoDB, MariaDB, Redis and PostgreSQL. It is still in early development phases so expect some changes and improvement over time.





Project GitHub Location:



You can import this maven project into an IDE and execute it, or run the maven commands:

mvn clean install compile package
And then deploy into into any application server such as Glassfish, Tomcat, JBoss, WebLogic, WebSphere, etc.

Also you can deploy it directly into OpenShift using S2I:

oc new-app --name=my-designer jboss-webserver30-tomcat8-openshift:1.3~https://github.com/osa-ora/app-designer  
And then expose a router to the application:
oc expose service my-desginer 
Access the application using {route_url}/AppBuilder-1.0-SNAPSHOT/

There is a "sample.json" file in the GIT repository in the "samples" folder where you can load it or create your own application stack (it load and save applications in JSON format).

Disclaimer: This tool is still under development, anyone willing to join and work on it, feel free to contact me :)



Sunday, January 19, 2020

CI/CD in OpenShift using Jenkins

One of the interesting part if the CI/CD in Openshift, which provided built-in capabilities using embeded Jenkins for that, 2 ways to build CI/CD using Jenkins either use Jenkins directly and configure everything or uses OpenShift Pipeline feature which uses Jenkins behind the scene, we will demonstrate both in this post.

In this post, I will go through this using GitHub + OpenShift + Jenkins.

Pre-requisites:

You have deployed your application and have build configurations for that application already configured in OpenShift.

Before We Start: Add Jenkins to OpenShift

From the search box, search for Jenkins and add it to your project, go through the configuration wizard and wait for its deployment till you get the Jenkins end point.

Get the Jenkins URL from the Routes section:


Use Case I: Using Jenkins Directly

1) Configure Jenkins

The Instant app Jenkins that come with OpenShift has a built in plugin for interaction with OpenShift, there is a sample project which you can either use or create a new one, simply rename the sample project with your project name and let's configure this step by step.

a) General Tab
Add project description and may be GitHub URL (just some information about the project)



b) Source Code Management
Add your project source code for me I have added: https://github.com/osa-ora/AcmeInternetBankingApp as my own repository and I uses the master branch for all the builds in my sample project.



c) Build Triggers
Check Poll SCM and configure it like */2 * * * * which means every 2 minutes check for any new commits in the configured branch in the GitHub configured repository
This is similar to cron job configurations: MINUTE HOUR DOM MONTH DOW



This is one way to achieve CI/CD from GitHub the other way that we are going to use is to use the WebHook to push any changes from GitHub to Jenkins as below in step 2 & 3 as it uses push rather than keep pinging GitHub for any changes.

d) Build
This is the core of any Jenkins project, you need to configure every step from the build till the deployment.
In Trigger OpenShift build add the project name and build name as in OpenShift, also check the box for getting the build output, both project nme and build config should match exactly what is configured in OpenShift for this application.


In deployment do the same, also configure the image stream name, tags, etc. and check the box for deployment validation.

You can optionally add scaling to the project as per the required number of nodes.


e) Post Build Actions
You can optionally add some post build actions, like running certain script or some test cases, etc.

To verify that every thing is configured properly, execute build now and check the console output, if any issues fix them.
The console output should show successful build and deployment steps:


2) Enable Jenkins WebHook

Now from Jenkins settings (Manage Jenkins ==> Configure System)
Go to GitHub section and get the WebHook URL, and save it in a notepad file.


3) Configure GitHub WebHooks

Go to the GitHub project go to Settings and select WebHooks and add the Jenkins hook url, content type as application/json and send only push events and make sure it is active and save this webhook configurations.

Now, with every new push to the GitHub an invocation to Jenkins hook URL, so Jenkins can start building the project and deploy it.

4) Test it

Now commit a new change to your project and push it to the Git Repository branch that you configured in GitHub and wait for workflow to be invoked and changes to be deployed.


You can see Jenkins invoked with this new commit push.
And now we have a complete cycle from pushing the code, build, deploy and may be test as well.

Use Case II: Using OpenShift Pipelines


OpenShift has the capability to hide all Jenkins configurations by using OpenShift Pipeline feature in the build tab.
To use it all you need to do is to structure a file format for the pipeline such as the following:

1) Build the Pipeline File

This is a sample file that has 3 stages; one for build, then deploy and 3rd one for scaling.



Note that the main components of the pipeline file is the stages which have a name and one or more actions to do.

2) Import the Pipeline File

Select Import YAML/JSON and copy and paste the file content.


Validate by Invoking the pipeline to see if the progress is okay and no issues with your configurations.

3) Get the Pipeline WebHook URL

From inside the pipeline get the Hook URL.

4) Configure GitHub WebHook 

Similar to what we did in step 3 in the first use case above.

And now we have another complete cycle from pushing the code, build, deploy and may be test as well but using OpenShift Pipeline feature this time.



Monday, January 13, 2020

Using Liquibase to Manage DB Changes for CI/CD

Liquibase is an open source tool for database schema change management, it helps teams track, version, and deploy database schema and logic changes.
It has 2 flavors; the community edition and the pro one.



In this post, we will do a local MySQL database schema changes management between 2 schema(s)
Source: bankaccounts schema and the Target is: bankaccounts2 schema

In one use case we will move all the schema into the target DB and then we will capture any changes and apply them into the target DB so we can make sure the source and target DB are consistent.

1) Download the Liquibase tool:
From this page: https://download.liquibase.org/download/?frm=n you can download it.
If for example, you are using Windows then just unzip the folder and add it to the PATH variable of your environment variables.

Done? let's move to the next step.

2) Create a separate folder for Source and target name them as you wish

3) Do initial DB configurations:

Open MySQL DB and create the following schema(s):

CREATE SCHEMA `bankaccounts` ;
GRANT ALL PRIVILEGES ON *.* TO 'bankaccounts'@'localhost' IDENTIFIED BY 'bankaccounts';databasechangelog

CREATE SCHEMA `bankaccounts2` ;
GRANT ALL PRIVILEGES ON *.* TO 'bankaccounts2'@'localhost' IDENTIFIED BY 'bankaccounts2';databasechangelog

Now we have 2 identical empty schema(s)

4) Load DB Schema:
Load Schema 1, our source schema with some DB objects by executing this script:

CREATE TABLE `bankaccounts`.`accounts` (
  `id` INT NOT NULL AUTO_INCREMENT,
  `account_no` VARCHAR(45) NULL,
  `balance` DOUBLE NULL,
  `currency` VARCHAR(45) NULL,
  PRIMARY KEY (`id`));

CREATE TABLE `bankaccounts`.`transactions` (
  `transaction_id` INT NOT NULL AUTO_INCREMENT,
  `account_no` VARCHAR(45) NULL,
  `transaction` DOUBLE NULL,
  `date` DATETIME NULL,
  `transaction_details` VARCHAR(45) NULL,
  PRIMARY KEY (`transaction_id`));


Now our source DB contains 2 tables.

FIRST USE CASE: INITIAL DB MIGRATION:

Now Let's prepare our Liquibase tool to capture and do initial DB migration
- In a directory named "source"
1- Create a configure a file named: "liquibase.properties"
with the following content:

changeLogFile: ./dbchangelog.xml
driver: com.mysql.jdbc.Driver
url: jdbc:mysql://localhost:3306/bankaccounts
username: bankaccounts
password: bankaccounts
classpath: ../../../mysql-connector-java-5.1.23-bin.jar

Note: you need also to fix the MySQL connector jar location as per your environment

2- Run the Liquibase tool:
liquibase generateChangeLog

The output will be directed into the dbchangelog.xml file specified in our properties file
Here is the content I got, you should get similar (except the id of the change set)
<?xml version="1.1" encoding="UTF-8" standalone="no"?>
<databaseChangeLog xmlns="http://www.liquibase.org/xml/ns/dbchangelog" xmlns:ext="http://www.liquibase.org/xml/ns/dbchangelog-ext" xmlns:pro="http://www.liquibase.org/xml/ns/pro" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.liquibase.org/xml/ns/dbchangelog-ext http://www.liquibase.org/xml/ns/dbchangelog/dbchangelog-ext.xsd http://www.liquibase.org/xml/ns/pro http://www.liquibase.org/xml/ns/pro/liquibase-pro-3.8.xsd http://www.liquibase.org/xml/ns/dbchangelog http://www.liquibase.org/xml/ns/dbchangelog/dbchangelog-3.8.xsd">
    <changeSet author="ooransa (generated)" id="1578934983284-1">
        <createTable tableName="accounts">
            <column autoIncrement="true" name="id" type="INT">
                <constraints nullable="false" primaryKey="true"/>
            </column>
            <column name="account_no" type="VARCHAR(45)"/>
            <column name="balance" type="DOUBLE"/>
            <column name="currency" type="VARCHAR(45)"/>
        </createTable>
    </changeSet>
    <changeSet author="ooransa (generated)" id="1578934983284-2">
        <createTable tableName="transactions">
            <column autoIncrement="true" name="transaction_id" type="INT">
                <constraints nullable="false" primaryKey="true"/>
            </column>
            <column name="account_no" type="VARCHAR(45)"/>
            <column name="transaction" type="DOUBLE"/>
            <column name="date" type="VARCHAR(45)"/>
            <column name="transaction_details" type="VARCHAR(45)"/>
        </createTable>
    </changeSet>
</databaseChangeLog>


As you can see the 2 tables are captured in this file and we can now execute this into the target DB to do a schema migration by configuring the DB in a new property file and execute:

liquibase update

Which will apply all the changes in the XML file, the file structure is simple as collection of changeSet every changeSet capture one DB modification.
The changes are identified as unique by combination of author and id fields, for example:
author="ooransa (generated)" id="1578934983284-2"

We will skip this part as it will be demonstrated with the second use case.


SECOND USE CASE: DB SCHEMA CHANGE MANAGEMENT:

Let's now prepare our target DB to receive the initial and any updates to the schema
In the "target" folder:
1- create a new file named "liquibase.properties"
with the following content:

changeLogFile: ./dbchangelog.xml
driver: com.mysql.jdbc.Driver
url: jdbc:mysql://localhost:3306/bankaccounts2
username: bankaccounts2
password: bankaccounts2
referenceUrl: jdbc:mysql://localhost:3306/bankaccounts
referenceUsername: bankaccounts
referencePassword: bankaccounts
referenceDriver: com.mysql.jdbc.Driver
classpath: ../../../mysql-connector-java-5.1.23-bin.jar

Note: you need also to fix the MySQL connector jar location as per your environment

As you can see this time we configured the source (reference DB) and the target DB for this configurations.

2- Execute the command:
liquibase diffChangeLog

This will capture the difference between our empty schema and the source/referenced schema which will produce a file similar to the previous XML file.

Now apply these changes into the DB using the command:

liquibase update

That's it, all we need to do. whenever a new changes happen to the DB execute the diffChangeLog then the update commands, no more :)

Note a change log table will be also created in our target DB named: databasechangelog plus a lock table to make sure only one liquibase command executed at a time.

Now we did the initial DB migration from the source DB to the target DB

Let's try this sample modification:

- Create a new table in your source DB "bankaccounts" for example:

CREATE TABLE `bankaccounts`.`test_change` (
  `id` INT NOT NULL AUTO_INCREMENT,
  `name` VARCHAR(45) NULL,
  PRIMARY KEY (`id`));

- Capture the new changes using the command: (while still in the target folder)

liquibase diffChangeLog

The new modification will be appended to our initial dbchangelog.xml so you can see a new entry:

<changeSet author="ooransa (generated)" id="1578935272217-1">
        <createTable tableName="test_change">
            <column autoIncrement="true" name="id" type="INT">
                <constraints nullable="false" primaryKey="true"/>
            </column>
            <column name="name" type="VARCHAR(45)"/>
        </createTable>
    </changeSet>

- Apply the changes to our target DB by executing the command: (while still in the target folder)

liquibase update

Now you can see the new table created in the target DB.



Drop the table from the source DB and re-execute both commands:

liquibase diffChangeLog

You can see the change appended to the file:

<changeSet author="ooransa (generated)" id="1578935549140-1">
        <dropTable tableName="test_change"/>
    </changeSet>


Then apply it using:

liquibase update

So whatever changes you do in the source you capture it using the diffChangeLog and apply it using the update, at the end all the changes you made for your DB will be persisted in this single DB change log file.


You can also do a rollback of the changes by using the rollback option.

For more information: visit: https://www.liquibase.org/index.html

Using Maven Plugin


It is better to use maven plugin and generate the DB change log as part of your build and package process. Now in the source folder, create a new folder named "maven" and in this folder create "pom.xml" file as following:

1) Add the plugin as simple to the maven pom file as:

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>osa.ora</groupId>
<artifactId>BankingServiceDB</artifactId>
<version>0.0.1-SNAPSHOT</version>
<name>BankingServiceDB</name>
<dependencies>
<dependency>
<groupId>org.liquibase</groupId>
<artifactId>liquibase-maven-plugin</artifactId>
<version>3.8.5</version>
</dependency> 
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<version>5.1.23</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.liquibase</groupId>
<artifactId>liquibase-maven-plugin</artifactId>
<version>3.8.5</version>
<configuration>       
                    <outputChangeLogFile>./dbchangelog.xml</outputChangeLogFile>
<driver>com.mysql.jdbc.Driver</driver>
<url>jdbc:mysql://localhost:3306/bankaccounts</url>
<username>BankAccounts</username>
<password>BankAccounts</password>
</configuration>                
</plugin> 
</plugins>
</build>
</project>

This also included the MySQL DB Driver dependency for example for MySQL

2. The configurations is now part of our maven file, but in case you need to have them outside it in the property file: liquibase.properties You need to modify the configuration section in the POM file as following:
<configuration>       
<propertyFile>liquibase.properties</propertyFile>
</configuration>                

Which point to configuration file "liquibase.properties" contains the following items:

outputChangeLogFile: ./dbchangelog.xml
driver: com.mysql.jdbc.Driver
url: jdbc:mysql://localhost:3306/bankaccounts
username: BankAccounts
password: BankAccounts

Note that you don't need to specify the jar location for MySQL connector jar, it will be added to the classpath by maven.

3) Execute it using maven:

mvn liquibase:generateChangeLog

Or execute any of other liquibase command options!




Let's now try the other way around, modify the POM file to use the properties file:

<configuration>       
<propertyFile>liquibase.properties</propertyFile>


</configuration>                


And configure it as following:

diffChangeLogFile: ./dbchangelog_diff.xml
changeLogFile: ./dbchangelog.xml
driver: com.mysql.jdbc.Driver
url: jdbc:mysql://localhost:3306/bankaccounts2
username: bankaccounts2
password: bankaccounts2
referenceUrl: jdbc:mysql://localhost:3306/bankaccounts
referenceUsername: BankAccounts
referencePassword: BankAccounts

referenceDriver: com.mysql.jdbc.Driver

Now run the diff command to get the delta between the 2 databases:

mvn liquibase:diff


Do some changes to the source schema and then re-run the command and check the output file.

For the rollback details, I would suggest you read about this from this nice article:

https://www.baeldung.com/liquibase-rollback



Wednesday, December 18, 2019

PCI DSS - Single Slide

PCI Data Security Standard specifies twelve requirements for compliance, I created the following slide to summarize these requirements.




For more details:
https://www.pcisecuritystandards.org/document_library?category=pcidss&document=pci_dss

Monday, December 9, 2019

Twelve-Factors Application - Single Slide

I made this one slide to summarize all the twelve factor applications:



Review these factors from the site https://12factor.net/ and then memorize them using this single slide.

Wednesday, September 18, 2019

Use Google Authenticator for DFA

Double Factor Authentication (DFA) is one of the best methods to prevent brutal force attacks or password hijack, it allow the end user to have double factors to authenticate and login into any system i.e. password and OTP (One Time Password).

There are many ways to generate the one time password, the easiest is to send a random number to a pre-defined user mobile phone number so the user can use this random number to login into the system.
The random number here must be generated using a way that guarantee that no one can guess the number and able to by-pass such security way.

Another simple way is to use a known algorithm that generate a unique number based on some calculations and the user enter this number as OTP while login into the system, he can have a device that run this algorithm or as Google Authenticator a mobile app.

Google Authenticator is using a Time-based One-Time Password algorithm (TOTP) which is an extension of the HMAC-based One-time Password algorithm (HOTP) generating a one-time password by instead taking uniqueness from the current time.


In order to configure your web site to work with Google authenticator you need to modify your login screens and do some code that enable the DFA as following:

Enablement steps

1. Workflow Changes:

- User to go to profile management and request to enable double factor authentication (DFA) using OTP, 2 options can be implemented easily:
                          1. Google Authenticator
                          2. SMS
- SMS should be only enabled if we have SMS gateway configured in the system, so in this post we will discuss only Google Authenticator:

- User select one method, it route it to a new page which will contain:

[1] Links to Mobile Apps in Android & iOS, here is the links:
Google Authenticator App (Apple iOS)
Google Authenticator App (Google Android)



[2] QR code generated using either:
Google public service:  img src=URL
Java backend libraries which is the recommended way.


  •  The URL for google public service looks like: https://chart.googleapis.com/chart?chs=200x200&cht=qr&chl=200x200&chld=M|0&cht=qr&chl=otpauth://totp/osa@myCompany.com%3Fsecret%3DJBSWY3DPEHPK3PXP
  • Osama_oransa@myCompany.com=loginName@companyName.com for example
  • 3DJBSWY3DPEHPK3PXP=encoded secret code auto-generated to this user based on his profile e.g. name+timestamp or any random key
  • This URL will display an image that contains QR code or it can be generated by the Java backend.




[3] A statement displayed to the user: "Please scan this QR Code by Google Authenticator App"
[4] An Input field for OTP number
[5] A "Validate" button

Once the user scanner the QR code, the Google authenticator app will display OTP, the user will use it and click on validate button, if successfully matched with the value that we generate in the backend using the secret code, then DBA will be enabled for the user otherwise "OTP not matched" error will be displayed.

After 3 failed trials the user will be routed back to the profile management page with error message.
Once DFA is enabled the user profile page will now have disable DFA button instead of enable DFA (from security prospective, it is better to have this as re-configure DFA)

Note: For a better security some systems do not allow the user to disable the DFA once enabled.

2. Login Changes

- Once the user entered correct username and password a new page will be displayed
- The page will show a text field and validate button and request the user to enter the one time password.



- Once the user enter the value and click on validate button, the backend will calculate the existing value (based on the user secret code) and compare it to the user input.

- If successfully matched, the user will be directed to the home page otherwise he will be given 3 trials after that he will be logout and will be calculated as one invalid login trial.
-       Note: if the user exceed the max invalid login trials, the account should be locked.


3. Implementation Details

- DB Changes: 

  • 2 additional DB fields in the user/profile table: 
    • OTP_ENABLED (default No) 
    • OTP_SECRET_CODE (encrypted, default null)

- User interface changes: 

  • New pages as per the flow above in both DFA enablement and post successful login page.
  • Add button for Enablement of DFA in the profile management page in case the DFA is not enabled and a button for disablement of DFA in case the DFA is enabled.



4. Implementation Reference

All the logic will be in the Java backend, which will generate the secret code, the calculation, QR Code generation and validation as per the below reference in Java.
JavaScript library is listed here as it is the best reference to understand what is required for implementation.
- In JavaScript: can be tested easily in the browser with all code libraries:


- In Java:


- Reference for other languages: