lunedì 28 gennaio 2019

How to install Ubuntu updates via command line

Exceute following commands in the bash:

sudo apt-get update        # Fetches the list of available updates
sudo apt-get upgrade       # Strictly upgrades the current packages
sudo apt-get dist-upgrade  # Installs updates (new ones)
sudo apt-get autoremove    # Move packages that were automatically installed



Have a good time...

Documentation apt-get

Bye Folks

sabato 30 giugno 2018

Protocol violations Pojects JDK 7 based and Public Nexus Indexes

In a very hot day of June a my collegue told me:

"Marco, I added some new dependecies to core project"

I thought "Here is it! At the first attempt won't comlie", I joked BUT

After a GIT pull the core project didn't compile.

The error was "failing in dependecy download -> procol violation"

Many public repositories changed the configuraion for HTTPS supporting only TLS 1.2.

We use:

NetBeans 8.1 powered by Oracle JDK 1.7 as Development Environment

It was claer the source of problem, the Oracle JDK doesn't support TLS 1.2 easily. My advice is to read this oracle blogs that introduce the topic.

The resolution was to change the jdk used to start netbeans. In our scenarion the change was from 1.7 to 1.8

In order to modify the jdk I accessed to directory:

<netbeans-inst-dir>/etc/

and edited the file netbeans.conf changing the proprty: netbeans_jdkhome

If you have problem on JDK configuration of your project, my advice is to add the old jsk as platform (not default) and configure it for every project in which the old JDK is mandatory.

Bye Folks,

venerdì 29 settembre 2017

Oracle JDK 9 and Eclipse Oxygen 4.7

Oh Shit!!!

WTF?

I downloaded JDK 9.. ehi guys we're talking about 350 MB installer ... ok not a problem, a 1 GB bandwidth doesn't suffer for few MBs.

The installation is the typical oracle raw package, jdk and jre (I want to analyze the server JRE option from Oracle, JRE and typical JDK tools to monitor and other features).

My idea was to analyze some feature of the last version of Eclipse IDE. Actually I develop with ECLIPSE neon III and OpenJDK 8.

I downloaded Eclipse Oxygen installer. I started the installer. In this case the idea behind the installer is a packages selector. Helpful? I don't now, I prefer zip archive (vintage lover? maybe!!)

The first installation is gone down.. Pay attention,. try to update installer before. The upgrade of installer fixes repositories references.

The second attempt wag good.

I launched eclipse and the result was ugly... NoClassFoundException ... I thoungth about new modules loader of java 9.

I found a nice article from eclipse foundation and the resolution of problem:


  • add to eclipse.ini the parameter --add-modules=ALL-SYSTEM 


Finally, Eclipse IDE was opened in front of my eyes.

My evening was eneded with this error:

"Target is not a JDK root. System library was not found"

 Eclipse Oxygen doesn't support the JDK 9 installation for "JRE Enviromnets".

I'll search a solution to fix this.

In the past, I used NetBeans IDE to try the new features of JDK 8 for the same situtation. Eclipse didn't support the new java consutrct for lambda expression.

By allZ folks


EDIT, Floks follow this guide to start your Java9 training using Eclipse Oxygen.

mercoledì 17 maggio 2017

My first experience with JUnit5 - Environment


I'm a TestNG lover. I think Junit is too limited for a complete customization of the test running phase.  I read many articles on the Junit 5 and its new features.

JUnit 5 = JUnit Platform + JUnit Jupiter + JUnit Vintage

  • The JUnit Platform serves as a foundation for launching testing frameworks on the JVM. It also defines the TestEngine API for developing a testing framework that runs on the platform. Furthermore, the platform provides a Console Launcher to launch the platform from the command line and build plugins for Gradle and Maven as well as a JUnit 4 based Runner for running any TestEngine on the platform.

  • JUnit Jupiter is the combination of the new programming model and extension model for writing tests and extensions in JUnit 5. The Jupiter sub-project provides a TestEngine for running Jupiter based tests on the platform.

  • JUnit Vintage provides a TestEngine for running JUnit 3 and JUnit 4 based tests on the platform.

I appreciated the the attempt to improve the Junit framework significally, reducing the impact on AS-IS test suites based on Junit 3 or 4.

Some materils to approach new paradigm:


Clearilly:


I found interesting this github prject to start with  a valid configuration for MAVEN (gradle version is available):


The junit5-maven-consumer project demonstrates how to execute tests based on JUnit 5 milestones using Maven. In addition, it showcases that existing JUnit 4 based tests can be executed in the same test suite as JUnit 5 based tests or any other tests supported on the JUnit Platform.
This sample project does not aim to demonstrate how to use the JUnit Jupiter APIs. For detailed information on the JUnit Jupiter programming and extension models.

Please note that this project uses the Maven Wrapper. Thus, to ensure that the correct version of Maven is used, invoke mvnw instead of mvn.

I think can be very helpful a simple JUnit Jupiter extension to integrate Mockito into JUnit Jupiter tests somewhat simpler.
The MockitoExtension showcases the TestInstancePostProcessor and ParameterResolver extension APIs of JUnit Jupiter by providing dependency injection support at the field level and at the method parameter level via Mockito 2.x's @Mock annotation. See here:


Mockito is a mocking framework that tastes really good. It lets you write beautiful tests with a clean & simple API. Mockito doesn’t give you hangover because the tests are very readable and they produce clean verification errors.

If you have read the second link I reported (testing with spring) probably you can understand that mockito and some feature of spring ofr test are the same.

For y first test i used 'junit5 maven consumer', it implies the usage of surefire maven plugin.
I add two deails of my environment:

  1. Junit 5 needs JDK8
  2. Apache maven 3.3.9

This is the base of the my environment for unit tests.

martedì 11 aprile 2017

Move from SVN local repository to GitLab.com


I had a great problem. I was working with Subversion repository, I'd prefer a gitflow policy to manage souce code repository.

My initial idea was to migrate to Atlassian BitBucket (http://www.bitbucket.org/). I had used in the past both version: free and paied (team group). It's a gret service expecially when BitBucket is integrated with Atlassian suite (JIRA, Confluence and so on, https://www.atlassian.com/software).

After a period of evaluation I discovered GitLab.com. This platform offers many versions of the solution, on cloud or on premise.

I adopted the free plan on cloud (https://about.gitlab.com/gitlab-com/):

Advantages

  • Unlimited repositories
  • Unlimited private collaborators
  • 10GB disk space per project (including Git LFS and artifacts)
  • Completely free, no credit card required
  • Unlimited CI Runners to do parallel testing
  • Alternate SSH port for git+ssh (443)

Features

  • Unlimited public and private repos
    • Create a new repo for even the smallest projects.
  • Project importing
    • Import existing projects from GitHub, BitBucket, Google Code, Fogbugz, or any git repo with a URL.
  • Protected branches
    • Control read/write permissions to specific branches.
  • Wiki
    • Keep your documentation within the project using GitLab’s built-in wiki system.
  • Code Snippets
    • Collect and share reusable code.
  • Powerful APIs
    • Control GitLab with a set of powerful APIs.
  • Issue and MR Templates
    • Create templates for issues and merge requests.
  • Milestones
    • Organize issues and merge requests into groups.
  • Labels
    • Categorize and track issues or merge requests based on descriptive titles.
  • External Services
    • Integrate with 24 external services (such as JIRA, Slack, Asana and more) to track progress across all the tools your team uses.
  • Due dates
    • Assign due dates to issues to make sure things get done on time.
  • Powerful search
    • Spend less time searching and more time building software.


I appreciated the WIKI, ISSUE, Slack Integration and the import repositories from BitBucket.

I've started to use the free plan. I create a group of project and I imported my projects from SVN.

In order to import repository it's easy.

Here there's the first problem with GitLab there isn't an available GUI to manage the resources easily on you computer, some operation are avaiable via Web Browser at gitlab.com. Clearilly, for git lovers this is an opportunity to use git bash.

Step to import projects follow:

  1. Create a group via gitLab.com console (Optional)
  1. Create a project under groups and save the reference to repository, something like this : https://marco.genova@gitlab.com/my-group/super-project.git
  2. In local, checkout from svn the branch of the project to migrate, I've used TortoiseSVN (https://tortoisesvn.net/)
  1. Access to git bash
  1. Reach the directory target of the checout opertation
  1. Lunched the command:
    • git svn clone svn://myhost/svn/repos/project/Branches/the/last/branch
  1. Launch the command, to expand buffer of HTTP connection
    • git config http.postBuffer 209715200
  1. Add origin to project via command:
  1. Push on server:
    • git push -u origin --all
  1. Exit from git bash


After this step, I created development branch via Web Browser at gitLab.com.

A detail on SSH connection, I had all configuration for gitHub and I had a ssh key pair created yet so the operation was completed in 30 seconds (.. to Mars). The guide of gitLab is easy to follow:  https://docs.gitlab.com/ce/ssh/README.html#ssh

... For Raspberry PI lovers, youcan create a cluster of PIs with Swarn and deploy with Docker approach GitLab. I want to spend some time to implement this solution.

That's all Folks..

Please comment i f you have some tips or tricks on this approach.

martedì 28 febbraio 2017

One Time Action token-based - First Part


Hi Folks,
I have a new problem.

Actually, I'm managing a service to submit purchase orders (OMS - orders management service).

Many 3rd parties clients submit requests to OMS, in the last months I noticed many double requests (two requests with the same charateristics) from some types of client. 
This type of dynamic create some problem to the back office operation beacuse many of these requests aren't correct request.

My idea is to add a token to order submit request in order to:

  1. Track the request
  1. Block erronous requests

The idea is based on the concept OTP (One-Time Password): https://en.wikipedia.org/wiki/One-time_password and to create a Token LifeCycle  Manangement Service (TLCMS)

Unfortunately, I don't use an API Gateway (eg, use a BEARER token) and the service uses a stateful approach so I need to construct all elements to create the requirement.

First step: Token Generation

I like something li firebase token generator: https://github.com/firebase/firebase-token-generator-java

Adding an encription phase based on DES or other: it's easy to find some resources:


The flow is this:

  • TLCMS receives a request from a client, the reuqest can contain some information on order or other information related to the request to protect with the token
  • TLCMS generates a JSON payload from information
  • TLCMS serialize  a JSON payload in a string
  • TLCMS encrypts string
  • TLCM apply BASE64 encoder to string.
  • TLCM gets the the string as token to client.

This token will be used in the header of http request to enable the action execution.


See you on...

venerdì 30 dicembre 2016

Web Monitoring Mechanism


In last years, it's normal to use javascript frameworks to test the javascript components in HTML pages or use framework as Selnium to implement regression test directly on a remote WebSite. I have a problem today, I want to avoid not-performed services so I need to examine a remote web site and understand the performances and the potentials not-blocking bugs on a retail portals (as a e-commerce). I need a Web Monitorng Mechanism.

These re some of the main goals of good service.  

My idea is to use PhantomJS to automate the test for some UI functionalities. PhantomJS is a headless WebKit scriptable with a JavaScript API. It has fast and native support for various web standards: DOM handling, CSS selector, JSON, Canvas, and SVG.

PhantomJS is an optimal solution for:

  • Headless Website Testing, run functional tests with frameworks such as Jasmine, QUnit, Mocha, Capybara, WebDriver, and many others. 
  • Programmatically capture web contents: including SVG and Canvas. Create web site screenshots with thumbnail preview 
  • Page Automation, Access and manipulate webpages with the standard DOM API, or with usual libraries like jQuery 
  • Network Monitoring, Monitor page loading and export as standard HAR files. Automate performance analysis using YSlow and Jenkins

A good point of start is this: http://phantomjs.org/ :)

Page Automation and Network Monitoring are my features. PAY Attention for a good tool to monitor a infrastructure, my advice is to introduce nagios components https://www.nagios.org/ the scalability of this solution is the best BUT you need to study many aspects in order to create damage on you infrastructre.

My idea is quite simpler than nagios but the effectiviness is very good. I want to create test case with jenkins and phantomjs in order to test the UI functionalities.

I won't use Yslow. The last one  analyzes web pages and why they're slow based on Yahoo!'s rules for high performance web sites.
YSlow for PhantomJS also introduces new output formats for automated test frameworks: TAP (Test Anything Protocol) and Junit. With this new feature, YSlow can now be added to your continuous integration pipeline as a performance test suite, preventing performance regression to be pushed to production. (guide for this approach: http://yslow.org/phantomjs/)

The stack is quite different in my case.

My initial ide was to use the stack: PhantomJs, Jasmine (or karma) and Jenkins.

I've found this guide and I implemented the solution.

In my test Im' going to use the new UI of Jenknins, Blue Ocean (beta). The installation of jenkins is easy and totally web-based.

To start using Blue Ocean:

  1. Login to your Jenkins server
  2. Click Manage Jenkins in the sidebar then Manage Plugins
  3. Choose the Available tab and use the search bar to find BlueOcean beta
  4. Click the checkbox in the Install column
  5. Click either Install without restart or Download now and install after restart
  6. When installation is complete click the Use Blue Ocean button in the classic UI

Jenkins works perfectly yet with standard plugins.

PhantomJS itself is not a test framework, it is only used to launch the tests via a suitable test runner.
Jasmine is a Behavior Driven Development testing framework for JavaScript. It does not rely on browsers, DOM, or any JavaScript framework. Thus it's suited for websites, Node.js projects, or anywhere that JavaScript can run. Documentation & guides live here: http://jasmine.github.io For a quick start guide of Jasmine 2.0, see the beginning of http://jasmine.github.io/2.0/introduction.html

In order to integrate PhantomJS and Jasmine I tried to use Phantom-Jasmine. Phantom-Jasmine is a simple set of two scripts for running your Jasmine Tests via Phantom.js. The first script lib/console-runner.js is a plugin to Jasmine that outputs test results (with ansi color codes) via console.log (included with a script tag inside TestRunner.html). The second script lib/run_jasmine_test.coffee takes an html file as it's first and only argument and then executes any Jasmine tests that file loads. See  for more details https://github.com/jcarver989/phantom-jasmine (this is so and so... Too old).

The test implementetaion is easy so I tried to follow this guide: http://www.slideshare.net/WapAdmin/drupalcon-2013 . The limit in this case was to execute test directly in staging environment, pratically on a remote site.

With jasmine I have to modify my html files adding something like this:

<link rel="shortcut icon" type="image/png" href="jasmine/lib/jasmine-2.0.0/jasmine_favicon.png">
<link rel="stylesheet" type="text/css" href="jasmine/lib/jasmine-2.0.0/jasmine.css">
<script type="text/javascript" src="jasmine/lib/jasmine-2.0.0/jasmine.js"></script>
<script type="text/javascript" src="jasmine/lib/jasmine-2.0.0/jasmine-html.js"></script>
<script type="text/javascript" src="jasmine/lib/jasmine-2.0.0/boot.js"></script>


In the past I used jasmine or karma (for angularjs components) to test javascript components, in this case I used custom page to publish a component an d to test it directly in this custom page, the approach is the same but the difference is the browser used for the test session, in this case headless browser as PhantomJS.

I tried karma runner (https://karma-runner.github.io/1.0/index.html) and it works in the same way, I reused the jasmine tests using plugin for karma (https://github.com/karma-runner/karma-jasmine), I followed this guide to integrate in Jenkins: https://karma-runner.github.io/0.8/plus/Jenkins-CI.html

BUT

I want to monitor my remote application, it's not a only unit test. While I was reading phantomjs documention, I found CapserJS. CasperJS is a navigation scripting & testing utility for the PhantomJS (WebKit) and SlimerJS (Gecko) headless browsers, written in Javascript. (See http://casperjs.org/)

It's very easy to install, only few commands:

  • npm install casperjs -g

Adding -g paramteers to add globally, for my purpose is a good solution.

The tests aren't compatible with jasmine format but I can reuse some part, my advice is to read this related information:


My simple login tests is available here:  https://github.com/MarcoGenova/WebSiteMonitor


casper.test.begin('Testing Login on Backoffice', 4, function(test){
    casper.start('http://localhost:8080/myApp');

    casper.then(function(){
              test.assertUrlMatch(this.getCurrentUrl(),'http://localhost:8080/myApp/login.jsp');
              test.assertTitle('Login', 'The Web Site has correct title');
    });

    casper.wait(1500, function(){
        this.sendKeys('input[name="j_username"]', 'admin');
        this.sendKeys('input[name="j_password"]', 'adminpwd');
    });

    casper.then(function(){
       this.click(".isubmit");
    });

    casper.wait(3000, function(){
        test.assertHttpStatus(200);
        test.assertTitle('Logged Admin', 'The Admin is correct title');
    });

    casper.run(function(){
        test.done();
    })
});

PAY attention to this: ('Testing Login on Backoffice', 4

4 is the number of assertions aspected in this test suite, if the number is different for real asserts you'll experince a "doubitus test" and a test suite fail!!

In my scenario is important to publish the result of tests in xUnit way in order to publish automatically on JenkinsCI. CasperJS supports the log generation via parameter  --xunit=<filename> , example of execution is this:

>casperjs.exe test test-suites.js --xunit=log.xml

The last steps is to confgiure a new job on jenkins, the sub steps are these:

  1. Configure the connection to GitHub, follow this guide: https://gist.github.com/misterbrownlee/3708738
  2. Create a "Generic Build Job"
  3. Add a nice name for the job
  4. Add the reference to the project on GitHub (another can be used)
  5. Enable the option "Delete workspace before build starts"
  6. Add step in the build "Windows batch execute" (in my case winzoz 7)
  7. Add the code "casperjs.exe test test-suites.js --xunit=xunit.xml"
  8. Add actions after build:
    1. Add JUNIT and fill the pattern field: "*.xml" (IT'S MANDATORY a relative directory), without this the nex step won't work
    2. Publish xUnit Test after build
    3. Add email notofication for every broken build
  1. Save

Now you are ready to launch the build...

Sorry...  You need to configure mail server connection before launch... You can follow this guide: https://www.safaribooksonline.com/library/view/jenkins-the-definitive/9781449311155/ch04s08.html


The result is approved for my goals, probably I need to tune the configuration on e-mail notification and better discrimination of jobs in order to parallelize the process in the future, the pipelines'll help me for this purpose.

I hope to produce new elements on this topic in next months.

Give me some tips if you read this post :)

Bye