Kohei Nozaki's blog 

Building environment specific artifacts with classifier


Posted on Friday Mar 20, 2015 at 03:09PM in Maven


Consider following multi-module Maven project:

  • classifier: The parent & aggregator project

  • persistence: A jar project which holds JPA entities and a persistence descriptor (persistence.xml)

  • web: A war project which depends on persistence project

persistence project contains following persistence descriptor:

<?xml version="1.0" encoding="UTF-8"?>
<persistence version="2.1" xmlns="http://xmlns.jcp.org/xml/ns/persistence" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://xmlns.jcp.org/xml/ns/persistence http://xmlns.jcp.org/xml/ns/persistence/persistence_2_1.xsd">
    <persistence-unit name="myPU">
        <jta-data-source>java:jboss/datasources/ExampleDS</jta-data-source>
        <properties>
            <property name="javax.persistence.schema-generation.database.action"
                      value="${javax.persistence.schema-generation.database.action}"/>
        </properties>
    </persistence-unit>
</persistence>

I need to set javax.persistence.schema-generation.database.action property as drop-and-create for development environment, but none for production environment. in such case, using profiles and filtering might be a solution, but its shortcoming is that we can hold only one (among environment specific builds) artifact in the local repository because these builds has same coordinate. it may bring unexpected result such as deploying development build to the production environment by some accident. in such case, using classifier would be a better solution.

Preparation

First, let’s enable resource filtering in persistence project.

<build>
    <resources>
        <resource>
            <directory>src/main/resources</directory>
            <filtering>true</filtering>
        </resource>
    </resources>

    <plugins>
        <plugin>
            <artifactId>maven-resources-plugin</artifactId>
            <executions>
                <!-- Filter and copy resources under src/main/resources into target/classes (default location) -->
                <execution>
                    <id>default-resources</id>
                    <phase>process-resources</phase>
                    <goals>
                        <goal>resources</goal>
                    </goals>
                    <configuration>
                        <filters>
                            <filter>${basedir}/filters/dev.properties</filter>
                        </filters>
                    </configuration>
                </execution>
            </executions>
        </plugin>
    </plugins>
</build>

Then put filters/dev.properties:

javax.persistence.schema-generation.database.action=drop-and-create

Set a dependency in web project:

<dependencies>
    <dependency>
        <groupId>org.nailedtothex.examples.classifier</groupId>
        <artifactId>persistence</artifactId>
        <version>1.0-SNAPSHOT</version>
    </dependency>
</dependencies>

Add configurations for producing artifacts for production

Now we can make artifacts that holds filtered persistence.xml for development environment. next, let’s add configurations for producing artifacts for production environment with prod classifier.

Put following profile definition into persistence project to make the project to produce both of development and production (with prod classifier) artifacts:

<profile>
    <id>prod</id>
    <properties>
        <filteredResources>target/filtered-classes</filteredResources>
    </properties>
    <build>
        <plugins>
            <plugin>
                <artifactId>maven-resources-plugin</artifactId>
                <executions>
                    <!-- Filter and copy resources under src/main/resources into target/filtered-classes/prod -->
                    <execution>
                        <id>prod-resources</id>
                        <phase>process-resources</phase>
                        <goals>
                            <goal>resources</goal>
                        </goals>
                        <configuration>
                            <outputDirectory>${filteredResources}/prod</outputDirectory>
                            <filters>
                                <filter>${basedir}/filters/prod.properties</filter>
                            </filters>
                        </configuration>
                    </execution>
                    <!-- Copy classes under target/classes into target/filtered-classes/prod -->
                    <!-- Existing files will not be overwritten. -->
                    <!-- see http://maven.apache.org/plugins/maven-resources-plugin/resources-mojo.html#overwrite -->
                    <execution>
                        <id>copy-classes-prod</id>
                        <phase>process-classes</phase>
                        <goals>
                            <goal>copy-resources</goal>
                        </goals>
                        <configuration>
                            <outputDirectory>${filteredResources}/prod</outputDirectory>
                            <resources>
                                <resource>
                                    <directory>${project.build.outputDirectory}</directory>
                                    <filtering>false</filtering>
                                </resource>
                            </resources>
                        </configuration>
                    </execution>
                </executions>
            </plugin>
            <plugin>
                <artifactId>maven-jar-plugin</artifactId>
                <executions>
                    <!-- Create the production jar with files inside target/filtered-classes/prod  -->
                    <execution>
                        <id>prod-jar</id>
                        <phase>package</phase>
                        <goals>
                            <goal>jar</goal>
                        </goals>
                        <configuration>
                            <classifier>prod</classifier>
                            <classesDirectory>${filteredResources}/prod</classesDirectory>
                        </configuration>
                    </execution>
                </executions>
            </plugin>
        </plugins>
    </build>
</profile>

Also put filters/prod.properties:

javax.persistence.schema-generation.database.action=none

Issue following command:

$ mvn clean install -P prod

Result:

...
[INFO] --- maven-jar-plugin:2.6:jar (default-jar) @ persistence ---
[INFO] Building jar: /Users/kyle/src/classifier/persistence/target/persistence-1.0-SNAPSHOT.jar
[INFO]
[INFO] --- maven-jar-plugin:2.6:jar (prod-jar) @ persistence ---
[INFO] Building jar: /Users/kyle/src/classifier/persistence/target/persistence-1.0-SNAPSHOT-prod.jar
[INFO]
[INFO] --- maven-install-plugin:2.4:install (default-install) @ persistence ---
[INFO] Installing /Users/kyle/src/classifier/persistence/target/persistence-1.0-SNAPSHOT.jar to /Users/kyle/.m2/repository/org/nailedtothex/examples/classifier/persistence/1.0-SNAPSHOT/persistence-1.0-SNAPSHOT.jar
[INFO] Installing /Users/kyle/src/classifier/persistence/pom.xml to /Users/kyle/.m2/repository/org/nailedtothex/examples/classifier/persistence/1.0-SNAPSHOT/persistence-1.0-SNAPSHOT.pom
[INFO] Installing /Users/kyle/src/classifier/persistence/target/persistence-1.0-SNAPSHOT-prod.jar to /Users/kyle/.m2/repository/org/nailedtothex/examples/classifier/persistence/1.0-SNAPSHOT/persistence-1.0-SNAPSHOT-prod.jar
...

You can see both of artifacts are installed as expected:

$ unzip -p /Users/kyle/.m2/repository/org/nailedtothex/examples/classifier/persistence/1.0-SNAPSHOT/persistence-1.0-SNAPSHOT.jar META-INF/persistence.xml
<?xml version="1.0" encoding="UTF-8"?>
<persistence version="2.1" xmlns="http://xmlns.jcp.org/xml/ns/persistence" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://xmlns.jcp.org/xml/ns/persistence http://xmlns.jcp.org/xml/ns/persistence/persistence_2_1.xsd">
    <persistence-unit name="myPU">
        <jta-data-source>java:jboss/datasources/ExampleDS</jta-data-source>
        <properties>
            <property name="javax.persistence.schema-generation.database.action"
                      value="drop-and-create"/>
        </properties>
    </persistence-unit>
</persistence>

$ unzip -p /Users/kyle/.m2/repository/org/nailedtothex/examples/classifier/persistence/1.0-SNAPSHOT/persistence-1.0-SNAPSHOT-prod.jar META-INF/persistence.xml
<?xml version="1.0" encoding="UTF-8"?>
<persistence version="2.1" xmlns="http://xmlns.jcp.org/xml/ns/persistence" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://xmlns.jcp.org/xml/ns/persistence http://xmlns.jcp.org/xml/ns/persistence/persistence_2_1.xsd">
    <persistence-unit name="myPU">
        <jta-data-source>java:jboss/datasources/ExampleDS</jta-data-source>
        <properties>
            <property name="javax.persistence.schema-generation.database.action"
                      value="none"/>
        </properties>
    </persistence-unit>
</persistence>

So what is needed for web project? put following profile as well:

<profile>
    <id>prod</id>
    <dependencies>
        <dependency>
            <groupId>org.nailedtothex.examples.classifier</groupId>
            <artifactId>persistence</artifactId>
            <version>1.0-SNAPSHOT</version>
            <classifier>prod</classifier>
        </dependency>
    </dependencies>
    <build>
        <plugins>
            <plugin>
                <artifactId>maven-war-plugin</artifactId>
                <executions>
                    <execution>
                        <id>default-war</id>
                        <phase>package</phase>
                        <goals>
                            <goal>war</goal>
                        </goals>
                        <configuration>
                            <packagingExcludes>WEB-INF/lib/persistence-1.0-SNAPSHOT-prod.jar</packagingExcludes>
                        </configuration>
                    </execution>
                    <execution>
                        <id>prod-war</id>
                        <phase>package</phase>
                        <goals>
                            <goal>war</goal>
                        </goals>
                        <configuration>
                            <classifier>prod</classifier>
                            <packagingExcludes>WEB-INF/lib/persistence-1.0-SNAPSHOT.jar</packagingExcludes>
                        </configuration>
                    </execution>
                </executions>
            </plugin>
        </plugins>
    </build>
</profile>

Then you will get both of artifacts installed after issuing mvn clean install -P prod as follows:

$ unzip -l /Users/kyle/.m2/repository/org/nailedtothex/examples/classifier/web/1.0-SNAPSHOT/web-1.0-SNAPSHOT.war | grep persistence
     3521  03-20-15 14:25   WEB-INF/lib/persistence-1.0-SNAPSHOT.jar
$ unzip -l /Users/kyle/.m2/repository/org/nailedtothex/examples/classifier/web/1.0-SNAPSHOT/web-1.0-SNAPSHOT-prod.war | grep persistence
     3513  03-20-15 14:25   WEB-INF/lib/persistence-1.0-SNAPSHOT-prod.jar


Arquillian EJB-JAR/EAR testing examples


Posted on Friday Mar 20, 2015 at 10:33AM in Arquillian


There are plenty of examples of Arquillian testing with WAR deployments, but not for other deployments such as EJB-JAR or EAR. so I created some examples. these examples were tested against Arquillian 1.1.7.Final, using WildFly 8.2.0.Final as remote container. the entire project can be obtained from GitHub.

Testing against EJB-JAR deployment

Assume we have a simple EJB in a EJB-JAR project as follows:

@Stateless
@LocalBean
public class SomeEjb {
    public String hello(String name) {
        return "Hello, " + name;
    }
}

Test class:

@RunWith(Arquillian.class)
public class EjbJarIT {
    @Deployment
    public static Archive<?> createDeploymentPackage() {
        final Archive archive = ShrinkWrap.create(JavaArchive.class).addClass(SomeEjb.class);
        return archive;
    }

    @EJB
    private SomeEjb someEjb;

    @Test
    public void test() {
        Assert.assertEquals("Hello, Kyle", someEjb.hello("Kyle"));
    }
}

The deployment will be a WAR through Arquillian’s automatic enrichment process while the method annotated as @Deployment produced JavaArchive.

Testing against EAR deployment

Assume we have a simple EAR project which depends on the preceding EJB-JAR project.

Test class:

@RunWith(Arquillian.class)
public class EarIT {

    @Deployment
    public static Archive<?> createDeploymentPackage() throws IOException {
        final JavaArchive ejbJar = ShrinkWrap.create(JavaArchive.class, "ejb-jar.jar").addClass(SomeEjb.class);

        // Embedding war package which contains the test class is needed
        // So that Arquillian can invoke test class through its servlet test runner
        final WebArchive testWar = ShrinkWrap.create(WebArchive.class, "test.war").addClass(EarIT.class);
        final EnterpriseArchive ear = ShrinkWrap.create(EnterpriseArchive.class)
                .setApplicationXML("test-application.xml")
                .addAsModule(ejbJar)
                .addAsModule(testWar);
        return ear;
    }

    @EJB
    private SomeEjb someEjb;

    @Test
    public void test() {
        Assert.assertEquals("Hello, Kyle", someEjb.hello("Kyle"));
    }
}

test-application.xml which will be embed as application.xml:

<application>
    <display-name>ear</display-name>
    <module>
        <ejb>ejb-jar.jar</ejb>
    </module>
    <module>
        <web>
            <web-uri>test.war</web-uri>
            <context-root>/test</context-root>
        </web>
    </module>
</application>

Also I have an another example that uses the EAR which Maven has produced because creating EAR with ShrinkWrap would be annoying in some complex cases. the @Deployment method will embed the test WAR into the EAR, and add a module element into existing application.xml before returning the archive to Arquillian runtime. the @Deployment method would be something like this:

...
@Deployment
public static Archive<?> createDeploymentPackage() throws IOException {
    final String testWarName = "test.war";

    final EnterpriseArchive ear = ShrinkWrap.createFromZipFile(
            EnterpriseArchive.class, new File("target/ear-1.0-SNAPSHOT.ear"));

    addTestWar(ear, EarFromZipFileIT.class, testWarName);
...


Arquillian Persistence Extension examples


Posted on Wednesday Mar 18, 2015 at 05:47PM in Arquillian


The whole project can be obtained from GitHub. tested with WildFly 8.2.0.Final as remote container.

Implementation (test target)

Assume we have very simple 2 entities as follows:

@Entity
public class Dept implements Serializable {
    @Id
    private Integer id;
    @Column(nullable = false)
    private String name;
    @OneToMany(mappedBy = "dept")
    private Collection<Employee> employees;
...

@Entity
public class Employee implements Serializable {
    @Id
    private Integer id;
    @Column(nullable = false)
    private String name;
    @JoinColumn(nullable = false)
    @ManyToOne
    private Dept dept;
...

Test target EJB:

@Stateless
@LocalBean
public class HumanResourcesBean {

    @PersistenceContext
    private EntityManager em;

    public void addEmployee(Employee employee, Integer deptId) {
        final Dept dept = em.find(Dept.class, deptId);
        dept.getEmployees().add(employee);
        employee.setDept(dept);
        em.persist(employee);
    }

    public void addDept(Dept dept, Employee employee) {
        Collection<Employee> employees = new ArrayList<>();
        dept.setEmployees(employees);
        employees.add(employee);
        employee.setDept(dept);
        em.persist(dept);
        em.persist(employee);
    }
}

addEmployee() testing

Test method of addEmployee():

@Test
@UsingDataSet("input.xml")
@ShouldMatchDataSet(value = "addEmployee-expected.xml", orderBy = "id")
public void addEmployeeTest() throws Exception {
    Employee emp = new Employee();
    emp.setId(2002);
    emp.setName("Todd");
    humanResourcesBean.addEmployee(emp, 200);
}

Initial entry data (input.xml):

<dataset>
    <Dept id="100" name="Sales"/>
    <Dept id="200" name="Finance"/>
    <Employee id="1000" name="Scott"  dept_id="100"/>
    <Employee id="1001" name="Martin" dept_id="100"/>
    <Employee id="1002" name="Nick"   dept_id="100"/>
    <Employee id="2000" name="Jordan" dept_id="200"/>
    <Employee id="2001" name="David"  dept_id="200"/>
</dataset>

Expected data (addEmployee-expected.xml):

<dataset>
    <Employee id="1000" name="Scott"  dept_id="100"/>
    <Employee id="1001" name="Martin" dept_id="100"/>
    <Employee id="1002" name="Nick"   dept_id="100"/>
    <Employee id="2000" name="Jordan" dept_id="200"/>
    <Employee id="2001" name="David"  dept_id="200"/>
    <Employee id="2002" name="Todd"   dept_id="200"/> <!-- Newly added -->
</dataset>

addDept() testing

Test method of addDept():

@Test
@UsingDataSet("input.xml")
@ShouldMatchDataSet(value = "addDept-expected.xml", orderBy = "id")
public void addDeptTest() throws Exception {
    Dept dept = new Dept();
    dept.setId(300);
    dept.setName("Engineering");
    Employee emp = new Employee();
    emp.setId(3000);
    emp.setName("Carl");
    humanResourcesBean.addDept(dept, emp);
}

Initial entry data (input.xml) is the same to previous testing.

Expected data (addDept-expected.xml):

<dataset>
    <Dept id="100" name="Sales"/>
    <Dept id="200" name="Finance"/>
    <Dept id="300" name="Engineering"/> <!-- Newly added -->
    <Employee id="1000" name="Scott"  dept_id="100"/>
    <Employee id="1001" name="Martin" dept_id="100"/>
    <Employee id="1002" name="Nick"   dept_id="100"/>
    <Employee id="2000" name="Jordan" dept_id="200"/>
    <Employee id="2001" name="David"  dept_id="200"/>
    <Employee id="3000" name="Carl"   dept_id="300"/> <!-- Newly added -->
</dataset>

It works well with multiple tables.

addDept() testing with DBUnit

Sometimes use of DBUnit directly is useful for complex assertion. in such case you need to care following conditions:

  • If you use JPA, force EntityManager to execute DMLs via invoking em.flush() before assertion

  • Include test data to the Arquillian’s application archive so that DBUnit can load these data on the server side

The XML can be included via addAsResource() method as follows:

@Deployment
public static Archive<?> createDeploymentPackage() {
    final WebArchive webArchive = ShrinkWrap.create(WebArchive.class, "test.war")
            .addPackage(Dept.class.getPackage())
            .addClass(HumanResourcesBean.class)
            .addAsResource("datasets/addDept-expected.xml") // to be loaded by DBUnit on the server side
            .addAsResource("test-persistence.xml", "META-INF/persistence.xml");
//        System.out.println(webArchive.toString(true));
    return webArchive;
}

The test method of addDept() and related convenient methods:

@Test
@UsingDataSet("input.xml")
public void addDeptTestWithDbUnit() throws Exception {
    Dept dept = new Dept();
    dept.setId(300);
    dept.setName("Engineering");
    Employee emp = new Employee();
    emp.setId(3000);
    emp.setName("Carl");

    humanResourcesBean.addDept(dept, emp);
    em.flush(); // force JPA to execute DMLs before assertion

    final IDataSet expectedDataSet = getDataSet("/datasets/addDept-expected.xml");
    assertTable(expectedDataSet.getTable("Dept"), "select * from dept order by id");
    assertTable(expectedDataSet.getTable("Employee"), "select * from employee order by id");
}

private static IDataSet getDataSet(String path) throws DataSetException {
    return new FlatXmlDataSetBuilder().build(HumanResourcesBeanIT.class.getResource(path));
}

private void assertTable(ITable expectedTable, String sql) throws SQLException, DatabaseUnitException {
    try (Connection cn = ds.getConnection()) {
        IDatabaseConnection icn = null;
        try {
            icn = new DatabaseConnection(cn);
            final ITable queryTable = icn.createQueryTable(expectedTable.getTableMetaData().getTableName(), sql);
            Assertion.assertEquals(expectedTable, queryTable);
        } finally {
            if (icn != null) {
                icn.close();
            }
        }
    }
}


Disabling color escape sequences in WildFly's console logging for IDE


Posted on Wednesday Mar 18, 2015 at 11:31AM in WildFly


I’m using IntelliJ IDEA for developing Java EE applications on WildFly. its built-in WildFly console appears as follows:

...
[0m11:20:28,608 INFO  [org.jboss.as.server] (Controller Boot Thread) JBAS015888: Creating http management service using socket-binding (management-http)
11:20:28,625 INFO  [org.xnio] (MSC service thread 1-9) XNIO version 3.3.0.Final
11:20:28,631 INFO  [org.xnio.nio] (MSC service thread 1-9) XNIO NIO Implementation Version 3.3.0.Final
...

You can see strange characters in the head of every lines. they are color escape sequences that works fine with terminal emulators but not for IntelliJ IDEA’s console output. to disable color escape sequences, issue following command in jboss-cli:

/subsystem=logging/console-handler=CONSOLE:write-attribute(name=named-formatter, value=PATTERN)

Now strange characters disappeared from IntelliJ IDEA’s console as follows:

...
2015-03-18 11:26:11,800 INFO  [org.jboss.as.server] (Controller Boot Thread) JBAS015888: Creating http management service using socket-binding (management-http)
2015-03-18 11:26:11,812 INFO  [org.xnio] (MSC service thread 1-4) XNIO version 3.3.0.Final
2015-03-18 11:26:11,817 INFO  [org.xnio.nio] (MSC service thread 1-4) XNIO NIO Implementation Version 3.3.0.Final
...


Using JPA 2.1 AttributeConverter against Java8 LocalDate / LocalDateTime


Posted on Tuesday Mar 17, 2015 at 01:50PM in JPA


I created an example project using https://weblogs.java.net/blog/montanajava/archive/2014/06/17/using-java-8-datetime-classes-jpa which ran on WildFly 8.2.0.Final (Hibernate 4.3.7) and H2 / Apache Derby database.

the whole project can be obtained from https://github.com/lbtc-xxx/jpa21converter .

You don’t need to define any additional configuration in persistence.xml if you use converters in EE environment. it goes like this:

The converter for LocalDate between DATE

@Converter(autoApply = true)
public class MyLocalDateConverter implements AttributeConverter<java.time.LocalDate, java.sql.Date> {

    @Override
    public java.sql.Date convertToDatabaseColumn(java.time.LocalDate attribute) {
        return attribute == null ? null : java.sql.Date.valueOf(attribute);
    }

    @Override
    public java.time.LocalDate convertToEntityAttribute(java.sql.Date dbData) {
        return dbData == null ? null : dbData.toLocalDate();
    }
}

The converter for LocalDateTime between TIMESTAMP

@Converter(autoApply = true)
public class MyLocalDateTimeConverter implements AttributeConverter<java.time.LocalDateTime, java.sql.Timestamp> {

    @Override
    public java.sql.Timestamp convertToDatabaseColumn(java.time.LocalDateTime attribute) {
        return attribute == null ? null : java.sql.Timestamp.valueOf(attribute);
    }

    @Override
    public java.time.LocalDateTime convertToEntityAttribute(java.sql.Timestamp dbData) {
        return dbData == null ? null : dbData.toLocalDateTime();
    }
}

Entity class

@Entity
public class MySimpleTable implements Serializable {
    @Id
    @GeneratedValue
    private Long id;
    private java.time.LocalDateTime someLocalDateTime;
    private java.time.LocalDate someLocalDate;
...

Hibernate produces the DDL against H2 as follows:

create table MySimpleTable (
    id bigint not null,
    someLocalDate date,
    someLocalDateTime timestamp,
    primary key (id)
)

Using converters with @EmbeddedId

Converters doesn’t work with fields that annotated as @Id (see http://stackoverflow.com/questions/28337798/hibernate-fails-to-load-jpa-2-1-converter-when-loaded-with-spring-boot-and-sprin ) but works with @EmbeddedId class.

Entity class:

@Entity
public class MyCompositeKeyTable implements Serializable {
    @EmbeddedId
    private MyCompositeKeyEmbeddable key;
...

Embeddable class:

@Embeddable
public class MyCompositeKeyEmbeddable implements Serializable {
    @Column(nullable = false)
    private java.time.LocalDateTime someLocalDateTime;
    @Column(nullable = false)
    private java.time.LocalDate someLocalDate;
...

Produced DDL:

create table MyCompositeKeyTable (
    someLocalDate date not null,
    someLocalDateTime timestamp not null,
    primary key (someLocalDate, someLocalDateTime)
)