./standalone.sh -c standalone-full.xml
Entries tagged [wildfly]
In-container JMS consumer/producer example
TweetPosted on Saturday May 14, 2016 at 11:06PM in Technology
In this entry, I’ll show you a complete example of using JMS in a Java EE 7 compliant application container, through creating a webapp which consists of both a consumer and a producer. we’re going to deploy it to a container (WildFly 10.0.0.Final) and see a webapp produces and consumes a message.
Launch the container
Launch WildFly server with the following parameter so the bundled message queue broker (ActiveMQ Artemis) will be launched:
For IntelliJ IDEA, check how to launch WildFly with -c standalone-full.xml
from this SO:
http://stackoverflow.com/questions/25849810/how-to-run-wildfly-with-standalone-full-xml-from-intellij-idea
Define a queue
Launch jboss-cli
and define a queue with this command:
jms-queue add --queue-address=testQueue --entries=queue/test,java:jboss/exported/jms/queue/test
Check if it’s successfully created:
[standalone@localhost:9990 /] /subsystem=messaging-activemq/server=default/jms-queue=testQueue:read-resource
{
"outcome" => "success",
"result" => {
"durable" => true,
"entries" => [
"queue/test",
"java:jboss/exported/jms/queue/test"
],
"legacy-entries" => undefined,
"selector" => undefined
}
}
Create the webapp which contains consumer/producer
Here we’re going to create following three classes in the webapp:
-
MyProducer
: a Stateless Session Bean which produces a message to the queue -
MyConsumer
: a Message-Driven Bean which consumes any messages being sent to the queue -
MyServlet
: Receives HTTP GET request and kicksMyProducer
The whole project can be obtained from My GitHub repository.
MyProducer
@Stateless
@LocalBean
public class MyProducer {
@Resource(mappedName = "java:/queue/test")
Queue testQueue;
@Inject
JMSContext jmsContext;
public void enqueue(final String text) {
jmsContext.createProducer().send(testQueue, text);
}
}
MyConsumer
@MessageDriven(name = "MyMDB", activationConfig = {
@ActivationConfigProperty(propertyName = "destination", propertyValue = "queue/test"),
@ActivationConfigProperty(propertyName = "destinationType", propertyValue = "javax.jms.Queue"),
@ActivationConfigProperty(propertyName = "acknowledgeMode", propertyValue = "Auto-acknowledge")})
public class MyConsumer implements MessageListener {
private final static Logger LOGGER = Logger.getLogger(MyConsumer.class.toString());
@Override
public void onMessage(final Message msg) {
if (msg instanceof TextMessage) {
try {
final String text = ((TextMessage) msg).getText();
LOGGER.info(() -> "Received: " + text);
} catch (final JMSException e) {
throw new RuntimeException(e);
}
}
}
}
MyServlet
@WebServlet(urlPatterns = "/")
public class MyServlet extends HttpServlet {
@EJB
MyProducer myProducer;
@Override
protected void doGet(final HttpServletRequest req, final HttpServletResponse resp) throws ServletException, IOException {
final String text = "Hello, JMS!";
myProducer.enqueue(text);
resp.getWriter().write("Published! check output of the consumer: " + text + "\n");
}
}
Of course you don’t need to put MyConsumer
to the webapp which contains MyProducer
. this is just an example, and in fact, just for asynchronous/background processing in a webapp, you better use more simple EJB Asynchronous methods or ManagedExecutorService instead of JMS. for real use-case, you may create a dedicated app for queue consumers and place your MyConsumer
equivalent into it.
Trigger producing and consuming
Deploy the app and submit HTTP GET request to it as follows.
$ curl http://localhost:8080/jms-example/
If it worked successfully, you’ll see following response from the Servlet:
Published! check output of the consumer: Hello, JMS!
Then check console/output/log of your application container. if the consumer worked successfully, you can see the output something like following:
13:55:18,168 INFO [class jms.MyConsumer] (Thread-438 (ActiveMQ-client-global-threads-271921988)) Received: Hello, JMS!
Conclusion
As described above, thanks to JMS, we can develop a messaging app without tons of annoying, vendor-specific boilar plate code, in lean and simple semantics. there are no worries of maintaining logic of connection handling, polling loop and so forth in the application code.
And note that there are no use of vendor-specific classes - it uses only standardized API which comes from javaee-api
. you may need to alter some portion of code when you deploy this example to other Java EE 7 compliant container such as GlassFish, WebLogic or WebSphere, but that should be few.
JBatch examples: bulk loading from database to CSV file
TweetPosted on Sunday May 24, 2015 at 03:52PM in JBatch
In previous entry, we looked how to load data from CSV file to database. in this entry, we will look how to load data from database to CSV file. we’ll use JdbcItemReader
to read data from database and CsvItemWriter
to write data to as a CSV file.
Setup
In this setup we’ll use WildFly 9.0.0.CR1.
Assume we already have a table forex
and data in H2 database that created and populated in previous entry.
For JdbcItemReader
, we need an another datasource which is Non-JTA, references the same database to JTA one. for detail see this conversation.
data-source add \ --name=MyNonJtaDS \ --driver-name=h2 \ --jndi-name=java:jboss/datasources/MyNonJtaDS \ --user-name=sa \ --password=sa \ --connection-url=jdbc:h2:/tmp/myds;AUTO_SERVER=TRUE \ --jta=false
Next, create job artifacts.
/src/main/resources/META-INF/batch-jobs/save-csv.xml
Note that MyNonJtaDS
is used, not MyDS
. and all of classes that used in the job are supplied within jberet-support
.
<job id="save-csv" version="1.0" xmlns="http://xmlns.jcp.org/xml/ns/javaee"> <step id="save"> <chunk> <reader ref="jdbcItemReader"> <properties> <property name="dataSourceLookup" value="java:jboss/datasources/MyNonJtaDS"/> <property name="sql" value="SELECT symbol, ts, bid_open, bid_high, bid_low, bid_close, volume FROM forex ORDER BY symbol, ts"/> <property name="beanType" value="java.util.List"/> </properties> </reader> <writer ref="csvItemWriter"> <properties> <property name="resource" value="#{jobParameters['resource']}"/> <property name="header" value="symbol, ts, bid_open, bid_high, bid_low, bid_close, volume"/> <property name="beanType" value="java.util.List"/> </properties> </writer> </chunk> </step> </job>
Run the job
Issue following command. this saves a CSV file into /tmp/save.csv
:
curl 'http://localhost:8080/jbatch-example-1.0-SNAPSHOT/jbatch/rest/start/save-csv?resource=/tmp/save.csv'
After job execution is done, check the CSV file is created as expected:
symbol,ts,bid_open,bid_high,bid_low,bid_close,volume USDJPY,2015-04-01 00:00:00.0,119.566,119.566,119.551,119.565,0 USDJPY,2015-04-01 00:01:00.0,119.566,119.581,119.565,119.579,0 USDJPY,2015-04-01 00:02:00.0,119.581,119.586,119.581,119.583,0 ...
The project which used in this entry can be obtained from my GitHub repository.
JBatch examples: bulk loading from CSV file to database
TweetPosted on Sunday May 24, 2015 at 03:07PM in JBatch
Bulk loading is a typical usecase of batch application. in this entry, I give you a example of bulk loading from a CSV file.
There is a supplemental package named jberet-support
, which contains many useful classes that implemented ItemReader
or ItemWriter
for common usecases. in this entry, we’ll use CsvItemReader
to read a CSV file and JdbcItemWriter
to write data to database.
Setup
In this setup we’ll use WildFly 9.0.0.CR1.
First, we need a CSV file, whatever. so we’ll use a forex historical data which can be downloaded from http://www.histdata.com/download-free-forex-historical-data/?/ascii/1-minute-bar-quotes/usdjpy/2015/4 . download it and unpack, put DAT_ASCII_USDJPY_M1_201504.csv
somewhere in your environment. this file contains data like:
20150401 000000;119.566000;119.566000;119.551000;119.565000;0 20150401 000100;119.566000;119.581000;119.565000;119.579000;0 20150401 000200;119.581000;119.586000;119.581000;119.583000;0 ...
Next, define a JTA datasource on WildFly. following is an example command which defines a H2 datasource using jboss-cli
:
data-source add \ --name=MyDS \ --driver-name=h2 \ --jndi-name=java:jboss/datasources/MyDS \ --user-name=sa \ --password=sa \ --connection-url=jdbc:h2:/tmp/myds;AUTO_SERVER=TRUE
After confirmed outcome was success
, issue following command to test a connection:
/subsystem=datasources/data-source=MyDS:test-connection-in-pool
Next, create a table to store dataset. issue following command to start H2 console, in the base directory of your WildFly instance:
java -cp ./modules/system/layers/base/com/h2database/h2/main/h2*.jar org.h2.tools.Shell -url "jdbc:h2:/tmp/myds;AUTO_SERVER=TRUE" -user sa -password sa
Execute following DDL:
CREATE TABLE forex ( symbol VARCHAR(6) NOT NULL, ts TIMESTAMP NOT NULL, bid_open NUMERIC(10,3) NOT NULL, bid_high NUMERIC(10,3) NOT NULL, bid_low NUMERIC(10,3) NOT NULL, bid_close NUMERIC(10,3) NOT NULL, volume INTEGER NOT NULL, PRIMARY KEY(symbol, ts) );
Next, create a batch application.
pom.xml
You need following dependencies in your pom.xml
:
<dependencies> <dependency> <groupId>javax</groupId> <artifactId>javaee-api</artifactId> <version>7.0</version> <scope>provided</scope> </dependency> <dependency> <groupId>org.apache.batchee</groupId> <artifactId>batchee-servlet-embedded</artifactId> <version>0.2-incubating</version> </dependency> <dependency> <groupId>org.jberet</groupId> <artifactId>jberet-support</artifactId> <version>1.1.0.Final</version> </dependency> <dependency> <groupId>net.sf.supercsv</groupId> <artifactId>super-csv</artifactId> <version>2.3.1</version> </dependency> <dependency> <groupId>com.fasterxml.jackson.core</groupId> <artifactId>jackson-databind</artifactId> <version>2.5.3</version> </dependency> </dependencies>
/src/main/resources/META-INF/batch-jobs/load-csv.xml
The job uses csvItemReader
and jdbcItemWriter
that are supplied within the jberet-support
package.
<job id="load-csv" version="1.0" xmlns="http://xmlns.jcp.org/xml/ns/javaee"> <step id="load"> <chunk> <reader ref="csvItemReader"> <properties> <property name="resource" value="#{jobParameters['resource']}"/> <property name="headerless" value="true"/> <property name="delimiterChar" value=";"/> <property name="beanType" value="java.util.List"/> </properties> </reader> <processor ref="forexItemProcessor"> <properties> <property name="symbol" value="#{jobParameters['symbol']}"/> </properties> </processor> <writer ref="jdbcItemWriter"> <properties> <property name="dataSourceLookup" value="java:jboss/datasources/MyDS"/> <property name="sql" value="INSERT INTO forex (symbol, ts, bid_open, bid_high, bid_low, bid_close, volume) values (?, ?, ?, ?, ?, ?, ?)"/> <property name="beanType" value="java.util.List"/> </properties> </writer> </chunk> </step> </job>
/src/main/java/jbatch/ForexItemProcessor.java
@Named @Dependent public class ForexItemProcessor implements ItemProcessor { private static final DateTimeFormatter FORMATTER = DateTimeFormatter.ofPattern("uuuuMMdd HHmmss"); @Inject @BatchProperty private String symbol; @Override public Object processItem(final Object item) throws Exception { final List items = (List) item; return Arrays.asList(symbol, Timestamp.valueOf(LocalDateTime.parse((String) items.get(0), FORMATTER)), new BigDecimal((String) items.get(1)), new BigDecimal((String) items.get(2)), new BigDecimal((String) items.get(3)), new BigDecimal((String) items.get(4)), Integer.valueOf((String) items.get(5))); } }
Run the batch
Example if you put the forex CSV in /tmp/DAT_ASCII_USDJPY_M1_201504.csv
:
curl 'http://localhost:8080/jbatch-example-1.0-SNAPSHOT/jbatch/rest/start/load-csv?symbol=USDJPY&resource=/tmp/DAT_ASCII_USDJPY_M1_201504.csv'
After the job done, check dataset within your database using the H2 CLI:
sql> select * from forex; SYMBOL | TS | BID_OPEN | BID_HIGH | BID_LOW | BID_CLOSE | VOLUME USDJPY | 2015-04-01 00:00:00.0 | 119.566 | 119.566 | 119.551 | 119.565 | 0 USDJPY | 2015-04-01 00:01:00.0 | 119.566 | 119.581 | 119.565 | 119.579 | 0 USDJPY | 2015-04-01 00:02:00.0 | 119.581 | 119.586 | 119.581 | 119.583 | 0 ... (31572 rows, 505 ms)
The project which used in this entry can be obtained from my GitHub repository.
JBatch examples: get started JBatch with WildFly and batchee-servlet-embedded
TweetPosted on Sunday May 24, 2015 at 12:43PM in JBatch
JSR352 aka JBatch is the standardized batch processing framework for the Java EE platform. it eases tedious work on batch programming such as transaction management of bulk processing, parallel processing, flow control. and it gives well-integrated job information management mechanism, well-designed interfaces that enables us to develop common modules for frequently use. there are some convenient modules aim to be used in typical situation. in this entry, I introduce you some examples to get started.
Setup
Setup WildFly 9.0.0.CR1: download the full distribution from wildfly.org and unpack.
Next, create a war application contains following resources:
pom.xml
This contains a dependency to batchee-servlet-embedded
. it brings a simple web application which enables us to control batch jobs, also it supplies simple REST style interface.
<?xml version="1.0" encoding="UTF-8"?> <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"> <modelVersion>4.0.0</modelVersion> <groupId>jbatch-example</groupId> <artifactId>jbatch-example</artifactId> <version>1.0-SNAPSHOT</version> <packaging>war</packaging> <properties> <maven.compiler.source>1.8</maven.compiler.source> <maven.compiler.target>1.8</maven.compiler.target> <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding> <failOnMissingWebXml>false</failOnMissingWebXml> </properties> <dependencies> <dependency> <groupId>javax</groupId> <artifactId>javaee-api</artifactId> <version>7.0</version> <scope>provided</scope> </dependency> <dependency> <groupId>org.apache.batchee</groupId> <artifactId>batchee-servlet-embedded</artifactId> <version>0.2-incubating</version> </dependency> </dependencies> </project>
/src/main/java/jbatch/MyBatchlet.java
@Named @Dependent public class MyBatchlet extends AbstractBatchlet { @Override public String process() throws Exception { System.out.println("Hello, JBatch"); return null; } }
/src/main/resources/META-INF/batch-jobs/simple-job.xml
<job id="simple-job" version="1.0" xmlns="http://xmlns.jcp.org/xml/ns/javaee"> <step id="myStep"> <batchlet ref="myBatchlet"/> </step> </job>
Deploy and run the batch
Then deploy the war, and go to http://localhost:8080/jbatch-example-1.0-SNAPSHOT/jbatch/ from your browser. you’ll see following page:
Then click New Batch button. you’ll be transited to following page. enter simple-job
to the text box, then click Set Job Name, and click Submit
If the batch executed successfully, you’ll be transited to following page:
Also you’ll see following output in your WildFly console:
12:23:02,046 INFO [stdout] (Batch Thread - 2) Hello, JBatch
You can see job execution history from the web. click simple-job
in home page and you’ll be transited following page:
Instead of using web browser, you can launch a job with simple REST style API as follows:
curl http://localhost:8080/jbatch-example-1.0-SNAPSHOT/jbatch/rest/start/simple-job
For details of the REST API, you can see help with following command:
curl http://localhost:8080/jbatch-example-1.0-SNAPSHOT/jbatch/rest/
It shows:
Known commands are: * start/ - start a new batch job Sample: http://localhost:8080/myapp/jbatch/rest/start/myjobname?param1=x¶m2=y BatchEE will start the job and immediately return * status/ - query the current status Sample: http://localhost:8080/myapp/jbatch/rest/status/23 will return the state of executionId 23 * stop/ - stop the job with the given executionId Sample: http://localhost:8080/myapp/jbatch/rest/stop/23 will stop the job with executionId 23 * restart/ - restart the job with the given executionId Sample: http://localhost:8080/myapp/jbatch/rest/restart/23 will restart the job with executionId 23
The project which used in this entry can be obtained from my GitHub repository.
Enabling RequestDumpingHandler of Undertow
TweetPosted on Friday Mar 20, 2015 at 05:01PM in WildFly
Tested with WildFly 8.2.0.Final. Issue following command via jboss-cli and restart the server:
/subsystem=undertow/configuration=filter/custom-filter=request-dumper:add(class-name=io.undertow.server.handlers.RequestDumpingHandler, module=io.undertow.core) /subsystem=undertow/server=default-server/host=default-host/filter-ref=request-dumper:add
Following log will be dumped to the console:
----------------------------REQUEST--------------------------- URI=/batcheetest/jbatch/batchee/execution/start/myjob characterEncoding=null contentLength=95 contentType=[application/json] header=Accept=*/* header=Content-Type=application/json header=Content-Length=95 header=User-Agent=curl/7.30.0 header=Host=localhost:8080 locale=[] method=POST protocol=HTTP/1.1 queryString= remoteAddr=/127.0.0.1:57668 remoteHost=localhost scheme=http host=localhost:8080 serverPort=8080 --------------------------RESPONSE-------------------------- contentLength=-1 contentType=application/json header=Connection=keep-alive header=X-Powered-By=Undertow/1 header=Server=WildFly/8 header=Transfer-Encoding=chunked header=Content-Type=application/json header=Date=Fri, 20 Mar 2015 07:58:13 GMT status=200 ==============================================================
I’m disappointed that there is no dump of request body :(