<?xml version="1.0" encoding="UTF-8"?> <job id="parent" abstract="true" version="1.0" xmlns="http://xmlns.jcp.org/xml/ns/javaee"> <properties> <property name="someJobProp" value="someJobPropValue"/> </properties> <listeners> <listener ref="myJobListener"/> </listeners>
Entries tagged [jbatch]
JBeret JdbcItemWriter / JdbcItemReader example on EE environment
TweetPosted on Sunday Mar 01, 2015 at 12:11AM in JBatch
We need to use non transactional datasource for JdbcItemReader.
JBeret on Java SE with JDBC chunk oriented processing example
TweetPosted on Saturday Feb 28, 2015 at 09:58PM in JBatch
I pushed an example to https://github.com/lbtc-xxx/jberet-se-example . but it looks like not comfortable due to lack of automatic injection mechanism or transaction management that available in EE environment. I guess some better approach might be exists but I don’t know yet. if you know any better way please let me know.
Tags: jbatch
Using JSL Inheritance with JBeret
TweetPosted on Sunday Feb 22, 2015 at 06:28PM in Technology
In practical use, many boilerplate code will appear to Job definition XMLs of JSR352 batch such as definition of Listeners, Properties and attribute such as item-count
of chunk
element. to overcome this, there is the specification of JSL Inheritance but it deferred to next version of JBatch spec but JBeret supports this specification already. it’s very useful to eliminate annoying fragments appear repeatedly so I tested it. WildFly 8.2.0.Final was used for test.
Parent XML
Parent XML begins as follows:
Declared property named someJobProp
become visible from child XMLs. myJobListener
will be included implicitly in child XMLs as well.
Parent XML has an abstract step element which will be referenced from child XMLs as follows. these properties and listeners for a step will be included implicitly too. we can override javax.transaction.global.timeout
at there. it will be affected to all of child steps.
<step id="mySimpleStep" abstract="true"> <properties> <property name="javax.transaction.global.timeout" value="1800"/> </properties> <listeners> <listener ref="myStepListener"/> </listeners> </step>
It is the same for chunk
based step. it can be used to eliminate defining item-count
for all of chunk steps.
<step id="myChunkStep" abstract="true"> <properties> <property name="javax.transaction.global.timeout" value="900"/> <property name="jberet.local-tx" value="true"/> </properties> <listeners> <listener ref="myStepListener"/> <listener ref="myChunkListener"/> </listeners> <chunk item-count="3"/> </step> </job>
Child XML
Child XML became very simple thanks to inheritance as follows. you need to specify parent
and jsl-name
attributes to reference its parent. parent
means the name of parent attribute. jsl-name
means XML file name of referencing element.
<?xml version="1.0" encoding="UTF-8"?> <job id="child" parent="parent" jsl-name="parent" version="1.0" xmlns="http://xmlns.jcp.org/xml/ns/javaee"> <step id="simpleStep" next="chunkStep" parent="mySimpleStep" jsl-name="parent"> <batchlet ref="myBatchlet"/> </step> <step id="chunkStep" parent="myChunkStep" jsl-name="parent"> <chunk> <reader ref="myItemReader"/> <writer ref="myItemWriter"/> </chunk> </step> </job>
Launching example job
-
Clone the repository
-
Build the WAR and deploy it to your WildFly
-
Run a JUnit test class named JobTest. it invokes the job through remote EJB invocation.
Log
You can see Properties, Listeners and item-count
attribute were inherited as expected.
(Batch Thread - 1) Job child starting (Batch Thread - 1) Job child, Step simpleStep starting (Batch Thread - 1) hello world! (Batch Thread - 1) jobProps: {someJobProp=someJobPropValue} (Batch Thread - 1) stepProps: {javax.transaction.global.timeout=1800} (Batch Thread - 1) Job child, Step simpleStep done (Batch Thread - 1) Job child, Step chunkStep starting (Batch Thread - 1) jobProps: {someJobProp=someJobPropValue} (Batch Thread - 1) stepProps: {jberet.local-tx=true, javax.transaction.global.timeout=900} (Batch Thread - 1) Job child, Step chunkStep chunk starting (Batch Thread - 1) write: [1, 2, 3] (Batch Thread - 1) Job child, Step chunkStep chunk done (Batch Thread - 1) Job child, Step chunkStep chunk starting (Batch Thread - 1) write: [4, 5, 6] (Batch Thread - 1) Job child, Step chunkStep chunk done (Batch Thread - 1) Job child, Step chunkStep chunk starting (Batch Thread - 1) write: [7, 8, 9] (Batch Thread - 1) Job child, Step chunkStep chunk done (Batch Thread - 1) Job child, Step chunkStep chunk starting (Batch Thread - 1) write: [0] (Batch Thread - 1) Job child, Step chunkStep chunk done (Batch Thread - 1) Job child, Step chunkStep done (Batch Thread - 1) Job child done
JDBC chunk oriented processing with jberet.local-tx
TweetPosted on Saturday Feb 21, 2015 at 11:40PM in Technology
As following URL said, in my understanding, each open, read, write and close of every JDBC resources (Connection, Statement and ResultSet) needs to be in it’s own transaction.
There are open
and close
methods in ItemReader
and ItemWriter
interface. these methods looks like good to create and dispose JDBC resources such as Connection
, Statement
and ResultSet
but we can’t go that way due to the JSR352 spec. before invocation of these methods, the framework starts a transaction, then commits a transaction after invocation finished. according to preceding URL said, these resources become unusable at readItem
and writeItems
method because these resources were created in another transaction.
It’s terrible to do open a cursor again and again at start of every chunk processing. to overcome this problem, JBeret supplied an implementation specific parameter named jberet.local-tx
. so I created a sample batch project to test that parameter. my test was done with WildFly 8.2.0.Final.
How the batch works
Entire project can be obtained from my GitHub repository. the batch has 2 steps as follows:
-
prepare
: createsSRC
andDEST
table, and insert 100 rows intoSRC
table. refer source of Batchlet for details. -
test
: loads data fromSRC
table, then simply writes data intoDEST
table as is using chunk oriented processing. this step has a propertyjberet.local-tx
withtrue
value.MyItemReader
creates and disposes JDBC resources inopen
andclose
method. refer source of MyItemReader and MyItemWriter for details.
How to run the batch
-
Define a H2 DataSource
xa-data-source add \ --name=MyDS \ --driver-name=h2 \ --jndi-name=java:jboss/datasources/MyDS \ --user-name=sa \ --password=sa \ --xa-datasource-properties={ \ "URL" => "jdbc:h2:/tmp/localtxtest;AUTO_SERVER=TRUE"}
-
Deploy the project
-
Access http://localhost:8080/localtxtest-1.0-SNAPSHOT/ to launch the batch through the Servlet which mapped at
/
-
Look your database to the check batch works expectedly
Notes
Actually, the problem ARJUNA016087
warning is disappeared in latest WildFly 8.2.0.Final without using jberet.local-tx
. but I don’t know whether it is intended to fix or simply by chance still. I’ll keep looking further of this discussion.
Multiple deployment use mode is implemented to jberetweb
TweetPosted on Thursday Jan 15, 2015 at 09:06PM in jberetweb
Now jberetweb can operate (start, stop, etc) distributed batches in multiple deployment. to enable multiple deployment use mode, suppress -DjobOperator.jndi
and specify -DjobOperator.name=${facade-class-name}
(e.g. JobOperatorFacade
) in mvn option when you build it.
In multiple deployment use mode, "App Name" column will be added for grasp where is deployment of each job comprehensively. actions such as restart, stop will be executed through lookup of appropriate remote EJB interface in according to "App Name".
Also in Start Job window, "App Name" can be specified. this will lookup appropriate remote interface accordingly too.