Spring Dynamic Language Support with Groovy

Groovy is a dynamic and object-oriented programming language running on JVM. It uses a syntax like Java, can be embedded in Java and is compiled to byte-code. Java code can be called from Groovy, and vice versa. Some of Groovy features are Meta and Functional programming, Dynamic typing (with the def keyword), Closures, GroovyBeans, Groovlets, integration with Bean Scripting Framework(BSF), Generics, Anotation and Collection Support.

This article explains fundamental Spring Dynamic Language Support for Groovy via the following ways :

1) By using Java syntax and Spring Stereotype,
2) By using Groovy syntax and Spring Stereotype,
3) By using inline-script feature,
4) By using Spring Groovy language support(lang:groovy).

Used Technologies :

JDK 1.7.0_09
Spring 3.2.0
Groovy 2.0.4
Maven 3.0.4

  • Facebook
  • Twitter
  • Reddit
  • LinkedIn
  • DZone
  • Add to favorites
  • Email
  • RSS
  • Delicious
  • Live

Chunk Oriented Processing in Spring Batch

Big Data Sets’ Processing is one of the most important problem in the software world. Spring Batch is a lightweight and robust batch framework to process the data sets.

Spring Batch Framework offers ‘TaskletStep Oriented’ and ‘Chunk Oriented’ processing style. In this article, Chunk Oriented Processing Model is explained. Also, TaskletStep Oriented Processing in Spring Batch Article is definitely suggested to investigate how to develop TaskletStep Oriented Processing in Spring Batch.

Chunk Oriented Processing Feature has come with Spring Batch v2.0. It refers to reading the data one at a time, and creating ‘chunks’ that will be written out, within a transaction boundary. One item is read from an ItemReader, handed to an ItemProcessor, and written. Once the number of items read equals the commit interval, the entire chunk is written out via the ItemWriter, and then the transaction is committed.

Basically, this feature should be used if at least one data item’ s reading and writing is required. Otherwise, TaskletStep Oriented processing can be used if the data item’ s only reading or writing is required.

Chunk Oriented Processing model exposes three important interface as ItemReader, ItemProcessor and ItemWriter via org.springframework.batch.item package.

ItemReader : This interface is used for providing the data. It reads the data which will be processed.

ItemProcessor : This interface is used for item transformation. It processes input object and transforms to output object.

ItemWriter : This interface is used for generic output operations. It writes the datas which are transformed by ItemProcessor. For example, the datas can be written to database, memory or outputstream (etc). In this sample application, we will write to database.

Let us take a look how to develop Chunk Oriented Processing Model.

Used Technologies :

JDK 1.7.0_09
Spring 3.1.3
Spring Batch 2.1.9
Hibernate 4.1.8
Tomcat JDBC 7.0.27
MySQL 5.5.8
MySQL Connector 5.1.17
Maven 3.0.4

  • Facebook
  • Twitter
  • Reddit
  • LinkedIn
  • DZone
  • Add to favorites
  • Email
  • RSS
  • Delicious
  • Live

Hazelcast Distributed Execution with Spring

The ExecutorService feature had come with Java 5 and is under java.util.concurrent package. It extends the Executor interface and provides a thread pool functionality to execute asynchronous short tasks. Java Executor Service Types is suggested to look over basic ExecutorService implementation.

Also ThreadPoolExecutor is a very useful implementation of ExecutorService ınterface. It extends AbstractExecutorService providing default implementations of ExecutorService execution methods. It provides improved performance when executing large numbers of asynchronous tasks and maintains basic statistics, such as the number of completed tasks. How to develop and monitor Thread Pool Services by using Spring is also suggested to investigate how to develop and monitor Thread Pool Services.

So far, we have just talked Undistributed Executor Service implementation. Let us also investigate Distributed Executor Service.

Hazelcast Distributed Executor Service feature is a distributed implementation of java.util.concurrent.ExecutorService. It allows to execute business logic in cluster. There are four alternative ways to realize it :

1) The logic can be executed on a specific cluster member which is chosen.
2) The logic can be executed on the member owning the key which is chosen.
3) The logic can be executed on the member Hazelcast will pick.
4) The logic can be executed on all or subset of the cluster members.

This article shows how to develop Distributed Executor Service via Hazelcast and Spring.

Used Technologies :

JDK 1.7.0_09
Spring 3.1.3
Hazelcast 2.4
Maven 3.0.4

  • Facebook
  • Twitter
  • Reddit
  • LinkedIn
  • DZone
  • Add to favorites
  • Email
  • RSS
  • Delicious
  • Live

TaskletStep Oriented Processing in Spring Batch

Many enterprise applications require batch processing to process billions of transactions every day. These big transaction sets have to be processed without performance problems. Spring Batch is a lightweight and robust batch framework to process these big data sets.

Spring Batch offers ‘TaskletStep Oriented’ and ‘Chunk Oriented’ processing style. In this article, TaskletStep Oriented Processing Model is explained.

Let us investigate fundamental Spring Batch components :

Job : An entity that encapsulates an entire batch process. Step and Tasklets are defined under a Job

Step : A domain object that encapsulates an independent, sequential phase of a batch job.

JobInstance : Batch domain object representing a uniquely identifiable job run – it’s identity is given by the pair Job and JobParameters.

JobParameters : Value object representing runtime parameters to a batch job.

JobExecution : A JobExecution refers to the technical concept of a single attempt to run a Job. An execution may end in failure or success, but the JobInstance corresponding to a given execution will not be considered complete unless the execution completes successfully.

JobRepository : An interface which responsible for persistence of batch meta-data entities. In the following sample, an in-memory repository is used via MapJobRepositoryFactoryBean.

JobLauncher : An interface exposing run method, which launches and controls the defined jobs.

TaskLet : An interface exposing execute method, which will be a called repeatedly until it either returns RepeatStatus.FINISHED or throws an exception to signal a failure. It is used when both readers and writers are not required as the following sample.

Let us take a look how to develop Tasklet-Step Oriented Processing Model.

Used Technologies :

JDK 1.7.0_09
Spring 3.1.3
Spring Batch 2.1.9
Maven 3.0.4

  • Facebook
  • Twitter
  • Reddit
  • LinkedIn
  • DZone
  • Add to favorites
  • Email
  • RSS
  • Delicious
  • Live

Coherence Event Processing by using Map Trigger Feature

This article shows how to process Coherence events by using Map Triggers. Basically, Distributed Data Management in Oracle Coherence is suggested to look over basic configuration and implementation of Oracle Coherence API.

Map Triggers are one of the most important features of Oracle Coherence to provide a highly customized cache management system. MapTrigger represents a functional agent that allows to validate, reject or modify mutating operations against an underlying map. Also, they can prevent invalid transactions, enforce security, provide event logging and auditing, and gather statistics on data modifications.

For example, we have code that is working with a NamedCache, and we want to change an entry’s behavior or contents before the entry is inserted into the map. This change can be made without modifying all the existing code by enabling a map trigger.

There are two ways to add Map Triggers feature to application :

1) A MapTriggerListener can be used to register a MapTrigger with a Named Cache
2) The class-factory mechanism can be used in the coherence-cache-config.xml configuration file

In the following sample application, MapTrigger functionality is implemented by following the first way. A new cluster called OTV, is created and User bean is distributed by user-map NamedCache object used among two members of the cluster.

Used Technologies :

JDK 1.6.0_35
Spring 3.1.2
Coherence 3.7.1
Maven 3.0.2

  • Facebook
  • Twitter
  • Reddit
  • LinkedIn
  • DZone
  • Add to favorites
  • Email
  • RSS
  • Delicious
  • Live

How to distribute Spring Beans by using EntryProcessor and PortableObject features in Oracle Coherence

This article shows how to distribute Spring beans by using EntryProcessor and Portable Object Format(POF) features in Oracle Coherence.

Coherence supports a lock-free programming model through the EntryProcessor API. This feature improves system performance by reducing network access and performing an implicit low-level lock on the entries. This implicit low-level locking functionality is different than the explicit lock(key) provided by ConcurrentMap API.

Explicit locking, Transaction Framework API and Coherence Resource Adapter are other Coherence Transaction Options as Entry Processors. For detailed informations about Coherence Transaction Options, please look at the references section. In addition, Distributed Data Management in Oracle Coherence Article can be suggested for the Coherence Explicit locking implementation.

Portable Object Format(POF) is a platform-independent serialization format. It allows to encode equivalent Java, .NET and C++ objects into the identical sequence of bytes. POF is suggested for the system performance since Serialization and Deserialization performances of POF are better than the Standart Java Serialization(According to Coherence Reference document, in a simple test class with a String, a long, and three ints, (de)serialization was seven times faster than the Standart Java Serialization).

Coherence offers many kinds of cache types such as Distributed(or Partitioned), Replicated, Optimistic, Near, Local and Remote Cache. Distributed cache is defined as a collection of data that is distributed (or, partitioned) across any number of cluster nodes such that exactly one node in the cluster is responsible for each piece of data in the cache, and the responsibility is distributed (or, load-balanced) among the cluster nodes. Please note that distributed cache type has been used in this article. Since the other cache-types are not in the scope of this article, please look at the References section or Coherence Reference document. Their configurations are very similar to distributed cache configuration.

How to distribute Spring Beans by using Coherence Article covering Explicit locking – Java Standart Serialization is suggested to compare two different implementations(EntryProcessor – Portable Object Format(POF) and Explicit locking – Java Standart Serialization).

In this article, a new cluster named OTV has been created and a spring bean has been distributed by using a cache object named user-cache. It has been distributed between two members of the cluster.

  • Facebook
  • Twitter
  • Reddit
  • LinkedIn
  • DZone
  • Add to favorites
  • Email
  • RSS
  • Delicious
  • Live

Dynamic Property Management in Spring

Static and Dynamic Properties are very important for both operational management and changing the behavior of the system at the production level. Specially, dynamic parameters reduces interruption of the service. This article shows how to manage dynamic properties in Spring Applications by using Quartz.

Multi-Job Scheduling Service by using Spring and Quartz article is offered for Spring and Quartz Integration.

Let us look at Dynamic Property Management in Spring.

Used Technologies :

JDK 1.6.0_31
Spring 3.1.1
Quartz 1.8.5
Maven 3.0.2

  • Facebook
  • Twitter
  • Reddit
  • LinkedIn
  • DZone
  • Add to favorites
  • Email
  • RSS
  • Delicious
  • Live

Spring Testing Support with TestNG

TestNG is a test framework which is designed to cover all categories of tests: unit, functional, end-to-end, integration, (etc). It includes a lot of features such as flexible test configuration, support for data-driven testing (with @DataProvider), powerful execution model (no more TestSuite) (etc).

Spring testing support covers very useful and important features for unit and integration testing of spring based applications. The org.springframework.test.context.testng package provides support classes for TestNG based test cases. This article shows how to test Spring Service layer components by using Spring and TestNG integration. Also next article will show how to test Spring Data Access layer components by using same integration.

Used Technologies :

JDK 1.6.0_31
Spring 3.1.1
TestNG 6.4
Maven 3.0.2

  • Facebook
  • Twitter
  • Reddit
  • LinkedIn
  • DZone
  • Add to favorites
  • Email
  • RSS
  • Delicious
  • Live

JSF2 + Primefaces3 + Spring3 & Hibernate4 Integration Project

This article shows how to integrate JSF2, PrimeFaces3, Spring3 and Hibernate4 Technologies. It provides a general project template for Java developers.

Also if Spring is not used for Business and Data Access layer, JSF – PrimeFaces & Hibernate Integration Project can be offered.

Used Technologies :

JDK 1.6.0_31
Spring 3.1.1
JSF 2.1
Hibernate 4.1.0
Primefaces 3.1.1
MySQL Connector 5.1.17
MySQL 5.5.8
c3p0 0.9.1.2
Tomcat 7.0
Maven 3.0.2

  • Facebook
  • Twitter
  • Reddit
  • LinkedIn
  • DZone
  • Add to favorites
  • Email
  • RSS
  • Delicious
  • Live

Spring Remoting Support with Http Invoker

Spring HTTP Invoker is an important solution for Java-to-Java Remoting. This technology uses the standard Java serialization mechanism to expose services through HTTP and can be thought as an alternative solution instead of the custom serialization found in Hessian and Burlap. Also, it is only provided by Spring so both client and server applications have to be based on Spring.

Spring supports HTTP invoker infrastructure via HttpInvokerProxyFactoryBean and HttpInvokerServiceExporter. HttpInvokerServiceExporter that exports the specified service bean as HTTP invoker service endpoint, accessible via an HTTP invoker proxy. HttpInvokerProxyFactoryBean is a factory bean for HTTP invoker proxies.

Also Spring Remoting Support & RMI article is offered for Spring Remoting introduction and RMI Service & Client sample project.

Let us look at Spring Remoting Support to develop Http Invoker Service & Client.

Used Technologies :

JDK 1.6.0_31
Spring 3.1.1
Tomcat 7.0
Maven 3.0.2

  • Facebook
  • Twitter
  • Reddit
  • LinkedIn
  • DZone
  • Add to favorites
  • Email
  • RSS
  • Delicious
  • Live