Overview

Spring Batch 2.2 – JavaConfig Part 4: Job inheritance

No Comments

One important feature in XML is the possibility to write abstract job definitions like these:

	<job id="abstractJob" abstract="true">
		<listeners>
			<listener ref="commonJobExecutionListener"/>
		</listeners>
	</job>

Concrete job definitions may inherit parts of their definition from it:

	<job id="myJob" parent="abstractJob">
		...
	</job>

In enterprise environments it’s often necessary to define common functionality, for example a common job protocol, common logging or common return code mapping, but of course there are many more use cases. You achieve this by registering certain listeners, and with the parent functionality above, it’s easy to register listeners in an abstract job. And often you have very similar jobs in a certain line of business that share more than just listeners, maybe they have the same reader and writer, or the same skip policy etc. In XML you extract this common stuff to abstract job definitions. How can you achieve this with Java based configuration? What are the advantages / disadvantages?
This is the fourth post about the new Java based configuration features in Spring Batch 2.2. Previous posts are about a comparison between the new Java DSL and XML, JobParameters, ExecutionContexts and StepScope and profiles and environments. Future posts will be about modular configurations and partitioning and multi-threaded step, everything regarding Java based configuration, of course. You can find the JavaConfig code examples on Github.

Builders and builder factories

There’s no direct equivalent to abstract job definitions in Java based configuration. But we have builders for jobs and steps, and we can prepare them with default functionality. If you look at the JobBuilderFactory in Spring Batch 2.2, you see that it creates a JobBuilder and calls the method repository on it:

public class JobBuilderFactory {
 
	private JobRepository jobRepository;
 
	public JobBuilderFactory(JobRepository jobRepository) {
		this.jobRepository = jobRepository;
	}
 
	public JobBuilder get(String name) {
		JobBuilder builder = new JobBuilder(name).repository(jobRepository);
		return builder;
	}
 
}

That’s exactly how you implement job inheritance in Java based configuration: create a custom builder factory for your job or step and add the default functionality by calling the appropriate methods on the builder. The following CustomJobBuilderFactory allows for adding JobExecutionListeners to a JobBuilder.

public class CustomJobBuilderFactory extends JobBuilderFactory {
 
	private JobExecutionListener[] listeners;
 
	public CustomJobBuilderFactory(JobRepository jobRepository, JobExecutionListener... listeners) {
		super(jobRepository);
		this.listeners = listeners;
	}
 
	@Override
	public JobBuilder get(String name) {
		JobBuilder jobBuilder = super.get(name);
		for (JobExecutionListener jobExecutionListener: listeners){
			jobBuilder = jobBuilder.listener(jobExecutionListener);
		}
		return jobBuilder;
	}
 
}

Configuration with delegation

Now that we have our custom job builder factory, how do we use it? We create a common configuration class that contains the listener we want to add to every job, and the factory, of course:

@Configuration
public class CommonJobConfigurationForDelegation {
 
	@Autowired
	private JobRepository jobRepository;
 
	@Bean
	public CustomJobBuilderFactory customJobBuilders(){
		return new CustomJobBuilderFactory(jobRepository, protocolListener());
	}
 
	@Bean
	public ProtocolListener protocolListener(){
		return new ProtocolListener();
	}
 
}

You can tell by its name that it should be included by delegation in concrete job configurations like this:

@Configuration
@Import(CommonJobConfigurationForDelegation.class)
public class DelegatingConfigurationJobConfiguration{
 
	@Autowired
	private CommonJobConfigurationForDelegation commonJobConfiguration;
 
	@Bean
	public Job delegatingConfigurationJob(){
		return commonJobConfiguration.customJobBuilders()
				.get("delegatingConfigurationJob")
				.start(step())
				.build();
	}
 
	...
}

Why delegation? As you might know, there are two ways in Java to call common functionality: either you do it by delegating to an object that performs the logic, or you inherit the functionality from a super class. In the case above we use delegation, because we don’t inherit from CommonJobConfigurationForDelegation, we just import it and delegate the creation of the JobBuilder to its method customJobBuilders. In general I prefer delegation over inheritance because it isn’t as strict as inheritance, and classes aren’t so tightly coupled then. We may just extend one class, but we may delegate to as many objects as we want.
But let’s compare the Java configuration to the XML configuration now. The approaches are technically very different, though they achieve the same. In XML we define abstract Spring bean definitions that are completed with the information in the concrete job definition. In Java we prepare a builder with some default calls, and the concrete job is created with the prepared builder. First thing you notice: the Java approach is much more natural with less Spring magic in it. Now let’s assume that the parent functionality resides in some common library, in our case either the XML file with the abstract job definition or the class CommonJobConfigurationForDelegation. The common library is added as a Maven dependency. Let’s see how everyday’s handling differ:
XML: In Eclipse you cannot open the parent XML with the ‘Open resource’ shortcut, you have to search it by hand in the dependencies. And even if you find it, there’s no direct connection between the concrete and parent job definition, you have to do a full text search in the parent XML to find it.
Java: You just take the class with the concrete job definition and do ‘Open implementation’ on the method customJobBuilders, and you jump directly to the place where the common stuff is defined.
The advantages are obvious, aren’t they?

Configuration with inheritance

I said I prefer delegation over inheritance, but that doesn’t mean that there aren’t valid use cases for inheritance. Let’s take a look at a configuration class designed for inheritance:

public abstract class CommonJobConfigurationForInheritance {
 
	@Autowired
	private JobRepository jobRepository;
 
	@Autowired
	private PlatformTransactionManager transactionManager;
 
	@Autowired
	private InfrastructureConfiguration infrastructureConfiguration;
 
	protected CustomJobBuilderFactory customJobBuilders(){
		return new CustomJobBuilderFactory(jobRepository, protocolListener());
	}
 
	protected CustomStepBuilderFactory<Partner,Partner> customStepBuilders(){
		return new CustomStepBuilderFactory<Partner,Partner>(
				jobRepository,
				transactionManager,
				completionPolicy(),
				reader(),
				processor(),
				writer(),
				logProcessListener());
	}
 
	@Bean
	public CompletionPolicy completionPolicy(){
		return new SimpleCompletionPolicy(1);
	}
 
	public abstract ItemProcessor<Partner,Partner> processor();
 
	@Bean
	public FlatFileItemReader<Partner> reader(){
		FlatFileItemReader<Partner> itemReader = new FlatFileItemReader<Partner>();
		itemReader.setLineMapper(lineMapper());
		itemReader.setResource(new ClassPathResource("partner-import.csv"));
		return itemReader;
	}
 
	@Bean
	public LineMapper<Partner> lineMapper(){
		DefaultLineMapper<Partner> lineMapper = new DefaultLineMapper<Partner>();
		DelimitedLineTokenizer lineTokenizer = new DelimitedLineTokenizer();
		lineTokenizer.setNames(new String[]{"name","email","gender"});
		lineTokenizer.setIncludedFields(new int[]{0,2,3});
		BeanWrapperFieldSetMapper<Partner> fieldSetMapper = new BeanWrapperFieldSetMapper<Partner>();
		fieldSetMapper.setTargetType(Partner.class);
		lineMapper.setLineTokenizer(lineTokenizer);
		lineMapper.setFieldSetMapper(fieldSetMapper);
		return lineMapper;
	}
 
	@Bean
	public ItemWriter<Partner> writer(){
		JdbcBatchItemWriter<Partner> itemWriter = new JdbcBatchItemWriter<Partner>();
		itemWriter.setSql("INSERT INTO PARTNER (NAME, EMAIL) VALUES (:name,:email)");
		itemWriter.setDataSource(infrastructureConfiguration.dataSource());
		itemWriter.setItemSqlParameterSourceProvider(new BeanPropertyItemSqlParameterSourceProvider<Partner>());
		return itemWriter;
	}
 
	@Bean
	public ProtocolListener protocolListener(){
		return new ProtocolListener();
	}
 
	@Bean
	public LogProcessListener logProcessListener(){
		return new LogProcessListener();
	}
 
}

We have two builder factories, one for the job and one for the step. They are protected and may be used by a subclass. If you’re interested in the implementation of the CustomStepBuilderFactory, take a look at Github. The builder factories use a lot of the components defined in this configuration class. The processor has an abstract definition, so a subclass has to add a processor. All the other components may be overridden by a subclass if needed. Let’s take a look at such a subclass.

@Configuration
public class InheritedConfigurationJobConfiguration extends CommonJobConfigurationForInheritance{
 
	@Bean
	public Job inheritedConfigurationJob(){
		return customJobBuilders().get("inheritedConfigurationJob")
				.start(step())
				.build();
	}
 
	@Bean
	public Step step(){
		return customStepBuilders().get("step")
				.faultTolerant()
				.skipLimit(10)
				.skip(UnknownGenderException.class)
				.listener(logSkipListener())
				.build();
	}
 
	@Override
	@Bean
	public ItemProcessor<Partner, Partner> processor() {
		return new ValidationProcessor();
	}
 
	@Override
	@Bean
	public CompletionPolicy completionPolicy() {
		return new SimpleCompletionPolicy(3);
	}
 
	@Bean
	public LogSkipListener logSkipListener(){
		return new LogSkipListener();
	}
 
}

So, what do we have here? This concrete configuration class implements the processor method, of course. Furthermore it overrides the definition of the CompletionPolicy. And then it uses the builder factories to create the job and the step, and it adds fault tolerance to the step.
Let’s take a look at the advantages / disadvantages. The coupling between parent and concrete definition is very tight, but in this case it’s fine. We want the parent to define needed components (the abstract method) and overridable default components (the other methods), and you cannot do this with delegation. Of course you may just inherit from one parent class. You use this pattern if you clearly want such a tight coupling, for example if you have a lot of very similar jobs that share the same type of components. In general you should only have one level of inheritance, take it as a bad smell and warning sign if there are more! Of course it’s always possible to combine delegation and inheritance.

Conclusion

Inheritance between jobs is important in enterprise environments. It’s achievable in XML and in Java based configuration in very different technical ways. The Java way may be a little bit more verbose, but has a lot of advantages that I pointed out in the paragraphs above.

Comment

Your email address will not be published. Required fields are marked *