Overview

Spring Batch 2.2 – JavaConfig Part 5: Modular configurations

No Comments

When adding more jobs to an ApplicationContext, you will soon run into problems with the naming and the uniqueness of beans. Normally you define one configuration class or one configuration XML file for each job, and then it feels natural to name the ItemReader reader. When adding both configurations to the ApplicationContext, we have two beans with the same name – reader. If we’re lucky, we get an Exception when starting the ApplicationContext, if we’re not they silently override each other. How can we deal with this problem?
This is the fifth post about the new Java based configuration features in Spring Batch 2.2. Previous posts are about a comparison between the new Java DSL and XML, JobParameters, ExecutionContexts and StepScope, profiles and environments and job inheritance. Future posts will be about partitioning and multi-threaded step, everything regarding Java based configuration, of course. You can find the JavaConfig code examples on Github.

The solution to this problem is the following construct:
Instead of having just one ApplicationContext we have one for each job, this way we have no problems with bean naming or overriding. The infrastructure part resides in a common parent ApplicationContext, and we access jobs via the JobRegistry defined in the parent context.
ModularConfigurations
Let’s have a look at a modular job configuration:

@Configuration
@EnableBatchProcessing(modular=true)
public class ModularJobConfiguration {
 
	@Bean
	public DataSource dataSource(){
		EmbeddedDatabaseBuilder embeddedDatabaseBuilder = new EmbeddedDatabaseBuilder();
		return embeddedDatabaseBuilder.addScript("classpath:org/springframework/batch/core/schema-drop-hsqldb.sql")
				.addScript("classpath:org/springframework/batch/core/schema-hsqldb.sql")
				.addScript("classpath:schema-partner.sql")
				.setType(EmbeddedDatabaseType.HSQL)
				.build();
	}
 
	@Bean
	public ApplicationContextFactory someJobs() {
		return new GenericApplicationContextFactory(FlatfileToDbJobConfiguration.class);
	}
 
	@Bean
	public ApplicationContextFactory someMoreJobs() {
		return new GenericApplicationContextFactory(FlatfileToDbWithParametersJobConfiguration.class);
	}
 
}

This way an AutomaticJobRegistrar is added to the ApplicationContext which is responsible for creating separate ApplicationContexts for each bean of type ApplicationContextFactory. So we have two jobs registered in two different client ApplicationContexts. We can access them with the JobRegistry:

@ContextConfiguration(classes=ModularJobConfiguration.class)
@RunWith(SpringJUnit4ClassRunner.class)
public class ModularJobTests {
 
	@Autowired
	private JobRegistry jobRegistry;
 
	@Autowired
	private JobLauncher jobLauncher;
 
	@Autowired
	private DataSource dataSource;
 
	private JdbcTemplate jdbcTemplate;
 
	@Before
	public void setup(){
		jdbcTemplate = new JdbcTemplate(dataSource);
	}
 
	@Test
	public void testLaunchJob() throws Exception {
		Job job = jobRegistry.getJob("flatfileToDbJob");
		jobLauncher.run(job, new JobParameters());
		assertThat(jdbcTemplate.queryForObject("select count(*) from partner",Integer.class),is(6));
		job = jobRegistry.getJob("flatfileToDbWithParametersJob");
		assertThat(job.getName(),is("flatfileToDbWithParametersJob"));
	}
 
}

Conclusion

If you want to separate jobs in different contexts so that they don’t interfere with each other, make use of this modular configuration feature.

Comment

Your email address will not be published. Required fields are marked *