In this example I show how to use Spring Boot to create a simple Spring Batch application.

I used the same architecture in a production software. Spring Boot is great to improve productivity but the documentation is still sparse (the framework is still in 0.5 version).

You can find the complete example here:

The only requirement is to have a mysql database installed.

You can find more information in the source code.

If you start a new project in an IDE, you can simply create a simple new maven project (without Spring wizards or similar).

Spring Boot

Spring boot offers to the developer some great feature to improve his productivity:

The main difficulties I encountered:


For the example we load a fixed line flat file in a MySql database.

Here the content of the file:

Here the result in the DB:

Directory structure:


We have to declare a parent, this is mandatory.

The spring-boot-starter-parent has a reference to all the boot modules:


Because we use Spring Batch, JPA and Mysql we declare the following dependencies:


Input flat file and FixedLengthTokenizer

The input fixed length flat file defines has this structure:

To read correctly tho file we created a PersonFixedLengthTokenizer:

RangeArrayPropertyEditor range = new RangeArrayPropertyEditor();
// we defines how to split the text line, from column 1 to 30 assign the content to 'firstName'
// names have to be the same as the properties in the model class
setNames(new String[]{"firstName", "familyName", "year"});
setColumns((Range[]) range.getValue());

The tokenizer assigns the 30 chars at the beginning of the line to the ‘firstName’ attribute, the chars from 31 to 60 to ‘familyName’ and so on. Because we passed the Person class to the ItemReader, the tokenizer fill automatically the Person attributes with the attributes found in the line.


We created an import.sql script to avoid a current limitation of Spring Batch that doesn’t work well in reading parameters passed to the CommandLineJobRunner.

drop table BatchDB.Batch_JOB_EXECUTION_CONTEXT;
drop table BatchDB.Batch_JOB_EXECUTION_PARAMS;
drop table BatchDB.Batch_JOB_EXECUTION_SEQ;
drop table BatchDB.Batch_JOB_SEQ;
drop table BatchDB.Batch_STEP_EXECUTION_CONTEXT;
drop table BatchDB.Batch_STEP_EXECUTION_SEQ;
drop table BatchDB.Batch_STEP_EXECUTION;
drop table BatchDB.Batch_JOB_EXECUTION;
drop table BatchDB.Batch_JOB_INSTANCE;

The script (automatically found by Hibernate) delete all the tables created by Spring to update the batch status allowing us to restart the batch without using the -next parameter avoiding the following error:

Caused by: org.springframework.batch.core.repository.JobInstanceAlreadyCompleteException: A job instance already exists and is complete for parameters={}. If you want to run this job again, change the parameters.

Caused by: org.springframework.batch.core.repository.JobInstanceAlreadyCompleteException: A job instance already exists and is complete for parameters={}.  If you want to run this job again, change the parameters.

Here we define the database parameters.

Very important in our example, we have to define:


If ddl-auto is not defined by the developer then Spring will use by default ‘create-drop’ deleting the tables at the end of the batch.

This class simply call the BatchConfiguration.class containing the batch configuration.

import org.springframework.boot.SpringApplication;

public class Application {
    public static void main(String args[]) {, args);

The BatchConfiguration class defines the basic beans to execute the batch.

The annotation


is specific to Spring Boot, you can read more about this annotation here:Spring Boot auto configure

//spring boot configuration
// file that contains the properties
public class BatchConfiguration {

     *Load the properties
    private String databaseDriver;
    private String databaseUrl;
    private String databaseUsername;
    private String databasePassword;

     * We define a bean that read each line of the input file.
    public ItemReader reader() {
        // we read a flat file that will be used to fill a Person object
        FlatFileItemReader reader = new FlatFileItemReader();
        // we pass as parameter the flat file directory
        reader.setResource(new ClassPathResource("PersonData.txt"));
        // we use a default line mapper to assign the content of each line to the Person object
        reader.setLineMapper(new DefaultLineMapper() {
            // we use a custom fixed line tokenizer
            setLineTokenizer(new PersonFixedLineTokenizer());
            // as field mapper we use the name of the fields in the Person class
            setFieldSetMapper(new BeanWrapperFieldSetMapper() {
                setTargetType(Person.class); // we create an object Person
        return reader;

     * The ItemProcessor is called after a new line is read and it allows the developer
     * to transform the data read
     * In our example it simply return the original object
    public ItemProcessor<Person, Person> processor() {
        return new PersonItemProcessor();

     * Nothing special here a simple JpaItemWriter
    public ItemWriter writer() {
        JpaItemWriter writer = new JpaItemWriter();

        return writer;

     * This method declare the steps that the batch has to follow
    public Job importPerson(JobBuilderFactory jobs, Step s1) {

        return jobs.get("import")
                .incrementer(new RunIdIncrementer()) // because a spring config bug, this incrementer is not really useful

     * Step
     * We declare that every 1000 lines processed the data has to be committed

    public Step step1(StepBuilderFactory stepBuilderFactory, ItemReader reader,
                      ItemWriter writer, ItemProcessor<Person, Person> processor) {
        return stepBuilderFactory.get("step1")
                .<Person, Person>chunk(1000)

     * As data source we use an external database
     * @return

    public DataSource dataSource() {
        DriverManagerDataSource dataSource = new DriverManagerDataSource();
        return dataSource;

    public LocalContainerEntityManagerFactoryBean entityManagerFactory() {

        LocalContainerEntityManagerFactoryBean lef = new LocalContainerEntityManagerFactoryBean();

return lef;
public JpaVendorAdapter jpaVendorAdapter() {
    HibernateJpaVendorAdapter jpaVendorAdapter = new HibernateJpaVendorAdapter();
    return jpaVendorAdapter;