Introducing the Delegate Pattern





5.00/5 (1 vote)
Introduction to the delegate pattern
Delegate: A person who is chosen or elected to vote or act for others – Merriam-Webster.
Delegate pattern: In software engineering, the delegation pattern is a design pattern in object-oriented programming where an object, instead of performing one of its stated tasks, delegates that task to an associated helper object – Wikipedia.
Make things as simple as possible, but not simpler – Albert Einstein, paraphrased.
Spring Batch is an important tool in the Enterprise Java toolkit. It provides great functionality out of the box, especially for reading and writing data from different sources. We have provided several articles in this blog introducing Spring Batch. If you are not familiar with Spring Batch and the Reader, Processor, Writer Tasklet, please take a moment and review those.
The paraphrased quote I use above is important to me. One of the things I try to do is keep the code I deliver as maintainable as possible. I want it to work, and work well, but code I check in today, will be maintained by somebody at some future date. Keeping the code as simple as possible is one way of ensuring that the code can be easily maintainable.
So what happens when you have a complicated data source that you have to process?
We find that often input files we have to process are not as simple as one record per line. Often we have multiple lines in the file describe just one record.
For instance:
HKaren Traviss
LAB00KW3VG2G
LI0345478274
LI0345511131
F00000003
HJim Butcher
LI0451457811
F00000001
HDave Duncan
LI0380791277
LI0345352912
F00000002
HRik Scarborough
LI9999999999
F00000001
Here, we have a file that contains four records across fifteen lines. Each record starts with a Header line, contains one or more Body lines, and ends on a Footer. The Header contains a line type (H for header), and a name. The line also contains a line type (L), the type of lookup, in this example either a ISBN or an Amazon code, and the key to look up a book. The footer contains, again, a line type and the number of records in this block.
Using a standard Reader, each line would be read then passed on to the Processor who would then have to determine what kind of line it is dealing with. The Processor would then have to retain the information from each Header as it processed each Body line, until a Footer was processed. The Writer would then have to be aware of each line that the Processor sent, and whether it should be written. This is complex, in part, because multiple objects have to be aware of how the file is read in, instead of the Processor only caring about a single object, and the Writer only concerned with writing what it’s been given.
Instead, let's introduce the Delegate pattern to the Reader and let it handle creating the entire record. Since we have information from multiple lines as well as a Header and Footer that we will use to create each record, we will have to pass the Processor a list of records. The observant among you will have noticed that each Record contains either an ISBN or Amazon book notation and that could be used to lookup the Author, which is also contained in the Header. In a real life example, this type of redundancy may or may not happen.
Let’s wrap the output in another object to make it easier to work with.
public class OrderReaderStep implements ItemReader<OrderList> {
private static final Logger logger = LoggerFactory.getLogger(OrderReaderStep.class);
private FlatFileItemReader
<FieldSet> delegate;
private static final String FOOTER = "F*";
private static final String BODY = "L*";
private static final String HEADER = "H*";
@BeforeStep
public void beforeStep(StepExecution stepExecution) {
delegate = new FlatFileItemReader<>();
delegate.setResource(new ClassPathResource("orders.txt"));
final DefaultLineMapper
<FieldSet> defaultLineMapper = new DefaultLineMapper<>();
final PatternMatchingCompositeLineTokenizer
orderFileTokenizer = new PatternMatchingCompositeLineTokenizer();
final Map<String, LineTokenizer> tokenizers = new HashMap<>();
tokenizers.put(HEADER, buildHeaderTokenizer());
tokenizers.put(BODY, buildBodyTokenizer());
tokenizers.put(FOOTER, buildFooterTokenizer());
orderFileTokenizer.setTokenizers(tokenizers);
defaultLineMapper.setLineTokenizer(orderFileTokenizer);
defaultLineMapper.setFieldSetMapper(new PassThroughFieldSetMapper());
delegate.setLineMapper(defaultLineMapper);
delegate.open(stepExecution.getExecutionContext());
}
@AfterStep
public void afterStep(StepExecution stepExecution) {
delegate.close();
}
@Override
public OrderList read() throws Exception,
UnexpectedInputException, ParseException, NonTransientResourceException {
logger.info("start read");
OrderList record = null;
FieldSet line;
List<Order> bodyList = new ArrayList<>();
while ((line = delegate.read()) != null) {
String prefix = line.readString("lineType");
if (prefix.equals("H")) {
record = new OrderList();
record.setName(line.readString("name"));
} else if (prefix.equals("L")) {
Order order = new Order();
order.setLookup(line.readString("lookupKey"));
order.setLookupType(line.readString("keyType"));
bodyList.add(order);
} else if (prefix.equals("F")) {
if (record != null) {
if (line.readLong("count") != bodyList.size()) {
throw new ValidationException("Size does not match file count");
}
record.setOrders(bodyList);
}
break;
}
}
logger.info("end read");
return record;
}
private LineTokenizer buildBodyTokenizer() {
FixedLengthTokenizer tokenizer = new FixedLengthTokenizer();
tokenizer.setColumns(new Range[]{ //
new Range(1, 1), // lineType
new Range(2, 2), // keyType
new Range(3, 12) // lookup key
});
tokenizer.setNames(new String[]{ //
"lineType",
"keyType",
"lookupKey"
}); //
tokenizer.setStrict(false);
return tokenizer;
}
private LineTokenizer buildFooterTokenizer() {
FixedLengthTokenizer tokenizer = new FixedLengthTokenizer();
tokenizer.setColumns(new Range[]{ //
new Range(1, 1), // lineType
new Range(2, 9) // count
});
tokenizer.setNames(new String[]{ //
"lineType",
"count"
}); //
tokenizer.setStrict(false);
return tokenizer;
}
private LineTokenizer buildHeaderTokenizer() {
FixedLengthTokenizer tokenizer = new FixedLengthTokenizer();
tokenizer.setColumns(new Range[]{ //
new Range(1, 1), // lineType
new Range(2, 20), // name
});
tokenizer.setNames(new String[]{ //
"lineType",
"name"
}); //
tokenizer.setStrict(false);
return tokenizer;
}
}
This Reader
implements the ItemReader
interface. This gives us a read
method that is called by the job until it returns a null
, or in case of a error, throws an exception. In our Reader
, we declare another Reader
, this one is a FlatFileItemReader
. This is our Delegate
, or the Object
that has been selected to perform a function on for us. Our read
method will loop on the Delegate
’s read
until a Footer
is read. It will then bundle the entire record into its wrapper and pass it on to the Processor
.
The Delegate Reader must be opened before it can be used, and then should be closed only when it is done. I open it here in the BeforeStep
since I have to initialize it and set it up here. I could also implement the containing reader as an ItemStreamReader
and use the open, close, as well as update, methods that Interface gives us.
Returning a simplified object to the Processor
allows us to greatly simplify the Processor
:
@Override
public List<BookList> process(OrderList orderList) throws Exception {
logger.info("process");
List<BookList> books = new ArrayList<>();
for (Order order : orderList.getOrders()) {
BookList bl = doProcessing(orderList.getName(), order);
books.add(bl);
}
return books;
}
The doProcessing
method can contain the business logic for this Job and needs to create a valid BookList
object. Since we are dealing with multiple records, the process will create multiple BookList
s that can be returned, and passed on to the Writer
. I’ll leave it to you to fill in the rest of this object, but it is just a standard ItemProcessor
. The Processor
does not have to retain record information between calls, so the programmer can concentrate on the business logic.
Our Writer
implements the ItemStreamWriter
. This gives us more methods than the ItemWriter
would, but if you prefer using ItemWriter
similarly to the way we did the Reader
, make sure you open the Delegate
in the BeforeStep
and close it in the AfterStep
.
Using a Delegate
in the Writer
gives us the ability to walk thorough the List
the Writer
receives from the Reader
and Process
.
public class ListWriter implements ItemStreamWriter<List<BookList>> {
private static final Logger logger = LoggerFactory.getLogger(ListWriter.class);
private FlatFileItemWriter<BookList> delegate;
@BeforeStep
public void beforeStep(StepExecution stepExecution) {
delegate = new FlatFileItemWriter<>();
delegate.setResource(new FileSystemResource("booklist.csv"));
delegate.setShouldDeleteIfEmpty(true);
delegate.setAppendAllowed(true);
DelimitedLineAggregator<BookList> dla = new DelimitedLineAggregator<>();
dla.setDelimiter(",");
BeanWrapperFieldExtractor<BookList> fieldExtractor = new BeanWrapperFieldExtractor<>();
fieldExtractor.setNames(new String[]{"bookName", "author"});
dla.setFieldExtractor(fieldExtractor);
delegate.setLineAggregator(dla);
}
@Override
public void close() throws ItemStreamException {
delegate.close();
}
@Override
public void open(ExecutionContext ec) throws ItemStreamException {
delegate.open(ec);
}
@Override
public void update(ExecutionContext ec) throws ItemStreamException {
delegate.update(ec);
}
@Override
public void write(List<? extends List<BookList>> list) throws Exception {
logger.info("write");
for (List<BookList> bookList : list) {
delegate.write(bookList);
}
}
}
This gives us the following output:
Going Grey,Karen Traviss
Hard Contact,Karen Traviss
501st,Karen Traviss
Storm Front,Jim Butcher
Lord of the Fire Lands,Dave Duncan
The Reluctant Swordsman,Dave Duncan
Wolfbrander Series Unpublished,Rik Scarborough
So what happens if it’s a little more complicated and the input file does not contain a footer?
The logical record still starts at the Header line, but ends at the line before the next Header. In our previous example, the system would have to read the next line before it knows it’s done, and then have some complicated logic to retain that information for the next go-round.
HKaren Traviss
LAB00KW3VG2G
LI0345478274
LI0345511131
HJim Butcher
LI0451457811
HDave Duncan
LI0380791277
LI0345352912
HRik Scarborough
LI9999999999
Asking our current writer to read ahead and hold on to that record during the next call is unnecessarily complex, which leads to maintenance headaches. However, we can simplify this by using the PeekableItemReader
:
class OrderReaderStep2 implements ItemStreamReader<OrderList> {
private static final String BODY = "L*";
private static final String HEADER = "H*";
private static final Logger logger = LoggerFactory.getLogger(OrderReaderStep2.class);
private SingleItemPeekableItemReader
<FieldSet> delegate;
@BeforeStep
public void beforeStep(StepExecution stepExecution) {
FlatFileItemReader fileReader = new FlatFileItemReader<>();
fileReader.setResource(new ClassPathResource("orders2.txt"));
final DefaultLineMapper
<FieldSet> defaultLineMapper = new DefaultLineMapper<>();
final PatternMatchingCompositeLineTokenizer
orderFileTokenizer = new PatternMatchingCompositeLineTokenizer();
final Map<String, LineTokenizer> tokenizers = new HashMap<>();
tokenizers.put(HEADER, buildHeaderTokenizer());
tokenizers.put(BODY, buildBodyTokenizer());
orderFileTokenizer.setTokenizers(tokenizers);
defaultLineMapper.setLineTokenizer(orderFileTokenizer);
defaultLineMapper.setFieldSetMapper(new PassThroughFieldSetMapper());
fileReader.setLineMapper(defaultLineMapper);
delegate = new SingleItemPeekableItemReader<>();
delegate.setDelegate(fileReader);
}
@Override
public void close() throws ItemStreamException {
delegate.close();
}
@Override
public void open(ExecutionContext ec) throws ItemStreamException {
delegate.open(ec);
}
@Override
public OrderList read() throws Exception,
UnexpectedInputException, ParseException, NonTransientResourceException {
logger.info("start read");
OrderList record = null;
FieldSet line;
List<Order> bodyList = new ArrayList<>();
while ((line = delegate.read()) != null) {
String prefix = line.readString("lineType");
if (prefix.equals("H")) {
record = new OrderList();
record.setName(line.readString("name"));
} else if (prefix.equals("L")) {
Order order = new Order();
order.setLookup(line.readString("lookupKey"));
order.setLookupType(line.readString("keyType"));
bodyList.add(order);
}
FieldSet nextLine = delegate.peek();
if (nextLine == null || nextLine.readString("lineType").equals("H")) {
record.setOrders(bodyList);
break;
}
}
logger.info("end read");
return record;
}
@Override
public void update(ExecutionContext ec) throws ItemStreamException {
delegate.update(ec);
}
private LineTokenizer buildBodyTokenizer() {
FixedLengthTokenizer tokenizer = new FixedLengthTokenizer();
tokenizer.setColumns(new Range[]{ //
new Range(1, 1), // lineType
new Range(2, 2), // keyType
new Range(3, 12) // lookup key
});
tokenizer.setNames(new String[]{ //
"lineType",
"keyType",
"lookupKey"
}); //
tokenizer.setStrict(false);
return tokenizer;
}
private LineTokenizer buildHeaderTokenizer() {
FixedLengthTokenizer tokenizer = new FixedLengthTokenizer();
tokenizer.setColumns(new Range[]{ //
new Range(1, 1), // lineType
new Range(2, 20), // name
});
tokenizer.setNames(new String[]{ //
"lineType",
"name"
}); //
tokenizer.setStrict(false);
return tokenizer;
}
}
This time, I do implement the containing Reader
as an ItemStreamReader
to show you the difference. This could have been implemented as an ItemReader
as our previous one was.
The PeekableItemReader
allows us to look ahead to the next record to see if we have reached the end of the record, or the end of the file. The same Processor
and Writer
can then be used to produce the same output as before.
Final Thoughts
At first glance, the Delegate Pattern may not seem as simple as using a single Reader
or Writer
. There is more configuration for both of these objects.
But my favorite paraphrased quote says to be as simple as possible, and no simpler. A slightly more complex Reader
and Writer
will make your Processor
much simpler, and help with maintenance down the road.
Code well, my friend.