java - File processing design suggesstion -
I am developing an application to process several CSV files using Apache camels. There are many changes and assumptions in processing.
The format of the files can be dynamic, but we will always get the csv header before it comes. I do not want to add a new model every time I come to a new format, instead I want to create a map with key values added. But for simple assumptions and complex rules, I had to convert it to bean (since using the drolls and bean verification).
If I create a lot of models for incoming files, then the camel processor has a possible example (bad idea) or type based strategy to choose a correct strategy for a number of processes. Selection.
Can someone recommend me a design approach to deal with this?
Use the component where you get a list with the map containing the key values:
last CsvDataFormat format = new CsvDataFormat (); Format.setUseMaps (true); Format.setDelimiter (","); ("Direct: Start"). Enormous (Format). Process (New Processor) {@ Override Public Nuclear Process (Last exchange exchange) throws exceptions {Final list & lt; Maps & lt; String, string & gt; & Gt; Body = exchange.getIn () .getBody (List.class); // Data Transform and / or Validate ...}); The processor can convert your data to Java bean or to validate the content directly.
Comments
Post a Comment