Code Monkey home page Code Monkey logo

beanio's Introduction

  BeanIO Java CI Maintainability Rating Security Rating Coverage

A Java library for marshalling and unmarshalling bean objects from XML, CSV, delimited and fixed length stream formats.

Installation

If you're coming from BeanIO 2.x, please note the new groupId com.github.beanio. Package names remain the same as before (org.beanio.*).

Maven

To use snapshot versions, configure the following repository:
<repositories>
    <repository>
        <id>ossrh</id>
        <url>https://s01.oss.sonatype.org/content/repositories/snapshots/</url>
        <snapshots>
            <enabled>true</enabled>
        </snapshots>
        <releases>
            <enabled>false</enabled>
        </releases>
    </repository>
</repositories>

Add the following dependency to your pom.xml:

<dependency>
    <groupId>com.github.beanio</groupId>
    <artifactId>beanio</artifactId>
    <version>3.0.0</version>
</dependency>

Gradle

To use snapshot versions, configure the following repository:
repositories {
    maven {
        url 'https://s01.oss.sonatype.org/content/repositories/snapshots'
    }
}

Add the following dependency to your build.gradle:

implementation 'com.github.beanio:beanio:3.0.0'

What's new in v3?

See changelog.txt

Project status

This is a fork of the original BeanIO library. It combines :

The website for version 3.x is available at https://beanio.github.io.

The website for version 2.x is available at http://www.beanio.org.

beanio's People

Contributors

andre-cardoso avatar bigloupe avatar bjansen avatar bjansen-caps avatar davsclaus avatar garcia-jj avatar johnpoth avatar krisbanas avatar nicoschl avatar saxicek avatar willsoto avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

beanio's Issues

Fixed length, optional, ignored fields do not honor non-space padding setting

The following side effect for the fix to issue #11 has been observed:

A field defined such as the one below will always marshal as spaces no matter 
the padding setting.

<field name="xxx" length="3" padding="A" ignore="true" />

As a workaround, if the stream is used for writing/marshaling only, you can set 
required="true" to marshal the field as "AAA".

The rules defined in issue #11 will likely be modified to exclude ignored 
fields from writing null values as all spaces (no matter the padding 
configuration), because ignored fields are always null.

Original issue reported on code.google.com by [email protected] on 6 Oct 2011 at 1:37

Record Identification Based on Field Existence

BeanIO requires a regex or literal expression to be configured for record 
identifying fields in a CSV, delimited or fixed length format.  This could be 
relaxed to allow record identification based on the existence of a field where 
minOccurs="1".  For now, the workaround is to simply configure a regex that 
matches anything (e.g. regex=".*").

Original issue reported on code.google.com by [email protected] on 10 Mar 2012 at 5:22

Import feature implementation is flawed

In all but the simplest cases, importing one mapping file into another can lead 
to some unexpected side effects.  If a mapping file A imports mapping file B, 
here are few examples:

1.  Circular references are not prevented, such that if B imports A, a 
StackOverflowError may occur.
2.  Stream definitions in B may adopt type handers declared in A.
3.  If A and B both import a third mapping file C, C may be loaded multiple 
times causing duplicate stream or other exceptions.

Please note this list is not exhaustive.

Original issue reported on code.google.com by [email protected] on 13 Sep 2011 at 10:34

Fixed length position field attribute off by 1

The position field attribute incorrectly starts at 1 for fixed length formatted 
streams, and should start at 0.  Since 0.9.0.

Temporary workaround: do not configure a field position, use length attributes 
only.


Original issue reported on code.google.com by [email protected] on 27 Jan 2011 at 10:35

Identify a record based on number of field?

I have a delimited file with following structure:

550 3034 244 1 0 660 0 0 0 0 0 0 0 0 01
3090 120
3082 202
424 3180 219 1 0 660 0 0 0 0 0 0 0 0 05
3300 300
3300 390
3220 390

The longer line(with 15 fields) is mapped to a parent class, and the short line 
(with 2 fields) is mapped to a list inside the parent class.

The problem is that I cannot identify a record correctly because the first 2 
fields are both integer. Is there a way to identify a record based on number of 
field??

I am using BeanIO version 2.0.1 with JDK6.

Many thanks,
Steve

Original issue reported on code.google.com by [email protected] on 23 Nov 2012 at 8:01

Attachments:

NullPointerException at org.beanio.internal.util.Settings.loadProperties

I am getting the below exception when I try to deploy my application on Apache 
Karaf. Can you please help me with this issue.

BeanIO version: 2.0.1
Using BeanIO with Apache Camel

Caused by: java.lang.NullPointerException
    at org.beanio.internal.util.Settings.loadProperties(Settings.java:233)
    at org.beanio.internal.util.Settings.getInstance(Settings.java:162)
    at org.beanio.StreamFactory.newInstance(StreamFactory.java:323)
    at org.beanio.StreamFactory.newInstance(StreamFactory.java:295)
    at org.apache.camel.dataformat.beanio.BeanIODataFormat.doStart(BeanIODataFormat.java:79)
    at org.apache.camel.support.ServiceSupport.start(ServiceSupport.java:60)
    at org.apache.camel.util.ServiceHelper.startService(ServiceHelper.java:62)
    at org.apache.camel.util.ServiceHelper.startService(ServiceHelper.java:52)
    at org.apache.camel.processor.UnmarshalProcessor.doStart(UnmarshalProcessor.java:100)
    at org.apache.camel.support.ServiceSupport.start(ServiceSupport.java:60)
    at org.apache.camel.util.ServiceHelper.startService(ServiceHelper.java:62)
    at org.apache.camel.util.ServiceHelper.startService(ServiceHelper.java:52)
    at org.apache.camel.util.ServiceHelper.startServices(ServiceHelper.java:73)
    at org.apache.camel.util.AsyncProcessorConverterHelper$ProcessorToAsyncProcessorBridge.start(AsyncProcessorConverterHelper.java:92)
    at org.apache.camel.util.ServiceHelper.startService(ServiceHelper.java:62)
    at org.apache.camel.util.ServiceHelper.startService(ServiceHelper.java:52)
    at org.apache.camel.util.ServiceHelper.startServices(ServiceHelper.java:73)
    at org.apache.camel.processor.DelegateAsyncProcessor.doStart(DelegateAsyncProcessor.java:78)
    at org.apache.camel.support.ServiceSupport.start(ServiceSupport.java:60)
    at org.apache.camel.util.ServiceHelper.startService(ServiceHelper.java:62)
    at org.apache.camel.util.ServiceHelper.startService(ServiceHelper.java:52)
    at org.apache.camel.util.ServiceHelper.startServices(ServiceHelper.java:73)
    at org.apache.camel.processor.DelegateAsyncProcessor.doStart(DelegateAsyncProcessor.java:78)
    at org.apache.camel.processor.interceptor.TraceInterceptor.doStart(TraceInterceptor.java:358)
    at org.apache.camel.support.ServiceSupport.start(ServiceSupport.java:60)
    at org.apache.camel.util.ServiceHelper.startService(ServiceHelper.java:62)
    at org.apache.camel.util.ServiceHelper.startService(ServiceHelper.java:52)
    at org.apache.camel.util.ServiceHelper.startServices(ServiceHelper.java:73)
    at org.apache.camel.processor.RedeliveryErrorHandler.doStart(RedeliveryErrorHandler.java:1049)
    at org.apache.camel.support.ChildServiceSupport.start(ChildServiceSupport.java:41)
    at org.apache.camel.support.ChildServiceSupport.start(ChildServiceSupport.java:28)
    at org.apache.camel.util.ServiceHelper.startService(ServiceHelper.java:62)
    at org.apache.camel.util.ServiceHelper.startService(ServiceHelper.java:52)
    at org.apache.camel.util.ServiceHelper.startServices(ServiceHelper.java:73)
    at org.apache.camel.processor.interceptor.DefaultChannel.doStart(DefaultChannel.java:152)
    at org.apache.camel.support.ServiceSupport.start(ServiceSupport.java:60)
    at org.apache.camel.util.ServiceHelper.startService(ServiceHelper.java:62)
    at org.apache.camel.util.ServiceHelper.startService(ServiceHelper.java:52)
    at org.apache.camel.util.ServiceHelper.startServices(ServiceHelper.java:85)
    at org.apache.camel.processor.MulticastProcessor.doStart(MulticastProcessor.java:941)
    at org.apache.camel.support.ServiceSupport.start(ServiceSupport.java:60)
    at org.apache.camel.util.ServiceHelper.startService(ServiceHelper.java:62)
    at org.apache.camel.util.ServiceHelper.startService(ServiceHelper.java:52)
    at org.apache.camel.util.ServiceHelper.startServices(ServiceHelper.java:73)
    at org.apache.camel.processor.DelegateAsyncProcessor.doStart(DelegateAsyncProcessor.java:78)
    at org.apache.camel.support.ServiceSupport.start(ServiceSupport.java:60)
    at org.apache.camel.util.ServiceHelper.startService(ServiceHelper.java:62)
    at org.apache.camel.util.ServiceHelper.startService(ServiceHelper.java:52)
    at org.apache.camel.util.ServiceHelper.startServices(ServiceHelper.java:73)
    at org.apache.camel.processor.DelegateAsyncProcessor.doStart(DelegateAsyncProcessor.java:78)
    at org.apache.camel.processor.UnitOfWorkProcessor.doStart(UnitOfWorkProcessor.java:88)
    at org.apache.camel.support.ServiceSupport.start(ServiceSupport.java:60)
    at org.apache.camel.util.ServiceHelper.startService(ServiceHelper.java:62)
    at org.apache.camel.util.ServiceHelper.startService(ServiceHelper.java:52)
    at org.apache.camel.util.ServiceHelper.startServices(ServiceHelper.java:73)
    at org.apache.camel.processor.DelegateAsyncProcessor.doStart(DelegateAsyncProcessor.java:78)
    at org.apache.camel.support.ServiceSupport.start(ServiceSupport.java:60)
    at org.apache.camel.util.ServiceHelper.startService(ServiceHelper.java:62)
    at org.apache.camel.util.ServiceHelper.startService(ServiceHelper.java:52)
    at org.apache.camel.util.ServiceHelper.startServices(ServiceHelper.java:73)
    at org.apache.camel.processor.DelegateAsyncProcessor.doStart(DelegateAsyncProcessor.java:78)
    at org.apache.camel.support.ServiceSupport.start(ServiceSupport.java:60)
    at org.apache.camel.util.ServiceHelper.startService(ServiceHelper.java:62)
    at org.apache.camel.impl.RouteService.startChildService(RouteService.java:322)
    at org.apache.camel.impl.RouteService.warmUp(RouteService.java:151)
    at org.apache.camel.impl.DefaultCamelContext.doWarmUpRoutes(DefaultCamelContext.java:1941)
    at org.apache.camel.impl.DefaultCamelContext.safelyStartRouteServices(DefaultCamelContext.java:1869)
    at org.apache.camel.impl.DefaultCamelContext.doStartOrResumeRoutes(DefaultCamelContext.java:1662)
    at org.apache.camel.impl.DefaultCamelContext.doStartCamel(DefaultCamelContext.java:1550)
    at org.apache.camel.impl.DefaultCamelContext.doStart(DefaultCamelContext.java:1427)
    at org.apache.camel.support.ServiceSupport.start(ServiceSupport.java:60)
    at org.apache.camel.impl.DefaultCamelContext.start(DefaultCamelContext.java:1395)
    at org.apache.camel.blueprint.BlueprintCamelContext.maybeStart(BlueprintCamelContext.java:86)
    at org.apache.camel.blueprint.BlueprintCamelContext.init(BlueprintCamelContext.java:78)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)[:1.7.0_04]
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)[:1.7.0_04]
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)[:1.7.0_04]
    at java.lang.reflect.Method.invoke(Method.java:601)[:1.7.0_04]
    at org.apache.aries.blueprint.utils.ReflectionUtils.invoke(ReflectionUtils.java:225)[9:org.apache.aries.blueprint:0.3.2]
    at org.apache.aries.blueprint.container.BeanRecipe.invoke(BeanRecipe.java:838)[9:org.apache.aries.blueprint:0.3.2]
    at org.apache.aries.blueprint.container.BeanRecipe.runBeanProcInit(BeanRecipe.java:638)[9:org.apache.aries.blueprint:0.3.2] 

Original issue reported on code.google.com by [email protected] on 20 Sep 2012 at 6:34

It would be great for BeanIO to pickup it's properties from mapping file top-level <property> element

Currently to add custom format I need to tell user to pass beanio.properties 
file to application.

It would be much better if org.beanio.<format>.streamDefinitionFactory property 
could be specified in the mapping file itself. 
As far as I can see, top level properties are only used for property 
substitution. I think, they can also be used to add to settings.

Original issue reported on code.google.com by [email protected] on 11 Jan 2013 at 1:04

Reading attributes to a class that have a superclass

Steps.
2. Create a pojo with a superclass that ou wanna get from the XML
3. Try to read it using beanio
3. It's read, but withou any properties

The expected is have the attributes readen drom XML, but it comes empty.

BeanIO 2.0.1, SDK 6


Original issue reported on code.google.com by psybox on 24 Oct 2012 at 4:49

An array of beans causes position to be calcuated incorrectly for fixed length records

If there is an fixed size array or collection of beans in a record followed by 
some additional fields, the position calculated is incorrect for the fields 
following the collection.  Attached is a simple test case.  Notice that the 
field results ends up in the collection.  

<stream name="header" format="fixedlength">
<record name="header" class="com.abc.HeaderTest">
<field name="districtCode" length="10" />
<bean name="taxDetails" class="com.abc.TaxDetails" collection="array" 
minOccurs="10" maxOccurs="10">
    <field name="taxCode" length="4" />
    <field name="taxRate" length="10" padding="0" justify="right" />
    <field name="taxAmount" length="16" padding="0" justify="right"/>
</bean>         
<field name="status" length="2" />          
<field name="results" length="12" />
</record>       
</stream>

ABCS          00000000000000000000000000S Results     0000000000034.45    
00000000000000000000000000    00000000000000000000000000    
00000000000000000000000000    00000000000000000000000000    
00000000000000000000000000    00000000000000000000000000    
00000000000000000000000000    00000000000000000000000000

Original issue reported on code.google.com by [email protected] on 7 Apr 2011 at 9:14

Attachments:

BeanReaderContext does not take into account bean nesting

What steps will reproduce the problem?  If applicable, please provide a
mapping configuration and sample record input to recreate the problem.
1. Create a mapping with a nested bean
2. Try to read a file with a field error on one of the nested bean fields
3. The field errors returned do not take into account the bean nesting. 

What is the expected output? What do you see instead?

Given a record with a bean inside such as:
<record name="bigBean" class="example.BigBean">
  <bean name="myBean" class="example.MyBean">
    <field name="num" type="integer" />
  </bean>
  <bean name="yourBean" class="example.MyBean">
    <field name="num" type="integer" />
  </bean>
</record>

If there is a field error on 'num' I would expect field errors to be created 
under the key "myBean.num" or "yourBean.num", giving the full property path. 
Instead, we are seeing field errors created under the key "num". 

What version of BeanIO are you using?
BeanIO 1.2.3

Please provide any additional information below.

This is a problem because our error handler needs to know the full path to the 
invalid field in order to generate the correct error message. Also, the 
fieldErrorMap within the BeanReaderContext maintains a map of field names to 
values. In the example above, this map might look like "num" -> "5" So when 
interpolating field values into error messages you might get the wrong value 
(ie, if one field was "A" and one field was "B" the two field errors generated 
would interpolate the same field value because the fieldErrorMap only has one 
entry for "num" rather than having two entries, such as "myBean.num" -> "A", 
"yourBean.num" -> "B"). I don't know if this is resolved in BeanIO 2.0.

Original issue reported on code.google.com by [email protected] on 7 Jun 2012 at 4:30

BeanIO mapping XSD not publicly accessible by Eclipse

The BeanIO mapping XSD is available online at 'http://www.beanio.org/2011/01', 
but Eclipse is unable to resolve the schema from that location.  To rectify, 
the XSD will be moved to the public URL 
'http://www.beanio.org/2011/01/mapping.xsd'.  Once the XSD is relocated, the 
mapping file schema location can be modified as follows: 

<beanio xmlns="http://www.beanio.org/2011/01" 
  xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
  xsi:schemaLocation="http://www.beanio.org/2011/01 http://www.beanio.org/2011/01/mapping.xsd">

Note the existing schema location 'http://www.beanio.org/2011/01' (documented 
in the reference guide) will still be supported when loading mapping files.

Original issue reported on code.google.com by [email protected] on 9 May 2011 at 11:58

Better error messages

What steps will reproduce the problem?  If applicable, please provide a
mapping configuration and sample record input to recreate the problem.
1. Any invalid configuration or input file

What is the expected output? What do you see instead?
On any invalid configuration, BeanIO throw error messages which could bring 
more information about the invalid record or field like expected value, current 
value literal, full record line, etc

What version of BeanIO are you using? What JDK version?
BeanIO 1.2.2

Original issue reported on code.google.com by mikhas.rock on 13 Jan 2012 at 6:12

OSGi bundle

Hi

At Apache Camel we are added a new camel-beanio component.
https://issues.apache.org/jira/browse/CAMEL-4964

And as we also support OSGi with Camel, it would be nice, if beanio was OSGi 
compliant out of the box.
If you want help how to make your JAR osgi compliant, then let us know, as we 
can help with that.

And thanks for this great library. Its great that you guys take up this task, 
and have an active maintained project, to allow people to parse their CSV, 
fixed length, and whatnot files.

Original issue reported on code.google.com by [email protected] on 5 Feb 2012 at 11:18

Not computing record lengths properly for nested beans

What steps will reproduce the problem?  If applicable, please provide a
mapping configuration and sample record input to recreate the problem.
1.Run attached sample with mapping.xml. It will work fine.
2.Run with mapping-bad.xml, it will not parse, throwing a 'field gaps not 
allowed' error. 
3.Datafile.txt also attached.

What is the expected output? What do you see instead?
Expected mapping file to parse, get: "Invalid <beanNameHere> bean 
configuration: field gaps not allowed for children of collections");

What version of BeanIO are you using? What JDK version?
1.2.4, JDSK 1.5

Please provide any additional information below.
The issue only occurs when a field is declared after a bean which is already a 
child of a bean. I.e.: this works fine:
<record ...>
  <bean name="outer">
   <field name="atOuterBeanLevelBefore" />
   <bean name="inner">
      <field .../>
    </bean>
  </bean>
</record>

this does NOT work:
<record ...>
  <bean name="outer">
    <bean name="inner">
      <field .../>
    </bean>
    <field name="atOuterBeanLevelAfter" />
  </bean>
</record>

Original issue reported on code.google.com by [email protected] on 25 Apr 2012 at 9:17

Attachments:

Zero padded fixed length fields cannot be made optional.

Given a zero padded fixed length field bound to a nullable Java type, such as 
the Integer value defined below, BeanIO will throw a TypeConversionException if 
the field is optional and passed as all spaces, "     ".

     <field name="value" type="java.lang.Integer" length="5" padding="0" justify="right" required="false"/> 

Instead, a field consisting of all spaces should be unmarshalled as null even 
when the padding character is not a space.  To disallow all spaces, required 
should be set to true, rather than relying on the type conversion error.

Until this is fixed in the next release, the following workarounds are 
available:

1.  (Simplest) Set trim="true" on the field.  However this will allow some 
invalid values to pass type conversion, such as "  10 ", which should be 
formatted as "00010".

2.  (Best) Create and use a custom type handler that parses all spaces as null. 
 For example, note the "".equals(text.trim()) check in parse(...) shown below.

public class IntegerTypeHandler implements TypeHandler { 

    public Integer parse(String text) throws TypeConversionException { 
        if (text == null || "".equals(text.trim())) 
            return null; 

        try { 
            return Integer.parseLong(text); 
        } 
        catch (NumberFormatException ex) { 
            throw new TypeConversionException("Invalid long", ex); 
        } 
    } 

    public String format(Object value) { 
        if (value == null) 
            return ""; 
        else 
            return ((Long)value).toString(); 
    } 

    public Class<?> getType() { 
        return Long.class; 
    } 
} 

**Special thanks to Mauro for pointing this out to me in the mailing list.

Original issue reported on code.google.com by [email protected] on 15 Jul 2011 at 1:45

Completely padded fixed length fields may parse incorrectly

Given a fixed length field where padding='0' and type='int', a field '000' is 
parsed as '' and should be parsed as '0'.

If the field type extends from Number and the padding character is a digit, a 
fully padded field should return the padding character.

If the field type is Character, a fully padded field should return the padding 
character.


Original issue reported on code.google.com by [email protected] on 29 Jan 2011 at 9:34

org.beanio.internal.util.Settings#getFileURL does not use location parameter

What steps will reproduce the problem?  If applicable, please provide a
mapping configuration and sample record input to recreate the problem.
1. Set org.beanio.configuration system property to file name, not classpath 
locaton
2. Try to use beanio
3.

What is the expected output? What do you see instead?

BeanIO should use the file. Instead I get 
org.beanio.BeanIOException: BeanIO configuration settings not found at 
'home/tivv/dev/BeanIO/trunk/beanio/beanio-lib/src/main/resources/com/ubs/beanio/
copybook/beanio.properties'
    at org.beanio.internal.util.Settings.getInstance(Settings.java:214)
    at org.beanio.StreamFactory.newInstance(StreamFactory.java:323)

What version of BeanIO are you using? What JDK version?

BeanIO: 2.0.3
JDK:
java version "1.6.0_24"
OpenJDK Runtime Environment (IcedTea6 1.11.5) (6b24-1.11.5-0ubuntu1~12.10.1)
OpenJDK 64-Bit Server VM (build 20.0-b12, mixed mode)



Original issue reported on code.google.com by [email protected] on 11 Jan 2013 at 1:01

Trimming spaces from input number fields

What steps will reproduce the problem?  If applicable, please provide a mapping 
configuration and sample record input to recreate the problem.
- mapping: 
<beanio xmlns="http://www.beanio.org/2012/03" 
  xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
  xsi:schemaLocation="http://www.beanio.org/2012/03 http://www.beanio.org/2012/03/mapping.xsd">
  <stream name="blahblah" format="csv">
    <parser>
      <property name="delimiter" value="|" />
    </parser>
    <record name="foobar" minOccurs="1" maxOccurs="unbounded" class="blah">
      <field name="someNumber" type="java.lang.Long" format="#0" />
      <field name="somethingElse" />
    </record> 
  </stream>
</beanio>

- input:
156000033 |Hello World

What is the expected output?
- Pojo with someNumber=156000033 and somethingElse="Hello World"

What do you see instead?
Invalid 'someNumber': Type conversion error: Invalid Long value '156000033 '
at 
org.beanio.internal.parser.UnmarshallingContext.validate(UnmarshallingContext.ja
va:196)
at 
org.beanio.internal.parser.BeanReaderImpl.internalRead(BeanReaderImpl.java:105)
at org.beanio.internal.parser.BeanReaderImpl.read(BeanReaderImpl.java:64)
at 
org.apache.camel.dataformat.beanio.BeanIODataFormat.readModels(BeanIODataFormat.
java:148)
at 
org.apache.camel.dataformat.beanio.BeanIODataFormat.unmarshal(BeanIODataFormat.j
ava:110)
....

What version of BeanIO are you using? What JDK version?
- Java 1.6.0_26
- BeanIO 2.0.1

Original issue reported on code.google.com by [email protected] on 27 Aug 2012 at 7:07

  • Merged into: #38

NullPointerException on add record error when using spring batch modules

1. Create a BeanIOStreamFactory bean
2. Use the factory in a BeanIOFlatFileItemReader and BeanIOFlatFileItemWriter
3. Run a spring batch chunk step using the reader and writer

When a record validation error occurs and the framework try to add the record 
error, a null pointer exception is thrown.

What version of BeanIO are you using? What JDK version?
beanio-1.2.2.jar
jdk-1.6.0_21

See stack trace:
java.lang.NullPointerException
    at java.util.Hashtable.get(Hashtable.java:334)
    at java.text.NumberFormat.getInstance(NumberFormat.java:742)
    at java.text.NumberFormat.getInstance(NumberFormat.java:376)
    at java.text.MessageFormat.subformat(MessageFormat.java:1237)
    at java.text.MessageFormat.format(MessageFormat.java:836)
    at java.text.Format.format(Format.java:140)
    at org.beanio.parser.Record.addRecordError(Record.java:194)
    at org.beanio.parser.StreamDefinition$BeanReaderImpl.nextRecord(StreamDefinition.java:572)
    at org.beanio.parser.StreamDefinition$BeanReaderImpl.read(StreamDefinition.java:455)
    at org.beanio.spring.BeanIOFlatFileItemReader.doRead(BeanIOFlatFileItemReader.java:82)
    at org.springframework.batch.item.support.AbstractItemCountingItemStreamItemReader.read(AbstractItemCountingItemStreamItemReader.java:85)


The issue is caused due to a null Locale variable on BeanIOFlatFileItemReader.
If it's not manually set on the spring bean, no default value is defined.

Try setting the Locale.getDefault();
Method: onOpen()
Line: 121

Original issue reported on code.google.com by mikhas.rock on 12 Jan 2012 at 8:50

org.beanio.BeanIOException: Failed to load stream factory implementation class 'org.beanio.parser.DefaultStreamFactory

Good morning,

I have a thread that should be write java bean in a CSV file.

When the thread invokes the write method java throw the following exception 
when instances the StreamFactory:

StreamFactory factory = StreamFactory.newInstance();

org.beanio.BeanIOException: Failed to load stream factory implementation class 
'org.beanio.parser.DefaultStreamFactory
'
        at org.beanio.StreamFactory.newInstance(StreamFactory.java:240)

Do you suggest a way to avoid this problem?

Extract of the method:
....
StreamFactory factory = StreamFactory.newInstance();
factory.load(mappingFileName);
....

BeanIO version: 1.2.2
JDK version: 1.6.0_11


There is a method that returns the String delimited by separator using the 
mapping and the java bean?


Thanks in advance for your support.

Best Regards.

Original issue reported on code.google.com by [email protected] on 16 Feb 2012 at 9:26

Cannot handle quote CSV with space delimter

Using the following mapping to handle a file that uses a quote qualifier and 
space delimiter.

<beanio xmlns="http://www.beanio.org/2012/03" 
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 
xsi:schemaLocation="http://www.beanio.org/2012/03 
http://www.beanio.org/2012/03/mapping.xsd">
   <stream name="input" format="csv">
      <parser>
      <property name="whitespaceAllowed" value="true"/>
      <property name="delimiter" value=" "/>
      <property name="quote" value="&quot;"/>
      <property name="escape" value="&quot;"/>
      </parser>
      <record name="record" class="com.silverlink.ceh.importpreview.ListBean">
         <field name="values" collection="list" maxOccurs="unbounded"/>
      </record>
   </stream>
</beanio>


Not trying to use this on a file that contains:
"MemberID" "MemberNameFirst" "MemberNameLast" "MemberNameMI" "PersonalEmail"
"RF223939891444331766741" "Lashawn" "Bjerknes" "Q" "[email protected]"
"RF751994296140475361441" "Tonette" "Cyler" "L" "[email protected]"

Results in an error:
Malformed record at line 1: Invalid character found outside of quoted field at 
line 1

Original issue reported on code.google.com by [email protected] on 1 Aug 2012 at 6:01

Some beans may be instantiated more than once

Non-collection beans assigned to a segment may be instantiated more than once.  
There does not appear to be any negative side effects of this behavior, other 
than some impact on performance.

Original issue reported on code.google.com by [email protected] on 8 Sep 2012 at 1:28

Trim doesn't work on last field

I get validation excepion as trimming seems to fails on delimited streams with 
trailing spaces in the last field (actually I didn’t test if there are 
problems also with other positions).
There is no error if I remove spaces at the end of the record.

Everything was fine using 1.2.2 version and stopped working when I moved to 
2.0.1. I am using JDK 6.

To get it working I had to move back to 1.2.2 version.

Here is an extract of the mapping:

<parser>
<property name="delimiter" value=";" />
</parser>
<record name="onBoardingTrailer" minOccurs="1" 
class="net.giomag.sbp.io.OnBoardingHeader">
 <field name="recordType" rid="true" literal="10" ignore="true" />
 <field name="dateSent" format="yyyyMMddHHmmss" />
 <field name="dateResponse" format="yyyyMMddHHmmss" trim="true" />
</record>


SAMPLE RECORD (quotation marks are for record delimitation, in order to show 
spaces: they’re not present in the real stream. 
"10;;20120712134531                                   " --> FAIL
"10;;20120712134531" --> OK

Best regards
Giovanni

Original issue reported on code.google.com by [email protected] on 19 Jul 2012 at 7:35

Disable DTD loading when reading XML

BeanIO is not meant for validating XML against a schema/DTD, therefore loading 
external DTD's should be disabled when reading XML to improve performance.

Original issue reported on code.google.com by [email protected] on 20 Jul 2011 at 2:36

Errors when not all records on stream have RID

What steps will reproduce the problem?  If applicable, please provide a
mapping configuration and sample record input to recreate the problem.
1. Create c BeanIO configuration with header and trailer RID
2. Detail record may not have RID

What is the expected output? What do you see instead?
If BeanIO does not find a record definition which does match a RID, the 
framework should try using the next record definition which does not have RID 
attribute

What version of BeanIO are you using? What JDK version?
BeanIO 1.2.2

Please provide any additional information below.
See attached files

Original issue reported on code.google.com by mikhas.rock on 13 Jan 2012 at 6:20

Attachments:

Nested groups bound to non-collection type beans are unmarshalled as null

<group name="g1" class="my.Group1Class" maxOccurs="1">
    <record name="r1.1" class="map" maxOccurs="1">
        ... a list of fields...
    </record>

    <group name="g1.2" class="my.Group2Class" maxOccurs="1">
        <record name="r1.2.1" template="t121" maxOccurs="1" />

        <record name="r1.2.2" setter="addR122" class="my.Record122Class" template="t122" collection="list" />
    </group>
</group>

Group2Class field in Group1Class is null: setter method is never called

Horrible regression of 2.0.2 version: it still works in version 2.0.1

java version "1.6.0_31"
Java(TM) SE Runtime Environment (build 1.6.0_31-b05)
Java HotSpot(TM) 64-Bit Server VM (build 20.6-b01, mixed mode)


Original issue reported on code.google.com by [email protected] on 9 Oct 2012 at 11:25

The maximum length of a fixed length record may be incorrectly defaulted

If minLength is set on a fixed length record and maxLength is not set, 
maxLength will incorrectly default to minLength which may be less than the sum 
of all defined field lengths.

Affects release 0.9.2.  

Workaround: when a minLength value is configured on a fixed length record, also 
set maxLength.

Original issue reported on code.google.com by [email protected] on 2 Feb 2011 at 3:38

NullPointer in fixedlength with field igore=true

What steps will reproduce the problem?
1. Define <stream name="employeeFile" format="fixedlength">
2. Define <field name="firstName" length="10" ignore="true" />
3.

What is the expected output? What do you see instead?
Exception in thread "main" java.lang.NullPointerException
    at org.beanio.parser.fixedlength.FixedLengthFieldDefinition.parseField(FixedLengthFieldDefinition.java:61)
    at org.beanio.parser.FieldDefinition.parsePropertyValue(FieldDefinition.java:128)
    at org.beanio.parser.PropertyDefinition.parseValue(PropertyDefinition.java:102)
    at org.beanio.parser.BeanDefinition.parsePropertyValue(BeanDefinition.java:80)
    at org.beanio.parser.PropertyDefinition.parseValue(PropertyDefinition.java:102)
    at org.beanio.parser.RecordDefinition.parseBean(RecordDefinition.java:79)
    at org.beanio.parser.StreamDefinition$BeanReaderImpl.read(StreamDefinition.java:464)
    at teste.Employee.main(Employee.java:30)


What version of the product are you using? On what operating system?
1.2, MacOX Snow Leopard

Please provide any additional information below.

Original issue reported on code.google.com by [email protected] on 30 Sep 2011 at 5:29

Add support for filtering rows in a delimited file (wish list)

I have a few very large files where I'm only interested in parsing a small 
subset of the rows into beans. Parsing all the rows using BeanIO then filtering 
them out using Java has higher memory requirements than filtering the rows as 
they're being parsed. My request is for expression-based filtering on 
individual fields like:

<field name="mappingType" position="1" filter="INTERESTING_RECORD"/>

In this case, only rows with the string literal "INTERESTING_RECORD" as the 
second column would be parsed; all other rows would be skipped.

Original issue reported on code.google.com by [email protected] on 3 Jan 2013 at 10:15

Cannot represent "no escape" in parser configuration

Although the documentation says that CSV streams can support not having an 
"escape", there is no way (to my knowledge) to represent this in the XML 
mapping definition.

For example:
<beanio>
   <stream name="input" format="csv">
      <parser>
      <property name="delimiter" value=","/>
      <property name="quote" value="&quot;"/>
      <property name="escape" value=""/>
</parser>
      <record name="record" class="map">
         <field name="values" collection="list" maxOccurs="unbounded"/>
      </record>
   </stream>
</beanio>

An exception is thrown if the escape property is set to empty as show above.

Original issue reported on code.google.com by [email protected] on 22 May 2012 at 3:10

Throwing exception when no record class is defined

What steps will reproduce the problem?  If applicable, please provide a
mapping configuration and sample record input to recreate the problem.
1. Create a BeanIo configuration file with a "classless" record

What is the expected output? What do you see instead?
BeanIo returns a null record instead of notifying or throwing an exception

What version of BeanIO are you using? What JDK version?
BeanIO 1.2.2 - JDK 1.6

Please provide any additional information below.
class:  org.beanio.parser.BeanDefinition
method: Object parsePropertyValue(Record)
line:   107

Original issue reported on code.google.com by mikhas.rock on 13 Jan 2012 at 5:57

Add support for "static" record marshalling

Add the capability to marshal "static" records not bound to a bean object, for 
instance, the header record in the following example:

<beanio xmlns="http://www.beanio.org/2012/03">
  <stream name="stream" format="csv">
    <record name="header" order="1" occurs="1">
      <field name="h1" default="Header1" />
      <field name="h2" default="Header2" />
      <field name="h3" default="Header3" />
    </record>
    <record name="detail" class="map" order="2" occurs="0+">
      <field name="d1" />
      <field name="d2" />
      <field name="d3" />
    </record>
  </stream>
</beanio>


Original issue reported on code.google.com by [email protected] on 26 May 2012 at 8:13

Allow a stream definition to support multiple formats

A single stream definition should support multiple formats, perhaps by allowing 
more than one 'parser' configuration.  Something like:

<stream name="s"...
  <parser name="s1" format="xml">...</parser>
  <parser name="s2" format="delimited">...</parser>
  ...
</stream>

The parser name could override or somehow extend the stream name.

Original issue reported on code.google.com by [email protected] on 26 Mar 2012 at 3:40

Field type is not validated if a record or bean class is not set

A mapping file such as the following, should still validate the 'fileDate' 
field in the 'header' record is a valid date in MMddyyyy format (even though 
the header record is not returned from BeanWriter because the 'class' attribute 
is not set).

  <stream name="stream4" format="csv">
    <record name="header" minOccurs="1" maxOccurs="1">
      <field name="fileDate" type="date" format="MMddyyyy" />
      <field name="fileName" />
    </record>
    <record name="record" class="map">
      <field name="field1" />
      <field name="field2" />
    </record>
  </stream>

This has likely been a defect in all previous versions.

Original issue reported on code.google.com by [email protected] on 16 Sep 2011 at 9:40

Is any way to get the size of the list (Collection)

I have a java class with list of Items (collection) for order booking. And I 
want to transform these items in delimited format with first field of delimited 
file should be the count of items (size of collection).I checked the reference 
manual but didn't found any property to represent the count. Is there any way 
to do it? 

For Ref - Cofiguration file (Mapping.xml) is attached.

Input to mapping
-----------------
List of Items (Code,Quantity)

Output expected
----------------
count|code1|quantity|code2|quantity2|code3|qunatity3







Original issue reported on code.google.com by [email protected] on 15 Feb 2012 at 4:41

Attachments:

Can't override the default XML namespace

Even if I set xmlPrefix="" on a bean, it still uses the defined namespace 
prefix. I want to use the default namespace to write the bean "series".

Config file:
<stream name="ScoresXML" format="xml" xmlType="none">
 <writer>
  <property name="version" value="1.0"/>
  <property name="encoding" value="UTF-8"/>
  <property name="indentation" value="1"/>
  <property name="lineSeparator" value="&#xA;"/>
  <property name="namespaces" value="
   d urn:oid:1.3.6.1.4.1.39777.1.0.1.0
   xml http://www.w3.org/XML/1998/namespace
   xsi http://www.w3.org/2001/XMLSchema-instance
  "/>
 </writer>
 <record name="cursus" class="eu.lp0.cursus.xml.scores.ScoresXML" xmlPrefix="" xmlNamespace="urn:oid:1.3.6.1.4.1.39777.1.0.1.1">
  <field name="generator" xmlType="attribute" xmlNamespace=""/>
  <bean name="series" minOccurs="1" class="eu.lp0.cursus.xml.data.entity.DataXMLSeries" xmlPrefix="" xmlNamespace="urn:oid:1.3.6.1.4.1.39777.1.0.1.0">
   <include template="series"/>
  </bean>
  <bean name="seriesResults" minOccurs="0" class="eu.lp0.cursus.xml.scores.results.ScoresXMLSeriesResults">
   <include template="seriesResults"/>
  </bean>
  <bean name="eventResults" minOccurs="0" maxOccurs="unbounded" class="eu.lp0.cursus.xml.scores.results.ScoresXMLEventResults" collection="list">
   <include template="eventResults"/>
  </bean>
  <bean name="raceResults" minOccurs="0" maxOccurs="unbounded" class="eu.lp0.cursus.xml.scores.results.ScoresXMLRaceResults" collection="list">
   <include template="raceResults"/>
  </bean>
 </record>
</stream>

Expected output:
<?xml version="1.0" encoding="UTF-8"?>
<cursus xmlns="urn:oid:1.3.6.1.4.1.39777.1.0.1.1" 
xmlns:d="urn:oid:1.3.6.1.4.1.39777.1.0.1.0" 
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" generator="cursus 0.0.1">
 <series xml:id="Seriesa1464332ac848" xmlns="urn:oid:1.3.6.1.4.1.39777.1.0.1.0">
  <name>...

Actual output:
<?xml version="1.0" encoding="UTF-8"?>
<cursus xmlns="urn:oid:1.3.6.1.4.1.39777.1.0.1.1" 
xmlns:d="urn:oid:1.3.6.1.4.1.39777.1.0.1.0" 
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" generator="cursus 0.0.1">
 <d:series xml:id="Seriesa1464332ac848">
  <d:name>...

Original issue reported on code.google.com by [email protected] on 22 Apr 2012 at 5:09

Allow definition of class loader

In OSGi based environments it might be necessary to define the class loader to 
use for creating Beans. Unfortunately, BeanIO does not enable the user to do 
so. A possible solution would be to extend StreamFactory's load methods by 
another parameter of type ClassLoader and use this ClassLoader anywhere a class 
is loaded or a bean is created.

Original issue reported on code.google.com by [email protected] on 8 Nov 2011 at 3:25

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.