Skip to Main Content

Integration

Announcement

For appeals, questions and feedback about Oracle Forums, please email oracle-forums-moderators_us@oracle.com. Technical questions should be asked in the appropriate category. Thank you!

Oracle Coherence GE 3.7.0.0 <Error> - Failed to apply delta

René van WijkAug 8 2011 — edited Oct 20 2011
The following exception is occuring:
2011-08-08 14:12:32.338/25.815 Oracle Coherence GE 3.7.0.0 <Error> (thread=DistributedCache:HibernateReadWriteDistributedCache, member=2): Failed to apply delta: partition=175; key=Binary(length=7, value=0x0DAF021541A50F); old=null; new=Binary(length=1, value=0xFF); java.lang.NullPointerException
	at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache.applyDelta(PartitionedCache.CDB:6)
	at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache.doBackupAllCache(PartitionedCache.CDB:66)
	at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache.onBackupAllRequest(PartitionedCache.CDB:31)
	at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$BackupAllRequest.onReceived(PartitionedCache.CDB:2)
	at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.onMessage(Grid.CDB:33)
	at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.onNotify(Grid.CDB:33)
	at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.PartitionedService.onNotify(PartitionedService.CDB:3)
	at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache.onNotify(PartitionedCache.CDB:3)
	at com.tangosol.coherence.component.util.Daemon.run(Daemon.CDB:42)
	at java.lang.Thread.run(Thread.java:619)
It happens in the following situation (a simplified set-up): a write behind architecture that uses Hibernate to communicate with the database

We have one entity
package model.entities;

import com.tangosol.io.pof.PofReader;
import com.tangosol.io.pof.PofWriter;
import com.tangosol.io.pof.PortableObject;

import java.io.IOException;

public class Klant implements PortableObject {

    private Integer klantnummer;
    private String naam;
    private String adres;
    private String stad;
    private String provincie;
    private String postcode;
    private Integer gebied;
    private String telefoonnummer;
    private Integer reputatieNummer;
    private Double kredietlimiet;
    private String commentaar;

    public Klant() {
    }

    public Integer getKlantnummer() {
        return klantnummer;
    }

    public void setKlantnummer(Integer klantnummer) {
        this.klantnummer = klantnummer;
    }

    public String getNaam() {
        return naam;
    }

    public void setNaam(String naam) {
        this.naam = naam;
    }

    public String getAdres() {
        return adres;
    }

    public void setAdres(String adres) {
        this.adres = adres;
    }

    public String getStad() {
        return stad;
    }

    public void setStad(String stad) {
        this.stad = stad;
    }

    public String getProvincie() {
        return provincie;
    }

    public void setProvincie(String provincie) {
        this.provincie = provincie;
    }

    public String getPostcode() {
        return postcode;
    }

    public void setPostcode(String postcode) {
        this.postcode = postcode;
    }

    public Integer getGebied() {
        return gebied;
    }

    public void setGebied(Integer gebied) {
        this.gebied = gebied;
    }

    public String getTelefoonnummer() {
        return telefoonnummer;
    }

    public void setTelefoonnummer(String telefoonnummer) {
        this.telefoonnummer = telefoonnummer;
    }

    public Integer getReputatieNummer() {
        return reputatieNummer;
    }

    public void setReputatieNummer(Integer reputatieNummer) {
        this.reputatieNummer = reputatieNummer;
    }

    public Double getKredietlimiet() {
        return kredietlimiet;
    }

    public void setKredietlimiet(Double kredietlimiet) {
        this.kredietlimiet = kredietlimiet;
    }

    public String getCommentaar() {
        return commentaar;
    }

    public void setCommentaar(String commentaar) {
        this.commentaar = commentaar;
    }

    @Override
    public boolean equals(Object object) {
        if (this == object) {
            return true;
        }

        if (object == null) {
            return false;
        }

        if (!(object instanceof Klant)) {
            return false;
        }

        Klant klant = (Klant) object;
        return klantnummer.equals(klant.getKlantnummer());
    }

    @Override
    public int hashCode() {
        if (klantnummer != null) {
            return 37 * klantnummer.hashCode();
        } else {
            return System.identityHashCode(this);
        }
    }

    @Override
    public String toString() {
        return naam + ", " + adres;
    }

    public void readExternal(PofReader reader) throws IOException {
        setKlantnummer(reader.readInt(0));
        setNaam(reader.readString(1));
        setAdres(reader.readString(2));
        setStad(reader.readString(3));
        setProvincie(reader.readString(4));
        setPostcode(reader.readString(5));
        setGebied(reader.readInt(6));
        setTelefoonnummer(reader.readString(7));
        setReputatieNummer(reader.readInt(8));
        setKredietlimiet(reader.readDouble(9));
        setCommentaar(reader.readString(10));
    }

    public void writeExternal(PofWriter writer) throws IOException {
        if (getKlantnummer() != null) {
            writer.writeInt(0, getKlantnummer());
        }
        if (getNaam() != null) {
            writer.writeString(1, getNaam());
        }
        if (getAdres() != null) {
            writer.writeString(2, getAdres());
        }
        if (getStad() != null) {
            writer.writeString(3, getStad());
        }
        if (getProvincie() != null) {
            writer.writeString(4, getProvincie());
        }
        if (getPostcode() != null) {
            writer.writeString(5, getPostcode());
        }
        if (getGebied() != null) {
            writer.writeInt(6, getGebied());
        }
        if (getTelefoonnummer() != null) {
            writer.writeString(7, getTelefoonnummer());
        }
        if (getReputatieNummer() != null) {
            writer.writeInt(8, getReputatieNummer());
        }
        if (getKredietlimiet() != null) {
            writer.writeDouble(9, getKredietlimiet());
        }
        if (getCommentaar() != null) {
            writer.writeString(10, getCommentaar());
        }
    }
}
which has the following mapping
<?xml version="1.0"?>
<!DOCTYPE hibernate-mapping PUBLIC "-//Hibernate/Hibernate Mapping DTD 3.0//EN"
        "http://hibernate.sourceforge.net/hibernate-mapping-3.0.dtd">
<hibernate-mapping package="model.entities">
    <class name="Klant" table="CUSTOMER">
        <id name="klantnummer" type="integer" column="CUSTID">
            <generator class="assigned"/>
        </id>
        <property name="naam" type="string" column="NAME"/>
        <property name="adres" type="string" column="ADDRESS"/>
        <property name="stad" type="string" column="CITY"/>
        <property name="provincie" type="string" column="STATE"/>
        <property name="postcode" type="string" column="ZIP"/>
        <property name="gebied" type="integer" column="AREA"/>
        <property name="telefoonnummer" type="string" column="PHONE"/>
        <property name="reputatieNummer" type="integer" column="REPID"/>
        <property name="kredietlimiet" type="double" column="CREDITLIMIT"/>
        <property name="commentaar" type="string" column="COMMENTS"/>
    </class>
</hibernate-mapping>
We have the following logic to communicate with Coherence
package model.logic;

import java.io.Serializable;
import java.util.List;

public interface GenericDAO<T, ID extends Serializable> {
    public void addEntity(ID id, T entity);

    public void removeEntity(ID id);

    public void updateEntity(ID id, T entity);

    public T findEntity(ID id);

    public List<T> findEntities();
}
and the following implementation:
package model.logic;

import com.tangosol.net.CacheFactory;
import com.tangosol.net.NamedCache;
import com.tangosol.util.MapTriggerListener;
import com.tangosol.util.ValueExtractor;
import com.tangosol.util.extractor.ChainedExtractor;
import com.tangosol.util.filter.EqualsFilter;

import java.io.Serializable;
import java.lang.reflect.ParameterizedType;
import java.lang.reflect.Type;
import java.util.ArrayList;
import java.util.List;
import java.util.Set;

public abstract class GenericCoherenceDAO<T, ID extends Serializable> implements GenericDAO<T, ID> {

    private Class<T> persistentClass;
    private NamedCache namedCache;

    public GenericCoherenceDAO() {
        Type type = getClass().getGenericSuperclass();
        if (type instanceof ParameterizedType) {
            ParameterizedType parameterizedType = (ParameterizedType) type;
            setPersistentClass((Class<T>) parameterizedType.getActualTypeArguments()[0]);
        } else {
            System.out.println("Not an instance of parameterized type: " + type);
        }

        setNamedCache(CacheFactory.getCache(getPersistentClass().getName()));
    }

    public Class<T> getPersistentClass() {
        return persistentClass;
    }

    public void setPersistentClass(Class<T> persistentClass) {
        this.persistentClass = persistentClass;
    }

    public NamedCache getNamedCache() {
        return namedCache;
    }

    public void setNamedCache(NamedCache namedCache) {
        this.namedCache = namedCache;
    }

    public void addEntity(ID id, T entity) {
        getNamedCache().put(id, entity);
    }

    public void removeEntity(ID id) {
        getNamedCache().remove(id);
    }

    public void updateEntity(ID id, T entity) {
        getNamedCache().put(id, entity);
    }

    public T findEntity(ID id) {
        return (T) getNamedCache().get(id);
    }

    public List<T> findEntities() {
        ValueExtractor extractor = new ChainedExtractor("getClass.getName");
        Set keys = getNamedCache().keySet(new EqualsFilter(extractor, getPersistentClass().getName()));
        return new ArrayList<T>(getNamedCache().getAll(keys).values());
    }
}
For every entity a DAO is constructed
package model.logic;

import model.entities.Klant;

public interface KlantDAO extends GenericDAO<Klant, Integer> {
}
and the implementation
package model.logic;

import model.entities.Klant;

public class KlantDAOBean extends GenericCoherenceDAO<Klant, Integer> implements KlantDAO {
}
The following Hibernate configuration is used
<?xml version='1.0' encoding='utf-8'?>
<!DOCTYPE hibernate-configuration PUBLIC
        "-//Hibernate/Hibernate Configuration DTD//EN"
        "http://hibernate.sourceforge.net/hibernate-configuration-3.0.dtd">
<hibernate-configuration>
    <session-factory>
        <!-- Database connection settings -->        
        <property name="hibernate.connection.driver_class">oracle.jdbc.OracleDriver</property>
        <property name="hibernate.connection.url">jdbc:oracle:thin:@hostname:1521:SID</property>
        <property name="hibernate.connection.username">username</property>
        <property name="hibernate.connection.password">password</property>
        <!-- SQL dialect -->
        <property name="hibernate.dialect">org.hibernate.dialect.Oracle10gDialect</property>
        <!-- Enable Hibernate's automatic session context management -->
        <property name="hibernate.current_session_context_class">thread</property>
        <!-- Echo all executed SQL to stdout -->
        <property name="hibernate.show_sql">false</property>
        <property name="hibernate.format_sql">false</property>
        <!-- Configure SessionFactory -->
        <mapping resource="model/entities/Klant.hbm.xml"/>
    </session-factory>
</hibernate-configuration>
And the following Coherence configuration:
<?xml version="1.0"?>
<cache-config xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
              xmlns="http://xmlns.oracle.com/coherence/coherence-cache-config"
              xsi:schemaLocation="http://xmlns.oracle.com/coherence/coherence-cache-config coherence-cache-config.xsd">
    <defaults>
        <serializer>
            <instance>
                <class-name>com.tangosol.io.pof.ConfigurablePofContext</class-name>
                <init-params>
                    <init-param>
                        <param-type>String</param-type>
                        <param-value>hibernate-pof-config.xml</param-value>
                    </init-param>
                </init-params>
            </instance>
        </serializer>
    </defaults>
    <caching-scheme-mapping>
        <cache-mapping>
            <cache-name>*</cache-name>
            <scheme-name>hibernate-read-write-distributed</scheme-name>
            <init-params>
                <init-param>
                    <param-name>back-size-limit</param-name>
                    <param-value>250M</param-value>
                </init-param>
            </init-params>
        </cache-mapping>
    </caching-scheme-mapping>
    <caching-schemes>
        <distributed-scheme>
            <scheme-name>hibernate-read-write-distributed</scheme-name>
            <service-name>HibernateReadWriteDistributedCache</service-name>
            <thread-count>5</thread-count>
            <backup-count>1</backup-count>
            <backup-count-after-writebehind>0</backup-count-after-writebehind>
            <backing-map-scheme>
                <read-write-backing-map-scheme>
                    <scheme-ref>hibernate-read-write-backing-map</scheme-ref>
                </read-write-backing-map-scheme>
            </backing-map-scheme>
            <autostart>true</autostart>
        </distributed-scheme>
        <read-write-backing-map-scheme>
            <scheme-name>hibernate-read-write-backing-map</scheme-name>
            <internal-cache-scheme>
                <local-scheme>
                    <scheme-ref>hibernate-backing-map</scheme-ref>
                </local-scheme>
            </internal-cache-scheme>
            <cachestore-scheme>
                <class-scheme>
                    <class-name>com.tangosol.coherence.hibernate.HibernateCacheStore</class-name>
                    <init-params>
                        <init-param>
                            <param-type>String</param-type>
                            <param-value>{cache-name}</param-value>
                        </init-param>
                    </init-params>
                </class-scheme>
            </cachestore-scheme>
            <write-delay>{write-delay 10s}</write-delay>
            <write-batch-factor>{write-batch-factor 0.75}</write-batch-factor>
            <write-requeue-threshold>{write-requeue-threshold 128}</write-requeue-threshold>
            <refresh-ahead-factor>{refresh-ahead-factor 0.75}</refresh-ahead-factor>
        </read-write-backing-map-scheme>
        <local-scheme>
            <scheme-name>hibernate-backing-map</scheme-name>
            <eviction-policy>hybrid</eviction-policy>
            <high-units>{back-size-limit 0}</high-units>
            <unit-calculator>binary</unit-calculator>
            <expiry-delay>{back-expiry-delay 1h}</expiry-delay>
        </local-scheme>
    </caching-schemes>
</cache-config>
and the following pof configuration:
<?xml version="1.0"?>
<pof-config xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
            xmlns="http://xmlns.oracle.com/coherence/coherence-pof-config"
            xsi:schemaLocation="http://xmlns.oracle.com/coherence/coherence-pof-config coherence-pof-config.xsd">
    <user-type-list>
        <include>coherence-pof-config.xml</include>
        <user-type>
            <type-id>1001</type-id>
            <class-name>model.entities.Klant</class-name>
        </user-type>
    </user-type-list>
    <allow-interfaces>true</allow-interfaces>
    <allow-subclasses>true</allow-subclasses>
</pof-config>
The environment is tested by using:
package model.test;

import model.entities.Klant;
import model.logic.KlantDAO;
import model.logic.KlantDAOBean;

import java.util.Random;

public class Test {

    private Random generator = new Random();

    public static void main(String[] args) {
        Test test = new Test();

        KlantDAO klantDAO = new KlantDAOBean();

        test.doRandomReadWriteTest(klantDAO);
    }

    private void doRandomReadWriteTest(KlantDAO klantDAO) {
        //klantDAO.preload();

        while (true) {
            // generate a random client
            Klant klant = createKlant();
            // insert or update a client - an update is issued when the client already exists
            klantDAO.addEntity(klant.getKlantnummer(), klant);
            if (generator.nextDouble() < 0.001) {
                System.out.println("JACKPOT");
                // remove a client
                klantDAO.removeEntity(generateKlantNummer());
                // get all client data
                klantDAO.findEntities();
            } else {
                // find a client by ID
                klantDAO.findEntity(generateKlantNummer());
            }
        }
    }

    private Klant createKlant() {
        int klantnummer = generateKlantNummer();

        Klant klant = new Klant();
        klant.setKlantnummer(klantnummer);
        klant.setNaam("Person" + klantnummer);
        klant.setAdres("Someware");
        klant.setStad("Else");
        klant.setProvincie("NL");
        klant.setPostcode("1234AB");
        klant.setGebied(1);
        klant.setTelefoonnummer("123-4567");
        klant.setReputatieNummer(1);
        klant.setKredietlimiet(Math.rint(generator.nextDouble() * 5000.0));
        klant.setCommentaar(Long.toString(Math.abs(generator.nextLong()), 36));
        
        return klant;
    }

    private Integer generateKlantNummer() {
        int klantnummer = generator.nextInt(10000);
        if (klantnummer == 0 || (klantnummer >= 100 && klantnummer <= 109)) {
            return 42;
        } else {
            return klantnummer;
        }
    }
}
When the test is run stand-alone the error is not observed, but when run with multiple nodes it is.

A second node is created by using the default cache serve that is started as follows:
#!/bin/sh
# coherence options
COHERENCE_MANAGEMENT_OPTIONS="-Dtangosol.coherence.management=all -Dtangosol.coherence.management.remote=true"
COHERENCE_OPTIONS="-Dtangosol.coherence.cacheconfig=hibernate-cache-config.xml ${COHERENCE_MANAGEMENT_OPTIONS}"
export COHERENCE_OPTIONS
 
JAVA_HOME="/home/oracle/aqualogic/jrrt-4.0.1-1.6.0"
export JAVA_HOME
MEM_ARGS="-jrockit -Xms1024m -Xmx1024m -Xns256m -XXkeepAreaRatio:25 -Xgc:pausetime -XpauseTarget:200ms -XX:+UseCallProfiling"
export MEM_ARGS

CLASSPATH="/home/oracle/temp/TryOut/lib/Coherence/coherence-hibernate.jar:/home/oracle/temp/TryOut/lib/Coherence/commonj.jar:/home/oracle/temp/TryOut/lib/Coherence/coherence.jar:/home/oracle/temp/TryOut/lib/Coherence/coherence-work.jar:/home/oracle/temp/TryOut/lib/Database/ojdbc6.jar:/home/oracle/temp/TryOut/lib/Hibernate/jta.jar:/home/oracle/temp/TryOut/lib/Hibernate/ehcache-1.2.3.jar:/home/oracle/temp/TryOut/lib/Hibernate/hibernate3.jar:/home/oracle/temp/TryOut/lib/Hibernate/commons-logging-1.0.4.jar:/home/oracle/temp/TryOut/lib/Hibernate/dom4j-1.6.1.jar:/home/oracle/temp/TryOut/lib/Hibernate/cglib-2.1.3.jar:/home/oracle/temp/TryOut/lib/Hibernate/commons-collections-2.1.1.jar:/home/oracle/temp/TryOut/lib/Hibernate/asm.jar:/home/oracle/temp/TryOut/lib/Hibernate/c3p0-0.9.1.jar:/home/oracle/temp/TryOut/lib/Hibernate/antlr-2.7.6.jar:/home/oracle/temp/TryOut/lib/Hibernate/asm-attrs.jar:/home/oracle/temp/TryOut/out/artifacts/test/test.jar"
export CLASSPATH
 
# start the test
${JAVA_HOME}/bin/java ${MEM_ARGS} ${COHERENCE_OPTIONS} com.tangosol.net.DefaultCacheServer
Note that when the entity just implements java.io.Serializable the error is not observed.

Can someone explain why the error occurs, and how it can be resolved?
Comments
Locked Post
New comments cannot be posted to this locked post.
Post Details
Locked on Nov 17 2011
Added on Aug 8 2011
5 comments
737 views