Skip to main content

Spring an Quartz JobDataMap serialization exception

We dont run our nodes in HA yet and once a customer registers he is assigned a node and he lives there. Problem is that if the node dies we incur a downtime for that customer and also we need to overallocate hardware to prepare for worse case scenario.  for the past 6 months I have been working to making the code stateless so that we can do HA and reduce our node count by 4 times.

So we used to run quartz using in memory scheduler but for HA I need to run quartz in a cluster. We chose org.quartz.impl.jdbcjobstore.JobStoreTX for this.

Problem was that as soon as I tried it I ran into issues because I was injecting spring beans into our quartz job using JobDataMap and JobStoreTX was trying to serialize the jobData into a table and our spring beans are not serializable.  There were two options:

1) Load the entire applicationContext in each job and read the bean from there.

2) Use the schedulerContextAsMap.

After evaluating options I found scheduler context as the best option .The way to do this would be

    <bean id="haQuartzScheduler" class="org.springframework.scheduling.quartz.SchedulerFactoryBean">
        <property name="configLocation" value="classpath:quartz.properties"/>
        <property name="triggers">
            <list>
                <ref bean="CommitFileJobTrigger" />
            </list>
        </property>
        <property name="schedulerContextAsMap">
            <map>
                <entry key="commitFileJobHelper" value-ref="commitFileJobHelper" />
                <entry key="numThreads" value="2" />           
            </map>
        </property>
    </bean>

In the quartz job I wrote a function

    protected T getSchedulerContextBean(JobExecutionContext context, String beanName) throws JobExecutionException {
        SchedulerContext schedulerContext;
        try {
            schedulerContext = context.getScheduler().getContext();
        } catch (SchedulerException e) {
            throw new JobExecutionException(e);
        }
        T bean = (T)schedulerContext.get(beanName);
        return bean;
    }    


and then all jobs can use this function like

        CommitFileJobHelper commitFileJobHelper = getSchedulerContextBean(context, "commitFileJobHelper");

Comments

Popular posts from this blog

RabbitMQ java clients for beginners

Here is a sample of a consumer and producer example for RabbitMQ. The steps are Download Erlang Download Rabbit MQ Server Download Rabbit MQ Java client jars Compile and run the below two class and you are done. This sample create a Durable Exchange, Queue and a Message. You will have to start the consumer first before you start the for the first time. For more information on AMQP, Exchanges, Queues, read this excellent tutorial http://blogs.digitar.com/jjww/2009/01/rabbits-and-warrens/ +++++++++++++++++RabbitMQProducer.java+++++++++++++++++++++++++++ import com.rabbitmq.client.Connection; import com.rabbitmq.client.Channel; import com.rabbitmq.client.*; public class RabbitMQProducer { public static void main(String []args) throws Exception { ConnectionFactory factory = new ConnectionFactory(); factory.setUsername("guest"); factory.setPassword("guest"); factory.setVirtualHost("/"); factory.setHost("127.0.0.1"); factory.se...

Spring 3.2 quartz 2.1 Jobs added with no trigger must be durable.

I am trying to enable HA on nodes and in that process I found that in a two test node setup a job that has a frequency of 10 sec was running into deadlock. So I tried upgrading from Quartz 1.8 to 2.1 by following the migration guide but I ran into an exception that says "Jobs added with no trigger must be durable.". After looking into spring and Quartz code I figured out that now Quartz is more strict and earlier the scheduler.addJob had a replace parameter which if passed to true would skip the durable check, in latest quartz this is fixed but spring hasnt caught up to this. So what do you do, well I jsut inherited the factory and set durability to true and use that public class DurableJobDetailFactoryBean extends JobDetailFactoryBean {     public DurableJobDetailFactoryBean() {         setDurability(true);     } } and used this instead of JobDetailFactoryBean in the spring bean definition     <bean i...

Killing a particular Tomcat thread

Update: This JSP does not work on a thread that is inside some native code.  On many occasions I had a thread stuck in JNI code and it wont work. Also in some cases thread.stop can cause jvm to hang. According to javadocs " This method is inherently unsafe. Stopping a thread with Thread.stop causes it to unlock all of the monitors that it has locked". I have used it only in some rare occasions where I wanted to avoid a system shutdown and in some cases we ended up doing system shutdown as jvm was hung so I had a 70-80% success with it.   -------------------------------------------------------------------------------------------------------------------------- We had an interesting requirement. A tomcat thread that was spawned from an ExecutorService ThreadPool had gone Rogue and was causing lots of disk churning issues. We cant bring down the production server as that would involve downtime. Killing this thread was harmless but how to kill i...