Saturday, February 22, 2014

Camel Groovy script validator for cxfEndpoint

This applies to Camel 2.12.1/2.12.2
Using a Groovy script as a validator for incoming messages received through an inbound cfxEndpoint has an erratic behaviour: sometimes the message body has the value of a previous message. This looks like being dependent on the thread allocated to process the message: if it is the original thread that processed the first incoming message (when the script was compiled) than the expected body content is OK, otherwise it is  the previous messages processed on that first thread.

Solution:

  • get the sources for camel-script-2.12.x
  • change in org.apache.camel.builder.script.ScriptBuilder:
    • from result = compiledScript.eval();
    • to result = compiledScript.eval(getScriptContext());
    • in runScript method
  • recompile the package and deploy it

Camel, cxf:cxfEndpoint and OTRS

When used as an outbound endpoint cxf:cxfEndpoint will use a default threshold of 4096 bytes to start switching from Content-Length based HTTP transfer to Transfer-Encoding: chunked. This behaviour can cause problems if targeted server does not support this behaviour (in particular OTRS/Apache default configuration does not).
Solution: add in Camel configuration file (outside camelContex tag) the following XML snippet:
  <http-conf:conduit name="*.http-conduit">
    <http-conf:client ChunkingThreshold="15000000" />
    <!--http-conf:client ="false" /-->
  </http-conf:conduit>

This will change the default treshold from 4096 bytes  to the value specified in ChunkingThreshold attribute. The most general option of AllowChunking didn't work for me (the outbound endpoint just freezes and no content is sent over the wire)

Controlling ApacheMQ through JMX

See http://activemq.apache.org/jmx.html; first option described ("The ActiveMQ broker should appear in the list of local connections, if you are running JConsole on the same host as ActiveMQ.") didnt't work for me (ActiveMQ was started as a service, maybe this was  the cause...)

So:

  • go into bin/win64 and edit wrapper.conf; uncomment the following lines:
    • wrapper.java.additional.16=-Dcom.sun.management.jmxremote.port=1616
    • wrapper.java.additional.17=-Dcom.sun.management.jmxremote.authenticate=false
    • wrapper.java.additional.18=-Dcom.sun.management.jmxremote.ssl=false
  • be aware that you need to change the default format so that 16,17 and 18 replace the original .n format (be aware that the sequence might be different depending on other changes you made to the conf file)
  • restart ActiveMQ
  • go to your JDK\bin directory and start jconsole.exe
  • provide the following URL for Remote Process:
    • service:jmx:rmi:///jndi/rmi://localhost:1616/jmxrmi
  • press Connect (do not fill in any data for user/password)
  • press Insecure on the pop-up
  • etc...
If secure connection is required (I suppose by changing the wrapper options accordingly) provide the user name/passord for the relevant role as configured in conf\jmx.access and conf\jmx.password


Debugging ApacheMQ/Camel


  • Change activemq.bat by removing comment for the following line:
    • SET ACTIVEMQ_DEBUG_OPTS=-Xdebug -Xnoagent -Djava.compiler=NONE -Xrunjdwp:transport=dt_socket,server=y,suspend=n,address=5005
  • start ActiveMQ through command line
  • create a Maven project with NetBeans
  • add the required dependencies to project according to the modules you want to debug (the sample is for ActiveMQ 5.9 and cxf, scripting/Groovy and stream):

    <dependencies>
        <dependency>
            <groupId>org.apache.camel</groupId>
            <artifactId>camel-cxf</artifactId>
            <version>2.12.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.camel</groupId>
            <artifactId>camel-script</artifactId>
            <version>2.12.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.camel</groupId>
            <artifactId>camel-stream</artifactId>
            <version>2.12.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.cxf</groupId>
            <artifactId>cxf-rt-transports-http-jetty</artifactId>
            <version>2.7.8</version>
        </dependency>
        <dependency>
            <groupId>org.codehaus.groovy</groupId>
            <artifactId>groovy-all</artifactId>
            <version>2.2.1</version>
        </dependency>
        <dependency>
            <groupId>org.eclipse.jetty.aggregate</groupId>
            <artifactId>jetty-all-server</artifactId>
            <version>8.1.14.v20131031</version>
        </dependency>            
        <dependency>
            <groupId>org.eclipse.jetty</groupId>
            <artifactId>jetty-websocket</artifactId>
            <version>8.1.14.v20131031</version>
        </dependency>
  • build the project so the dependencies are downloaded to local maven  repository
  • in NetBeans go to Dependencies and select Download Sources
  • in NetBeans go to Debug menu and choose Attach Debugger
  • chgange the port to 5005 (or whatever was specified in ACTIVEMQ_DEBUG_OPTS)
  • look for the desired package/class in Dependencies, open the class, place a breakpoint, etc

log4j.properties

Things to remember :-):

  • The  log4j.properties  is searched in the class path.
  • The rootLogger properties are inherited by all loggers in terms of appenders (if for instance it has a console appender all other defined loggers will have also this appender even if not specifically configured)
In order to configure your specific logger(s):
  • add the desired appender(s) (like below):
    • log4j.appender.myLog=org.apache.log4j.FileAppender
    • log4j.appender.myLog.File=logs/myLog.txt
    • log4j.appender.myLog.layout=org.apache.log4j.PatternLayout
    • log4j.appender.myLog.layout.conversionPattern=%-5p %d [%t] %c: %m%n
  • add the logger by specifying logging level and appender(s):
    • log4j.logger.myLog=DEBUG, myLog

The actual place where the logs will be generated depend on the implicit/explicit setting for the working directory of your application;  to have a more specific control over this behaviour you can use environment variables expansion like:

  • log4j.appender.file.File=${mule.home}/logs/mule-logging-example.log

Running ApacheMQ as a Windows service

After downloading/extracting ActiveMQ:

  • open a command window as administrator
  • place the right activemq.xml in conf directory (along with any other related config files like the one describing Camel context)
  • navigate to bin\win64 (do not try the win32 version, the service satrup will fail on x64 Windows)
  • edit wrapper.config by changing the working directory to:
    • wrapper.working.dir=%ACTIVEMQ_BASE%
  • (if not, the working directory will be win64)
  • run InstallService.bat
  • manually start the service (named ApacheMQ)

Run Mule ESB as a service on Windows

After downloading/extracting standalone version:

  • open a command line window (cmd.exe) as administrator
  • navigate to Mule bin directory
  • run mule install
  • check in Services that a service with Mule name exists; start the service
  • place any Mule packages (zip archive) in apps directory and wait for the deployment to occur (a subdirectory with exploded package content is created and a text file <package name>-anchor.txt is present)
  • you can control the service by running mule [start|stop|restart]
When running Mule as a service by default current path is set to the root directory of the expanded standalone package (on the same level with bin, apps, logs, etc) so if we have defined in log4j.properties
log4j.appender.MyLog.File=logs/MyLog.txt
teh MyLog.txt file will be created in logs subdirectory of the installation along with Mule standard log files (mule.log, etc)

Just supposing/not tried with Mule:  you can change the working path by adding an entry to conf/wrapper.conf, something like:
  • wrapper.working.dir="%MULE_HOME%/workingDir"

Be aware: if at start-up there are some unrecoverable errors Mule will try to undeploy the application by also deleting the exploded directory; since the original zip package was already deleted this means that your application will never be launched. To be more explicit:
  • we have a Mule application that uses as inbound endpoint an ActiveMQ JMS queue that is configured (by default) without the reconnect-forever option and with for the connector it references
  • First time when we configure Mule as a service the ActiveMQ broker is up so the application is deployed fine and works as expected
  •  For some maintenance activities we restart the server. Suppose that the Mule service comes up before ActiveMQ; the application start-up will fail and application will be deleted. The failure is due to the fact that Mule tries to activate the connector even before the flow that references it it is started.