Tuesday, 17 September 2013

Apache Web Server, Tomcat AJP: ajp_read_header: ajp_ilink_receive failed


Apache Web Server configured to proxy requests to web application running on Tomcat (7.0.39) over AJP. The applications were installed on virtual machines in a cloud environment. On the completion of load tests (more than 24 hours), the system became unresponsive. Whenever a request was made to Apache an HTTP 503 status code was returned. Looking at the Tomcat logs showed no errors and requests could still be sent over the HTTP channel directly to Tomcat. CPU and memory consumption was also very low. Looking at the Apache log files showed errors of the following nature:

[error] ajp_read_header: ajp_ilink_receive failed
[error] (70007)The timeout specified has expired: proxy: read response failed from (

Tomcat could no longer handle any more requests from Apache over AJP and required a restart.


Doing a 'netstat' for port 8009 on the app server VM showed that there were 200 connections still in an ESTABLISHED state, with 100 connections in a CLOSE_WAIT state.
 From this initial analysis, a number of questions arose:

  • How did the number of AJP connections grow so large?
  • Why did the number of connections not close after a period of inactivity?
  • What was causing Tomcat from accepting any more requests?
Reading the AJP documentation confirmed that by default the AJP connection pool is configured with a size of 200 and an accept count (request queue when all connections are busy) of 100. To confirm the findings, Tomcat was configured with a smaller AJP connection pool (20) and as expected the errors in Apache occurred sooner and more frequently. 
To address the issue, Apache (MaxClients) and Tomcat (maxConnections) were both configured to support 25 concurrent requests. This worked perfectly (Apache no longer returned 503 responses and the log files no longer showed the ajp_link errors). The test was then repeated after increasing the connection pool to 50. Running a load test for an hour showed the servers working well, response times improved and no 503 responses. However, after the test completed, the Tomcat VM still showed 50 connections in an ESTABLISHED state. A further read of the Tomcat AJP documentation revealed that the connections remain open indefinitely until the client closes them. The next thing to try was the 'keepAliveTimeout' on the AJP connection pool. This had the effect of closing the connections after a period of inactivity and therefore seems to have resolved the issue. Ideally, the AJP connections should grow as load increases and then reduce back to an optimal number when load decreases. The 'keepAliveTimeout' has the effect of closing all connections that are inactive.


Configure Apache 'MaxClients' to be equal to the Tomcat AJP 'maxConnections' configuration.
Configure Tomcat AJP 'keepAliveTimeout' to close connections after a period of inactivity.

Tomcat AJP: http://tomcat.apache.org/tomcat-7.0-doc/config/ajp.html
Apache MPM Worker: http://httpd.apache.org/docs/2.2/mod/worker.html

Tuesday, 3 September 2013

Runtime Configuration of log4j properties

log4j is used widely to provide logging in applications. There are numerous articles about how to configure log4j.properties or log4j.xml to enable/disable logging at various levels and categories. Often it is necessary to enable the debug log level for a short duration to identify the cause of a production issue. Without runtime configuration, this would require stopping the application, changing the log configuration (after extracting the properties file from the war) and then redeploying.

This post describes how log4j can be configured outside of the application and also changed at runtime.

File Watchdog

Log4j provides runtime configuration through the DOMConfigurator.configureAndWatch for XML files, or the PropertyConfigurator.configureAndWatch for properties file. Both these methods take the absolute path for the configuration file and a refresh interval. This allows the configuration file to be located outside of a web application war file and allow administrators to change the logging levels at runtime.

The configureAndWatch API can be invoked from a custom servlet listener. For users of the Spring framework, there already exists the Log4jConfigListener that is a wrapper to the Log4j configureAndWatch API. This listener is configured in the web applications web.xml.

Spring Log4jConfigListener

By specifying the log4jConfigLocation to a file outside the web application, it allows different environments (dev, test, prod) to have different levels of logging enabled. The log4jRefreshInterval specifies how often the log4j.properties file should be checked for changes and reloaded. This allows runtime changes to the log4j configuration to be performed.


Log4j also has built-in support for JMX and two classes org.apache.log4j.jmx.LoggerDynamicMBean and org.apache.log4j.HierarchyDynamicMBean are used to expose and register loggers as MBeans. However, it still requires wrappers to work to add loggers that are not defined at start-up.

Tomcat JMX Connectivity through firewall

JMX is great for monitoring the JVM to identify potential problems with memory and concurrency. Usually this is quite simple as a local running JVM can be connected to using tools such as JConsole. When connecting remotely however there is some additional configuration required.

During a recent project, the team faced an issue where a web application was deployed using Tomcat 7 on a cloud environment.

To enable JMX monitoriing for Tomcat 7, the following system properties were configured via setenv.sh:
Tomcat JMX configuration
The above configuration properties specify that JMX remote access should be enabled on port 7099 and authenticated using the credentials found in jmxremote.access and jmxremote.password.

For more information about configuring JMX see the Oracle documentation in the references section.

The firewall on the VPN between the cloud environment and local network was configured to allow connections on port 7099. However this still didn't prove successful.

After reading numerous online resources, the problem was identified. Basically, when the JMX server starts up, it opens two ports, one for the JMX registry and another dynamically generated port for the RMI server. The above configuration only specifies the JMX registry port and there isn't a system property to configure the RMI server port. Both these ports need to be opened in the firewall, but as the RMI server port is dynamically chosen, how can the firewall be configured?

One solution is to develop a custom JMX agent and configure the java runtime to use it. This is outlined in the references below.

For Tomcat there is a more elegant solution for out-of-the-box JMX monitoring.
Tomcat provides a JMXRemoteLifecycleListener that allows specifying both the JMX registry port and the RMI server port.

Tomcat JMXRemoteLifecycleListener

The rmiRegistryPortPlatform replaces the use of the com.sun.management.jmxremote.port system property.

The JMXRemoteLifecycleListener requires the deployment of the catalina-jmx-remote.jar in the ${CATALINA_HOME}/lib directory.

Tomcat can now be remotely monitored and managed using the following JMX service URL:


Thursday, 11 July 2013

Environment Specific Configuration of Web Applications

We've all faced this situation. A web application needs to be deployed to a number of different environments (dev, test, prod) and each of these requires slightly different configuration such as which database to use etc.

Assuming a typical Maven-based Spring application, these configuration parameters are usually held within a properties file in the src/main/resources folder.

sample properties file

And then used to substitute place-holders within Spring application context files.

sample application context file

Now how do we provide environment specific configuration for every deployment?

  • Manually unpack the artefact and modify the properties file and then repackage it. This is clearly unacceptable from a maintenance and quality control perspective.
  • Use Maven profiles to include an environment specific properties file at build time.
  • Use Maven filtering in combination with profiles to produce an environment specific build.
  • Use Maven assembly plug-in to generate multiple environment specific artefacts in a single build.
  • Use Spring profiles for runtime configuration

Our initial approach was to use Maven filtering with profiles and further augment the artefact by using the profile id as a classifier.

Maven filters
There exists a properties file for every required environment (dev, test, prod), but instead of placing these files within the src/main/resources directory, they are placed within the src/main/filters directory. The property files follow a naming convention such as config-{profileid}.

Maven pom filtering configuration

The above Maven pom extract specifies that when the resources are copied from the source to target directory, place-holder substitution or filtering will take place using the environment specific filter. The artefact that is built with the maven-war-plugin will also be tagged with the environment variable (env).

The env variable is a Maven property that is set by the active profile.
Maven profiles

The above Maven pom extract specifies a number of profiles that simply set the env property, that effectively points to a configuration file that contains all the properties. This is a cleaner approach than to list all the environment specific properties within the Maven profile.

The above approach works perfectly well but there are still some deficiencies such as:

  • Requires running the build multiple times for environment specific builds. This is not recommended as the artefact should be built only once and deployed anywhere.
  • Doesn't work well with the Maven release plug-in and automated build tools.
  • Every new environment requires a new profile, new configuration filter and a new artefact.
Taking a step back, we first need to identify and understand the problem we are trying to solve. 

Ideally the configuration should not be tightly coupled to the build but external and applied at runtime. By storing the configuration property file outside of the application in a specified location on the deployment environment, the application can be configured with environment specific properties at runtime.

External Runtime Configuration
This is achieved simply by configuring the Spring application context property-placeholder to load properties from a file location that is outside the application. Note, that there is also a properties file in the application (src/main/resources), that is used to apply default properties. The environment specific property files can then choose to override the default properties if required. This approach is much simpler (no profiles, no filtering, single artefact) than the initial approach and easier to maintain. For additional deployment environments, there is no need to modify the build process, all that is required is a customized properties file in the required location.

Thursday, 18 April 2013

Spring MVC, Jackson and Filtering JSON Serialization

The default Spring JSON configuration will show all the fields of an object during serialization. Often we would like to return a subset of an entity’s properties and hide some which are only used internally such as the id and version.  Or perhaps return different properties depending on who/what is requesting the data.

As Spring uses Jackson for JSON serialization it supports the Jackson Json annotations. Jackson provides a number of options for “narrowing” or “filtering” the properties of an entity that gets serialized.

Jackson JsonViews (requires adding @JsonView to entity classes which pollutes them)
Jackson mix-in (requires extending the entity to specify @JsonIgnoreProperties)

Jackson Filtering options

Example of using Jackson views and mixin for filtering:

This one creates a new annotation to use with JsonViews but is a little too involved for my liking as it requires changing some Spring classes.

This link provides a good demo of Jackson Views and Mixins

Within my SearchController I was returning a SearchSummary object that referenced a Collector object. However, I didn’t want to return all the Collector details within the SearchSummary as its not relevant. Shown below is all I required:

public class SearchController {

    private static class CollectorIdOnlyView extends Collector {
        // Empty by design ...
    private final ObjectMapper objectMapper = new ObjectMapper();

    private final MappingJacksonJsonView view = new MappingJacksonJsonView();      
    public SearchController() {         
        objectMapper.getSerializationConfig().addMixInAnnotations(Collector.class, CollectorIdOnlyView.class);
    private AccountService accountService;
    private SearchService searchService;
    @RequestMapping(value = "/search/{id}", method = RequestMethod.GET)
    public View getSummary(@PathVariable("id") String id, Model uiModel) {
        Collector collector = accountService.findCollectorById(id);
        if(collector == null) {
              throw new RuntimeException("Collector Not Found");
        } else {
              SearchSummary searchSummary = searchService.getSearchSummary(collector);
              return view;

For the filtering I had to create a subclass of my entity to define the properties I wanted to ignore. (CustomerIdOnlyView).
This is then used to configure the ObjectMapper and View.
The change I had to make to the getSummary method was to change the return type from SearchSummary to View, add the SearchSummary to the Model and remove the @ResponseBody annotation.

Tuesday, 22 January 2013

Spring Roo Updating two entities from one View

Following on from the last post about Spring Roo, this post looks at how to customize the views and controller to update multiple entities from a single view.

Using the web mvc add-on Spring Roo generates views with crud functionality for each entity. For a given entity, Roo will typically generate views for create, show, list and update that map to methods on the generated Spring controller. By specifying attributes to the RooWebScaffold annotation, it is possible to control which views get generated.

However, there are some situations where it may be desirable to allow updating multiple entities (parent-child) from within the same view, perhaps to provide a better user experience. For example, consider a domain model with Business and User entities having a one-to-one relationship to an Address entity (managed from the Business/User). When creating a new Business or User entity, it would provide a better user experience to allow entering all the Business or User details and Address details from a single view, rather than having to create the Address from a separate view and then select the address identifier from a drop-down list within the create Business/User view (as would be the default behaviour with Roo generation).

To achieve this behaviour, the create.jspx view can be augmented with additional fields to enter the address details. Note the use of the dot-notation to navigate to the nested property of the address object.

    <form:create id="fc_com_changesoft_samples_domain_Business" modelAttribute="business" path="/businesses" render="${empty dependencies}" z="0qN2h7loptVj50JEP4Htm9WcQ2U=">
        <field:select field="businessType" id="c_com_changesoft_samples_domain_Business_businessType" items="${businesstypes}" path="businesstypes" z="lbHfbOekwe5M/bFdLTQsGY43NTs="/>
        <field:input field="businessName" id="c_com_changesoft_samples_domain_Business_businessName" z="H/Qzpi7gDgY45Ag3tLqLZiYL9wY="/>
        <field:input field="registrationNumber" id="c_com_changesoft_samples_domain_Business_registrationNumber" z="v0EWCe/zpM89BvaLz1O6+1p1ZEs="/>
        <field:input field="businessDescription" id="c_com_changesoft_samples_domain_Business_description" z="oly643Jm218C3sMXDHNnxbCaKC8="/>
        <field:input field="address.addressLine" id="c_com_changesoft_samples_domain_Address_addressLine" z="nDT/wm1FdsgfzEXLv9zJkmZybmg="/>
        <field:select field="address.addressType" id="c_com_changesoft_samples_domain_Address_addressType" items="${addresstypes}" path="addresstypes" z="aYI9UiQIlRB+F0Bwu93/eo4ZFAc="/>
        <field:input field="address.city" id="c_com_changesoft_samples_domain_Address_city" z="FVwmWn28G7JtqIfUIvAdZORMVkM="/>
        <field:input field="address.postcode" id="c_com_changesoft_samples_domain_Address_postcode" z="eezB9GBw9FiIb+ISiJosM7zXbi4="/>

This results in the following rendered jsp view.

When the form is submitted, it correctly persists a new Business with a new Address entity (due to the address attribute of Business having cascading behaviour).

The update.jspx view requires slightly more work. When the update form of an entity is submitted, the id and version attributes are also present as hidden form fields and sent as request parameters to identify and retrieve the detached entity and update it. Therefore when updating 2 entities from a single view, it is also necessary to encapsulate the id and version attributes of the Address entity as hidden form fields; and subsequently submit them along with the Business attributes. The jsp tags that are provided by the web mvc add-on do not cater for hidden form fields. Therefore to fulfil this requirement it was necessary to create a custom tag that would generate an html hidden form field that could then be used within the update view jsp.

  <c:set value="hidden" var="type" />

    <c:when test="${disableFormBinding}">
      <input id="_${sec_field}_id" name="${sec_field}" type="${fn:escapeXml(type)}" />
      <form:hidden id="_${sec_field}_id" path="${sec_field}" />
      <br />
      <form:errors cssClass="errors" id="_${sec_field}_error_id" path="${sec_field}" />

With the new hidden.tagx custom tag file, the update view for the Business entity is adjusted to mark any required fields as hidden:

    <field:hidden field="address.id" id="c_com_changesoft_samples_domain_Address_Id" z="user-managed"/>
    <field:hidden field="address.version" id="c_com_changesoft_samples_domain_Address_Version" z="user-managed"/>

Using the new hidden field tags, the update view renders as follows:

This view allows updating both the Business and Address entities by correctly sending the id and version properties for each entity as hidden form fields, shown below in the HTML source.

<div style="display: none;" id="_c_com_changesoft_samples_domain_Address_Id_id">
<input id="_address.id_id" name="address.id" type="hidden" value="1"/>
<br />
<div style="display: none;" id="_c_com_changesoft_samples_domain_Address_Version_id">
<input id="_address.version_id" name="address.version" type="hidden" value="2"/>
<br />
<div id="_c_com_changesoft_samples_domain_Address_addressLine_id">
<label for="_address.addressLine_id">
Address Line
<input id="_address.addressLine_id" name="address.addressLine" type="text" value="119 Middlesex Street"/>

This works quite well without requiring any custom code within the controller.

Spring Roo MVC Read-Only Inputs

I've recently been using Spring Roo quite regularly and also recommended it as part of the development tool stack at a client to boost productivity. It provides a lot of nice features and generates a lot of boiler-plate code for CRUD functionality. Obviously it has its limitations and when you start trying to do things differently then Roo will start having problems. Nevertheless, as long as you are willing to accept the limitations and customize where necessary, you can use it to your benefit.

Recently, I came across an issue where I wanted to update an entity from the web UI, but have some fields marked as readonly so that the user cannot modify them from the update view. Unfortunately, the custom jsp tags that the Spring Roo web mvc add-on provides doesn't support readonly input fields. The input.tagx provides a number of attributes such as render, disableFormBinding and disabled but none of these fulfilled the requirement. Using the disabled attribute, marks the input field as disabled which renders fine and doesn't allow editing of the field, but it also doesn't get submitted with the form and consequently the entity's validation fails or the field's value is replaced by an empty value.

The solution I found was to create a new inputreadonly tag by copying and customizing the input.tagx file.

I added a new jsp directive:

<jsp:directive.attribute name="readonly" type="java.lang.Boolean" required="false" rtexprvalue="true" description="Specify if this field should be readonly" />

and modified the body where it generates the input field:

<c:when test="${readonly eq true}">
<form:input disabled="${disabled}" id="_${sec_field}_id" path="${sec_field}" readonly="${readonly}"></form:input></c:when>
<form:input disabled="${disabled}" id="_${sec_field}_id" path="${sec_field}

Then I modified the update.jspx file to use the new jsp tag.

<field:inputreadonly field="businessType" id="c_com_changesoft_samples_domain_Business_businessType" z="user-managed>" readonly="true"/>

The result is an html read-only input field.