Sunday, May 26, 2013

Doing JSON on CXF

Doing RESTful web services using Apache CXF can be an adventure. Unfortunately this adventure again requires some ploughing through mud. Let's look at a very simple JAX-RS resource implementing the canonical order-item example:
@Path("/orders")
public class OrderResource {

 @GET
 @Produces(MediaType.APPLICATION_JSON)
 public List<Order> getOrders() {
  List<Order> orders = new ArrayList<>();
  orders.add(new Order(123L, new Item("Foo", 2)));
  //orders.add(new Order(456L, new Item("Foo", 1), new Item("Bar", 3)));
  return orders;
 }

 public static class Order {

  private long id;
  private List<Item> items = new ArrayList<>();

  public Order(long id, Item... toAdd) {
   this.id = id;
   this.items.addAll(Arrays.asList(toAdd));
  }

  public long getId() {
   return id;
  }

  public List<Item> getItems() {
   return Collections.unmodifiableList(items);
  }
 }

 public static class Item {

  private String name;

  private int quantity;

  public Item(String name, int quantity) {
   this.name = name;
   this.quantity = quantity;
  }

  public String getName() {
   return name;
  }

  public int getQuantity() {
   return quantity;
  }
 }
}
Next step is deploying this on a JEE6 application server. I used GlassFish, which internally uses Jersey as JAX-RS implementation and Jackson as JSON provider. The resource spits out the following JSON:
[
  {
    "id":123,
    "items":[
      {
        "name":"Foo",
        "quantity":2
      }
    ]
  }
]
That's pretty much what you would expect: a list of orders which are made up of an id and a list of items. Uncommenting the second order in the Java code above confirms that the structure is consistent:
[
  {
    "id":123,
    "items":[
      {
        "name":"Foo",
        "quantity":2
      }
    ]
  },
  {
    "id":456,
    "items":[
      {
        "name":"Foo",
        "quantity":1
      },
      {
        "name":"Bar",
        "quantity":3
      }
    ]
  }
]
So far so good! Let's repeat this exercise and deploy the resource on CXF. The JSON with a single order now looks like this:
{
  "order":[
    {
      "id":123,
      "items":{
        "name":"Foo",
        "quantity":2
      }
    }
  ]
}
That's not quite what we expected: we've got a wrapping JSON map with a list of orders inside it labelled "order"! Furthermore, the list of items inside the order no longer appears to be a list! Note that there is are no square brackets surrounding it. Getting back the list of two orders makes things even more surprising:
{
  "order":[
    {
      "id":123,
      "items":{
        "name":"Foo",
        "quantity":2
      }
    },
    {
      "id":456,
      "items":[
        {
          "name":"Foo",
          "quantity":1
        },
        {
          "name":"Bar",
          "quantity":3
        }
      ]
    }
  ]
}
Wow, the list of items is back inside the order structure! Apparently you get a list if you have multiple items, and just the single item if you just have one. That's pretty bad since the JSON structure for an order is now no longer consistent, which of course makes parsing it a headache.

The reason for all of this madness is the fact that CXF is at it's core an XML framework (notice the X in CXF). Doing JSON with CXF involves a bit of trickery. Internally, CXF will first use JAXB to marshall your objects into XML. Actually, I was cheating earlier: you can't directly deploy the OrderResource class shown above on CXF. First you'll have to add JAXB annotations, getters and setters, and other such frivolities to appease JAXB:

@XmlRootElement
public class Order {

 private long id;
 private List<Item> items = new ArrayList<>();

 public long getId() {
  return id;
 }

 public void setId(long id) {
  this.id = id;
 }

 public List<Item> getItems() {
  return items;
 }

 public void setItems(List<Item> items) {
  this.items = items;
 }
}
JAXB produces XML but we want JSON! To resolve this conundrum CXF uses a StAX implementation called Jettison which does not actually write XML but instead outputs JSON. Clever! The downside here is that JSON is produced from XML, not from Java objects. Consequently, Java type information is no longer available when JSON is written out. Looking at the XML produced by JAXB for an order with a single item clarifies things:
<orders>
  <order>
    <id>123</id>
    <items>
      <name>Foo</name>
      <quantity>2</quantity>
    </items>
  </order>
</orders>
Looking at this XML, you have no way of knowing that you can have multiple <items> elements. Since Java type information is no longer available, Jettison cannot see that items is actually a java.util.List and consequently it omits the list in the JSON structure:
{
  "order":[
    {
      "id":123,
      "items":{
        "name":"Foo",
        "quantity":2
      }
    }
  ]
}
If you have an order with multiple items, multiple <items> elements will be present in the XML and as a result the JSON will contain a list.

This type of translation from XML to JSON is called the mapped convention. CXF (or rather Jettison) also supports the BadgerFish convention, but it's much more esoteric.

By default CXF uses Jettison to produce JSON. Although this might be useful if you're using JAXB for other reasons, it might be better to configure CXF to use Jackson if you're just doing JAX-RS with JSON. Luckily this is easy to do.

Thursday, February 14, 2013

@Transactional mud

More wading through mud today at work, @Transactional mud on this occasion. The @Transactional annotation is Spring's way of specifying a method needs transactional semantics. It sounds simple enough: all you need is <tx:annotation-driven/> in your application context to tell Spring you'll be using annotations to demarcate transactions, and a few of those @Transactional annotations sprinkled around your code, like so:
@Service
@Transactional(rollbackFor = Throwable.class, readOnly = true)
public class MyService {

  @Transactional(readOnly = false)
  public void doStuff() throws SomeCheckedException {
    // ...
  }
}
Following the principle of least astonishment, I assumed the code above defined the doStuff() method as running in a read-write transaction (readOnly = false) which rolls back for all types of exceptions (rollbackFor = Throwable.class). The cool thing here is that you can use a class level annotation to setup useful defaults while the method level annotation adjusts settings as required for a particular method.

It took quite a bit of head scratching to realize this is not actually true! With the above code, no rollback would occur if doStuff() threw SomeCheckedException! The reason is that the transaction attributes in the method level annotation replace those in the class level annotation rather that the two being merged together. If you think about it this makes sense: Spring has no way of knowing at run-time whether you explicitly specified the rollbackFor on the annotation or just used the default value! Since the default rollback behavior is to not rollback on checked exceptions, this can give quite surprising results.

Lesson learned: check!

Tuesday, January 8, 2013

Check caching of HTTP resources

Setting up HTTP caching can be a bit of pain. Essentially the HTTP response needs to contain appropriate cache control headers, either an Expires header or a Cache-Control max-age directive. Here's a quick Java program using the HttpClient and HttpClient Cache (both part of the Apache HttpComponents) to test client-side caching of HTTP responses.
public class TestHttpCaching {

 public static void main(String[] args) throws Exception {
  CacheConfig config = new CacheConfig();
  config.setMaxObjectSize(Long.MAX_VALUE);
  HttpClient httpClient = new CachingHttpClient(config);

  doGetRequest(httpClient, args[0]);
  doGetRequest(httpClient, args[0]);
 }

 private static void doGetRequest(HttpClient httpClient, String url) throws Exception {
  HttpGet httpGet = new HttpGet(url);
  HttpContext httpContext = new BasicHttpContext();
  try {
   System.out.print("Getting... ");
   System.out.print(httpClient.execute(httpGet, httpContext).getStatusLine());
   System.out.println(": " + httpContext.getAttribute(CachingHttpClient.CACHE_RESPONSE_STATUS));
  } finally {
   httpGet.releaseConnection();
  }
 }
}
The standard DefaultHttpClient does not do any caching. Instead you have to use the CachingHttpClient which wraps a DefaultHttpClient and adds caching functionality. Also notice how the cache is configured with a very large maximum object size. This avoids resources not being cached because they are too large for the standard in-memory cache (BasicHttpCacheStorage).

Monday, December 31, 2012

HTTP basic authentication in Grails with Spring Security

Setting up HTTP basic authentication in Grails using Spring Security is pretty straightforward. Here's a quick how-to:
  1. grails create-app basicauthdemo
  2. cd basicauthdemo
  3. grails install-plugin spring-security-core
  4. grails s2-quickstart basicauthdemo User Role
  5. Edit grails-app/conf/Config.groovy and add two lines telling the Spring Security plugin to use HTTP basic authentication:
    grails.plugins.springsecurity.useBasicAuth = true
    grails.plugins.springsecurity.basic.realmName = "HTTP Basic Auth Demo"
    
  6. Edit grails-app/conf/BootStrap.groovy to setup a user and role:
    import basicauthdemo.*
    
    class BootStrap {
    
        def init = { servletContext ->
            def userRole = Role.findByAuthority("ROLE_USER") ?: new Role(authority: "ROLE_USER").save(flush: true)
            def user = User.findByUsername("tst") ?: new User(username: "tst", password: "foo", enabled: true).save(flush: true)
            UserRole.create(user, userRole, true)
        }
        def destroy = {
        }
    }
    
  7. grails create-controller hello
  8. Edit grails-app/controllers/basicauthdemo/HelloController.groovy and add a security annotation:
    package basicauthdemo
    
    import grails.plugins.springsecurity.Secured
    
    class HelloController {
    
        @Secured(['ROLE_USER'])
        def index() {
            render "Hello World!"
        }
    }
    
  9. grails run-app
  10. Open http://localhost:8080/basicauthdemo/hello
Presto!

Monday, December 17, 2012

Watching digital HD TV without a Telenet decoder

And now for something completely different. If you live in Flanders and use Telenet as cable TV provider, it might be interesting to know that you can watch quite a few digital HD channels without using a Telenet decoder (either an HD Digibox or HD Digicorder; Telenet also has a FAQ article documenting this). All you need is:
  • A Telenet cable TV subscription
  • A television set with DVB-C tuner
Since every somewhat recent HD TV has a built-in DVB-C tuner, these requirements aren't very demanding. Just have your TV do a full scan of the digital spectrum. That takes quite a while and will find more than 200 channels. Of all those channels, you can currently watch the following unencrypted in their full digital HD 720p magnificence:
  • één HD
  • Canvas HD
  • Ketnet/OP12
  • France 3
  • Arte Belgique
  • TV5 Monde
  • Actua TV
  • CNBC Europe
  • BBC World
  • CNN
Next to these TV channels, you'll also be able to listen to a number of digital radio stations (STU BRU, ...). Of course there is one caveat: you won't be able to use interactive features such as Net Gemist.

Saturday, November 10, 2012

Getting started with Spring Shell

I thought I'd write up a quick getting-started guide for Spring Shell. Spring Shell is the new Spring portfolio project that helps you in building easy to use command line interfaces for whatever commands you provide. Commands can really be anything. For instance, on most projects the developers end up writing a bunch of tools and utilities automating tedious tasks such setting up a database schema, scanning through log files or doing some code generation. All of these would be perfect examples for what a Spring Shell command can be. Using Spring Shell to house all of your tooling and utility commands in a coherent shell makes them self documenting and easier to use for other developers on the team.

Let's setup a trivial Spring Shell application to get started. In follow-up posts I'll cover more advanced Spring Shell functionality. I'll be using Maven in this example because I think that's what most people are familiar with (Spring Shell itself is built with Gradle).

Here's the POM for the spring-shell-demo project (available on GitHub):

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
 xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
 
 <modelVersion>4.0.0</modelVersion>
 <groupId>com.ervacon</groupId>
 <artifactId>spring-shell-demo</artifactId>
 <packaging>jar</packaging>
 <version>1.0-SNAPSHOT</version>
 <name>Spring Shell Demo</name>

 <properties>
  <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
 </properties>

 <repositories>
  <!-- see https://jira.springsource.org/browse/SHL-52 -->
  <repository>
   <id>ext-release-local</id>
   <url>http://repo.springsource.org/simple/ext-release-local/</url>
  </repository>
 </repositories>

 <dependencyManagement>
  <dependencies>
   <dependency>
    <groupId>org.springframework.shell</groupId>
    <artifactId>spring-shell</artifactId>
    <version>1.0.0.RELEASE</version>
   </dependency>
  </dependencies>
 </dependencyManagement>
 
 <dependencies>
  <dependency>
   <groupId>org.springframework.shell</groupId>
   <artifactId>spring-shell</artifactId>
  </dependency>
 </dependencies>
 
 <build>
  <plugins>
   <!-- copy all dependencies into a lib/ directory -->
   <plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-dependency-plugin</artifactId>
    <version>2.1</version>
    <executions>
     <execution>
      <id>copy-dependencies</id>
      <phase>prepare-package</phase>
      <goals>
       <goal>copy-dependencies</goal>
      </goals>
      <configuration>
       <outputDirectory>${project.build.directory}/lib</outputDirectory>
       <overWriteReleases>true</overWriteReleases>
       <overWriteSnapshots>true</overWriteSnapshots>
       <overWriteIfNewer>true</overWriteIfNewer>
      </configuration>
     </execution>
    </executions>
   </plugin>
   
   <!-- make the jar executable by adding a Main-Class and Class-Path to the manifest -->
   <plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-jar-plugin</artifactId>
    <version>2.3.1</version>
    <configuration>
     <archive>
      <manifest>
       <addClasspath>true</addClasspath>
       <classpathPrefix>lib/</classpathPrefix>
       <mainClass>org.springframework.shell.Bootstrap</mainClass>
      </manifest>
     </archive>
    </configuration>
   </plugin>
  </plugins>
 </build>

</project>

There's quite a bit going on in this POM. First thing to note is the declaration of the ext-release-local SpringSource repository. This is needed to be able to resolve one of Spring Shell's dependencies: JLine (check SHL-52 for more background info). Hopefully this will no longer be necessary in future versions of Spring Shell. Next there's a dependency on Spring Shell itself. No surprise there. Finally, the POM customizes the build by copying all dependencies into a lib/ directory and adding a Main-Class and Class-Path property to the manifest of the generated jar file. Doing this produces the following directory structure in your target folder:

target/
 spring-shell-demo-1.0-SNAPSHOT.jar
 lib/
  all dependencies
You can simply package this up and distribute your shell. It will be fully self-contained and launching it is trivial:
java -jar spring-shell-demo-1.0-SNAPSHOT.jar

Hold on there, we're getting ahead of ourselves. Before launching the shell let's first add an echo command that just prints its input text back out to the console. Here's the code, which lives in the com.ervacon.ssd package:

@Component
public class DemoCommands implements CommandMarker {

 @CliCommand(value = "echo", help = "Echo a message")
 public String echo(
   @CliOption(key = { "", "msg" }, mandatory = true, help= "The message to echo") String msg) {
  return msg;
 }
}

As you can see, a Spring Shell command is just a @CliCommand annotated method on a Java class tagged with the CommandMarker interface. Our echo method takes a single argument which will be a mandatory @CliOption for the command. By using both the empty string and msg as keys for the option, you'll be able to invoke the command both as echo test and echo --msg test. The method simply returns the message: Spring Shell will make sure it gets printed to the console.

Alright, we've got our command implemented! We still have to tell Spring Shell about it. Since Spring Shell is Spring based, adding a command simply means defining a Spring bean. On start up, Spring Shell will automatically load the application context defined in classpath:/META-INF/spring/spring-shell-plugin.xml. You typically setup component scanning and simply mark your command classes as @Components, making development of new commands trivial since the shell will automatically detect them. Here's the application context definition:

<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
 xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
 xmlns:context="http://www.springframework.org/schema/context"
 xsi:schemaLocation="
  http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.1.xsd
  http://www.springframework.org/schema/context http://www.springframework.org/schema/context/spring-context-3.1.xsd">

 <context:component-scan base-package="com.ervacon.ssd" />

</beans>

And that's it! We're ready to build and launch our shell:

mvn package
java -jar target/spring-shell-demo-1.0-SNAPSHOT.jar

Thursday, October 25, 2012

More mud: SimpleDateFormat and the Gregorian calendar

I spent another few hours wading through mud at work the other day, SimpleDateFormat related mud this time around. You would assume "yyyy/MM/dd HH:mm z" to be a pretty solid date format for representing a timestamp with minute precision:
  • yyyy/MM/dd: the date, e.g. "2012/10/23"
  • HH:mm: the time in 24-hour format, e.g. "18:32"
  • z: the timezone, e.g. "CET"
This date format can be used to store dates in a text file (or any other textual format for that matter, e.g. XML). Let's go one step further and always use UTC as the time zone for the dates stored in our text file. That avoids all confusion when parsing the file again: all dates are expressed in UTC. Here's a bit of code that does what we need: it takes an input Date object and formats it as a UTC date string, it then parses that UTC date again and verifies that the input and output are the same.
Date input = new Date(-62135773200000L); // "0001/01/01 00:00 CET"

SimpleDateFormat utc = new SimpleDateFormat("yyyy/MM/dd HH:mm z");
utc.setTimeZone(TimeZone.getTimeZone("UTC"));

String str = utc.format(input);

Date output = utc.parse(str);

if (input.equals(output)) {
 System.out.println("Equal!");
} else {
 System.out.println(input + " != " + output);
}
For most dates this would print Equal!. However, I used a special date in the code above: midnight on January 1st of the year 1 expressed in CET. The output on my machine is:
Sat Jan 01 00:00:00 CET 1 != Sun Jan 01 00:00:00 CET 2
What!? Year 1 became 2? Let's look at the string produced by the date formatting:
0001/12/31 23:00 UTC
The time becomes 23:00 UTC because CET is one hour ahaid of UTC. To understand why the day jumped to December 31st of the year 1, you have to realize that the Gregorian calendar, on which UTC and CET are based, does not have a year 0. This means the timeline looks like this (BC is Before Christ, AD is Anno Domini, the era indicator in SimpleDateFormat terms):
..., 2 BC, 1 BC, 1 AD, 2 AD, ...

The date formatting assumes January first of the year 1 to be AD. To move from midnight CET to 23:00 UTC, we end up in the previous day: December 31st of the year 1 BC. However, our date format does not encode the era (BC / AD), so when the date is parsed, year 1 is again assumed to be AD, which leads to the rather surprising result that year 1 AD becomes year 2 AD after the UTC to CET time adjustment.

Luckily fixing the problem is easier than understanding it! You simply need to add the era indicator to the date format pattern: "yyyy/MM/dd HH:mm z G", and the above code will work as expected.

PS: This problem popped up during an XStream version upgrade. Check XSTR-556 and XSTR-711 in the XStream JIRA for more information.