Saturday, April 25, 2015

Ubuntu 15.04 on Lenovo X1 Carbon 3rd Generation

Following up on my previous post, where I talked about installing Ubuntu 14.10 on my Lenovo X1 Carbon 3rd generation, I've now gone ahead and reinstalled the machine from scratch using the recently released Ubuntu 15.04.

I'm happy to tell you installation is now a complete breeze:

  • No more USB startup disk problems.
  • The brightness function keys work out-of-the-box.
  • The trackpoint buttons work out-of-the-box.
  • No more graphics glitches (mainly text rendering) in Unity.

It even seems that the default font sizes are better suited for a WQHD display, although I can't actually confirm anything has changed here compared to 14.10. In the end, the only things I did was setting the "Scale for menu and title bars" to 1,25 in the display settings and changing the "Page zoom" to 125% in Google Chrome.

In summary: Ubuntu 15.10 just works on the 3rd generation Lenovo X1 Carbon. No tweaking required really. I highly recommend this combination for those looking for a new Linux based laptop!

Saturday, February 28, 2015

Ubuntu 14.10 on Lenovo X1 Carbon 3rd Generation

I just switched my main laptop from an Asus Zenbook Prime UX31A to a Lenovo X1 Carbon 3rd generation (the 2015 model). Since I always run Ubuntu, I started wiping the hard disk and installing Ubuntu 14.10 as the main operating system.

Although installing Linux on brand new hardware like the Lenovo X1 Carbon can be a hairy adventure, it turned out to be a pretty smooth ride. I thought I'd share what I had to do to get to an (almost) fully functional system.

  • The first snag I hit was trying to boot the Ubuntu 14.10 startup USB disk. I was greeted with a somewhat unnerving "gfxboot.c32: not a COM32R image" message and a "boot:" prompt. A quick search brought me to Ask Ubuntu: simply type "live" at the prompt and hit enter. You'll boot into the Live CD where you can start the installation.
    I haven't seen other people running Ubuntu 14.10 on their X1 Carbon complain about this, so I'm not sure why I ran into this problem.
  • The next issue I faced was the brightness function keys not working. Some more web searching revealed an Arch Linux thread compiling a number of issues people had encountered trying to run Linux on the 3rd generation X1 Carbon. Fixing the function key problem was again easy: just force the thinkpad_acpi module to load:
    echo thinkpad_acpi > /etc/modules-load.d/thinkpad_acpi.conf
    echo "options thinkpad_acpi force_load=1" > /etc/modprobe.d/thinkpad_acpi.conf 
    
  • My laptop sports a fancy 2560 x 1440 WQHD display. That's great and all but with the default Ubuntu fonts text gets really tiny. Working around that involved installing the unity-tweak-tool and setting the Text scaling factor to "1,20" (explained here). I also set the default "Page zoom" to 125% in Google Chrome.
And that's about it as far as I'm concerned. There are a few more things not working properly, like the track-point buttons or fingerprint reader, but these are things I never use so I didn't bother fixing those.

So far the system has been really sweet! The hardware seems up-to-par with my previous ThinkPad which I owned many years ago (a real IBM ThinkPad X40): really rugged feel and a superb keyboard. Combine this with good Linux support and you've got a winning combination!

Wednesday, October 1, 2014

SWF Humor

Funny little blast from the (Spring Web Flow) past. Back in 2006 while working on Spring Web Flow 1.0 we needed an exception class signaling a failure to parse a flow execution key. Keith Donald and myself jokingly considered calling it the FuckedUpFlowExecutionKeyException but ultimately settled for BadlyFormattedFlowExecutionKeyException. Indeed, it would be quite painful if a user was presented with a FuckedUpFlowExecutionKeyException stack trace after having accidentally corrupted the key in the URL of his browser. :-)

The joke continued for a little bit with the following piece of JavaDoc that you can find in the Spring Web Flow 1.0 source code:

package org.springframework.webflow.execution.repository;

/**
 * Thrown when an encoded flow execution key is badly formatted and could not be
 * parsed. We debated calling this the FuckedUpFlowExecutionKeyException.
 * 
 * @author Keith Donald
 * @author Erwin Vervaet
 */
public class BadlyFormattedFlowExecutionKeyException extends FlowExecutionRepositoryException {
This was not enough however, a particularly sensitive person actually complained about Spring Web Flow using offensive language! In the end this bit of nostalgia was kicked out in the 1.0.1 release. Still makes for a fun anecdote however! ;-)

Monday, June 16, 2014

Puzzling loop

Another fun Java puzzler that you would think you would never encounter in the wild but I happened to run across something similar a few days ago:
public class Puzzler {

 public static void main(String[] args) throws Exception {
  PuzzleFuction f = new PuzzleFuction();
  for (int i = 0; i < f.invoke(); i++) {
   System.out.println(i);
  }
 }

 public static class PuzzleFuction {

  private int invocationCount;

  public int invoke() {
   invocationCount++;
   return invocationCount < 2 ? 2 : 5;
  }
 }
}
What does this print? In other words: does the loop boundary (f.invoke()) get evaluated again every iteration? Of course it does! :-)
01234

Sunday, May 26, 2013

Doing JSON on CXF

Doing RESTful web services using Apache CXF can be an adventure. Unfortunately this adventure again requires some ploughing through mud. Let's look at a very simple JAX-RS resource implementing the canonical order-item example:
@Path("/orders")
public class OrderResource {

 @GET
 @Produces(MediaType.APPLICATION_JSON)
 public List<Order> getOrders() {
  List<Order> orders = new ArrayList<>();
  orders.add(new Order(123L, new Item("Foo", 2)));
  //orders.add(new Order(456L, new Item("Foo", 1), new Item("Bar", 3)));
  return orders;
 }

 public static class Order {

  private long id;
  private List<Item> items = new ArrayList<>();

  public Order(long id, Item... toAdd) {
   this.id = id;
   this.items.addAll(Arrays.asList(toAdd));
  }

  public long getId() {
   return id;
  }

  public List<Item> getItems() {
   return Collections.unmodifiableList(items);
  }
 }

 public static class Item {

  private String name;

  private int quantity;

  public Item(String name, int quantity) {
   this.name = name;
   this.quantity = quantity;
  }

  public String getName() {
   return name;
  }

  public int getQuantity() {
   return quantity;
  }
 }
}
Next step is deploying this on a JEE6 application server. I used GlassFish, which internally uses Jersey as JAX-RS implementation and Jackson as JSON provider. The resource spits out the following JSON:
[
  {
    "id":123,
    "items":[
      {
        "name":"Foo",
        "quantity":2
      }
    ]
  }
]
That's pretty much what you would expect: a list of orders which are made up of an id and a list of items. Uncommenting the second order in the Java code above confirms that the structure is consistent:
[
  {
    "id":123,
    "items":[
      {
        "name":"Foo",
        "quantity":2
      }
    ]
  },
  {
    "id":456,
    "items":[
      {
        "name":"Foo",
        "quantity":1
      },
      {
        "name":"Bar",
        "quantity":3
      }
    ]
  }
]
So far so good! Let's repeat this exercise and deploy the resource on CXF. The JSON with a single order now looks like this:
{
  "order":[
    {
      "id":123,
      "items":{
        "name":"Foo",
        "quantity":2
      }
    }
  ]
}
That's not quite what we expected: we've got a wrapping JSON map with a list of orders inside it labelled "order"! Furthermore, the list of items inside the order no longer appears to be a list! Note that there is are no square brackets surrounding it. Getting back the list of two orders makes things even more surprising:
{
  "order":[
    {
      "id":123,
      "items":{
        "name":"Foo",
        "quantity":2
      }
    },
    {
      "id":456,
      "items":[
        {
          "name":"Foo",
          "quantity":1
        },
        {
          "name":"Bar",
          "quantity":3
        }
      ]
    }
  ]
}
Wow, the list of items is back inside the order structure! Apparently you get a list if you have multiple items, and just the single item if you just have one. That's pretty bad since the JSON structure for an order is now no longer consistent, which of course makes parsing it a headache.

The reason for all of this madness is the fact that CXF is at it's core an XML framework (notice the X in CXF). Doing JSON with CXF involves a bit of trickery. Internally, CXF will first use JAXB to marshall your objects into XML. Actually, I was cheating earlier: you can't directly deploy the OrderResource class shown above on CXF. First you'll have to add JAXB annotations, getters and setters, and other such frivolities to appease JAXB:

@XmlRootElement
public class Order {

 private long id;
 private List<Item> items = new ArrayList<>();

 public long getId() {
  return id;
 }

 public void setId(long id) {
  this.id = id;
 }

 public List<Item> getItems() {
  return items;
 }

 public void setItems(List<Item> items) {
  this.items = items;
 }
}
JAXB produces XML but we want JSON! To resolve this conundrum CXF uses a StAX implementation called Jettison which does not actually write XML but instead outputs JSON. Clever! The downside here is that JSON is produced from XML, not from Java objects. Consequently, Java type information is no longer available when JSON is written out. Looking at the XML produced by JAXB for an order with a single item clarifies things:
<orders>
  <order>
    <id>123</id>
    <items>
      <name>Foo</name>
      <quantity>2</quantity>
    </items>
  </order>
</orders>
Looking at this XML, you have no way of knowing that you can have multiple <items> elements. Since Java type information is no longer available, Jettison cannot see that items is actually a java.util.List and consequently it omits the list in the JSON structure:
{
  "order":[
    {
      "id":123,
      "items":{
        "name":"Foo",
        "quantity":2
      }
    }
  ]
}
If you have an order with multiple items, multiple <items> elements will be present in the XML and as a result the JSON will contain a list.

This type of translation from XML to JSON is called the mapped convention. CXF (or rather Jettison) also supports the BadgerFish convention, but it's much more esoteric.

By default CXF uses Jettison to produce JSON. Although this might be useful if you're using JAXB for other reasons, it might be better to configure CXF to use Jackson if you're just doing JAX-RS with JSON. Luckily this is easy to do.

Thursday, February 14, 2013

@Transactional mud

More wading through mud today at work, @Transactional mud on this occasion. The @Transactional annotation is Spring's way of specifying a method needs transactional semantics. It sounds simple enough: all you need is <tx:annotation-driven/> in your application context to tell Spring you'll be using annotations to demarcate transactions, and a few of those @Transactional annotations sprinkled around your code, like so:
@Service
@Transactional(rollbackFor = Throwable.class, readOnly = true)
public class MyService {

  @Transactional(readOnly = false)
  public void doStuff() throws SomeCheckedException {
    // ...
  }
}
Following the principle of least astonishment, I assumed the code above defined the doStuff() method as running in a read-write transaction (readOnly = false) which rolls back for all types of exceptions (rollbackFor = Throwable.class). The cool thing here is that you can use a class level annotation to setup useful defaults while the method level annotation adjusts settings as required for a particular method.

It took quite a bit of head scratching to realize this is not actually true! With the above code, no rollback would occur if doStuff() threw SomeCheckedException! The reason is that the transaction attributes in the method level annotation replace those in the class level annotation rather that the two being merged together. If you think about it this makes sense: Spring has no way of knowing at run-time whether you explicitly specified the rollbackFor on the annotation or just used the default value! Since the default rollback behavior is to not rollback on checked exceptions, this can give quite surprising results.

Lesson learned: check!

Tuesday, January 8, 2013

Check caching of HTTP resources

Setting up HTTP caching can be a bit of pain. Essentially the HTTP response needs to contain appropriate cache control headers, either an Expires header or a Cache-Control max-age directive. Here's a quick Java program using the HttpClient and HttpClient Cache (both part of the Apache HttpComponents) to test client-side caching of HTTP responses.
public class TestHttpCaching {

 public static void main(String[] args) throws Exception {
  CacheConfig config = new CacheConfig();
  config.setMaxObjectSize(Long.MAX_VALUE);
  HttpClient httpClient = new CachingHttpClient(config);

  doGetRequest(httpClient, args[0]);
  doGetRequest(httpClient, args[0]);
 }

 private static void doGetRequest(HttpClient httpClient, String url) throws Exception {
  HttpGet httpGet = new HttpGet(url);
  HttpContext httpContext = new BasicHttpContext();
  try {
   System.out.print("Getting... ");
   System.out.print(httpClient.execute(httpGet, httpContext).getStatusLine());
   System.out.println(": " + httpContext.getAttribute(CachingHttpClient.CACHE_RESPONSE_STATUS));
  } finally {
   httpGet.releaseConnection();
  }
 }
}
The standard DefaultHttpClient does not do any caching. Instead you have to use the CachingHttpClient which wraps a DefaultHttpClient and adds caching functionality. Also notice how the cache is configured with a very large maximum object size. This avoids resources not being cached because they are too large for the standard in-memory cache (BasicHttpCacheStorage).