Oops Null Pointer

Java programming related

Category Archives: Java

Minecraft Modding: A Template Mod

My son loves playing Minecraft mods and wanted to make his own. He had a plethora of grand ideas. I advised to start small to create something achievable. And thus the Sandwich Mod (GitHub link) was created. It adds a piece of bread and a slice of cheese, which can be crafted into a cheese sandwich.

I couldn’t find any quick start mod samples for Minecraft 1.11, but I did find a reasonably straight-forward and detailed video tutorial series by CJMinecraft. From the ideas there I created a simple mod that adds 3 new edible items and a recipe.

To get started you need

  • The Java 1.8 JDK installed
  • A Java IDE (Eclipse or Intellj)
  • git to check out the code
  • A copy of Minecraft

The mod is based on Mincraft Forge, which provides a bunch of hooks to get started. I use Gradle as a build tool as it downloads all the dependencies you need and compiles and packages the mod.

Forge also provides a launcher so you can start Minecraft from your IDE with you mod installed to play-test or debug.

Check out the readme file on GitHub for all the instructions and let me know if you get stuck.


XStream 1.3.1 to 1.4.3 ReflectionConverter changes

Recently while upgrading to Java 7 I had to upgrade XStream due to (I think) Oracle changing the name of the JVM or reflection providers. My converters that subclass the ReflectionConverter class started to fail. I had been omitting fields and then manually marshalling them in by overriding the marshall and marshallFields methods.

The 1.3.1 version of the RelectionConverter (in AbstractConverter) passed all non transient fields to the marshallField method. But in 1.4.3 it now checks if the fields should be omitted and will not pass omitted fields through to the marshallField method.

While mine was a relatively unusual case I hope this post can help others out stuck on similar issues.

Jackrabbit auto commit exception

At work I have been using a Jackrabbit (JCR) repository in our core product for a quite a few months. Recently we added features for managing financial publications which entails a bunch of CRUD activities using REST including creating versions and performing pessimistic locking.

With add the new activity occurring in the repository we starting seeing occasional auto commit errors. Our JBDC driver (JTDS) reports the following error on commit:

org.apache.jackrabbit.core.state.ItemStateException: commit() should not be called while in auto-commit mode.

Caused by: java.sql.SQLException: commit() should not be called while in auto-commit mode.

Hmmm.. it occurred in some orders of operation (create, edit, finish, find, edit, cancel) but not in others. Also it failed on the build machine and on other dev box, but I earned the “works on my machine” badge.

We use currently use Jackrabbit without a transaction as it turns auto commit on and off itself and thus needs to be unmanaged (see here for a raised issue). I’ve seen hints of using it via XASessions but have not yet found any clear documentation on how to do this.

We also had Jackrabbit using the database to store everything but the Lucene indexes and it shared the same database (SQL Server 2005 / 2008) and database connection as the rest of the application. It was this last point that was causing the issue – giving Jackrabbit it’s own data source to the same database fixed the issue.

The type of data source did not matter – I tried: no-tx-datasource, local-tx-datasource and xa-datasource.

TL;DR: Jackrabbit needs it’s own data source when sharing a SQL Server database.

I’m using:

  • Jackrabbit 2.4.2 (latest stable)
  • JBoss 4.2.3 (old and friendly)
  • SQL Server 2005 / 2008

GXT: BorderLayout starting with a collapsed region

Calling collapse on a BorderLayout region before it has been rendered doesn’t work. One suggestion was to add a deferred command to collapse the region. I found this was “flashing” the panel (showing and then hiding it) and also that when you pop out a collapsed panel (clicking on the collapsed section) the popup was empty as panel was still effectively collapsed. Instead I used a once only AfterLayout event listener:

layoutContainer.addListener(Events.AfterLayout, new Listener<ComponentEvent>() {
  public void handleEvent(ComponentEvent be) {
    be.getComponent().removeListener(Events.AfterLayout, this);

Using this method some child components may have trouble rendering inside the collapsed panel. One such component is a TreePanel – if you add it and then call expandAll() it will throw a JavaScriptException, caused by the parent of the collapsed panel being null.
com.google.gwt.core.client.JavaScriptException: (TypeError): this.appendChild is not a function

To counter this I added a once only Attach event listener to the panel that will be collapsed:

myContentPanel.addListener(Events.Attach, new Listener<ComponentEvent>() {
  public void handleEvent(ComponentEvent ce) {
    // Remove this listener - expand only once
    ce.getComponent().removeListener(Events.Attach, this);

Sending PDF files in the response

To send PDF files so that most browsers will display them I’ve found the following headers work:

  • Content-Disposition: inline; filename=sample.pdf or Content-Disposition: attachment; filename=sample.pdf
  • Expires: 0
  • Cache-Control: must-revalidate, post-check=0, pre-check=0
  • Pragma: public
  • Content-Type: application/pdf
  • Content-Length: <number of bytes>

Here is some rough Java to stream an input stream to the HttpServletResponse.

public static void streamInputToResponse(HttpServletResponse resp, boolean isInline, String contentType, String filename,
		InputStream in, int length) throws IOException {
	ServletOutputStream outstr = null;
	try {
		if (length > Integer.MAX_VALUE) {
			throw new IOException("File to large to stream");
		outstr = resp.getOutputStream();
		// Setting required headers
		String inlineOrAttachment = isInline ? "inline;" : "attachment;";
		resp.setHeader("Content-Disposition", inlineOrAttachment + " filename=" + filename);
		// setting some extra response headers to sooth browser issues
		resp.setHeader("Expires", "0");
		resp.setHeader("Cache-Control", "must-revalidate, post-check=0, pre-check=0");
		resp.setHeader("Pragma", "public");

		IOUtils.copy(in, outstr);
	catch (IOException e) {
		String errorStr = "Error Streaming data to servlet output stream" + ": " + e.getMessage();

		try {
			if (outstr != null) {
		catch (IOException e1) {
			logger.warn("Error flushing servlet output stream: " + e1.getMessage());
		throw e;
	finally {

The Restlet framework has some issues with PDFs when used in JBoss / Tomcat. Tomcat is appending a charset=UTF8 that appears to stop chrome’s built in PDF reader from loading (Chrome 11). Not setting the content length appeared to help while surprisingly not breaking IE.
Also with JBoss and authentication a no-cache header is added and the Restlet framework doesn’t remove it (even if you add your own Cache-Control header). See here for a way to disable this behaviour.

Sort HashMap By value

Recently I evaluated a few methods of sorting a HashMap by value.

The two main ways I found (based on this thread were:

Using LinkedLists

  • Handles duplicate values by leaving keys in the same order they are in the source
  • This example passes in a comparator, but if the values are comparable you could use compareTo() instead
  • Performs a once off sort, so you need to re-sort if the map changes
public static <K, V> Map<K, V> sortByValue(Map<K, V> map, final Comparator<V> valueComparator)
    List<Map.Entry<K, V>> list = new LinkedList<Map.Entry<K, V>>(map.entrySet());
    Collections.sort(list, new Comparator<Map.Entry<K, V>>()  {
        public int compare(Map.Entry<K, V> o1, Map.Entry<K, V> o2) {
            return vComparator.compare(o1.getValue(), o2.getValue());

    Map<K, V> result = new LinkedHashMap<K, V>();
    for (Map.Entry<K, V> entry : list)  {
        result.put(entry.getKey(), entry.getValue());
    return result;

Using a TreeMap

  • Duplicates values will be removed from the tree so you need to add the key to the compare
  • In this example I use a comparator for both the values and the key (if the values are equal)
  • Always keeps the map sorted if the sorted map is used
Map baseMap = ...          
KeyComparator kComparator = new KeyComparator();
ValueComparator vComparator = new ValComparator();
MapValueComparator mapValComparator = new MapValueComparator(baseMap, kComparator, vComparator);
Map sortedMap = new TreeMap(mapValComparator);

The map value comparator use above:

public class MapValueComparator<K, V> implements Comparator<K>
    private Map<K,V> map;
    private Comparator<K> keyComparator;
    private Comparator<V> valueComparator;
    public MapValueComparator(Map<K, V> map, Comparator<K> keyComparator, Comparator<V> valueComparator) {
        this.map = map;
        this.valueComparator = valueComparator;
        this.keyComparator = keyComparator;
    public int compare(K o1, K o2) {            
        int valueCompare = valueComparator.compare(map.get(o1), map.get(o2));
        if (valueCompare == 0) {
            return keyComparator.compare(o1, o2);
        return valueCompare;

Chrome returning null content type on file upload

Chrome (up to at least 8.0.552.215) has issues with content types of files. For me it was if the file did not have an extension.

See here for the old (unfixed issue) and here for the new issue.

Note: I’m using struts 2.0.14 with the FileUploadInterceptor (uses ServletFileUpload and JakataMultipartRequest) and if I upload a file without an extension then struts throws a null pointer exception. If I take the same file and add any extension (e.g. .bin) then it will upload correctly.

Looking at the code for struts 2.2.1 or the current trunk (line 275) it still looks like this is a problem.

Firefox defaults to “application/octet-stream” which, in this case, works.

Anyone else had this issue or has a workaround? Maybe allow the setting of a default content type in the FileUploadInterceptor?

Update: This appears to be fixed in the code base – waiting for word on when it will get into a release.

Capturing Integration Testing Input/Output with XStream

A co-worker was looking for a way to capture input and results during a debug session for certain scenarios that were occurring at run time. Ideally they were looking for a tool that could be used in Eclipse and capture parameters and return values from frames in the debug session and then be able to reuse those values in jUnit tests.

A quick search couldn’t find any tools that find the bill, but as a quick and dirty method you can use XStream in an Eclipse “Display” view to dump out input and return variables. Then in a jUnit test you unserialise the XML back into objects and tests the inputs against the results.

The XML format gives you two instant advantages:

  • No need to reconstruct the input / result objects from scratch (which can be difficult, especially in complex or legacy systems)
  • Easy tweaking of the the XML to get the results you want

A disadvantage is that if the objects structure changes then re-factoring tools will miss the XML data and the test may no longer work.

To use this technique:

  • Put XStream in your project classpath
  • Show the Display view in eclipse (Widnow->Show View-> Display)
  • Put breakpoints in the code to capture input and result objects
  • At a breakpoint run the following code in the Display view to serialise the objects to XML
(new com.thoughtworks.xstream.XStream()).toXML(someObjectToSerialise)
  • Save the XML to files and load the files into a jUnit test (say with Commons FileUtils.readFileToString())

If you know of any other tools to do this in a more automated fashion then let me know (it would be cool to do this with a few clicks via a plugin – even generate the test classes).