Monday, December 31, 2012

Sed: Mutli-Line Replacement Between Two Patterns

This post has some useful sed commands which can be used to perform replacements and deletes between two patterns across multiple lines. For example, consider the following file:
$ cat file
line 1
line 2
foo
line 3
line 4
line 5
bar
line 6
line 7
1) Replace text on each line between two patterns (inclusive):
To perform a replacement on each line between foo and bar, including the lines containing foo and bar, use the following:
$ sed '/foo/,/bar/{s/./x/g}' file
line 1
line 2
xxx
xxxxxx
xxxxxx
xxxxxx
xxx
line 6
line 7
2) Replace text on each line between two patterns (exclusive):
To perform a replacement on each line between foo and bar, excluding the lines containing foo and bar, use the following:
$ sed '/foo/,/bar/{/foo/n;/bar/!{s/./x/g}}' file
line 1
line 2
foo
xxxxxx
xxxxxx
xxxxxx
bar
line 6
line 7
3) Delete lines between two patterns (inclusive):
To delete all lines between foo and bar, including the lines containing foo and bar, use the same replacement sed command as shown above, but simply change the replacement expression to a delete.

$ sed '/foo/,/bar/d' file
line 1
line 2
line 6
line 7
4) Delete lines between two patterns (exclusive):
To delete all lines between foo and bar, excluding the lines containing foo and bar, use the same replacement sed command as shown above, but simply change the replacement expression to a delete.
$ sed '/foo/,/bar/ {/foo/n;/bar/!d}' file
line 1
line 2
foo
bar
line 6
line 7
5) Replace all lines between two patterns (inclusive):
To perform a replacement on a block of lines between foo and bar, including the lines containing foo and bar, use:
$ sed -n '/foo/{:a;N;/bar/!ba;N;s/.*\n/REPLACEMENT\n/};p' file
line 1
line 2
REPLACEMENT
line 6
line 7
How it works:
/foo/{                   # when "foo" is found
  :a                     # create a label "a"
    N                    # store the next line
  /bar/!ba               # goto "a" and keep looping and storing lines until "bar" is found
  N                      # store the line containing "bar"
  s/.*\n/REPLACEMENT\n/  # delete the lines
}
p                        # print
6) Replace all lines between two patterns (exclusive):
To perform a replacement on a block of lines between foo and bar, excluding the lines containing foo and bar, use:
$ sed -n '/foo/{p;:a;N;/bar/!ba;s/.*\n/REPLACEMENT\n/};p' file
line 1
line 2
foo
REPLACEMENT
bar
line 6
line 7
References:
Sed - An Introduction and Tutorial by Bruce Barnett

Saturday, December 22, 2012

Useless Use of Echo

Most of us are familiar with the Useless Use of Cat Award which is awarded for unnecessary use of the cat command. For example, in nearly all cases, cat file | command arg can be rewritten as <file command arg.

In a similar vein, this post is about the useless use of the echo command. In nearly all cases:

echo string | command arg
can be rewritten using a heredoc:
command arg << END
string
END
or, using a here-string:
command arg <<< string
Note: Here-strings are not portable (but most modern shells support them) so use the heredoc alternative shown above if you are writing a portable script!

Saturday, December 01, 2012

Spring: Creating a java.util.Properties Bean

The easiest way to create a java.util.Properties bean in Spring is with a PropertiesFactoryBean as shown in the example below:
<bean id="emailProperties"
      class="org.springframework.beans.factory.config.PropertiesFactoryBean">
  <property name="properties">
    <value>
        smtp.host=mail.host.com
        from=joe.bloggs@domain.com
        to=${mail.recipients}
    </value>
  </property>
</bean>
Spring will parse the key=value pairs and put them into the Properties object.

Saturday, November 24, 2012

Parsing a CSV file into JavaBeans using OpenCSV

opencsv is a very useful CSV parsing library for Java.

The following generic utility method shows how you can parse a CSV file into a list of JavaBeans.

/**
 * Parses a csv file into a list of beans.
 *
 * @param <T> the type of the bean
 * @param filename the name of the csv file to parse
 * @param fieldDelimiter the field delimiter
 * @param beanClass the bean class to map csv records to
 * @return the list of beans or an empty list there are none
 * @throws FileNotFoundException if the file does not exist
 */
public static <T> List<T> parseCsvFileToBeans(final String filename,
                          final char fieldDelimiter,
                          final Class<T> beanClass) throws FileNotFoundException {
  CSVReader reader = null;
  try {
    reader = new CSVReader(new BufferedReader(new FileReader(filename)),
                           fieldDelimiter);
    final HeaderColumnNameMappingStrategy<T> strategy =
                                         new HeaderColumnNameMappingStrategy<T>();
    strategy.setType(beanClass);
    final CsvToBean<T> csv = new CsvToBean<T>();
    return csv.parse(strategy, reader);
  } finally {
    if (reader != null) {
      try {
          reader.close();
      } catch (final IOException e) {
          // ignore
      }
    }
  }
}
Example:
Consider the following CSV file containing person information:
FirstName,LastName,Age
Joe,Bloggs,25
John,Doe,30
Create the following Person bean to bind each CSV record to:
public class Person {

  private String firstName;
  private String lastName;
  private int age;

  public Person() {
  }
  public String getFirstName() {
    return firstName;
  }
  public void setFirstName(String firstName) {
    this.firstName = firstName;
  }
  public String getLastName() {
    return lastName;
  }
  public void setLastName(String lastName) {
    this.lastName = lastName;
  }
  public int getAge() {
    return age;
  }
  public void setAge(int age) {
    this.age = age;
  }
}
Now, you can parse the CSV file into a list of Person beans with this one-liner:
List<Person> persons = Utils.parseCsvFileToBeans("/path/to/persons.csv", 
                                                 ',', Person.class);

Saturday, October 20, 2012

Joining Two Files with the Unix join Command

The join command is a useful tool for joining two files on a common field. It allows you to join two files, similar to the way you would join two tables in a SQL database.

The following example illustrates the power of the join command. You have two files, one containing a list of employees with their department ids and the other containing departments and their ids. You want to find out the names of the departments for each employee. You MUST first sort the files on the department id column (using the sort command) and then join them on that column.

$ cat employees.txt
Jones,33
Steinberg,33
Robinson,34
Smith,34
Rafferty,31
John,

$ cat departments.txt
31,Sales
33,Engineering
34,Clerical
35,Marketing

$ join -a 1 -t, -1 2 -2 1 -o 1.1 2.2 <(sort -t, -k2 employees.txt) <(sort -t, -k1 departments.txt)
John,
Rafferty,Sales
Jones,Engineering
Steinberg,Engineering
Robinson,Clerical
Smith,Clerical

Joining on multiple columns
The join command joins on a single field. What do you do if you want to join on multiple fields? You create a composite field by combining the multiple fields together! This can be done using awk. For example:
$ cat employees2.txt
Jones,33,50
Steinberg,33,51
Robinson,34,50
Smith,34,50
Rafferty,31,51

$ awk -F, '{print $2"_"$3","$0}' employees2.txt
33_50,Jones,33,50
33_51,Steinberg,33,51
34_50,Robinson,34,50
34_50,Smith,34,50
31_51,Rafferty,31,51

As you can see, an additional field has been created by concatenating the second and third fields of the file. Now you can join the files on the new composite field.

(File data courtesy of Wikipedia.)

Sunday, October 07, 2012

Java: Find an Available Port Number

In some cases, such as in unit tests, you might need to start up a server or an rmiregistry. What port number do you use? You cannot hardcode the port number because when your unit test runs on a continuous build server or on a colleague's machine, it might already be in use. Instead, you need a way to find an available port on the current machine.

According to IANA (Internet Assigned Numbers Authority), the ports that we are free to use lie in the range 1024-49151:

Port numbers are assigned in various ways, based on three ranges: System Ports (0-1023), User Ports (1024-49151), and the Dynamic and/or Private Ports (49152-65535)
The following utility class can help find an available port on your local machine:
import java.io.IOException;
import java.net.DatagramSocket;
import java.net.ServerSocket;
 
/**
 * Finds an available port on localhost.
 */
public class PortFinder {
 
  // the ports below 1024 are system ports
  private static final int MIN_PORT_NUMBER = 1024;
 
  // the ports above 49151 are dynamic and/or private
  private static final int MAX_PORT_NUMBER = 49151;
 
  /**
   * Finds a free port between 
   * {@link #MIN_PORT_NUMBER} and {@link #MAX_PORT_NUMBER}.
   *
   * @return a free port
   * @throw RuntimeException if a port could not be found
   */
  public static int findFreePort() {
    for (int i = MIN_PORT_NUMBER; i <= MAX_PORT_NUMBER; i++) {
      if (available(i)) {
        return i;
      }
    }
    throw new RuntimeException("Could not find an available port between " + 
                               MIN_PORT_NUMBER + " and " + MAX_PORT_NUMBER);
  }
 
  /**
   * Returns true if the specified port is available on this host.
   *
   * @param port the port to check
   * @return true if the port is available, false otherwise
   */
  private static boolean available(final int port) {
    ServerSocket serverSocket = null;
    DatagramSocket dataSocket = null;
    try {
      serverSocket = new ServerSocket(port);
      serverSocket.setReuseAddress(true);
      dataSocket = new DatagramSocket(port);
      dataSocket.setReuseAddress(true);
      return true;
    } catch (final IOException e) {
      return false;
    } finally {
      if (dataSocket != null) {
        dataSocket.close();
      }
      if (serverSocket != null) {
        try {
          serverSocket.close();
        } catch (final IOException e) {
          // can never happen
        }
      }
    }
  }
}

Sunday, September 30, 2012

Testing Email with a Mock MailSender

If you have an application which sends out email, you don't want your unit tests doing that too, so you need to use a "mock mail sender". You can create one by extending JavaMailSenderImpl and overriding the send method so that it doesn't really send an email. Here is an example:
import java.util.Properties;

import javax.mail.internet.MimeMessage;

import org.springframework.mail.MailException;
import org.springframework.mail.MailPreparationException;
import org.springframework.mail.javamail.JavaMailSenderImpl;
import org.springframework.mail.javamail.MimeMessagePreparator;

public class MockMailSender extends JavaMailSenderImpl {

  @Override
  public void send(final MimeMessagePreparator mimeMessagePreparator) throws MailException {
    final MimeMessage mimeMessage = createMimeMessage();
    try {
      mimeMessagePreparator.prepare(mimeMessage);
      final String content = (String) mimeMessage.getContent();
      final Properties javaMailProperties = getJavaMailProperties();
      javaMailProperties.setProperty("mailContent", content);
    } catch (final Exception e) {
      throw new MailPreparationException(e);
    }
  }
}
The mock shown above stores the email body into the mail properties object. This is a quick-and-dirty way of getting access to the content of the email just in case you want to check it in your unit test.

Here is the associated Spring Java-based configuration:

@Configuration
public class MailConfig {

  @Bean
  public JavaMailSender mailSender() {
    final JavaMailSenderImpl sender = new JavaMailSenderImpl();
    sender.setHost("mail.host.com");
    return sender;
  }

  @Bean
  public Notifier notifier() {
    return new Notifier(mailSender());
  }
}
The unit-test configuration:
@Configuration
@Profile("unit-test")
public class UnitTestMailConfig extends MailConfig {
  @Override
  @Bean
  public JavaMailSender mailSender() {
   return new MockMailSender();
  }
}
For more information about sending emails with Spring 3, see the documentation here.

Related Posts:
Spring 3 - JavaConfig: Unit Testing

Saturday, September 29, 2012

Installing Scala IDE for Eclipse Juno

Just spent ages trying to get the Scala IDE plugin working in Eclipse Juno (4.2).

Here are the links that finally worked for me:

To install them go to Eclipse > Help > Install new software... and enter the URL in the Work with... textfield.

stackoverflow - 40k rep

Eight months after crossing the 30k milestone, I've now achieved a reputation of 40k on stackoverflow! The following table shows some interesting stats about my journey so far:
0-10k 10-20k 20-30k 30-40k Total
Date achieved 01/2011 05/2011 01/2012 09/2012
Questions answered 546 376 253 139 1314
Questions asked 46 1 6 0 53
Tags covered 609 202 83 10 904
Badges
(gold, silver, bronze)
35
(2, 10, 23)
14
(0, 4, 10)
33
(2, 8, 23)
59
(3, 20, 36)
141
(7, 42, 92)
As I mentioned before, I have really enjoyed being a member of stackoverflow. For me, it has not simply been a quest for reputation, but more about learning new technologies and picking up advice from other experts on the site. I like to take on challenging questions, rather than the easy ones, because it pushes me to do research into areas I have never looked at before, and I learn so much during the process.

You can probably tell by the number of questions answered, that I haven't spent much time on stackoverflow recently. I've been busy at work and have also been participating in other stackexchange sites like superuser, serverfault and Unix and Linux.

50k, here I come!

Sunday, September 23, 2012

Spring 3 - JavaConfig: Unit Testing using a Different Profile

In unit tests, you should not connect to an external database or webservice. Instead, you should use an in-memory database like hsqldb and mock any other external system dependencies. In order to do so, you need to inject test beans into the Spring container instead of using real ones. This example shows how you can use a different configuration for unit testing using Spring Java-based configuration.

Let's start with the following configuration:

/**
 * Configuration for an external oracle database.
 */
@Configuration
public class DatabaseConfig {

  @Bean
  public DataSource personDataSource() {
    DataSource ds = new org.apache.commons.dbcp.BasicDataSource();
    ds.setDriverClassName("oracle.jdbc.driver.OracleDriver");
    ds.setUrl("jdbc:oracle:thin:@firefly:1521:HRP2");
    ds.setUsername("scott");
    ds.setPassword("tiger");
    return ds;
  }
}

/**
 * Main application config.
 */
@Configuration
@Import(DatabaseConfig.class)
public class AppConfig {

  @Bean
  public PersonDao personDao() {
    return new PersonDao(personDataSource());
  }
}
In order to use a different database for your unit tests, you need to create a separate unit test database configuration as shown below. This configuration returns an HSQL data source and, more importantly, is decorated with a @Profile annotation which indicates that it will be only be used when the "unit-test" profile is active.
/**
 * Configuration for an embedded HSQL database used by unit tests.
 */
@Configuration
@Profile("unit-test")
public class UnitTestDatabaseConfig extends DatabaseConfig {

  @Override
  @Bean
  public DataSource personDataSource() {
    return new EmbeddedDatabaseBuilder()
               .setType(EmbeddedDatabaseType.HSQL)
               .addScript("person.sql")
               .build();
  }
}
Now, write your unit test as shown below. The @ActiveProfiles annotation tells Spring which profile to use when loading beans for the test classes. Since it is set to "unit-test", the HSQL DataSource will be used.
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration(classes = { AppConfig.class, UnitTestDatabaseConfig.class })
@ActiveProfiles("unit-test")
public class PersonDaoTest {

  @Autowired
  private PersonDao personDao;

  @Test
  public void testGetPerson() {
    Person p = personDao.getPerson("Joe");
  }
}

Saturday, September 22, 2012

Spring 3 - JavaConfig: Loading a Properties File

This example shows how you can load a properties file using Spring Java-based configuration and then use those properties in ${...} placeholders in other beans in your configuration.

First, you need to create a PropertySourcesPlaceholderConfigurer bean as shown below:

import org.springframework.context.annotation.*;
import org.springframework.context.support.PropertySourcesPlaceholderConfigurer;

/**
 * Loads properties from a file called ${APP_ENV}.properties
 * or default.properties if APP_ENV is not set.
 */
@Configuration
@PropertySource("classpath:${APP_ENV:default}.properties")
public class PropertyPlaceholderConfig {

  @Bean
  public static PropertySourcesPlaceholderConfigurer propertySourcesPlaceholderConfigurer() {
    return new PropertySourcesPlaceholderConfigurer();
  }
}
Next, import this configuration into your main application config and use @Value to resolve the ${...} placeholders. For example, in the code below, the databaseUrl variable will be set from the db.url property in the properties file.
import javax.sql.DataSource;
import org.springframework.context.annotation.*;
import org.springframework.context.support.PropertySourcesPlaceholderConfigurer;

@Configuration
@Import(PropertyPlaceholderConfig.class)
public class AppConfig {

  @Value("${db.url}")      private String databaseUrl;
  @Value("${db.user}")     private String databaseUser;
  @Value("${db.password}") private String databasePassword;

  @Bean
  public DataSource personDataSource(){
    final DataSource ds = new org.apache.commons.dbcp.BasicDataSource();
    ds.setDriverClassName("oracle.jdbc.driver.OracleDriver");
    ds.setUrl(databaseUrl);
    ds.setUsername(databaseUser);
    ds.setPassword(databasePassword);
    return ds;
  }

  @Bean
  public PersonDao personDao() {
    return new PersonDao(personDataSource());
  }
}
Alternative approach:
Alternatively, you can load the properties file into the Spring Environment and then lookup the properties you need when creating your beans:
import javax.sql.DataSource;
import org.springframework.context.annotation.*;
import org.springframework.context.support.PropertySourcesPlaceholderConfigurer;
import org.springframework.core.env.Environment;

@Configuration
@PropertySource("classpath:${APP_ENV:default}.properties")
public class AppConfig {

  @Autowired
  private Environment env;

  @Bean
  public DataSource personDataSource() {
    final DataSource ds = new org.apache.commons.dbcp.BasicDataSource();
    ds.setDriverClassName("oracle.jdbc.driver.OracleDriver");
    ds.setUrl(env.getProperty("db.url"));
    ds.setUsername(env.getProperty("db.user"));
    ds.setPassword(env.getProperty("db.password"));
    return ds;
  }

  @Bean
  public PersonDao personDao() {
    return new PersonDao(personDataSource());
  }
}
A minor downside of this approach is that you need to autowire the Environment into all your configs which require properties from the properties file.

Related posts:
Spring 3: JavaConfig vs XML Config

Sunday, September 16, 2012

Spring 3: JavaConfig vs XML Config

Spring JavaConfig allows you to configure the Spring IoC container and define your beans purely in Java rather than XML. I have been using Java-based configuration in all my new projects now and prefer it over the traditional XML-based configuration.

Here is a small example illustrating what the XML and Java configurations look like:

XML Config

<beans>
  <bean id="personDataSource" destroy-method="close" class="org.apache.commons.dbcp.BasicDataSource">
    <property name="driverClassName" value="oracle.jdbc.driver.OracleDriver"/>
    <property name="url" value="jdbc:oracle:thin:@firefly:1521:HRP2"/>
    <property name="username" value="scott"/>
    <property name="password" value="tiger"/>
  </bean>
  <bean id="personDao" class="com.example.PersonDao">
    <property name="dataSource" ref="personDataSource"/>
  </bean>
</beans>
Java Config
import javax.sql.DataSource;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;

@Configuration
public class AppConfig {
  
  @Bean
  public DataSource personDataSource(){
    DataSource ds = new org.apache.commons.dbcp.BasicDataSource();
    ds.setDriverClassName("oracle.jdbc.driver.OracleDriver");
    ds.setUrl("jdbc:oracle:thin:@firefly:1521:HRP2");
    ds.setUsername("scott");
    ds.setPassword("tiger");
    return ds;
  }
  
  @Bean
  public PersonDao personDao() {
    return new PersonDao(personDataSource());
  }
}
Why I like JavaConfig:

Here are a few reasons, in no particular order, as to why I like the Java-based configuration:

  1. Easy to learn: With XML, I have always found myself copying the app-context.xml from the last project I did, to start off with. I also have a hard time remembering what the XML schema is. However, JavaConfig is so intuitive that you can easily start writing your configuration from scratch - all you need to remember are a few annotations. Java-based configuration is more succint than XML. It is also more "readable". This makes it easier to see, at a glance, how your spring container is configured as compared to reading a lot of XML.

  2. Quicker to write: It is faster to write JavaConfig because your Java IDE will help you complete class names and methods.

  3. Type safety: In XML, it is easy to type the name of a class or property incorrectly, but with JavaConfig this is not possible because you will be using code completion in your Java IDE. Even if you are not, you will get a compiler error and can fix it straightaway.

  4. Faster navigation: It is quicker to jump from one bean to another, track bean references etc because, since they are just Java classes and methods, you can use your IDE shortcuts to find types, go into method declarations and view call hierarchies.

  5. No context switching: Another advantage of JavaConfig is that your brain (and IDE) does not have to keep switching between XML and Java. You can stay happily in the Java world.

  6. No more XML! I hate XML in general. I find Spring XML verbose and hard to follow. It does not "belong" in a Java project. Sorry, but I don't think I will ever be using it again.

Oh, and YMMV :)

Saturday, September 15, 2012

Testing Expected Exceptions with JUnit Rules

This post shows how to test for expected exceptions using JUnit. Let's start with the following class that we wish to test:
public class Person {
  private final String name;
  private final int age;
    
  /**
   * Creates a person with the specified name and age.
   *
   * @param name the name
   * @param age the age
   * @throws IllegalArgumentException if the age is not greater than zero
   */
  public Person(String name, int age) {
    this.name = name;
    this.age = age;
    if (age <= 0) {
      throw new IllegalArgumentException("Invalid age:" + age);
    }
  }
}
In the example above, the Person constructor throws an IllegalArgumentException if the age of the person is not greater than zero. There are different ways to test this behaviour:

Approach 1: Use the ExpectedException Rule
This is my favourite approach. The ExpectedException rule allows you to specify, within your test, what exception you are expecting and even what the exception message is. This is shown below:

import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;

import org.junit.Rule;
import org.junit.Test;
import org.junit.rules.ExpectedException;

public class PersonTest {

  @Rule
  public ExpectedException exception = ExpectedException.none();
  
  @Test
  public void testExpectedException() {
    exception.expect(IllegalArgumentException.class);
    exception.expectMessage(containsString("Invalid age"));
    new Person("Joe", -1);
  }
}
Approach 2: Specify the exception in the @Test annotation
As shown in the code snippet below, you can specify the expected exception in the @Test annotation. The test will pass only if an exception of the specified class is thrown by the test method. Unfortunately, you can't test the exception message with this approach.
@Test(expected = IllegalArgumentException.class)
public void testExpectedException2() {
  new Person("Joe", -1);
}
Approach 3: Use a try-catch block
This is the "traditional" approach which was used with old versions of JUnit, before the introduction of annotations and rules. Surround your code in a try-catch clause and test if the exception is thrown. Don't forget to make the test fail if the exception is not thrown!
@Test
public void testExpectedException3() {
  try {
    new Person("Joe", -1);
    fail("Should have thrown an IllegalArgumentException because age is invalid!");
  } catch (IllegalArgumentException e) {
    assertThat(e.getMessage(), containsString("Invalid age"));
  }
}

Wednesday, August 15, 2012

Java 7: Fork/Join Framework Example

The Fork/Join Framework in Java 7 is designed for work that can be broken down into smaller tasks and the results of those tasks combined to produce the final result.

In general, classes that use the Fork/Join Framework follow the following simple algorithm:

// pseudocode
Result solve(Problem problem) {
  if (problem.size < SEQUENTIAL_THRESHOLD)
    return solveSequentially(problem);
  else {
    Result left, right;
    INVOKE-IN-PARALLEL {
      left = solve(extractLeftHalf(problem));
      right = solve(extractRightHalf(problem));
    }
    return combine(left, right);
  }
}
In order to demonstrate this, I have created an example to find the maximum number from a large array using fork/join:
import java.util.Random;
import java.util.concurrent.ForkJoinPool;
import java.util.concurrent.RecursiveTask;

public class MaximumFinder extends RecursiveTask<Integer> {

  private static final int SEQUENTIAL_THRESHOLD = 5;

  private final int[] data;
  private final int start;
  private final int end;

  public MaximumFinder(int[] data, int start, int end) {
    this.data = data;
    this.start = start;
    this.end = end;
  }

  public MaximumFinder(int[] data) {
    this(data, 0, data.length);
  }

  @Override
  protected Integer compute() {
    final int length = end - start;
    if (length < SEQUENTIAL_THRESHOLD) {
      return computeDirectly();
    }
    final int split = length / 2;
    final MaximumFinder left = new MaximumFinder(data, start, start + split);
    left.fork();
    final MaximumFinder right = new MaximumFinder(data, start + split, end);
    return Math.max(right.compute(), left.join());
  }

  private Integer computeDirectly() {
    System.out.println(Thread.currentThread() + " computing: " + start
                       + " to " + end);
    int max = Integer.MIN_VALUE;
    for (int i = start; i < end; i++) {
      if (data[i] > max) {
        max = data[i];
      }
    }
    return max;
  }

  public static void main(String[] args) {
    // create a random data set
    final int[] data = new int[1000];
    final Random random = new Random();
    for (int i = 0; i < data.length; i++) {
      data[i] = random.nextInt(100);
    }

    // submit the task to the pool
    final ForkJoinPool pool = new ForkJoinPool(4);
    final MaximumFinder finder = new MaximumFinder(data);
    System.out.println(pool.invoke(finder));
  }
}
The MaximumFinder class is a RecursiveTask which is responsible for finding the maximum number from an array. If the size of the array is less than a threshold (5) then find the maximum directly, by iterating over the array. Otherwise, split the array into two halves, recurse on each half and wait for them to complete (join). Once we have the result of each half, we can find the maximum of the two and return it.

Tuesday, August 14, 2012

Analysing a Java Core Dump

In this post, I will show you how you can debug a Java core file to see what caused your JVM to crash. I will be using a core file I generated in my previous post: Generating a Java Core Dump.

There are different ways you can diagnose a JVM crash, listed below:

The hs_err_pid log file
When a fatal error occurs in the JVM, it produces an error log file called hs_err_pidXXXX.log, normally in the working directory of the process or in the temporary directory for the operating system. The top of this file contains the cause of the crash and the "problematic frame". For example, mine shows:

$ head hs_err_pid21178.log
#
# A fatal error has been detected by the Java Runtime Environment:
#
#  SIGSEGV (0xb) at pc=0x0000002b1d00075c, pid=21178, tid=1076017504
#
# JRE version: 6.0_21-b06
# Java VM: Java HotSpot(TM) 64-Bit Server VM (17.0-b16 mixed mode linux-amd64 )
# Problematic frame:
# C  [libnativelib.so+0x75c]  bar+0x10
#
There is also a stack trace:
Stack: [0x000000004012b000,0x000000004022c000],  sp=0x000000004022aac0,  free space=3fe0000000000000018k
Native frames: (J=compiled Java code, j=interpreted, Vv=VM code, C=native code)
C  [libnativelib.so+0x75c]  bar+0x10
C  [libnativelib.so+0x772]  foo+0xe
C  [libnativelib.so+0x78e]  Java_CoreDumper_core+0x1a
j  CoreDumper.core()V+0
j  CoreDumper.main([Ljava/lang/String;)V+7
v  ~StubRoutines::call_stub
V  [libjvm.so+0x3e756d]
The stack trace shows that my java method, CoreDumper.core(), called into JNI and died when the bar function was called in native code.

Debugging a Java Core Dump
In some cases, the JVM may not produce a hs_err_pid file, for example, if the native code abruptly aborts by calling the abort function. In such cases, we need to analyse the core file produced. On my machine, the operating system writes out core files to /var/tmp/cores. You can use the following command to see where your system is configured to write out core files to:

$ cat /proc/sys/kernel/core_pattern
/var/tmp/cores/%e.%p.%u.core
$ ls /var/tmp/cores
java.21178.146385.core
There are a few, different ways to look at core dumps:

1. Using gdb
GNU Debugger (gdb) can examine a core file and work out what the program was doing when it crashed.

$ gdb $JAVA_HOME/bin/java /var/tmp/cores/java.14015.146385.core
(gdb) where
#0  0x0000002a959bd26d in raise () from /lib64/tls/libc.so.6
#1  0x0000002a959bea6e in abort () from /lib64/tls/libc.so.6
#2  0x0000002b1cecf799 in bar () from libnativelib.so
#3  0x0000002b1cecf7a7 in foo () from libnativelib.so
#4  0x0000002b1cecf7c3 in Java_CoreDumper_core () from libnativelib.so
#5  0x0000002a971aac88 in ?? ()
#6  0x0000000040113800 in ?? ()
#7  0x0000002a9719fa42 in ?? ()
#8  0x000000004022ab10 in ?? ()
#9  0x0000002a9a4d5488 in ?? ()
#10 0x000000004022ab70 in ?? ()
#11 0x0000002a9a4d59c8 in ?? ()
#12 0x0000000000000000 in ?? ()
The where command prints the stack frames and shows that the bar function called abort() which caused the crash.

2. Using jstack
jstack prints stack traces of Java threads for a given core file.

$ jstack -J-d64 $JAVA_HOME/bin/java /var/tmp/cores/java.14015.146385.core
Debugger attached successfully.
Server compiler detected.
JVM version is 17.0-b16
Deadlock Detection:

No deadlocks found.

Thread 16788: (state = BLOCKED)

Thread 16787: (state = BLOCKED)
 - java.lang.Object.wait(long) @bci=0 (Interpreted frame)
 - java.lang.ref.ReferenceQueue.remove(long) @bci=44, line=118 (Interpreted frame)
 - java.lang.ref.ReferenceQueue.remove() @bci=2, line=134 (Interpreted frame)
 - java.lang.ref.Finalizer$FinalizerThread.run() @bci=3, line=159 (Interpreted frame)

Thread 16786: (state = BLOCKED)
 - java.lang.Object.wait(long) @bci=0 (Interpreted frame)
 - java.lang.Object.wait() @bci=2, line=485 (Interpreted frame)
 - java.lang.ref.Reference$ReferenceHandler.run() @bci=46, line=116 (Interpreted frame)

Thread 16780: (state = IN_NATIVE)
 - CoreDumper.core() @bci=0 (Interpreted frame)
 - CoreDumper.main(java.lang.String[]) @bci=7, line=12 (Interpreted frame)
3. Using jmap
jmap examines a core file and prints out shared object memory maps or heap memory details.
$ jmap -J-d64 $JAVA_HOME/bin/java /var/tmp/cores/java.14015.146385.core
Debugger attached successfully.
Server compiler detected.
JVM version is 17.0-b16
0x0000000040000000      49K     /usr/sunjdk/1.6.0_21/bin/java
0x0000002a9566c000      124K    /lib64/tls/libpthread.so.0
0x0000002a95782000      47K     /usr/sunjdk/1.6.0_21/jre/lib/amd64/jli/libjli.so
0x0000002a9588c000      16K     /lib64/libdl.so.2
0x0000002a9598f000      1593K   /lib64/tls/libc.so.6
0x0000002a95556000      110K    /lib64/ld-linux-x86-64.so.2
0x0000002a95bca000      11443K  /usr/sunjdk/1.6.0_21/jre/lib/amd64/server/libjvm.so
0x0000002a96699000      625K    /lib64/tls/libm.so.6
0x0000002a9681f000      56K     /lib64/tls/librt.so.1
0x0000002a96939000      65K     /usr/sunjdk/1.6.0_21/jre/lib/amd64/libverify.so
0x0000002a96a48000      228K    /usr/sunjdk/1.6.0_21/jre/lib/amd64/libjava.so
0x0000002a96b9e000      109K    /lib64/libnsl.so.1
0x0000002a96cb6000      54K     /usr/sunjdk/1.6.0_21/jre/lib/amd64/native_threads/libhpi.so
0x0000002a96de8000      57K     /lib64/libnss_files.so.2
0x0000002a96ef4000      551K    /lib64/libnss_db.so.2
0x0000002a97086000      89K     /usr/sunjdk/1.6.0_21/jre/lib/amd64/libzip.so
0x0000002b1cecf000      6K      /home/sharfah/tmp/jni/libnativelib.so
Useful Links:
Crash course on JVM crash analysis
Generating a Java Core Dump

Monday, August 13, 2012

Generating a Java Core Dump

This post demonstrates how you can generate a Java core dump manually (using JNI).

1. Create a Java class

/**
 * A class to demonstrate core dumping.
 */
public class CoreDumper {

  // load the library
  static {
    System.loadLibrary("nativelib");
  }

  // native method declaration
  public native void core();

  public static void main(String[] args) {
    new CoreDumper().core();
  }
}
2. Compile the Java class
$ javac CoreDumper.java
$ ls
CoreDumper.class  CoreDumper.java
3. Generate the header file
$ javah -jni CoreDumper
$ ls
CoreDumper.class  CoreDumper.h  CoreDumper.java
4. Implement the native method
Copy the method declaration from the header file and create a new file called CoreDumper.c containing the implementation of this method:
#include "CoreDumper.h"

void bar() {
  // the following statements will produce a core
  int* p = NULL;
  *p = 5;

  // alternatively:
  // abort();
}

void foo() {
  bar();
}

JNIEXPORT void JNICALL Java_CoreDumper_core
  (JNIEnv *env, jobject obj) {
  foo();
}
5. Compile the native code
This command may vary based on your operating system. On my Red Hat Linux machine, I use the following command:
$ gcc -fPIC -o libnativelib.so -shared \
            -I$JAVA_HOME/include/linux/ \
            -I$JAVA_HOME/include/ \
             CoreDumper.c
$ ls
CoreDumper.class  CoreDumper.h  CoreDumper.java libnativelib.so
6. Run the program
$ java -Djava.library.path=. CoreDumper
#
# A fatal error has been detected by the Java Runtime Environment:
#
#  SIGSEGV (0xb) at pc=0x0000002b1cecf75c, pid=18919, tid=1076017504
#
# JRE version: 6.0_21-b06
# Java VM: Java HotSpot(TM) 64-Bit Server VM (17.0-b16 mixed mode linux-amd64 )
# Problematic frame:
# C  [libnativelib.so+0x75c]  bar+0x10
#
# An error report file with more information is saved as:
# /home/sharfah/tmp/jni/hs_err_pid18919.log
#
# If you would like to submit a bug report, please visit:
#   http://java.sun.com/webapps/bugreport/crash.jsp
# The crash happened outside the Java Virtual Machine in native code.
# See problematic frame for where to report the bug.
#
Aborted (core dumped)
The core file
As shown above, running the program causes it to crash and a core file is produced. On my machine, the operating system writes out core files to /var/tmp/cores. You can use the following command to see what your core file directory is configured to:
$ cat /proc/sys/kernel/core_pattern
/var/tmp/cores/%e.%p.%u.core
$ ls /var/tmp/cores
java.21178.146385.core
In my next post, I will show you how can you perform some quick analysis on a core file to see what caused the crash.

Next Post:
Analysing a Java Core Dump

Thursday, August 09, 2012

Running a command on multiple hosts

There are different ways you can run a command on multiple machines.

1. For loop
If you want to execute the same command on a few hosts, you can use a for loop as shown below:

for host in host1 host2 host3
do
    ssh $host "hostname; who -b"
done
The example above iterates over a list of hosts, and runs two commands on each one to print the name of the host and the time it was rebooted.

2. While loop
If your list of hosts is stored in a file, you can use a while loop as shown below:

while IFS= read -r host
do
    ssh -n $host "hostname; who -b"
done < /tmp/myhosts
You must provide the -n option to ssh, otherwise it will only run on the first host in your file and then the loop will terminate.

3. Parallel ssh
Parallel ssh (pssh) allows you to run a command on several hosts at the same time and is much faster than using a sequential loop if the number of hosts is large. You can specify how many parallel processes it uses to ssh to the various hosts (default is 32).

$ pssh
Usage: pssh [OPTIONS] -h hosts.txt prog [arg0] ..

  -h --hosts   hosts file (each line "host[:port] [user]")
  -l --user    username (OPTIONAL)
  -p --par     max number of parallel threads (OPTIONAL)
  -o --outdir  output directory for stdout files (OPTIONAL)
  -t --timeout timeout in seconds to do ssh to a host (OPTIONAL)
  -v --verbose turn on warning and diagnostic messages (OPTIONAL)
  -O --options SSH options (OPTIONAL)

$ pssh -h /tmp/myhosts -o /tmp/output "hostname; who -b"

Saturday, August 04, 2012

bash error: value too great for base

I came across this interesting error today:
-bash: 08: value too great for base (error token is "08")
It was coming from a script which works out the previous month by extracting the current month from the current date and then decrementing it. The code looks like this:
today="$(date +%Y%m%d)"
month=${today:4:2}
prevmonth=$((--month))
This script throws an error only if the current month is 08 or 09. I found that the reason for this is that numbers starting with 0 are interpreted as octal numbers and 8 and 9 are not in the base-8 number system, hence the error. There are more details on the bash man page:
Constants with a leading 0 are interpreted as octal numbers. A leading 0x or 0X denotes hexadecimal. Otherwise, numbers take the form [base#]n, where base is a decimal number between 2 and 64 represent- ing the arithmetic base, and n is a number in that base. If base# is omitted, then base 10 is used.
To fix this issue, I specified the base-10 prefix as shown below:
today="$(date +%Y%m%d)"
month=10#${today:4:2}
prevmonth=$((--month))

Saturday, June 30, 2012

vim: Change statusline colour based on mode

Here is my vimrc statusline configuration:
" statusline
" format markers:
"   %t File name (tail) of file in the buffer
"   %m Modified flag, text is " [+]"; " [-]" if 'modifiable' is off.
"   %r Readonly flag, text is " [RO]".
"   %y Type of file in the buffer, e.g., " [vim]".
"   %= Separation point between left and right aligned items.
"   %l Line number.
"   %L Number of lines in buffer.
"   %c Column number.
"   %P percentage through buffer
set statusline=%t\ %m%r%y%=(ascii=\%03.3b,hex=\%02.2B)\ (%l/%L,%c)\ (%P)
set laststatus=2
" change highlighting based on mode
if version >= 700
  highlight statusLine cterm=bold ctermfg=black ctermbg=red
  au InsertLeave * highlight StatusLine cterm=bold ctermfg=black ctermbg=red
  au InsertEnter * highlight StatusLine cterm=bold ctermfg=black ctermbg=green
endif
It displays some useful information about the file and your position within it. It also automatically changes the colour of the statusline from red to green when you enter INSERT mode and back to red when you leave it.

This is what the status line looks like in INSERT mode:

Foo.java [RO][java] (ascii=097,hex=61) (158/667,23) (26%)

To see a description of all possible status line variables type :help statusline in vim.

To see my complete vimrc visit my GitHub dotfiles repository.

Saturday, May 26, 2012

MultiTail: Viewing Multiple Files with Custom Colorschemes

MultiTail is a program which allows you to tail multiple files in a single terminal. The feature I find most useful is its ability to highlight text in files using "colorschemes". There are a number of pre-defined colorschemes which can be found in the configuration file, multitail.conf.

Here is an example of using multitail. The command below tails two files: an apache access log and a tomcat catalina log using two different colorschemes.

$ multitail -cS apache /tmp/apache/access_log -cS log4j ${TOMCAT_HOME}/logs/catalina.out
You can also add additional colorschemes to your ~/.multitailrc. A colorscheme is simply a set of regular expressions to capture and highlight the text you are interested in. Here is my config file which contains my custom XML colour scheme.
check_mail:0

colorscheme:xml
# element text
cs_re_s:white:>([^<]*)<
# attribute key
cs_re_s:green: ([^ =]*)=
# attribute value
cs_re_s:red:=("[^"]*")
# element name
cs_re:blue,,bold:<[^>]*>
Used like this:
$  multitail -cS xml /var/log/config.xml
Related Post:
Highlighting Command Output with Generic Colouriser

Saturday, April 28, 2012

Calling getdate() using Hibernate

This post shows you how to use Hibernate to call Sybase's getdate() function in order to get the current date and time on your database server.

First, you need to create an entity to represent the date object. Hibernate will then map the result of getdate() to this entity.

import java.util.Date;

import javax.persistence.Entity;
import javax.persistence.Id;
import javax.persistence.Temporal;
import javax.persistence.TemporalType;

/**
 * Represents a date entity.
 * Used by hibernate to map the getdate() sybase function onto.
 */
@Entity
public class DBDateTime {

    @Id
    @Temporal(TemporalType.TIMESTAMP)
    private Date date;

    /**
     * @return the date
     */
    public Date getDate() {
        return date;
    }
}
Usage:
The code snippet below shows how you would call getdate() on your Sybase database and get a Date returned:
Query query = entityManager.createNativeQuery("SELECT getdate() as date", 
                                              DBDateTime.class);
DBDateTime dateEntity = (DBDateTime) query.getSingleResult();
Date now = dateEntity.getDate();

Saturday, April 14, 2012

Sybase: How to BCP data in and out of databases

To quickly copy data from a table in one database to another database, for example, from production to a development environment, use the Sybase bcp utility as follows:

Step 1: bcp out to a file
First run bcp to copy data out of your database table and into a flat file. Just hit [Return] when prompted for lengths of columns, but remember to save the table format information to a file. An example is shown below:

$ bcp  Customers out /tmp/bcp.out -S server1 -t, -U username -P password
Enter the file storage type of field firstName [char]:
Enter prefix-length of field firstName [0]:
Enter length of field firstName [32]:
Enter field terminator [,]:

Enter the file storage type of field lastName [char]:
Enter prefix-length of field lastName [0]:
Enter length of field lastName [10]:
Enter field terminator [,]:

Enter the file storage type of field accessTime [smalldatetime]:
Enter prefix-length of field accessTime [0]:
Enter field terminator [,]:

Do you want to save this format information in a file? [Y/n] Y

Host filename [bcp.fmt]: /tmp/bcp.fmt

Starting copy...

14 rows copied.
Clock Time (ms.): total = 1  Avg = 0 (14000.00 rows per sec.)
Step 2: bcp in to the target database
Next run bcp to copy data from the flat file to your target database using the format file you saved in Step 1.
$ bcp  Customers in /tmp/bcp.out -S server2 -f /tmp/bcp.fmt -U username -P password
Starting copy...

14 rows copied.
Clock Time (ms.): total = 9  Avg = 0 (1555.56 rows per sec.)

Saturday, April 07, 2012

Using a NamedQuery with a Composite ID in Hibernate

Last week I wrote about how you can create a Composite ID in Hibernate if your table has multiple key columns. In this post, I will show you how you can use a NamedQuery to select entities with embedded composite ids.

The code is shown below. The entity class is the same as before except for the @NamedQueries annotation which defines two named queries. The first query searches for a user with a specific first name and last name. The second query allows you to specify multiple last names in an "in-clause".

import java.io.Serializable;

import javax.persistence.Column;
import javax.persistence.Embeddable;
import javax.persistence.EmbeddedId;
import javax.persistence.Entity;
import javax.persistence.NamedQueries;
import javax.persistence.NamedQuery;
import javax.persistence.Table;
import javax.persistence.Temporal;
import javax.persistence.TemporalType;

/**
 * The User entity which contains an embedded UserId.
 */
@Entity
@Table(name = "Users")
@NamedQueries({
  @NamedQuery(name="user.findByName",
              query = "from User where id.firstName = :firstName and id.lastName = :lastName"),
  @NamedQuery(name="user.withLastNames",
              query = "from User where id.lastName in (:lastName)")
})
public class User {

  @EmbeddedId
  private UserId id;

  private String website;

  @Temporal(TemporalType.TIMESTAMP)
  @Column(insertable = false)
  private Date lastUpdateTime;

  /**
   * Default constructor required by Hibernate.
   */
  public User() {
  }

  /**
   * This represents a "composite primary key" for the Users table.
   * It contains all the columns that form a unique id.
   * Must implement equals() and hashcode() and be serializable.
   * https://community.jboss.org/wiki/EqualsAndHashCode
   */
  @Embeddable
  public static class UserId  implements Serializable {

    private static final long serialVersionUID = 1L;

    private String firstName;
    private String lastName;

    /**
     * Default constructor required by hibernate.
     */
    public UserId() {
    }

    /** (non-Javadoc)
     * @see java.lang.Object#hashCode()
     */
    @Override
    public int hashCode() {
      final int prime = 31;
      int result = 1;
      result = prime * result + ((firstName == null) ? 0 : firstName.hashCode());
      result = prime * result + ((lastName == null) ? 0 : lastName.hashCode());
      return result;
    }

    /** (non-Javadoc)
     * @see java.lang.Object#equals(java.lang.Object)
     */
    @Override
    public boolean equals(Object obj) {
      if (this == obj)
        return true;
      if (obj == null)
        return false;
      if (getClass() != obj.getClass())
        return false;
      UserId other = (UserId) obj;
      if (firstName == null) {
        if (other.firstName != null)
          return false;
      } else if (!firstName.equals(other.firstName))
        return false;
      if (lastName == null) {
        if (other.lastName != null)
          return false;
      } else if (!lastName.equals(other.lastName))
        return false;
      return true;
    }
  }
}
Usage:
The code snippet below shows how you would use the named queries:

// the first query
Query query1 = entityManager.createNamedQuery("user.findByName")
                            .setParameter("firstName", "Peter")
                            .setParameter("lastName", "Griffin");
List<User> resultList = query1.getResultList();

// the second query
Query query2 = entityManager.createNamedQuery("user.withLastNames")
                            .setParameter("lastNames", Arrays.asList("Dent", "Griffin"));
List<User> resultList = query2.getResultList();

Saturday, March 31, 2012

Composite IDs in Hibernate

Consider the following database table which contains a list of users and the times they last accessed a certain website:
FirstNameLastNameWebsiteAccessTime
ArthurDentGoogle2012-03-30 07:09:00.0
PeterGriffinYahoo!2012-03-30 10:36:00.0
Let's say that you want to use both the FirstName and LastName fields to uniquely identify records in the table and perform searches/updates using Hibernate. There are different ways to define a "composite primary key" and, in this post, I will show you an approach I have used successfully in the past, which involves creating an EmbeddedId.

The @EmbeddedId approach:
The code is shown below and is pretty self-explanatory. It consists of a User class which contains a static nested UserId class. The latter holds the first name and the last name attributes and thus represents the composite ID.

import java.io.Serializable;

import javax.persistence.Embeddable;
import javax.persistence.EmbeddedId;
import javax.persistence.Entity;
import javax.persistence.Table;

import org.hibernate.annotations.Type;
import org.joda.time.DateTime;

/**
 * The User entity which contains an embedded UserId.
 */
@Entity
@Table(name = "Users")
public class User {

  @EmbeddedId
  private UserId id;

  private String website;

  @Type(type = "org.joda.time.contrib.hibernate.PersistentDateTime")
  private DateTime accessTime;

  /**
   * Default constructor required by Hibernate.
   */
  public User() {
  }

  /**
   * @param id the user id
   */
  public User(UserId id) {
    this.id = id;
  }

  /**
   * @return the id
   */
  public UserId getId() {
    return id;
  }

  /**
   * @param id the id to set
   */
  public void setId(UserId id) {
    this.id = id;
  }

  /**
   * @return the website
   */
  public String getWebsite() {
    return website;
  }

  /**
   * @param website the website to set
   */
  public void setWebsite(String website) {
    this.website = website;
  }

  /**
   * @return the accessTime
   */
  public DateTime getAccessTime() {
    return accessTime;
  }

  /**
   * @param accessTime the accessTime to set
   */
  public void setAccessTime(DateTime accessTime) {
    this.accessTime = accessTime;
  }

  /**
   * This represents a "composite primary key" for the Users table.
   * It contains all the columns that form a unique id.
   * Must implement equals() and hashcode() and be serializable.
   * https://community.jboss.org/wiki/EqualsAndHashCode
   */
  @Embeddable
  public static class UserId  implements Serializable {

    private static final long serialVersionUID = 1L;

    private String firstName;
    private String lastName;

    /**
     * Default constructor required by Hibernate.
     */
    public UserId() {
    }

    /**
     * @param firstName the first name
     * @param lastName the last name
     */
    public UserId(String firstName, String lastName) {
      this.firstName = firstName;
      this.lastName = lastName;
    }

    /**
     * @return the firstName
     */
    public String getFirstName() {
      return firstName;
    }

    /**
     * @param firstName the firstName to set
     */
    public void setFirstName(String firstName) {
      this.firstName = firstName;
    }

    /**
     * @return the lastName
     */
    public String getLastName() {
      return lastName;
    }

    /**
     * @param lastName the lastName to set
     */
    public void setLastName(String lastName) {
      this.lastName = lastName;
    }

    /** (non-Javadoc)
     * @see java.lang.Object#hashCode()
     */
    @Override
    public int hashCode() {
      final int prime = 31;
      int result = 1;
      result = prime * result + ((firstName == null) ? 0 : firstName.hashCode());
      result = prime * result + ((lastName == null) ? 0 : lastName.hashCode());
      return result;
    }

    /** (non-Javadoc)
     * @see java.lang.Object#equals(java.lang.Object)
     */
    @Override
    public boolean equals(Object obj) {
      if (this == obj)
        return true;
      if (obj == null)
        return false;
      if (getClass() != obj.getClass())
        return false;
      UserId other = (UserId) obj;
      if (firstName == null) {
        if (other.firstName != null)
          return false;
      } else if (!firstName.equals(other.firstName))
        return false;
      if (lastName == null) {
        if (other.lastName != null)
          return false;
      } else if (!lastName.equals(other.lastName))
        return false;
      return true;
    }
  }
}
Usage:
The code snippet below shows how you would search for a user and update its access time:
UserId userId = new UserId("Peter", "Griffin");

// search for a user
User user = entityManager.find(User.class, userId);
if (user == null) {
    // the user doesn't exist, so create one
    user = new User(userId);
}
user.setAccessTime(new DateTime());

// update the user
entityManager.merge(user);

Sunday, March 25, 2012

Mocking with Mockito

Here is an example of using Mockito to mock a service. The service used in this example, is a fictitious PersonService which returns Person objects based on their name. It might do this by connecting to a external database. In our unit tests, we don't want to connect to an external database, hence the reason for creating a mock. The mocked version shown below always returns a new Person object with the name that was passed into the service.
import static org.mockito.Matchers.anyString;
import static org.mockito.Mockito.when;
import static org.junit.Assert.*;
import org.mockito.Mock;
import org.mockito.MockitoAnnotations;
import org.mockito.invocation.InvocationOnMock;
import org.mockito.stubbing.Answer;

public class PersonServiceTest {

  @Mock private PersonService personService;

  @Before
  public void setUp() {
    MockitoAnnotations.initMocks(this);
    when(personService.getPerson(anyString())).thenAnswer(
      new Answer<Person>() {
        @Override
        public Person answer(InvocationOnMock invocation) throws Throwable {
          Object[] args = invocation.getArguments();
          return new Person((String)args[0]);
        }
      });
  }

  @Test
  public void testGetPerson(){
    assertEquals("Alice", personService.getPerson("Alice").getName());
  }
}

Saturday, March 24, 2012

Eclipse: Writing Better, Faster Java Documentation

JAutodoc:
I've started using JAutodoc, which is an Eclipse plugin for automatically adding Javadoc to your source code. It helps generate the initial Javadoc for methods which don't have them and can even complete existing Javadoc by adding missing parameters and return types. I have found this plugin very useful in writing documentation fast.

Let's say you have the following method which adds two ints:

public int add(int x, int y) {
    return x + y;
}
To invoke JAutodoc, all you have to do is hit Ctrl+Alt+J inside the method and the following Javadoc template will be automatically generated, which you can then complete.
/**
 * Adds the.
 *
 * @param x the x
 * @param y the y
 * @return the int
 */
public int add(int x, int y) {
    return x + y;
}
Later on, if you decide to change this method by adding another parameter to it, you can press Ctrl+Alt+J again and JAutoDoc will add the new parameter to the Javadoc but leave the rest of it unchanged.

Enabling Eclipse Javadoc warnings:
I've also found it useful to turn on Eclipse warnings for missing or malformed Javadoc comments. You can do this by going to Window > Preferences and then selecting Java > Compiler > Javadoc. Tick the box for processing Javadoc comments and select your desired severity levels. I've got mine set to Warnings for everything. You can get my Eclipse preferences from my git repository.

Saturday, February 04, 2012

Visual GC Error: Not Supported for this JVM

The Visual GC plugin for Visual VM graphically displays garbage collection, class loader, and HotSpot compiler performance data.

If you install this plugin and, like me, get the error "Not Supported for this JVM", when you try to start Visual GC, try the following steps:

  1. Start jstatd on the remote host. You need to specify a policy file, otherwise you will get an AccessControlException: access denied exception. Create a policy file e.g. /tmp/tools.policy containing:
    grant codebase "file:${java.home}/../lib/tools.jar" {
       permission java.security.AllPermission;
    };
    
    Then start jstatd using the command below:
    jstatd -J-Djava.security.policy=/tmp/tools.policy
    
  2. Start Visual VM. Add a Remote Host and then add a "jstatd connection" to it. Your JVM should appear in the list. You can then click on it and look at the "Visual GC" tab for garbage collection information.

Saturday, January 21, 2012

Play My Code: Wacky Cannon

I have always wanted to make a projectile game and now I have! I've just published Wacky Cannon on Play My Code. The aim of the game is to control the cannon using your mouse and click to shoot the purple targets. See how many levels you can complete!

Here's Wacky Cannon!

Play My Code is a place where you can create your own online games easily using the site's Quby language. The games are compiled into native JavaScript, compatible with all modern HTML5-compliant browsers. Once you've written a game, you can embed it on your website or blog, like I have done above. It's like YouTube, but for games! So, if you think you've got a game in you, head over to Play My Code and write one.

Click here to see all my games, including the popular Chain Reaction!

Did you like Wacky Cannon? Share your thoughts in the comments below.

Related posts:
Play My Code: Chain Reaction

Saturday, January 14, 2012

stackoverflow - 30k rep

Seven months after crossing the 20k milestone, I've now achieved a reputation of 30k on stackoverflow! The following table shows some interesting stats about my journey so far:
0-10k 10-20k 20-30k Total
Date achieved 01/2011 05/2011 01/2012
Questions answered 546 376 253 1175
Questions asked 46 1 6 53
Tags covered 609 202 83 894
Badges
(gold, silver, bronze)
35
(2, 10, 23)
14
(0, 4, 10)
33
(2, 8, 23)
82
(4, 22, 56)
As I mentioned before, I have really enjoyed being a member of stackoverflow. For me, it has not simply been a quest for a high reputation, but more about learning new technologies and picking up advice from other experts on the site. I like to take on challenging questions, rather than the easy ones, because it pushes me to do research into areas I have never looked at before, and I learn so much during the process.

I have to admit, I haven't spent much time on stackoverflow recently. I've been busy at work and also took up three Stanford online courses which I completed at the end of last year.

Now let's see how fast I can make it to 40k!

Saturday, January 07, 2012

Stackless Exceptions for Improved Performance

One of the reasons for standard exceptions being slow is that they have to fill in the execution stack trace for the current thread. Although, this is useful for debugging, in most cases you don't really care about the stack trace. What you care about is that an exception of a certain type was thrown and what the error message was. For example, a java.io.FileNotFoundException was thrown with message config.xml (The system cannot find the file specified)".

Stackless Exceptions are exceptions without any associated stack information. They are faster to create than normal exceptions because they don't record information about the current state of the stack frames for the current thread.

The class below is an example of a stackless exception. The fillInStackTrace method has been overridden so that it doesn't do anything and simply returns the current instance.

/**
 * An exception which does not fill in a stack trace
 * for performance reasons.
 */
@SuppressWarnings("serial")
public class StacklessException extends Exception {

    /**
     * Constructs a new stackless exception
     * with the specified detail message.
     *
     * @param message the detail message.
     * @see java.lang.Exception#Exception(String)
     */
    public StacklessException(String message) {
        super(message);
    }

    /**
     * Does not fill in the stack trace for this exception
     * for performance reasons.
     *
     * @return this instance
     * @see java.lang.Throwable#fillInStackTrace()
     */
    @Override
    public Throwable fillInStackTrace() {
        return this;
    }
}
I measured performance by comparing the time taken to create a million StacklessException and Exception objects. I found that the creation of stackless exceptions is nearly 40 times faster than normal exceptions.

You can then create more specific exception types which subclass StacklessException. For example:

public class PersonNotFoundException extends StacklessException {
    public PersonNotFoundException(String message) {
        super(message);
    }
}
As an aside, note that the JVM omits stack traces if an exception is thrown very frequently. This optimisation is enabled by default and can be disabled using the JVM option -XX:-OmitStackTraceInFastThrow.

Sunday, January 01, 2012

fahd.blog in 2011

Happy 2012!
I'd like to wish everyone a great start to an even greater new year!

During 2011, I posted 58 new entries on fahd.blog. I am also thrilled that I have more readers from all over the world too! Thanks for reading and especially for giving feedback.

Top 5 posts of 2011:

I'm going to be writing a lot more this year, so stay tuned for more great techie tips, tricks and hacks! :)