Automated Testing Layers and Execution Configuration

One of the most beneficial things you can do when developing production quality applications is apply automated testing in a few different layers. In my most recent projects we’ve successfully built and deployed applications with no dedicated QA and minimal amounts of manual testing. First lets take a look at our layers.

  1. Unit Tests
  2. Spring Database/MVC Integration tests
  3. Pre-deployment Selenium Integration Tests
  4. Post-deployment Selenium Integration Tests

and then how we need to configure our tests for execution during

  1. development
  2. build

Unit Tests

Unit testing give us a few advantages. To put it bluntly, it forces you to design and code loosely coupled objects like we’ve always been taught, but not always practiced. This isn’t to say that other code isn’t testable, but it becomes a huge nightmare when you are testing code that does 100 different things.

Next, Unit Testing also documents our code. By writing a unit test I am telling other developers who end up working on the code “this is how it should work in these situations” and “these are my assumptions for this code”. Unit test combined with verbose self documenting code helps eliminate our need for large amounts of javadocs and other written documentations (not to say you should write no documentation, but you should understand when and where it is necessary).

The main thing to understand here is that these test do not determine if your system actually works. They just make sure the assumptions of the initial developer hold true for a given unit (class in this case).

Let’s take a look at an example unit test where we isolate our class under test using mockito to mock out its dependencies.

@RunWith(MockitoJUnitRunner.class)
public class ManifestServiceImplTest {
	private static final long FILEID = -1L;
	@InjectMocks
	private ManifestService manifestServiceImpl = new ManifestServiceImpl();
	@Mock
	private UserService userService;
	@Mock
	private MidService midService;
	@Mock
	private ManifestRepository manifestRepository;
	private Manifest manifest;
	private User user;
	private final String username = "abc";
	@Captor
	private ArgumentCaptor<Pageable> manifestPageCaptor;

	@Before
	public void setup() {
		user = new User();
		user.setUsername(username);
		when(userService.find(username)).thenReturn(user);
		manifest = new Manifest();
		when(manifestRepository.findOne(FILEID)).thenReturn(manifest);
		when(manifestRepository.save(manifest)).thenReturn(manifest);
	}

	@Test
	public void getAvailableEnvNone() {
		when(midService.hasCompletedMidCertificationStatus(username))
				.thenReturn(false);
		when(midService.hasIncompletedMidCertificationStatus(username))
				.thenReturn(false);
		assertTrue("no manifestEnvs should be returned if user has no mid",
				manifestServiceImpl.getAvailableManifestEnvs(username)
						.isEmpty());
	}

	@Test
	public void getAvailableEnvCompleteOnly() {
		when(midService.hasCompletedMidCertificationStatus(username))
				.thenReturn(true);
		when(midService.hasIncompletedMidCertificationStatus(username))
				.thenReturn(false);
		Set<ManifestEnv> manifestEnvs = manifestServiceImpl
				.getAvailableManifestEnvs(username);
		assertEquals(
				"manifestEnvs should have 2 entries when user only has completed mid cert",
				2, manifestEnvs.size());
		assertTrue("manifestEnvs should contain all ManifestEnv enums",
				manifestEnvs.containsAll(Arrays.asList(ManifestEnv.values())));
	}

	@

	Test
	public void getAvailableEnvIncompletOnly() {
		when(midService.hasCompletedMidCertificationStatus(username))
				.thenReturn(false);
		when(midService.hasIncompletedMidCertificationStatus(username))
				.thenReturn(true);
		Set<ManifestEnv> manifestEnvs = manifestServiceImpl
				.getAvailableManifestEnvs(username);
		assertEquals(
				"manifestEnvs hsould only have 1 entry when user has only incomplete mid cert",
				1, manifestEnvs.size());
		assertTrue("mainfestEnvs should only contain TEM",
				manifestEnvs.contains(ManifestEnv.TEM));
	}

	@Test
	public void getAvailableEnvBoth() {
		when(midService.hasCompletedMidCertificationStatus(username))
				.thenReturn(true);
		when(midService.hasIncompletedMidCertificationStatus(username))
				.thenReturn(true);
		Set<ManifestEnv> manifestEnvs = manifestServiceImpl
				.getAvailableManifestEnvs(username);
		assertEquals(
				"manifestEnvs should have 2 entries when user only has completed mid cert",
				2, manifestEnvs.size());
		assertTrue("manifestEnvs should contain all ManifestEnv enums",
				manifestEnvs.containsAll(Arrays.asList(ManifestEnv.values())));
	}

	@Test
	public void find() {
		when(manifestRepository.findOne(FILEID)).thenReturn(manifest);
		final Manifest returnedManifest = manifestServiceImpl.find(FILEID);
		verify(manifestRepository).findOne(FILEID);
		assertEquals("manifest should be returned when found by FILEID",
				manifest, returnedManifest);
	}

	@Test
	public void findNotFound() {
		when(manifestRepository.findOne(FILEID)).thenReturn(null);
		final Manifest returnedManifest = manifestServiceImpl.find(FILEID);
		verify(manifestRepository).findOne(FILEID);
		assertEquals(
				"null should be returned when a manifest file is not found",
				null, returnedManifest);
	}

	@Test
	public void findUserManifestHistory() {
		final Page<Manifest> page = new PageImpl<Manifest>(
				Lists.newArrayList(manifest));
		when(
				manifestRepository.findByUserUsernameOrderByCreatedTimeDesc(
						eq("abc"), isA(Pageable.class))).thenReturn(page);
		manifestServiceImpl.findUserManifestHistory("abc");
		verify(manifestRepository).findByUserUsernameOrderByCreatedTimeDesc(
				eq("abc"), manifestPageCaptor.capture());
		assertEquals("user manifest histroy should be max 7 for page size", 7,
				manifestPageCaptor.getValue().getPageSize());
		assertEquals(
				"user manifest histroy should always return the first page", 0,
				manifestPageCaptor.getValue().getPageNumber());
	}

	@Test
	public void create() {
		final Manifest returnedManifest = manifestServiceImpl.create(manifest,
				username);
		assertEquals("user should be set to manifest when creating", user,
				returnedManifest.getUser());
		verify(manifestRepository).save(manifest);
	}

}

and our class under test

@Service
public class ManifestServiceImpl implements ManifestService {

	@Autowired
	private ManifestRepository manifestRepository;

	@Autowired
	private UserService userService;

	@Autowired
	private MidService midService;

	@Override
	public Manifest create(final Manifest manifest, final String username) {
		Validate.notNull(manifest);
		Validate.notBlank(username);
		final User user = userService.find(username);
		Validate.notNull(user);
		manifest.setUser(user);
		return manifestRepository.save(manifest);
	}

	@Override
	public Set<ManifestEnv> getAvailableManifestEnvs(final String username) {
		Validate.notBlank(username);
		final Set<ManifestEnv> envs = Sets.newHashSet();
		if (midService.hasCompletedMidCertificationStatus(username)) {
			envs.add(ManifestEnv.PROD);
			envs.add(ManifestEnv.TEM);
			return envs;
		}
		if (midService.hasIncompletedMidCertificationStatus(username)) {
			envs.add(ManifestEnv.TEM);
		}
		return envs;
	}

	@Override
	@PostAuthorize("returnObject == null or returnObject.user.username == principal.username")
	public Manifest find(final long id) {
		return manifestRepository.findOne(id);
	}

	@Override
	public Page<Manifest> findUserManifestHistory(final String username) {
		Validate.notBlank(username);
		final Pageable pageable = new PageRequest(0, 7);
		return manifestRepository.findByUserUsernameOrderByCreatedTimeDesc(
				username, pageable);
	}
}

I mainly want to look at the find method so we can demonstrate what unit tests do and do not do for us.

The find method does nothing except delegate to a spring data repository interface. So we have nothing really to test except that the method got called. But even so we actually have two test methods for this method. The question is why? We can’t test much functionality here since our method isn’t doing much on its own, and code coverage would be 100% with only a single test method verifying the mocked out object had its findOne method called. But we have a found and notFound test, and this is because our interface can return a null or non null object. So here we aren’t really testing anything, but we are documenting that our interface will return null if nothing is found by the repository.

That being said, lets move on to some test that actually test our system (paritally) as a whole.

Spring Database and MVC Integration Tests

This is our first layer of integration tests. We use the spring-test framework for building these tests that focus on our spring container and down. These test spin up their own spring context and are not deployed to a container. Our test also have full access to our spring context so we can inject beans into our test class.

First lets look at a spring database integration test for our MainfestServiceImpl class to compare it with the unit tests we created.

@FlywayTest
@DBUnitSupport(loadFilesForRun = { "CLEAN_INSERT", "/dbunit/dbunit.base.xml",
		"CLEAN_INSERT", "/dbunit/dbunit.dev.base.xml", "CLEAN_INSERT",
		"/dbunit/dbunit.manifest.xml", "CLEAN_INSERT", "/dbunit/dbunit.mid.xml" })
public class ManifestServiceImplSpringDatabaseITest extends
		AbstractSpringDatabaseTest {

	@Autowired
	private ManifestService manifestService;

	@Autowired
	private UserRepository userRepository;
	private User user;

	@Before
	public void setup() {
		user = userRepository.findOne(-1L);
	}

	@Test
	@DBUnitSupport(loadFilesForRun = { "CLEAN_INSERT",
			"/dbunit/dbunit.mid.cert.incomplete.xml" })
	public void getAvailableManifestEnvsTEMOnly() {
		Set<ManifestEnv> manifestEnvs = manifestService
				.getAvailableManifestEnvs(user.getUsername());
		assertEquals(
				"only one ManifestEnv should be returned when user only has incomplete mid",
				1, manifestEnvs.size());
		assertTrue("returned manifestEnvs should contain TEM",
				manifestEnvs.contains(ManifestEnv.TEM));
	}

	@Test
	@DBUnitSupport(loadFilesForRun = { "CLEAN_INSERT",
			"/dbunit/dbunit.mid.cert.complete.xml" })
	public void getAvailableManifestEnvsTEMAndPROD() {
		Set<ManifestEnv> manifestEnvs = manifestService
				.getAvailableManifestEnvs(user.getUsername());
		assertEquals(
				"only TEM and PROD should be returned when user only has complete mid",
				2, manifestEnvs.size());
		assertTrue("returned manifestEnvs should contain TEM and PROD",
				manifestEnvs.containsAll(Arrays.asList(ManifestEnv.values())));
	}

	@Test
	public void findFound() {
		assertNotNull(manifestService.find(-1L));
	}

	@Test
	public void findNotFound() {
		assertNull("null should be returned when file not found",
				manifestService.find(-10L));
	}

	@Test
	@DBUnitSupport(loadFilesForRun = { "CLEAN_INSERT",
			"/dbunit/dbunit.manifest.xml" })
	public void findUserManifestHistory() {
		assertEquals("user should have 2 manifest in their history", 2,
				manifestService.findUserManifestHistory(user.getUsername())
						.getNumberOfElements());
	}

	@Test
	public void create() throws IOException {
		final Manifest manifest = new Manifest();
		manifest.setEnvironment(ManifestEnv.PROD);
		byte[] data = "hello".getBytes();
		final MultipartFile multipartFile = new MockMultipartFile("somefile",
				"file.txt", null, new ByteArrayInputStream(data));

		manifest.setMultipartFile(multipartFile);
		final Manifest returnManifest = manifestService.create(manifest,
				user.getUsername());
		assertTrue(returnManifest.getFileSystemResource().exists());
		assertNotNull("id should be set for saved manifest", manifest.getId());
		assertNotNull("createdTime should be set on manifest",
				manifest.getCreatedTime());
		assertNotNull("path should be set on manifest", manifest.getPath());
		assertNotNull("filename should be set on manifest",
				manifest.getFilename());
		assertEquals("file should be saved when manifest is saved",
				data.length, IOUtils.toByteArray(manifest
						.getFileSystemResource().getInputStream()).length);
	}
}

Again, similar test methods, but this time our intentions are different. For integration test we are now actually testing the application as it would be run in a live environment. We don’t mock anything here, but we do need to setup test data in our database so that our tests are reproducible.

Let’s look at the create method this time. Since our object is a hibernate entity, we expect hibernate to perform some operations for us. In this case, our entity has a prePersist method that writes a file to the file system before saving our entity to the database. That method also sets up the state of our entity by storing the path to the file, its original filename, time it was created, and hibernate assigns an id.

The @Flyway annotation handles our database lifecycle. It can be placed on the method/class level and will clean and rebuild the database. This combined with the @DBUnitSupport annotation lets us fully control the state of our database for each test. See https://github.com/flyway/flyway-test-extensions for more information.

That being said, lets take a look at the AbstractSpringDatabaseTest class that we extend so we can see how everything is configured for these tests.

@RunWith(SpringJUnit4ClassRunner.class)
@Category(IntegrationTest.class)
@ContextConfiguration(classes = { SpringTestConfig.class })
@ActiveProfiles({ "DEV_SERVICES", "TEST_DB", "DEV_SECURITY" })
@TestExecutionListeners({ DependencyInjectionTestExecutionListener.class,
		FlywayDBUnitTestExecutionListener.class,
		TransactionalTestExecutionListener.class })
@TransactionConfiguration(defaultRollback = false)
@Transactional("transactionManager")
public class AbstractSpringDatabaseTest {

}

So few things here. First is the @RunWith and @ContextConfiguration annotations setup our spring context and @ActiveProfiles sets the spring profiles we want to use while running these tests. The @TestExecutionListeners lets us register listeners that spring-test provides hooks for in our tests. DependencyInjectionTestExecutionListener allows us to inject beans directly into our test, FlywayDBUnitTestExecutionListener handles @Flyway and @DbUnitSupport annotations and TransactionalTestExecutionListener makes our tests transnational so hibernate has transactions to work within. Next for transactional support we have @TransactionConfiguration which allows us to configure our transactions and @Transactional(“transactionManager”) which actually wraps our test methods in transactions (you most likely have seen this annotation when writing transnational code).

Next we need to take a look at the SpringTestConfig class

@Configuration
@Import({ DatabaseTestConfig.class })
@ComponentScan(value = "com.company.artifact.app")
@PropertySource("classpath:test.properties")
public class SpringTestConfig {
	@Bean
	public static PropertySourcesPlaceholderConfigurer propertyPlaceholder() {
		final PropertySourcesPlaceholderConfigurer placeholder = new PropertySourcesPlaceholderConfigurer();
		placeholder.setIgnoreUnresolvablePlaceholders(true);
		return placeholder;
}

Again, this class isn’t doing too much. It tells spring to scan our base class for beans and imports another configuration class. It also imports some properties for our test.

@Configuration
@Profile("TEST_DB")
@PropertySource({ "classpath:flyway.properties" })
public class DatabaseTestConfig {

	@Value("${flyway.user}")
	private String user;
	@Value("${flyway.password}")
	private String password;
	@Value("${flyway.url}")
	private String url;
	@Value("${flyway.locations}")
	private String locations;
	@Value("${flyway.placeholdersPrefix}")
	private String prefix;
	@Value("${flyway.placeholderSuffix}")
	private String suffix;

	@Bean(destroyMethod = "close")
	public DataSource dataSource() {
		final BasicDataSource basicDataSource = new BasicDataSource();
		basicDataSource.setUsername(user);
		basicDataSource.setPassword(password);
		basicDataSource.setUrl(url);
		basicDataSource.setMaxActive(-1);
		return basicDataSource;
	}

	@Bean
	public FlywayHelperFactory flywayHelperFactory() {
		final FlywayHelperFactory factory = new FlywayHelperFactory();
		final Properties flywayProperites = new Properties();
		flywayProperites.setProperty("flyway.user", user);
		flywayProperites.setProperty("flyway.password", password);
		flywayProperites.setProperty("flyway.url", url);
		flywayProperites.setProperty("flyway.locations", locations);
		flywayProperites.setProperty("flyway.placeholderPrefix", prefix);
		flywayProperites.setProperty("flyway.placeholderSuffix", suffix);
		factory.setFlywayProperties(flywayProperites);
		return factory;
	}

	@Bean
	public Flyway flyway() {
		final Flyway flyway = flywayHelperFactory().createFlyway();
		flyway.setDataSource(dataSource());
		return flyway;
	}

	@Bean
	@Qualifier("userNumber")
	public String userNumber() {
		return "";
	}
}

We need to setup our own datasource for our tests since they won’t be run from within our container, and thus won’t have access to any jndi resources. Also here we configure flyway to use that datasource as well. flyway.properties is actually populated by defaults in our parent maven pom and can be overridden during a test run if needed. We’ll see later when we talk about the maven build how we use these properties to run against either oracle or h2 database.

Ignore the userNumber bean for now, we’ll get to that when we talk about pre and post deployment selenium tests.

Next lets look at how we extend these database test to support using spring-test for testing Spring MVC.

@WebAppConfiguration
public class AbstractSpringMvcTest extends AbstractSpringDatabaseTest {

  @Autowired
  protected WebApplicationContext webApplicationContext;

  @Autowired
  private FilterChainProxy springSecurityFilterChain;

  protected MockMvc mockMvc;

  @Before
  public void setup() {
    mockMvc =
        MockMvcBuilders.webAppContextSetup(webApplicationContext)
            .addFilter(springSecurityFilterChain).build();
  }

  protected UserDetailsRequestPostProcessor custregUser(final String username) {
    return SecurityRequestPostProcessors.userDeatilsService(username).userDetailsServiceBeanId(
        "custregUserDetailsService");
  }
}

Here we are extending our database test functionality and adding in some spring mvc test configuration. The code here is setting up springs mockMvc for us and setting a username and userDetailsService for our security so we don’t need to mock out our user. Also we annotate our configuration with @WebAppConfiguration so that we have a WebApplicationContext created for us.

Currently Spring MVC test does not support spring security, but there is an example at https://github.com/spring-projects/spring-test-mvc/blob/master/src/test/java/org/springframework/test/web/server/samples/context/SecurityRequestPostProcessors.java that we borrow from. This let’s use create some request post processors and add in our spring security information before tests run. This makes it so we don’t need to mock out our SecurityContextHolder wrapper bean since we’ll set an actual authentication object on it. This feature will most likely be added in a later version of spring test.

There’s not much configuration here outside what we’ve covered during our database tests so lets take a look at an example using Spring mvc test.

@FlywayTest
@DBUnitSupport(loadFilesForRun = { "CLEAN_INSERT", "/dbunit/dbunit.base.xml",
		"CLEAN_INSERT", "/dbunit/dbunit.dev.base.xml", "CLEAN_INSERT",
		"/dbunit/dbunit.mid.xml", "CLEAN_INSERT",
		"/dbunit/dbunit.mid.cert.complete.xml", "CLEAN_INSERT",
		"/dbunit/dbunit.manifest.xml", })
public class ManifestControllerSpringMvcITest extends AbstractSpringMvcTest {

	private User user;
	@Autowired
	private UserRepository userRepository;
	private MockMultipartFile file;

	@Before
	public void setupData() {
		user = userRepository.findOne(-1L);
		file = new MockMultipartFile("multipartFile", "orig", null,
				"bar".getBytes());
	}

	@Test
	public void index() throws Exception {
		mockMvc.perform(get("/manifests").with(custregUser(user.getUsername())))
				.andExpect(view().name(is("manifest/index")))
				.andExpect(
						model().attributeExists("manifestHistory",
								"manifestEnvironments"));
	}

	@Test
	public void uploadAjaxEnvironmentValidationErrors() throws Exception {

		mockMvc.perform(doFileUpload(file).accept(MediaType.APPLICATION_JSON))
				.andExpect(status().isBadRequest())
				.andExpect(
						jsonPath("$.fieldErrors[0].field", is("environment")))
				.andExpect(
						jsonPath("$.fieldErrors[0].error",
								is("This field cannot be null.")));
	}

	@Test
	public void uploadAjaxFileEmtpyValidationErrors() throws Exception {
		mockMvc.perform(
				doFileUpload(
						new MockMultipartFile("multipartFile", "orig", null,
								new byte[0]))
						.accept(MediaType.APPLICATION_JSON).param(
								"environment", "PROD"))
				.andExpect(content().contentType(MediaType.APPLICATION_JSON))
				.andExpect(status().isBadRequest())
				.andExpect(
						jsonPath("$.fieldErrors[0].field", is("multipartFile")))
				.andExpect(
						jsonPath("$.fieldErrors[0].error",
								is("Please select a valid file to upload.")));

	}

	@Test
	public void uploadAjaxSuccess() throws Exception {
		mockMvc.perform(
				doFileUpload(file).param("environment", "PROD").accept(
						MediaType.APPLICATION_JSON)).andExpect(status().isOk())
				.andExpect(content().contentType(MediaType.APPLICATION_JSON));

	}

	@Test
	public void uploadEnvironmentValidationErrors() throws Exception {

		mockMvc.perform(doFileUpload(file))
				.andExpect(status().isOk())
				.andExpect(model().hasErrors())
				.andExpect(
						model().attributeHasFieldErrors("manifest",
								"environment"));
	}

	@Test
	public void uploadEmptyFileValidationErrors() throws Exception {

		mockMvc.perform(
				doFileUpload(new MockMultipartFile("multipartFile", "orig",
						null, new byte[0])))
				.andExpect(status().isOk())
				.andExpect(model().hasErrors())
				.andExpect(
						model().attributeHasFieldErrors("manifest",
								"multipartFile"));
	}

	@Test
	public void uploadSuccess() throws Exception {
		mockMvc.perform(doFileUpload(file).param("environment", "PROD"))
				.andExpect(redirectedUrl("/manifests"))
				.andExpect(model().hasNoErrors());
	}

	private MockHttpServletRequestBuilder doFileUpload(
			final MockMultipartFile file) {
		return fileUpload("/manifests").file(file).with(
				custregUser(user.getUsername()));
	}
}

and the controller under test

@Controller
public class ManifestController {

	@Autowired
	private ManifestService manifestService;

	@Autowired
	private SecurityHolder securityHolder;

	@RequestMapping(value = "manifests", method = RequestMethod.GET)
	public String index(@ModelAttribute final Manifest manifest,
			final Model model) {
		setupManifestModel(model);
		return "manifest/index";
	}

	private void setupManifestModel(final Model model) {
		model.addAttribute("manifestHistory", manifestService
				.findUserManifestHistory(securityHolder.getName()));
		model.addAttribute("manifestEnvironments", manifestService
				.getAvailableManifestEnvs(securityHolder.getName()));
	}

	@RequestMapping(value = { "manifests", "api/manifests" }, method = RequestMethod.POST, produces = MediaType.APPLICATION_JSON_VALUE)
	public @ResponseBody
	Manifest uploadAjax(@Valid @ModelAttribute final Manifest manifest,
			final BindingResult bindingResult)
			throws MethodArgumentNotValidException {
		if (!manifestService.getAvailableManifestEnvs(securityHolder.getName())
				.contains(manifest.getEnvironment())) {
			bindingResult.rejectValue("environment", "invalid.manifest.env");
		}
		if (bindingResult.hasErrors()) {
			throw new MethodArgumentNotValidException(null, bindingResult);
		}
		return manifestService.create(manifest, securityHolder.getName());
	}

	@RequestMapping(value = "manifests", method = RequestMethod.POST)
	public String upload(@Valid @ModelAttribute final Manifest manifest,
			final BindingResult bindingResult, final Model model,
			final RedirectAttributes redirectAttributes) {
		if (!manifestService.getAvailableManifestEnvs(securityHolder.getName())
				.contains(manifest.getEnvironment())) {
			bindingResult.rejectValue("environment", "invalid.manifest.env");
		}
		if (bindingResult.hasErrors()) {
			setupManifestModel(model);
			return "manifest/index";
		}
		manifestService.create(manifest, securityHolder.getName());
		redirectAttributes.addFlashAttribute("flashMessage",
				"manifest.upload.success");
		return "redirect:/manifests";
	}

	@RequestMapping(value = { "manifests/{id}", "api/manifests/{id}" }, method = RequestMethod.GET, produces = MediaType.APPLICATION_OCTET_STREAM_VALUE)
	public @ResponseBody
	FileSystemResource download(@PathVariable final Long id,
			final HttpServletResponse httpServletResponse)
			throws FileNotFoundException, IOException {
		final Manifest manifest = manifestService.find(id);
		if (manifest == null)
			throw new NotFoundException();
		httpServletResponse.setHeader("Content-Disposition",
				"attachment; filename=" + manifest.getFilename());
		return manifest.getFileSystemResource();
	}

	@Autowired
	private MessageSource messageSource;

	@ExceptionHandler(MethodArgumentNotValidException.class)
	@ResponseStatus(value = HttpStatus.BAD_REQUEST)
	public @ResponseBody
	HttpValidationMessage validation(final MethodArgumentNotValidException e,
			final Locale locale) {
		return new HttpValidationMessage(e.getBindingResult(), messageSource,
				locale);
	}
}

So what are we and aren’t we testing here? First off we are not testing the UI and servlet container features. What we are testing is our HTTP API that we’ve created in spring, along with all the services and other objects involved. You’re spring code will be executed as if it had received a real requests from the servlet container.

Spring test provides us with a nice builder pattern api for creating mock http requests and having them run through spring mvc. We can easily include things like request params, content type, and more. Then spring give us access to the http response, such as content type and response codes as well as other headers, along with any spring mvc features like model and views.

Decoupling External Systems

Before we get into selenium testing, we need to talk about our application profiles. We rely on external systems for all our user/company data and security and sometimes we even create/modify data in other systems when necessary which would need to be reset. To allow for easily reproducible and fast selenium tests we need to decouple our system from these other systems. To achieve this we used spring profiles to provide database implementations of our api calls. So instead of a class using springs restOperations to make a http call, we instead just have the interface backed by a hibernate object. These database implementations are activated by our DEV_SERVICES profile which you have seen in our test configurations. We have to do something similar with our security. Instead of using the custom filter provided by the external system we use spring security’s jdbc implementation and tie that to the DEV_SECURITY profile. With this done we can control all the data from the external systems using flyway and dbunit. We can then cover the missing api calls in post deployment selenium tests or in spring tests.

Selenium Integration Tests

Now that we can start talking about our selenium tests. The idea here is to split our test into 2 categories, Pre and Post deployment.

The purpose of pre-deployment test are to test the UI and functionality of the system in a reproducible manner so that they can be run by developers before committing code and during continuous integration builds to catch any issues before we deploy our application. If your system has a regression, or a database script error, or many of the other things that could go wrong should be caught here. We are testing the 90% of application at this point, include browser, server, and database interactions. We are not testing our external services since they are backed by the database and we are not testing any production server settings/features/etc.

Now post-deployment tests are less about verifying the application features/validations work properly and more about testing that the application deployed correctly. These test will need to setup user/company data in the external systems before they run and use the custom authentication provided by them. They’ll test the happy paths of the functionality to verify that all the external apis, database connections, etc are working properly. Also you can test web server configuration here like making sure you redirect all http to https and any other type of url rewrites/proxying/whatever that would be configured in your production env but not your developer ones.

Let’s start with our ManifestController pre-deployment selenium test

/**
 * 
 * @author stephen.garlick
 * @author lindsei.berman
 * 
 */
@DBUnitSupport(loadFilesForRun = { "CLEAN_INSERT", "/dbunit/dbunit.base.xml",
		"CLEAN_INSERT", "/dbunit/dbunit.dev.base.xml", "CLEAN_INSERT",
		"/dbunit/dbunit.mid.xml", "CLEAN_INSERT",
		"/dbunit/dbunit.mid.cert.incomplete.xml" })
@FlywayTest
public class ManifestControllerSeleniumITest extends AbstractDevSeleniumTest {

	@Value("${selenium.base.url}")
	private String baseUrl;

	@Autowired
	private ManifestPage manifestPage;

	@Autowired
	private ResourceLoader resourceLoader;

	@Autowired
	private SeleniumElementVisibilityTester seleniumElementVisibilityTester;

	@Before
	@Override
	public void setup() throws Exception {
		super.setup();
		getWebDriver().get(baseUrl + "/manifests");
	}

	@Test
	@DBUnitSupport(loadFilesForRun = { "CLEAN_INSERT",
			"/dbunit/dbunit.mid.cert.incomplete.xml" })
	public void fileSizeCannotBeZero() throws IOException {
		manifestPage
				.selectFile(getAbsoluteFilePath("classpath:files/manifest-empty-test-data"));
		assertTrue(manifestPage.isFileErrorDisplayed());
	}

	@Test
	public void successfulUpload() throws IOException {

		manifestPage
				.selectFile(
						getAbsoluteFilePath("classpath:files/manifest-not-empty-test-data"))
				.submit();
		assertTrue(manifestPage.getManifestHistorySize() >= 1);
	}

	/**
	 * When Client's certification is incomplete he/she should be able to view
	 * only the Pre-Production option in the Environment selection drop down box
	 * 
	 * @throws IOException
	 */
	@Test
	@DBUnitSupport(loadFilesForRun = { "CLEAN_INSERT",
			"/dbunit/dbunit.mid.cert.incomplete.xml" })
	public void userIncompleteCertificationOnlyViewPreProduction()
			throws IOException {
		assertEquals("Pre-Production", manifestPage.getTEMEnvironmentText());

	}

	/**
	 * When Client is Certified to upload manifests he/she should be able to
	 * view both Pre-Production and Production options in the Environment
	 * selection drop down box
	 * 
	 * @throws IOException
	 */
	@Test
	@DBUnitSupport(loadFilesForRun = { "CLEAN_INSERT",
			"/dbunit/dbunit.mid.cert.complete.xml" })
	public void userCertifiedViewBothPreProductionAndProduction()
			throws IOException {
		assertEquals("user should see both prod and preprod options", 2,
				manifestPage.getNumberOfEnvironmentOptions());
	}

	/**
	 * 
	 * @throws IOException
	 * 
	 *             when the user picks a manifest using the manifest select
	 *             button. The manifest name should be displayed beside the
	 *             cancel and upload button. Then once the cancel button is
	 *             pressed the name should no longer be displayed and the
	 *             file-select should be displayed
	 **/

	@Test
	public void manifestCancelSuccessful() throws IOException {
		int before = manifestPage.getManifestHistorySize();
		manifestPage
				.selectFile(
						getAbsoluteFilePath("classpath:files/manifest-not-empty-test-data"))
				.cancel();
		assertTrue(manifestPage.isFileSelectDisplayed());
		int after = manifestPage.getManifestHistorySize();
		assertEquals(before, after);
	}

	/**
	 * After manifest select button is pressed and a file is chosen
	 * successfully(not too small) then the upload and cancel button should be
	 * visible
	 * 
	 * @throws IOException
	 */
	@Test
	public void manifestClickandFileChoosenUploadandCancelDisplayed()
			throws IOException {
		manifestPage
				.selectFile(getAbsoluteFilePath("classpath:files/manifest-not-empty-test-data"));
		List<String> buttons = Lists.newArrayList("upload-file-button",
				"cancel-file-button");
		seleniumElementVisibilityTester.testElementsDisplayedAndEnabled(
				getWebDriver(), buttons);
	}

	private String getAbsoluteFilePath(final String resource)
			throws IOException {
		return resourceLoader.getResource(resource).getFile().getAbsolutePath();
	}
}

Again you’ll see we are controlling the database with flyway and dbunit. One thing you might realize is that we require that we have a server started up to run this test. We solve this later with maven for our builds, but for development we need to have our server up when running our tests. This is solved by Arquillian which is quickly approaching production readiness. We won’t be going into that today, but look for a future post.

If you’ve done browser work you’ll notice a lot of familiar things in the above code like css selectors. Here we are able to test that specific elements on our page are visible, enabled, and anything else you could determine from within a browser. This is because of seleniums webdriver, which interacts with an actual api for each browser directly; turn on debugging and you can see http calls for each interaction you perform within the test.

Lets go in a little deeper and start looking at our base classes.

@Category(IntegrationTest.class)
@TestExecutionListeners({DependencyInjectionTestExecutionListener.class,
    FlywayDBUnitTestExecutionListener.class})
@ActiveProfiles({"TEST_DB", "TEST_DEV"})
@ContextConfiguration(classes = {DatabaseTestConfig.class, SeleniumTestConfig.class})
public class AbstractDevSeleniumTest extends AbstractSeleniumTest {

}

Most of this should look familiar. We have a new profile TEST_DEV which will discuss in a moment. Also we see a new configuration class

@Configuration
@ComponentScan("com.company.artifact.test.selenium")
@PropertySource({ "classpath:selenium/selenium.properties" })
public class SeleniumTestConfig {

	@Bean(destroyMethod = "stop", initMethod = "start")
	public ChromeDriverService chromeDriverService() throws IOException {
		final ChromeDriverService chromeDriverService = new ChromeDriverService.Builder()
				.usingDriverExecutable(
						new File(System.getProperty("user.home")
								+ "/chromedriver")).usingAnyFreePort().build();
		return chromeDriverService;
	}

	@Bean(destroyMethod = "quit")
	public ChromeDriver chromeDriver() throws IOException {
		final ChromeDriver chromeDriver = new ChromeDriver(
				chromeDriverService());
		return chromeDriver;
	}

	/**
	 * Configuration for integration tests the run during the build process.
	 * 
	 * @author stephen.garlick
	 * 
	 */
	@Configuration
	@Profile("TEST_DEV")
	@PropertySource("classpath:selenium/selenium-build.properties")
	static class BuildSeleniumConfig {

	}

	/**
	 * Configuration for integration tests that run post deployment.
	 * 
	 * @author stephen.garlick
	 * 
	 */
	@Configuration
	@Profile("TEST_SIT")
	@PropertySource("classpath:selenium/selenium-sit.properties")
	static class SitSeleniumConfig {

	}

	@Bean
	public static PropertySourcesPlaceholderConfigurer propertyPlaceholder() {
		final PropertySourcesPlaceholderConfigurer placeholder = new PropertySourcesPlaceholderConfigurer();
		placeholder.setIgnoreUnresolvablePlaceholders(true);
		return placeholder;
	}

}

Here we setup our chromeDriverService which expects the chromedriver executable and then the chromeDriver bean itself which we’ll be using to interact with the browser. Then we component scan for our reusable selenium beans and pulling in some properties.

Next let’s take a look at our base test class

@RunWith(SpringJUnit4ClassRunner.class)
public abstract class AbstractSeleniumTest {

	@Value("${bcg.user.name}")
	private String bcgUserName;
	private String userNumber;
	private String username;

	@Autowired
	@Qualifier("userNumber")
	private Provider<String> userNumberProvider;

	@Before
	public void setup() throws Exception {
		userNumber = userNumberProvider.get();
		username = bcgUserName + userNumber;
		createUserTester.createUser(webDriver, username);
		loginTester.loginIn(webDriver, username);
	}

	@After
	public void tearDown() throws Exception {
		webDriver.manage().deleteAllCookies();
	}

	@Autowired
	private WebDriver webDriver;

	public WebDriver getWebDriver() {
		return webDriver;
	}

	@Autowired
	private SeleniumLoginTester loginTester;

	@Autowired
	private SeleniumCreateUserTester createUserTester;

}

This is where a lot of the work is going on for our tests, so lets break it down.

First the setup method. This method will be run for our pre and post deployment tests but will do different things based on the TEST_DEV or TEST_SIT profiles. If you are in a TEST_DEV (pre-deployment) test then it takes the bcgUserName property adds empty string to it and then uses that as the username for our test. Next it does a createUser call, which in this case is an empty implementation since dbunit will take care of this during database setup. Next it’ll login using our dev login page we discussed earlier. Now for the TEST_SIT profile userNumber will actually pull a number from a database sequence, which we’ll see when we look at the post deployment configuration, and creatUser will actually create a user in the external system and login does nothing because we are already logged in after creating the user.

The only thing we do after each test is clear the cookies, and thus authentication info, from the webDriver. We do this instead of instantiating a new webDriver so that we can create its bean as a singleton and reduce the amount of time we spend creating/tearing down our browser.

Next lets look at our post-deployment configuration.

@Category(SitIntegrationTest.class)
@TestExecutionListeners({DependencyInjectionTestExecutionListener.class})
@ActiveProfiles({"TEST_SIT"})
@ContextConfiguration(classes = {SitIntegrationTestConfig.class, SeleniumTestConfig.class})
public class AbstractSitSeleniumTest extends AbstractSeleniumTest {

}

Again similar to our pre-deployment except for different spring profiles and configuration classes. And this time we aren’t dealing with setting up the database so all of that configuration is gone.

Lets take a look at the new configuration class

@Configuration
@Profile("TEST_SIT")
public class SitIntegrationTestConfig {

  @Bean(destroyMethod = "close")
  public DataSource dataSource() {
    final BasicDataSource basicDataSource = new BasicDataSource();
    basicDataSource.setDriverClassName("oracle.jdbc.OracleDriver");
    basicDataSource.setUsername("user");
    basicDataSource.setPassword("password");
    basicDataSource.setUrl("url");
    basicDataSource.setMaxActive(-1);
    return basicDataSource;
  }

  @Bean
  public JdbcOperations jdbcOperations() {
    return new JdbcTemplate(dataSource());
  }

  @Bean
  @Qualifier("userNumber")
  @Scope(ConfigurableBeanFactory.SCOPE_PROTOTYPE)
  public String userNumber() {
    return jdbcOperations().queryForObject("select sit_user_seq.nextval from dual", String.class);
  }
}

Here we setup a connection to a database where we have a sequence object created. Before each test our base class will pull a new userNumber bean, which will return the next number form the sequence, and add it to our username so that we can create new users in the live system for our tests without needing to update the username everytime the tests are run.

Finally remember, that the setup method on the base class can be overriden to change this default creating/loging in/etc behavior in our setup() method. This can be useful when creating helper scripts that create X amount of users in the external system for manual testing and other things.

Page Pattern Support

It’s a recommend practice to use the page pattern for selenium. I won’t be going into the pattern itself, but see Page Objects for an explanation from the selenium developers.

We are going to look at a small bit of code used to support this pattern within our tests. You may have seen this object in our selenium test earlier

/**
 * 
 * @author stephen.garlick
 * @author linsei.berman
 * 
 */
@Page
public class ManifestPage {
	@FindBy(id = "upload-file-button")
	private WebElement submitButton;
	@FindBy(id = "file-select")
	private WebElement fileSelect;
	@FindBy(id = "fileinput")
	private WebElement fileUpload;
	@FindBy(id = "cancel-file-button")
	private WebElement cancelButton;
	@FindBy(id = "file-error")
	private WebElement fileError;
	@FindBys(value = { @FindBy(id = "history-body"), @FindBy(tagName = "tr") })
	private List<WebElement> manifestHistory;
	@FindBy(xpath = "//*[@id='environment']/option[1]")
	private WebElement temEnvironmentOption;
	@FindBy(xpath = "//*[@id='environment']")
	private WebElement environmentOptions;

	public boolean isFileSelectDisplayed() {
		return fileSelect.isDisplayed();
	}

	public ManifestPage selectFile(final String filePath) {
		fileUpload.sendKeys(filePath);
		return this;
	}

	public ManifestPage submit() {
		submitButton.click();
		return this;
	}

	public int getNumberOfEnvironmentOptions() {
		return new Select(environmentOptions).getOptions().size();
	}

	public ManifestPage cancel() {
		cancelButton.click();
		return this;
	}

	public boolean isFileErrorDisplayed() {
		return fileError.isDisplayed();
	}

	public int getManifestHistorySize() {
		return manifestHistory.size();
	}

	public String getTEMEnvironmentText() {
		return temEnvironmentOption.getText();
	}
}

and the Page annotation which uses the spring @Component annotation so we can register them as beans by classpath scanning

@Target(ElementType.TYPE)
@Retention(RetentionPolicy.RUNTIME)
@Component
public @interface Page {

}

and the PageBeanPostProcessor where we check for the annotation on beans created and call PageFactory.initElements to configure the bean’s selenium annotated fields with our webDriver.

@Component
public class PageBeanPostProcessor implements BeanPostProcessor {

	@Autowired
	private WebDriver webDriver;

	@Override
	public Object postProcessBeforeInitialization(Object bean, String beanName)
			throws BeansException {
		if (bean.getClass().isAnnotationPresent(Page.class)) {
			PageFactory.initElements(webDriver, bean);
		}
		return bean;
	}

	@Override
	public Object postProcessAfterInitialization(Object bean, String beanName)
			throws BeansException {
		return bean;
	}

}

Now we don’t need to worry about initializing our page beans in each test they are used and can inject them with spring.

Decoupling the Database

By default we develop against an oracle database. But this means that our test will need an oracle instance setup in advance before our test runs. To remove this need we use an in memory database called h2 which allows for oracle syntax. While h2 is fine for projects using ORMs like hibernate, it might not be the best option if you are using a lot of vendor specific features that h2 does not have compatibility for. Just remember that when making the decision to use h2 or not.

We use maven to spawn our h2 database as a tcp server so that our websphere instance and tests can both connect to it while running in different jvms. Let’s take a look at our parent pom.

	<profile>
		<id>h2-database</id>
		<activation>
			<property>
				<name>db</name>
				<value>h2</value>
			</property>
		</activation>
		<properties>
			<flyway.url>jdbc:h2:tcp://localhost:8082/mem:test;MODE=Oracle;DB_CLOSE_DELAY=-1;DB_CLOSE_ON_EXIT=FALSE</flyway.url>
			<flyway.user>sa</flyway.user>
			<flyway.password>sa</flyway.password>
			<database.datasource.class>org.h2.jdbcx.JdbcDataSource</database.datasource.class>
			<database.driver.jar>h2-1.3.175.jar</database.driver.jar>
		</properties>
	</profile>

First we have a profile that changes all the flyway connection settings to that of our h2 database.

<plugin>
	<groupId>com.btmatthews.maven.plugins.inmemdb</groupId>
	<artifactId>inmemdb-maven-plugin</artifactId>
	<version>1.4.2</version>
	<configuration>
		<monitorPort>11527</monitorPort>
		<monitorKey>inmemdb</monitorKey>
		<daemon>true</daemon>
		<type>h2</type>
		<port>8082</port>
		<database>test</database>
		<username>${flyway.user}</username>
		<password>${flyway.password}</password>
	</configuration>
	<dependencies>
		<dependency>
			<groupId>com.h2database</groupId>
			<artifactId>h2</artifactId>
			<version>${h2.version}</version>
		</dependency>
	</dependencies>
	<executions>
		<execution>
			<id>start-db</id>
			<goals>
				<goal>run</goal>
			</goals>
			<phase>pre-integration-test</phase>
		</execution>
		<execution>
			<id>stop</id>
			<goals>
				<goal>stop</goal>
			</goals>
			<phase>post-integration-test</phase>
		</execution>
	</executions>
</plugin>

Here is our plugin configuration for spawning the h2 database on our pre-integration-test phase and stopping it in our post-ingration-test phase.

And finally with our h2 database spawned we can use the flyway plugin to do the initial migration

<plugin>
	<groupId>com.googlecode.flyway</groupId>
	<artifactId>flyway-maven-plugin</artifactId>
	<version>${plugin-version.flyway}</version>
	<executions>
		<execution>
			<phase>pre-integration-test</phase>
			<goals>
				<goal>clean</goal>
				<goal>init</goal>
				<goal>migrate</goal>
			</goals>
			<configuration>
			</configuration>
		</execution>
	</executions>
	<dependencies>
		<dependency>
			<groupId>com.oracle</groupId>
			<artifactId>ojdbc6</artifactId>
			<version>${ojdbc6.version}</version>
		</dependency>
		<dependency>
			<groupId>com.h2database</groupId>
			<artifactId>h2</artifactId>
			<version>${h2.version}</version>
		</dependency>
		<dependency>
			<groupId>${project.groupId}</groupId>
			<artifactId>db</artifactId>
			<version>${project.version}</version>
		</dependency>
	</dependencies>
	<configuration>
	</configuration>
</plugin>

Now our database is up and schema migrated and we are ready to deploy to our websphere server and start running our selenium integration tests via the maven failsafe plugin.

Bootstrapping the Websphere Liberty Profile Server

Now that we’ve gotten a database up and migrated we need a way to setup our test server, in this case websphere liberty profile, so that we can deploy the application and let our selenium tests run.

Again we are going to our pom.xml to

<plugin>
	<groupId>com.ibm.websphere.wlp.maven.plugins</groupId>
	<artifactId>liberty-maven-plugin</artifactId>
	<version>1.0</version>
	<executions>
		<execution>
			<id>pre-integration-setup</id>
			<phase>pre-integration-test</phase>
			<goals>
				<goal>start-server</goal>
				<goal>deploy</goal>
			</goals>
		</execution>
		<execution>
			<id>post-integration-setup</id>
			<phase>post-integration-test</phase>
			<goals>
				<goal>stop-server</goal>
			</goals>
		</execution>
	</executions>
	<configuration>
		<assemblyArtifact>
			<groupId>com.company</groupId>
			<artifactId>wlp-test-server</artifactId>
			<version>1.1</version>
			<type>zip</type>
		</assemblyArtifact>
		<configFile>${project.build.directory}/test-classes/server.xml</configFile>
		<appArchive>${project.build.directory}/webapp.war</appArchive>
		<timeout>60</timeout>
		<verifyTimeout>60</verifyTimeout>
	</configuration>
</plugin>

This plugin allows us to use start up websphere liberty profile server and deploy our war file automatically from maven. We’ve packaged up the server as a maven artifact and deploy it to a private repo; This server includes any necessary provided dependencies in its /lib folder before being zipped up.

Next the plugin allows us to use a server.xml file which is a configuration file for websphere. We have the following server.xml template that gets processed by maven during our build to set the correct database (h2 or oracle).

<server description="new server">
	<!-- Enable features -->

	<webContainer deferServletLoad="false" />

	<featureManager>
		<feature>jsp-2.2</feature>
		<feature>servlet-3.0</feature>
		<feature>localConnector-1.0</feature>
		<feature>jdbc-4.0</feature>
		<feature>jndi-1.0</feature>
		<feature>beanValidation-1.0</feature>
		<feature>jpa-2.0</feature>
	</featureManager>
	<httpEndpoint host="0.0.0.0" httpPort="9080" httpsPort="9443"
		id="defaultHttpEndpoint" />
	<applicationMonitor updateTrigger="mbean" />

	<dataSource id="db" jndiName="jdbc/datasource">
		<jdbcDriver libraryRef="driverLib" javax.sql.DataSource="${database.datasource.class}"/>
		<properties URL="${flyway.url}"
			password="${flyway.user}" user="${flyway.user}" />
	</dataSource>

	<library id="driverLib">
		<fileset dir="${wlp.install.dir}/lib" includes="${database.driver.jar}" />
	</library>
	<jndiEntry id="profile" jndiName="spring.profiles.active"
		value="DEV_SECURITY,DEV_SERVICES,CONTAINER_DB" />
</server> 

You’ll notice properites like database.datasource.class which were defined in our pom.xml.

Now all of our tests can be run during our build and we have no need to manually setup any databases or web servers to run our integration tests on.

Closing Comments

Now we have ways to easily developer each layer of our tests and have them run during our maven build with a single command. From here we could easily create CI jobs in jenkins to handle testing, reporting and deployments for us so that we can focus on developing our app and tests.

Brief Overview of Automation and Testing Tools

I’ve been asked recently to do a short overview of the automation and testing tools I’ve used on my current project. I’ve gotten a lot of time to play around with various test tools and configurations, as well as ways to automate the setup and configuration of a build system. We’ll be taking a quick look at what each tool provides us and list any pros/cons that I’ve run into.

Databases and tools

Flyway

What is it?

Flyway is a database versioning and migration tool. It gives us a standard way to build our database schema in a reproducible manner and thus helps us easily create reproducible integration tests. We supply it a list of folders with our scripts, and flyway executes the scripts for us and manages a “schema_version” table with the metadata about the scripts that have already run. It uses a specific file naming to achieve the script execution order, VXXXXX__script_name.sql where XXXXX is a number. Once new scripts are added, you run the migrate command and all scripts with a version number higher than the last executed script will run. It also has a maven plugin which allows us to easily manually trigger migrations and easy hook into a Jenkins build/deployment.

How we’ve used it

We’ve used Flyway to handle migrations on our DEV/SIT environments and use it to determine which scripts need to be packaged for our CAT/PROD environments since we don’t have permission to run database scripts in those. This multi env setup is made easy since flyway allows you to supply a list of folders and it’ll combine the scripts in them and run in order; we just supply different maven properties/profiles to change the list of folders (ie db-all,db-dev for dev, db-all,db-prod for prod) and database connection.

We’ve also used it to help maintain database state during integration tests. By combining, flyway, spring test, and Flyway Test Extensions we are able to clean and rebuild our database so that each test has a clean schema to run against. Dbunit also comes into play here for setting up our test data after flyway has cleaned and migrated the schema.

Pros

  • Simple and easy to use
  • Reproducible schemas
  • Database scripts committed to version control and versioned in the database
  • Scripts written in SQL
  • Good support for spring, testing, and maven

Cons

  • Can’t use in our upper envs (not a tool issue)
  • Possible issues with multiple users creating/committing scripts due to versioning nature.
    • If someone uses the same version number as you did there will be conflicts (this is why we use timestamps to reduce this chance).
    • Also for example someone commits a script with a higher version than mine and the migration runs, and then I commit my script. My script version is now lower than the latest in schema_version table. Developers will just need to be aware of this, and update their script versions if necessary. Another option is that flyway does have an ability to run scripts out of order, so that it will go back and pickup scripts with lower version that were added after the higher version ran, but then you do run into the chance of non reproducible schemas. Whether you enable this option or not it is something that you should be aware of when using flyway.

Other options

dbmigrate
Liquibase

Oracle/Postgresql/Mysql/Etc

What is it?

In this case, everyone should be familiar with at least one of these databases as they tend to be the most used. Most applications will connect to a database to persist data collected. We’ll mostly be talking about how we’ve used this database for our integration tests.

How we’ve used it

Our project uses the Oracle database, and we originally setup our tests to run against an oracle xe install. Our tests use flyway to connect to and manage the lifecycle of the database for our integration tests.

Pros

  • Same database your application will connect to in a live env
  • Allows for database specific features to be used (plsql, triggers, etc)

Cons

  • Additional setup required
  • Need to host database externally from tests
  • Slower than in memory database
  • Potentially slower depending on network connection
  • Database must be up and accessible to run tests

H2 Database

What is it?

H2 is a fast, lightweight sql database written in Java. It supports embedded and server modes and can be run completely in memory. It also supports syntax compatibility for a number of other databases.

How we’ve used it

Our integration test were originally setup to run against an oracle instance. This became an issue when automating our build due to poor network connection to our dev database test schema. To remedy this we updated our sql scripts to be compatible in both H2 and Oracle (this was minimal since we weren’t using any Oracle specific features and enabled H2s compatibility mode). We then added a maven profile that would use a maven plugin – inmemdb to start H2 in server mode and change the connection properties to this new H2 server instead of oracle. This way our test can be easily run against a full Oracle install or H2 with a single command line property.

We’ve also created a POC in a different branch of the code which uses Arquillian (more on that later) and H2 in embedded mode. I’ll be exploring setting up the H2 server as a spring bean instead of a maven plugin in the future.

Pros

  • Quick to setup using spring
  • No need to externally host
  • Initialized from spring/test (embedded)
  • Initialized during pre-integration maven phase or spring (server)
  • In memory mode very fast for integration tests that need to reset database often

Cons

  • Limited support for database compatibility (no plsql support and other database specific features)
  • Additional memory required in your jvm
  • Embedded mode only accessible from within same jvm its launched (remedied by running in server mode)

Testing

JUnit

What is it?

Again, this is another one most people are familiar with. JUnit is a framework for writing test code in Java. It provides the foundation of our tests, providing test lifecycle hooks, assertions, maven support, 3rd party extensions and more.

How we’ve used it

All our tests, unit and integration start off with a JUnit test and hook in the appropriate runner depending on the type of test we perform.

Our unit tests tend to use vanilla JUnit or with Mockito to allow for quick and easy dependency mocking.

We’ll go more into our integration tests setups shortly.

While JUnit has been a great tool, we’ll be looking at testNG in the future due to having better ways to parameterize tests.

Pros

  • Heavy used in the Java community and simple to use
  • Lots of 3rd party extensions
  • Maven support

Cons

  • Tends to behind most other frameworks in features

Other Options

TestNG
Spock

Spring Test

What is it?

Spring Test is a spring module meant to provide support and ease of testing spring enabled applications. It includes test runners for JUnit/TestNG, takes a spring java config/xml to build the application context, and supports spring profiles and more. It also provides extended support of testing Spring MVC by providing a way to test routing and assert various parts of the response (ie view return, status code returned, model attributes, etc)

How we’ve used it

For our database integration tests we’ve wired our test using the Spring test runner and loaded the entire core module of our project (services/repos/etc) with a spring defined datasource (normally the app uses a container supplied datasource through jndi). We then use spring profiles to ignore the jndi datasource and pickup our test datasource. We then use the flyway extension to execute flyway clean/migrate and dbunit to setup data to put our database in a known state and enable completely reproducible tests.

Pros

  • Actively supported and developed by the Spring Source teams
  • Test using spring, thus allowing the creation of reusable test beans
  • Provides additional information to tests that is not available when unit testing (mvc test features)
  • Allows for testing all spring components

Cons

  • Still not inside our servlet container
  • No access to container resources
  • Cannot test UI
  • Can’t test things controlled by the container (filters, web.xml mappings, etc)

Selenium Webdriver

What is it?

Selenium Webdriver is a tool which allows us to develop browser tests using code from just about any language. It uses the browsers internal API to find and interact with the browser; this is different from the old selenium where mouse and keyboard recording were used to interact with the browser. It also supports a wide variety of browsers, including mobile. See the selenium webdriver page for more.

How we’ve used it

We use selenium in a few different configurations so that we can tests as much as possible in our build process without deploying to a live environment to catch as many issues as possible before any deployment happens. We can also control our data using properties. We’ll discuss these configurations here.

First off, our configurations have very little duplication thanks to using Spring to wire our tests together. This allows us to create a common base class that setups spring/selenium and then creates and logs a user in for both environments. Let’s take a look at how each is setup.

Our dev environment is setup to mock out any external APIs, using database tables, to allow rapid prototyping and allow us to test the majority of the application without connecting to test environments of systems we don’t control. By doing this we can use DBUnit to populate the external data our system relies on and then run tests against. This environment configuration is controlled by Spring Profiles to enable swap out the real API calls with database writes and enable a web interface for creating mock data.

Now that you understand our dev environment we can talk about how selenium tests are run against it. As we talked about earlier you can back these tests with either an external database or H2 running in server mode since our application will be deployed into a container running in its own VM. For actually application deployment we have two options; First is that you have the application already deployed in a container running in eclipse, connected to the same database your tests will be connected to, this is general how you will develop new tests. Second is that we use maven to deploy our war during the pre-integration phase of our build. In this case we have a maven plugin that spawns a websphere liberty profile server and deploys the app before running our integration tests (our client uses websphere, but you can easily do this for other containers as well). Our spring profile then uses a NOOP implementation of the create user call in our setup (dbunit takes care of this data) and logs in using our dev panel (spring security database auth).

Now you understand how the tests are setup and executed, lets take a look at what we can and can’t test in this configuration. First we can test all UI features now as well as the majority of our functional tests (data creation, validations, etc). What we can’t test in this case are environment specific things, like connectivity between systems (firewalls/ports), webserver request modifications.

Next is our SIT configuration. In this configuration our app is deployed to our test servers (after dev integration test run/pass) and is run in its production configuration using http calls to external APIs. By the time we reach this point, the majority of our actual application testing is already covered. Since most of our integration test have already run during the build we’ll want to mainly test the happy paths to make sure all our api calls properly go through and we know the application is working properly. Again, for these tests we change our Spring profile to pickup different properties/test beans. In this profile our test is NOT connected to any database since all the data will be created through the application/external systems. So this time our create user instead of being NOOP we have it create user in the external system and then our login code is NOOP since the user is automatically logged in after creation.

We’ll discuss the test flow more when we talk about our Jenkins configuration.

Pros

  • Most of the test executed during build, errors caught fast
  • Using browser APIs allows for test closer to a real user experience
  • Tests are fully reproducible in the dev configuration due to database lifecycle control

Cons

  • Test are not run in container, and don’t have access to the container datasource and thus need to define their own that connects to the same as the container
  • Server must be started up outside of the test (IDE/Maven spawned/etc)

Arquillian

What is it?

Arquillian is a new tool in development by JBoss to allow for in-container tests. Arquillian allows for you to define deployments from within our JUnit/TestNG test and have it deployed to a servlet container. It has a lot of extension support for spring, persistence, various servlet containers and extensions that add abstraction on top of selenium. Arquillian itself is still in early development stages with extensions slightly behind the core module.

How we’ve used it

We’ve used Arquillian to develop a proof of concept to allow all our integration tests to be run full self contained. We use the Tomcat container, spring extension, and persistence extension. We added a @Deployment to our selenium test base class, which packages up our war, and changed the spring runner to the arquillian runner. Combined with a embedded h2 database we are then able to run test fully self contained (in our development profile) without having to have any external resources already started.

The flow of a test in this configuration is as so.

  1. Arquillian/Spring extenion packages up our app with all maven dependencies (including test/provided but this is configurable)
  2. Arquillian extracts an embedded Tomcat server and starts it up
  3. Arquillian deploys the war to the embedded tomcat server
  4. Spring starts up and connects to the database
  5. Test starts, uses the same Spring context that is running in your tomcat deployment, has access to everything running on tomcat including jndi resources
  6. Test runs, passes and everything shutsdown
  7. Repeat

As you can see the main benefit of this setup is that you just hit run and arquillian takes care of the rest. This reduces a lot of extra configuration and container lifecycle management on our part and lets test run as a 100% real test inside our container.

The only downside at this point is that each test class is going to redeploy the tomcat/application which greatly increases our test times (and the main reason we haven’t merged this POC into trunk). Luckily the Arquillian developers are already aware of this issue and are planning to allow for test suite support to reduce these runtimes ( see https://issues.jboss.org/browse/ARQ-567 ).

Pros

  • Tests completely control their own container/test lifecycles
  • No need for external resources to be in place before tests run
  • Full access to spring context running on serve and inside tests
  • Test have access to jndi and other container resources
  • Supported by JBoss
  • Lots of additional support through extensions

Cons

  • Slow test times due to container redeployment per test class
  • Early stages of development, thus API changes common
  • Increased memory on the JVM running tests with embedded containers

Build Environment

Vagrant

What is it?

Vagrant is a tool, written in Ruby, that provides a useful abstraction for multiple VM providers (VirtualBox, VMWare, EC2, etc) and allows for easy VM configuration and startup from the commandline. It also provides hooks for provisioners (such as Chef) and other extensions in the form of plugins. It also allows us to define VMs in code allowing us to check them into version control.

How we’ve used it

We’ve used Vagrant with to setup an Ubuntu 13.04 server, mount some shared folders, and configure networking. We used the vagrant-omnibus plugin to manage our Chef install and the vagrant-berkshelf plugin to manage copying our chef dependencies to the VM before provisioning. This is all configured in a Vagrantfile and started up by running the “vagrant up” command.

After this initial setup I developed the chef cookbook which we’ll talk about in a few. This allowed me to easily develop chef scripts iteratively by adding new functionality and testing it with “vagrant provision” to rerun the cookbook. We’ll discuss this development flow more when we talk about Chef.

Pros

  • Simplifies VM configuration and provisioning
  • VMs defined as code
  • Easy development and testing of Chef cookbooks
  • Good community support

Cons

  • Evolvoing API and tools

Chef

What is it?

Chef is a tool with a focus on IT infrastructure automation and management. Chef comes in two flavors, first a full server/client setup with the intent of fully managing your IT and a standalone version called chef-solo that focuses on executing chef cookbooks with out server management side. Chef provides abstractions for installing and executing scripts and other common OS operations. Its written in Ruby, and can execute Ruby code which give’s it a large additional library of useful scripting tools.

You develop Chef Cookbooks which define a idempotent procedural way of installing a piece of software. This basically means you develop cookbooks that can be rerun over and over and your system will always end up in the same state without error. This development style, combined with Vagrant, allows for us to quickly and easily develop cookbooks in small focused chunks that we can rerun over and over as we add features. This also allows you to easily deploy an update to a large number of servers at once with minimal chance of error.

Chef cookbooks can be linted, unit tested, and integration tested to verify working code. Combined with Vagrant and Jenkins you can setup a continuous integration server for chef cookbooks.

Chef also comes with the benefit that it is being used as the base of amazon web services opsworks platform so that you can execute custom chef scripts easily.

How we’ve used it

We used Chef, along with Vagrant and berkshelf (chef cookbook dependency manager), to develop a Jenkins build box for our app. The build box installs the following software to support the build/test configurations we’ve been talking about.

  1. install/update apt
  2. install java 7
  3. setup dns-search domains and restart networking
  4. install jenkins server
  5. install git and svn
  6. setup svn proxy
  7. download and store svn certs
  8. install maven
  9. install chrome
  10. install xvfb (headless selenium tests)
  11. install chromedriver for selenium test
  12. configure jenkins server

Now most of the software above has a Chef cookbook already developed that can run against multiple OSes (for example java cookbook supports debian/rhel/window with multiple versions 6/7/ibm/oracle/etc). And all cookbooks can be parameterized, allowing for high reusability.

And since everything is defined in code, we can of course check it into version control.

Pros

  • Infrastructure as Code/Version Control
  • Automate installation and configuration of machines
  • Reduce error due to repetition
  • Continuous integration/Testable
  • Quickly get up identical environments test/prod mirror/etc
  • Much smaller than VMs
  • Large base cookbook library
  • AWS Opworks support
  • Strong community support
  • Ruby Libraries

Cons

  • Provisioning time when starting from clean VM (more of issue for things like aws autoscaling)

Jenkins

What is it?

Jenkins is a tool written in Java that provides a continuous integration environment for projects. It has a very large plugin library and provides scripting support via the Groovy language. Due to this Jenkins is one of the most popular CI software. As with most things, there is a lot more the Jenkins than what we’ll be discussing here.

The goal of Jenkins is to provide an interface for developing build/deployment pipelines, task automation, reporting and more. It has support for most major build tools including maven which we’ll discuss. Jobs can be trigger manually, scheduled, off a version control hooks, polling a version control for updates, off of over builds completing, and more.

How we’ve used it

As we talked about before we used Chef to develop our Jenkins server. This required that we figure out how Jenkins was managing its configuration. Jenkins has a number of xml/json/text files that it and its plugins use to persist configuration changes. First I inited a git repo in my Jenkins home folder and then proceeded to commit changes as I updated the Jenkins configuration. This allowed me to track and add all the configuration updates to my cookbook. This itself seems like a large hassle due to just the sheer amount of configuration files and eventually will most likely need to be changed over to using the scripting console to configure and maintain the server.

By default Jenkins comes with a plugin to perform maven builds. We took advantage of this since the project is a maven project. We have two jobs configured in jenkins.

The first job performs CI for our trunk branch. This job polls for changes in svn. Once a change is picked up, Jenkins checks out the source code and performs out maven build command to run our full (dev) integration test suite. We either end up with a successful build (all tests passed), unstable build (all unit tests passed but integration test failures), or a failed build (compilation errors or unit test failures).

The second job, uses the buildresult-trigger plugin. The goal of this job is to tag and deploy the app to our test servers. This job triggers twice a day if it sees the first job with a new build in a successful state. If so this job will handle database migration using maven/flyway, handle websphere deployment using the websphere admin clinet, and then execute our happy path integration tests to make sure everything deployed correct. If the first job is not in success status, then this job will not run and thus prevent a build with integration test errors being deployed.

Pros

  • Mature CI platform
  • Tons of Community support
  • Support for just about anything through plugins
  • Scriptable

Cons

  • Text file configuration hell (most likely solved by using scripting console)
  • Doesn’t seem to be a ton of info on scripting outside of the javadocs