Inject beans into JBoss 7 Modules

It seems like JBoss 7.0 CDI has an issue with injecting beans that are located in shared libraries, and are not part of an EAR file. Irritating (but solved in the next version). However, I didn’t want to upgrade, so I decided to solve it instead.
Here’s a short step-by-step:

Deltaspike

Deltaspike is an apache library that has several useful CDI extensions. Add it as a module to your jboss-as. You can use the following module.xml file:

<?xml version="1.0" encoding="UTF-8"?>
 
<module xmlns="urn:jboss:module:1.1" name="org.deltaspike">
 
    <resources>
        <resource-root path="deltaspike-core-api.jar" />
        <resource-root path="deltaspike-core-impl.jar" />
    </resources>
 
    <dependencies>
        <module name="com.google.guava" />
        <module name="javax.enterprise.api" />
        <module name="javax.inject.api" />
    </dependencies>
</module>

Reflections

Reflections is a very useful library that allows you to find classes that has an annotation in runtime.
Add it as a module to your jboss-as.

<?xml version="1.0" encoding="UTF-8"?>
 
<module xmlns="urn:jboss:module:1.1" name="org.reflections">
 
    <resources>
        <resource-root path="reflections.jar" />
    </resources>
 
    <dependencies>
        <module name="com.google.guava" />
        <module name="org.javassist" />
        <module name="org.slf4j" />
    </dependencies>
</module>

Module extensions

This is the heart of the solution. This module runs when the CDI container starts, and adds relevant beans to the CDI. Make sure this class is part of a JAR file that is inside your EAR file!

This code is greatly influenced from https://rmannibucau.wordpress.com/2013/08/19/adding-legacy-beans-to-cdi-context-a-cdi-extension-sample/

package com.tona.cdi;
import java.util.HashMap;
import java.util.Map;
import java.util.Set;

import javax.enterprise.event.Observes;
import javax.enterprise.inject.Any;
import javax.enterprise.inject.spi.AfterBeanDiscovery;
import javax.enterprise.inject.spi.AnnotatedType;
import javax.enterprise.inject.spi.Bean;
import javax.enterprise.inject.spi.BeanManager;
import javax.enterprise.inject.spi.BeforeBeanDiscovery;
import javax.enterprise.inject.spi.Extension;
import javax.enterprise.util.AnnotationLiteral;
import javax.inject.Named;
import javax.inject.Singleton;

import org.apache.deltaspike.core.util.bean.BeanBuilder;
import org.apache.deltaspike.core.util.metadata.builder.AnnotatedTypeBuilder;
import org.reflections.Reflections;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

import com.google.common.base.Strings;

public class ModuleConfigurationExtension implements Extension {
    private static final Logger log = LoggerFactory.getLogger(ModuleConfigurationExtension.class);
    private final Map beans = new HashMap();

    /**
     * This method is automatically activated by CDI, and loads all classes in the com.tona package that has NAMED or
     * SINGLETON annotations.
     * @param bdd
     */
    void readAllConfigurations(final @Observes BeforeBeanDiscovery bdd, BeanManager bm) {
        log.info("Starting to load beans from modules");
        addBeansFromPackage(bdd, bm, "com.tona");
    }

    private void addBeansFromPackage(final BeforeBeanDiscovery bdd, BeanManager bm, String packageName) {
        Reflections reflections = new Reflections(packageName);
        Set beanClasses = reflections.getTypesAnnotatedWith(Named.class);
        beanClasses.addAll(reflections.getTypesAnnotatedWith(Singleton.class));

        for (Class bean : beanClasses) {
            @SuppressWarnings({ "unchecked", "rawtypes" })
            AnnotatedType annotatedType = new AnnotatedTypeBuilder().readFromType(bean).create();
            Set foundBeans = bm.getBeans(annotatedType.getBaseType(), new AnnotationLiteral() {
            });

            if (foundBeans.size() == 0) {
                bdd.addAnnotatedType(annotatedType);
                String name;
                Named named = bean.getAnnotation(Named.class);
                if (named == null || Strings.isNullOrEmpty(named.value())) {
                    name = bean.getSimpleName();
                } else {
                    name = named.value();
                }
                beans.put(name, annotatedType);
            }
        }
    }

    /**
     * This method actually initializes the beans we discovered in <code>readAllConfigurations</code>. Again - this
     * method is automatically activated by CDI
     * @param abd
     * @param bm
     * @throws Exception
     */
    public void addCdiBeans(final @Observes AfterBeanDiscovery abd, final BeanManager bm) throws Exception {
        log.info("Starting to initialize beans from modules");

        for (Map.Entry bean : beans.entrySet()) {
            Set foundBeans = bm.getBeans(bean.getValue().getBaseType(), new AnnotationLiteral() {
            });

            if (foundBeans.size() == 0) {
                final Bean cdiBean = createBean(bm, bean.getKey(), bean.getValue());
                abd.addBean(cdiBean);
                log.debug("Added bean " + cdiBean.getName());
            }
        }
    }

    private static Bean createBean(final BeanManager bm,
            final String name,
            final AnnotatedType annotatedType)
            throws Exception {
        final BeanBuilder beanBuilder = new BeanBuilder(bm).
                readFromType(annotatedType).
                name(name);

        return beanBuilder.create();
    }
}

Configuring the extension

Create a file called META-INF/services/javax.enterprise.inject.spi.Extension. It should only have the following line:

com.tona.cdi.ModuleConfigurationExtension

Updating your EAR file

The EAR file should have dependencies on the org.deltaspike and org.reflections module. Add it in the MANIFEST.MF file.

jmap, jstack not working properly with OpenJDK

I run into an issue lately with the jmap and jstack implementation of OpenJDK. Quite frankly – they didn’t work…
When running jmap -heap, I would get:

Exception in thread "main" java.lang.reflect.InvocationTargetException
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at sun.tools.jmap.JMap.runTool(JMap.java:197)
	at sun.tools.jmap.JMap.main(JMap.java:128)
Caused by: java.lang.RuntimeException: unknown CollectedHeap type : class sun.jvm.hotspot.gc_interface.CollectedHeap
	at sun.jvm.hotspot.tools.HeapSummary.run(HeapSummary.java:146)
	at sun.jvm.hotspot.tools.Tool.start(Tool.java:221)
	at sun.jvm.hotspot.tools.HeapSummary.main(HeapSummary.java:40)
	... 6 more

When running jstack -F I would get:

java.lang.RuntimeException: Unable to deduce type of thread from address 0x00007fc980001000 (expected type JavaThread, CompilerThread, ServiceThread, JvmtiAgentThread, or SurrogateLockerThread)
	at sun.jvm.hotspot.runtime.Threads.createJavaThreadWrapper(Threads.java:162)
	at sun.jvm.hotspot.runtime.Threads.first(Threads.java:150)
	at sun.jvm.hotspot.runtime.DeadlockDetector.createThreadTable(DeadlockDetector.java:149)
	at sun.jvm.hotspot.runtime.DeadlockDetector.print(DeadlockDetector.java:56)
	at sun.jvm.hotspot.runtime.DeadlockDetector.print(DeadlockDetector.java:39)
	at sun.jvm.hotspot.tools.StackTrace.run(StackTrace.java:52)
	at sun.jvm.hotspot.tools.StackTrace.run(StackTrace.java:45)
	at sun.jvm.hotspot.tools.JStack.run(JStack.java:60)
	at sun.jvm.hotspot.tools.Tool.start(Tool.java:221)
	at sun.jvm.hotspot.tools.JStack.main(JStack.java:86)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at sun.tools.jstack.JStack.runJStackTool(JStack.java:136)
	at sun.tools.jstack.JStack.main(JStack.java:102)
Caused by: sun.jvm.hotspot.types.WrongTypeException: No suitable match for type of address 0x00007fc980001000
	at sun.jvm.hotspot.runtime.InstanceConstructor.newWrongTypeException(InstanceConstructor.java:62)
	at sun.jvm.hotspot.runtime.VirtualConstructor.instantiateWrapperFor(VirtualConstructor.java:80)
	at sun.jvm.hotspot.runtime.Threads.createJavaThreadWrapper(Threads.java:158)
	... 15 more
Can't print deadlocks:Unable to deduce type of thread from address 0x00007fc980001000 (expected type JavaThread, CompilerThread, ServiceThread, JvmtiAgentThread, or SurrogateLockerThread)

The fix was simple (as some dear guys in the OpenJDK explained to me). Make sure you install openjdk-debuginfo package. That should fix it.

Postgres and multi-dimensions arrays in JDBC

For a side project I’m doing I needed to use multi-dimension arrays in PostgreSQL using JDBC. There were no clear examples online on how to do this – and some forum posts claimed it wasn’t doable, so I wrote this short JUnit class to test the functionality. Needless to say – it works…
Some points – JDBC spec recommends you call the Array.free() method after using the array, but in the PostgreSQL driver version I was using (9.0 build 801) this was not supported.

public class TestMultiDimensionalArray {

	public Connection getConnection() throws Exception {
		Class.forName("org.postgresql.Driver");
		Connection c = DriverManager.getConnection(
				"jdbc:postgresql://localhost/engine", "user", "pass");

		return c;
	}

	@Test
	public void testCallableMultiDimensionRetValue() throws Exception {
		Connection c = getConnection();
		
		CallableStatement stmt = c.prepareCall("select * from select_schedules()");
		ResultSet rs = stmt.executeQuery();
		while (rs.next()) {
			Array outputArray = rs.getArray(1);
			String[][] realArray = (String[][])outputArray.getArray();
			System.out.println(realArray.length + "-->" + Arrays.toString(realArray[0]));
			
		}
		stmt.close();
		c.close();
	}

	@Test
	public void testCallableMultiDimensionInOutParams() throws Exception {
		Connection c = getConnection();
		
		CallableStatement stmt = c.prepareCall("{ call select_schedules_params(?,?)}");
		String[][] elements = new String[2][];
		elements[0] = new String[] {"meeting_m","lunch_m"};
		elements[1] = new String[] {"training_m","presentation_m"};
		
		Array inArray = c.createArrayOf("text", elements);
		
		stmt.setArray(1, inArray);
		stmt.registerOutParameter (2, java.sql.Types.ARRAY);
		
		stmt.execute();
		
		Array outputArray = stmt.getArray(2);
		Assert.assertNotNull(outputArray);
			
		String[][] realArray = (String[][])outputArray.getArray();
		Assert.assertEquals(2, realArray.length);
		
		stmt.close();
		c.close();
	}

	@Test
	public void testInsertSingleDimension() throws Exception {
		Connection c = getConnection();
		
		PreparedStatement stmt = c.prepareStatement("INSERT INTO sal_emp VALUES ('Bill',?,'{{"meeting", "lunch"}, {"training", "presentation"}}');");
		Array myArray = c.createArrayOf("integer", new Integer[] {1000,1000,1000,1000});
		stmt.setArray(1, myArray);
		stmt.execute();
		stmt.close();
		c.close();
	}
	
	@Test
	public void testInsertMultiDimension() throws Exception {
		Connection c = getConnection();
		
		PreparedStatement stmt = c.prepareStatement("INSERT INTO sal_emp VALUES ('multi_Bill',?,?);");
		Array intArray = c.createArrayOf("integer", new Integer[] {1000,1000,1000,1000});
		String[][] elements = new String[2][];
		elements[0] = new String[] {"meeting_m","lunch_m"};
		elements[1] = new String[] {"training_m","presentation_m"};

		//Note - although this is a multi-dimensional array, we still supply the base element of the array
		Array multiArray = c.createArrayOf("text", elements);
		stmt.setArray(1, intArray);
		stmt.setArray(2, multiArray);
		stmt.execute();
		//Note - free is not implemented
//		myArray.free();
		stmt.close();
		c.close();
	}
	
	@Test
	public void testSelectSingleDimension() throws Exception {
		Connection c = getConnection();
		Statement stmt = c.createStatement();
		ResultSet rs = stmt.executeQuery("SELECT 1 || ARRAY[2,3] AS array;");
		if (rs.next()) {
			Array outputArray = rs.getArray(1);
			Integer[] intArray = (Integer[]) outputArray.getArray();

			Assert.assertEquals(3, intArray.length);
			Assert.assertEquals(intArray[0].intValue(), 1);
			Assert.assertEquals(intArray[1].intValue(), 2);
			Assert.assertEquals(intArray[2].intValue(), 3);
		} else {
			Assert.fail("Didn't get array results");
		}

		rs.close();
		stmt.close();
		c.close();
	}

	@Test
	public void testSelectMultiDimension() throws Exception {
		Connection c = getConnection();
		Statement stmt = c.createStatement();
		ResultSet rs = stmt.executeQuery("SELECT ARRAY[1,2] || ARRAY[[3,4]] AS array");
		if (rs.next()) {
			Array outputArray = rs.getArray(1);
			Integer[][] intArray = (Integer[][]) outputArray.getArray();

			Assert.assertEquals(2, intArray.length);
			Assert.assertEquals(1, (int) intArray[0][0]);
			Assert.assertEquals(2, (int) intArray[0][1]);
			Assert.assertEquals(3, (int) intArray[1][0]);
			Assert.assertEquals(4, (int) intArray[1][1]);
		} else {
			Assert.fail("Didn't get array results");
		}

		c.close();

	}
}

Initial SQL configuration was:

CREATE TABLE sal_emp (
    name            text,
    pay_by_quarter  integer[],
    schedule        text[][]
);


CREATE OR REPLACE FUNCTION select_schedules() RETURNS setof sal_emp.schedule%TYPE AS $$
DECLARE
    row sal_emp.schedule%TYPE;
BEGIN
    return query select schedule from sal_emp;
    return;
END
$$ LANGUAGE plpgsql;

CREATE OR REPLACE FUNCTION select_schedules_params(query text[][],OUT data text[][])  AS $$
DECLARE
    row sal_emp.schedule%TYPE;
BEGIN
    select schedule into data from sal_emp where schedule[1][1]=query[1][1];
END
$$ LANGUAGE plpgsql;

Improving LifeRay 6 CAS integration

Lately, I had the dubious pleasure of integrating CAS with LifeRay (the results of which can be seen in my previous posts). Unfortunately, LifeRay assumes that both CAS and LifeRay are connected to the same user store (LDAP server or any similar security store), and thus no user import is necessary. But, as CAS has a much wider range of supported user stores – this is not always the case.
I needed to address this issue, meaning – allow users to login through CAS, even if they are not LifeRay users.

Concept

I replaced LifeRay CAS filter, and made sure that the AttributePrincipal object arriving from CAS client is stored at the HTTPSession.
Then, I replaced LifeRay auto-login class, and used LifeRay API to create a user if a user has logged in but did not exist in the internal LifeRay user database.

July-17, 2013 – Since I got many comments on this topic, I decided to open source the code mentioned here. Please see https://github.com/liranzel/liferay-cas-no-ldap/ for details.

The How

Here’s what I did:

  1. Configure LifeRay for CAS (see my previous post – http://tonaconsulting.com/configuring-liferay-and-cas-to-work-with-ldap/, but DON’T configure the LifeRay for LDAP
  2. Create a new Java project.
  3. As I use Maven, I used the following pom.xml file:
    <?xml version="1.0"?>
    <project
        xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"
        xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
        <modelVersion>4.0.0</modelVersion>
        <groupId>com.tona.liferay</groupId>
        <artifactId>Authenticator</artifactId>
        <version>1.0-SNAPSHOT</version>
        <packaging>jar</packaging>
        <name>Authenticator</name>
        <dependencies>
     
            <dependency>
                <groupId>org.slf4j</groupId>
                <artifactId>slf4j-api</artifactId>
                <version>1.6.6</version>
            </dependency>
     
            <dependency>
                <groupId>javax.portlet</groupId>
                <artifactId>portlet-api</artifactId>
                <version>2.0</version>
            </dependency>
     
            <dependency>
                <groupId>junit</groupId>
                <artifactId>junit</artifactId>
                <version>3.8.1</version>
                <scope>test</scope>
            </dependency>
     
            <dependency>
                <groupId>org.jasig.cas.client</groupId>
                <artifactId>cas-client-core</artifactId>
                <version>3.2.1</version>
            </dependency>
     
            <dependency>
                <groupId>log4j</groupId>
                <artifactId>log4j</artifactId>
                <version>1.2.14</version>
            </dependency>
     
            <dependency>
                <groupId>com.liferay.portal</groupId>
                <artifactId>portal-client</artifactId>
                <version>6.0.4</version>
            </dependency>
            <dependency>
                <groupId>com.liferay.portal</groupId>
                <artifactId>portal-impl</artifactId>
                <version>6.0.4</version>
                <scope>provided</scope>
            </dependency>
            <dependency>
                <groupId>com.liferay.portal</groupId>
                <artifactId>portal-service</artifactId>
                <version>6.0.4</version>
                <scope>provided</scope>
            </dependency>
            <dependency>
                <groupId>com.liferay.portal</groupId>
                <artifactId>util-java</artifactId>
                <version>6.0.4</version>
            </dependency>
     
            <dependency>
                <groupId>com.liferay.portal</groupId>
                <artifactId>util-bridges</artifactId>
                <version>6.0.4</version>
                <scope>provided</scope>
            </dependency>
     
        </dependencies>
    </project>
    
  4. I create a new class, called TonaCASFilter, that derives from CASFilter. Note that I had to copy some code from the parent class, as it was not easily extensible 😦
    public class TonaCasFilter extends CASFilter {
    
    	public static String LOGIN = CASFilter.class.getName() + "LOGIN";
    
    	public static void reload(long companyId) {
    		_ticketValidators.remove(companyId);
    	}
    
    	protected Log getLog() {
    		return _log;
    	}
    
    	protected TicketValidator getTicketValidator(long companyId)
    		throws Exception {
    
    		TicketValidator ticketValidator = _ticketValidators.get(companyId);
    
    		if (ticketValidator != null) {
    			return ticketValidator;
    		}
    
    		String serverName = PrefsPropsUtil.getString(
    			companyId, PropsKeys.CAS_SERVER_NAME, PropsValues.CAS_SERVER_NAME);
    		String serverUrl = PrefsPropsUtil.getString(
    			companyId, PropsKeys.CAS_SERVER_URL, PropsValues.CAS_SERVER_URL);
    		String loginUrl = PrefsPropsUtil.getString(
    			companyId, PropsKeys.CAS_LOGIN_URL, PropsValues.CAS_LOGIN_URL);
    
    		Saml11TicketValidator cas20ProxyTicketValidator = new Saml11TicketValidator(serverUrl);
    		
    		Map parameters = new HashMap();
    
    		parameters.put("serverName", serverName);
    		parameters.put("casServerUrlPrefix", serverUrl);
    		parameters.put("casServerLoginUrl", loginUrl);
    		parameters.put("redirectAfterValidation", "false");
    
    		cas20ProxyTicketValidator.setCustomParameters(parameters);
    
    		_ticketValidators.put(companyId, cas20ProxyTicketValidator);
    
    		return cas20ProxyTicketValidator;
    	}
    
    	protected void processFilter(
    			HttpServletRequest request, HttpServletResponse response,
    			FilterChain filterChain)
    		throws Exception {
    
    		long companyId = PortalUtil.getCompanyId(request);
    
    		if (PrefsPropsUtil.getBoolean(
    				companyId, PropsKeys.CAS_AUTH_ENABLED,
    				PropsValues.CAS_AUTH_ENABLED)) {
    
    			HttpSession session = request.getSession();
    
    			String pathInfo = request.getPathInfo();
    
    			if (pathInfo.indexOf("/portal/logout") != -1) {
    				session.invalidate();
    
    				String logoutUrl = PrefsPropsUtil.getString(
    					companyId, PropsKeys.CAS_LOGOUT_URL,
    					PropsValues.CAS_LOGOUT_URL);
    
    				response.sendRedirect(logoutUrl);
    
    				return;
    			}
    			else {
    				String login = (String)session.getAttribute(LOGIN);
    
    				String serverName = PrefsPropsUtil.getString(
    					companyId, PropsKeys.CAS_SERVER_NAME,
    					PropsValues.CAS_SERVER_NAME);
    
    				String serviceUrl = PrefsPropsUtil.getString(
    					companyId, PropsKeys.CAS_SERVICE_URL,
    					PropsValues.CAS_SERVICE_URL);
    
    				if (Validator.isNull(serviceUrl)) {
    					serviceUrl = CommonUtils.constructServiceUrl(
    						request, response, serviceUrl, serverName, "ticket",
    						false);
    				}
    
    				String ticket = ParamUtil.getString(request, "ticket");
    
    				if (Validator.isNull(ticket)) {
    					if (Validator.isNotNull(login)) {
    						processFilter(
    								TonaCasFilter.class, request, response, filterChain);
    					}
    					else {
    						String loginUrl = PrefsPropsUtil.getString(
    							companyId, PropsKeys.CAS_LOGIN_URL,
    							PropsValues.CAS_LOGIN_URL);
    
    						loginUrl = HttpUtil.addParameter(
    							loginUrl, "service", serviceUrl);
    
    						response.sendRedirect(loginUrl);
    					}
    
    					return;
    				}
    
    				TicketValidator ticketValidator = getTicketValidator(
    					companyId);
    
    				Assertion assertion = ticketValidator.validate(
    					ticket, serviceUrl);
    
    				if (assertion != null) {
    					AttributePrincipal attributePrincipal =
    						assertion.getPrincipal();
    
    					login = attributePrincipal.getName();
    
    					session.setAttribute(LOGIN, login);
    					session.setAttribute("principal", attributePrincipal);
    				}
    			}
    		}
    
    		processFilter(TonaCasFilter.class, request, response, filterChain);
    	}
    
    	private static Log _log = LogFactoryUtil.getLog(TonaCasFilter.class);
    
    	private static Map _ticketValidators =
    		new ConcurrentHashMap();
    
    }
    
  5. I then create the new auto-login class. Again – as it was not very extendible, I had to copy-paste allot of code from the parent class…
    public class TonaCASAutoLogin extends CASAutoLogin {
    	private Logger logger = LoggerFactory.getLogger(TonaCASAutoLogin.class.getName());
    
    	@Override
    	public String[] login(HttpServletRequest request, HttpServletResponse response) {
    		String[] credentials = null;
    
    		try {
    			long companyId = PortalUtil.getCompanyId(request);
    
    			if (!PrefsPropsUtil.getBoolean(companyId, PropsKeys.CAS_AUTH_ENABLED, PropsValues.CAS_AUTH_ENABLED)) {
    
    				return credentials;
    			}
    
    			HttpSession session = request.getSession();
    
    			String login = (String) session.getAttribute(CASFilter.LOGIN);
    
    			if (Validator.isNull(login)) {
    				return credentials;
    			}
    
    			AttributePrincipal principal = (AttributePrincipal) session.getAttribute("principal");
    			if (principal != null) {
    
    				Map attrs = principal.getAttributes();
    
    				Configuration.getInstance().load();
    				
    				Object groupMembership = attrs.get(Configuration.getInstance().getMemberOfProperty());
    
    				if (groupMembership != null) {
    					com.liferay.portal.service.ServiceContext context = new com.liferay.portal.service.ServiceContext();
    
    					User user = null;
    					
    					String email = attrs.get("email").toString();
    					String lastName = attrs.get("lastName").toString();
    					String firstName = attrs.get("firstName").toString();
    
    					try {
    						user = UserLocalServiceUtil.getUserByScreenName(companyId, login);
    					} catch (NoSuchUserException nsue) {
    						// User not found.
    					}
    
    					// The groups the user needs to belong to
    					long[] mapToGroupsArray = getUserGroups(companyId, groupMembership.toString());
    					
    					// The community we want to map the user to
    					long groupId = 10131;
    
    
    					// User not found - create it.
    					if (user == null) {
    						try {
    							UserLocalServiceUtil.addUser(0, companyId, false, "not-used", "not-used", false,
    									fixScreenName(login), email, 0, "", Locale.getDefault(), firstName, "", lastName,
    									0, 0, true, 1, 1, 1970, null, new long[] {groupId}, null, null, mapToGroupsArray, false, context);
    
    						} catch (Exception e) {
    							logger.error("Can't add user", e);
    						}
    					} else {
    						// User exists - remap groups
    						UserGroupLocalServiceUtil.setUserUserGroups(user.getUserId(), mapToGroupsArray);
    						
    						// Ensure user has the right community
    						
    						UserLocalServiceUtil.addGroupUsers(groupId, new long[] { user.getUserId()});
    					}
    				} 
    			} 
    
    			return super.login(request, response);
    
    		} catch (Throwable e) {
    			logger.error("Can't auto-login, reverting to default behavior", e);
    		}
    
    		return super.login(request, response);
    	}
    
    	private String fixScreenName(String loginName) {
    		
    		String name = loginName;
    		
    		if (name.contains("@")) {
    			name = name.substring(0,name.indexOf("@"));
    		}
    
    		return name;
    	}
    
    	private long[] getUserGroups(long companyId, String groupMembership) throws Exception {
    		String[] groups = groupMembership.toString().split(";");
    
    		List mapToGroups = new ArrayList();
    
    		for (String group : groups) {
    			if (group.contains("[")) {
    				group = group.replace('[', ' ');
    				group = group.replace(']', ' ');
    				group = group.trim();
    			}
    			String groupName = group;
    
    			if (groupName != null) {
    				UserGroup liferayGroup = UserGroupLocalServiceUtil.getUserGroup(companyId, groupName);
    				if (liferayGroup != null) {
    					logger.debug("Found user group " + liferayGroup.getUserGroupId());
    				mapToGroups.add(liferayGroup.getUserGroupId());
    				} else {
    					logger.debug("Liferay group " + groupName + " not found");
    				}
    			}
    		}
    
    		long[] mapToGroupsArray = new long[mapToGroups.size()];
    		int i = 0;
    		for (long l : mapToGroups) {
    			mapToGroupsArray[i] = l;
    			++i;
    		}
    		
    		return mapToGroupsArray;
    	}
    }
    [/jcodeva]
    Note that you must make sure CAS sends all the relevant properties in the return SAML response, and that the groups sent exist in LifeRay. 
    </li>
    <li>Now, create a JAR file (<code>mvn clean install</code>), and copy the JAR file to <code>TOMCAT_HOME/webapps/ROOT/WEB-INF/lib</code></li>
    	<li>Edit the LifeRay web.xml file. It can be found in <code>TOMCAT_HOME/webapps/ROOT/WEB-INF</code>. Replace the line
    
    &lt;filter-class&gt;com.liferay.portal.servlet.filters.sso.cas.CASFilter&lt;/filter-class&gt;
    [/xml]
    with the following line:
    
    &lt;filter-class&gt;com.tona.security.TonaCasFilter&lt;/filter-class&gt;
    
  6. Edit the LifeRay portal-ext.properties file. It can be found in TOMCAT_HOME/webapps/ROOT/WEB-INF/classes. Add the following line:
    auto.login.hooks=com.tona.security.TonaCASAutoLogin
    
  7. Restart LifeRay. All should work...

Configuring LifeRay and CAS to work with LDAP

I saw many tutorials on CAS, Liferay and LDAP – but unfortunetly, none of them worked for me. So I decided to document what does work (at least for me).
Note that my environment is based on LifeRay 6.0.5 and CAS 3.5.1.

  1. Configure Tomcat for SSL. I have used port 443. You can read all about it here
    1. After creating the certificates, I just ended up with adding the following tag in TOMCAT_HOME/conf/server.xml
    2.  
      <Connector
                 port="443" maxThreads="200"
                 scheme="https" secure="true" SSLEnabled="true"
                 keystoreFile="/root/.keystore" keystorePass="password"
                 clientAuth="false" sslProtocol="TLS"/>  
      
    3. IMPORTANT I did not manage to make CAS work with a self signed certificate, so I’ve used a temporary free one.
  2. Configure LifeRay for LDAP
    1. Login to LifeRay
    2. Go to the Control Panel–>Settings–>Authentication–>LDAP
    3. Ensure the “Enabled” check box is selected
    4. I strongly suggest enabling the “Import” checkbox and ensure Import is enabled for server startup.
    5. Add a server
    6. Fill in the LDAP server details (it’s easy to check them with an LDAP browser like jxplorer)
    7. Save your configuration
    8. I usually restart Tomcat after that change, and view the log to see all users were successfully imported
  3. Build CAS
    1. Download CAS (I downloaded it from here)
    2. Unzip the file
    3. Edit the CAS_HOME/cas-server-webapp/pom.xml file and add the following:
    4. <dependency>
           <groupId>org.jasig.cas</groupId>
           <artifactId>cas-server-support-ldap</artifactId>
           <version>3.5.1</version>
      </dependency>
      
    5. Build CAS using maven. The command to run is mvn clean install
  4. Deploy CAS
    1. Copy the newly created WAR file from CAS_HOME/cas-server-webapp/target/cas.war to TOMCAT_HOME/webapps
  5. Configure CAS for LDAP
    1. Edit the TOMCAT_HOME/webapps/cas/WEB-INF/deployerConfigContext.xml
    2. Add the following at the end of the file (just before the /beans tag)
    3. <bean id="contextSource" class="org.springframework.ldap.core.support.LdapContextSource">
        <!-- DO NOT enable JNDI pooling for context sources that perform LDAP bind operations. -->
        <property name="pooled" value="false"/>
       
        <!--
          Although multiple URLs may defined, it's strongly recommended to avoid this configuration
          since the implementation attempts hosts in sequence and requires a connection timeout
          prior to attempting the next host, which incurs unacceptable latency on node failure.
          A proper HA setup for LDAP directories should use a single virtual host that maps to multiple
          real hosts using a hardware load balancer.
        -->
        <property name="url" value="ldap://LDAP_SERVER:389" />
       
        <!--
          Manager credentials are only required if your directory does not support anonymous searches.
          Never provide these credentials for FastBindLdapAuthenticationHandler since the user's
          credentials are used for the bind operation.
        -->
        <property name="userDn" value="cn=Manager"/>
        <property name="password" value="test"/>
       
        <!-- Place JNDI environment properties here. -->
        <property name="baseEnvironmentProperties">
          <map>
            <!-- Three seconds is an eternity to users. -->
            <entry key="com.sun.jndi.ldap.connect.timeout" value="3000" />
            <entry key="com.sun.jndi.ldap.read.timeout" value="3000" />
       
            <!-- Explained at http://download.oracle.com/javase/1.3/docs/api/javax/naming/Context.html#SECURITY_AUTHENTICATION -->
            <entry key="java.naming.security.authentication" value="simple" />
          </map>
        </property>
      </bean>
      
    4. Add the following under the list tag of the authenticationHandlers tag
    5.       <bean class="org.jasig.cas.adaptors.ldap.BindLdapAuthenticationHandler"
      p:filter="mail=%u"
      p:searchBase="ou=people,dc=test,dc=com"
      p:contextSource-ref="contextSource" />
                            </list>
                    </property>
            </bean>
      
  6. Configure LifeRay for CAS
    1. Login to LifeRay
    2. Go to the Control Panel–>Settings–>Authentication–>CAS
    3. Ensure the “Enabled” check box is selected
    4. Ensure the “LDAP Import” check box is selected
    5. Enter the URLs of the CAS server
    6. Save
    7. Add the following line to TOMCAT_HOME/webapps/ROOT/WEB-INF/classes/system-ext.properties
    8. com.liferay.filters.sso.cas.CASFilter=true
      
    9. Add the following line to TOMCAT_HOME/webapps/ROOT/WEB-INF/classes/portal-ext.properties
    10. auto.login.hooks=com.liferay.portal.security.auth.CASAutoLogin
      
    11. Restart Tomcat

You can now access your LifeRay instance, and get the CAS login instead…

RuntimeException in Action for tag [rollingPolicy] java.lang.IndexOutOfBoundsException: No group 1

A customer of mine got sometimes the following exception in his catalina.out log.

ERROR in ch.qos.logback.core.joran.spi.Interpreter@23:25 - RuntimeException in Action for tag [rollingPolicy] java.lang.IndexOutOfBoundsException: No group 1
at java.lang.IndexOutOfBoundsException: No group 1
at at java.util.regex.Matcher.group(Matcher.java:470)
at at ch.qos.logback.core.rolling.helper.FileFilterUtil.extractCounter(FileFilterUtil.java:109)
at at ch.qos.logback.core.rolling.helper.FileFilterUtil.findHighestCounter(FileFilterUtil.java:93)
at at ch.qos.logback.core.rolling.SizeAndTimeBasedFNATP.computeCurrentPeriodsHighestCounterValue(SizeAndTimeBasedFNATP.java:65)
at at ch.qos.logback.core.rolling.SizeAndTimeBasedFNATP.start(SizeAndTimeBasedFNATP.java:49)
at at ch.qos.logback.core.rolling.TimeBasedRollingPolicy.start(TimeBasedRollingPolicy.java:87)
at at ch.qos.logback.core.joran.action.NestedComplexPropertyIA.end(NestedComplexPropertyIA.java:167)
at at ch.qos.logback.core.joran.spi.Interpreter.callEndAction(Interpreter.java:318)
at at ch.qos.logback.core.joran.spi.Interpreter.endElement(Interpreter.java:197)
at at ch.qos.logback.core.joran.spi.Interpreter.endElement(Interpreter.java:183)
at at ch.qos.logback.core.joran.spi.EventPlayer.play(EventPlayer.java:62)
at at ch.qos.logback.core.joran.GenericConfigurator.doConfigure(GenericConfigurator.java:147)
at at ch.qos.logback.core.joran.GenericConfigurator.doConfigure(GenericConfigurator.java:133)
at at ch.qos.logback.core.joran.GenericConfigurator.doConfigure(GenericConfigurator.java:96)
at at ch.qos.logback.core.joran.GenericConfigurator.doConfigure(GenericConfigurator.java:55)
at at ch.qos.logback.classic.util.ContextInitializer.configureByResource(ContextInitializer.java:75)
at at ch.qos.logback.classic.util.ContextInitializer.autoConfig(ContextInitializer.java:148)
at at org.slf4j.impl.StaticLoggerBinder.init(StaticLoggerBinder.java:84)
at at org.slf4j.impl.StaticLoggerBinder.(StaticLoggerBinder.java:54)
at at org.slf4j.LoggerFactory.bind(LoggerFactory.java:121)
at at org.slf4j.LoggerFactory.performInitialization(LoggerFactory.java:111)
at at org.slf4j.LoggerFactory.getILoggerFactory(LoggerFactory.java:268)
at at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:241)
at at org.apache.commons.logging.impl.SLF4JLogFactory.getInstance(SLF4JLogFactory.java:156)
at at org.apache.commons.logging.impl.SLF4JLogFactory.getInstance(SLF4JLogFactory.java:132)
at at org.apache.commons.logging.LogFactory.getLog(LogFactory.java:645)
at at org.springframework.web.context.ContextLoader.initWebApplicationContext(ContextLoader.java:184)
at at org.springframework.web.context.ContextLoaderListener.contextInitialized(ContextLoaderListener.java:47)
at at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:3972)
at at org.apache.catalina.core.StandardContext.start(StandardContext.java:4467)
at at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:791)
at at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:771)
at at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:546)
at at org.apache.catalina.startup.HostConfig.deployDirectory(HostConfig.java:1041)
at at org.apache.catalina.startup.HostConfig.deployDirectories(HostConfig.java:964)
at at org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:502)
at at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1277)
at at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:321)
at at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:119)
at at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1053)
at at org.apache.catalina.core.StandardHost.start(StandardHost.java:785)
at at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1045)
at at org.apache.catalina.core.StandardEngine.start(StandardEngine.java:443)
at at org.apache.catalina.core.StandardService.start(StandardService.java:519)
at at org.apache.catalina.core.StandardServer.start(StandardServer.java:710)
at at org.apache.catalina.startup.Catalina.start(Catalina.java:581)

Turns out the problem was with his logback.xml configuration file –

<appender name="FILE" class="ch.qos.logback.core.rolling.RollingFileAppender">
    <file>${env.PANORAMA_HOME}/var/log/panorama-sl.log</file>
    <rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
        <!-- daily rollover -->
        <fileNamePattern>${env.PANORAMA_HOME}/var/log/panorama-sl.%d{yyyy-MM-dd}.log.gz</fileNamePattern>
                                         
        <timeBasedFileNamingAndTriggeringPolicy   class="ch.qos.logback.core.rolling.SizeAndTimeBasedFNATP">
                <!-- or whenever the file size reaches 100MB -->
            <maxFileSize>100MB</maxFileSize>
       </timeBasedFileNamingAndTriggeringPolicy>
    <!-- keep 10 files worth of history -->
        <maxHistory>10</maxHistory>
    </rollingPolicy>
               
    <Append>true</Append>
    <encoder>
        <!-- default ISO date format enables lexical sorting of dates -->
        <pattern>%-30.-30(%date %level) [%-50.-50thread] %logger{25} %msg%n</pattern>
    </encoder>
    <filter class="ch.qos.logback.classic.filter.ThresholdFilter">
        <level>ALL</level>
    </filter>
</appender>

Since sometimes the log file size was bigger than 100MB, QOS needed to roll the log file – but couldn’t, since the log file format didn’t include a counter. The correct configuration is:

<appender name="FILE" class="ch.qos.logback.core.rolling.RollingFileAppender">
    <file>${env.PANORAMA_HOME}/var/log/panorama-sl.log</file>
    <rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
        <!-- daily rollover -->
        <fileNamePattern>${env.PANORAMA_HOME}/var/log/panorama-sl.%d{yyyy-MM-dd}.%i.log.gz</fileNamePattern>
                                         
        <timeBasedFileNamingAndTriggeringPolicy   class="ch.qos.logback.core.rolling.SizeAndTimeBasedFNATP">
                <!-- or whenever the file size reaches 100MB -->
            <maxFileSize>100MB</maxFileSize>
       </timeBasedFileNamingAndTriggeringPolicy>
    <!-- keep 10 files worth of history -->
        <maxHistory>10</maxHistory>
    </rollingPolicy>
               
    <Append>true</Append>
    <encoder>
        <!-- default ISO date format enables lexical sorting of dates -->
        <pattern>%-30.-30(%date %level) [%-50.-50thread] %logger{25} %msg%n</pattern>
    </encoder>
    <filter class="ch.qos.logback.classic.filter.ThresholdFilter">
        <level>ALL</level>
    </filter>
</appender>

Monitoring seda queues with JBoss JMX console

In my current project we’re using camel, and depend heavily on it’s seda technology.
Since we didn’t monitor our queues at first, we encountered OutOfMemory exceptions constantly (usually after ~48 hours of heavy use).
We overcome this by limiting the size of the seda queue (using the size attribute – see here for more info).
But now we face QueueFull exceptions, and need to constantly monitor the queues for their size. Since our application runs on top of JBoss, we can use their JMX API for that. And since I’m a bit lazy – I’ve decided to access it through their HTTP jmx-console.

package com.tona.monitor;

import java.io.ByteArrayOutputStream;
import java.io.FileWriter;
import java.io.InputStream;
import java.net.URL;

public class Main {
	public static void main(String[] args) throws Exception {
		String[] urls = new String[] {
				"http://192.168.155.101:8080/jmx-console/HtmlAdaptor?action=invokeOpByName&amp;name=org.apache.camel%3Acontext%3DacsAdapterCamelContext%2Ctype%3Dendpoints%2Cname%3D%22seda%3A%2F%2FworkflowTriggerManager%5C%3Fsize%3D50000%22&amp;methodName=queueSize",
				"http://192.168.155.101:8080/jmx-console/HtmlAdaptor?action=invokeOpByName&amp;name=org.apache.camel%3Acontext%3DacsAdapterCamelContext%2Ctype%3Dendpoints%2Cname%3D%22seda%3A%2F%2FsyslogAppender%5C%3FconcurrentConsumers%3D4%26timeout%3D5000%22&amp;methodName=queueSize",
		};
		FileWriter fos = new FileWriter("/tmp/queue_log" + System.currentTimeMillis() + ".csv");

		for (String url : urls) {
			System.out.print(getQueueName(url) + ",");
			fos.write(getQueueName(url) + ",");
		}
		System.out.println();
		fos.write("n");
		
		boolean flag = true;
		
		while (flag) {
			for (String url : urls) {
				URL u = new URL(url);
				InputStream is = u.openStream();
				int i = 0;
				ByteArrayOutputStream baos = new ByteArrayOutputStream();
				while ((i = is.read()) &gt; 0) {
					baos.write(i);
				}

				// System.out.println(baos.toString());
				String body = baos.toString();
				int start = body.indexOf("<pre>");
				int end = body.indexOf("</pre>");
				String numOfMessages = body.substring(start + 5, end).trim();
				System.out.print(numOfMessages + ",");
				fos.write(numOfMessages + ",");

			}
			System.out.println();
			fos.write("n");
			fos.flush();
			Thread.sleep(1000);			
		}
		
		fos.close();

	}
	
	private static String getQueueName(String url) {
		String queueNameStart = "seda%3A%2F%2F";
		String queueNameEnd = "%5C%3";
		
		int queueNameStartPos = url.indexOf(queueNameStart) + queueNameStart.length();
		int queueNameEndPos = url.indexOf(queueNameEnd);
		
		if (queueNameEndPos == -1)
			queueNameEndPos = url.length();
		
		return url.substring(queueNameStartPos,queueNameEndPos);
	}

}

Import users into Liferay

If you’re using Liferay, and not using LDAP, you will probably face into the same problem I did – how to import a large amount of users without manually adding them to the system.

So, loving automation, I’ve decided to create a simple Portlet that does just that.

  1. Create a new Dynamic Web App in Eclipse.
  2. Configure all necessary deployment files (liferay-portlet.xml, portlet.xml, web.xml etc)
  3. Create a new User class:
    package com.tona.liferay.web;public class User {
    private String firstName;
    private String lastName;
    private String email;
    private String phoneNo;
    private String screenName;
    private String password;
    
    public String getEmail() {
    return email;
    }
    
    public void setEmail(String email) {
    this.email = email;
    }
    
    public String getPhoneNo() {
    return phoneNo;
    }
    
    public void setPhoneNo(String phoneNo) {
    this.phoneNo = phoneNo;
    }
    
    public String getFirstName() {
    return firstName;
    }
    
    public void setFirstName(String firstName) {
    this.firstName = firstName;
    }
    
    public String getLastName() {
    return lastName;
    }
    
    public void setLastName(String lastName) {
    this.lastName = lastName;
    }
    
    public String getScreenName() {
    return screenName;
    }
    
    public void setScreenName(String screenName) {
    this.screenName = screenName;
    }
    
    public String getPassword() {
    return password;
    }
    
    public void setPassword(String password) {
    this.password = password;
    }
    
    public User(String line) {
    String[] tokens = line.split(",");
    setFirstName(tokens[1]);
    setLastName(tokens[2]);
    setEmail(tokens[3]);
    setPhoneNo(tokens[4]);
    String screenName = getFirstName() + getLastName().substring(0, 3);
    setScreenName(screenName.toLowerCase());
    setPassword(getScreenName() + "123");
    }
    
    public User() {
    
    }
    
    }
    
  4. Create the portlet itself:
    package com.tona.liferay.web;import java.io.BufferedReader;
    import java.io.FileReader;
    import java.io.IOException;
    import java.util.ArrayList;
    import java.util.List;
    import java.util.Locale;
    import java.util.logging.Level;
    import java.util.logging.Logger;
    
    import javax.portlet.ActionRequest;
    import javax.portlet.ActionResponse;
    import javax.portlet.PortletException;
    
    import com.liferay.portal.model.Company;
    import com.liferay.portal.service.CompanyLocalServiceUtil;
    import com.liferay.portal.service.UserLocalServiceUtil;
    import com.liferay.util.bridges.mvc.MVCPortlet;
    
    public class ImportUsersPortlet extends MVCPortlet {
    public void importUsers(ActionRequest actionRequest,
    ActionResponse actionResponse) throws IOException, PortletException {
    
    String fileName = actionRequest.getParameter("fileName");
    
    BufferedReader fr = new BufferedReader(new FileReader(fileName));
    
    Listusers = new ArrayList();
    
    String line;
    
    while ((line = fr.readLine()) != null) {
    users.add(new User(line));
    }
    
    // We now have the user list
    com.liferay.portal.service.ServiceContext context = new com.liferay.portal.service.ServiceContext();
    long companyId = 0;
    
    try {
    Company company = CompanyLocalServiceUtil.getCompanies().get(0);
    companyId = company.getCompanyId();
    for (User user : users) {
    try {
    UserLocalServiceUtil.addUser(0, companyId, false,
    user.getPassword(), user.getPassword(), false, user.getScreenName(),
    user.getEmail(), 0, "", Locale.getDefault(),
    user.getFirstName(), "", user.getLastName(), 0, 0,
    true, 1, 1, 1970, null, null, null, null, null, false,
    context);
    } catch (Exception e) {
    e.printStackTrace();
    }
    }
    } catch (Exception e) {
    e.printStackTrace();
    }
    
    }
    }
    
  5. Create a WAR file and deploy it in Liferay
  6. Note that the portlet does not upload the CSV file – and expects it to exist on the Liferay server itself
  7. You can of course change the algorithm of the screen-name and password creation, by changing the User constructor method.

Updating Pentaho PRPT files to add a PreProcessor

In my previous post (see here) I mentioned that I couldn’t add a pre-processor to a Pentaho report using the report designer. So, I’ve written a short Java program that does just that.
Note that I use a neat open source library called Zip4J (you can get it here).

package com.tona.rprt;

import java.io.File;
import java.io.FileWriter;

import javax.xml.parsers.DocumentBuilder;
import javax.xml.parsers.DocumentBuilderFactory;
import javax.xml.transform.Transformer;
import javax.xml.transform.TransformerFactory;
import javax.xml.transform.dom.DOMSource;

import net.lingala.zip4j.core.ZipFile;
import net.lingala.zip4j.model.ZipParameters;

import org.w3c.dom.Document;
import org.w3c.dom.Element;
import org.w3c.dom.Node;

public class Main {
	private static final String CONFIG_FILE_NAME = "layout.xml";
	
	public static void main(String[] args) throws Exception {
		ZipFile reportFile = new ZipFile("");

		File tempDirectory = createTempDirectory();
		String path = tempDirectory.getAbsolutePath();
		reportFile.extractFile(CONFIG_FILE_NAME, path);

		System.out.println("Extraced file to " + path);
		File updatedFile = new File(path + File.separator + CONFIG_FILE_NAME);

		// Update the file
		DocumentBuilderFactory dbf = DocumentBuilderFactory.newInstance();
		DocumentBuilder db = dbf.newDocumentBuilder();
		Document doc = db.parse(updatedFile);
		
		System.out.println("Parsed document");
		
		Element layoutNode = doc.getDocumentElement();
		Element preProcessorElement = doc.createElement("preprocessor");
		preProcessorElement.setAttribute("class", "com.tona.report.infra.TonaWizardProcessor");
		Node firstLayoutChild = layoutNode.getFirstChild(); 
		layoutNode.insertBefore(preProcessorElement, firstLayoutChild);
		
		System.out.println("Added child");

		FileWriter output = new FileWriter(updatedFile);
		javax.xml.transform.stream.StreamResult result = new javax.xml.transform.stream.StreamResult(output);

		TransformerFactory tf = TransformerFactory.newInstance();
		Transformer t = tf.newTransformer();
		t.transform(new DOMSource(doc), result);
		
		System.out.println("Updated XML file");
		
		ZipParameters parameters = new ZipParameters();
		reportFile.removeFile(CONFIG_FILE_NAME);
		reportFile.addFile(updatedFile, parameters);
		
		System.out.println("Update ZIP file");
		
		tempDirectory.delete();
		
		System.out.println("Removed temporary directory");
	}
	
	private static File createTempDirectory() throws Exception
		{
		    File temp = File.createTempFile("temp", Long.toString(System.nanoTime()));

		    if(!(temp.delete())) {
		        throw new Exception("Could not delete temp file: " + temp.getAbsolutePath());
		    }

		    if(!(temp.mkdir())) {
		        throw new Exception("Could not create temp directory: " + temp.getAbsolutePath());
		    }

		    return temp;
		}	
}

Using Oracle tkprof with JDBC thin client application

When attempting to profile a Hibernate based application, I found a statement that was incredibly slow and caused the system to basically halt for a few seconds before resuming execution. I wanted to profile it at the database level, and the best tool for the job is Oracle’s own tkprof.
The input for tkprof is a session trace file, and enabling one is a bit tricky. The reason – a JavaEE application, with multiple threads, have multiple connections, multiple database sessions, and the SQL_TRACE is on a per session level (I didn’t want to configure it for the entire database – the trace file would be totally unusable…)
So, I took the code and ran it in a standalone Java application, and enabled SQL Trace. Here’s how:

package com.tona.jdbc;

import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.PreparedStatement;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.sql.Statement;
import java.sql.Types;

import oracle.jdbc.OracleCallableStatement;

public class TestJDBC {
	public static void main(String[] args) throws Exception {
		Connection c = DriverManager.getConnection(JDBC_CONNECTION_STRING);
		
                // Set SQL_Trace to on
		Statement stmt = c.createStatement();
		stmt.executeUpdate("ALTER SESSION SET SQL_TRACE=TRUE");
		stmt.close();
		
                // Set SQL Trace location
		stmt = c.createStatement();
		stmt.executeUpdate("alter system set USER_DUMP_DEST='/tmp'");
		stmt.close();

		// JDBC logic comes here...

		c.close();
	}
}

Note
Changing the USER_DUMP_DEST parameter did not have any effect, and the SQL_TRACE was written to the default trace log directory on the server (in my case it was ./u01/app/oracle/diag/rdbms/SID/SID/trace)