Mocking final classes with Mockito and JavaAssist

Sometime you need to mock final classes. I never got why people will write final classes, but since they do – and we need to mock those classes – here is a nice solution. I’m using JavaAssist here, which is a framework that can manipulate classes at run time.

Note that this code must run before anything else in your unit tests – Constructor, @Before, @BeforeClass, @PostConstruct (if you use Spring) annotations will not do. This has to be a static block on your unit test class (or its base). Otherwise, your final class will already be loaded, and you’ll get weird exceptions like “attempted duplicate class” LinkageError.
First of all – add JavaAssist to your project. I use gradle, so:

testCompile group: 'org.javassist', name: 'javassist', version: '3.22.0-GA'

Note the testCompile – we only use Mockito and JavaAssist on our test classes, so no need to bundle it in your code.
Then, write a static block on your unit test class (or it’s base class. Will work either way)

static {
try {
ClassPool cp = ClassPool.getDefault();
CtClass cc = cp.get("fully qualified class name");
cc.defrost();
int clzModifiers = cc.getModifiers();
clzModifiers = javassist.Modifier.clear(clzModifiers, Modifier.FINAL);
cc.setModifiers(clzModifiers);
cc.toClass();
} catch (Exception e) {
e.printStackTrace();
}
}

And walla. You can now use @MockBean, Mockito.mock or any other method you use to mock classes.
Enjoy.

Using BouncyCastle FIPS for Java FIPS support

How do you use BC FIPS in Java app?

I’ve been trying recently to use BC FIPS module in my Java app. Turns out – not as simple as you’d think.
The problems I faced were mainly with the keystore format, but other issues came up as well.

1. Download the bc-fips-1.0.0.jar (download latest and greatest from here) file
2. Place it in jre/lib/ext
3. Edit jre/lib/security/java.security file. Edit the following line:
security.provider.4=com.sun.net.ssl.internal.ssl.Provider BCFIPS
4. Edit jre/lib/security/java.security file. Add the following line:
security.provider.11=org.bouncycastle.jcajce.provider.BouncyCastleFipsProvider
(Make sure you use the right numbering. It should be consecutive)
5. Create your keystore:
keytool -genkey -storetype BCFKS -alias mykey -keyalg RSA -provider org.bouncycastle.jcajce.provider.BouncyCastleFipsProvider -storepass test123 -keystore test_fips
Of course, you can change the parameters are you need
6. Add the following line in your code (I prefer that over the java.security changes)
new com.sun.net.ssl.internal.ssl.Provider("BCFIPS");
7. If your code requires specifying the keystore type, use the following constant – BCFKS

You should be OK…

Implementing IGNORE_ROW_ON_DUPKEY_INDEX in Postgres

Unfortunately for us all, Postgres does not support the IGNORE_ROW_ON_DUPKEY_INDEX hint. So if you have highly concurrent code that inserts data to a table with a unique constraint, you’re in for allot of potential problems.

There is no easy fix (beside handling that scenario in the code). But if the insert rate to the table is not high, the following solution can work for you:

CREATE OR REPLACE FUNCTION lock_table() RETURNS trigger AS $$
DECLARE
  cnt integer;
  p_table_name varchar;
  p_query varchar;
  p_query_temp varchar;
  p_param_name varchar;
  p_param_type varchar;
BEGIN
  p_param_type = TG_ARGV[0];
  p_param_name = TG_ARGV[1];
  p_table_name = TG_TABLE_NAME || '_lock';

  EXECUTE 'SELECT $1."' || p_param_name || '"'
      USING NEW
       INTO p_query_temp;

  IF p_param_type = 'string' THEN
    p_query = 'select 1 from ' || TG_TABLE_NAME || ' where ' || p_param_name || '=''' || p_query_temp || '''';
  ELSE
    p_query = 'select 1 from ' || TG_TABLE_NAME || ' where ' || p_param_name || '=' || p_query_temp;
  END IF;

  execute 'lock table ' || p_table_name || '  in exclusive mode';
  execute p_query into cnt;
	IF cnt = 1 THEN
	  RETURN NULL;
	else
    RETURN NEW;
	END IF;

END; $$ LANGUAGE plpgsql;
/

create table my_table_lock(id varchar2(1024) not null);
/

CREATE TRIGGER my_table_on_duplicate_ignore BEFORE INSERT OR UPDATE ON parameter_name
    FOR EACH ROW EXECUTE procedure lock_table('string','name');
/

Now, locking table for exclusive mode forces the database to write rows one by one, and disable concurrency. It’s not perfect – but it will work.

Another word of caution – this solution might break your ORM tool (specifically – it happened for me on Hibernate), as returning NULL from the trigger causes the update count to be decreased to 0. Couldn’t find a good way around that problem. If you have an idea – I’d be happy to hear it.

Inject beans into JBoss 7 Modules

It seems like JBoss 7.0 CDI has an issue with injecting beans that are located in shared libraries, and are not part of an EAR file. Irritating (but solved in the next version). However, I didn’t want to upgrade, so I decided to solve it instead.
Here’s a short step-by-step:

Deltaspike

Deltaspike is an apache library that has several useful CDI extensions. Add it as a module to your jboss-as. You can use the following module.xml file:

<?xml version="1.0" encoding="UTF-8"?>
 
<module xmlns="urn:jboss:module:1.1" name="org.deltaspike">
 
    <resources>
        <resource-root path="deltaspike-core-api.jar" />
        <resource-root path="deltaspike-core-impl.jar" />
    </resources>
 
    <dependencies>
        <module name="com.google.guava" />
        <module name="javax.enterprise.api" />
        <module name="javax.inject.api" />
    </dependencies>
</module>

Reflections

Reflections is a very useful library that allows you to find classes that has an annotation in runtime.
Add it as a module to your jboss-as.

<?xml version="1.0" encoding="UTF-8"?>
 
<module xmlns="urn:jboss:module:1.1" name="org.reflections">
 
    <resources>
        <resource-root path="reflections.jar" />
    </resources>
 
    <dependencies>
        <module name="com.google.guava" />
        <module name="org.javassist" />
        <module name="org.slf4j" />
    </dependencies>
</module>

Module extensions

This is the heart of the solution. This module runs when the CDI container starts, and adds relevant beans to the CDI. Make sure this class is part of a JAR file that is inside your EAR file!

This code is greatly influenced from https://rmannibucau.wordpress.com/2013/08/19/adding-legacy-beans-to-cdi-context-a-cdi-extension-sample/

package com.tona.cdi;
import java.util.HashMap;
import java.util.Map;
import java.util.Set;

import javax.enterprise.event.Observes;
import javax.enterprise.inject.Any;
import javax.enterprise.inject.spi.AfterBeanDiscovery;
import javax.enterprise.inject.spi.AnnotatedType;
import javax.enterprise.inject.spi.Bean;
import javax.enterprise.inject.spi.BeanManager;
import javax.enterprise.inject.spi.BeforeBeanDiscovery;
import javax.enterprise.inject.spi.Extension;
import javax.enterprise.util.AnnotationLiteral;
import javax.inject.Named;
import javax.inject.Singleton;

import org.apache.deltaspike.core.util.bean.BeanBuilder;
import org.apache.deltaspike.core.util.metadata.builder.AnnotatedTypeBuilder;
import org.reflections.Reflections;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

import com.google.common.base.Strings;

public class ModuleConfigurationExtension implements Extension {
    private static final Logger log = LoggerFactory.getLogger(ModuleConfigurationExtension.class);
    private final Map beans = new HashMap();

    /**
     * This method is automatically activated by CDI, and loads all classes in the com.tona package that has NAMED or
     * SINGLETON annotations.
     * @param bdd
     */
    void readAllConfigurations(final @Observes BeforeBeanDiscovery bdd, BeanManager bm) {
        log.info("Starting to load beans from modules");
        addBeansFromPackage(bdd, bm, "com.tona");
    }

    private void addBeansFromPackage(final BeforeBeanDiscovery bdd, BeanManager bm, String packageName) {
        Reflections reflections = new Reflections(packageName);
        Set beanClasses = reflections.getTypesAnnotatedWith(Named.class);
        beanClasses.addAll(reflections.getTypesAnnotatedWith(Singleton.class));

        for (Class bean : beanClasses) {
            @SuppressWarnings({ "unchecked", "rawtypes" })
            AnnotatedType annotatedType = new AnnotatedTypeBuilder().readFromType(bean).create();
            Set foundBeans = bm.getBeans(annotatedType.getBaseType(), new AnnotationLiteral() {
            });

            if (foundBeans.size() == 0) {
                bdd.addAnnotatedType(annotatedType);
                String name;
                Named named = bean.getAnnotation(Named.class);
                if (named == null || Strings.isNullOrEmpty(named.value())) {
                    name = bean.getSimpleName();
                } else {
                    name = named.value();
                }
                beans.put(name, annotatedType);
            }
        }
    }

    /**
     * This method actually initializes the beans we discovered in <code>readAllConfigurations</code>. Again - this
     * method is automatically activated by CDI
     * @param abd
     * @param bm
     * @throws Exception
     */
    public void addCdiBeans(final @Observes AfterBeanDiscovery abd, final BeanManager bm) throws Exception {
        log.info("Starting to initialize beans from modules");

        for (Map.Entry bean : beans.entrySet()) {
            Set foundBeans = bm.getBeans(bean.getValue().getBaseType(), new AnnotationLiteral() {
            });

            if (foundBeans.size() == 0) {
                final Bean cdiBean = createBean(bm, bean.getKey(), bean.getValue());
                abd.addBean(cdiBean);
                log.debug("Added bean " + cdiBean.getName());
            }
        }
    }

    private static Bean createBean(final BeanManager bm,
            final String name,
            final AnnotatedType annotatedType)
            throws Exception {
        final BeanBuilder beanBuilder = new BeanBuilder(bm).
                readFromType(annotatedType).
                name(name);

        return beanBuilder.create();
    }
}

Configuring the extension

Create a file called META-INF/services/javax.enterprise.inject.spi.Extension. It should only have the following line:

com.tona.cdi.ModuleConfigurationExtension

Updating your EAR file

The EAR file should have dependencies on the org.deltaspike and org.reflections module. Add it in the MANIFEST.MF file.

Salesforce Delegated Authentication

This post is radically different from my previous posts – it’s going to be written in C#!!!
Salesforce allows users to use a delegated authentication mechanism for SSO. One option is SAML, which is nice – but it doesn’t work on mobile devices in disconnected mode. The other is delegated authentication. This way, Salesforce activates a web service that implements a predefined WSDL. The parameters the web service is getting are the username, password and IP address, and the service needs to return a true/false value.
So, let’s get down to business:

  1. Configure Delegated Authentication
    1. Open your Salesforce account for delegated authentication. For some reason, this is not enabled by default, and you need to ask your SF guys to enable this features.
    2. Login to Salesforce, and click the Setup link
    3. Click Security Controls→Single Sign-On Settings
    4. Click on Edit, and enter your Web Service URL
  2. Assign users to the Delegated Authentication
    1. Login to Salesforce, and click the Setup link
    2. Click Manage Users→Profiles
    3. Select the user profile
    4. Click the Edit button
    5. Make sure the “Is Single Sign-On Enabled” checkbox is enabled
    6. Click Save

    And now the code

    using System;
    using System.Collections.Generic;
    using System.Configuration;
    using System.DirectoryServices;
    using System.DirectoryServices.Protocols;
    using System.IO;
    using System.Linq;
    using System.Net;
    using System.Security.Cryptography.X509Certificates;
    using System.Threading;
    using System.Web;
    using System.Web.Services;
    
    namespace DelegatedAuthenticationService
    {
        /// <summary>
        /// This service is used for delegated security for force.com
        /// </summary>
        [WebService(Namespace = "urn:authentication.soap.sforce.com", Description="v1.1.3")]
        [WebServiceBinding(ConformsTo = WsiProfiles.BasicProfile1_1)]
        [System.ComponentModel.ToolboxItem(false)]
        public class DelegatedSecurityService : System.Web.Services.WebService, IAuthenticationBinding
        {
            [WebMethod]
            public bool Authenticate(string username, string password, string sourceIp, System.Xml.XmlElement[] Any)
            {
                    try
                    {
                        // Run the business logic
    	                return true;
                    }
                    catch (Exception e)
                    {
                        // Connection can not be created - password is incorrect
                        log(ERROR,"Failed to get LDAP connection. Error message is : " + e.Message);
                        audit(username, "FAIL",e.Message);
                        return false;
                    }
            }
        }
    }
    

    It’s important to note that Salesforce is limiting the time it will waits for the service – the entire request/response (including network) must take less than ~5 seconds, otherwise users will get a failed to login message.

    Good luck!

jmap, jstack not working properly with OpenJDK

I run into an issue lately with the jmap and jstack implementation of OpenJDK. Quite frankly – they didn’t work…
When running jmap -heap, I would get:

Exception in thread "main" java.lang.reflect.InvocationTargetException
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at sun.tools.jmap.JMap.runTool(JMap.java:197)
	at sun.tools.jmap.JMap.main(JMap.java:128)
Caused by: java.lang.RuntimeException: unknown CollectedHeap type : class sun.jvm.hotspot.gc_interface.CollectedHeap
	at sun.jvm.hotspot.tools.HeapSummary.run(HeapSummary.java:146)
	at sun.jvm.hotspot.tools.Tool.start(Tool.java:221)
	at sun.jvm.hotspot.tools.HeapSummary.main(HeapSummary.java:40)
	... 6 more

When running jstack -F I would get:

java.lang.RuntimeException: Unable to deduce type of thread from address 0x00007fc980001000 (expected type JavaThread, CompilerThread, ServiceThread, JvmtiAgentThread, or SurrogateLockerThread)
	at sun.jvm.hotspot.runtime.Threads.createJavaThreadWrapper(Threads.java:162)
	at sun.jvm.hotspot.runtime.Threads.first(Threads.java:150)
	at sun.jvm.hotspot.runtime.DeadlockDetector.createThreadTable(DeadlockDetector.java:149)
	at sun.jvm.hotspot.runtime.DeadlockDetector.print(DeadlockDetector.java:56)
	at sun.jvm.hotspot.runtime.DeadlockDetector.print(DeadlockDetector.java:39)
	at sun.jvm.hotspot.tools.StackTrace.run(StackTrace.java:52)
	at sun.jvm.hotspot.tools.StackTrace.run(StackTrace.java:45)
	at sun.jvm.hotspot.tools.JStack.run(JStack.java:60)
	at sun.jvm.hotspot.tools.Tool.start(Tool.java:221)
	at sun.jvm.hotspot.tools.JStack.main(JStack.java:86)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at sun.tools.jstack.JStack.runJStackTool(JStack.java:136)
	at sun.tools.jstack.JStack.main(JStack.java:102)
Caused by: sun.jvm.hotspot.types.WrongTypeException: No suitable match for type of address 0x00007fc980001000
	at sun.jvm.hotspot.runtime.InstanceConstructor.newWrongTypeException(InstanceConstructor.java:62)
	at sun.jvm.hotspot.runtime.VirtualConstructor.instantiateWrapperFor(VirtualConstructor.java:80)
	at sun.jvm.hotspot.runtime.Threads.createJavaThreadWrapper(Threads.java:158)
	... 15 more
Can't print deadlocks:Unable to deduce type of thread from address 0x00007fc980001000 (expected type JavaThread, CompilerThread, ServiceThread, JvmtiAgentThread, or SurrogateLockerThread)

The fix was simple (as some dear guys in the OpenJDK explained to me). Make sure you install openjdk-debuginfo package. That should fix it.

Using Oracle’s ListAgg with distinct

Oracle’s 11g ListAgg function is a great way to concat multiple rows into a single column. However, it has a major limitation (in addition to the light documentation) – you can’t use distinct in ListAgg, a serious limitation. Online solutions suggest subqueries or regular expressions. I decided to write my own aggregate function to replace ListAgg.
Here goes:

  • First of all, I created the type specification:
    create or replace type TextAggregation as object
    (
      aggString VARCHAR2(32767), 
      static function ODCIAggregateInitialize(sctx IN OUT TextAggregation) 
        return number,
      member function ODCIAggregateIterate(self IN OUT TextAggregation, 
        value IN VARCHAR2) return number,
      member function ODCIAggregateTerminate(self IN TextAggregation, 
        returnValue OUT VARCHAR2, flags IN VARCHAR2) return number,
      member function ODCIAggregateMerge(self IN OUT TextAggregation, 
        ctx2 IN TextAggregation) return number
    );
    /
    
  • Then, the type body:
    create or replace type body TextAggregation is 
    static function ODCIAggregateInitialize(sctx IN OUT TextAggregation) 
    return number is 
    begin
      sctx := TextAggregation('');
      return ODCIConst.Success;
    end;
    
    member function ODCIAggregateIterate(self IN OUT TextAggregation, value IN VARCHAR2) return number is
      location number;
    begin
    	location := instr(',' || aggString || ',' , ',' || value || ',');
    	
    	if location > 0 then
    		return ODCIConst.Success;
    	end if;
    	
      if (aggString is null) then
        aggString := value;
      else
        aggString := aggString || ',' || value;
      end if;
      
    	return ODCIConst.Success;
    end;
    
    member function ODCIAggregateTerminate(self IN TextAggregation, 
        returnValue OUT VARCHAR2, flags IN VARCHAR2) return number is
    begin
      returnValue := self.aggString;
      return ODCIConst.Success;
    end;
    
    member function ODCIAggregateMerge(self IN OUT TextAggregation, ctx2 IN TextAggregation) return number is
    begin
      self.aggString := ctx2.aggString;
      return ODCIConst.Success;
    end;
    end;
    /
    
  • And then the actual function
    CREATE or replace FUNCTION MyListAgg (input VARCHAR2) RETURN VARCHAR2 PARALLEL_ENABLE AGGREGATE USING TextAggregation;
    /
    

Postgres and multi-dimensions arrays in JDBC

For a side project I’m doing I needed to use multi-dimension arrays in PostgreSQL using JDBC. There were no clear examples online on how to do this – and some forum posts claimed it wasn’t doable, so I wrote this short JUnit class to test the functionality. Needless to say – it works…
Some points – JDBC spec recommends you call the Array.free() method after using the array, but in the PostgreSQL driver version I was using (9.0 build 801) this was not supported.

public class TestMultiDimensionalArray {

	public Connection getConnection() throws Exception {
		Class.forName("org.postgresql.Driver");
		Connection c = DriverManager.getConnection(
				"jdbc:postgresql://localhost/engine", "user", "pass");

		return c;
	}

	@Test
	public void testCallableMultiDimensionRetValue() throws Exception {
		Connection c = getConnection();
		
		CallableStatement stmt = c.prepareCall("select * from select_schedules()");
		ResultSet rs = stmt.executeQuery();
		while (rs.next()) {
			Array outputArray = rs.getArray(1);
			String[][] realArray = (String[][])outputArray.getArray();
			System.out.println(realArray.length + "-->" + Arrays.toString(realArray[0]));
			
		}
		stmt.close();
		c.close();
	}

	@Test
	public void testCallableMultiDimensionInOutParams() throws Exception {
		Connection c = getConnection();
		
		CallableStatement stmt = c.prepareCall("{ call select_schedules_params(?,?)}");
		String[][] elements = new String[2][];
		elements[0] = new String[] {"meeting_m","lunch_m"};
		elements[1] = new String[] {"training_m","presentation_m"};
		
		Array inArray = c.createArrayOf("text", elements);
		
		stmt.setArray(1, inArray);
		stmt.registerOutParameter (2, java.sql.Types.ARRAY);
		
		stmt.execute();
		
		Array outputArray = stmt.getArray(2);
		Assert.assertNotNull(outputArray);
			
		String[][] realArray = (String[][])outputArray.getArray();
		Assert.assertEquals(2, realArray.length);
		
		stmt.close();
		c.close();
	}

	@Test
	public void testInsertSingleDimension() throws Exception {
		Connection c = getConnection();
		
		PreparedStatement stmt = c.prepareStatement("INSERT INTO sal_emp VALUES ('Bill',?,'{{"meeting", "lunch"}, {"training", "presentation"}}');");
		Array myArray = c.createArrayOf("integer", new Integer[] {1000,1000,1000,1000});
		stmt.setArray(1, myArray);
		stmt.execute();
		stmt.close();
		c.close();
	}
	
	@Test
	public void testInsertMultiDimension() throws Exception {
		Connection c = getConnection();
		
		PreparedStatement stmt = c.prepareStatement("INSERT INTO sal_emp VALUES ('multi_Bill',?,?);");
		Array intArray = c.createArrayOf("integer", new Integer[] {1000,1000,1000,1000});
		String[][] elements = new String[2][];
		elements[0] = new String[] {"meeting_m","lunch_m"};
		elements[1] = new String[] {"training_m","presentation_m"};

		//Note - although this is a multi-dimensional array, we still supply the base element of the array
		Array multiArray = c.createArrayOf("text", elements);
		stmt.setArray(1, intArray);
		stmt.setArray(2, multiArray);
		stmt.execute();
		//Note - free is not implemented
//		myArray.free();
		stmt.close();
		c.close();
	}
	
	@Test
	public void testSelectSingleDimension() throws Exception {
		Connection c = getConnection();
		Statement stmt = c.createStatement();
		ResultSet rs = stmt.executeQuery("SELECT 1 || ARRAY[2,3] AS array;");
		if (rs.next()) {
			Array outputArray = rs.getArray(1);
			Integer[] intArray = (Integer[]) outputArray.getArray();

			Assert.assertEquals(3, intArray.length);
			Assert.assertEquals(intArray[0].intValue(), 1);
			Assert.assertEquals(intArray[1].intValue(), 2);
			Assert.assertEquals(intArray[2].intValue(), 3);
		} else {
			Assert.fail("Didn't get array results");
		}

		rs.close();
		stmt.close();
		c.close();
	}

	@Test
	public void testSelectMultiDimension() throws Exception {
		Connection c = getConnection();
		Statement stmt = c.createStatement();
		ResultSet rs = stmt.executeQuery("SELECT ARRAY[1,2] || ARRAY[[3,4]] AS array");
		if (rs.next()) {
			Array outputArray = rs.getArray(1);
			Integer[][] intArray = (Integer[][]) outputArray.getArray();

			Assert.assertEquals(2, intArray.length);
			Assert.assertEquals(1, (int) intArray[0][0]);
			Assert.assertEquals(2, (int) intArray[0][1]);
			Assert.assertEquals(3, (int) intArray[1][0]);
			Assert.assertEquals(4, (int) intArray[1][1]);
		} else {
			Assert.fail("Didn't get array results");
		}

		c.close();

	}
}

Initial SQL configuration was:

CREATE TABLE sal_emp (
    name            text,
    pay_by_quarter  integer[],
    schedule        text[][]
);


CREATE OR REPLACE FUNCTION select_schedules() RETURNS setof sal_emp.schedule%TYPE AS $$
DECLARE
    row sal_emp.schedule%TYPE;
BEGIN
    return query select schedule from sal_emp;
    return;
END
$$ LANGUAGE plpgsql;

CREATE OR REPLACE FUNCTION select_schedules_params(query text[][],OUT data text[][])  AS $$
DECLARE
    row sal_emp.schedule%TYPE;
BEGIN
    select schedule into data from sal_emp where schedule[1][1]=query[1][1];
END
$$ LANGUAGE plpgsql;

Improving LifeRay 6 CAS integration

Lately, I had the dubious pleasure of integrating CAS with LifeRay (the results of which can be seen in my previous posts). Unfortunately, LifeRay assumes that both CAS and LifeRay are connected to the same user store (LDAP server or any similar security store), and thus no user import is necessary. But, as CAS has a much wider range of supported user stores – this is not always the case.
I needed to address this issue, meaning – allow users to login through CAS, even if they are not LifeRay users.

Concept

I replaced LifeRay CAS filter, and made sure that the AttributePrincipal object arriving from CAS client is stored at the HTTPSession.
Then, I replaced LifeRay auto-login class, and used LifeRay API to create a user if a user has logged in but did not exist in the internal LifeRay user database.

July-17, 2013 – Since I got many comments on this topic, I decided to open source the code mentioned here. Please see https://github.com/liranzel/liferay-cas-no-ldap/ for details.

The How

Here’s what I did:

  1. Configure LifeRay for CAS (see my previous post – http://tonaconsulting.com/configuring-liferay-and-cas-to-work-with-ldap/, but DON’T configure the LifeRay for LDAP
  2. Create a new Java project.
  3. As I use Maven, I used the following pom.xml file:
    <?xml version="1.0"?>
    <project
        xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"
        xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
        <modelVersion>4.0.0</modelVersion>
        <groupId>com.tona.liferay</groupId>
        <artifactId>Authenticator</artifactId>
        <version>1.0-SNAPSHOT</version>
        <packaging>jar</packaging>
        <name>Authenticator</name>
        <dependencies>
     
            <dependency>
                <groupId>org.slf4j</groupId>
                <artifactId>slf4j-api</artifactId>
                <version>1.6.6</version>
            </dependency>
     
            <dependency>
                <groupId>javax.portlet</groupId>
                <artifactId>portlet-api</artifactId>
                <version>2.0</version>
            </dependency>
     
            <dependency>
                <groupId>junit</groupId>
                <artifactId>junit</artifactId>
                <version>3.8.1</version>
                <scope>test</scope>
            </dependency>
     
            <dependency>
                <groupId>org.jasig.cas.client</groupId>
                <artifactId>cas-client-core</artifactId>
                <version>3.2.1</version>
            </dependency>
     
            <dependency>
                <groupId>log4j</groupId>
                <artifactId>log4j</artifactId>
                <version>1.2.14</version>
            </dependency>
     
            <dependency>
                <groupId>com.liferay.portal</groupId>
                <artifactId>portal-client</artifactId>
                <version>6.0.4</version>
            </dependency>
            <dependency>
                <groupId>com.liferay.portal</groupId>
                <artifactId>portal-impl</artifactId>
                <version>6.0.4</version>
                <scope>provided</scope>
            </dependency>
            <dependency>
                <groupId>com.liferay.portal</groupId>
                <artifactId>portal-service</artifactId>
                <version>6.0.4</version>
                <scope>provided</scope>
            </dependency>
            <dependency>
                <groupId>com.liferay.portal</groupId>
                <artifactId>util-java</artifactId>
                <version>6.0.4</version>
            </dependency>
     
            <dependency>
                <groupId>com.liferay.portal</groupId>
                <artifactId>util-bridges</artifactId>
                <version>6.0.4</version>
                <scope>provided</scope>
            </dependency>
     
        </dependencies>
    </project>
    
  4. I create a new class, called TonaCASFilter, that derives from CASFilter. Note that I had to copy some code from the parent class, as it was not easily extensible 😦
    public class TonaCasFilter extends CASFilter {
    
    	public static String LOGIN = CASFilter.class.getName() + "LOGIN";
    
    	public static void reload(long companyId) {
    		_ticketValidators.remove(companyId);
    	}
    
    	protected Log getLog() {
    		return _log;
    	}
    
    	protected TicketValidator getTicketValidator(long companyId)
    		throws Exception {
    
    		TicketValidator ticketValidator = _ticketValidators.get(companyId);
    
    		if (ticketValidator != null) {
    			return ticketValidator;
    		}
    
    		String serverName = PrefsPropsUtil.getString(
    			companyId, PropsKeys.CAS_SERVER_NAME, PropsValues.CAS_SERVER_NAME);
    		String serverUrl = PrefsPropsUtil.getString(
    			companyId, PropsKeys.CAS_SERVER_URL, PropsValues.CAS_SERVER_URL);
    		String loginUrl = PrefsPropsUtil.getString(
    			companyId, PropsKeys.CAS_LOGIN_URL, PropsValues.CAS_LOGIN_URL);
    
    		Saml11TicketValidator cas20ProxyTicketValidator = new Saml11TicketValidator(serverUrl);
    		
    		Map parameters = new HashMap();
    
    		parameters.put("serverName", serverName);
    		parameters.put("casServerUrlPrefix", serverUrl);
    		parameters.put("casServerLoginUrl", loginUrl);
    		parameters.put("redirectAfterValidation", "false");
    
    		cas20ProxyTicketValidator.setCustomParameters(parameters);
    
    		_ticketValidators.put(companyId, cas20ProxyTicketValidator);
    
    		return cas20ProxyTicketValidator;
    	}
    
    	protected void processFilter(
    			HttpServletRequest request, HttpServletResponse response,
    			FilterChain filterChain)
    		throws Exception {
    
    		long companyId = PortalUtil.getCompanyId(request);
    
    		if (PrefsPropsUtil.getBoolean(
    				companyId, PropsKeys.CAS_AUTH_ENABLED,
    				PropsValues.CAS_AUTH_ENABLED)) {
    
    			HttpSession session = request.getSession();
    
    			String pathInfo = request.getPathInfo();
    
    			if (pathInfo.indexOf("/portal/logout") != -1) {
    				session.invalidate();
    
    				String logoutUrl = PrefsPropsUtil.getString(
    					companyId, PropsKeys.CAS_LOGOUT_URL,
    					PropsValues.CAS_LOGOUT_URL);
    
    				response.sendRedirect(logoutUrl);
    
    				return;
    			}
    			else {
    				String login = (String)session.getAttribute(LOGIN);
    
    				String serverName = PrefsPropsUtil.getString(
    					companyId, PropsKeys.CAS_SERVER_NAME,
    					PropsValues.CAS_SERVER_NAME);
    
    				String serviceUrl = PrefsPropsUtil.getString(
    					companyId, PropsKeys.CAS_SERVICE_URL,
    					PropsValues.CAS_SERVICE_URL);
    
    				if (Validator.isNull(serviceUrl)) {
    					serviceUrl = CommonUtils.constructServiceUrl(
    						request, response, serviceUrl, serverName, "ticket",
    						false);
    				}
    
    				String ticket = ParamUtil.getString(request, "ticket");
    
    				if (Validator.isNull(ticket)) {
    					if (Validator.isNotNull(login)) {
    						processFilter(
    								TonaCasFilter.class, request, response, filterChain);
    					}
    					else {
    						String loginUrl = PrefsPropsUtil.getString(
    							companyId, PropsKeys.CAS_LOGIN_URL,
    							PropsValues.CAS_LOGIN_URL);
    
    						loginUrl = HttpUtil.addParameter(
    							loginUrl, "service", serviceUrl);
    
    						response.sendRedirect(loginUrl);
    					}
    
    					return;
    				}
    
    				TicketValidator ticketValidator = getTicketValidator(
    					companyId);
    
    				Assertion assertion = ticketValidator.validate(
    					ticket, serviceUrl);
    
    				if (assertion != null) {
    					AttributePrincipal attributePrincipal =
    						assertion.getPrincipal();
    
    					login = attributePrincipal.getName();
    
    					session.setAttribute(LOGIN, login);
    					session.setAttribute("principal", attributePrincipal);
    				}
    			}
    		}
    
    		processFilter(TonaCasFilter.class, request, response, filterChain);
    	}
    
    	private static Log _log = LogFactoryUtil.getLog(TonaCasFilter.class);
    
    	private static Map _ticketValidators =
    		new ConcurrentHashMap();
    
    }
    
  5. I then create the new auto-login class. Again – as it was not very extendible, I had to copy-paste allot of code from the parent class…
    public class TonaCASAutoLogin extends CASAutoLogin {
    	private Logger logger = LoggerFactory.getLogger(TonaCASAutoLogin.class.getName());
    
    	@Override
    	public String[] login(HttpServletRequest request, HttpServletResponse response) {
    		String[] credentials = null;
    
    		try {
    			long companyId = PortalUtil.getCompanyId(request);
    
    			if (!PrefsPropsUtil.getBoolean(companyId, PropsKeys.CAS_AUTH_ENABLED, PropsValues.CAS_AUTH_ENABLED)) {
    
    				return credentials;
    			}
    
    			HttpSession session = request.getSession();
    
    			String login = (String) session.getAttribute(CASFilter.LOGIN);
    
    			if (Validator.isNull(login)) {
    				return credentials;
    			}
    
    			AttributePrincipal principal = (AttributePrincipal) session.getAttribute("principal");
    			if (principal != null) {
    
    				Map attrs = principal.getAttributes();
    
    				Configuration.getInstance().load();
    				
    				Object groupMembership = attrs.get(Configuration.getInstance().getMemberOfProperty());
    
    				if (groupMembership != null) {
    					com.liferay.portal.service.ServiceContext context = new com.liferay.portal.service.ServiceContext();
    
    					User user = null;
    					
    					String email = attrs.get("email").toString();
    					String lastName = attrs.get("lastName").toString();
    					String firstName = attrs.get("firstName").toString();
    
    					try {
    						user = UserLocalServiceUtil.getUserByScreenName(companyId, login);
    					} catch (NoSuchUserException nsue) {
    						// User not found.
    					}
    
    					// The groups the user needs to belong to
    					long[] mapToGroupsArray = getUserGroups(companyId, groupMembership.toString());
    					
    					// The community we want to map the user to
    					long groupId = 10131;
    
    
    					// User not found - create it.
    					if (user == null) {
    						try {
    							UserLocalServiceUtil.addUser(0, companyId, false, "not-used", "not-used", false,
    									fixScreenName(login), email, 0, "", Locale.getDefault(), firstName, "", lastName,
    									0, 0, true, 1, 1, 1970, null, new long[] {groupId}, null, null, mapToGroupsArray, false, context);
    
    						} catch (Exception e) {
    							logger.error("Can't add user", e);
    						}
    					} else {
    						// User exists - remap groups
    						UserGroupLocalServiceUtil.setUserUserGroups(user.getUserId(), mapToGroupsArray);
    						
    						// Ensure user has the right community
    						
    						UserLocalServiceUtil.addGroupUsers(groupId, new long[] { user.getUserId()});
    					}
    				} 
    			} 
    
    			return super.login(request, response);
    
    		} catch (Throwable e) {
    			logger.error("Can't auto-login, reverting to default behavior", e);
    		}
    
    		return super.login(request, response);
    	}
    
    	private String fixScreenName(String loginName) {
    		
    		String name = loginName;
    		
    		if (name.contains("@")) {
    			name = name.substring(0,name.indexOf("@"));
    		}
    
    		return name;
    	}
    
    	private long[] getUserGroups(long companyId, String groupMembership) throws Exception {
    		String[] groups = groupMembership.toString().split(";");
    
    		List mapToGroups = new ArrayList();
    
    		for (String group : groups) {
    			if (group.contains("[")) {
    				group = group.replace('[', ' ');
    				group = group.replace(']', ' ');
    				group = group.trim();
    			}
    			String groupName = group;
    
    			if (groupName != null) {
    				UserGroup liferayGroup = UserGroupLocalServiceUtil.getUserGroup(companyId, groupName);
    				if (liferayGroup != null) {
    					logger.debug("Found user group " + liferayGroup.getUserGroupId());
    				mapToGroups.add(liferayGroup.getUserGroupId());
    				} else {
    					logger.debug("Liferay group " + groupName + " not found");
    				}
    			}
    		}
    
    		long[] mapToGroupsArray = new long[mapToGroups.size()];
    		int i = 0;
    		for (long l : mapToGroups) {
    			mapToGroupsArray[i] = l;
    			++i;
    		}
    		
    		return mapToGroupsArray;
    	}
    }
    [/jcodeva]
    Note that you must make sure CAS sends all the relevant properties in the return SAML response, and that the groups sent exist in LifeRay. 
    </li>
    <li>Now, create a JAR file (<code>mvn clean install</code>), and copy the JAR file to <code>TOMCAT_HOME/webapps/ROOT/WEB-INF/lib</code></li>
    	<li>Edit the LifeRay web.xml file. It can be found in <code>TOMCAT_HOME/webapps/ROOT/WEB-INF</code>. Replace the line
    
    &lt;filter-class&gt;com.liferay.portal.servlet.filters.sso.cas.CASFilter&lt;/filter-class&gt;
    [/xml]
    with the following line:
    
    &lt;filter-class&gt;com.tona.security.TonaCasFilter&lt;/filter-class&gt;
    
  6. Edit the LifeRay portal-ext.properties file. It can be found in TOMCAT_HOME/webapps/ROOT/WEB-INF/classes. Add the following line:
    auto.login.hooks=com.tona.security.TonaCASAutoLogin
    
  7. Restart LifeRay. All should work...

Configuring LifeRay and CAS to work with LDAP

I saw many tutorials on CAS, Liferay and LDAP – but unfortunetly, none of them worked for me. So I decided to document what does work (at least for me).
Note that my environment is based on LifeRay 6.0.5 and CAS 3.5.1.

  1. Configure Tomcat for SSL. I have used port 443. You can read all about it here
    1. After creating the certificates, I just ended up with adding the following tag in TOMCAT_HOME/conf/server.xml
    2.  
      <Connector
                 port="443" maxThreads="200"
                 scheme="https" secure="true" SSLEnabled="true"
                 keystoreFile="/root/.keystore" keystorePass="password"
                 clientAuth="false" sslProtocol="TLS"/>  
      
    3. IMPORTANT I did not manage to make CAS work with a self signed certificate, so I’ve used a temporary free one.
  2. Configure LifeRay for LDAP
    1. Login to LifeRay
    2. Go to the Control Panel–>Settings–>Authentication–>LDAP
    3. Ensure the “Enabled” check box is selected
    4. I strongly suggest enabling the “Import” checkbox and ensure Import is enabled for server startup.
    5. Add a server
    6. Fill in the LDAP server details (it’s easy to check them with an LDAP browser like jxplorer)
    7. Save your configuration
    8. I usually restart Tomcat after that change, and view the log to see all users were successfully imported
  3. Build CAS
    1. Download CAS (I downloaded it from here)
    2. Unzip the file
    3. Edit the CAS_HOME/cas-server-webapp/pom.xml file and add the following:
    4. <dependency>
           <groupId>org.jasig.cas</groupId>
           <artifactId>cas-server-support-ldap</artifactId>
           <version>3.5.1</version>
      </dependency>
      
    5. Build CAS using maven. The command to run is mvn clean install
  4. Deploy CAS
    1. Copy the newly created WAR file from CAS_HOME/cas-server-webapp/target/cas.war to TOMCAT_HOME/webapps
  5. Configure CAS for LDAP
    1. Edit the TOMCAT_HOME/webapps/cas/WEB-INF/deployerConfigContext.xml
    2. Add the following at the end of the file (just before the /beans tag)
    3. <bean id="contextSource" class="org.springframework.ldap.core.support.LdapContextSource">
        <!-- DO NOT enable JNDI pooling for context sources that perform LDAP bind operations. -->
        <property name="pooled" value="false"/>
       
        <!--
          Although multiple URLs may defined, it's strongly recommended to avoid this configuration
          since the implementation attempts hosts in sequence and requires a connection timeout
          prior to attempting the next host, which incurs unacceptable latency on node failure.
          A proper HA setup for LDAP directories should use a single virtual host that maps to multiple
          real hosts using a hardware load balancer.
        -->
        <property name="url" value="ldap://LDAP_SERVER:389" />
       
        <!--
          Manager credentials are only required if your directory does not support anonymous searches.
          Never provide these credentials for FastBindLdapAuthenticationHandler since the user's
          credentials are used for the bind operation.
        -->
        <property name="userDn" value="cn=Manager"/>
        <property name="password" value="test"/>
       
        <!-- Place JNDI environment properties here. -->
        <property name="baseEnvironmentProperties">
          <map>
            <!-- Three seconds is an eternity to users. -->
            <entry key="com.sun.jndi.ldap.connect.timeout" value="3000" />
            <entry key="com.sun.jndi.ldap.read.timeout" value="3000" />
       
            <!-- Explained at http://download.oracle.com/javase/1.3/docs/api/javax/naming/Context.html#SECURITY_AUTHENTICATION -->
            <entry key="java.naming.security.authentication" value="simple" />
          </map>
        </property>
      </bean>
      
    4. Add the following under the list tag of the authenticationHandlers tag
    5.       <bean class="org.jasig.cas.adaptors.ldap.BindLdapAuthenticationHandler"
      p:filter="mail=%u"
      p:searchBase="ou=people,dc=test,dc=com"
      p:contextSource-ref="contextSource" />
                            </list>
                    </property>
            </bean>
      
  6. Configure LifeRay for CAS
    1. Login to LifeRay
    2. Go to the Control Panel–>Settings–>Authentication–>CAS
    3. Ensure the “Enabled” check box is selected
    4. Ensure the “LDAP Import” check box is selected
    5. Enter the URLs of the CAS server
    6. Save
    7. Add the following line to TOMCAT_HOME/webapps/ROOT/WEB-INF/classes/system-ext.properties
    8. com.liferay.filters.sso.cas.CASFilter=true
      
    9. Add the following line to TOMCAT_HOME/webapps/ROOT/WEB-INF/classes/portal-ext.properties
    10. auto.login.hooks=com.liferay.portal.security.auth.CASAutoLogin
      
    11. Restart Tomcat

You can now access your LifeRay instance, and get the CAS login instead…