RuntimeException in Action for tag [rollingPolicy] java.lang.IndexOutOfBoundsException: No group 1

A customer of mine got sometimes the following exception in his catalina.out log.

ERROR in ch.qos.logback.core.joran.spi.Interpreter@23:25 - RuntimeException in Action for tag [rollingPolicy] java.lang.IndexOutOfBoundsException: No group 1
at java.lang.IndexOutOfBoundsException: No group 1
at at java.util.regex.Matcher.group(Matcher.java:470)
at at ch.qos.logback.core.rolling.helper.FileFilterUtil.extractCounter(FileFilterUtil.java:109)
at at ch.qos.logback.core.rolling.helper.FileFilterUtil.findHighestCounter(FileFilterUtil.java:93)
at at ch.qos.logback.core.rolling.SizeAndTimeBasedFNATP.computeCurrentPeriodsHighestCounterValue(SizeAndTimeBasedFNATP.java:65)
at at ch.qos.logback.core.rolling.SizeAndTimeBasedFNATP.start(SizeAndTimeBasedFNATP.java:49)
at at ch.qos.logback.core.rolling.TimeBasedRollingPolicy.start(TimeBasedRollingPolicy.java:87)
at at ch.qos.logback.core.joran.action.NestedComplexPropertyIA.end(NestedComplexPropertyIA.java:167)
at at ch.qos.logback.core.joran.spi.Interpreter.callEndAction(Interpreter.java:318)
at at ch.qos.logback.core.joran.spi.Interpreter.endElement(Interpreter.java:197)
at at ch.qos.logback.core.joran.spi.Interpreter.endElement(Interpreter.java:183)
at at ch.qos.logback.core.joran.spi.EventPlayer.play(EventPlayer.java:62)
at at ch.qos.logback.core.joran.GenericConfigurator.doConfigure(GenericConfigurator.java:147)
at at ch.qos.logback.core.joran.GenericConfigurator.doConfigure(GenericConfigurator.java:133)
at at ch.qos.logback.core.joran.GenericConfigurator.doConfigure(GenericConfigurator.java:96)
at at ch.qos.logback.core.joran.GenericConfigurator.doConfigure(GenericConfigurator.java:55)
at at ch.qos.logback.classic.util.ContextInitializer.configureByResource(ContextInitializer.java:75)
at at ch.qos.logback.classic.util.ContextInitializer.autoConfig(ContextInitializer.java:148)
at at org.slf4j.impl.StaticLoggerBinder.init(StaticLoggerBinder.java:84)
at at org.slf4j.impl.StaticLoggerBinder.(StaticLoggerBinder.java:54)
at at org.slf4j.LoggerFactory.bind(LoggerFactory.java:121)
at at org.slf4j.LoggerFactory.performInitialization(LoggerFactory.java:111)
at at org.slf4j.LoggerFactory.getILoggerFactory(LoggerFactory.java:268)
at at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:241)
at at org.apache.commons.logging.impl.SLF4JLogFactory.getInstance(SLF4JLogFactory.java:156)
at at org.apache.commons.logging.impl.SLF4JLogFactory.getInstance(SLF4JLogFactory.java:132)
at at org.apache.commons.logging.LogFactory.getLog(LogFactory.java:645)
at at org.springframework.web.context.ContextLoader.initWebApplicationContext(ContextLoader.java:184)
at at org.springframework.web.context.ContextLoaderListener.contextInitialized(ContextLoaderListener.java:47)
at at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:3972)
at at org.apache.catalina.core.StandardContext.start(StandardContext.java:4467)
at at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:791)
at at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:771)
at at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:546)
at at org.apache.catalina.startup.HostConfig.deployDirectory(HostConfig.java:1041)
at at org.apache.catalina.startup.HostConfig.deployDirectories(HostConfig.java:964)
at at org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:502)
at at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1277)
at at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:321)
at at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:119)
at at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1053)
at at org.apache.catalina.core.StandardHost.start(StandardHost.java:785)
at at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1045)
at at org.apache.catalina.core.StandardEngine.start(StandardEngine.java:443)
at at org.apache.catalina.core.StandardService.start(StandardService.java:519)
at at org.apache.catalina.core.StandardServer.start(StandardServer.java:710)
at at org.apache.catalina.startup.Catalina.start(Catalina.java:581)

Turns out the problem was with his logback.xml configuration file –

<appender name="FILE" class="ch.qos.logback.core.rolling.RollingFileAppender">
    <file>${env.PANORAMA_HOME}/var/log/panorama-sl.log</file>
    <rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
        <!-- daily rollover -->
        <fileNamePattern>${env.PANORAMA_HOME}/var/log/panorama-sl.%d{yyyy-MM-dd}.log.gz</fileNamePattern>
                                         
        <timeBasedFileNamingAndTriggeringPolicy   class="ch.qos.logback.core.rolling.SizeAndTimeBasedFNATP">
                <!-- or whenever the file size reaches 100MB -->
            <maxFileSize>100MB</maxFileSize>
       </timeBasedFileNamingAndTriggeringPolicy>
    <!-- keep 10 files worth of history -->
        <maxHistory>10</maxHistory>
    </rollingPolicy>
               
    <Append>true</Append>
    <encoder>
        <!-- default ISO date format enables lexical sorting of dates -->
        <pattern>%-30.-30(%date %level) [%-50.-50thread] %logger{25} %msg%n</pattern>
    </encoder>
    <filter class="ch.qos.logback.classic.filter.ThresholdFilter">
        <level>ALL</level>
    </filter>
</appender>

Since sometimes the log file size was bigger than 100MB, QOS needed to roll the log file – but couldn’t, since the log file format didn’t include a counter. The correct configuration is:

<appender name="FILE" class="ch.qos.logback.core.rolling.RollingFileAppender">
    <file>${env.PANORAMA_HOME}/var/log/panorama-sl.log</file>
    <rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
        <!-- daily rollover -->
        <fileNamePattern>${env.PANORAMA_HOME}/var/log/panorama-sl.%d{yyyy-MM-dd}.%i.log.gz</fileNamePattern>
                                         
        <timeBasedFileNamingAndTriggeringPolicy   class="ch.qos.logback.core.rolling.SizeAndTimeBasedFNATP">
                <!-- or whenever the file size reaches 100MB -->
            <maxFileSize>100MB</maxFileSize>
       </timeBasedFileNamingAndTriggeringPolicy>
    <!-- keep 10 files worth of history -->
        <maxHistory>10</maxHistory>
    </rollingPolicy>
               
    <Append>true</Append>
    <encoder>
        <!-- default ISO date format enables lexical sorting of dates -->
        <pattern>%-30.-30(%date %level) [%-50.-50thread] %logger{25} %msg%n</pattern>
    </encoder>
    <filter class="ch.qos.logback.classic.filter.ThresholdFilter">
        <level>ALL</level>
    </filter>
</appender>

Using Liferay SSO with Oracle Access Manager 11g

Well, I was given a challenging task – using Oracle Access Manager 11g as an SSO provider for Liferay 6.0 (the community edition…).
Now, as you might know – there is no built-in OAM support for Liferay – so I was stuck with configuring one myself. Since I didn’t even have the OAM installed – I’ll detail all the steps I did. To simplify matters – I installed OAM on Microsoft Windows Server, but the same should hold for Linux.

All Oracle downloads were downloaded from edelivery.oracle.com. Version is 11.1.0.6.
 

OAM Installation

  1. Install Oracle database. I didn’t install Oracle XE, but rather the Enterprise edition.
  2. Alter the Oracle database.
    1. Open sqlplus as sys and run the following commands
    2. alter system set open_cursors=1000 scope=both;
      alter system set processes=1000 scope=SPFILE;
      
  3. Restart Oracle DB.
  4. Run RCU (V33643-01), and check the Identity Managent checkbox. Proceed with the installation.
  5. Install WebLogic Server (wls1036_generic)
  6. Install SOA Suite (ofm_soa_generic_11.1.1.6.0_disk1_1of2 and ofm_soa_generic_11.1.1.6.0_disk1_2of2).
  7. Install IdM (V33644-01_1of2 and V33644-01_2of2)
  8. From your ORACLE_HOME/IDM_HOME/common/bin run the config.cmd file.
  9. Install all the required components (especially all the Oracle Access Manager relevant components).
  10. DO NOT START THE ADMIN SERVER.
  11. Run the following WLST scripts (thank you Warren
  12. $MW_HOME/oracle_common/common/bin/wlst.cmd $ORACLE_HOME/common/tools/configureSecurityStore.py -d $IAM_DOMAIN_LOCATION -m create     -c IAM -p $ORA_PASS
    $MW_HOME/oracle_common/common/bin/wlst.cmd $ORACLE_HOME/common/tools/configureSecurityStore.py -d $IAM_DOMAIN_LOCATION -m validate
    
  13. Where
    1. $MW_HOME is where you put the Middleware home (e.g. ~/Oracle/Middleware)
      $ORACLE_HOME
    2. is the Oracle IAM home (e.g. ~/Oracle/Middleware/Oracle_IAM1)
      $IAM_DOMAIN_
    3. LOCATION is the domain home (e.g. ~/Oracle/Middleware/user_projects/domains/OAMDomain)
    4. $ORA_PASS is the password needed to talk to the database
  14. Now you can safely run the admin server. Connect to it using IP_ADDR:7001/em, and start the OAM managed server too.

Apache installation/configuration

On a separate machine (I used RedHat Linux 5.5):

  1. Install Apache2.2
  2. Configure WebGate (I used ZIP file oam_int_linux_v10_cd1.zip)
  3. Configure Apache to act as a proxy for your Liferay server by using ProxyPass and ProxyPassReverse. For instance:
  4. ProxyRequests Off
    ProxyPass /web http://LIFERAY_SERVER:8080/web
    ProxyPassReverse /web http://LIFERAY_SERVER:8080/web
    
  5. Configure WebGate in the Apache. On my machine the configuration looked like this:
  6. LoadModule obWebgateModule "/usr/local/webgate/product/access/oblix/apps/webgate/bin/webgate.so"
    
    LoadFile "/usr/local/webgate/libgcc_s.so.1"
    LoadFile "/usr/local/webgate/libstdc++.so.6"
    
            WebGateInstalldir "/usr/local/webgate/product/access"
            WebGateMode PEER
            #webgateload obWebgateModule "/usr/local/webgate/product/access/oblix/apps/webgate/bin/webgate.so"
    
    
            SetHandler obwebgateerr
    
    
            SetHandler obwebgateerr
    
    
            AuthType Oblix
            require valid-user
    
    

OAM Configuration

  1. Open the Access Manager console, and click on the “New OAM10g WebGate”
  2. Fill in the details, exactly as you did during the WebGate installation.
  3. Go to “Application Domains”, and select the newly created Application Policy
  4. Change any required value, and select “Authorization Policies”
  5. Select the “Protected Resource Policy”
  6. Select “Responses”
  7. Add a new response – HTTP Header with the name of LIFERAY_SCREEN_NAME and value of uid

Liferay Configuration

  1. Edit the portal-ext.properties file and add the following line:
  2. auto.login.hooks=com.liferay.portal.security.auth.RequestHeaderAutoLogin
    

Restart Apache, and browse to it. You should get the OAM login page, and after login – you should see you have automatically logged-in into Liferay…

Monitoring seda queues with JBoss JMX console

In my current project we’re using camel, and depend heavily on it’s seda technology.
Since we didn’t monitor our queues at first, we encountered OutOfMemory exceptions constantly (usually after ~48 hours of heavy use).
We overcome this by limiting the size of the seda queue (using the size attribute – see here for more info).
But now we face QueueFull exceptions, and need to constantly monitor the queues for their size. Since our application runs on top of JBoss, we can use their JMX API for that. And since I’m a bit lazy – I’ve decided to access it through their HTTP jmx-console.

package com.tona.monitor;

import java.io.ByteArrayOutputStream;
import java.io.FileWriter;
import java.io.InputStream;
import java.net.URL;

public class Main {
	public static void main(String[] args) throws Exception {
		String[] urls = new String[] {
				"http://192.168.155.101:8080/jmx-console/HtmlAdaptor?action=invokeOpByName&amp;name=org.apache.camel%3Acontext%3DacsAdapterCamelContext%2Ctype%3Dendpoints%2Cname%3D%22seda%3A%2F%2FworkflowTriggerManager%5C%3Fsize%3D50000%22&amp;methodName=queueSize",
				"http://192.168.155.101:8080/jmx-console/HtmlAdaptor?action=invokeOpByName&amp;name=org.apache.camel%3Acontext%3DacsAdapterCamelContext%2Ctype%3Dendpoints%2Cname%3D%22seda%3A%2F%2FsyslogAppender%5C%3FconcurrentConsumers%3D4%26timeout%3D5000%22&amp;methodName=queueSize",
		};
		FileWriter fos = new FileWriter("/tmp/queue_log" + System.currentTimeMillis() + ".csv");

		for (String url : urls) {
			System.out.print(getQueueName(url) + ",");
			fos.write(getQueueName(url) + ",");
		}
		System.out.println();
		fos.write("n");
		
		boolean flag = true;
		
		while (flag) {
			for (String url : urls) {
				URL u = new URL(url);
				InputStream is = u.openStream();
				int i = 0;
				ByteArrayOutputStream baos = new ByteArrayOutputStream();
				while ((i = is.read()) &gt; 0) {
					baos.write(i);
				}

				// System.out.println(baos.toString());
				String body = baos.toString();
				int start = body.indexOf("<pre>");
				int end = body.indexOf("</pre>");
				String numOfMessages = body.substring(start + 5, end).trim();
				System.out.print(numOfMessages + ",");
				fos.write(numOfMessages + ",");

			}
			System.out.println();
			fos.write("n");
			fos.flush();
			Thread.sleep(1000);			
		}
		
		fos.close();

	}
	
	private static String getQueueName(String url) {
		String queueNameStart = "seda%3A%2F%2F";
		String queueNameEnd = "%5C%3";
		
		int queueNameStartPos = url.indexOf(queueNameStart) + queueNameStart.length();
		int queueNameEndPos = url.indexOf(queueNameEnd);
		
		if (queueNameEndPos == -1)
			queueNameEndPos = url.length();
		
		return url.substring(queueNameStartPos,queueNameEndPos);
	}

}

Import users into Liferay

If you’re using Liferay, and not using LDAP, you will probably face into the same problem I did – how to import a large amount of users without manually adding them to the system.

So, loving automation, I’ve decided to create a simple Portlet that does just that.

  1. Create a new Dynamic Web App in Eclipse.
  2. Configure all necessary deployment files (liferay-portlet.xml, portlet.xml, web.xml etc)
  3. Create a new User class:
    package com.tona.liferay.web;public class User {
    private String firstName;
    private String lastName;
    private String email;
    private String phoneNo;
    private String screenName;
    private String password;
    
    public String getEmail() {
    return email;
    }
    
    public void setEmail(String email) {
    this.email = email;
    }
    
    public String getPhoneNo() {
    return phoneNo;
    }
    
    public void setPhoneNo(String phoneNo) {
    this.phoneNo = phoneNo;
    }
    
    public String getFirstName() {
    return firstName;
    }
    
    public void setFirstName(String firstName) {
    this.firstName = firstName;
    }
    
    public String getLastName() {
    return lastName;
    }
    
    public void setLastName(String lastName) {
    this.lastName = lastName;
    }
    
    public String getScreenName() {
    return screenName;
    }
    
    public void setScreenName(String screenName) {
    this.screenName = screenName;
    }
    
    public String getPassword() {
    return password;
    }
    
    public void setPassword(String password) {
    this.password = password;
    }
    
    public User(String line) {
    String[] tokens = line.split(",");
    setFirstName(tokens[1]);
    setLastName(tokens[2]);
    setEmail(tokens[3]);
    setPhoneNo(tokens[4]);
    String screenName = getFirstName() + getLastName().substring(0, 3);
    setScreenName(screenName.toLowerCase());
    setPassword(getScreenName() + "123");
    }
    
    public User() {
    
    }
    
    }
    
  4. Create the portlet itself:
    package com.tona.liferay.web;import java.io.BufferedReader;
    import java.io.FileReader;
    import java.io.IOException;
    import java.util.ArrayList;
    import java.util.List;
    import java.util.Locale;
    import java.util.logging.Level;
    import java.util.logging.Logger;
    
    import javax.portlet.ActionRequest;
    import javax.portlet.ActionResponse;
    import javax.portlet.PortletException;
    
    import com.liferay.portal.model.Company;
    import com.liferay.portal.service.CompanyLocalServiceUtil;
    import com.liferay.portal.service.UserLocalServiceUtil;
    import com.liferay.util.bridges.mvc.MVCPortlet;
    
    public class ImportUsersPortlet extends MVCPortlet {
    public void importUsers(ActionRequest actionRequest,
    ActionResponse actionResponse) throws IOException, PortletException {
    
    String fileName = actionRequest.getParameter("fileName");
    
    BufferedReader fr = new BufferedReader(new FileReader(fileName));
    
    Listusers = new ArrayList();
    
    String line;
    
    while ((line = fr.readLine()) != null) {
    users.add(new User(line));
    }
    
    // We now have the user list
    com.liferay.portal.service.ServiceContext context = new com.liferay.portal.service.ServiceContext();
    long companyId = 0;
    
    try {
    Company company = CompanyLocalServiceUtil.getCompanies().get(0);
    companyId = company.getCompanyId();
    for (User user : users) {
    try {
    UserLocalServiceUtil.addUser(0, companyId, false,
    user.getPassword(), user.getPassword(), false, user.getScreenName(),
    user.getEmail(), 0, "", Locale.getDefault(),
    user.getFirstName(), "", user.getLastName(), 0, 0,
    true, 1, 1, 1970, null, null, null, null, null, false,
    context);
    } catch (Exception e) {
    e.printStackTrace();
    }
    }
    } catch (Exception e) {
    e.printStackTrace();
    }
    
    }
    }
    
  5. Create a WAR file and deploy it in Liferay
  6. Note that the portlet does not upload the CSV file – and expects it to exist on the Liferay server itself
  7. You can of course change the algorithm of the screen-name and password creation, by changing the User constructor method.

Tracking Locks in Oracle Database

A web application I was tracing had serious DB performance issues. I reviewed the AWR, found out that the IO was incredibly slow, but a weird behavior made me suspect some locks were also the reason for the poor performance.
While running JMeter, I saw a situation where no responses are received from the server, until, suddenly, everything started running – only to be stuck again.
So, how did I track locks in the database? I’ve used the following queries (of course, all require SYSDBA to run):

SELECT l.session_id||','||v.serial# sid_serial,
       l.ORACLE_USERNAME ora_user,
       o.object_name, 
       o.object_type, 
       DECODE(l.locked_mode,
          0, 'None',
          1, 'Null',
          2, 'Row-S (SS)',
          3, 'Row-X (SX)',
          4, 'Share',
          5, 'S/Row-X (SSX)',
          6, 'Exclusive', 
          TO_CHAR(l.locked_mode)
       ) lock_mode,
       o.status,                                                                                                                               
       to_char(o.last_ddl_time,'dd.mm.yy') last_ddl                                                                                            
FROM dba_objects o, gv$locked_object l, v$session v                                                                                            
WHERE o.object_id = l.object_id                                                                                                                
      and l.SESSION_ID=v.sid                                                                                                                   
order by 2,3;

I use the above query to see if actual locks exist in the DB (it runs pretty fast, so I use it often).

select l1.sid, ' IS BLOCKING ', 
       l2.sid 
from v$lock l1, 
     v$lock l2 
where l1.block =1 and 
      l2.request > 0 and 
      l1.id1=l2.id1  and 
      l1.id2=l2.id2;

I use the above query to see which session is locked (very useful to detect dead-locks)

select s1.username || '@' || s1.machine 
  || ' ( SID=' || s1.sid || ' )  is blocking '
  || s2.username || '@' || s2.machine || ' ( SID=' 
  || s2.sid || ' ) ' AS blocking_status,
  a1.sql_text,
  a2.sql_text
from v$lock l1, 
     v$session s1, 
     v$lock l2, 
     v$session s2,
     v$sqlarea a1, 
     v$sqlarea a2
where s1.sid=l1.sid and 
      l1.BLOCK=1 and 
      s2.sid=l2.sid and 
      l2.request > 0 and 
      l1.id1 = l2.id1 and 
      l2.id2 = l2.id2 and 
      s1.sql_hash_value = a1.hash_value and 
      s1.sql_address = a1.address and 
      s2.sql_hash_value = a2.hash_value and 
      s2.sql_address = a2.address;

The above query is my favorite (alas, it is very slow…) as it shows the actual locking SQL – and the locked SQL. If you know your code – you can easily identify which query causes the locking problem.

Updating Pentaho PRPT files to add a PreProcessor

In my previous post (see here) I mentioned that I couldn’t add a pre-processor to a Pentaho report using the report designer. So, I’ve written a short Java program that does just that.
Note that I use a neat open source library called Zip4J (you can get it here).

package com.tona.rprt;

import java.io.File;
import java.io.FileWriter;

import javax.xml.parsers.DocumentBuilder;
import javax.xml.parsers.DocumentBuilderFactory;
import javax.xml.transform.Transformer;
import javax.xml.transform.TransformerFactory;
import javax.xml.transform.dom.DOMSource;

import net.lingala.zip4j.core.ZipFile;
import net.lingala.zip4j.model.ZipParameters;

import org.w3c.dom.Document;
import org.w3c.dom.Element;
import org.w3c.dom.Node;

public class Main {
	private static final String CONFIG_FILE_NAME = "layout.xml";
	
	public static void main(String[] args) throws Exception {
		ZipFile reportFile = new ZipFile("");

		File tempDirectory = createTempDirectory();
		String path = tempDirectory.getAbsolutePath();
		reportFile.extractFile(CONFIG_FILE_NAME, path);

		System.out.println("Extraced file to " + path);
		File updatedFile = new File(path + File.separator + CONFIG_FILE_NAME);

		// Update the file
		DocumentBuilderFactory dbf = DocumentBuilderFactory.newInstance();
		DocumentBuilder db = dbf.newDocumentBuilder();
		Document doc = db.parse(updatedFile);
		
		System.out.println("Parsed document");
		
		Element layoutNode = doc.getDocumentElement();
		Element preProcessorElement = doc.createElement("preprocessor");
		preProcessorElement.setAttribute("class", "com.tona.report.infra.TonaWizardProcessor");
		Node firstLayoutChild = layoutNode.getFirstChild(); 
		layoutNode.insertBefore(preProcessorElement, firstLayoutChild);
		
		System.out.println("Added child");

		FileWriter output = new FileWriter(updatedFile);
		javax.xml.transform.stream.StreamResult result = new javax.xml.transform.stream.StreamResult(output);

		TransformerFactory tf = TransformerFactory.newInstance();
		Transformer t = tf.newTransformer();
		t.transform(new DOMSource(doc), result);
		
		System.out.println("Updated XML file");
		
		ZipParameters parameters = new ZipParameters();
		reportFile.removeFile(CONFIG_FILE_NAME);
		reportFile.addFile(updatedFile, parameters);
		
		System.out.println("Update ZIP file");
		
		tempDirectory.delete();
		
		System.out.println("Removed temporary directory");
	}
	
	private static File createTempDirectory() throws Exception
		{
		    File temp = File.createTempFile("temp", Long.toString(System.nanoTime()));

		    if(!(temp.delete())) {
		        throw new Exception("Could not delete temp file: " + temp.getAbsolutePath());
		    }

		    if(!(temp.mkdir())) {
		        throw new Exception("Could not create temp directory: " + temp.getAbsolutePath());
		    }

		    return temp;
		}	
}

Templating with Pentaho BI

I’m trying to build multiple reports on Pentaho Report Designer, and couldn’t find a really good way to implement templating. You see – I want the report design to change when I change my template.
So, I wrote some code. The WizardProcessor class is a Pentaho class to be used by report pre processors (see here for more info on the subject), and extending it is easy.

public class TonaWizardProcessor extends WizardProcessor {

	public MasterReport performPreProcessing(MasterReport definition,
			DefaultFlowController flowController)
			throws ReportProcessingException {

		// Init header
		ReportHeader header = definition.getReportHeader();

		float pageWidth = definition.getPageDefinition().getWidth();

		// Add black background

		Element e = RectangleElementFactory.createFilledRectangle(0, 0,
				pageWidth, 100, Color.black);

		header.addElement(e);

		// Add report title

		// Add date
		Element dateElement = DateFieldElementFactory.createDateElement(
				"TopDateLabel", new Rectangle2D.Double(pageWidth - 100, 0, 100,
						100), Color.WHITE, ElementAlignment.RIGHT,
				new FontDefinition("Arial", 12), "-", "MMM dd, yyyy",
				"report.date");

		header.addElement(dateElement);

		// Init footer
		ReportFooter footer = definition.getReportFooter();

		// Add date
		Element footerDateElement = DateFieldElementFactory.createDateElement(
				"FooterDateLabel", new Rectangle2D.Double(0, 0, pageWidth / 2,
						14), Color.BLACK, ElementAlignment.RIGHT,
				new FontDefinition("Arial", 12), "-", "MMM dd, yyyy",
				"report.date");

		makeBorder(footerDateElement);

		footer.addElement(footerDateElement);

		// Add pages count

		Element footerPagesElement = TextFieldElementFactory
				.createStringElement("FooterPageLabel", new Rectangle2D.Double(
						pageWidth / 2, 0, pageWidth / 2, 14), Color.BLACK,
						ElementAlignment.RIGHT,
						new FontDefinition("Arial", 12), "-", "PageFunction0");

		makeBorder(footerPagesElement);

		footer.addElement(footerPagesElement);

		// Create orange row banding
		Element[] items = definition.getItemBand().getElementArray();

		// Setting temp names for the row banding
		for (Element item : items) {
			item.setName("TempRowBand");
			makeBorder(item);
		}

		RowBandingFunction bandingFunction = new RowBandingFunction();
		bandingFunction.setVisibleBackground(Color.ORANGE);
		bandingFunction.setElement("TempRowBand");
		definition.addExpression(bandingFunction);

		// Change all group header background to orange

		for (int i = 0; i < definition.getGroupCount(); ++i) {
			for (Element groupHeaderElement : definition.getGroup(i).getHeader().getElementArray()) {
				makeBorder(groupHeaderElement);
				groupHeaderElement.getStyle().setStyleProperty(
						ElementStyleKeys.BACKGROUND_COLOR, Color.ORANGE);
			}
		}

		return super.performPreProcessing(definition, flowController);
	}

	private void makeBorder(Element element) {

		element.getStyle().setStyleProperty(ElementStyleKeys.BOX_SIZING,
				BoxSizing.BORDER_BOX);

		element.getStyle().setStyleProperty(ElementStyleKeys.BORDER_TOP_WIDTH,
				new Float(1));
		element.getStyle().setStyleProperty(ElementStyleKeys.BORDER_LEFT_WIDTH,
				new Float(1));
		element.getStyle().setStyleProperty(
				ElementStyleKeys.BORDER_BOTTOM_WIDTH, new Float(1));
		element.getStyle().setStyleProperty(
				ElementStyleKeys.BORDER_RIGHT_WIDTH, new Float(1));
		element.getStyle().setStyleProperty(
				ElementStyleKeys.BORDER_BREAK_WIDTH, new Float(1));

		element.getStyle().setStyleProperty(ElementStyleKeys.BORDER_TOP_COLOR,
				Color.black);
		element.getStyle().setStyleProperty(ElementStyleKeys.BORDER_LEFT_COLOR,
				Color.black);
		element.getStyle().setStyleProperty(
				ElementStyleKeys.BORDER_BOTTOM_COLOR, Color.black);
		element.getStyle().setStyleProperty(
				ElementStyleKeys.BORDER_RIGHT_COLOR, Color.black);
		element.getStyle().setStyleProperty(
				ElementStyleKeys.BORDER_BREAK_COLOR, Color.black);

		element.getStyle().setStyleProperty(ElementStyleKeys.BORDER_TOP_STYLE,
				BorderStyle.SOLID);
		element.getStyle().setStyleProperty(ElementStyleKeys.BORDER_LEFT_STYLE,
				BorderStyle.SOLID);
		element.getStyle().setStyleProperty(
				ElementStyleKeys.BORDER_BOTTOM_STYLE, BorderStyle.SOLID);
		element.getStyle().setStyleProperty(
				ElementStyleKeys.BORDER_RIGHT_STYLE, BorderStyle.SOLID);
		element.getStyle().setStyleProperty(
				ElementStyleKeys.BORDER_BREAK_STYLE, BorderStyle.SOLID);
	}

}

After you write your class, write a simple JUnit class to test it:

public class ProcessorTest {

	@Test
	public void test() {
		ClassicEngineBoot.getInstance().start();
		ResourceManager manager = new ResourceManager();
		manager.registerDefaults();
		String reportPath = "file:/home/liran/test.prpt";

		try {
			Resource res = manager.createDirectly(new URL(reportPath),
					MasterReport.class);
			MasterReport report = (MasterReport) res.getResource();
			
			report.addPreProcessor(new TonaWizardProcessor());
			
			File file = new File("/home/liran/output.html");
			PrintStream ps = new PrintStream(file);
			HtmlReportUtil.createStreamHTML(report, ps);
		} catch (Exception e) {
			e.printStackTrace();
		}
	}

}

Implementing it at the report level is a bit more difficult, as I couldn't do it in the PRD (Pentaho Report Designer). But changing the layout.xml file (found inside your prpt file) is easy. Just add the following lines:



And you're done.

Note that your pre-processor class (contained in a JAR file) must be found in the REPORT_DESIGNER/lib directory – otherwise the designer won’t open your report. Same is true for the BI-SERVER (put it in biserver-ee/tomcat/lib directory).

I’m writing a Java code that will allow you to automatically patch the rprt file to use the pre-processor. I’ll post the code in a future post.

Using Oracle tkprof with JDBC thin client application

When attempting to profile a Hibernate based application, I found a statement that was incredibly slow and caused the system to basically halt for a few seconds before resuming execution. I wanted to profile it at the database level, and the best tool for the job is Oracle’s own tkprof.
The input for tkprof is a session trace file, and enabling one is a bit tricky. The reason – a JavaEE application, with multiple threads, have multiple connections, multiple database sessions, and the SQL_TRACE is on a per session level (I didn’t want to configure it for the entire database – the trace file would be totally unusable…)
So, I took the code and ran it in a standalone Java application, and enabled SQL Trace. Here’s how:

package com.tona.jdbc;

import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.PreparedStatement;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.sql.Statement;
import java.sql.Types;

import oracle.jdbc.OracleCallableStatement;

public class TestJDBC {
	public static void main(String[] args) throws Exception {
		Connection c = DriverManager.getConnection(JDBC_CONNECTION_STRING);
		
                // Set SQL_Trace to on
		Statement stmt = c.createStatement();
		stmt.executeUpdate("ALTER SESSION SET SQL_TRACE=TRUE");
		stmt.close();
		
                // Set SQL Trace location
		stmt = c.createStatement();
		stmt.executeUpdate("alter system set USER_DUMP_DEST='/tmp'");
		stmt.close();

		// JDBC logic comes here...

		c.close();
	}
}

Note
Changing the USER_DUMP_DEST parameter did not have any effect, and the SQL_TRACE was written to the default trace log directory on the server (in my case it was ./u01/app/oracle/diag/rdbms/SID/SID/trace)

Batch Update with JPA

A customer of mine has a highly scalable system, with high database load. To improve performance, we’ve decided to use Batch update. Alas – the application uses JPA, so how do we do it?

SessionFactoryImplementor sfi = SessionFactoryImplementor)entityManager.unwrap(Session.class).getSessionFactory();
//Retrieve DB connection
connect = sfi.getConnectionProvider().getConnection();
PreparedStatement ps = connect.prepareStatement("INSERT INTO temp_table values(?,?)");
for (Data p : insertCommands) {
	ps.setInt(1, p.getId());
	if (p.isWorking() != null) {
		ps.setInt(2, p.isWorking() ? 1 : 0);
	} else {
		ps.setNull(2, Types.INTEGER);
	}
	ps.addBatch();
}
ps.executeBatch();

Some points to remember:

  1. Do not close the connection!!! When closing the connection, it returns to the connection pool, without commit or rollback. You’ll get some exceptions in your code, but mainly – you’re open to database locks and dead-locks, as the connection keeps all the locks that were done in the batch update. And since other threads will reuse the connection – it will never close.
  2. Do not use Statement.addBatch. Use PreparedStatement.addBatch. When I checked both solutions, the Statement one took anywhere between 30s-60s. The PreparedStatement version took <300ms…

How to reuse a java.lang.StringBuilder

It’s common knowledge in Java that you want to clear a StringBuilder object, and not recreate it. Why? GC for instance. But, surprisingly enough, there is no clear() method on the StringBuilder class, and so we resort to manual clear methods. The ones I’ve seen are:

  • new StringBuilder which basically recreates the object
  • setLength(0) which changes the internal flags of the StringBuilder
  • delete(0,sb.length()) which deletes the whole buffer

One would think that the best technique is setLength – but I’ve decided to give it a short test.
Here is the code I’ve used:

package com.tona;

import java.util.Random;

public class TestStringBuffer {
	private static final int NUM_OF_RUNS = 100000;
	private static final int MAX_LENGTH = 1000;
	private static Random rand = new Random();

	private static int fillString(StringBuilder sb,boolean isSameLenString) {
		
		int length;
		
		if (isSameLenString)
			length = MAX_LENGTH;
		else 
			length = rand.nextInt(MAX_LENGTH);
		
		for (int i = 0; i < length; ++i) 
			sb.append("a");
		
		return length;
	}
	
	private void runBenchmark(boolean isSameLenString) {
		StringBuilder sb = new StringBuilder();
		long start = System.currentTimeMillis();

		for (int i = 0; i < NUM_OF_RUNS; ++i) {
			fillString(sb,isSameLenString);
			sb.setLength(0);
		}
		long end = System.currentTimeMillis();
		long diffWithLen0 = end - start;
		
		start = System.currentTimeMillis();

		for (int i = 0; i < NUM_OF_RUNS; ++i) {
			fillString(sb,isSameLenString);
			sb = new StringBuilder();
		}
		end = System.currentTimeMillis();
		long diffWithNew = end - start;

		start = System.currentTimeMillis();

		for (int i = 0; i < NUM_OF_RUNS; ++i) {
			fillString(sb,isSameLenString);
			sb = new StringBuilder(MAX_LENGTH);
		}
		end = System.currentTimeMillis();
		long diffWithNewConstLength = end - start;

		start = System.currentTimeMillis();

		for (int i = 0; i < NUM_OF_RUNS; ++i) {
			fillString(sb,isSameLenString);
			sb.delete(0, sb.length());
		}
		end = System.currentTimeMillis();
		long diffWithDelete = end - start;

		start = System.currentTimeMillis();

		for (int i = 0; i < NUM_OF_RUNS; ++i) {
			int length = fillString(sb,isSameLenString);
			sb.delete(0, length);
		}
		end = System.currentTimeMillis();
		long diffWithDeleteConstLength = end - start;

		System.out.println("With setLength(0) " + diffWithLen0);
		System.out.println("With new StringBuilder() " + diffWithNew);
		System.out.println("With new StringBuilder(MAX_LENGTH) " + diffWithNewConstLength);
		System.out.println("With delete(0, sb.length()) " + diffWithDelete);
		System.out.println("With delete(0, length) " + diffWithDeleteConstLength);
	}
	
	public static void main(String[] args) {
		TestStringBuffer test = new TestStringBuffer();
		System.out.println("Constant length string");
		test.runBenchmark(true);
		System.out.println("Changing length string");
		test.runBenchmark(false);
		
	}
}

And here are the results:

Constant length string
With setLength(0) 1524
With new StringBuilder() 1501
With new StringBuilder(MAX_LENGTH) 1365
With delete(0, sb.length()) 1369
With delete(0, length) 1391
Changing length string
With setLength(0) 686
With new StringBuilder() 743
With new StringBuilder(MAX_LENGTH) 796
With delete(0, sb.length()) 715
With delete(0, length) 698

(Note that changing string length uses a Random, so results may vary).
With StringBuilder object with a fixed length, the new StringBuilder with the predefined length is the best option, although not much slower than the delete method. With the varying length strings, setLength and delete and more of less on par (give it a few runs and you’ll see it for yourself).
So, now you know how to clean your StringBuilder object. Enjoy.