TAR – What is it and how to use it


Recently, I had to migrate several WordPress sites from one server to another, which actually might make for a good technical post.  What I came upon, however, was the crucial bit of knowledge needed in zipping up directories and moving them that I had to do over and over and over again.  For my own person benefit (and maybe for others), I decided to write this quick blog on how to compress files and directories using tar and how to uncompress them.

Before we begin, what the heck does tar stand for?  After doing some research, because I use this command quite often now, it is a shortened form of “Tape archive” and does what those familiar win-zip does by compressing files and directories down to a manageable file object that you can quickly move around or archive.

For this tutorial, we will look at an example of the command and the flag directives used and what they mean. We will also consider not only how to “tar” something – compress a set of files in a directory but also how to uncompress a tar file.  As a side note, there are a good number of derivative compression types like tar, bzip, gzip, tz, tzg and I won’t be dealing with those.

Why not just zip?

So, I when I first started down this path of “tar’ing” files, I wondered first about using what I was comfortable with: zipping. In using what I was comfortable with from my time in windows;  Why not just use zip?  The reason is zip does not retain all of the file permissions when you zip up a directory.  If you want the user and/or group permission then you really need to use the tar command.

Tar Command Basics

# tar -cvzf wordpress_back.tgz wordpress/
....and on and on it goes...

Let’s talk through what happened.

In the following command, what is actually happening?  So let’s break it down.

#tar -cvzf wordpress_back.tgz wordpress/

The flag directives in order mean:

  • c – create an archive
  • v – show the progress of archive when being created
  • z – filter the archive through gzip
  • f – specifies the name of the archive file

Interestingly,  if you get these out of order, the process will fail; however, removing the dash will still allow it pass correctly.


# tar -cfzv wordpress_back.tgz wordpress/
tar: wordpress_back.tgz: Cannot stat: No such file or directory


# tar cvzf wordpressBak.tgz wordpress/
 ....and on and on it goes...

If you need more directives, you can man tar to see more.


Decompressing a .tgz or tar file is also pretty easy with the following command:

# tar -xzf wordpressBak.tgz

Ok, so what happened here; again, let’s break it down and look at the directives used.

  • x – extract the archive file
  • z – means pretty much as before filter the archive through gzip
  • f –  this also addresses the same as before in that it specifies the name of the archive file

You will also notice that if you use this command in the presence of a directory that has the same name as previously archived, you will overwrite it and all of its permissions. So, one way to deal with this is only extract the file you want:

# tar -xzf wordpressBak.tgz wordpress/wp-config.php

This give us then only the wp-config.php from the wordpress directory inside the wordpressBak.tgz.  Or, you could also write it this way and get the same results:

# tar --extract --file=wordpressBak.tgz wordpress/wp-config.php

For multiple files, just encapsulate the files in quotes like so:

# tar-xzf wordpressBak.tgz "wordpress/wp-config.php" "wordpress/wp-cron.php"

You can also get at these using wildcards:

#tar-xzf wordpressBak.tgz --wildcards '*.php'


Now, I have only scratched the surface of what you can do with the tar command. Again, if you need more information, please man tar and it will give you plenty of details.

Alright that is it for now and don’t get stuck in the tar!

Resources for further reading:

Uploading XML into XMLTYPE column in database with Spring JDBCTemplate

Couple of days ago I had been really struggling on how to upload and store an XML file into an XMLTYPE column in the database.  I have uploaded with no difficulty at all XML files into Oracle db table columns of type CLOB and BLOB but XMLTYPE columns seemed to through me for a loop. I would get the following error no matter what I tried:

ORA-01461 can bind a LONG value only for insert into a LONG column

This was very perplexing and searching the web on how to code this type of upload seemed to yield next to no results.  I thought putting down the steps I performed to accomplish this might help someone else struggling with this.

Lets start with a DAO class and walk through the code and what is needed:

UploadObjectDAOImpl.java – this class utilizes Spring JdbcTemplate to perform an insert into the database.

package com.mdrsolutions.web.dao.impl;

import com.mdrsolutions.external.sql.SqlCalls;
import com.mdrsolutions.web.dao.UploadObjectDao;
import com.mdrsolutions.web.dao.entity.UploadObject;
import com.mdrsolutions.web.dao.util.CustomPreparedStatementSetter;
import java.math.BigDecimal;
import java.math.BigInteger;
import java.sql.SQLException;
import java.util.ArrayList;
import java.util.List;
import javax.persistence.NoResultException;
import javax.persistence.Query;
import javax.persistence.TypedQuery;
import org.apache.log4j.Logger;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.jdbc.core.PreparedStatementSetter;
import org.springframework.stereotype.Repository;
import org.springframework.transaction.annotation.Transactional;

public class UploadObjectDAOImpl<T extends UploadObject> extends BaseDaoImpl implements UploadObjectDAO<T> {

   private static final Logger logger = Logger.getLogger(UploadObjectDAOImpl.class);
   private static final String UPLOAD_XML_SQL_PATH = "db/scripts/application/uploadXML.sql"; 

   private CustomPreparedStatementSetter customPreparedStatementSetter;

   public void upload(final T t) throws SQLException {

       String uploadXMLSql = SqlCalls.getSql(UPLOAD_XML_SQL_PATH);  

       PreparedStatementSetter ps = customPreparedStatementSetter.create(getJdbcTemplate(), t);

       int update = getJdbcTemplate().update(uploadXMLSql, ps);
       logger.debug("Succefully updated table = " + update);

Lets examine real briefly a couple of things about this class.

On line 29 you will notice the declaration of CustomPreparedStatementSetter which is used at line 36 passing in a reference to jdbcTemplate object as well as T object representing the UploadObject.

On line 34, a class SqlCall.getSql(path) retrieves the sql statement and assigns it to the String uploadXMLSql.

On line 38, jdbcTemplate.update method is called passing in both the updateXMLSql and the PreparedStatementSetter.

Concerning line 34, you could replace this with any valid insert sql statement. Line 34 basically represents the following SQL:

values (?, ?, ?)

As you can tell it is a very basic Prepared INSERT SQL statement.

Line 34 of the UploadObjectDAOImpl.java demonstrates declaration of CustomPreparedStatementSetter class which by far is the most important part of this tutorial; lets take a look at that class next.


package com.mdrsolutions.web.dao.util.impl;

import com.mdrsolutions.web.dao
import com.mdrsolutions.web.dao.entity.UploadObject;
import java.io.ByteArrayInputStream;
import java.io.ByteArrayOutputStream;
import java.io.IOException;
import java.io.InputStream;
import java.io.ObjectOutputStream;
import java.sql.Connection;
import java.sql.PreparedStatement;
import java.sql.SQLException;
import oracle.jdbc.OracleConnection;
import oracle.xdb.XMLType;
import org.apache.log4j.Logger;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.jdbc.core.PreparedStatementSetter;
import org.springframework.jdbc.support.nativejdbc.SimpleNativeJdbcExtractor;
import org.springframework.stereotype.Component;

public class CustomPreparedStatementSetterImpl implements CustomPreparedStatementSetter {

 private static final Logger logger = Logger.getLogger(CustomPreparedStatementSetterImpl.class);

 public PreparedStatementSetter create(JdbcTemplate jdbcTemplate, final UploadObject t) throws SQLException {
    SimpleNativeJdbcExtractor jdbcExtractor = new SimpleNativeJdbcExtractor();

    //XMLType insert requires the use of an OracleConnection,
    //which is usually wrapped inside the jdbcTemplate connection object
    Connection nativeConnection =

    OracleConnection conn = null;
    //Now, we have to unwrap the OracleConnection
    if (nativeConnection.isWrapperFor(OracleConnection.class)) {
        conn = (OracleConnection) nativeConnection.unwrap(OracleConnection.class);
    } else {
        conn = (OracleConnection) nativeConnection;

    //Next, we have to wrap the byte array in an InputStream to accepted
    InputStream is = new ByteArrayInputStream(t.getObject().getBytes());

    //Then, instantiate an XMLType object by using native OracleConnection and InputStream of the byte array object
    final XMLType xmldoc = new XMLType(conn, is);

    //Finally create a Spring PreparedStatementSetter object and set the below prepared statement
    PreparedStatementSetter xmlPs = new PreparedStatementSetter() {

        public void setValues(PreparedStatement ps) throws SQLException {
             ps.setString(1, t.getDocId());
             ps.setDate(2, new java.sql.Date(t.getRecvdDate().getTime()));
             ps.setObject(3, xmldoc);

   return xmlPs;

Please take note of the comments made throughout this class. Because, we are using JdbcTemplate, we don’t have to worry about closing connections here. Spring JdbcTemplate handles that for us, which makes this such an easier implementation.

Ok, so lets look at lines 32 through 41 briefly.  This part gave me a head ache because the documentation doesn’t clearly define the use of a regular Connection object versus an OracleConnection object, which you have to have to make this work. If you are using a Tomcat jndi connection to get your DataSource, you will need to ensure that the accessToUnderlyingConnectionAllowed=”true” in your context.xml file for Tomcat.  I included an example below to help out with this.

    <Resource auth="Container"
        validationQuery="SELECT 1"
        name="jdbc/objectUploadDB" password="tomcat"
        url="jdbc:oracle:thin:@testdb.myoracle.com:1521:DEV" username="tomcat"/>

So, by having this attribute set to true it basically enables the Connection to be unwrapped to get at the OracleConnection, which we have to have to continue.

So, looking at line 43 – 47 we see the following:

//Next, we have to wrap the byte array in an InputStream to accepted
InputStream is = new ByteArrayInputStream(t.getObject().getBytes());

//Then, instantiate an XMLType object by using native OracleConnection and InputStream of the byte array object
final XMLType xmldoc = new XMLType(conn, is);

Oracle requires that in order to insert XML into a column of type XMLType that you must wrap in a Java XMLType obect. Like I said at the beginning of this post, normal insertion of XML into CLOB or BLOB database fields is not difficult at all but if the column type is XMLType, it gets a little more involved.

So, we first have to convert the byte array into an InputStream and then we have to take the native OracleConnection and pass as a parameter into XMLType() constructor. This constructor takes two parameters: Connection and either String, byte[], or InputStream.

Concerning that second parameter, I tried the other two parameter types of String and byte[] but it generates a strange Oracle error (that I didn’t write down) but using an InputStream that wraps a byte[] will work every-time (or at least that’s what my testing shows so far).

And, concerning the first parameter, the documentation doesn’t say that a simple Connection object is sufficient – you end up finding out from the error messages that an OracleConnection is required.

OK, so that is about it. Looking back at line 38 in UploadObjectDAOImpl.java,  you will see the

int update = getJdbcTemplate().update(uploadXMLSql, ps);

This takes the insert sql statement and the PreparedStatementSetter and performs an insert/update with returning the number of rows affected.

Some may be wondering where do I get the


method at. Well, I have a BaseDaoImpl which has that getJdbcTemplate() method for me to use and because I extend from BaseDaoImpl…well if you understand basic Java you should be able to infer the rest.

How to externalize SQL into separate sql files for JDBC queries

The other day I was working on a project where I had to troubleshoot some sql that was nested in a java class file. After copying and pasting the sql statement, I had to spend sometime removing all of the concatenation symbols used to make it readable on a java page…you know instead of having one long line scrawled across the page. After I fixed it,  I had to go back and re-add all of those nasty “+” symbols and extra quotes. I thought to myself wouldn’t be nice if I could just store that sql in a separate file like a normal sql file.

So, I came up with idea of a Java class that could reference those files and read them back in. Its a first stab at it but it worked and made debugging the sql file much easier as well as easy to read.

package com.mdrsolutions.dao.sql;

import java.io.InputStream;
import java.io.InputStreamReader;
import com.google.common.io.CharStreams;
import com.google.common.io.Closeables;
import java.io.BufferedReader;
import java.io.IOException;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.apache.log4j.Logger;
import org.springframework.core.io.FileSystemResource;

public class SqlCalls {

    public static final String RETRIEVE_IMAGE_ALL_SQL = "db/sql/retrieve_group_all.sql";
    public static final String UPDATE_IMAGE_AUDIT_SQL = "db/sql/update_group.sql";

    private SqlCalls() {}

    * Method takes a file path String and returns the String representation of
    * that sql file
    * @param filePath of location of the sql file
    * @return a String of the contents of the SQL file
    public static final String getSql(String filePath) {
        String sql = "";

        try {
          //get the file stream object
          InputStream is = SqlCalls.class.getClassLoader().getResourceAsStream(filePath);

          //convert to String and close stream
          sql = CharStreams.toString(new InputStreamReader(is));
          if (null == sql || sql.isEmpty()){
            throw new IOException("File path to SQL file could not be read!");


          return sql;
       } catch (IOException ex) {
           //log your error
           return sql;

On the calling class I simply called the static method

SqlCalls.getSql(String path)

like so with a little Spring JdbcTemplate magic:

public class GroupDaoImpl extends BaseDaoImpl implements GroupDao {

    public GroupDaoImpl(DataSource dataSource){

    public void update(final List<Group> groups){
        String updateSQL = SqlCalls.getSql(SqlCalls.UPDATE_GROUP_SQL);
        int[] batchUpdate = getJdbcTemplate().batchUpdate(updateSQL, new BatchPreparedStatementSetter(){
              public void setValues(PreparedStatement ps, int i) throws SQLException {

                   Group grp = groups.get(i);
                   ps.setString(1, grp.getName());
                   ps.setString(2, grp.getLocation());

              public int getBatchSize() {
                   return groups.size();

I supplied the path as static variable to the sql file and it worked like a charm. I was able to externalize my sql, maintain nice formatting, and be able to quickly pull it up and test and debug it in my favorite SQL developer IDE.

I stored my sql files under the project directory structure like so:


In the future, it might be cool to take this to a different level and allow the application to scan the directory of sql files on start up of application and store them in a HashMap for easier retrieval, then maybe use the key to get the sql…just some thoughts…