Category Archives: Tools

Arquillian y GlassFish: portar ‘Hola Mundo’ desde TomEE

Por ahora hemos visto cómo efectuar un test Hola Mundo con Arquillian y WildFly, y cómo portar dicho test a Arquillian + TomEE.

Hoy veremos cómo ejecutar ese test contra GlassFish, usando su adaptador managed.

1.Configuración de dependencias

En teoría, lo único que debería cambiar a nivel de dependencias son las librerías de arquillian para GlassFish (cambio #1).

Modificaremos build.gradle para utilizar org.jboss.arquillian.container:arquillian-glassfish-managed-3.1:1.0.0.Final-SNAPSHOT, como sigue:

   // *****************************************************************************
   // GlassFish managed
   testRuntime 'org.jboss.arquillian.container:arquillian-glassfish-managed-3.1:1.0.0.Final-SNAPSHOT'

2. Código

El mismo que para WildFly y para TomEE.

3. Configuración de Runtime

Aunque tocaría ahora, ya hemos cambiado las dependencias de tiempo de ejecución para tests más arriba.

Debemos cambiar el archivo arquillian.xml para configurar el servidor managed de GlassFish (cambio #2), que quedará como sigue:

<?xml version="1.0" encoding="UTF-8"?>
<arquillian xmlns="http://jboss.org/schema/arquillian"
	xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
	xsi:schemaLocation="http://jboss.org/schema/arquillian
        http://jboss.org/schema/arquillian/arquillian_1_0.xsd">
   <container qualifier="glassfish-managed" default="true">
      <configuration>
         <property name="glassFishHome">/home/pagdev/pagde/tool/glassfish-4.1/</property>
      </configuration>
   </container>
</arquillian>

La única propiedad obligatoria de esta configuración es glassFishHome, que debe apuntar al directorio de instalación de GlassFish.

4. Ejecutar tests

Se ejecutan exactamente igual:

gradle test --tests *Test

Notas

Tan solo hemos tenido que hacer dos cambios para poder lanzar el test más simple posible usando un bean EJB.

Gradle tardó en ejecutar el test contra GlassFish managed alrededor de 18 segundos, y contra WildFly managed sobre los 11 segundos, y a lo largo de numerosas ejecuciones los tiempor variaron no más de 2 segundos. Contra Tomcat embedded, los tests se ejecutaron en torno a 6 o 7 segundos.

Software utilizado

Esto ha sido probado con GlassFish 4.1 y Arquillian 1.1.8, usando Gradle 2.3.

Arquillian y TomEE: portar ‘Hola Mundo’ de WildFly a TomEE

En teoría, Arquillian permite escribir tests de integración para varios servidores utilizando el mismo código y con unos cambios mínimos de configuración.

En este blog mi objetivo es usar Arquillian con TomEE en lugar de WildFly, y ver qué cambios debemos realizar para ejecutar el mismo test Hola Mundo que ya vimos en WildFly.

El objetivo es detallar todos y cada uno los cambios de configuración entre TomEE embedded y WildFly embedded.

1.Configuración de dependencias

Dado que no usamos ninguna funcionalidad específica de ningún servidor de aplicaciones, las dependencias de compilación y de test son las mismas que en el proyecto para WildFly.

En teoría, lo único que debería cambiar a nivel de dependencias son las librerías del runtime de Tomee (cambio #1), necesarias para arrancar y ejecutar TomEE en lugar de WildFly.

Modificaremos build.gradle para utilizar org.apache.openejb:arquillian-tomee-embedded:1.7.1, como sigue:

   // *****************************************************************************
   // TomeePlume: embedded
   testRuntime 'org.apache.openejb:arquillian-tomee-embedded:1.7.1'

Sin embargo, en la práctica, al ejecutar los test, nos encontramos con un error, java.lang.ClassFormatError, que se soluciona cambiando la dependencia de la API de JEE6 a javax:javaee-api:6.0 (cambio #2) para TomEE.

Como WildFly funciona perfectamente contra esta dependencia, este es más bien un cambio a hacer a la configuración de WildFly que a la de TomEE.

2. Código

El mismo que para WildFly.

3. Configuración de Runtime

Aunque tocaría ahora, ya hemos cambiado las dependencias para ejecutar TomEE en lugar de WildFly más arriba.

Por supuesto, debemos cambiar el archivo arquillian.xml para configurar el servidor embedded de Tomee (cambio #3), que quedará como sigue:

<?xml version="1.0" encoding="UTF-8"?>
<arquillian xmlns="http://jboss.org/schema/arquillian"
	xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
	xsi:schemaLocation="http://jboss.org/schema/arquillian
        http://jboss.org/schema/arquillian/arquillian_1_0.xsd">
   <container qualifier="tomee" default="true">
      <configuration>
      </configuration>
   </container>
</arquillian>

Y, sí, no hace falta configurar ningún parámetro para usar el TomEE embedded.

Otra cosa a tener en cuenta es que en la versión de WildFly con el adaptador embedded era obligatorio pasar a la máquina virtual del servidor el parámetro java.util.logging.manager, seleccionando el log manager de WildFly. Pero dicho gestor no está disponible en TomEE, por lo que no arrancará.

Solución: no pasar el parámetro java.util.logging.manager al runtime de TomEE (cambio #4).

4. Ejecutar tests

Se ejecutan exactamente igual.

En resumen

Hemos tenido que hacer cuatro cambios para poder lanzar el test más simple posible usando un bean EJB.

En realidad, si actualizamos la versión de WildFly para que use la dependencia org.apache.openejb:javaee-api:6.0-4 como API de JEE 6, con la que funciona perfectamente, el número de cambios es de 3, todos ellos triviales.

No está nada mal, ¿no?

Notas

Como nota aparte, comentar que Gradle tardó en ejecutar el test contra TomEE 5.856 segundos. El mismo test, lanzado contra WildFly con la configuración standalone.xml tardó 9.443 segundos.

Los tiempos en numerosas ejecuciones se mantuvieron siempre en torno a estos valores, con variaciones de no más de 1 segundo. Es una diferencia notable y que vale la pena mencionar, pero tampoco le daría mucha importancia, ya que el nuestro no es un escenario realista.

Software utilizado

Esto ha sido probado con TomeePlume 1.7.1 y Arquillian 1.1.8, usando Gradle 2.3.

Arquillian y WildFly: test ‘Hola Mundo’

Arquillian permite crear tests de integración con servidores/contenedores web y JEE como WildFly (o Tomcat), de forma bastante sencilla.

Para trabajar con un servidor, Arquillian usa adaptadores de contenedor, que permiten arrancar y parar los contenedores. Estos pueden ser de tres tipos: embebidos, gestionados, y remotos (embedded, managed y remote, a partir de ahora).

Para este Hola mundo usaremos el adaptador embedded, y dejaremos para otro artículo la discusión de qué ventajas e inconvenientes tiene cada tipo de adaptador y cuándo usar cada uno.

1. Configuración de dependencias

Para compilar los tests necesitaremos JUnit y Arquillian: con tan solo incluir junit:junit:4.11 y org.jboss.arquillian.junit:arquillian-junit-container:1.1.8.Final en build.gradle se incluirán todos los demás jars necesarios, especialment las dependencias de Arquillian.

Para ejecutar los tests, debemos incluir un adaptador de contenedor de WildFly, que Arquillian usará para arrancar y parar el contenedor contra el que se ejecutará el test. Como usaremos el contenedor embedded de WildFly, debemos incluir org.wildfly:wildfly-arquillian-container-embedded:8.2.0.Final en la lista de dependencias.

Por último, para que Gradle pueda encontrar algunas de estas dependencias deberemos añadir un repositorio de JBoss a nuestro build.gradle, que una vez hecho todo esto quedará como sigue:

apply plugin: 'java'

sourceCompatibility = 1.6

repositories {
  mavenCentral()
  maven {
    url "http://repository.jboss.org/nexus/content/groups/public-jboss"
  }
}

dependencies {
  compile 'javax:javaee-api:6.0'
  testCompile 'junit:junit:4.11',
    'org.jboss.arquillian.junit:arquillian-junit-container:1.1.8.Final'
  testRuntime 'org.wildfly:wildfly-arquillian-container-embedded:8.2.0.Final'
}

test {
  // Si no se especifica el log manager con el adaptador embedded de WildFly 8.2,
  // no arrancara correctamente
  systemProperties 'java.util.logging.manager': 'org.jboss.logmanager.LogManager'
}

2. Código

Para demostrar cómo implementar y ejecutar tests sencillos crearemos un stateless session bean, MyStatelessEjb, que simplemente suma dos números:

import javax.ejb.Stateless;

@Stateless
public class MyStatelessEjb {
   public int sum(int a, int b) {
      return a + b;
   }
}

Para hacer un test JUnit con Arquillian necesitaremos un runner de JUnit especial, incluido con Arquillian, para lo que anotaremos la clase de test con @RunWith(Arquillian.class).

También debemos especificar la configuración básica de la aplicación, para que el test se pueda ejecutar en el contexto adecuado. Debemos indicar dónde están los .class necesarios, si vamos a crear un war u otra cosa, quizá incluir un archivo de configuración como persistence.xml, etc.

Para ello se debe crear un método anotado con @Deployment, que utilizará la clase ShrinkWrap para llevar a cabo toda esta tarea de configuración.

El código resultante para hacer el test será como sigue:

@RunWith(Arquillian.class)
public class MyStatelessEjbTest {
   @EJB
   private MyStatelessEjb ejb;

   @Deployment
   public static Archive<?> createTestArchive() {
       return ShrinkWrap.create(WebArchive.class)
               .addPackage(MyStatelessEjb.class.getPackage());
   }

   @Test
   public void testSum() {
      Assert.assertEquals(5, ejb.sum(6,-1));
   }
}

Por lo demás, los tests son tests JUnit normales y corrientes, como se puede ver con testSum. Nótese que este test usa nuestro bean, que se ha inyectado en el test con @EJB, y que el propio contenedor WildFly gestiona.

3. Configuración de Runtime

Llegados a este punto, la aplicación compilará, pero aún es necesario dar unos pasos extra para que la aplicación arranque y podamos ejecutar los tests.

Las librerías adicionales necesarias en tiempo de ejecución ya las hemos incluido en un paso anterior, pero aún debemos configurar Arquillian. Esto se hace mediante arquillian.xml, que ubicaremos en el directorio src/test/resources del proyecto. Quedará como sigue:

<?xml version="1.0" encoding="UTF-8"?>
<arquillian xmlns="http://jboss.org/schema/arquillian"
      xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
      xsi:schemaLocation="http://jboss.org/schema/arquillian
      http://jboss.org/schema/arquillian/arquillian_1_0.xsd">
  <container qualifier="jboss-embedded" default="true">
    <configuration>
      <property name="jbossHome">/home/pagdev/pagde/tool/wildfly-8.2.0.Final</property>
      <property name="modulePath">/home/pagdev/pagde/tool/wildfly-8.2.0.Final/modules</property>
      <property name="serverConfig">standalone-full.xml</property>
    </configuration>
  </container>
</arquillian>

Aquí usamos la propiedad jbossHome, requerida, para indicar cuál es el directorio donde se halla JBoss (como JBOSS_HOME).

Esta versión del adaptador también requiere que se le pase el parámetro modulePath, que es donde se encuentran los módulos utilizables de JBoss. Por defecto están en el subdirectorio modules en JBoss. Es curioso que este parámetro sea obligatorio, dado que podrían utilizar un valor por defecto sin problemas.

También he añadido el parámetro serverConfig, que nos permite especificar qué configuración de servidor vamos a utilizar, y que no es requerido.

4. Ejecutar tests

Desafortunadamente necesitamos un pequeño workaround para un bug del adaptador embedded. Es necesario pasarle al contenedor WildFly arrancado por Arquillian el parámetro java.util.logging.manager, para el que simplemente utilizaremos el LogManager por defecto de WildFly. Esto es lo que hace el siguiente fragmento de build.gradle:


test {
  systemProperties 'java.util.logging.manager':
         'org.jboss.logmanager.LogManager'
}

Una alternativa a esto sería llamar a gradle desde la línea de comando pasándole ese parámetro. Para lanzar todos los tests del proyecto, por ejemplo, tendríamos la siguiente línea de comandos:

gradle test -Djava.util.logging.manager=org.jboss.logmanager.LogManager --tests *Test

Con la configuración actual de build.gradle, podemos lanzar el test con

gradle test --tests *Test

Por otro lado, si ejecutamos los tests desde Eclipse y no usando Gradle desde la línea de comandos, habrá que indicarle al launcher correspondiente que pase a la VM el parámetro -Djava.util.logging.manager=org.jboss.logmanager.LogManager.

Si queremos ejecutar estos tests desde la línea de comandos, y depurarlos con Eclipse, vale la pena consultar este otro artículo.

Software utilizado

Esto ha sido probado con WildFly 8.2 y Arquillian 1.1.8, usando Gradle 2.3.

Debugging build.gradle with Eclipse

It would be nice if we could debug our Gradle scripts in Eclipse, using all the goodies that development environment provides: syntax highlighting, code auto-completion, debugging support, etc.

And it would be even better if I could create .groovy files with all the goodies Eclipse can offer and be able to use them in our build.gradle script with little or no effort. I try hard to avoid spagueti code, and being able to distribute part of my scrips in isolated Groovy classes could help.

The good news is that we can do all of this. The bad news? There are some ugly limitations with which we’ll have to live -at least as of Eclipse Luna 4.4.2 with plugins Groovy-Eclipse 2.9.2 and Gradle IDE 3.6.4.

Before we begin

Before starting, make sure you have the following plugins installed:

  • Gradle Plugin for Eclipse.
  • Groovy Plugin for Eclipse.

Properly configured, they’ll provide us with syntax highlighting, code completion and debugging support.

Debugging a simple Gradle Script

Let’s create a debug-gradle-in-eclipse directory for the build.gradle file we want to debug and the Eclipse project we will use to debug it –which can be at the same time the Eclipse project corresponding to the gradle project.

The following build.gradle file will help us understand how and what can be debugged with Eclipse and Gradle:

apply plugin: 'eclipse'

class ClassInBuildGradle {
   ClassInBuildGradle() {
      println '*BUT* you *CAN* place a breakpoint in a class method ;)'
   }
}

void scriptFunction() {
   println 'You can\'t place a breakpoint in a script function :('
}

task sayHello &lt;&lt; {
   def configMessage = 'You can\'t place a breakpoint in a task :('
   println configMessage
   scriptFunction()
   new ClassInBuildGradle()
}

Here, we have defined a class (ClassInBuildGradle) and a function, scriptFunction, to show where you can place a breakpoint –or not.

To make things easier, we have added the eclipse plugin to generate an Eclipse project for this gradle build file. To create it, execute

   gradle eclipse

Now open Eclipse, and import the project with File|Import… and then General|Existing Projects into Workspace, choosing the project directory.

Open build.gradle in Eclipse, and you’ll notice you’ve got syntax highlighting, thanks to the Groovy/Gradle plugins.

Figura_1

Now, before adding breakpoints you need to tell Eclipse this is a Groovy project: just choose the Configure|Convert to Groovy Project from the project contextual menu (I did this in the Navigator view, selecting the project itself and right-clicking the mouse).

Once this is done, you can add breakpoints: add them at line 5 (inside a class method), line 10 (inside a script function) and line 14 (inside a task), as seen in the figure above.

Now, create a remote Eclipse debug configuration, going to Run|Debug Configurations…, and then choosing Remote Java Application. Call it debug-gradle-in-eclipse, build.gradle, and set the Port to 5005 and check the Allow termination of remote VM option. Save the configuration with Apply and then Close -do not launch it by clicking Debug.

Figura_2

Note that port 5005 is the port gradle expects a debugger to use by default.

I recommend you to store this debug configuration in your project directory, by going to the Common tab and then selecting the directory in Shared file: I stored it in the eclipse-launchers subdirectory, which is where I place this and other Eclipse launch configurations.

Next, run the sayHello gradle task from the command line as follows:

gradle sayHello –Dorg.gradle.debug=true

This will start the task for debugging, and Gradle will wait for a debugger to connect to port 5005. Do it by opening our debug configuratio and clicking Debug. Eclipse will start debugging, showing you something like this:

Figura_3

You’ll notice several things going on here.

First, Eclipse will not open the build.gradle file. This is a limitation, but you can circumvent it here by taking a look at the stack trace, where you can see that source line is number 5. So, you can open build.gradle and go to that line.

Or you can click in the Breakpoints view to jump to the source line. This is my preferred way, but you need to have build.gradle open in the editor. The Variables view is fully operative, and it will allow you to check and modify variable values.

Next, you’ll notice that the other breakpoints…they are ignored. If you take a look at the Breakpoints view you’ll see they are reported as being placed in build, which is a way to say they are located in code belonging to the script object.

No, you can’t place breakpoints in the code belonging to the script code Gradle generates under the hood, you can only place them in code for other classes defined in the script or groovy classes used by the script –more on that later. Therefore, moving as much code as possible to classes can be a good idea.

Besides, by debugging step by step you’ll be able to see the values of variables in the stack, with the same names you gave them in the source code, and you’ll quickly find the task objects, the project object, etc. Surprisingly, this can be very useful.

Yeah, ugly, but might be workable and better than debugging without a debugger. I hope next version of the Eclipse Groovy plugin solves this issue. I think we are not that far from there, given what we already have.

Important
You must execute the gradle task first, and only then launch the Eclipse debugger. If you launch the debugger first, you’ll get an error, as shown below.

Figura_4

Moving part of the Gradle Script to separate Groovy classes

If your script starts to grow, you will probably want to create your own Groovy classes to avoid spaguetti code. Gradle comes with support for this out of the box, by allowing you to write Groovy code in buildSrc. The default source directory will be in buildSrc/main/groovy.

Let’s create a Greeter.groovy class in buildSrc/main/groovy, as follows:

class Greeter {
   private String name
   Greeter(String name) {
      this.name = name
   }

   void greet() {
      def message='Greeter Groovy class greeting you, '+ this.name+'!'
      println message
   }
}

Now, we’ll use it in build.gradle by adding a line that uses it, such as

new Greeter().greet()

You’ll need to configure Gradle so that it can process these Groovy files, adding the following lines at the beginning of build.gradle:

repositories {
   mavenCentral()
}

apply plugin: 'groovy'

dependencies {
   compile gradleApi()
   compile localGroovy()
}

The last step is to help Eclipse show the source code for these groovy files when a breakpoint is hit. Just edit the debug configuration, going to the Source tab in it, and adding an entry to the Source Lookup Path: click Add…, choose the Workspace Folder option and then choose the debug-gradle-in-eclipse project and its buildSrc/main/groovy directory.

Figura_5

Now, run the gradle sayHello task from the command line as explained before, execute the Eclipse debugging configuration, and when a breakpoint in Greeter.groovy is hit the source file will open automatically.

Now, that’s much better than what we’ve got for the build.gradle script, and very helpful to create complex scripts or even Gradle plugins.

Now that we are at it…

I recommend you enable several options by using the project contextual menu and clicking Gradle|Enable DSL Support and Groovy|Enable DSL Support, which can help with working with .gradle source code.

Last, but not least, you can run the gradle tasks with the gradle daemon, as follows:

gradle sayHello --daemon –Dorg.gradle.debug=true

In this scenario, the Eclipse debugger will attach itself to the daemon, and it will be active until the daemon is stopped. This can make debugging faster, though some instability might result –both things inherent to using the gradle daemon.

Moving log4js-ext development from Windows to Linux -while keeping Windows dev supported

I have been moving my development environment from Windows to Linux, step by step, and the time for moving log4js-ext has arrived.

Log4js-ext is a pure javascript + Java project, and therefore source code portability is not a big issue: the only portability issues that arose were due to the build tools I use to create the deliverables, and the ant build file.

Issues

Here is my main issues list:

  • Differences in OS conventions, including directory separator, etc., make tools fail.
  • Can’t use good old CMD to run command line tools.
  • Some tools are not available in the Linux world.
  • Other nuisances.

Differences in OS conventions, including directory separator character, etc.

There are differences in the way Linux and Windows names things, case sensitivity or the directory separator.

Being aware of that, and knowing that Windows supports both ‘\’ and ‘/’ as directory separators, I’ve been using ‘/’ for ages, so that did not pose a problem.

Besides, some file paths changed (i.e., the Tomcat config directory location), but that was an easy thing to spot and fix, and I moved those paths to ant properties to make them easier to deal with.

Thankfully, no other issues appeared in this area.

Can’t use good old CMD to run command line tools

While the project was hosted in Windows, I added calls to external tools to generate Javascript API documentation, optimize image size, etc,. invoking cmd.exe with the ant exec task.

For example, this is the relevant part of the ant code I used to generate the API documentation:

<exec executable="cmd" dir="${lib.basedir}/src" >
  <arg line="/c jsduck .../>
</exec>

I had to modify the exec to make it work both in Windows and Linux, as follows:

<!-- Ok, executables are called differently in Windows and Linux... -->
<condition property="jsduck.cmd" value="cmd.exe">
  <os family="windows"/>
</condition>
<property name="jsduck.cmd" value="jsduck"/>

<condition property="jsduck.cmd.extraparams" value="/c jsduck">
  <os family="windows"/>
</condition>
<property name="jsduck.cmd.extraparams" value=""/>

<exec executable="${jsduck.cmd}" dir="${lib.basedir}/src" >
  <arg line="${jsduck.cmd.extraparams} .../>
</exec>

As you can see, I had to provide slightly different commands to exec. It was easy, but I have to admit that ant makes this verbose and to look like a big thing -which it is not.

Another annoyance was running java to minify the consolidated javascript file that includes all of log4js-ext. In Windows, I used an exec task to call Java, passing the YUI minifier jar, as follows

<exec executable="cmd">
  <arg line="/c java -jar ${yui.jar}
   ${pack.tmpdir}/${lib.project}-all-dev.js
   -o ${pack.tmpdir}/${lib.project}-all.js"/>
</exec>

Now, I decided to use the ant java task instead, as it looked easier to use it in a portable way:

<java classpath="${log4j.jar}" jar="${yui.jar}"
  spawn="true" fork="true">
  <arg line="${pack.tmpdir}/${lib.project}-all-dev.js -o
     ${pack.tmpdir}/${lib.project}-all.js"/>
</java>

However, I discovered that I had to fork the java task in order to invoke a jar, and that meant that other operations were executed *in parallel*…and therefore were called before the java task output they required was available, failing.

The workaround was to add the following lines immediately after the java task, so that it waited until the generated file was available, or 30 seconds, whatever came first:

<waitfor maxwait="30" maxwaitunit="second">
  <available file="${pack.tmpdir}/${lib.project}-all.js"/>
</waitfor>

Thank God, the problem with parallel execution was easily repeatable and I was able to find the problem in no time. Had it manifested itself only from time to time, it would had been a real pain.

Some tools are not available in the Linux world

Too bad, but it is true: some tools that are readily available in the Windows world are not there in the Linux world.

That was the case with the tools I use to optimize PNG, GIF and JPG images, pngslim and pngout.

Well, since log4js-ext is very mature when it comes to the front-end, there is no real need to optimize new images, which will probably never come into being. The smart thing to do was to ignore the problem, and not to look for alternatives 🙂

So, that’s what I placed at the beginning of the corresponding ant task:

<fail message="NOT PORTED TO LINUX TOOLS: if the need arises to add new images, then it will make a lot of sense to look for alternatives that work in Linux"/>

Call me pragmatic!

Other issues

Whenever you move or install a complex development environment, it is always hard to reproduce it exactly, unless you go to great lengths.

Log4js-ext being a personal project, that was asking too much, and I just installed the latest versions of the different tools in Linux.

That meant I found the typical problems due to my Windows environment using version x.y.1 of JSDuck and Linux apt-get installing x.y.2: in fact, I had not one, but two problems related to using a newer version of JSDuck.

I know, this is not related to Windows vs Linux per se, but it is worth taking into account because it will happen. For big projects this can be a real problem, and I tend to be very, very strict with the exact version of tools and libraries I use.

A non-trivial ant build file that works in Linux *and* Windows

Just for the sake of it, I run the new ant file in Windows and performed several tests to check the results: everything worked, so I called it a day.

All in all, making my build and test scripts work in Windows and Linux was not as much of a chore as I had anticipated, and I was pleased with the time it took to migrate and the result.

Introducción al control de versiones, usando Git: Pragmatic version control with Git

Ahora mismo, si alguien me pidiera opinión sobre qué control de versiones usar, y qué libro utilizar para iniciarse en el uso de un sistema de control de versiones, la respuesta sería “usa Git, y echale un vistazo a Pragmatic version control using Git“.

El libro va al grano, sin perderse por vericuetos, y sobre todo sin perder de vista que un sistema de control de versiones es un medio para un fin: primero, pues, viene la necesidad concreta, y después lo que un sisteam como git nos ofrece para satisfacer esa necesidad.

Demasiado a menudo nos cuentan el cómo sin que se vislumbre el por qué:  no es el caso de ese libro, que nunca pierde de vista el por qué de las cosas.

Por otro lado, si ya tienes experiencia con el uso de un sistema de control de versiones, tampoco te irá mal un libro que une muy rapidamente la necesidad concreta (¿cómo creo una rama aparte para “trastear”?) con lo que git proporciona para satisfacerla.

Finalmente, si eres un super experto…busca en otro lado. Estamos ante un libro de poco más de 150 páginas, simplemente no puede serlo para todo el mundo.

Una introducción excelente tanto a los sistemas de control de versiones como a git, ideal para programadores que tengan ninguna o poca experiencia con ambas cosas.

Juicer: issues and workarounds

Juicer is a nice tool to merge and minify several CSS files into just one file. Kudos to the guy that created it!

That said, I’ve been bitten several times by it. Here are some of the troubles I experienced and what I did to get things to work for me.

Juicer: embedding images in CSS

Juicer wants to know the –-document-root when embedding images with --force-image-embed or --embed-images data_uri.

Note that I’m talking about image inclusion in a css using relative URLs, I have not checked this for absolute urls because I don’t care: I never use then.

For things to work, the --document-root must be the directory of the css file you are processing.

Again, a clarification is in order: I always process just one css file each time with Juicer, because I make a point of creating a single css file that @imports all css files I want to consolidate. The net effect is that I’m processing several files, but specifying just one in the command line.

I’m making this clear so that you know I just haven’t tested Juicer behavior with --document-root and multiple css files in the command line. Better safe than sorry!

If you notice I’m being defensive, it is because I am. Juicer is great, but I’ve found its ways are not obvious to me and made many assumptions that did not hold.

Fixing relative CSS urls with ant

I run juicer from my project ant tasks, and that can be problematic because there are ant ways and there are Juicer ways, and they collide from time to time.

As point in case, I have to perform text substitution on the generated css file so that relative urls are right: this is due to the fact that Juicer seems to be intended to be run from the webapp root directory if you want correct relative urls in your css.

*But* I have not managed to run the ant task in the webapp root directory. I tried the dir attribute in the exec task, that seemed to be the cure, but that didn’t do what I needed.

Yes, I can bypass ant and run things from the command line. But then I will have ant’s way, the command line way, and will need to duplicate configuration information for ant and batch files (quick, where is the YUI compressor jar in your system/project?). And I hate breaking the DRY (Don’t Repeat Yourself) rule.

Workaround: replace ../WebContent/ with ../, WebContent being the subdirectory relative to my ant build.xml file. This is way simple in ant:

<replace 
   file="${juicer-friendly-output-dir}/${ant.project.name}-all.css" 
   token="url(../../WebContent/" value="url(../"/>

Juicer is written in Ruby, and Ruby has its ways

Ruby seems not to like ‘:’ or ‘\’ in file names -or Juicer uses that in a bizarre way.

Ok, I know almost nil about Ruby, but I had some bizarre problems with file names. In some cases, the error message I’ve got helped me clearly diagnose the problem.

In other cases, I had no error message, and it took a *lot* of time to identify the problem.

The workaround is avoiding ‘:’ or ‘\’ in file names passed directly or indirectly to Juicer.

Note that this can happen inadvertently. For example, if you execute an ant task in Windows, ant’s ${basedir} can end up being c:\x\y\z.

Workaround: convert all directory and file names to Unix style, which Windows handles correctly, like this:

<pathconvert property="juicer-friendly-input-dir" targetos="unix">
  <path location="${basedir}/WebContent/css"/>
</pathconvert>

Juicer does not generate a non-minified css: but you can trick it into it

Juicer provides two nices features for CSS: consolidates several css files into one, and minifies the resulting file.

I would love to have a single production time css that is minified, and another one that is not, to help users of my js libraries with customizing the css. But Juicer does not provide a way to generate a non-minified version.

Workaround: specify a fake inexistent .jar file with the --path INEXISTENT_MINIFIER_JAR_TO_GET_NON_MINIFIED_CSS argument. Juicer will generate the consolidated css and will fail with a Unable to access jarfile error…but the consolidated css file with be there.

Yes, ugly, but very useful. I hope they had added a none option to the --minifyer argument that would have the same effect without the rror message, but they did not