Monday, August 10, 2009

Adding new package sources to Ubuntu Synaptic

Our web team decided a couple months ago to require maven 2.2 for their builds, which was a bit of problem for me because the Debian/Ubuntu guys took their sweet time in getting the 2.2 build available through the default package authorities that my out-of-the-box synaptic configuration pointed to.

I found this guy's Personal Package Archive with maven-2.2 available, and here are the steps I followed (stolen from his how-to) to get maven upgraded locally:

1) System -> Administration -> Software Sources -> 3rd Party -> Add

2) Added his 2 links to the 3rd party sources

3) Clicked close and ignored the errors

4) On a terminal, typed:
sudo apt-key adv --keyserver keyserver.ubuntu.com --recv-keys 5FEF2294
(where the 5FEF2294 is the part after the slash on his Signing Key)

5) sudo apt-get update

Monday, July 27, 2009

Spring load order: applicationContext vs. dispatcher-servlet

I've forgotten this more times that I care to admit, so let me just write it down in my own words to get it to sink in:

* an applicationContext.xml is "global" for all beans except for BeanFactoryPostProcessor / BeanPostProcessor (s) - like “PropertyPlaceholderConfigurer” for example. These guys apply ONLY to beans in its own context.

* each servlet defined by a dispatcher-servlet.xml represents a child application context that inherits the beans from the parent (root) app context.

Here...this guy says it better than I do:


Thursday, July 09, 2009

Command line find & replace

Thanks to http://schestowitz.com/Software/Search_and_Replace/, I re-remebered how to find and replace across a suite of files (either recursively or just within a directory).

find . -maxdepth 1 -type f -name '*.html' -print |
while read filename
sed 's/old/new/i;' $filename >$filename.xxxxx
mv $filename.xxxxx $filename # replace output files with original

You can rock it with just an "ls" instead of a "find", etc.

So for example I help Anh out by changing a ton of YAML test case definition files that needed updating after a big schema change:

ls |
while read filename;
do (
sed 's/pdf_document/document_path/g;' $filename >$filename.xxx;
mv $filename.xxx $filename;

Monday, June 29, 2009

Double port forwarding

I had a problem today that involved a couple levels of SSH port forwarding...

I was evaluating an email vendor who had asked what machine they should expect my test emails to be sent from. I replied with our dev02 server, thinking that I'd just create a little web page on our reportapp that you could "click button to send mime email" or something.

In the end, I wound up writing the email test as a JUnit test case, which I could execute on demand via NetBeans on my localhost.

So the problem was the vendor was expecting traffic from dev02, but I was only generating traffic from localhost. Complicating the matter was that I couldn't get directly to dev02 from localhost...I had to go through another server named "gateway".

  1. Tunnel my localhost's port 5555 to gateway's port 5555:
    ssh -AL 5555:localhost:5555 my.name@gateway.mycompany.com

    Technically, I think the "-A" flag isn't necessary (it just forwards my authentication token, which I have set up so I don't need to log in)

  2. The first command got me out to gateway with the 5555 tunnel open. Now tunnel gateway's 5555 port to dev02 and have dev02 forward that port to the email vendor's machine and port (in bold):
    ssh -L 5555: my.name@dev02.mycompany.com

  3. Then in my unit test, I can send mail to a JavaMailSenderImpl setup with host "localhost" and post "5555" and it ends up being sent to on port 9936, and the vendor sees the traffic as dev02's IP and lets it through the firewall.

Note that I'm pretty sure you can combine both steps 1 and 2 into a single step by using the fact that the ssh port forwarding syntax command can take a command to run on the remote box (the remote command to run is itself an ssh port forwarding command). Something like this. . . though use at your own risk:

ssh -AnfL 5555:localhost:5555 my.name@gateway.mycompany.com \
"ssh -L 5555: my.name@dev02.mycompany.com"

Maybe try adding a "sleep 30" at the end of that command (inside the double quote) if the -nf flags try to close it immediately

Friday, June 26, 2009

Reminding myself about Base64

I've learned and forgotten Base64 a couple times now, and after I re-looked up how it worked, I thought I'd take advantage of the old adage about "you remember 80% of what you write" by blogging my own explanation of Base64.

Here goes...

Let's say you wanted to send the string "{|}" in an email as some sort of crude emoticon. Those are 3 characters that are in the ASCII character set, but they aren't in the Base64 encoding table. They could just as easily be 3 non-printable characters or maybe multi-byte UTF-16 character points, but let's just stick with {|} for this example.

The 3 bytes for those ASCII characters are:
{ = 7B = 123 = 01111011
| = 7C = 124 = 01111100
} = 7D = 135 = 01111101

String those 3 bytes together into a single 24 bit stream is:

The number 24 is both (3 * 8) and (4 * 6), so splitting it into 4 x 6-bit chunks yeilds:
011110 = 30
110111 = 55
110001 = 49
111101 = 61

Now looking those up in the Base64 conversion chart yields:
011110 = 30 = e
110111 = 55 = 3
110001 = 49 = x
111101 = 61 = 9

So "{|}" is "e3x9" in Base64 encoding.

I found this nifty base64 encoder/decoder site to validate my gorilla math.

Monday, May 18, 2009

DBUnit does what??

I found out today that if you don't define all the columns you want to use in the very first row of a DBUnit XML file that you define, then you can't define values for any undeclared columns later.

I knew DBUnit nulled out things I didn't declare by default, but I didn't realize you couldn't re-defined them later. For example, if the cancel_date is null, that has a special meaning...so I defined this:

<commitment id="1" customer_id="1" sign_up_date="2009-1-1" start_date="2009-2-1" end_date="2009-4-30" percent="0.05">
<commitment id="2" customer_id="1" sign_up_date="2008-11-15" start_date="2008-11-1" end_date="2009-1-1" percent="0.10">
<commitment id="3" customer_id="2" sign_up_date="2005-2-22" start_date="2005-3-1" end_date="2020-7-30" percent="0.20" cancel_date="2009-5-15">

Because "cancel_date" only shows up on the 3rd row (and not explicitly in the first row where I'd set it to null for declarative purposes only), DBUnit passes cancel_date as null even when parsing the 3rd row.

If I instead re-order my rows so that the row with the superset of attributes I want to use shows up first, then it's ok (e.g., see below)

Note that I couldn't find an easy way of just saying "cancel_date=[null]" on the initial rows #1 and #2 in the above block, or I would have done that instead of re-ordering the rows.

<commitment id="1" customer_id="2" sign_up_date="2005-2-22" start_date="2005-3-1" end_date="2020-7-30" percent="0.20" cancel_date="2009-5-15">
<commitment id="2" customer_id="1" sign_up_date="2009-1-1" start_date="2009-2-1" end_date="2009-4-30" percent="0.05">
<commitment id="3" customer_id="1" sign_up_date="2008-11-15" start_date="2008-11-1" end_date="2009-1-1" percent="0.10">

Thursday, January 29, 2009

Quartz Scheduler Not Working? Don't forget lazy-init=false

I followed the line-by-line instructions in the Spring 2.5.x documentation on integrating Quartz with Spring and couldn't, for the life of me, get the Quartz jobs to run.

Long story short, try throwing in a lazy-init="false" in your configuration for the SchedulerFactoryBean.

Here's my configurations for a automatically re-loading a bunch of XML files off a file system every 120 seconds:

<bean id="abstractDAORefreshTrigger" class="org.springframework.scheduling.quartz.SimpleTriggerBean" abstract="true">

<!-- Delay 10 seconds before starting -->
<property name="startDelay" value="10000">

<!-- Then run every 120 seconds thereafter -->
<property name="repeatInterval" value="120000">

<bean id="tipCacheRefreshTrigger" parent="abstractDAORefreshTrigger">
<property name="jobDetail" ref="tipCacheRefreshJobDetail"> </bean>

<!-- A list of Triggers to be scheduled and executed by Quartz -->
<bean lazy-init="false" class="org.springframework.scheduling.quartz.SchedulerFactoryBean">
<property name="autoStartup" value="true">
<property name="triggers">
<ref bean="tipCacheRefreshTrigger">
<bean id="tipCacheRefreshJobDetail" class="org.springframework.scheduling.quartz.MethodInvokingJobDetailFactoryBean">
<property name="targetObject" ref="tipCache">
<property name="targetMethod" value="refresh">
<!-- Don't run 2 of these jobs at the same time! -->
<property name="concurrent" value="false">