I’ts been a while since I posted something, but that’s just because I’ve been swamped at work. One of the main reason that I even post here is so that I can remember how I did something later on down the line. Here’s a perfect example. A while back I wanted to make a quarterly average of a 2d time series (i.e. average a 2d field every three months). You can make climatologies in ferret, but here I wanted a subset to average over, not the entire time range. One thing that seems to work here is to just do a 3 month average from the middle month that you want in the range. An example below is to make a three month average of a SeaWiFS chlorophyll-a field for October – December 1997:

let swseas = CHLA[l=3:200:3@AVE]

This starts at month 3 in the time series (Nov 1997 in SeaWiFS) and goes to the end of the series (yes 200 is too many but it’s OK) and then averages every three months.

It almost seems to make more sense to start at October and go forward every three months, but that doesn’t work as it must average on the center node…


yes? list CHLA[x=190,y=35,l=2:4]
VARIABLE : Chlorophyll-a Concentration (Milligrams per cubic meter)
FILENAME : chla-SeaWiFS_Monthly_Chla
FILEPATH : las-FDS/LAS/SeaWiFS_Monthly_Chla/
SUBSET : 3 points (TIME)
18-OCT-1997 / 2: 0.1265
17-NOV-1997 / 3: 0.1700
18-DEC-1997 / 4: 0.2466

Average is 0.181033

yes? list CHLA[x=190,y=35,l=2:4:3@AVE]
VARIABLE : Chlorophyll-a Concentration (Milligrams per cubic meter)
regrid: 2192 hour on T@AVE
FILENAME : chla-SeaWiFS_Monthly_Chla
FILEPATH : las-FDS/LAS/SeaWiFS_Monthly_Chla/
TIME : 18-OCT-1997 06:30

yes? list CHLA[x=190,y=35,l=3:5:3@AVE]
VARIABLE : Chlorophyll-a Concentration (Milligrams per cubic meter)
regrid: 2192 hour on T@AVE
FILENAME : chla-SeaWiFS_Monthly_Chla
FILEPATH : las-FDS/LAS/SeaWiFS_Monthly_Chla/
TIME : 17-NOV-1997 17:00

OK, another night, another trial. I must say, tonight was a lot more fun than the last couple of nights, because I really felt that I learned something, which is really the whole point of this exercise. So the example I was trying to code tonight is a simple EOF of a 3D data series. This is something that I just had to code up at work today, so it was a perfect chance for me to try out Sage. For work I ended up altering an existing m-file and running the EOFs in Matlab, but that’s OK, because now I know what I expect to see after running this in Sage.
The data names have been changed to protect the innocent.

# Load in required modules
sage: from import *
sage: from pylab import *
sage: from scipy.stats.stats import nanmean
sage: import datetime

#Load data from NetCDF file
sage: ncfile = netcdf_file('','r')
sage: varnames = ncfile.variables.keys()
sage: varnames


#Now that I have the order I can load into arrays
sage: lon = ncfile.variables[varnames[0]][:]
sage: lat = ncfile.variables[varnames[2]][:]
sage: dates = ncfile.variables[varnames[1]][:]
sage: raw = ncfile.variables[varnames[3]][:,0:50,:] #I only want 50 records in Y
sage: data = raw.copy() #make a copy
sage: data.shape
(124, 50, 151)
sage: (ncycles, ny, nx) = data.shape

#deal with dates
sage: ncfile.variables[varnames[1]].attributes

{'axis': 'TIME',
'time_origin': '15-JAN-1901 00:00:00',
'units': 'HOURS since 1901-01-15 00:00:00'}

sage: off = datetime.datetime(1901,1,15,0,0,0)
sage: months = ones(ncycles)

sage: for i in range(0,ncycles):
....tdel = datetime.timedelta(days=dates[i]/24) = off + tdel
....months[i] = td.month

sage: ind = where(raw<0)
sage: data[ind] = nan

And here was the first real bottleneck, as things just slowed to a crawl as python tried to find all the instances where the data was less than zero. This is something that is instantaneous in Matlab, and took over 30 seconds to go through 124*50*151 values. There must be a faster way to do this.

#Take out monthly averages
sage: mclim = ones((50,151))
sage: for i in range(1,13):
....index = where(months==i)[0]
....mclim = nanmean(data[index,:,:])
....data2[index,:,:] = data[index,:,:] - mclim

data2.shape = (ncycles, nx*ny)
ltmean = nanmean(data2) #get mean of each time series

#take out long term mean
sage: anom = data2.copy()
sage: for i in range(0,ncycles):
....anom[i,:] = data2[i,:] - ltmean

sage: EOF = nan_to_num(anom) #push land back to zero
sage: [u,s,v] = linalg.svd(EOF)
sage: for i in range(0,ncycles):#build array so that we can project eigenvalues back onto timeseries
....s2[i,i] = s[i]
sage: amp = dot(s2.transpose(),u.transpose()) #get amplitude
sage: spatial = v[0:4,:]# pull out spatial fields
sage: ratios = pow(s,2)/sum(pow(s,2))*100 #get %variance explained for each mode
sage: temp = spatial[0,:]
sage: temp.shape = (ny,nx) #push back to original dims
sage: plot(amp)
sage: savefig('amplitude.png')
sage: imshow(flipud(temp))
sage: savefig('spatial.png')


I actually really felt positive about this whole example as I really learned a lot more. This also was probably too large of an array to test out (measure twice cut once!) but it’s what I was working with so I wanted a real world example. The more that I worked in sage the more comfortable I felt as well. The geographic projection issue is still there, as well as some indexing speed issues, but overall, I was really impressed with the Sage/SciPy/NumPy experience today. Overall I feel that more of a transition was made for me last night/today. Which was great timing as a co-worker actually called me and asked if I knew of any free replacements for Matlab…

Technorati Tags:
, , , , , ,

This was quite possibly the worst idea for title naming that I could have thought of. Anyway, I played around a bit more tonight, and I thought that I would give an update to the three people that are waiting with bated breath.

Anywho, I decided to continue trying to map the data from the netcdf file onto a projection, and here’s what I ran into.

It looks like the basemap module is installed (as basemap) but that it depends on matplotlib > 0.98 and 0.91 is installed. I tried to be tricky and move my locally installed matplotlib over to the sage/local/lib/python2.5/site-packages directory but then that version of matplotlib needed a newer version of numpy than what was installed. At this point I tried

hostname $> sage -upgrade

to see if updated packages/modules were available. This started a huge chain reaction of downloads and source compiling to get to the latest, greatest versions. This process took exactly 59m10.482s to complete (I know because it told me!).

But once again, I get this error:

sage: from basemap import basemap

ImportError: your matplotlib is too old – basemap requires version 0.98 or higher, you have version 0.91.1

At this point though, it’s not working on either the linux or OSX platforms due to outdated dependencies, so either I need to find another way to plot mapped projections or use something else.

Again, this isn’t a knock against Sage, because I really don’t think that is an ideal test for this software. But honestly, a lot of why I went for this approach was to avoid having to use separate approaches for data manipulation and visualization, and this would be a common task. Matlab’s mapping toolbox is useless to me for plotting, so I end up using m_map, which is still not as good as GMT, but it gets the job done in house.

My main thoughts at this point are that it seems easy to get into dependency hell here, as one module upgrade can force another, and so on. At this point it’s another block of time spent on setup, and no result. Time to stop for the time being.

Technorati Tags:
, , , , , , ,

Part 1 of the sage experience was just installing the software. This was incredibly easy on both OSX and linux (CentOS 5.2 and Fedora 9). For the Fedora 9 install I just downloaded the latest version of Sage which was compiled for Fedora 8, and this seemed to be just fine.

So for me, I really just wanted to be able to do a few different examples which would be close to “real world applications” for me.

Some things that I would like to be able to do in sage:

1. Load in a 2-D NetCDF satellite data file and display it as a map projection. This should be really simple. I would usually just use GMT for this (a small shell script wrapping psbasemap, grdimage, and pscoast).

2. Load in a data series with dates and locations, and match this to corresponding satellite data in time and space. Normally I would use a perl script that I wrote many moons ago to do this. I would basically sort the data, then match a block of data at a time using GMT’s grdtrack function. I know that this is inefficient, and really I would like to be able to pull extra data in x,y, or t and take the mean or median value, which would be more CPU intensive, but better than matching just one point in space and time to the nearest pixel.

3. Load in a multivariate data series and do multivariate statistics (e.g. LME, GLM/GAM, RDA). This is where the R interface would come into play. Normally I would prepare the data elsewhere, then import the flat table into R and use the R functions. This may involve installing more packages (nlme, mgcv, etc).

4. Load in a 3-D set (x,y,t) of satellite data files and perform an EOF analysis on them (akin to SVD in Matlab). Normally I would do this in Matlab or Ferret. I’m just curious how easy it would be to do this here.

There are other things that I could do, but these are a few off the top of my head, and things that I am doing now, so it would be incentive to try Sage out with. For tonight, I’ll just work on #1, which should be really fast.

The data file I’m using is just a NetCDF file (created by GMT) which I can read with pupynere in python. Here I’m going to use the module (which is actually based on pupynere I believe).

sage: from import *
sage: from pylab import *

# Read in file metadata to object
sage: ncfile = netcdf_file(‘RS2006001_2006031_sst.grd’,’r’)

# get the variables in the data file
sage: ncfile.variables

{‘x’: < object at 0xb47b08c>,
‘y’: < object at 0xb47b16c>,
‘z’: < object at 0xb47b1ec>}

# Yank out data
sage: longitude = ncfile.variables[‘x’][:]
sage: latitude = ncfile.variables[‘y’][:]
sage: sst = ncfile.variables[‘z’][:]

# just plot sst to test 2D image plotting
sage: plot(sst)
[<matplotlib.AxesImage instance at 0xc03636c>]

Nice, but it’s upside down. Let’s flip it vertically.

sage: clf
sage: plot(flipud(sst))
[<matplotlib.AxesImage instance at 0xb86a2ac>]
sage: savefig('temp.png')


Easy, but I want to put this on a projection. Normally I would use the basemap tools which are an add on to matplotlib. I don’t see these installed, and I didn’t see them in the extra sage packages on line, so I downloaded them from SourceForge and installed them.

The first step you have to do is to install the geos package, just read the README in the geos folder and hit


and then we get our first epic fail. Something in the geos chain won’t compile, and I’m just about fried enough to call it quits for this evening.

At this point I’ve been playing with this for more than 2 hours, and I still have yet to make a simple map on a projection. There has to be something I’m missing, but at this point I’m going to pause until tomorrow. So not the best testing evening, but there are some positives so far. The bundling of most packages is a plus, and the ease of loading in NetCDF files is nice. Data displays well using the Pylab interface, even though I am still forced to save to a file at this point.

So immediate goals:

1. Get a backend working for viewing plots in widgets (akin to ipython -pylab)

2. Get the basemap tools installed so that I can make a map with a projection!

Technorati Tags:
, , , , , ,

In an earlier post I alluded to some thoughts that I had on proprietary vs free open source systems for dealing with scientific data. Somewhat out of time constraints and somewhat out of laziness, I had decided to pretty much just stick with the proprietary status quo system that I had in place, rather than commit to the time in learning something new. The double shot of my activation woes with Matlab with my recent realization of how much I had been reinventing the wheel by not using Ferret has given me a new perspective, and after a recent comment inviting me to try out Sage open source mathematical software, I figure that it was time to give it a try.

As Sage is “a free mathematics software system which combines the power of many existing open-source packages into a common Python-based interface”, that means that there are many libraries bundled together. These include ipython, numpy, scipy, R, RPy, and more. While I already had many of these installed on my Macbook, rather than look for a solution to cobble things together to save space, I just took the easy route and downloaded the .dmg file to keep everything together in one environment. I’m not sure if it’s possible (or even wise) to use existing components (probably not) but I figured that it would be more hassle than it was worth.

The .dmg file is fairly large (~256 MB), but smaller than the last Matlab release, which can put things in perspective. Download also only took about 6 minutes over wireless (>500 KB/sec). Installation was as simple as dragging the sage folder from the mounted .dmg file into my Applications folder. Expanded the folder was pretty hefty (~780 MB), which was larger than the last Matlab release (~680 MB), but not by much.

The next step in the installation instructed me to double click the sage icon (inside the Sage folder), and then change some preferences. When I double clicked it however a terminal fired up and I sage started to work

[user@localbox ~]$ /Applications/sage/sage ; exit;
| SAGE Version 3.0.2, Release Date: 2008-05-24 |
| Type notebook() for the GUI, and license() for information. |
The SAGE install tree may have moved.
Regenerating Python.pyo and .pyc files that hardcode the install PATH (please wait at most a few minutes)…
Please do not interrupt this.

Setting permissions of DOT_SAGE directory so only you can read and write it.

After this step I was presented with the sage prompt. Typing in

sage: notebook()

prompted me to create an admin password, and then opened the sage notebook GUI in my default web browser. I closed this down and then went back to the command prompt to start to play with the Tutorial.

I’m going to stop here for now, because it’s late, but I’ll post more after playing around with sage a bit. I’ve been through the tutorial a bit, and plotting works, yet on the surface there seems to be an inability to view plots “on the fly” with matplotlib, instead having to save and view with an external viewer. Just an extra step, but I as a it used to having graphics on the fly with ipython -pylab. As I said though, I just wanted to talk about installation. Once I give it a fair shake I’ll post up what I think so far.

Technorati Tags:
, , , ,

OK, so I spent a bit of time yesterday thinking about some alternatives to Matlab versus the time/money balance of learning a completely new system. Today’s adventure activating Matlab 2008a has me definitely leaning in the “alternative” direction.

A little background on Matlab 2008a and the activation model. With 2008a Matlab has moved to an activation model, where you have to basically register your computer with The Math Works. With an individual license there are two choices you have: lock the license to a computer, or lock it to an individual. As I use Matlab at home, at work, and on my laptop I chose to lock it to myself.

Now Matlab 2008a came out in February 2008, and I’m just trying to install it now, which is all based on my fear of activation, since usually I am eager to download and install the latest version. So I bit the bullet yesterday and installed the OSX (laptop) and linux versions. Usually I get a nice CD package with installations for all platforms but that must have ended as well.

So this morning I figured that I would give installation a go. Usually a 5-10 minute affair, I spent 90+ minutes on this, and it still doesn’t work on the linux box. Basically what would happen is that I would go through the automated steps (generate the file key, license file, download and give info where needed, rinse, repeat) and watch Matlab completely have a hissy fit that either my username didn’t match, or my host ID didn’t match or that I didn’t have the desk pointed northeast (well, not that last one). I did my civic duty and dug around the troubleshooting site, where I found exactly three entries for activation problems. This must mean that I am completely unlucky to both have a problem and to have a non-standard problem that could not be fixed with answers #1-3.

At any rate, time is money right? So I figured I would give tech support a call. I got through on the second try, and actually go a human who was surprisingly helpful. The main problem on both machines was that the host ID generated was from the MAC address of the active internet connection, which is not what the license manager wants to see. So the laptop fix was an easy one after all, just use the MAC address of the primary connection, which was the wired connection (I was using the wireless at the time). In hindsight, yeah, that makes sense, but there’s still too much voodoo involved for me.

The linux box was worse, where I do not have eth0 enabled since I have problems with it, and I use eth1 instead. So while to the operating system there is only eth1 (as far as I understand) the license file wants to communicate with eth0, which of course doesn’t exist in the software space (now I may be way off on this, but this is what I could come up with for now). So my options are to either rename eth1 to eth0, or activate eth0. Well guess what, at this point I’m inclined to just not run Matlab 2008a since I don’t really feel like risking the possibility of borking my internet connection just to get the latest version of Matlab running. So remind me again why I am paying those maintenance fees? Maybe I’m no activation super genius, but should it really be this difficult to get software that you’ve paid good money for to run? Sure, there’s still an open thread with tech support but how much more time do I want to spend on this today? Well that’s easy, none.

Now I may feel differently in a bit and try again. Or I may just drink more coffee and start installing Sage or Enthought and see how that goes. Or maybe I’ll just try to get some work done.

Technorati Tags:
, , ,

I’ve been a Matlab user for 15 years, and over that time period I’ve of course become fairly dependent on it to get things done quickly. The downside? It’s expensive. It’s a pretty penny to buy the base package, toolboxes are extra, and there are recurrent “maintenance” costs each year to get upgrades.

Sure, that’s standard practice, but each year I have to stand up and justify to my boss why we need to pay these costs for our multiple Matlab users in our shop (a multi-user concurrent license is out of the question, don’t even ask). So what’s a user to do?

For years we’ve just bit the bullet and paid the fee, but with options such as R and Numpy/SciPy out there it may be time to loosen the chain a bit. Or maybe not.

A couple of possible alternatives to Matlab and their respective pros and cons:


R is a really nice statistical environment which has pretty much become the industry standard, replacing the very expensive S-Plus. It’s easy to install, has an excellent GUI on OS X, and has a ton of community released packages which are usually made during the preparation of scientific papers. There are some downsides, as there can be multiple (sometimes possibly conflicting) packages (e.g. gam vs mgcv) but choice is good, right? The cons for me are that it’s a new language to learn, and even though I write an m-script for everything, I find the scripting in R a bit clunky, even writing in TextWrangler and then hitting CTRL-R to have the SendToR script source the code for me. It’s just something new, and while the built in functions are really nice, the learning curve for coding things is higher, and will it be faster in the long run than just using Matlab?


The Numpy/SciPy combo in Python is a viable alternative to Matlab, even having a page dedicated to showing you how easy it is to transition from Matlab. As with R, it’s free, and there are a ton of functions available, but there is a downside for me. I’ve successfully installed it on CentOS 5.1 and OS X 10.5, but it was a bit complicated. I know that these are packaged in many distributions, but not in CentOS, and I had to install from either source or .egg files, which isn’t all that tough, but took some time. I’m not writing the 24.3 steps I did to get it installed because honestly, I didn’t write it down and I don’t remember what I did. Next time I promise to list it out! On OS X I did it all through MacPorts on the MP version of python 2.5. Again, it took some massaging to get it all set up since I was using the non-default install of python.

Overall though, the reason for this little diatribe is that while there are alternatives to Matlab, they all involve learning new ways to do things which, after I successfully learn them, may not be faster than just doing it in Matlab. Most of the time I just need to get things done, and the $7/day cost of Matlab may be well worth it if I’m saving more than 10 minutes of time during that day (assuming for a minute that I am earning $42/hour).

I’m rambling a bit here, but these are just questions that I ask myself as I code things up at the desk. For each of these tools has their place, and in terms of maximum comfort and speed, I use each of them for their strengths. The main dilemma is that in a perfect situation I would drop the commercial Matlab for the free/open source alternatives, but at a minimal cost in dollars and time.

Technorati Tags:
, , , , ,

Playing around a bit more with Ferret, one of the things that I wanted to do was to load a binary flat file of pixels where each pixel represents the area of 0.1 x 0.1 degree which of course changes with longitude and latitude.

To do this I wrote out data as a real*4 floating point number in one column running up from south to north then west to east. This is equivalent to the Generic Mapping Tools (GMT) command

grd2xyz -R159.9/230/12.4/32.5 -ZBLf area.grd > area.bin

where the grid spacing is x = 0.1 and y = 0.1

To load this into Ferret (on a linux system):

DEFINE AXIS/X=160:230:0.1 x10
DEFINE AXIS/Y=12.5:32.5:0.1 y10
DEFINE GRID/X=x10/Y=y10 g10
FILE/VAR=MYVAR/GRID=g10/FORMAT=stream area.bin
shade MYVAR # To view

the only difference on the Mac was that I had to byte swap, which is done with a /swap qualifier on the loading line:

FILE/VAR=MYVAR/GRID=g10/FORMAT=stream/swap area.bin

Blogged with the Flock Browser


Just a mini post here, but as I would really like to move the home video editing I do over to Linux, I have been collecting links to get the ball rolling.

Specifically, what I am looking at right now are:

Cinerella or Kino for video editing
Gaupol for subtitles
Dvdauthor to put it all together

There’s a nice guide here on one user’s experience with Kino that I might check out.

I’ll post more as I play…

Technorati Tags:

Technorati Tags:

With the release of CentOS 5.2 I upgraded both my home server and my work machine. Ironically, while there is a known issue with kernel panics on older architecture machines (like my PIII home server), it was my core duo work machine which refused to boot with kernel 2.6.18-92.1.6.el5.

The workaround? I am now using 2.6.18-53.1.21.el5…

Other than that things seemed to have worked well. I appreciate the updaes in many programs such as Open Office, yet others such as Firefox were not necessary as I had already installed FF3 locally.

Technorati Tags:

Next Page »