Coding Tips


Just an update on the Matlab situation. I recently did renew my maintenance fees, but dropped support for the optimization, mapping, and GARCH toolboxes as I never used this. The only thing I ever used the mapping toolbox for was calcultating distances, and that got busted at some point. The free m_map software is a far superior toolbox, and did I mention it was free?

Part of the decision was that all of the analysis for the last chapter of my dissertation was done in R. Some data mining was done with Python, oceanographic transects/profiles in Ocean Data View and maps in GMT, but all analysis was done in R. This was a revelation to me, mostly due to the change in not having to write an m-script full of loops. I know that some of this is due to my bad habits in Matlab (not using the vector operations and writing loops instead) but there are so many convenience functions in R that just do what I want, and I think that this reflects a change to a more statistically demanding anaylsis style that Matlab really just isn’t built for.

This isn’t to say that one is better than the other, but with the cost and difficulty in getting Matlab up and running on new machines, R is definitely more than a viable option, and a joy to work with.

Advertisements

For a while I’ve been looking for a way to add error bars in R. It’s actually not that trivial in some cases, and I think that I wrote my own m-script to do it in matlab for bar plots.

At any rate, Google is my friend, and I found a really good post detailing how to do this. In a nutshell, just grab the gplots package and look at the plotCI function. Works great. The only thing I had to tweak was that the uiw is how far from the mean value you want to go (say, S.E., and not mean + S.E.).

Nice.

At this point just a small update to say that since the activation fiasco, I have not used Matlab at all. Everything that I have done for the last four presentations has been done with R and Python, and I am both the happier and wiser for it.

I have been playing with JMP a bit, but honestly, it’s a bit too “high level” and while it’s neat for data exploration, it really made me nuts that two weeks after I got a license, they were *offering* me a special deal on the impending upgrade. Nothing like dropping a bundle on instantly outdated software. Way to go guys.

So for me, it’s been a pleasure to use R, Python (with iPython of course), and ferret on my Mac Pro. If I ever take a break from playing on the computer I’ll post up what I installed on the new Mac in terms of scientific software.

A coworker recently approached me if asked if I knew how to make Pie Wedge plots in the Generic Mapping tools using the -Sw(w) switch. I had never done this before, but I thought it would be a cool thing to do so I tried my hand a making one.

It was tougher than I thought, and while I have seen these types of plots quite a bit in the fisheries world, there didn’t seem to be any examples of how to do this. So I thought I would post up here what I did, both for myself in the future and for anyone else in the world who may be interested in this type of plot.

Basically what I want to do is to make a dummy plot with a pie chart centered at every 5×5 degree box with a different size outer circle based on the total number in the box, and wedges representing percentages of that total amount.

For this exercise I am using the PXSY routine of GMT4.2.0 with my default MEASURE_UNIT = in.

To make this chart you have to have 5 columns of data:

Longitude — Latitude — Radius — StartAngle — EndAngle

So say I want a pie chart to represent how many types of widgets I sold in the area from 160-155W, 18-23N, with each wedge a portion of the widgets and centered in the middle of the 5×5 box. Here’s the data:

#lon lat blueWidget greenWidget redWidget
-157.5 20.5 200 200 400

So let’s say I make a map where each inch = 1 degree. The largest that I want my pie wedge diameter is 1 inch, so I know that for this example (only one data point) I will make the radius 0.5. I also know that the total for this example point is 800, so I convert this into angles. I actually have to make 3 rows of data now since I have three widgets. I also converted to 0-360 degrees.

$>cat pienc.xy
#lon lat radius startAngle endAngle
202.5 20.5 0.5 0 90 #end angle is 360 * (200/800)
202.5 20.5 0.5 90 180 #start angle is row-1 end angle
202.5 20.5 0.5 180 360 #Finish circle

So, a nice shell script to plot this up with the output below:

$>cat pienc
#!/bin/ksh
psfile=pie.ps
psbasemap -Jm1 -R200/206/17/23 -Bf1a1g1/f1a1g1WeSn -X1.5 -Y4 -P -K > $psfile
pscoast -Jm -R -O -K -Di -G200/200/200 -W1/0/0/0 >> $psfile
psxy pienc.xy -Sw -Jm -R -O -K >> $psfile
echo “203 24 12 0 0 6 Pie Chart Example” | pstext -Jm -R -O -N >> $psfile

$>display pie.ps

Pie Wedge Plot no Color

Pie Wedge Plot no Color

So that’s all well and good, except it would be nice to have different colors for each wedge representing a different widget. To get color in there you have to give a new column of values that will be mapped to a color value in a color lookup table (a cptfile in GMT). This column must be in the third row and then the -C switch must be given in the psxy call.

$>cat pie.xy
#lon lat COLORVALUE radius startAngle endAngle
202.5 20.5 1 0.5 0 90 #end angle is 360 * (200/800)
202.5 20.5 2 0.5 90 180 #start angle is row-1 end angle
202.5 20.5 3 0.5 180 360 #Finish circle

And my cptfile:

$>cat pie.cpt
0 0 0 255   1.1 0 0 255
1.1 0 255 0   2.1 0 255 0
2.1 255 0 0   3.1 255 0 0

The adjusted script:

$>cat pie
#!/bin/ksh
psfile=pie.ps
psbasemap -Jm1 -R200/206/17/23 -Bf1a1g1/f1a1g1WeSn -X1.5 -Y4 -P -K > $psfile
pscoast -Jm -R -O -K -Di -G200/200/200 -W1/0/0/0 >> $psfile
psxy pie.xy -Sw -Jm -R -O -K -Cpie.cpt >> $psfile
echo “203 24 12 0 0 6 Pie Chart Example with Color” | pstext -Jm -R -O -N >> $psfile

$>display pie.ps

PIe Wedge plot with color

Pie Wedge plot with color

And that’s pretty much it. Now to go sell some more widgets.

This was quite possibly the worst idea for title naming that I could have thought of. Anyway, I played around a bit more tonight, and I thought that I would give an update to the three people that are waiting with bated breath.

Anywho, I decided to continue trying to map the data from the netcdf file onto a projection, and here’s what I ran into.

It looks like the basemap module is installed (as basemap) but that it depends on matplotlib > 0.98 and 0.91 is installed. I tried to be tricky and move my locally installed matplotlib over to the sage/local/lib/python2.5/site-packages directory but then that version of matplotlib needed a newer version of numpy than what was installed. At this point I tried

hostname $> sage -upgrade

to see if updated packages/modules were available. This started a huge chain reaction of downloads and source compiling to get to the latest, greatest versions. This process took exactly 59m10.482s to complete (I know because it told me!).

But once again, I get this error:

sage: from basemap import basemap

ImportError: your matplotlib is too old – basemap requires version 0.98 or higher, you have version 0.91.1

At this point though, it’s not working on either the linux or OSX platforms due to outdated dependencies, so either I need to find another way to plot mapped projections or use something else.

Again, this isn’t a knock against Sage, because I really don’t think that is an ideal test for this software. But honestly, a lot of why I went for this approach was to avoid having to use separate approaches for data manipulation and visualization, and this would be a common task. Matlab’s mapping toolbox is useless to me for plotting, so I end up using m_map, which is still not as good as GMT, but it gets the job done in house.

My main thoughts at this point are that it seems easy to get into dependency hell here, as one module upgrade can force another, and so on. At this point it’s another block of time spent on setup, and no result. Time to stop for the time being.

Technorati Tags:
, , , , , , ,

Part 1 of the sage experience was just installing the software. This was incredibly easy on both OSX and linux (CentOS 5.2 and Fedora 9). For the Fedora 9 install I just downloaded the latest version of Sage which was compiled for Fedora 8, and this seemed to be just fine.

So for me, I really just wanted to be able to do a few different examples which would be close to “real world applications” for me.

Some things that I would like to be able to do in sage:

1. Load in a 2-D NetCDF satellite data file and display it as a map projection. This should be really simple. I would usually just use GMT for this (a small shell script wrapping psbasemap, grdimage, and pscoast).

2. Load in a data series with dates and locations, and match this to corresponding satellite data in time and space. Normally I would use a perl script that I wrote many moons ago to do this. I would basically sort the data, then match a block of data at a time using GMT’s grdtrack function. I know that this is inefficient, and really I would like to be able to pull extra data in x,y, or t and take the mean or median value, which would be more CPU intensive, but better than matching just one point in space and time to the nearest pixel.

3. Load in a multivariate data series and do multivariate statistics (e.g. LME, GLM/GAM, RDA). This is where the R interface would come into play. Normally I would prepare the data elsewhere, then import the flat table into R and use the R functions. This may involve installing more packages (nlme, mgcv, etc).

4. Load in a 3-D set (x,y,t) of satellite data files and perform an EOF analysis on them (akin to SVD in Matlab). Normally I would do this in Matlab or Ferret. I’m just curious how easy it would be to do this here.

There are other things that I could do, but these are a few off the top of my head, and things that I am doing now, so it would be incentive to try Sage out with. For tonight, I’ll just work on #1, which should be really fast.

The data file I’m using is just a NetCDF file (created by GMT) which I can read with pupynere in python. Here I’m going to use the scipy.io.netcdf module (which is actually based on pupynere I believe).

sage: from scipy.io.netcdf import *
sage: from pylab import *

# Read in file metadata to object
sage: ncfile = netcdf_file(‘RS2006001_2006031_sst.grd’,’r’)

# get the variables in the data file
sage: ncfile.variables

{‘x’: <scipy.io.netcdf.netcdf_variable object at 0xb47b08c>,
‘y’: <scipy.io.netcdf.netcdf_variable object at 0xb47b16c>,
‘z’: <scipy.io.netcdf.netcdf_variable object at 0xb47b1ec>}

# Yank out data
sage: longitude = ncfile.variables[‘x’][:]
sage: latitude = ncfile.variables[‘y’][:]
sage: sst = ncfile.variables[‘z’][:]

# just plot sst to test 2D image plotting
sage: plot(sst)
[<matplotlib.AxesImage instance at 0xc03636c>]

Nice, but it’s upside down. Let’s flip it vertically.


sage: clf
sage: plot(flipud(sst))
[<matplotlib.AxesImage instance at 0xb86a2ac>]
sage: savefig('temp.png')

RStest

Easy, but I want to put this on a projection. Normally I would use the basemap tools which are an add on to matplotlib. I don’t see these installed, and I didn’t see them in the extra sage packages on line, so I downloaded them from SourceForge and installed them.

The first step you have to do is to install the geos package, just read the README in the geos folder and hit

./configure
make

and then we get our first epic fail. Something in the geos chain won’t compile, and I’m just about fried enough to call it quits for this evening.

At this point I’ve been playing with this for more than 2 hours, and I still have yet to make a simple map on a projection. There has to be something I’m missing, but at this point I’m going to pause until tomorrow. So not the best testing evening, but there are some positives so far. The bundling of most packages is a plus, and the ease of loading in NetCDF files is nice. Data displays well using the Pylab interface, even though I am still forced to save to a file at this point.

So immediate goals:

1. Get a backend working for viewing plots in widgets (akin to ipython -pylab)

2. Get the basemap tools installed so that I can make a map with a projection!

Technorati Tags:
, , , , , ,

In my travels one of the things that I end up doing is to make seasonal climatologies out of 3 dimensional data sets (longitude, latitude, time). For example, I may have a series of monthly sea surface temperature images for the North Pacific from January 1950 through December 2007 and I want to get the spatial averages for each seasonal. In other words for each spatial point I want to average all the points in time for all January-March months (season 1), April-June months (season 2) etc.

What I would usually do is to do something kludgy like either write a shell script to cycle through all the months that I wanted to collapse, and then dump the data from the GMT NetCDF format into binary files and then run blockmean and regrid them in GMT’s NetCDF format. Sometimes if the data was reasonable in size I would just load them directly into Matlab and index the months that I wanted, subset the array and then do a nanmean(data,3).

Neither one of these methods is the easiest or most efficient way to do things, this I know. So with my newfound friend Ferret I figured that there must be a really simple, built-in way to make the climatologies from this XYT data, and there was.

Basically Ferret allows for a Modulo regridding to a built-in climatological time axis. What would have taken quite a bit of time in my previous attempts took me 10 seconds in Ferret.

! load example data set
SET DATA levitus_climatology
USE climatological_axes
LET sst_clim = SST[d=1,X=100:260,Y=10;60,GT=seasonal_reg@MOD]
CANCEL DATA climatological_axes
SHADE sst_clim[L=1] ! shade Quarter 1

This now provides an L=4 data set of seasonal climatologies centered in time on [15-Feb, 16-May, 16-Aug, 15-Nov].

So, while 10 years ago half the fun may have trying to be clever and code a faster solution than the last one that I had made, in my old age I tend to just want to get this portion of it done so that I can interpret the results. This is very nice indeed.

Technorati Tags:
, , ,

Next Page »