Archive of articles classified as' "programming"

Back home

Skynet in Python

5/02/2009

After a long hiatus I have come back to doing some (extremely basic, price I have to admit) Python coding. This xkcd§ comic is a timely reminder:

Well, that and minimization of the objective function.

Filed in miscellanea, programming No Comments

Generating dynamic Google maps with Python

1/02/2009

As I have mentioned before, I have been putting together some dynamically generated maps for environmental information. A barebones version of my Python code to generate the KML file is:

#!/usr/bin/env python
# encoding: utf-8
 
import urllib, random
 
# Charting function
def lineChart(data, size = '250x100'):
    baseURL = 'http://chart.apis.google.com/chart?cht=lc&chs='
    baseData = '&chd=t:'
    newData = ','.join(data)
    baseData = baseData + newData
    URL = baseURL + size + baseData    
    return URL
 
# Reading test data: connecting to server and extracting lines
f = urllib.urlopen('http://gis.someserver.com/TestData.csv')
stations = f.readlines()
kmlBody = ('')
 
for s in stations:
    data = s.split(',')
    # Generate random data
    a = []
    for r in range(60):
        a.append(str(round(random.gauss(50,10), 1)))
 
    chart = lineChart(a)
 
    # data is csv as station name (0), long (1), lat (2), y (3)
    kml = (
        '<Placemark>\n'
        '<name>%s</name>\n'
        '<description>\n'
        '<![CDATA[\n'
        '<p>Value: %s</p>\n'
        '<p><img src="%s" width="250" height="100" /></p>\n'
        ']]>\n'
        '</description>\n'
        '<Point>\n'
        '<coordinates>%f,%f</coordinates>\n'
        '</Point>\n'
        '</Placemark>\n'
        ) %(data[0], data[3], chart, float(data[1]), float(data[2]))
 
    kmlBody = kmlBody + kml
 
# Bits and pieces of the KML file
contentType = ('Content-Type: application/vnd.google-earth.kml+xml\n')
 
kmlHeader = ('<?xml version=\"1.0\" encoding=\"UTF-8\"?>\n'
             '<kml xmlns=\"http://earth.google.com/kml/2.1\">\n'
             '<Document>\n')
 
kmlFooter = ('</Document>\n'
             '</kml>\n')
 
 
print contentType
print kmlHeader
print kmlBody
print kmlFooter

Well, this is not exactly barebones, because we also wanted to generate dynamic graphs for each placemark, in the easiest possible way. My first idea was to use one of the multiple javascript libraries available in the net However, a quick search revealed that KML files do not support javascript in the description tag. That was the time when I remembered playing with Google Charts a while ago. The lineChart function above is simply a call to create a line chart using the charts API. Because this is a test, I used 60 randomly generated data points, which explains the presence of random as an imported library.

Originally, I did not want to use javascript at all, so inserted the code as a search in maps, generating a link like http://maps.google.co.nz/maps?q=http://gis.someserver.com/dynamicmap.py Just copy the address, send it to some one and, presto, they have access to my map. However, I wanted to embed it in a blog post§ and I was struggling to do it. The solution was to click on the ‘Link’ link in the generated map to copy the ‘Paste HTML to embed in website’ link. This gives an iframe block that can be copied in any page or blog post.

While helping a friend to create another map, we faced the problem that the data set was being updated every five minutes. What is the problem? The map was not being refreshed often enough. The I am not sure if the problem was a browser cache or Google Maps, but it could be solved by calling the KML file with a random extra argument (the script does not need take any arguments, so anything after the question mark is ignored). In my case I needed a frequent random argument, so I use the current time (using the date would work for once a day updates). This meant inserting the map using javascript (and using a Google Maps key). The code for a simple page–from the header onwards–would look like:

<head>
<meta http-equiv="content-type" content="text/html; charset=utf-8"/>
<title>A simple dynamic python generated map</title>
<script src="http://maps.google.com/maps?file=api&amp;v=2&amp;key=my_key"
  type="text/javascript"></script>
<script type="text/javascript">
    //<![CDATA[
 
    function load() {
      if (GBrowserIsCompatible()) {
        var map = new GMap2(document.getElementById("map"));
        map.setCenter(new GLatLng(-33.458943, -70.658569), 11);
        var pollution = new GGeoXml("http://gis.uncronopio.org/testmapscsv.py?"+
                        (new Date()).getTime());
        map.addOverlay(pollution);
      }
    }
    //]]>
</script>
</head>
<body onload="load()" onunload="GUnload()">
<div id="map" style="width:750px;height:600px"></div>
</body>

It was not too bad for mucking around on Friday in between doing house chores.

Filed in geocoded, programming, web No Comments

Displaying air pollution data

2/07/2008

Last week I was contacted by my friend Marcelo about increasing awareness of air pollution problems in Santiago, this web Chile. He was becoming involved in the problem from a technical point of view (GIS and urban forestry). One of the main problems was the lack of proper information for decision making, look so we decided to quickly put together a prototype. Today the page on particulate material pollution went online.

ICAP.jpg

The general process was relatively simple. CONAMA provides data on pollution in graphical form (see, for sale for example, here). I had a quick look at the pages using Firebug, which showed that all the data used for the graphs was contained in one of the javascript files called by the page (variable.js). Then I could obtain up to date pollution data by reading that file, which seems to be updated hourly.

The other component was the location of the air quality stations together with the coordinates of the polygon that marks the city boundary. Marcelo provided me with a KML file containing all the coordinates.

The really fun part was to write a script using Python glueing all these components. The advantages of working with such a great high level language is the default library, which makes chores like reading a file located in another web site very simple, like:

import urllib
f = urllib.urlopen('http://www.conama.cl/rm/airviro/hoy/variable.js')
lines = f.readlines()

Probably the most challenging part has been to quickly learn the basics of KML (without having much free time to do so). The documentation for KML is OK, but the tutorial was not exactly what I was trying to do, so there was a fair amount of trial and error to get things working properly.

Overall, coming back to Python (which I started using in version 1.5) has been a lot of fun, particularly when one has a project of ’social value’.

Filed in chile, environment, geocoded, programming No Comments

Back to Python

7/10/2006

Eleven of September again, troche but I am not thinking of 2001—which was terrible, I agree—but of 1973. I always stop for a minute to remember as an old political campaign said: without hate, without fear, without violence. I was six and still remember.

I am preparing lectures, really busy. I took the students after more than a semester and most of them appear to be clueless. I am going over a general review, again, because they have to learn. Always listening to energetic music when preparing lectures or running analyses. This time is Rammstein’s ‘Sehnsucht’. But, anyway, I also like to go back a few years and Yes’s ‘Close to the Edge’ (yes, from 1972) is now coming from the headphones.

I start writing most of the ideas in Writeroom and then copy the lot to TeXShop. Packages included in the document preamble for the notes:

%! program = pdflatex
documentclass[11pt]{article}
usepackage{pslatex}
usepackage{amsbsy}
usepackage{graphicx}
usepackage{geometry}
geometry{a4paper}

Although the Statistics course uses Mendenhall & Sincich’s ‘Regression Analysis’ as the basic text, I am using quite a few other references:

  • Quinn and Keough’s ‘Experimental design and data analysis for biologists’.
  • Searle’s ‘Linear models’.
  • Harrell’s ‘Regression modeling strategies’.
  • Steel and Torrie’s ‘Principles and procedures of statistics’ (the first edition!).
  • Hamilton’s ‘Regression with graphics’.
  • Neter and Wasserman’s ‘Applied linear statistical models’.

In addition I am also preparing grant applications, so if you try to contact me please be understanding: I will not read my email (or act upon any emails) until Wednesday next week (around September 20th).

Moved to King Crimson’s ‘Discipline’. Now listening ‘Elephant talk’…Talk is ony talk…

Less than a month ago I bought a Panasonic DMC-TZ1 digital camera. Basically, buy information pills
it is a small ‘point and shoot’, approved
but with a very nice 10x optical zoom. Not really a substitute for the Nikon D70 (our main camera these days), physiotherapy
but with the advantage of having it with me most of the time.

It has quite a decent image quality, nice image stabilizer (really needed when using as a telephoto) and good screen. The flash is a bit flaky but, hey, it is tiny. Battery life is quite good and I really like the small charger.

The photo below belongs to the ‘first roll’, taken in the carpark at work (30 metres from my office).

Portrait: when you are not here

It all started last week with problems coming back from sleep mode. The MBP would wake up for a few seconds and then shutdown completely. It was an annoyance, ambulance
but given that I was still teaching and preparing lectures using the laptop, I just kept going. Last Monday 19th September it was happening often enough that I sent an email to the University’s IT support, for which I received an automatic reply on Tuesday saying ‘we got your message’.

Yesterday, Wednesday 21th September, the computer just ‘died’ four hours before my last lecture for STAT220. I would turn on and then shutdown almost immediately. I had prepared the last lecture at home, without a chance for the automatic network backup that I get at work. Therefore, I spent the last few hours frantically trying to recreate the lecture (in PowerPoint rather than Keynote) in a borrowed HP laptop. Not the best final lecture, but at least a decent end for my participation on the course.

Marcela dropped the laptop at MagnumMac, the authorised Mac reseller in Christchurch. We were presented with the following options: for normal guarantee work (which certainly was this case) I would need to wait until next week so the technicians had a look at the computer. That is, waiting at least from Wednesday until Monday so someone would touch the computer, not even repair it. The option: spend NZ$75 so the technicians would put my computer at the front of the queue—skipping another 14 computers—and start diagnosing the problem. Not having the computer is certainly worth much more than $75 per day for me, so I decided to pay, but the whole system sounds wrong:

  • The computer is under warranty (and with additional AppleCare), so Why should I pay to have it fixed in a reasonable amount of time?
  • MagnumMac’s technical support appears to be seriously understaffed if it takes them so long just to ‘have a look’.
  • At least in theory, people that choose not to pay the extra cost can be continuously pushed down the queue to have their computers repaired.
  • Apple did not offer any support so I can keep working while they repair my computer. My mechanic can lend me a car while he is fixing my vehicle, so how can that a computer company can not do the same? How do they expect me to continue working with my Keynote documents?
  • Several supposedly ‘lesser quality’ brands offer a much faster turn around for repairs.

Apple has not found the problem yet: I was told—after I called a couple of times—that they have reproduced the problem but that they are still trying to figure out what is its cause. Once that happens, it will take several days for me to get back my computer (tomorrow is Friday already): normally they order the parts from Australia, which may take several days, even weeks.

This is the second time that my computer goes back to Apple: the first time was a battery problem (before the big battery recall). It has been very disappointing to have these problems, particularly when they seem to be so common place.

By the way, I am still waiting to receive an email written by humans from the University’s IT support.

P.S. 2006-09-27. I received my laptop back on Monday 25th. Apple’s technician explained to me that they were able to replicate the problem but not to pinpoint the cause. They replaced 512MB of third party RAM for Apple brand and that seems to have temporarily fixed the problem. Today I had a shutdown when running on battery (with supposedly 67% of charge). It may be that the battery needs calibration; however, it may also be the same problem coming back. We’ll see.

P.S. 2006-09-28. Random shutdowns are back. Called tech support and they will order the next potentially problematic part.

P.S. 2006-10-05. The left I/O board was replaced on Monday 2nd October, got the computer back yesterday and it failed again tonight. I am returning it tomorrow and will ask for a replacement computer. The issue is now how to get a bit over 10GB of data backed up (mostly the multimedia part, the rest—around 2GB—is already safely stored in our network), while the computer is working. If it is not possible, we will need to access the hard drive from somewhere else.

P.S. 2006-10-10. I received a new Macbook Pro, which did not present any of the problems of the previous one. Annoyances aside, I received a faster computer (2GHz instead of 1.86GHz) and I used the opportunity to upgrade RAM from 1Gb to 1.5GB. All in all, I am a lot happier with the new computer. Apple was not very keen on exchanging the computer, but I refused to accept back the old one. At least some times it pays to be a real pain in the butt.

I started using Python way back at the end of 1996 or early 1997. I was working in my PhD, visit this site
for which the first project involved writing some simulations in FORTRAN. Originally I was using FORTRAN90, but then I needed to move my project to a server that had only FORTRAN77, so I was stuck with something that looked—at least to me—really ugly. While I was looking for alternatives (I used Mathematica, Matlab, SAS, Python and ASReml in my PhD), I stumbled on an article by Konrad Hinsen discussing using Python to glue FORTRAN programs. Intrigued, I downloaded Python and ordered a copy of Mark Lutz’s Programming Python (the October 1996 first edition). After reading the book for a while I was hooked on the language.

I used Python in and out for small projects, and later dropped almost all programming (that was not stats) around 2001. I have missed that quite a bit until yesterday when working with a list of words that Orlando is using. We had typed around 450 words in Spanish (and he uses around the same number in English) and I wanted to check if we had repeated words. I downloaded Python, wrote a few lines and presto! We did have around 20 repeated words and it was so nice to be able to write something in Python.

After that I did check a few web pages and I realised that the language has evolved quite nicely (although I rarely use the object oriented stuff) and there are at least two books that I will be browsing soon:

Both books are available as free downloads in a variety of formats, as well as in real old-fashioned paper. I will certainly buy the nicest one in a paper copy.

I forgot to mention that one of the great things about Python was the existence of an excellent set of libraries for matrix operations (at the time was Numpy) that has grown in to a great set of resources for scientific computing called SciPy.

Filed in programming, software No Comments

Calling R from Visual Basic

4/02/2005

This post is an utterly miscellaneous brain dump:

  • Last week I got pharyngitis and am still taking antibiotics, side effects situation that I really dislike.
  • Yes, geriatrician I am a doctor in the real sense of the word, despite what the physician that prescribed antibiotics thinks.
  • Last Saturday I got one of the worst haircuts ever—at least that I can remember—at Just Cuts. Yes, it is my fault for first choosing to go to a such dubious place: avoid it if you can. Nevertheless, every time I passed outside Paul’s barber shop he was busy. Today I went to his place with my tail between my legs and beg him to have it fixed. We had a laugh, had it fixed and he made me promise not to repeat my sin.
  • Cooked a beautiful marinated octopus pasta last night. Looking forward to eat the left overs at lunch time.
  • Last Christmas I got a few vouchers from ‘Music without Frontiers’, one of the few music stores in Hobart where one can find something outside the ‘top 20′. I went back to my old listening habits, and got:
  • Andrew told me that last Saturday was Captain Beefheart’s birthday. I did not know who CB was so I will have to borrow some of his art.

It is hard for me to get interested in current mainstream music: no challenges, one can guess what is coming so easily that tends to be a big yawn. That’s all folks.

High productivity of matrix languages like Matlab and S+ or their Open Source siblings Scilab and R are a joy to use. I wrote programs in Matlab during my PhD and I can still go back to the code and perfectly understand what is going on there. Now I am writing a lot of S+ and R code where a few lines manage to perform complex operations.

A good programmer can certainly produce better performing (on terms of speed and memory requirements) program using a low(ish) level language like C, viagra 60mg
C++ or website
I am not such a good programmer and it would take me ages to do some of my work if I needed to write things using those languages. Most of the time execution speed and memory usage are not the limiting factors, and speed of development rules.

I am extremely happy now using R and playing with the idea to use it as a statistics server for a few small applications. Omega Hat seems to be a very valuable resource for all things ‘connecting R to other software’.

A long lived quicky

Around 2001 I wrote a ‘temporary quicky’ to compare new Eucalyptus samples to already identified haplotypes. I did that in a few lines of VBA in MS Excel, which was the software used as a repository for these haplotypes. At the time I suggested ‘this is a quick fix and it would be a good idea to develop a proper data base’, and suggested a structure allowing for user roles, web access, etc. I was told that ‘this is not a priority’ and ‘we are happy with the spreadsheet’.

Yesterday I was having lunch with the owner of this spreadsheet, who told me that a.- it is still being used after four years! and b.- they were having some problems because they changed a bit the structure for storing the haplotypes. I offered help to fix the problem but I was told that ‘one of my students will try to fix it, because the problem has to be something very simple’.

I thought that the comment was a bit dismissive and if it was so easy why haven’t they fixed it in over a month? Granted, the code is extremely simple but they do not have any programming experience whatsoever.

VBA is a fine scripting language, which allows people to write short and useful programs. However, I would question that in this case an Excel spreadsheet is the best option for storing molecular genetics information.

A better generic language

In general, scripting languages (like Matlab or R) feel like a better fit for me. Python, my all time favourite language, feels much more productive than any other language I have ever used. In addition, combining Python with the Numerical Python library produces an excellent all purpose/matrix programming language. This can be used for prototyping and—if one is happy with performance—transformed into a standalone program using a utility like py2exe.

Our telephone service for the last three years has been provided by Ecomtel (a small company), information pills
although the physical infrastructure belongs to Telstra (the largest telecommunications provider in Australia). Initially we were very happy with Ecomtel’s services, rx
they had low charges and their service seemed to be very responsive. The icing in the cake for me was their reliance on Open Source Software (e.g., prostate Linux), which made easier for them to be very competitive in price—particularly for international calls.

This year we logged an issue with Ecomtel, because our low speed of connection to internet (maximum of 14.4 Kbps, pathetic, isn’t it?). After some investigation, it was established that the problem was in the quality of our line—that belongs to Telstra, which happens to be a paired gain system rather than an individual copper line. We pointed out to Ecomtel that according to the TIO the minimum speed of connection to internet should be 19.2 Kbps:

The Internet Assistance Program was set up as a joint venture between Telstra Corporation and the Federal Government to ensure a minimum transmission speed of at least 19.2 kilobits per second to all users of its fixed network. Subsequently, it was decided that a minimum speed of 19.2 kilobits per second would become a condition of Telstra’s licence agreement. While this condition is not binding on other network carriers (where Telstra does not provide the underlying infrastructure), the TIO views this as an industry benchmark and expects that regardless of which network a customer is connected to, the standard telephone line provided should be capable of a minimum transmission speed of 19.2 kilobits per second.

The fact is that physically it is not possible to achieve 19.2 Kbps with a paired gain connection. Telstra and Ecomtel say that we should pay for a new telephone connection (cost AU$209) to change to a copper line. Our position is that i- we were never offered the option between types of line when connected in the first place and ii- the current line does not meet the condition for Telstra’s licence agreement anyway. By the way, the ‘new connection’ only involves unplugging our line from one connector and plugging it back in a connector sitting next to the original one.

We have been discussing the issue with Ecomtel for over a month, and they have not been very responsive during this time. We will now take this issue to the TIO and see if we can get a new connection without the extra cost. During this process, we changed from loyal (telling our friends to switch to Ecomtel) to dissatisfied (writing this post) customers. A real business lesson in how to alienate your customers.

Checking the server logs I have discovered that many people that arrive at my posts on calling VB from R are, dermatologist
in fact, looking for the reverse. I have never done any programming calling R from VB; however, while I was looking for COM clients for R I also found information on COM servers. OmegaHat lists RDCOMServer as a package that exports S (or R) objects as COM objects in Windows. It provides examples on using VB, Python and Perl to call R code.

Another option is Thomas Baier’s R(D)COM Server, which is provided with examples in the same languages used by RDCOM Server.

Filed in programming, statistics No Comments

Matrix languages and quickies

22/01/2005

This post is an utterly miscellaneous brain dump:

  • Last week I got pharyngitis and am still taking antibiotics, side effects situation that I really dislike.
  • Yes, geriatrician I am a doctor in the real sense of the word, despite what the physician that prescribed antibiotics thinks.
  • Last Saturday I got one of the worst haircuts ever—at least that I can remember—at Just Cuts. Yes, it is my fault for first choosing to go to a such dubious place: avoid it if you can. Nevertheless, every time I passed outside Paul’s barber shop he was busy. Today I went to his place with my tail between my legs and beg him to have it fixed. We had a laugh, had it fixed and he made me promise not to repeat my sin.
  • Cooked a beautiful marinated octopus pasta last night. Looking forward to eat the left overs at lunch time.
  • Last Christmas I got a few vouchers from ‘Music without Frontiers’, one of the few music stores in Hobart where one can find something outside the ‘top 20′. I went back to my old listening habits, and got:
  • Andrew told me that last Saturday was Captain Beefheart’s birthday. I did not know who CB was so I will have to borrow some of his art.

It is hard for me to get interested in current mainstream music: no challenges, one can guess what is coming so easily that tends to be a big yawn. That’s all folks.

High productivity of matrix languages like Matlab and S+ or their Open Source siblings Scilab and R are a joy to use. I wrote programs in Matlab during my PhD and I can still go back to the code and perfectly understand what is going on there. Now I am writing a lot of S+ and R code where a few lines manage to perform complex operations.

A good programmer can certainly produce better performing (on terms of speed and memory requirements) program using a low(ish) level language like C, viagra 60mg
C++ or website
I am not such a good programmer and it would take me ages to do some of my work if I needed to write things using those languages. Most of the time execution speed and memory usage are not the limiting factors, and speed of development rules.

I am extremely happy now using R and playing with the idea to use it as a statistics server for a few small applications. Omega Hat seems to be a very valuable resource for all things ‘connecting R to other software’.

A long lived quicky

Around 2001 I wrote a ‘temporary quicky’ to compare new Eucalyptus samples to already identified haplotypes. I did that in a few lines of VBA in MS Excel, which was the software used as a repository for these haplotypes. At the time I suggested ‘this is a quick fix and it would be a good idea to develop a proper data base’, and suggested a structure allowing for user roles, web access, etc. I was told that ‘this is not a priority’ and ‘we are happy with the spreadsheet’.

Yesterday I was having lunch with the owner of this spreadsheet, who told me that a.- it is still being used after four years! and b.- they were having some problems because they changed a bit the structure for storing the haplotypes. I offered help to fix the problem but I was told that ‘one of my students will try to fix it, because the problem has to be something very simple’.

I thought that the comment was a bit dismissive and if it was so easy why haven’t they fixed it in over a month? Granted, the code is extremely simple but they do not have any programming experience whatsoever.

VBA is a fine scripting language, which allows people to write short and useful programs. However, I would question that in this case an Excel spreadsheet is the best option for storing molecular genetics information.

A better generic language

In general, scripting languages (like Matlab or R) feel like a better fit for me. Python, my all time favourite language, feels much more productive than any other language I have ever used. In addition, combining Python with the Numerical Python library produces an excellent all purpose/matrix programming language. This can be used for prototyping and—if one is happy with performance—transformed into a standalone program using a utility like py2exe.

Filed in programming, software No Comments

Calling Visual Basic DLLs from R, part 2

21/12/2004

After yesterday’s experience calling other software from R, refractionist I continued working to connect to my DLL. I first registered my DLL as a COM server in windows typing in the command line:

regsvr32 c:data
connectgrowthmod.dll

Then I had a look at my DLL using the SWinTypeLibs library, visit this site typing in R:

library(SWinTypeLibs)
wood <- LoadTypeLib("c:data
connectgrowthmodels.dll")
getTypeLibTypes(wood)
DLLFuncs <- getFuncs(wood[[1]])
DLLElements <- getElements(wood[[1]])

The command getTypeLibTypes(wood) showed that the DLL contained _cPlantationModels (of type dispatch) and cPlantationModels (of type coclass). That is why I then continued working with wood[[1]], herpes of type dispatch.

After that I needed help from Tim. He had compiled the DLL leaving the class module cPlantationModels with an instancing of type 5 (multiUse), making it available for outside users. Connecting to the library was relatively straightforward with Tim’s help:

library(RDCOMClient)
growthmod <- COMCreate("growthmodels.cPlantationModels")
growthmod$ModelCreate(30,11,8,4,0,25,5,5,1000,0,1,2,0,1)
standvol <- growthmod$TreeValue(20)
growthmod$ModelDestroy()

This code loaded the RDCOM client library, created a connection to the DLL, instantiated a growth model with starting values, evaluated the model and destroyed the instantiation. Now we are ready to start testing the models from inside R.

Filed in programming, software, statistics 1 Comment

Calling Visual Basic DLLs from R

20/12/2004

I want to test a series of growth models that are part of a DLL(Dynamic Link Library) written in Visual Basic. Although I could write several pieces of code in VB to test the library, check I will try to access it from R. In this way I can run simulations and then process the results from within a statistical package.

I obtained the RDCOM client and the SWinTypeLibs facilities from the Omega Hat Project for Statistical Computing. If you are running R 2.0 you should get the latest compilations from here (RDCOMClient_0.8-1.zip, view 11-Oct-2004, 12:08, 433K and SWinTypeLibs_0.2-1.zip, 11-Oct-2004, 12:09, 147K).

It is possible to learn about the classes, methods, parameters and return types of a COM server using the SWinTypeLibs library. Once we have learn about the server, we can access it using the RDCOMClient (this is explained in this presentation in PDF format, 50KB). For example:

library(RDCOMClient)
ie = COMCreate("InternetExplorer.application")
ie[["Visible"]] <- TRUE\
ie$Navigate2("http://www.uncronopio.org/quantum/")

will load the library, connect to Internet Explorer, display the window and point the browser to Quantum Forest. An example using Excel would be:

exc <- COMCreate("Excel.Application")
exc[["Visible"]] <- TRUE\
books <- exc[["Workbooks"]]
books$Add()
actsheet <- books[[1]][["ActiveSheet"]]
myrange <- actsheet$Range("A1:C10")
myrange[["Value"]] <- asCOMArray(matrix(rnorm(30, 10, 3)))

This will connect to Excel, show the program, get a list of workbooks, add one, get which sheet is active, define a range and fill it with N(0,1) random numbers.

I continue exploring the package to see if I can connect to my own DLL.

Filed in programming, software, statistics 1 Comment

Prototyping breedOmatic

28/09/2004

This was the question I asked — tongue in cheek, herbal almost a year ago — in The Agora. How come that I bought a bottle of ‘organic water’ in this web +tasmania&ie=UTF8&z=12&iwloc=addr&om=1″>Scamander (north-eastern Tasmania), doctor or the high prices for crappy ‘organic tomatoes’ (small, full of blemishes and tasting no different from a normal one), or the resistance against GMO (Genetically Modified Organisms) in many parts of the world?

I have participated in a few discussions about GMO and some people have a visceral reaction against them, because they are unnatural. Well, plastics, planes, cars, computers are unnatural and we happily interact with them in a day to day basis. People eat varieties of crops and fruits that are the result of mutations induced through radiation exposure, but they will complain against GMO plants. People complain against cloning, but they will buy apple or rose varieties that are, in fact, a clone. What does make a chimera, clone or GMO so different from a ‘natural’ organism?

Similar issues arise with the distinction between native and exotic animals, for example. Of course exotic animals must be native to somewhere in the planet. A related distinction is often made between ‘cute and furry’ and ugly animals. Thusly, people complain about poisoning a possum, but will happily eat a beef steak or a chicken burger, poison a spider (normally native) or squash an insect (probably native too).

I have no answer for this situation, but my sense of curiosity makes me wonder about the differential treatment given to living organisms based mostly on cultural codes.

In June I made some comments on how chemists (pharmacies) were stifling competition. That was a comment solely based on my experience dealing with them. Ten days ago Choice Magazine, ed
owned by the Australian Consumers’ Association, angina
published a report on the quality of advice and pricing provided by pharmacies.

Access to the report requires paid subscription (which I have), click
but the main findings were made public. The study included 87 seven pharmacies in Sydney, the Wollongong area and Adelaide, and found that:

  • Advice given in 58 out of the 87 pharmacies we visited was rated ‘poor’ by our experts. The pharmacy profession needs to improve the quality of advice being given to consumers.
  • Speaking to a pharmacist rather than a pharmacy assistant didn’t guarantee good advice.
  • In a price spot-check of two of the products our researchers bought, the most expensive of each product in a supermarket was still cheaper than the cheapest pharmacy price for the same item.

The report also claimed that:

…there’s some evidence that restricting pharmacy ownership may be limiting competition and make prices for some medicines higher than they would otherwise be, and the results of our spot-check seem to support this.

Sadly, it seems that my experience with pharmacies was not an exception at all.

One of the good things of having a job is receiving an income that I can spend as I prefer. One of my monthly expenses is supporting charities. I like the feeling of contributing to something useful—and it is tax deductible. There are many good causes, Sildenafil
and I tend to support organisations that work improving people’s lives, website
like Amnesty International or World Vision.

Today I friend brought to my attention the Sponsor a Giant campaign to save the Styx Valley. This is organised by the Wilderness Society to provide the Society ‘with both lobbying power and the financial support needed to protect ancient trees’. Contributors pledge A$50/month and get:

  • Latest Edition of Wilderness News
  • Styx information pack
  • Double sided El Grande image
  • Journey into the Old Growth CD Rom
  • Styx sticker
  • Special welcome letter with El Grande press release on the back
  • Sponsor a Giant certificate

I couldn’t stop comparing with World Vision’s Sponsor a Child campaign, where for A$39/month the child’s community receives help ‘for vital development work such as providing clean water, immunisation and healthcare and training in improved farming methods’, and the sponsor receives:

  • A picture folder with photo and details of your sponsored child, and information about his/her country
  • A yearly report on your sponsored child’s progress plus a new photo
  • Your choice of a regular online newsletter or printed magazine updating you on World Vision’s work around the world.

After thinking for two milliseconds, I chose a child.

Yesterday I changed from an old, Hemorrhoids
clunky Dell laptop to a plain, decease business-like Acer TravelMate 290. I received the computer with a barebones installation: Windows 2000, Office 97, Groupwise and Java Virtual Machine. This is equivalent to getting a new car with only a chassis and engine, and puts on me the burden of installing a large number of programs and trying to get the machine to my standards. Incidentally, the computer comes with old Microsoft software because IT services at work can’t see the advantage of switching to XP and spending a couple of millions in the process. I tend to agree, although I wouldn’t mind ditching Groupwise for MS Outlook.

This will involve immediately installing:

This will be followed by a plethora of other programs. In addition, programs with the symbol require some sort of product activation, which I always find a hassle. Thus, a few projects will be delayed, including updating a couple of web sites.

This weekend I was working in my latest project — codename breedOmatic — writing a Python script to process the information obtained through a web form. I started working directly in the “production machine”; however, malady
with a lousy internet connection the development and testing process was really slow.

I decided to use IIS (Internet Information Services) in my laptop — I am running windows 2000, but I discovered that it was not available. As an aside, at work machines are always setup in a very limited, barebones, way. Then, I needed to download a very small server (so I had some hope with my connection). Here comes TinyWeb, a wonderfully small (53KB) free web server by Ritlabs. I also installed TinyBox, a free controler for TinyWeb. The couple of programs work very well.

In a previous post I mentioned that I would give a try to JEdit, a java based text editor. Well, I installed it and I have had a great experience using it; and I am writing this post with it. Tip: if you are going to be using HTML or XHTML some very useful plugins for this editor is the XML plugin. There are some dependencies though, so you also need to have two other small plugins: Sidekick and ErrorLine.

P.S. 2004-09-28
1. I found a bug in JEdit, printing to networked printers under Windows 2000.
2. Note to self: the first line of a Python script needs to contain #!c:\python23\python.exe to be ran in a windows server.

Filed in programming, software, web No Comments

Time in Splus

12/02/2004

It has been a while since writing the previous post. Most of my ‘web time’ has been devoted to put together the Forestry in Tasmania site (where I am still doing some work). Anyway, malady this week I imported an Excel spreadsheet into Splus, where I had defined a few columns as time. I usually do not use time formats in Excel, but this was not my project and I needed the times in Splus. Surprise! All times were there, but with weird dates attached. Then I tried importing dates and, surprise again, dates had this 00:00:00 attached. To be more precise, only the time showed in the column when using the data viewer.

A few visits to the online language reference and to the discussion group archives where useful. Times and dates are stored using the timeDate class, and it is possible to query the variable containing time and date using the functions: hours, minutes and seconds for time, and days, weekdays, months, quarters, years, and yeardays (which gives the number of day within a year) for dates. These functions return vectors. In addition, it is possible to create data frames where each column contains an integer with the following functions: hms (hour, minutes and seconds), mdy (month, day and year) and wdydy (weekday, yearday and year).

timeDate can also be used to transform strings to objects of class timeDate. For example:

x <- timeDate(c("1/1/1998 3:05:23.4",
"5/10/2005 2:15:11.234 PM"))

which then can be queried as:

mdy(x)
hms(x)
wdydy(x)
hours(x)
etc
Filed in programming, statistics No Comments