Friday, May 15, 2009

A trip to Persia - GIS conference for municipalities in Mashhad




To start a few pictures from an excursion to the tomb of Ferdosi Toosi near Mashhad:

Thursday, May 14, 2009

Ontology with Processes for GIS gives a Geographic Simulation Model

The introduction of process descriptions in GIS is a long-standing desire. A partial answer is given by models of geographic, continuous processes with Partial Differential Equations. Other spatial processes can be modeled with cellular automatons and multi-agent simulation.

If a GIS maintains data with relation to observation time (e.g., snapshots) and includes in its ontology the description of the processes then the datasets in the GIS become linked by the processes. The GIS can simulate the development and compare it with the observed changes in the real world, deriving values for (local) constants for the process models, etc.

Integrating time related observations with process models in a GIS give a substantially different information system than current GIS, which are well organized, useful repositories for data. The new GIS is a spatial simulation systems.

It may be interesting to speculate on the timeframe in which such a transformation could occur; consider that from a gleam of GIS in researcher's eyes to the current reality of GIS workhorses took some 20 years; consider further that the integration of snapshot time into GIS is past the research but not yet fully integrated in practice. One may conclude that the "GIS as a geographic simulation system" may take another 25 years.

Friday, May 8, 2009

Synchronizing multiple computers

For 20 year I had only one computer, a laptop and later a tablet. I like the slate tablet from Fujitsu Siemens, because it lets me arrange screen and keyboard as I like it. Unfortunately, it is heavy and the batteries run out quickly, thus I have added a Asus eeePC 901 “for the road” - ending up with three machines: a tablet at home, a table at work and the eeePC anywhere else.

Synchronizing the three computers became crucial. I regularly missed a file needed, because it was stored on another machine, and I had to repeat any change to the interface I made.

Synchronizing the files I work on was easy: Unison does it. I set up a directory on a machine on the web (computer number 4!) and I sync each of the three machines I work on against the fourth one. Unison works reliably and points out files which have a conflict and need manual (human intelligence) resolution. But still, every change to a setup in an application had to be redone on the other machine... maddening!

When I upgraded all three computers to Ubuntu 9.04 Jaunty I set them up with exactly the same software and then decided to synchronizing all my files – including the hidden files describing my configuration and user preference using Unison. The unix-style operating system separates the setup of the computer (which is evidently different for each hardware) from all user specific aspects, which are stored under /home/user_name. This works!

As an added benefit, the files I have on computer number 4 are automatically backed up every night (with rdiff-backup, keeping all old copies). This is sufficient for retrieving previous copies of files; a backup of the installation is not needed, because it is faster to re-install all, which takes less than 2 hours, including installing additional software and the one or two edits necessary.

To reduce traffic on the web, I excluded all “cache” type files and try to find the files which are specific and must remain different on the three machines. The file with the list of path and filenames to ignore when synchronizing is included here at the end with the hope that I may get some hints what I should add or rather what I must synchronizing to avoid disasters.

I guess others may have similar needs - having all your files and setup all the time available is what Google and Microsoft try to sell. To make it a reality under my full control, the software designers should think of
1. all cached data in files or directories called 'cache' and not rely blindly on the reliability,
2. configuration in small files, easily shared,
3. allow local configurations (and other data needed per machine) in files or directories called 'local' (or with names including the hostname).
It should be possible to have a live system on a stick, which you can plug into any reasonable computer, start up, sync and have most of your files locally to work with.

For things which must be different on different machines, I use scripts I with a case statement as shown here:

#!/bin/sh
# to execute depending on the host

p904Atlanta () {
unison -sshargs "-o ProtocolKeepAlives=30" gi41_mergeAtlantaBern
exit 0
}

p904Bern () {
unison -sshargs "-o ProtocolKeepAlives=30" gi41_mergeAtlantaBern
exit 0
}

p904Hamburg () {
unison -sshargs "-o ProtocolKeepAlives=30" gi41_mergeHamburg
exit 0
}

THISHOST=$(hostname)

case $THISHOST in
bernJ) p904Atlanta;;
atlantaJ) p904Bern;;
hamburgI) p904Hamburg;;
*) echo unknown host $THISHOST;;
esac

The list of files excluded for Unison:
ignore = Path .unison/*
ignorenot = Path .unison/*.prf
ignorenot = Path .unison/*.common

ignorenot = Path .nx
ignore = Path .nx/*
ignorenot = Path .nx/config

ignorenot = Path .svngzPrefs.txt

ignore = Path .beagle
ignore = Path .cache
ignore = Path .cabal
ignore = Path .eclipse
ignore = Path .evolution
ignore = Path .mozilla-thunderbird
ignore = Path .popcache
ignore = Path .wine
ignore = Path .nautilus
ignore = Path .thumbnails

ignore = Path .xsession-errors
ignore = Path .pulse
ignore = Path .ICEauthority
ignore = Path .Xauthority

ignore = Path .dbus
ignore = Path .config/transmission
ignore = Path .opensync*
ignore = Path .ssh/ida_rsa
ignore = Path .gnome2/gnome-power-manager/profile*
ignore = Path .gconfd/saved*
ignore = Path .recently-used.xbel

ignore = Path {unison.log}
ignore = Path {local_*}
ignore = Path Photos
ignore = Path {workspace}
ignore = Path {experiments}

#avoid temp files
ignore = Name temp.*
ignore = Name .*~
ignore = Name {*.tmp}
ignore = Name theResult
ignore = Name *cache*
ignore = Name *Cache*
ignore = Name cache*
ignore = Name insertedRepo_*

ignore = Name trash
ignore = Name .trash*
ignore = Name Trash
ignore = Name *Trash*
ignore = Name *trash*

ignore = Name urlclassifier*

perms=0o1777

Laws of nature and laws of human nature

The current crisis is a welcome opportunity to rethink politics, economy and “all that”. In this and related short blogs to come, I will write down my current understanding of “how the world works”. I start with what I think is fixed and cannot be changed by humans within the time frame of human history:
  1. The laws of nature; physics, chemistry etc.

    Lawrence Lessig (Code, 2000) has pointed out that the laws of nature apply to everybody and nobody can escape them. Water flows downhill for everybody!

  2. The fundamental aspects of human nature.
    In principle, human nature is changeable but within the timespan of history, it must be accepted as constant. Human desire for love, fear of death and pain, hope and all, is a constant. Ignoring or maintaining illusions about the human nature is equally damaging as ignoring the laws of nature.

Contrasting with the constant nature of these laws, our knowledge and understanding of the laws of nature and the laws of human nature is greatly changing over time. Economically relevant is the current knowledge of these laws and how they can be used in production processes.

Changes in the knowledge of the laws of nature and the laws of human nature change the technology in a wide sense and thus the economy; values of things owned change, some become more valuable. Examples: the oil buried under the sand of Arabia became valuable with the advent of individual motorization, the quarries of limestone suitable for lithography lost value when other less demanding printing methods were found.

Sunday, May 3, 2009

Install Cabal with Ubuntu 9.04 Jaunty and GHC 6.8.2 or 6.10.1

Cabal is a marvelous help to install the Haskell packages found in Hackage. There is a very long and rapidly increasing list of Haskell packages to solve nearly any programming task collected in Hackage: reading XML files, connecting to databases, to graphical user interface (wx) or to run a web server – all this and much more is available. The only problem was to find the right versions to work together, such that the ghc package manager is satisfied and the result is running.

The regular method to install packages was:
  1. find the package on Hackage http://hackage.haskell.org/packages/archive/pkg-list.html#cat:user-interface

  2. unpack the file in a directory of your choice

  3. change into this directory

  4. runghc or runhaskell Setup.hs configure (or Setup.lhs – whatever is in the package)

  5. runghc or runhaskell Setup.hs build

  6. sudo runghc or sudo runhaskell Setup.hs install
    With this command, the compiled content is moved to /usr/local/lib and registered with the package manager of ghc (in /usr/lib/ghc-6.8.2)
  7. download the package as a .tar.gz file (link at the bottom of the page)

The procedure is simple, but time consuming to satisfy the dependencies between packages. A package may need another packages that must be previously installed, which is discovered in step 5 (configure). Then the package required must be installed first; it is usually easy to find on the hackage page of the dependent package, but may require yet another package...

Cabal automates this. The only problem was that I could not find a ready made cabal-install program and had to construct it. I had a new and clean installation for GHC 6.8.2 in Ubuntu 9.04 Jaunty (and the same should apply for Ubuntu 8.04 Hardy and 8.10 Intrepid). I loaded a bunch of packages already available in the Ubuntu repository, of which libghc-network, libghc6-parsec and libghc6-zlib (with the dependent zlib...) are likely the only relevant ones here.

The blog http://www.kuliniewicz.org/blog/archives/2009/03/24/installing-ghc-610-on-ubuntu-intrepid/comment-page-1/ gave me a start, but I ran into problems with cabal-install-0.6.2 as described there, probably because I had a 6.8.2 ghc and difficulties to build network, which I could not download. I gave up with ghc 6.10 which is not yet available for ubuntu.

I tried first to use the traditional method to install cabal-install, but found later a simpler method:

  1. getting cabal-install-0.6.0 from hackage and use
  2. sudo sh bootstrap.sh

Required are the packages mtl, parsec and network, which I had. The result is an executable ~/.cabal/bin/cal, which I copied to /usr/bin (sudo cp ~/.cabal/bin/cabal /usr/bin). Next, it may be necessary to run cabal update to download the list of packages from hackage.

Now life should be wonderful! (I did just install WxGeneric with simply sudo cabal install WxGeneric...)

Alternatively and on a different computer, the traditional approach with manual download and configure/build/install worked for the combination

  1. HTTP 3001.0.4 (produces some warnings)
  2. Cabal 1.2.3
  3. cabal-install 0.4.0

Higher versions I could not get working – but the result is satisfying.

Later note for installation with 6.10.1:

I installed manually only parsec-2.1.0.1 and network-2.1.1.2 (with runhaskell Setup.hs configu/build/install) and then the ./bootstrap script in cabal-install-0.6.2 ran.

Sunday, April 12, 2009

Where is the money gone?

We have an economic crisis. Nobody has money, many starve, lost their homes etc. But where did the money go? On the blog http://scienceblogs.com/goodmath/2009/03/bad_bailouts.php#more the question was raised and some partial answers given. I will try here an explanation:

There is some confusion on terminology: By money I mean the abstract rights which are described with amounts of a currency (Dollar, Euro). By value, I will describe real things which can be used and produce benefits (e.g. my home, my car, a company with its tangible and intangible assets).


The real confusion starts when newspapers report 'today x billions were lost on the stock exchange' (or even worse 'x billions were destroyed'). Who did destroy the money? Where did it go when it was lost? - The answer is simply 'the illusion of value went away'. Here is why:

In the morning I hold 1000 shares of, say Ford Motor company; this means I own a (small piece) of the company with all its assets. They were traded for $16 the evening before and my banker things my net value is 16,000. In the evening I still have the same 1000 shares of the same company, but they are traded now for only $12. What have I lost? Obviously no value was lost, but in the view of my banker, my net value is reduced by $4000. When the stock prices came down, the stock was not representing less real value in terms of assets behind it. What came down was the market value of the stocks.

Before I look why people really lost money, let me discuss what happened when the stock prices were rising:

I bought recently 1000 shares of a company owning large buildings in Vienna for Euro 0.70. These shares are now traded for 1.30. Have I earned now 600 Eros? Not yet. If I sell the stock then I have earned 600, which I can use to invite my friends to a sumptuous dinner. Likewise, I would have realized my loss of holding Ford stock only if I sell it. A gain or a loss in money terms is associated with an action, converting money to an other asset and then back. The same is true for buying a home – there is no gain or loss as long as I hold on to it. The transactions produce a gain or loss, or better an increase or decrease in my net value as seen by my banker.

So – where is the money gone?

If I buy stock or a home but do not pay all with my own money, but ask the bank for a loan, guaranteed by the stock or a mortgage on the home, I can buy more than I could pay cash for. If the stock is traded later higher, I sell it, pay back the loan and pocket the difference. The percentage I can earn in this form is higher than if I buy with cash. Consider the above example: instead of using my 700 Euro to pay for 1000 shares, I buy 4000 and ask the bank to loan me 2100 Euro which together with my 700 pay for the 2800 total (I disregard fees etc.). If I sell at 1.30 I get 5200, pay back the 2800 and get 2400; my net value has increased by 1700 in a few weeks. My gain, expressed in percentage per annum is perhaps 250%. This is aptly called 'leverage'.

Where is the catch?

If the stock trades now lower, say 60 cents, then the bank fears for its loan and ask me to reduce the loan by giving them cash – and if I do not send it promptly, they will sell the stock for whatever price, say 55 cents. This gives 2200 of which the bank takes 2100 to pay back the loan and 100 remains for me – that is what I have left from my 700 Europa I had initially! Now I have really lost 600 Euro (compared to not having bought and sold the stock).

If many buy stock or homes paid partly by loans and market prices are sliding down such that banks see their loans not covered anymore and force the owners to sell, then prices will go further down and more owners may be forced by their banks to sell and thus pushing the market prices further down, as happened last year.

Reading Galbraith' [The Great Crash of 1929] and Krugman's [The Return of Depression Economics and the Crisis of 2008] accounts of the economic crisis indicate that the economic downturn – which is an event happening regularly every 7-15 years – started in every crisis before the crisis broke out (e.g. fall 2007) – but the crisis itself is the product of leveraged buying, which is multiplying the effects of normal economic ups and downs. We had many years of ups and some people benefited from leveraged buying, now we had the downturn and many got caught.

Still, where is the money gone?

When the market values of homes, stocks, companies etc. all increased regularly and leveraged buying of these valuable things was a good deal. People were able to sell and realize the gains in money (or were just increasing their net value in money). They used the money appearing in their bank accounts to buy real value things – like cars, dinners, massages, ships, companies – keeping the real economy going.

The non-intuitive aspect is the account is that the gains were realized first and the losses were occurring later. The bubble economy was giving credits to the ones willing to run risks. When everybody else jumped on the band-wagon, it started sliding backwards. The clever ones had left (i. e. realized) the gains and moved it to other more stable form of assets, the late comers lost their investment.

Now: who benefited from the bubble? Certainly the ones that cashed the big bonus, realized the big gains, but also all the ones which bought and sold homes and used the money for this and that – meaning a little bit everybody (including state employees, benefiting from increased taxes used to pay salaries... me included).

The figures reported in the press are the figures describing Ponzi schemes built on top of the simple examples used above: one can leverage the leveraged investment in stocks, do leverage buy-outs and translate this in stocks; one can produce securities from risky mortgages – and then create leveraged investment in these securities. For example the loss in Madoff's investment vehicle – a classic Ponzi scheme – is reported in the 50 billions; actual investment of real (?) money from outside is closer to 15 – 20 billions – the rest are gains appearing in the books but never realized.

The billions necessary to keep the system relevant banks afloat – a classic scheme of privatizing gains and sharing publicly the losses as Stiglitz has pointed out – are inflated by the leverage schemes used to create the investment vehicles now appearing n their books as toxic assets. I see not enough analysis of the causes of the crisis to believe that the bailout is using the best (least public cost) medication to overcome the current illness of the finance system.